Category:

Medical News

Police are still investigating what exactly prompted a gunman to kill four people in a Manhattan office building yesterday evening, but perhaps the clearest aspect of his motive is the condition that he evokes in a note found on his body: chronic traumatic encephalopathy.

The 27-year-old gunman, Shane Tamura, was a former high-school football player. He targeted the Midtown skyscraper that houses the National Football League, though none of the four people he shot and killed before ending his own life was an NFL employee. (According to a statement from NFL Commissioner Roger Goodell, one league employee was “seriously injured” and in stable condition at a hospital.) In his note, Tamura reportedly speculated that CTE might have been a cause of his mental illness, but it’s still too early for medical examiners to offer a diagnosis. (And even if an autopsy were to show anomalies in his brain, it could never reveal what precisely drove him to homicide.) Like at least one NFL player who died by suicide, Tamura asked that his brain be studied after he died.

Concerns about CTE and football have been mounting for more than two decades. In 2013, the NFL settled a lawsuit brought by more than 4,500 former players who claimed that the league concealed from them the risks of brain injury, including CTE. CTE is both rare and difficult to diagnose, so scientists haven’t definitively established its symptoms. They’re thought to include memory loss, personality changes, suicidality, and loss of motor control—all of which can be both devastating and caused by any number of disorders. Research overwhelmingly validates the link between the condition and professional football careers.

But the consequences of playing high-school football are not well studied—a major oversight, given that most people who play do not end up in the NFL, Eleanna Varangis, a University of Michigan professor who studies brain injury, told me. “The majority of the experience is at the youth level, and we still don’t know a lot about how those people look later in life,” she said.

Because CTE can be diagnosed only after death, whether Tamura had it is not yet known. Clearly, based on his note, something appears to have led him to suspect that he did. But two experts I spoke with about the condition—both of whom are advocates for better research and care in understanding CTE—told me that they had little reason to suspect that a person like Tamura would in fact have had the condition just because he played high-school football. (So far, a high-school teammate and a coach have told NBC News that Tamura was a talented player, but no further details have emerged about his time in the sport.) Jesse Mez, an associate professor of neurology at Boston University’s Chobanian and Avedisian School of Medicine and a co-director of clinical research at its CTE Center, has studied the risks of CTE across football careers ranging from one year up to 30—high schoolers to professional players. He found that the longer people play, the greater their risk of developing CTE. After five and a half years of playing, the relationship starts to be linear. But in careers shorter than that, “the likelihood of getting the disease is quite low,” Mez told me.

CTE is thought to be caused by repetitive blows to the head, whether or not they lead to concussions. But scientists have not been able to pin down the precise number of impacts (or concussions) that cause someone to tip over the threshold into CTE, and even if they could, the length of a player’s career is an imperfect proxy for how many times they hit their head, Kristen Dams-O’Connor, the director of Mount Sinai’s Brain Injury Research Center, told me. There is variability, too, in susceptibility: Some people might develop CTE after fewer blows to the head. Genetics may also play a role. Although research shows the chances that a high schooler would develop CTE are extraordinarily small, “it’s hard to say what small is,” Mez said.

[Read: Tua Tagovailoa’s impossible choice]

The ambiguities around diagnosing CTE are made only worse by the fact that it cannot be confirmed before death. “I think it would be a huge service to be able to diagnose it in life, even without absolute certainty,” Mez said. Some signs indicate that all of that fear and confusion has led to excessive self-diagnosis. CTE appears to be “uncommon” in professional football if you take all players into account, Dams-O’Connor told me. And yet, in a 2024 study of former NFL players, more than one-third believed they had CTE. To assume that CTE is to blame for, say, depression, just because a person played football, is “really harmful logic,” Dams-O’Connor said: It suggests that nothing can be done, that a person is doomed to a life of irreversible decline from a disease with no direct treatments. Whether or not Tamura had CTE, it’s chilling to think that his conjecture that it contributed to his mental illness may have driven him to violence.

Plenty of other factors, football-related or not, may have also caused or exacerbated Tamura’s mental illness, Mez said. Some research suggests that high-school football players may have greater risk for comorbidities that affect brain health, such as cardiovascular or cerebrovascular disease. A 2018 study found that people who start playing football before age 12 are at risk of experiencing cognitive, behavioral, and mood-related problems earlier in life than those who start playing when they’re older. And studies have shown that brain injuries (from football or any other cause) are associated with mental-health issues, including a higher risk of suicide, homicide, and criminal tendencies. New York City Police Commissioner Jessica Tisch said at a press briefing last night that the gunman had a “documented mental-health history,” and multiple outlets have reported that he was placed on psychiatric hold in both 2022 and 2024.

[Read: The future of detecting brain damage in football]

The New York City Office of the Chief Medical Examiner said in a statement today that it would examine Tamura’s brain during an autopsy—just as he wished. Perhaps the findings will add to the messy, ongoing science of CTE. Or perhaps they will prove a lesson in what happens when players are too quick to suspect it.

Americans have a long history of enduring heat waves by going outside. In a 1998 essay for The New Yorker, the author Arthur Miller described urbanites’ Depression-era coping mechanisms: People caught the breeze on open-air trolleys, climbed onto the back of ice trucks, and flocked to the beach. In the evenings, they slept in parks or dragged their mattresses onto fire escapes.

But since air conditioning went mainstream, in the 1960s, the easiest way to beat the heat has been by staying indoors—at home, the office, the mall—where cool air is a constant and blinds are often drawn to prevent homes from overheating (and electric bills from skyrocketing). For this convenience, Americans sacrifice the benefits of sunshine and the opportunities for fun it creates. As climate change turns up the temperature, summers in America are coming down to a choice between enduring the heat and avoiding it—both of which might, in their own ways, be making people sick.

In cities across the country, summers are, on average, 2.6 degrees hotter than they were some 50 years ago. In Phoenix, where a 95-degree day is a relief, schedules are arranged around the darkness; Jeffrey Gibson, an accountant who works from home, takes his eight-month-old daughter out for walks before 6:30 a.m.; after that, it’s so hot that she flushes bright red if they venture outside. He spends the rest of his day indoors unless leaving is absolutely necessary. It’s like this from April to October. Gibson recently told his wife, “Man, I think I’m a little depressed.”

Josef A. Von Isser, a therapist in Tucson, Arizona, told me that feeling low in the summer comes up a lot with his clients. Some feel that the heat affects them directly; others struggle with its indirect effects, such as fewer opportunities to socialize and be somewhere other than home or the office. All of them, he suspects, might be experiencing seasonal affective disorder.

The DSM-5 categorizes SAD as a type of major depression with a seasonal pattern, with symptoms such as sadness, feelings of worthlessness, and low energy. Usually, it presents in the winter, though scientists don’t agree on why. Some suspect that it’s because a lack of sun exposure may contribute to decreased levels of serotonin, a hormone that regulates mood, as well as vitamin D, which helps stimulate serotonin activity. Another theory links low exposure to sunlight with unusually high levels of melatonin, a hormone that helps regulate sleep.

[Read: The surprising truth about seasonal depression]

Summer SAD is generally accepted as a variety of the disorder, but it’s much rarer than the winter form; one study from earlier this year puts its prevalence at about 0.6 percent. That makes summer SAD especially hard to study. “It’s clearly a different kind of illness,” Paul Desan, a psychiatry professor at Yale, told me, but “it’s not in their imagination.” Unlike the winter form, which comes with a tendency to overeat, oversleep, and withdraw from society, summer SAD involves reduced appetite, insomnia, and restlessness—all of which can also be effects of heat. The scientific literature shows that heat is associated with mood disorders, anxiety, aggression, and reduced cognitive abilities. Uncomfortably hot nights, longer periods of daylight, and extended stretches of time spent indoors all disrupt sleep, which can in turn fuel mood disorders, Amruta Nori-Sarma, an environmental-health professor at Harvard, told me. Extreme heat can also be an obstacle to exercising, spending time in nature, and socializing, all of which can make people feel good and also double as important coping mechanisms for emotional distress.

Taking comfort in air conditioning when it’s too hot out is a natural human response. But air-conditioned spaces can be stifling in their own way. Staying home where it’s cool also means socializing less; some offices and homes hardly let in a wink of sunlight all day. It’s plausible that in the summer, people experience SAD symptoms not only from excessive heat but also because they spend all of their time avoiding the sun, Kim Meidenbauer, a psychology professor at Washington State University, told me. “It does make sense to me that you’d have, potentially, an analogous pattern of effects” to winter SAD, she said. The link between indoor time and summer SAD hasn’t been studied, but plenty of Americans, even if they don’t meet the DSM-5 criteria, are noticing that summer is starting to feel a lot like winter. Reddit abounds with users who lament that being forced indoors by the heat gives them “summer depression.”

America’s summer quandary—suffer inside or out?—will become only more persistent as climate change intensifies. In the United States, heat waves have grown more frequent and intense every decade since the 1960s. During a single heat wave last month, people in 29 states were warned to stay inside to avoid dangerously high temperatures. All of the experts I spoke with expressed concerns about the impacts of escalating heat on mental health. “I am not optimistic,” Ayman Fanous, a psychiatry professor at the University of Arizona, told me, noting that heat also has a well-established link with suicide risk and can exacerbate mental-health conditions such as schizophrenia, bipolar disorder, depression, and substance abuse. Many Americans don’t have access to air conditioning, or they work jobs that require them to be outside in the heat. Those who can stay cool inside may avoid the most severe consequences but still end up miserable for half of the year.

[Read: Earth’s new gilded era]

As long as summer SAD remains poorly understood, the options for addressing it will be limited. Treatment for winter SAD usually involves exposure to light boxes that mimic sunlight, but these aren’t recommended for summer SAD, because it might have a different neurobiological basis, Fanous said. For now, the first-line treatments are SSRIs such as Prozac—which can make people even more sensitive to heat.

For those with the means, the best strategy for beating summer SAD might be to move somewhere cooler. After eight years in Phoenix, Gibson has had enough of hiding from the heat for six months at a time and is ready to leave behind what he believes is his own summer SAD. Later this year, he plans to move his family to Colorado, where he hopes to be able to bring his daughter out during daylight hours. Yet Colorado summers, too, are becoming uncomfortably hot—and the same goes around the country. Last month, Alaska issued its first-ever heat advisory. As summer temperatures continue to rise, perhaps Americans will start to look back with envy on the ways our forebears beat the heat. The hotter summer nights get, the more sleeping on the fire escape starts to sound like a luxury.

For decades, countries around the world have held up the United States’s rigorous approach to vaccine policy as a global ideal. But in Robert F. Kennedy Jr.’s Department of Health and Human Services, many of the officials responsible for vaccine policy disagree. For the best immunization policy, they argue, the U.S. should look to Europe.

Marty Makary, the new FDA commissioner, and Vinay Prasad, the new head of the agency’s center for regulating vaccines, have criticized the nation’s COVID-19-vaccine policy for recommending the shots more broadly than many European countries do. Tracy Beth Høeg, a new adviser at the FDA, has frequently compared the U.S.’s childhood vaccination schedule unfavorably with the more pared-down one in Denmark, and advocated for “stopping unnecessary vaccines.” (Prasad, citing Høeg, has made the same points.) And the new chair of the CDC’s Advisory Committee on Immunization Practices, Martin Kulldorff—whom Kennedy handpicked to serve on the panel, after dismissing its entire previous roster—announced in June that ACIP would be scrutinizing the current U.S. immunization schedule because it exceeds “what children in most other developed nations receive.”

This group has argued that the trimness of many European schedules—especially Denmark’s—implies that the benefits of the U.S.’s roster of shots may not outweigh the risks, even though experts discussed and debated exactly that question when devising the guidance. But broadly speaking, the reasons behind the discrepancies they’re referencing “have nothing to do with safety,” David Salisbury, the former director of immunization of the U.K.’s Department of Health, told me. Rather, they’re driven by the factors that shape any national policy: demographics, budget, the nature of local threats. Every country has a slightly different approach to vaccination because every country is different, Rebecca Grais, the executive director of the Pasteur Network and a member of the WHO’s immunization-advisory group, told me.

One of the most important considerations for a country’s approach to vaccines is also one of the most obvious: which diseases its people need to be protected from. The U.S., for instance, recommends the hepatitis A vaccine for babies because cases of the contagious liver disease continue to be more common here than in many other high-income countries. And conversely, this country doesn’t recommend some vaccine doses that other nations do. The U.K., for example, routinely vaccinates against meningococcal disease far earlier, and with more overall shots, than the U.S. does—starting in infancy, rather than in adolescence—because meningitis rates have been higher there for years. Using that same logic, countries have also modified prior recommendations based on emerging evidence—including, for instance, swapping the oral polio vaccine for the safer inactivated polio vaccine in the year 2000.

Vaccines are expensive, and countries with publicly funded insurance consider those costs differently than the U.S. does. Under U.K. law, for instance, the National Health Service must cover any vaccine that has been officially recommended for use by its Joint Committee on Vaccination and Immunisation, or JCVI—essentially, its ACIP. So that committee weights the cost effectiveness of a vaccine more heavily and more explicitly than ACIP does, and will recommend only a product that meets a certain threshold, Mark Jit, an epidemiologist at NYU, who previously worked at the London School of Hygiene & Tropical Medicine, told me. Price also influences what vaccines are ultimately available. In 2023, JCVI recommended (as ACIP has) two options for protecting babies against RSV; unlike in the U.S., though, the NHS bought only one of them from manufacturers, presumably “because the price they gave the government was cheaper,” Andrew Pollard, the director of the Oxford Vaccine Group, the current JCVI chair, and a former member of the World Health Organization’s advisory group on immunizations, told me. (The prices that the U.K. government pays for vaccines are generally confidential.)

[Read: The neo-anti-vaxxers are in power now]

The nature of a country’s health-care system can influence vaccine policy in other ways too. In the U.S. system of private health care, health-equity gaps are massive, and access to care is uneven, even for one person across their lifetime. Many Americans bounce from health-care provider to provider—if they are engaged with the medical system at all—and must navigate the coverage quirks of their insurer. In this environment, a more comprehensive vaccination strategy is, essentially, plugging up a very porous safety net. Broad, simple recommendations for vaccines help ensure that a minimal number of high-risk people slip through. “We’re trying to close gaps we couldn’t close in any other way,” Grace Lee, a pediatrician and a former chair of ACIP, told me.

The U.S. strategy has worked reasonably well for the U.S. Universal flu-vaccine recommendations (not common in Europe) lower the burden of respiratory disease in the winter, including for health-care workers. Hepatitis B vaccines for every newborn (rather than, like in many European countries, for only high-risk ones) help ensure that infants are protected even if their mother misses an opportunity to test for the virus. More generally, broad recommendations for vaccination can also mitigate the impacts of outbreaks in a country where obesity, heart disease, and diabetes—all chronic conditions that can exacerbate a course of infectious illness—affect large swaths of the population. American vaccine experts also emphasize the importance of the community-wide benefits of shots, which can reduce transmission from children to elderly grandparents or decrease the amount of time that parents have to take off of work. Those considerations carry far more weight for many public-health experts and policy makers in a country with patchy insurance coverage and inconsistent paid sick leave.


The current leadership of HHS thinks differently: Kennedy, in particular, has emphasized individual choice about vaccines over community benefit. And some officials believe that a better childhood immunization schedule would have fewer shots on it, and more closely resemble Denmark’s, notably one of the most minimalist among high-income countries. Whereas the U.S. vaccination schedule guards against 18 diseases, Denmark’s targets just 10—the ones that the nation’s health authorities have deemed the most severe and life-threatening, Anders Hviid, an epidemiologist at Statens Serum Institut, in Copenhagen, told me. All vaccines in Denmark are also voluntary.

But “I don’t think it’s fair to look at Denmark and say, ‘Look how they’re doing it, that should be a model for our country,’” Hviid told me. “You cannot compare the Danish situation and health-care system to the situation in the U.S.”

Denmark, like the U.K., relies on publicly funded health care. The small, wealthy country also has relatively narrow gaps in socioeconomic status, and maintains extremely equitable access to care. The national attitude toward federal authorities also includes a high degree of confidence, Hviid told me. Even with fully voluntary vaccination, the country has consistently maintained high rates of vaccine uptake, comparable with rates in the U.S., where public schools require shots. And even those factors don’t necessarily add up to a minimalist schedule: Other Nordic countries with similar characteristics vaccinate their children more often, against more diseases.

At least some of Kennedy’s allies seem to have been influenced not just by Denmark’s more limited vaccine schedule but specifically by the work of Christine Stabell Benn, a researcher at the University of Southern Denmark, who has dedicated much of her career to studying vaccine side effects. Like Kennedy and many of his allies, Benn is skeptical of the benefits of vaccination: “It’s not very clear that the more vaccines you get, the healthier you are,” she told me. Along with Kulldorff, Høeg, and National Institutes of Health Director Jay Bhattacharya, Benn served on a committee convened in 2022 by Florida Governor Ron DeSantis that cast COVID-19 vaccines as poorly vetted and risky. She and Høeg have appeared together on podcasts and co-written blogs about vaccine safety; Kulldroff also recently cited her work in an op-ed that praised one Danish approach to multidose vaccines, noting that evaluating that evidence “may or may not lead to a change in the CDC-recommended vaccine schedule.” When justifying his cuts to Gavi—the world’s largest immunization program—Kennedy referenced a controversial and widely criticized 2017 study co-authored by Benn and her husband, Peter Aaby, an anthropologist, that claimed that a diphtheria, tetanus, and pertussis vaccine was increasing mortality among children in Guinea-Bissau. (Benn wrote on LinkedIn that cutting Gavi funding “may have major negative impact on overall child survival,” but also noted that “it is reasonable to request that WHO and GAVI consider the best science available.”)

[Read: The U.S. is going backwards on vaccines, very fast]

Several of the researchers I spoke with described Benn, with varying degrees of politeness, as a contrarian who has cherry-picked evidence, relied on shaky data, and conducted biased studies. Her research scrutinizing vaccine side effects—arguing, for instance, that vaccines not made from live microbes can come with substantial detriments—has been contradicted by other studies, spanning years of research and scientific consensus. (In a 2019 TEDx talk, she acknowledged that other vaccine researchers have disagreed with her findings, and expressed frustration over her difficulties publicizing them.) When we spoke, Benn argued that the U.S. would be the ideal venue for an experiment in which different regions of the country were randomly assigned to different immunization schedules to test their relative merits—a proposal that Prasad has floated as well, and that several researchers have criticized as unethical. Benn said she would prefer to see it done in a country that would withdraw vaccines that had previously been recommended, rather than add new ones. In a later email, she defended her work and described herself as “a strong advocate for evidence-based vaccination policies,” adding that “it is strange if that is perceived as controversial.”

When I asked her whether anyone currently at HHS, or affiliated with it, had consulted her or her work to make vaccine decisions, she declined to answer. Kulldorff wrote in an email that “Christine Stabell Benn is one of the world’s leading vaccine scientists” but did not answer my questions about Benn’s involvement in shaping his recommendations. HHS did not respond to a request for comment.

What unites Benn with Robert F. Kennedy Jr. and his top officials is that, across their statements, they suggest that the U.S. is pushing too many vaccines on its children. But the question of whether or not the U.S. may be “overvaccinating” is the wrong one to ask, Jake Scott, an infectious-disease physician at Stanford, told me. Rather, Scott said, the more important question is: “Given our specific disease burden and public-health goals, are we effectively protecting the most vulnerable people? Based on overwhelming evidence? The answer is yes.”

[Read: RFK Jr. is barely even pretending anymore]

That’s not to say that the U.S. schedule should never change, or that what one country learns about a vaccine should not inform another’s choices. Data have accumulated—including from a large clinical trial in Costa Rica—to suggest that the HPV vaccine, for instance, may be powerful enough that only a single dose, rather than two, is necessary to confer decades of protection. (Based on that growing evidence, the prior roster of ACIP was considering recommending fewer HPV doses.) But largely, “I’m not sure if there’s a lot in the U.S. schedule to complain about,” Pollard, the JCVI chair, told me. On the contrary, other nations have taken plenty of their cues from America: The U.K., for instance, is expected to add the chickenpox shot to its list of recommended vaccines by early next year, Pollard told me, based in part on reassuring data from the U.S. that the benefits outweigh the risks. The U.S. does recommend more shots than many other countries do. But the U.S. regimen also, by definition, guards against more diseases than those of many other countries do—making it a standout course of protection, unparalleled elsewhere.


*Illustration by Jonelle Afurong / The Atlantic. Source: Aleksandr Zubkov / Getty; Anna Efetova / Getty; Smith Collection / Gado / Getty; BBC Archive / Getty; Child and Adolescent Immunization Schedule by Age / CDC

Every summer, there is a brief window—call it August—when the produce is exquisite. The cherries are at their best, as are the peaches, plums, and nectarines. The watermelon is sweet. The eggplants are glossy. The corn is pristine. And the tomatoes! The tomatoes are unparalleled. There’s a reason tomatoes are synonymous with summer, staple of home gardens and farmers’ markets alike. Giant, honking beefsteaks and sprightly Sungolds are begging to be transformed into salads and gazpachos, tossed with pasta and sliced into sandwiches, or eaten raw by the fistful. Enjoy them while you can.

Come fall, tomato season will be over just as quickly as it began. Yes, you can obtain sliceable red orbs in virtually any supermarket, at any time of year, anywhere in the United States. But they are pale imitations of dripping August heirlooms. Out-of-season tomatoes—notoriously pale, mealy, and bland—tend to be tomatoes in name only. They can be serviceable, dutifully filling out a Greek salad; they can valiantly garnish a taco and add heft to a grilled-cheese sandwich. At the very least, they contribute general wetness and a sense of virtue to a meal. Flavor? Not so much.

This year, of all years, it’s worth indulging in the bounties of high tomato season. The bloodless tomatoes waiting for us in the fall are mostly imported from Mexico, and as with so many other goods these days, they are now stuck in the middle of President Donald Trump’s trade war. This week, the White House imposed 17 percent tariffs on Mexican tomatoes. In all likelihood, that will mean higher prices for grocery-store tomatoes, Tim Richards, an agricultural economist at Arizona State University, told me. This will not make them better in terms of color, texture, or flavor—but it will make them cost more.

[Read: America has never been so desperate for tomato season]

Grumbling about grim winter tomatoes is a long-standing national hobby, and at the same time, their existence is a small miracle. You can eat a BLT in the snow or a Caprese salad for Valentine’s Day with no effort at all. In August 1943, before Americans could get fresh tomatoes year-round, New York City Mayor Fiorello La Guardia encouraged housewives to brace for winter by canning as many tomatoes as they could. “They are in your city’s markets and I want to see every woman can them while they are at this low price,” he announced. They wouldn’t have to do it for long. By the 1960s, “just about every supermarket and corner store in America was selling Florida tomatoes from October to June,” the author William Alexander wrote in Ten Tomatoes That Changed the World. They were visually perfect but tasted like Styrofoam, which is in many ways what they were supposed to be: durable, pest-resistant, long-lasting, and cheap. Tomatoes are famously fragile and quick to rot, so they are often picked while still green, and then gassed with ethylene. It turns them red, giving the appearance of ripeness but not the corresponding flavor. In recent years, the situation has somewhat improved: Instead of focusing exclusively on looks and durability, horticulturalists have turned their attention to maximizing flavor.

There is another reason year-round tomatoes have improved: Mexico. “Most of the nice-looking, really tasty tomatoes in the market are Mexican,” Richards said. That includes small varieties such as cherry tomatoes, grape tomatoes, and cocktail tomatoes, or, as he classified them, “those little snacking tomatoes in the plastic things.” Mexico manages to produce this steady stream of year-round, pretty-good tomatoes by growing them primarily in greenhouses, which Richards said is the best possible way to produce North American tomatoes at scale. Even in winter, tomatoes sheltered from the elements can be left to ripen on the vine, which helps improve the taste.

All of which is to say that an America without easy access to imported Mexican tomatoes looks bleak. Like all of Trump’s tariffs, the point of taxing Mexican tomatoes is to help producers here in the U.S. Thirty years ago, 80 percent of the country’s fresh tomatoes were grown in America. Now the share is more like 30 percent, and sliding. America could produce enough tomatoes to stock grocery stores year-round—Florida still grows a lot of them—but doing that just doesn’t make a lot of sense. “It’s not cost-effective,” Luis Ribera, an agricultural economist at Texas A&M University, told me. “We cannot supply year-round tomatoes at the prices that we have.” Unlike Mexico, Florida mainly grows its tomatoes outside, despite the fact that it is ill-suited to outdoor tomato growing in pretty much all ways: The soil is inhospitable. The humidity is an incubator for disease. There are regular hurricanes. “From a purely botanical and horticultural perspective,” the food journalist Barry Estabrook wrote in Tomatoland, “you would have to be an idiot to attempt to commercially grow tomatoes in a place like Florida.”

[Read: Who invented the cherry tomato?]

Exactly what the tariffs will mean for grocery prices is hard to say. Tomatoes will be taxed when they cross the border, so importers and distributors will directly pay the costs. But eventually, the increase will likely trickle down to the supermarket. The story of tariffs, Ribera said, is that “the lion’s share is paid by consumers.” In the short term, Richards estimated that price hikes will depend a lot on the variety of tomato, with romas hardest hit. “That’s the one we rely on most from Mexico,” he said. Beefsteaks, he added, will face a smaller increase.

Compared with some of the other drastic tariffs that Trump imposed, a 17 percent price bump on Mexican tomatoes hardly portends the tomato-pocalypse. Last year, the average import price of Mexican tomatoes was about 74 cents a pound. If the entire 17 percent increase is passed on to consumers, we’d be looking at an additional 13 cents—enough to notice, but not enough for a critical mass of people to forgo romas altogether. Here’s the other thing: People want tomatoes, and they want them now. “We don’t want to wait for things to be in season,” Ribera said, and we aren’t about to start.

For all of the many problems with out-of-season tomatoes, Americans keep eating them. It was true when winter tomatoes were a novelty: “I don’t know why housewives feel they have to have tomatoes,” one baffled supplier told The New York Times in 1954. But they did, and people still do. Season to season, our national tomato consumption fluctuates relatively little, the grocery-industry analyst Phil Lempert told me. Every burger joint in America needs tomatoes—not the best tomatoes, but tomatoes that exist. There is a whole genre of recipes about how to make the most of out-of-season tomatoes. A lesser tomato, of course, is better than no tomato at all.

Mark and David Geier were a father-and-son team of researchers who operated on the fringes of the scientific establishment. They were known for promoting a controversial treatment for autism, and for publishing papers on the purported harms of vaccines that experts dismissed as junk science. In 2004, the CDC accused them of violating research protocols. In 2012, the state of Maryland sanctioned them. And in 2025, Health and Human Services Secretary Robert F. Kennedy Jr. tapped one of them to investigate alleged wrongdoing in a crucial CDC database.

For years, Kennedy has claimed that the database, which tracks adverse reactions to immunizations and is known as the Vaccine Safety Datalink, once contained vital information about vaccine safety—and that this information has been withheld from the public, scrubbed from the record, or otherwise manipulated. He wants David Geier to investigate it because he and his late father, a physician, studied it in the early 2000s, after they applied through a CDC program that allows researchers outside the government to access certain data sets.

When the Geiers were first allowed into this trove of millions of anonymized health records, they were supposed to be carrying out a safety study of the DTaP vaccine. But the CDC found that they were instead conducting unauthorized analyses to hunt for a link between the vaccine and autism, and risked breaching patients’ confidentiality in the process; the agency revoked their access. (At the time, the Geiers disputed the charge that they had endangered anyone’s personal information, writing in a 2004 letter to an institutional-review-board administrator that they held the “utmost regard” for patient confidentiality.) Even after they were ousted, the Geiers used information they’d apparently held on to from that database to publish a series of scientific papers advancing the widely discredited theory that thimerosal, a mercury-based preservative once common in childhood vaccines, is linked to autism, among other conditions.

[Read: The U.S. is going backwards on vaccines, very fast]

Researchers in the field have long criticized the Geiers’ methodology as sloppy, and noted that their conclusions are at odds with those of numerous higher-quality studies. Since March, when The Washington Post reported that David Geier had been brought into the Department of Health and Human Services, his and his father’s work has come under renewed scrutiny. One scientist found that several of their papers—based on information from the very CDC database that Kennedy has tasked Geier with investigating—contain a statistical error so fundamental that it casts doubt on Geier’s abilities and intentions in assessing data. That scientist and another I spoke with couldn’t believe that some of Geier’s work had ever been published in the first place.

David Geier is currently listed as a senior data analyst in HHS’s staff directory, though what exactly he’s doing for the department is unclear. The Wall Street Journal has reported that Geier is using his new position to continue his search for a link between thimerosal-containing vaccines and autism. New York magazine floated the possibility that he will attempt to repeat a study from the early 2000s that anti-vaccine activists cite as proof that inoculations harm developing brains. Kennedy has denied that Geier is running the agency’s project to find out what causes autism, and testified that he has instead been hired by a contractor to determine whether information disappeared from the database. (Mark Geier died in March, and David Geier did not respond to interview requests. Reached for comment, an HHS spokesperson pointed to a lengthy X post by Kennedy in which he defends Geier’s record and notes his “extensive background as a research scientist.”)

Under any other administration, Geier’s history would almost certainly have disqualified him from any role at HHS. In the mid-2000s, after Mark Geier had established a profitable sideline of testifying as an expert witness in lawsuits that alleged injury from vaccines, the father and son claimed to have discovered a method of treating autism. What they touted as a miracle drug was Lupron, a testosterone-suppressing medication used in many cases of premature puberty. They ran a laboratory out of the basement of their Maryland home and administered the drug to children based on their unfounded theory, advertising their supposed breakthrough on the autism-conference circuit. In 2012, Mark, a physician, was stripped of his license, and David was sanctioned for practicing medicine without one. (The Geiers sued the Maryland Board of Physicians in 2012 for releasing information about medications Mark Geier had prescribed to family members. They were awarded a total of nearly $5 million for the invasion of their privacy and attorneys’ fees, but that judgment was reversed after a different court ruled that Maryland Board of Physicians members were immune from such claims.)

[Read: RFK Jr.’s autism time machine]

The Geiers’ work is well known among autism researchers, though not well respected. “They were seen as not representing the best of autism science,” Craig Newschaffer, a Penn State scientist who has studied how genetics and environmental factors contribute to autism, told me, putting it more gently than others I spoke with. Marie McCormick met the Geiers when she chaired a 2004 review of immunization safety by the Institute of Medicine (now known as the National Academy of Medicine), a nonprofit group that advises the federal government. McCormick, now an emeritus professor at Harvard’s School of Public Health, recalled that the Geiers’ presentation had “really made no sense”: It was a slideshow of vaccine vials with labels indicating that they contained mercury, but it didn’t have much else in the way of evidence. The committee’s report identified a host of “serious methodological flaws” in the Geiers’ research, such as a failure to explain how they had sorted their subjects into groups.

The Geiers’ work from the 2010s likewise has such glaring flaws that the experts I spoke with were baffled as to how the studies had been published at all. Jeffrey Morris, a biostatistics professor at the University of Pennsylvania, recently examined a series of papers on which the Geiers were authors that used data from the Vaccine Safety Datalink. One representative 2017 study purportedly showed that the hepatitis B vaccine was associated with an increased risk of autism.

Morris quickly noticed that the paper’s approach rendered its findings meaningless. It compared a group of children with autism to a control group of children without the diagnosis, to see how vaccination rates differed between the two. But these groups of children also differed in another crucial way: The children diagnosed with autism were born during the eight-year span from 1991 to 1998, whereas the control group—children not diagnosed with autism—were born in 1991 or 1992.

That’s more than a minor inconsistency. In 1991, the CDC’s vaccine-advisory committee recommended that all infants in the United States receive the hepatitis B vaccine, and so the percentage of vaccinated children rose steadily throughout the decade, from fewer than 10 percent to approximately 90 percent. That meant that babies born later in the ’90s (who were overrepresented in the autism group) were very likely to have gotten the shot, whereas those born earlier in the decade (who were overrepresented in the control group) were not. By picking a control group in which relatively few kids would have been vaccinated, and an autistic population in which most were, the Geiers made finding a connection between immunization and autism inevitable.

[Read: The conversations doctors are having about vaccination now]

Using this approach, you could blame the vaccine for all manner of maladies. According to Morris, the Geiers did exactly that in at least nine papers, published from 2015 to 2018, that used data from the vaccine-safety database. One of their studies linked hep-B vaccination to childhood obesity. Others showed an association with tic disorders, emotional disturbance, and premature puberty, among other conditions, some of which rose during the ’90s and early 2000s at least in part because of new diagnostic criteria and increased awareness. That likely also explains why autism rates began to climb significantly in the ’90s.

Many flawed scientific papers include a regrettable but understandable oversight, Morris told me, but the Geiers employed “an absolutely invalid design that biases things so enormously that you could throw out the results of all these papers.” Newschaffer reviewed Morris’s critique and told me he doesn’t believe that a study with such a serious problem should have been published in the first place. “I would characterize that as a ‘miss’ in the peer review,” he said. (I also contacted Dirk Schaumlöffel, the editor in chief of the Journal of Trace Elements in Medicine and Biology, which published the Geiers’ paper connecting the hep-B vaccine to autism. He took issue with Morris’s “polemical allegations” and defended the paper, noting that it “does not argue against vaccination, but merely questions the role of thimerosal.” He told me that he would prefer that the matter be debated in the pages of his journal.)

If David Geier were merely an independent researcher publishing in lesser-known journals, his errors, although egregious, would be of little more than academic concern. But his influence on Kennedy runs deep. In 2005, Kennedy highlighted the Geiers’ research in an essay outlining how he’d come to believe that thimerosal-containing vaccines could cause autism. He wrote about them again that year in “Deadly Immunity,” an article—eventually retracted by both Salon and Rolling Stone after multiple corrections and intense criticism—that alleged that government health agencies had covered up evidence indicating that thimerosal in vaccines was to blame for the rise in autism rates. In his 2014 book, Thimerosal: Let the Science Speak, Kennedy cites the Geiers dozens of times, portraying them as determined truth-tellers battling uncooperative government agencies—the very ones Kennedy has now been appointed to oversee.

[Read: The neo-anti-vaxxers are in power now]

Thanks to Kennedy, Geier seemingly is being handed the keys to the same database he’s proved himself unfit to study. People who are familiar with Geier’s history worry that he’ll use his position on the inside not to defend the truth but to resurrect thoroughly debunked claims, twisting the data to support what he and Kennedy have long believed.

For all of the eons that animal life has existed on Earth, the sun has been there too. And for all of those eons, animal life has had only one solution for intense exposure to the sun: evolution. Some creatures have thick, dark skin that’s resistant to UV harm; others sprout fur, scales, or feathers that block the sun’s rays. Many fish, reptiles, amphibians, and birds may produce a compound that protects their cells against the sun’s damaging effects. Hippos, weirdly, ooze a reddish, mucus-y liquid from their pores that absorbs light before it can destroy their skin. And plenty of creatures have evolved behaviors that take advantage of their environment—rolling around in dirt or mud, simply retreating into the shade.

But certain modern animals have sun problems that natural selection can’t easily solve. Some reside at zoos that can’t perfectly replicate their habitat; others live at latitudes that their ancestors didn’t experience. Others spend too much time sunbathing in a living-room window, or sport sparse or light-colored fur or hair because their domesticators liked the way it looked. For these animals, people have come up with a shorter-term solution: sunscreen.

If, that is, a creature is willing to accept the treatment. Indu, an Asian elephant who lived at the Phoenix Zoo, was game. A few years ago, Heather Wright, one of the zookeepers, noticed the tops of Indu’s ears pinking, peeling, and flaking in the summer heat, much like her human keepers’ did. So her caretakers picked up some zinc-oxide-based sunblock—specially formulated for sensitive (human) skin—and dabbed it on the elephant. Indu, to be fair, was used to a level of care most wild animals don’t enjoy. “We had already been applying lotion,” to manage dryness, Wright told me. The elephant knew the drill: Once in the barn, she’d lumber up to a window with an opening for her ear and stick the appendage through.

As far as zoo staff members could tell, the treatment helped. “There’s nothing magical” about other animals’ skin, Leslie Easterwood, a large-animal veterinarian at Texas A&M University, told me: Bake it in the sun, and it will burn. Scientists have spotted whales suffering from sunburns; cats, dogs, horses—even alpacas, turtles, and penguins—can develop all kinds of skin cancers. Pigs, in particular, “have skin most similar to humans,” Mitchell Song, a veterinary dermatologist based in Arizona told me. At Zoo Miami, keepers have spread mud on older, arthritic wild pigs who can’t wallow as well as they did in their youth; they’ve also applied sunscreen to a babirusa, a species of swine native to Indonesia’s forests, and to a Kunekune pig, Gwen Myers, the zoo’s chief of animal health, told me.

In some sunny places, vets commonly recommend sunscreen for pets and other domesticated creatures, especially light-colored dogs and horses. Steve Valeika, a veterinarian in North Carolina, advises the same for “white cats that go outside.” This particular conundrum is one of our own making. “You don’t see a lot of white-skinned animals in the wild,” Anthea Schick, a veterinary dermatologist in Arizona, told me. Only thanks to generations of selective breeding have they become a frequent presence in and around people’s homes.

Of course, to sunscreen your pet, you have to … sunscreen your pet. Some pet owners, vets told me, are definitely flummoxed by the suggestion: “It’s not widely discussed,” Schick told me. Vets are more unified in recommending teeth brushing for cats—and most cat owners still just decide they’d rather not. But some animals would certainly benefit from block: Schick told me she’s seen her fair share of badly burned dogs, especially after long bouts of sunbathing that scorch their bellies. “We see a lot of sun-induced skin cancers that could be avoided,” she said. Pit bulls, Dalmatians, and other short-haired breeds are especially vulnerable; even long-haired white cats are sensitive around their eyes, their nose, and the tips of their ears. And Easterwood estimates that the majority of paint horses, left unprotected, will eventually develop skin issues. Squamous-cell-carcinoma cases make up the majority of her workload: “I see it every single day,” she said.

The vets I spoke with generally agreed: Don’t bother with sprays, which a lot of animals find annoying or downright terrifying; reapply often, and well; it is way, way, way harder to sunscreen a cat than a dog, though some brave souls manage it. But although some vets recommended human sunscreens, formulated for kids or sensitive skin, others told me they preferred blends marketed for animals. (The FDA has dubbed just one pet sunscreen, made by a company called Epi-Pet and marketed to dogs and horses, “FDA compliant”—not the same as FDA approval, which requires rigorous safety testing.) Several warned against zinc oxide, which can be toxic to animals if ingested in large quantities; others felt that zinc oxide was worth the risk, unless administered to a tongue-bathing cat.

Regardless of the product they’re offered, most animals generally aren’t as eager as Indu to subject themselves to a human-led sun-protection ritual. And even she was usually plied with a five-gallon bucket of fruits and vegetables while her keepers tended her ears. At Zoo Miami, keeper Madison Chamizo told me she and her colleagues had to spend months training an okapi—an African mammal closely related to a giraffe—to accept caretakers gently scrubbing sunscreen onto her back with a modified Scotch-Brite dishwand, after she lost some patches of hair on her back to a fungal infection. But for creatures in very sunny parts of the world, the alternatives are, essentially, being cooped up indoors, kept away from windows, or wrestled into full-body sunsuits. (Some dogs don’t mind; cats, once again, are unlikely to comply.)

And some sun-related problems, sunscreen can’t fix. Gary West, the Phoenix Zoo’s vet, told me he suspects that UV glare has caused eye inflammation in some of his animals; Myers, in Miami, worries about the sensitive skin around some species’ eyes. “They’re not really going to wear sunglasses for us,” Myers told me. So she and her colleagues have started to wonder: “Gosh, is this an animal that we could put a sun visor on?”

The early aughts were the worst possible kind of golden age. Tans were inescapable—on Britney Spears’s midriff, on the flexing biceps outside of Abercrombie & Fitch stores. The Jersey Shore ethos of “gym, tan, laundry” infamously encapsulated an era in which tanning salons were after-school hangouts, and tanning stencils in the shape of the Playboy bunny were considered stylish. Self-tanning lotions, spray tans, and bronzers proliferated, but people still sought the real thing.

By the end of the decade, tanning’s appeal had faded. Americans became more aware of the health risks, and the recession shrank their indoor-tanning budgets. But now America glows once again. The president and many of his acolytes verge on orange, and parties thrown by the MAGA youth are blurs of bronze. Celebrity tans are approaching early-aughts amber, and if dermatologists’ observations and social media are any indication, teens are flocking to the beach in pursuit of scorching burns.

Tanning is back. Only this time, it’s not just about looking good—it’s about embracing an entire ideology.

Another apparent fan of tanning is Robert F. Kennedy Jr., America’s perpetually bronzed health secretary, who was spotted visiting a tanning salon last month. What tanning methods he might employ are unknown, but the secretary’s glow is undeniable. (The Department of Health and Human Services didn’t respond to a request for comment about the administration’s views on tanning or Kennedy’s own habits.)

On its face, the idea that any health secretary would embrace tanning is odd. The Obama administration levied an excise tax on tanning beds and squashed ads that marketed tanning as healthy. The Biden administration, by contrast, made sunscreen use and reducing sun exposure central to its Cancer Moonshot plan. The stated mission of Kennedy’s Make America Healthy Again movement is to end chronic diseases, such as cancer, by addressing their root causes. Yet the Trump administration’s MAHA report, released in May, doesn’t once mention skin cancer, which is the most common type as well as the most easily preventable. It mentions the sun only to note its connection with circadian rhythm: “Morning sun synchronizes the body’s internal clock, boosting mood and metabolism.”

In fact, there’s good reason to suspect that Kennedy and others in his orbit will encourage Americans to get even more sun. Last October, in a post on X, Kennedy warned that the FDA’s “aggressive suppression” of sunlight, among other supposedly healthy interventions, was “about to end.” Casey Means, a doctor and wellness influencer whom President Donald Trump has nominated for surgeon general, is also a sun apologist. In her best-selling book, Good Energy (which she published with her brother, Calley Means, an adviser to Kennedy), she argues that America’s many ailments are symptoms of a “larger spiritual crisis” caused by separation from basic biological needs, including sunlight. “Shockingly, we rarely ever hear about how getting direct sunlight into our eyes at the right times is profoundly important for metabolic and overall health,” she writes. An earlier version of Trump’s One Big Beautiful Bill tried to repeal the excise tax on tanning beds. (The provision was cut in the final version.)

The alternative-health circles that tend to attract the MAHA crowd are likewise skeptical of sun avoidance. “They don’t want you to know this. But your body was made for the sun,” says a “somatic energy healer” with 600,000 followers who promotes staring directly into the sun to boost mood and regulate the body’s circadian rhythm. (Please, don’t do this.) On social media, some influencers tout the sun’s supposedly uncelebrated power to increase serotonin and vitamin D, the latter of which some erroneously view as a cure-all. Some promote tanning-bed use as a way to relieve stress; others, such as the alternative-health influencer Carnivore Aurelius, promote genital tanning to boost testosterone. Another popular conspiracy theory is that sunscreen causes cancer and is promoted by Big Pharma to keep people sick; a 2024 survey found that 14 percent of young adults think using sunscreen every day is worse for the skin than going without it.

These claims range from partly true to patently false. The sun can boost serotonin and vitamin D, plus regulate circadian rhythm—but these facts have long been a part of public-health messaging, and there’s no evidence that these benefits require eschewing sunscreen or staring directly at our star. Tanning beds emit little of the UVB necessary to produce vitamin D. Some research suggests that the chemicals in sunscreen can enter the bloodstream, but only if it’s applied to most of the body multiple times a day; plus, the effects of those chemicals in the body haven’t been established to be harmful, whereas skin cancer has. And, if I really have to say it: No solid research supports testicle tanning. Nor does any of this negate the sun’s less salutary effects: premature aging, eye damage, and greatly increased risk of skin cancer, including potentially fatal melanomas.

The specific questions raised in alternative-health spaces matter less than the conspiracist spirit in which they are asked: What haven’t the American people been told about the sun? What lies have we been fed? Their inherent skepticism aligns with Kennedy’s reflexive mistrust of the health establishment. In the MAHA world, milk is better when it’s raw, beef fat is healthier than processed oils, and the immune system is strongest when unvaccinated. This philosophy, however flawed, appeals to the many Americans who feel that they’ve been failed by the institutions meant to protect them. It offers the possibility that regaining one’s health can be as simple as rejecting science and returning to nature. And what is more natural than the sun?

[Read: You’re not allowed to have the best sunscreens in the world]

Now is an apt moment for American politics to become more sun-friendly. Tanning is making a comeback across pop culture, even as “anti-aging” skin care and cosmetic procedures boom. Young people are lying outside when the sun is at its peak—new apps such as Sunglow and Rayz AI Tanning tell them when UV rays are strongest—to achieve social-media-ready tan lines. Last year, Kim Kardashian showed off a tanning bed in her office (in response to backlash, she claimed that it treated her psoriasis). Deep tans are glorified in ads for luxury goods, and makeup is used in fashion shows to mimic painful-looking burns. Off the runway, “sunburned makeup,” inspired by the perpetually red-cheeked pop star Sabrina Carpenter, is trending.

Veena Vanchinathan, a board-certified dermatologist in the Bay Area, told me that she’s noticed more patients seeking out self-tanning products and tanning, whether in beds or outdoors. Angela Lamb, a board-certified dermatologist who practices on New York’s well-to-do Upper West Side, told me her patients are curious about tanning too. “It’s actually quite scary,” she said. A recent survey by the American Academy of Dermatology found that a quarter of Americans, and an even greater proportion of adults ages 18 to 26, are unaware of the risks of tanning, and many believe in tanning myths, such as the idea that a base tan protects against a burn, or that tanning with protection is safe. (“There is no such thing as a safe tan,” Deborah S. Sarnoff, the president of the Skin Cancer Foundation, told me.)

Recently, some experts have called for a more moderate approach to sun safety, one that takes into account the benefits of some sun exposure and the harms of too much shade. “I actually think we do ourselves a bit of a disservice and open ourselves up to criticism if the advice of someone for skin-cancer prevention is ‘Don’t go outside,’” Jerod Stapleton, a professor at the University of Kentucky who studies tanning behaviors, told me. But the popular rejection of sun safety goes much further. Advances in skin-cancer treatment, for example, may have lulled some Americans into thinking that melanoma just isn’t that serious, Carolyn Heckman, a medical professor at Rutgers University’s Robert Wood Johnson Medical School, told me. Skin-cancer treatment and mortality rates have indeed improved, but melanomas that metastasize widely are still fatal most of the time.

[From the June 2024 issue: Against sunscreen absolutism]

In previous decades, tans were popular because they conveyed youth, vitality, and wealth. They still do. (At least among the fairer-skinned; their connotations among people of color can be less positive.) But the difference now is that tanning persists in spite of the known consequences. Lamb likened tanning to smoking: At this point, most people who take it up are actively looking past the well-established risks. (Indeed, smoking is also making a pop-culture comeback.) A tan has become a symbol of defiance—of health guidance, of the scientific establishment, of aging itself.

Donald Trump is enamored with Coca-Cola. In January, he smiled from ear to ear in a photo with the company’s CEO, who gifted him a special Coke bottle commemorating his inauguration. When Trump officially returned to the Oval Office as president a few days later, his desk was already set up as it had been in his first term: with a button to summon a bottle of Diet Coke. Between sips of the fizzy drink (at one point, he reportedly drank up to 12 cans a day), the president has apparently been working the company behind the scenes. Yesterday, he announced on Truth Social that Coke has agreed to start making its signature soda with “REAL Cane Sugar” rather than high-fructose corn syrup. “I’d like to thank all of those in authority at Coca-Cola,” he wrote.

So far, little else is known about the supposed deal. I asked Coke for more information but did not hear back. The company has yet to even confirm that it has agreed to anything at all. (“More details on new innovative offerings within our Coca-Cola product range will be shared soon,” a company spokesman told The New York Times earlier today.) Although the move may seem random, it follows a pattern of Trump trying to score easy political points—especially during a moment when his base is at war with itself over the Jeffrey Epstein scandal. The clearest argument for cane sugar over corn syrup is taste. “You’ll see. It’s just better!” Trump said on Truth Social. Coke is made with cane sugar in Mexico and many other countries, and “Mexican Coke” has long had a cult following in the United States. Trump may also be doing the “Make America healthy again” movement a solid. Robert F. Kennedy Jr. has focused on high-fructose corn syrup as a major contributor to diabetes and obesity. “Thank you, @POTUS!,” Kennedy wrote on X following the announcement.

The taste argument is fair enough. But if Trump does think Americans’ health is at stake in a switch from high-fructose corn syrup to cane sugar, he’s relying on some very confused reasoning. Like many MAHA priorities, the change is better on paper than in practice. Kennedy may oppose high-fructose corn syrup, but he has also called sugar “poison.” He’s right to be wary of both, because sugar and high-fructose corn syrup are, by and large, the same thing. Multiple independent meta-analyses have found that there is little difference between the two when it comes to health metrics such as weight, blood pressure, and cholesterol. The two products “have nearly identical metabolic effects,” Dariush Mozaffarian, the director of Tuft University’s Food Is Medicine Institute, told me. Trump is essentially claiming that he got Coke to agree to switch its sugar … for sugar.

If Trump wanted to use Coke as a lever to improve Americans’ health, he would need to focus on making sure as few people drink the stuff as possible. In the world of public health, soda is a scourge. Consumption of sugar-sweetened beverages has been tied to increased body weight, diabetes, and heart disease. These drinks are the largest source of added sugar in the American diet. The Trump administration, to its credit, has spent the past several months trying to temper the United States’ insatiable soda habit. At the urging of RFK Jr. and other top officials, six states—Arkansas, Idaho, Indiana, Iowa, Nebraska, and Utah—have experimented with bans on the purchase of soda using SNAP benefits.

But at times, RFK Jr. has seemed to lose the plot on sugar. It’s among the biggest problems with the American diet, but it hasn’t been Kennedy’s primary focus as health secretary. He has spent more time and effort attempting to crack down on synthetic food dyes. For the past several months, Kennedy has been pressuring food companies to voluntarily remove such dyes from their products—a move that, like Trump’s purported deal with Coke, makes for grabby headlines. Food giants such as Kraft Heinz, General Mills, and Nestlé have all agreed to phase out artificial dyes from their products. But Lucky Charms made with natural food dyes and still loaded with sugar is hardly a win for American health.

It’s a tumultuous time to be a food company; the MAHA army might suddenly come after an ingredient in your product that people previously agreed was safe. (Yesterday, after Trump’s announcement, Coke defended on X the safety of high-fructose corn syrup.) Agreeing to remove a controversial ingredient is a way to get in the administration’s good graces and can be a good business move—even if your foods are largely still unhealthy.

In fact, food companies have started to find that they don’t actually need to sell healthy food in order to be MAHA approved; they just need to remove the few ingredients Kennedy objects to. No company quite embodies this strategy like Steak ’n Shake. The burger chain has become a MAHA darling: Steak ’n Shake announced earlier this year that it would begin frying french fries in beef tallow instead of seed oils. To celebrate the change, Kennedy had a meal at a Steak ’n Shake for a photo opp with Sean Hannity that was broadcast on Fox News. (He thanked the restaurant for “RFK-ing the french fries.”) What this moment didn’t capture was that, at Steak ’n Shake, you can still order a double cheeseburger with 1,120 milligrams of sodium—half the recommended daily amount for adults. Wash it down with a vanilla milkshake, and you’ll have just consumed 92 grams of sugar, equivalent to gorging on three Snickers bars. Or how about a Coke with your beef-tallow fries? Today, Steak ’n Shake announced the next step in its “MAHA journey”: In a few weeks, it will begin selling “Coca-Cola with real cane sugar in glass bottles.”

Robert F. Kennedy Jr. has spent a lot of his time as health secretary on the road. Late last month, he spoke at an event in Baton Rouge and lamented how Americans have gotten sicker and sicker over the years. “When my uncle was president, I was a 10-year-old boy—we had the healthiest children in the world,” he said, flanked by supporters in green MAHA Louisiana hats. The day before, Kennedy gave a near-identical speech in Oklahoma City, this time surrounded by people holding signs that read OK ❤️ RFK Jr. and Make Oklahoma Healthy Again.

Kennedy traveled to both states to celebrate their efforts to take up his MAHA agenda. In Oklahoma, RFK Jr. joined the signing of a pair of orders that will begin the process of pulling fluoride from the state’s water supply and blocking the purchase of soda using food stamps. In Louisiana, the health secretary was there when the state enacted a bill that forces food companies to put warnings on their products if they contain certain artificial food dyes, preservatives, or dozens of other additives. These were just two stops on a nationwide tour that has also taken RFK Jr. to several other states—including Arizona, Utah, and West Virginia—that are pushing forward with his ideas, especially on food. In some cases, Kennedy has cheered from afar: “Texas is leading the way,” he posted on X last month, after the Lone Star State passed its own MAHA-style bill similar to Louisiana’s.

Though RFK Jr. has the power to enact monumental change, much of MAHA’s actual successes at reforming the American diet haven’t come from Washington. While states pass law after law cracking down on food, Kennedy’s own biggest action to date has been relatively modest: a campaign pressuring food companies to voluntarily remove synthetic food dyes from their products. The states are out-MAHAing the MAHA king, much to his pleasure.

To some degree, RFK Jr. was always going to need help from the states. Though he has repeatedly called for a ban on purchasing soda using food stamps, the health secretary can’t make it happen without action from states such as Oklahoma. State legislators started introducing various MAHA bills right as Kennedy was being confirmed to his position. In March, Kennedy visited West Virginia when it became one of the first states to pass such a bill into law, banning seven artificial dyes from being served in schools. The laws in Louisiana and Texas are far more sweeping, among the most stringent food policies that have been passed by states in recent memory. In Texas, a range of products that include common food additives will have to specify on the package that they are “not recommended for human consumption by the appropriate authority in Australia, Canada, the European Union, or the United Kingdom.” Louisiana has a similar warning-label rule, and mandates that any restaurant serving food cooked in seed oils has to display a disclaimer in the store.

[Read: Republicans are right about soda]

Kennedy could enact similar changes nationwide, and could even outright ban certain ingredients, but so far he hasn’t. Such actions typically require bureaucrats to first collect evidence that a certain food is causing actual harm, and outside groups have already gathered dossiers of scientific studies. “All he had to do was swing and hit it out of the ballpark,” Jensen Jose, the regulatory counsel at the Center for Science in the Public Interest, an advocacy group that pushes for more stringent regulation of food additives, told me. By Jose’s telling, Kennedy “didn’t even step up to bat.”  

Kennedy’s strategy appears to be by design. Banning ingredients requires new regulations, something that the Trump administration disdains. In January, Trump signed an executive order requiring that for every new rule enacted, the government would have to shed 10. By regulating food through handshake agreements and relying on states to enact their own policies, Kennedy is getting his way without having to deal with all of the red tape. . There are other practical considerations: Although Kennedy talks about waging war with food companies, he is running low on infantry. Food regulations involve legal paperwork, and the FDA has been without its chief counsel since March. The FDA branch tasked with reviewing the safety of additives was thrown into chaos back in February, when a number of employees were fired—prompting the agency’s top food regulator to resign in protest. In an email, a Health and Human Services spokesperson told me, “Secretary Kennedy has led the national charge in demanding greater transparency and accountability from the food industry, and it is precisely because of his leadership that many states have felt empowered to act.”

The challenge for Kennedy in realizing his vision on a national scale, of course, is that he also has to contend with states that are less enthused about the MAHA agenda. But it’s possible that the state laws in Texas and Louisiana will have consequences for the rest of the country. (It’s also true that some of what he has proposed—particularly around food dyes—has at least some bipartisan appeal.) Instead of dealing with the hassle of creating special packaging with warning labels to sell in Texas and Louisiana, food companies might just add those labels to the products they sell nationwide. Something along those lines has happened before. Across the country, foods occasionally come with warnings that they contain certain carcinogens after California enacted a rule requiring such labels. Or perhaps food companies will take out the ingredients targeted by Louisiana and Texas to avoid the warning-label requirement altogether.

Kennedy seems to be banking on such sweeping change. During his appearance in Louisiana, he noted that the state’s MAHA law helps his campaign of pressuring food companies to phase out artificial food dyes. “The food companies are coming in every day and saying, Stop the states from doing this; we don’t want to have a national product that has a patchwork of different states with different rules,” Kennedy claimed. The quickness by which states picked up this MAHA charge undoubtedly projects strength onto Kennedy during his negotiations with food makers. Some of the world’s largest food companies, including Nestlé USA, Tyson Foods, and Kraft Heinz, have already pledged to remove certain dyes from their products in the coming years.

[Read: RFK Jr.’s worst nightmare]

But outsourcing MAHA to the states is hardly a surefire strategy for Kennedy. Some companies promising change may actually be waiting him out—banking on the idea that once Trump leaves office, Kennedy will, too. In the meantime, should the laws be challenged in court, they risk at least being partially struck down. In 2013, a Michigan state law requiring unique beverage labels was nullified after a federal appeals court determined that the legislation unfairly interfered with interstate commerce. Even if the laws stand, RFK Jr. might face some challenges. Warning labels on artificial food dyes might lead companies to swap them out for natural alternatives, but it’s less likely to be the case for emulsifiers—another category of food additives that is targeted by the Louisiana and Texas bills. Emulsifiers are present in many ultraprocessed foods, and in some cases there is no easy replacement.

The clearest way to get rid of emulsifiers would be a national ban. Kennedy can do that, but the states can’t. The same is true with many other food additives that he—often incorrectly—says are wreaking havoc to American health. At a certain point, no matter how MAHA-focused some states become, Kennedy may actually need to get to regulating.

For years, studies have pointed to one especially powerful influence over whether a person will get a vaccine: a clear recommendation from their doctor. Throughout most of her career, Nola Ernest, a pediatrician in rural southeastern Alabama, could reassure families who were hesitating to vaccinate their kids—in many cases by explaining that she had enthusiastically opted into the same shots for her own sons. In the past few months, though, she’s spoken with several families who, at her recommendation, had previously immunized all of their older kids—and yet are now adamant about not vaccinating their newborn. “I reassure them that I am still the same pediatrician,” Ernest told me. “They say, ‘We still trust you. We just think a lot of the things have been pushed on us for a long time that were not actually necessary, or were harmful.’”

Until recently, doubt about vaccines might have been seeded mainly by cautions from friends and family, or by unreliable information online. Now, though, doubt about vaccines has the weight of the federal government behind it. Health and Human Services Secretary Robert F. Kennedy Jr., a longtime anti-vaccine activist, has been telling the public exactly what Ernest’s patients told her: Unnecessary, unsafe vaccines have been forced on you. A recent KFF tracking survey found that about three-fourths of Republicans trust their physician to provide reliable information about vaccines—but about three-fourths trust President Donald Trump and Kennedy to do so as well.

As those sources start to contradict one another, patients’ trust in doctors—which was already eroding—is being pitted directly against trust in government. And in doctors’ offices across the country, the Trump administration’s position is bending conversations about vaccinations—in some cases toward hesitancy, and in others toward haste as people fear that shots will soon be harder to get.

Government advisers and doctors have diverged in their vaccination advice before. In 1989, for instance, the CDC’s Advisory Committee on Immunization Practices, or ACIP, recommended that children receive their second dose of the measles-mumps-rubella vaccine between the ages of 4 and 6, when kids are due for other vaccines; that same year, the American Academy of Pediatrics, following the logic that outbreaks tended to happen in middle school or high school, advised age 11 or 12. The conflicting guidance created enough uncertainty for health professionals and patients that ACIP and the AAP pledged in 1993 to sync their vaccine advice, and in 1995 published the country’s first officially harmonized immunization schedule.

The current vaccine schism between the government and medical professionals, though, is different in kind—not a disagreement over maximizing uptake of data-backed vaccines, but a fight over what evidence to even consider. In May, Kennedy bypassed the CDC—his own department’s agency—and tried to unilaterally remove COVID-vaccine recommendations for children and pregnant people, without providing any evidence of harm. Weeks later, he dismissed all 17 members of ACIP and replaced them with researchers who largely lack expertise in vaccines, including multiple people who are openly antagonistic toward them. At its first meeting last month, that reconstituted group voted to remove recommendations for flu shots, following the advice of an anti-vaccine activist invited to speak at the meeting. (When reached over email for comment, an HHS spokesperson wrote that “HHS continues to support the CDC and the Advisory Committee on Immunization Practices (ACIP) in advancing evidence-based immunization schedules” and that “the Secretary stands by his CDC reforms.”)

These upheavals have prompted a very public fight. Last week, several professional societies—including the AAP, the American College of Physicians, and the Infectious Diseases Society of America—sued Kennedy and HHS, calling recent shifts in vaccine policy “capricious” and arguing that the department’s new leaders were putting the nation’s health at risk. The AAP also boycotted ACIP’s most recent meeting. Shortly after, Martin Kulldorff, the new ACIP chair, criticized the AAP’s loyalty to the unamended immunization schedule as “unscientific.”

The government’s alterations to vaccine guidance so far have been relatively limited. But Kennedy and many of his allies have criticized the immunization schedule, especially for kids, or advocated for paring it back further. And according to nearly a dozen doctors I spoke with, plenty of patients have already picked up on the spirit of these changes: that they should put less stock in vaccination than the government had previously called for. “Families have really been shaken in their confidence in what we’ve been telling them all this time,” Molly O’Shea, a pediatrician in Michigan, told me. “We’re already seeing in my practices a decrease in people taking vaccines on schedule.” In the past, when O’Shea asked, her patients would usually explain their rationale for distrusting a vaccine—something they had read online, a rumor they had heard from a relative. Now, though, many of them don’t want to discuss their choice at all, a response she’s rarely encountered in her three decades of practicing medicine.

In some cases, families are echoing Kennedy’s concerns, and pressing their doctors to directly address them. Like many pediatricians, O’Shea requires that her patients follow the recommended childhood-immunization schedule to continue to be seen at her practice; at one of her offices, several families have asked recently why she’s maintaining the policy even though Kennedy has described vaccination as a personal decision. Braveen Ragunanthan, a pediatrician in a rural part of the Mississippi Delta, told me that a patient recently expressed concerns about the immunizations recommended for his six-month-old daughter after hearing something on the news. The patient asked, “All this time, has there been something wrong with the shots?” Ragunanthan told me.

Ernest, the pediatrician in Alabama, told me that one family of longtime patients, when declining to vaccinate their newborn, cited the debunked notion that vaccines cause autism, an idea that Kennedy has repeatedly endorsed. Several of Kennedy’s other mistruths about vaccines, including that certain shots contain “aborted-fetus debris” and that the hepatitis B vaccine has been linked to autism, have come up as well. Some of the families she sees have also cast federal vaccine regulators and pharmaceutical companies as untrustworthy—echoing Kennedy’s narrative that the U.S. approach to vaccine policy has been corrupt and is bent on pushing dangerous shots for industry profit.

Families who remain eager to vaccinate are also taking seriously Kennedy’s rhetoric—and the implication that a government that endorses fewer shots will ultimately depress their availability. Gretchen LaSalle, a family-medicine physician in Spokane, Washington, told me that some of her patients have started asking whether they’ll be able to get their fall COVID and flu shots; Jennifer Hamilton, a family-medicine physician in Philadelphia, said she’s heard similar concerns from older adults about shingles and pneumococcal vaccines. Ragunanthan also recently vaccinated a patient against HPV at age 9, the earliest age of eligibility and two years before most pediatricians routinely offer the first dose, at her parents’ request. “They said, ‘I don’t know if they’re going to try to take it away,’” he said.

Several doctors told me that they’re committed to following whatever their professional society—be it the AAP, the American Academy of Family Physicians, or another organization—recommends. But they also acknowledged that doing so may not be practical. Public schools generally look to the national immunization schedule to determine which vaccines to mandate for entry, and when; the government’s official stance on vaccines can also influence the price and availability of shots, and determine what insurers will cover. ACIP also decides which vaccines are covered by the Vaccines for Children Program, which ensures access for kids whose families can’t afford shots.

Certain patients might opt to pay for shots out of pocket; Alanna Levine, a pediatrician in New York, told me that her practice intends to seek grant funding that might help it continue to offer vaccines to all of its patients, regardless of insurance coverage. But some vaccines can cost as much as hundreds of dollars per dose—a price that many families won’t be able to, or want to, pay and that many doctors’ offices won’t want to shoulder to keep shots in stock. “We would definitely lose considerable money if we bought vaccines, paid to store the vaccines, paid to administer the vaccines, and then families couldn’t afford to pay us,” Ernest told me. As much as doctors want to continue to “follow the science”—as nearly all of them put it to me—the power of the government may force their hand. “I can recommend something, but if it’s not paid for, I know my patients aren’t going to get it,” Hamilton told me.

Several doctors told me that they hope insurers end up following the recommendations of professional societies. But in the absence of official harmonization with the government, professional societies might revert to developing their own schedule. Even if they were to agree with one another, the discrepancy between official medical advice and official governmental advice casts doubt on the scientific consensus that vaccines are safe and effective. Sian Jones-Jobst, a pediatrician in Lincoln, Nebraska, told me that some of her patients’ visits are now so dominated by combatting vaccine hesitancy that she runs out of time to discuss other aspects of their health. Uncertainty also makes the work of caring for patients inherently more challenging: Before, doctors trusted that they could simply follow the recommended schedule to keep their patients up-to-date on vaccines, Jason Terk, a pediatrician in Keller, Texas, told me. Now, though, divergence is the norm.

Newer Posts