Category:

Medical News

Out of the 30 states where measles has been detected in 2026, Florida currently ranks third in case counts. Since the start of the year, at least 132 confirmed or probable cases of measles have been reported across the state, where vaccination rates have consistently fallen below the threshold required to prevent outbreaks. The measles situation in Florida is, in other words, an urgent problem for the state that the state should be urgently addressing.

But on all things measles, the state’s health department has been mostly silent. The department’s measles landing page has no map of the state’s cases and no list of vaccination sites; its “Data and Statistics” section points to measles numbers that were last updated in 2024. In the months that measles has been spreading in the state, health officials have not issued press releases about the virus or launched information campaigns to caution residents about the risks. They have not publicly advertised the benefits of vaccines. Many of Florida’s health experts remain in the dark about their own state: “There has been no—capital N, capital O—communication to physicians, in particular pediatricians, about the outbreak,” Jeffrey Goldhagen, a pediatrician at the University of Florida at Jacksonville and the former head of the Duval County health department, told me.

Officials in other states are not being so coy. In South Carolina, where nearly 1,000 measles cases have been documented in recent months, the state health department holds weekly press briefings and has plastered an orange MEASLES OUTBREAK banner at the top of its website; in Utah, which has had more than 200 cases in 2026, the health department shares granular details about where the virus has been found. Even South Dakota, which has reported just 23 cases in the past year, provides a list of vaccination clinics at the top of its health department’s measles page.

Florida is the stark outlier—and has been headed in this direction for some time. Under the leadership of Governor Ron DeSantis and Surgeon General Joseph Ladapo, the state has spent the past few years bucking public-health wisdom. A year ago, when measles spread through a Florida elementary school, Ladapo allowed unvaccinated children to return to class instead of staying home to limit the size of the outbreak; in September, he and DeSantis announced that they intend to do away with all vaccine mandates. Now, by all appearances, Florida is testing out a head-in-the-sand approach to measles.


During an outbreak, health departments are usually the first line of defense. Few other entities can serve as a hub for public communication and a trove of data while coordinating across health-care systems and emergency services. In the dozens of states that have reported measles cases since the start of 2025, most health departments have offered a consistent and very public response: issuing press releases, mapping infections, sending health-care bulletins, hosting vaccination clinics. Last year, for instance, after measles started to spread in Texas, New Mexico health officials began pushing out information before the virus had been detected there; by the time New Mexico’s own 2025 outbreak ended, the state had logged 56,000 new immunizations, Andrea Romero, the state’s immunization-program manager, told me. Several of the state health officials I spoke with emphasized transparency as one of their core philosophies: “People have a right to know,” Natasha Bagdasarian, Michigan’s chief medical executive, told me.

In Florida, though, several of the doctors I spoke with weren’t even certain how many cases had been detected in their own county. “We cannot get any information on what is happening in various parts of the state,” Mobeen Rathore, a pediatric-infectious-disease specialist at the University of Florida at Jacksonville, told me. Researchers have been able to tap into Florida’s measles-case counts only via a poorly publicized database that is not linked on the health department’s measles page. Lauren Gardner, an infectious-disease modeler at Johns Hopkins University, where she and her colleagues run a national measles tracker, told me that Florida’s data stand out as unusually difficult to find, and of the 11 Florida physicians and epidemiologists I spoke with for this story, most were unaware that recent measles cases could be found on the state database. Even when cases hit Jacksonville, Rathore said, he heard about them only because some of his colleagues had seen them; pediatricians in the broader community, meanwhile, weren’t told at all.

In response to a request for comment, the Florida Department of Health’s communications director, Brian Wright, told me in an email that The Atlantic was “leaning on unverified third- and fourth-hand claims and calling it reporting.” The department did not respond to questions about why it has not readily publicized measles cases.

Florida’s health department used to be as reliable as other states’: As recently as 2019, its website clearly documented recent measles cases and described the measles-mumps-rubella vaccine as “the best way to prevent measles.” That information is now gone—and the experts I spoke with consistently described the measles content left on the health department’s website as superficial, biased, and even misleading. Although the website does warn about “suspected measles cases” in Collier County—which accounts for about three-quarters of the cases in the state—it then links to Collier’s health-department website, which links back to the state health department’s website. The state’s site also fails to mention that unvaccinated people are at high risk of severe illness; meanwhile, it promotes vitamin-A supplements as a possible treatment (as Health and Human Services Secretary Robert F. Kennedy Jr. has), even though supplementation is generally considered essential only for those with a deficiency, which is very rare in the United States.

Rana Chakraborty, a pediatric-infectious-disease specialist at the University of Miami’s Miller School of Medicine, told me that he has stopped relying on the health department for measles information and has instead been following the advice of the American Academy of Pediatrics and the Infectious Diseases Society of America. Health-care establishments across the state are now charting their own course as they prepare for future outbreaks. “We’re all scrambling a little bit to know what the right or best thing to do is,” Chakraborty said.

Even in Collier County, where dozens of cases have been detected at Ave Maria University since late January, information is spotty. Shannon Fox-Levine, who represents that region in the Florida chapter of the AAP, told me that she consistently hears from her colleagues in Collier that “there has been very little transparency from the school and the health department” about local measles cases—including whether officials have continued to detect spread. A page on Collier’s health-department website lists two clinics where vaccine appointments can be scheduled, but not much else; Ave Maria University has said that it has coordinated with the health department to perform contact tracing and testing, but it stopped releasing data about the outbreak in mid-February. (Neither Ave Maria University nor the Collier County health department returned a request for comment.)

The relative silence in and around Collier has made communicating with patients challenging, Fox-Levine told me. “We are the trusted source for our families,” she said. “When we don’t have answers, it can be hard to reassure them.” Many physicians also worry that the lack of public information has lulled the rest of the state into complacency: Surely, if Florida officials aren’t sounding the alarm, there’s little to worry about. “For the rest of Florida, it’s like, Measles? What measles?” one physician in southern Florida told me. (Several of the health experts I spoke with for this article requested anonymity to avoid professional repercussions for speaking about measles and vaccination without the permission of the health department or their university.)


Many of the Florida experts I spoke with said they didn’t blame health officials specifically for Florida’s minimalist response to measles. Across the health department, officials desperately want to do more to address measles cases, Goldhagen, who remains connected with many former health-department colleagues, told me. But “their hands are tied by the state.”

Several of the experts I spoke with—including one health-department employee—cited an environment of fear and restriction within Florida’s health department. The message from state leaders, they told me, has been that health officials are not to publicize outbreaks, or encourage quarantines, isolation, or vaccines. The state has embraced such hands-off policies since the coronavirus pandemic: In 2020, DeSantis strongly encouraged COVID vaccines for the elderly, but as political sentiment toward COVID vaccines and policy shifted, Florida began taking a more relaxed approach to infectious disease. In 2021, the state passed a law that restricted officials’ ability to quarantine students exposed to COVID; measles cases are not subject to such restrictions, but attitudes about the viruses appear to be similar, several researchers told me.

Unlike many other states, Florida runs a fairly centralized health department: The local health departments are staffed with state employees. “All the directors are scared enough that nobody talks,” one physician in northern Florida told me. The department has suffered serious budget cuts too in the past few years, and many health-department employees worry that doing the job they were hired for could now mean losing it altogether. (Neither the health department nor DeSantis’s office responded to my questions about how politics has shaped Florida’s measles response, or about the limits put on health-department employees.)

Still, where they can, many health officials are “working quietly to implement standard public-health practices,” the northern-Florida physician said. After two cases were detected at the University of Florida at Gainesville, for instance, health-department officials performed contact tracing, ultimately reaching some 1,000 people, Fred Southwick, an infectious-disease specialist at the university, told me. Health officials also recommended quarantines and isolation as appropriate—and people were able to seek vaccines through the health department as well as through the university or their own health-care providers, two physicians in Gainesville said. (Southwick spoke to me in a personal capacity, not on behalf of the University of Florida. In response to a request for comment, the university referred me to communications it had sent to its students, faculty, and staff about measles and then directed me to the health department; it did not answer my questions about the department’s degree of involvement or transparency in managing the cases.)

But these covert operations won’t cut it for much longer, experts told me. A key part of public health, Goldhagen said, is to be public about health, so that everyone has the opportunity to protect themselves and their community. Matt Hitchings, an infectious-disease researcher at the University of Florida at Gainesville, told me that “there are lots of places in Florida that appear to be right on the knife’s edge between sufficient vaccine coverage and not enough,” and although active messaging from the health department wouldn’t change every mind, Hitchings suspects that it could bump vaccination rates enough to prevent outbreaks in some of those regions. As things stand, two physicians told me that they’ve begun to imagine a future in which measles could soon be detected in nearly every single one of the state’s 67 counties.

DeSantis and Ladapo’s approach to infectious disease has been polarizing: Recent polls show that roughly 80 percent of Florida parents support maintaining current laws that require vaccines for schoolchildren—approximately in line with national attitudes. But Florida has become a firmly red state, and across the country, Republicans express doubts more frequently than Democrats do that immunizations are safe and beneficial. In Florida, the number of families seeking exemptions from vaccine requirements has been increasing steadily, and is now about 5 percent.

At some point, the health department may be forced to change tactics—if other states begin warning against travel to Florida, hospitals are overrun, or people begin to die. The “look away” strategy, after all, works only as long as a disease’s impact is small enough to ignore without political consequence. But Hitchings and others told me that they can’t yet picture what that shift might look like. “What is the endgame? I really don’t know,” Hitchings said. DeSantis has pitched Florida as a haven from liberal policies: In 2024, the state posted on its borders signs reading Welcome to the free state of Florida, advertising its commitment to personal choice. It may yet have to reckon with how the decisions of a relatively small number of individuals will affect the health of entire populations—who may soon have less of a choice about whether they get sick.

Back in the early 1970s, psychologists at Northwestern University performed an experiment that, on the surface, looked like a child’s fantasy. The researchers gathered 45 college women and asked some of them to drink a milkshake—or two. Then they placed three pints of ice cream in front of each woman and asked her to taste each one. Afterward, they told each participant to “help herself to any of the remaining ice cream, as she wished,” the researchers wrote in the Journal of Personality. Finally—and this was key—each woman completed a survey meant to measure how much she dieted or “restrained” her eating, outside of the treats she had just consumed.

The findings were dramatic. On average, the women who said they didn’t diet or have weight concerns ate less ice cream if they drank at least one milkshake. The first sweet treat satiated their hunger. But for the women who dieted and felt worried about their weight, the milkshake appeared to unleash a hidden hunger. On average, they ate 66 percent more ice cream after the milkshake than they did without it.

From these data, the researchers devised a bold new theory: Dieting and weight concerns make people overeat and gain weight. Dieting remains pervasive in American culture, but the milkshake study, and similar ones that followed, nonetheless reshaped many Americans’ views of dieting and obesity. Experts concluded that all types of eating disorders—including anorexia, binge eating, and bulimia—can be brought on by intentionally trying to reduce the number of calories that you eat. Some scientists believe that dietary restraint causes obesity too.

This line of research inspired treatments for eating disorders, helped launch an anti-diet movement, fueled the trend of so-called intuitive eating, and shifted how many parents raised their kids to think about food. But more recent evidence suggests that attempting to restrict one’s food intake typically doesn’t have such dire consequences after all.

The notion that trying to diet causes eating disorders and obesity makes some sense. “There’s the idea that if you’re finding yourself thinking about food, trying to restrict what you eat or trying not to overeat, then you’re developing an eating-disorder mentality,” Michael Lowe, a psychologist at Drexel University, told me. The theory is also inherently appealing, in that most people don’t like avoiding tasty food; they can easily believe that doing so would be harmful. No wonder, then, that the idea spread far among clinicians and everyday Americans. Social media supercharged the theory, enough that many people now believe that placing any limits on your diet could be dangerous or harmful, Ashley Gearhardt, a psychologist at the University of Michigan, told me. Many parents share the belief that letting kids follow their own appetites will create healthy attitudes toward food; taken to its logical extreme, that way of thinking means that “in a lot of circles now, if you don’t let your kids have unlimited access to ultra-processed foods, it’s a bad thing,” Katherine Balantekin, a registered dietitian at the University at Buffalo, told me.

Such ideas spread even as researchers were uncovering major flaws in early studies on the link between dietary restrictions and eating disorders. Those experiments didn’t use a consistent definition of dietary restraint, and never tested whether it actually caused eating disorders or overeating; they could say only that those behaviors occurred together. Plus, many studies lumped together several types of eating disorders, or didn’t separate participants with obesity from those with low body weights.

[From the May 2023 issue: Nutrition science’s most preposterous result]

Scientists, including the ones who ran the 1975 milkshake study, also relied on self-reports or surveys to quantify how much a participant dieted, assuming that people who said that they greatly restricted their consumption really did take in fewer calories. But decades later, when scientists gave the same surveys to new participants and measured their calorie intake, they found that the surveys simply didn’t correlate with calorie restriction, Eric Stice, a psychologist at Stanford who led some of these measurement studies, told me. People whom such surveys would label “high dieters” may not have been dieting at all, Stice found. In one of his studies, a so-called high dieter ate, on average, 23 calories fewer a day than a low dieter. “That’s like not eating four peanuts each day and saying you’re on a diet,” he said.

By the 2000s, scientists began to run randomized, controlled trials that could accurately test the model proposed back in the ’70s. In one series of studies, people were prescribed personalized diets aimed at reducing calorie consumption, and taught effective ways to adhere to their eating plans. After six months, those volunteers lost about 10 percent of their body weight, on average, compared with the 1 percent that the control group lost. And the increased dieting didn’t exacerbate participants’ eating-disorder symptoms. In fact, it decreased their binge eating, and they felt less concerned about their body size (perhaps, in part, because their body size decreased). In the past decade, psychologists at Yale School of Medicine have run similar randomized, controlled studies on people who had already been diagnosed with binge-eating disorder and obesity. And again, on average, calorie restriction reduced binge eating; participants’ eating-disorder symptoms worsened only occasionally, and no more so than in the control group. In at least one paper, eating-disorder symptoms improved far more among people in the restricted group than in the control group. In another, weight loss led to remission of binge-eating in nearly three-quarters of the participants.

The scientific consensus that has emerged after these and similar studies is much more nuanced than the one proffered 50 years ago—even though that one still has significant traction in American culture today. “Dietary restriction is not necessarily all good or all bad. But different degrees may be helpful or harming to different people,” Sydney Yurkow, a psychologist at Yale School of Medicine who contributed to the recent trials there, told me in an email. For example, she said, cutting back on food would never be recommended for people diagnosed with anorexia nervosa. And even for people without an eating disorder, “extreme restriction that often coincides with self-directed dieting is largely unhealthy and unhelpful,” Yurkow wrote.

[Read: We have no drugs to treat the deadliest eating disorder]

The new generation of experiments has also uncovered a surprising way to prevent future eating disorders in high-risk adolescents and young women: a small amount of effective dietary restraint. One 2021 meta-analysis found that teaching people about healthy eating habits—including how to curb a tendency to overeat—prevents the future development of eating disorders. Altogether, Stice said, the modern experiments suggest two likely pathways for developing eating disorders. The first involves a person who’s dissatisfied with their body and engages in extreme weight-loss behaviors to change it. “But there’s a whole other pathway that many people have been ignoring,” he said—one in which a person overeats or binges, gains weight, and then becomes dissatisfied with their body as a result.

From this perspective, the milkshake study looks quite different. Fifty years ago, psychologists concluded that dietary restraint caused the women to overeat the pints of ice cream. But the modern interpretation suggests that the reverse was likely true for many of the participants: An underlying tendency to overeat drove the women to try to diet.

In hindsight, the timing of the milkshake study is almost prophetic. Around the late 1970s, the food environment in America began to change rapidly, Lowe, the Drexel psychologist, told me. “The availability of fast food, restaurants and high-sugar, high-fat foods began to explode,” he said. Food became much more difficult to resist, even when Americans weren’t hungry. “People suddenly had to restrain their eating much more in order to maintain their weight.” More people began to gain weight, and by the early ’80s, the country had entered the first stage of the current obesity epidemic. Today, the average American consumes about half their calories from ultra-processed foods. The precise bounds of the category generate debate among some scientists, but they generally agree that such foods are highly refined, manufactured in industrial factories, and calorically dense. Studies have found that diets high in these foods can coax people to consume hundreds of extra calories each day, and when people binge eat, they tend to do so with only ultra-processed foods.

[Read: Coke, Twinkies, Skittles, and … whole-grain bread?]

Recently, scientists and government leaders have begun warning Americans about the potential harms of eating too much ultra-processed food, including an elevated risk of diabetes, certain cancers, and depression. Health Secretary Robert F. Kennedy Jr. has declared war on added sugars, and as of this year, the national dietary guidelines advise against eating highly processed foods. In reality, the study of this kind of food is just getting started. But if Americans are ever to really understand just how such foods affect us, now is the time to abandon the misguided lessons of the milkshake study. Maybe then we can explore how we might develop a truly healthy relationship with the tantalizing food around us.

If nutrition is a sport, it has no casual fans. Supporters of Team Protein, the 2025 champions, are numerous and passionate, backed up by a sprawling industry of protein-supplemented products such as popcorn, soda, and cereal. Also popular is Team MAHA, captained by Health Secretary Robert F. Kennedy Jr., which endorses “real foods,” especially red meat and dairy. The Dietitians are veteran players with an old-school strategy: going heavy on plants and light on saturated fats. Alongside underdogs like Team Keto and the Vegans, there are the Fiber-Maxxers, upstarts whose popularity has soared alongside sales of fiber-filled cookies, powders, and drinks.

As in any fandom, choosing one team can mean demonizing the others’ stars: MAHA partisans despise the Dietitians’ low-fat milk, and the Fiber-Maxxers sneer at Team Protein’s constipating supplements. Yet there is one player that any team would gladly welcome. It’s packed with fiber and protein. Kennedy would call it a “real food.” It’s plant-based, widely available, and incredibly affordable. It is the homeliest and humblest of foods: the bean.

Beans have a lot going for them. (The term beans is often deployed as a catchall term for the larger family of legumes, which includes beans as well as a subset called pulses; here, I’m talking about all of them.) These tiny packages pack a nutritional punch—so much so that the advisory committee for the 2025 U.S. Dietary Guidelines recommended upping the daily serving size of legumes and promoting them as a protein source over meat and seafood. (The meat-happy published guidelines did not incorporate this suggestion.) Navy beans, for example, are especially fiber-dense, and lentils are protein powerhouses. To the farmer, beans are a boon: The plants store nitrogen in their roots, so they require less fertilizer and leave soil healthy once they’re harvested. They are significantly gentler on the climate than meat. Cooked well, they are creamy, tender, and excellent vehicles for flavor.

[Read: If everyone ate beans instead of beef]

Even the most skilled player can’t excel at everything. Dried beans take time and effort to cook. Expert technique can make them delicious, but they’ll never be as succulent as steaks. And yes, they can cause horrible gas, especially among bean novitiates—that is, most Americans. Unlike, say, Mexican or South Asian cuisines, American food is not particularly bean-heavy. Americans consume roughly 60 pounds of beef annually, but only about nine pounds of beans.

The discrepancy is partly because of beans’ lackluster image. Bean companies “might need a little bit of extra oomph in the marketing department,” Jonna Parker, a produce analyst at the industry-research firm Circana, told me. Zach Conrad, a nutritional epidemiologist at William and Mary, recently co-authored a study showing that most Americans don’t eat enough beans to meet the recommendations of the Dietary Guidelines advisory committee. The paper also noted that beans, owing to their relative cheapness, have a stigmatizing association with poverty. “At a nice restaurant, on a date, most people are not going to get a bean salad,” Conrad told me.

But a confluence of changes in American life are making beans a more attractive choice. Other humble foods, such as tinned fish and cottage cheese, have lately experienced a bump in their status, thanks in part to the nation’s ongoing protein obsession. The protein in beans isn’t as easy for the body to absorb as that in animal products, so you have to eat more to get the same amount, Conrad said, but the conversion is hardly unreasonable. Opting for beans also helps square some of the conflicting nutritional advice in the 2025 Dietary Guidelines. The latest version newly emphasizes protein consumption, but also maintains firm limits on saturated fat, which is plentiful in red meat and associated with a higher risk of heart disease and stroke. As my colleague Nicholas Florko has noted, meeting those standards with animal products would be a challenge. But with beans, it’s almost trivial.

Lately, food costs have become eye-wateringly high, particularly for protein. Beef prices were nearly 15 percent higher in September 2025 than they were a year earlier; this year, chicken will likely become less affordable too. Kennedy recently encouraged people to embrace offal as a more affordable source of protein. Even canned foods have become more expensive. Yet beans, canned or otherwise, are still one of the cheapest protein sources around. A can of navy beans costs about a dollar and contains nearly the same amount of protein as a McDonald’s Quarter Pounder.

[Read: America’s grocery lifeline is fraying]

If the nutritional, environmental, and financial benefits aren’t sufficient reasons to root for beans, take note of their recent makeover. Beans are no longer “what Grandma used to make,” Parker said. American culinary enthusiasts have been experimenting with them since at least the early 2000s, but bean innovation really took off in the early days of the coronavirus pandemic, when people adopted pantry cooking as a matter of staying safe, budgeting, and passing the time, says Bettina Makalintal, a senior reporter at Eater whose popular Instagram account is filled with alluring photos of stewed legumes. Well-known recipe developers including Alison Roman—who in 2021 helped propel the “brothy bean” into culinary-world ubiquity—elevated beans further. The food influencer and model Pierce Abernathy gives beans the gourmet treatment with ingredients such as salsa verde and bottarga. Abernathy, whose bean recipes include lesser-known varieties such as Anasazi and gigante beans, is among a growing number of heirloom-bean enthusiasts; Rancho Gordo, an heirloom-bean company, runs a subscription club that reportedly has 29,000 people on its waitlist.

Food companies are riding this leguminous wave, introducing bean-based products with unexpected flavors and convenient formats. The vibrantly branded canned-bean stews from Heyday Canning, launched in 2020, include products such as kimchi-sesame navy beans, harissa-lemon chickpeas, and vodka-sauced cannellini beans. A brand called Lentiful sells individually packaged, microwavable lentil stews in flavors such as Thai Coconut Curry and Lemon Mediterranean, marketing them as grab-and-go lunches. Lentil Telepathy, which launched in 2023, specializes in air-toasted crunchy lentils that can be eaten as snacks or salad toppers, as in the case of the peri-peri and salt-and-vinegar varieties, or as breakfast cereal, if toasted-marshmallow or chocolate lentils hold more appeal. Bean dips can be scooped up with bean chips. Should you prefer a lighter bite, a line of jarred fermented-bean salads is due to launch next month.

“Americans are finally catching up to what much of the world has known for centuries,” Ben Bacon, a co-founder of Lentiful, told me: Beans are a main course, not just a side. Steve Sando, the founder of Rancho Gordo, told me that he hopes the beanthusiasm is here to stay. “Being the flavor of the week is kind of fun, but I really want people just to incorporate beans into the American diet,” he said. Team fandoms, of course, run deep. But no matter where allegiances lie, everyone should be able to agree that beans are the MVP.

Last September, during a memorably bizarre press conference, President Trump told pregnant women—repeatedly and emphatically—not to take Tylenol. His health secretary, Robert F. Kennedy Jr., pointed to studies that “suggest a potential association” between acetaminophen use during pregnancy and neurodevelopmental disorders. Kennedy was more adamant on The Joe Rogan Experience recently, insisting that he had read 76 studies on the subject over one weekend and concluded that the “science is really clear” in showing a link between the drug and the conditions.

The science, in fact, is not at all clear. Although some studies have indeed found an association between prenatal acetaminophen use and autism or ADHD, others have not, including a recent systematic review of 43 studies. But a new study, carried out in Taiwan and published today in JAMA Pediatrics, seems poised to inflame the controversy anew.

Kennedy’s confident assertions aside, the FDA has offered a more evenhanded assessment of prior evidence. On the day of the September press conference, the agency issued a notice to doctors acknowledging that “a causal relationship has not been established” between prenatal acetaminophen use and neurological conditions. That’s exactly right. The studies that have found an association tend to be small and unable to determine causality or rule out other potential causes. Two recent large studies—one conducted in Japan, the other in Sweden—examined siblings and found no association between acetaminophen and autism. Such studies, common in epidemiology, compare siblings in the same family, making use of the similarities in their genetics and environment to help rule those out as culprits. The Swedish researchers, for example, found a weak association between acetaminophen and neurological disorders among all 2.4 million children in the study. But when the study was narrowed just to siblings, comparing, for instance, one who was exposed prenatally and one who was not, the association vanished, suggesting that a factor other than acetaminophen use was at play. The Japanese results followed a similar pattern.

At first glance, the new Taiwanese study appears to mirror the conclusions of the Swedish and Japanese studies. The researchers analyzed the health records of more than 2 million children born from 2004 to 2015, along with their mothers. Across the entire study, mothers who were prescribed acetaminophen during pregnancy were more likely to give birth to a child who would later be diagnosed with autism or ADHD. (In Taiwan, most acetaminophen is taken by prescription, the study’s authors suggest.) And, as in the Swedish and Japanese studies, the apparent association between acetaminophen and the neurological disorders disappeared when the analysis was narrowed to look only at siblings.

But the researchers conducting the Taiwan study also found something weird. When only an older sibling was exposed to acetaminophen during pregnancy, that child was more likely than the younger, unexposed sibling to have ADHD or autism. When only a younger sibling was exposed, the risk for that child decreased. In other words, the siblings didn’t appear to be acting as reliable controls for each other in the experiment, as would be expected.

It’s a head-scratching result, and even the authors don’t seem to have an explanation for it. (Zeyan Liew and Pei-Chen Lee, who are listed as the study’s corresponding authors, did not respond to my requests for comment.) In the paper, they argue that their findings cast doubt on the validity of sibling studies more generally, including their own, arguing that the study design might accidentally introduce unexplained biases and that the families included might not be a representative sample of the general population. Their skepticism of sibling studies runs counter to the long-established notion that the more narrow approach is more rigorous, but it does seem to support Kennedy’s dismissal of the many studies that indicate acetaminophen’s safety. Late last month, Kennedy told Rogan that such studies “have huge holes in them.” (He did not specify what those holes might be; Andrew Nixon, a spokesperson for the Department of Health and Human Services, did not clarify when I asked him.)

[Read: The Trump administration’s unintended autism experiment]

To try to make sense of the findings, I spoke with Viktor Ahlqvist, an author of the Swedish sibling study. Ahlqvist thought the new study’s results were broadly consistent with his own and was perplexed by the researchers’ emphasis on what seemed to him like a “local quirk,” perhaps related to some unknown variable that’s specific to Taiwan. (For instance, a mother in Taiwan who’s more likely to use acetaminophen in a first pregnancy might be different from other mothers in other ways as well.) Besides, Ahlqvist said, the oddity in the Taiwanese data doesn’t invalidate the overall results of the Swedish or the Japanese studies—or of the Taiwanese study itself, for that matter. Yusuke Okubo, a co-author of the Japanese sibling study, was more cautious. He told me that although the new study mostly confirmed his research, the discrepancy between the outcomes for older and younger siblings “suggests that unaddressed biases may also remain.” Ruling out any potential harm based solely on siblings studies like his own is therefore premature, he said.

And so the science remains just as inconclusive as it was back in September. Since then, several of the administration’s acetaminophen-related promises seem to have fallen by the wayside. Kennedy said at the time that the FDA would start the process of updating acetaminophen’s label to indicate the possible risks during pregnancy; in an email, a spokesperson for Kenvue, which makes Tylenol, said the FDA has yet to contact the company about that proposed change. (“We believe that there is no credible data that shows a proven link between taking acetaminophen and autism,” the spokesperson wrote.) The secretary also teased “a nationwide public-service campaign to inform families” about the alleged dangers of prenatal acetaminophen use; if such a campaign has begun, it has gone under the radar.

Back in the fall, the FDA did advise doctors to consider minimizing the use of acetaminophen for low-grade fevers in pregnancy, though the notice also correctly stated that the drug is safer for pregnant women than aspirin or ibuprofen. A recent analysis in The Lancet found that, in the weeks following the White House press conference, prescriptions for acetaminophen given to pregnant women who visited emergency rooms dropped as much as 20 percent below what would otherwise have been expected; after 11 weeks, prescription rates were approaching typical levels again. (Nixon didn’t respond to questions about the label change or the nationwide campaign, but did point me to an X post about the Lancet data, in which he wrote, “Delivering a message about a specific neurological risk for babies is another example of our commitment to telling the truth about public health.”)

[Read: Trump tells pregnant women to ‘fight like hell’ not to take Tylenol]

Last spring, Kennedy promised that the health department would figure out, in just a few months, not only what has caused autism rates to rise in the United States, but also how to prevent kids from being exposed to whatever it is. This proposal was unlikely, if not absurd. Autism researchers have been trying for decades to understand the complex disorder, and the consensus among them is that the increase can largely be attributed to broader diagnostic criteria and better surveillance. But pinning the blame on acetaminophen, even if the case has always been weak, allowed the Trump administration to appear to meet its self-imposed deadline. That association, at least, is clear.

This time last year, Jay Bhattacharya’s main claim to fame was, in essence, a hot take on COVID. In 2020, Bhattacharya, then a health economist at Stanford University without specialized training in infectious disease, co-authored the Great Barrington Declaration, an open letter that downplayed the risk of COVID and called for most of society to reopen before the arrival of vaccines. Back then, health experts widely excoriated this laissez-faire approach as dangerous and ill-conceived; now Bhattacharya wields more power over the direction of U.S. health policy than most Americans ever have. When Donald Trump returned to office, he tapped Bhattacharya to lead the National Institutes of Health. And last month, Bhattacharya became the only person who has ever been tasked with directing the NIH and the Centers for Disease Control and Prevention at the same time.

As the acting director of the CDC, Bhattacharya’s tenure will likely be brief; Trump reportedly plans to name a new permanent director soon. But Bhattacharya clearly wants something from the agency. In his first email to CDC staff, he wrote that the federal government’s “decisions, communications, and processes” broke the public’s trust during the pandemic, and that “acknowledging this reality is a necessary step toward renewal.”

In practice, the CDC has been undergoing a kind of forced renewal for months. Since Robert F. Kennedy Jr. took over as the secretary of the Department of Health and Human Services, thousands of people have been pushed out of the CDC, and several prominent agency leaders have resigned their post. Last year, Kennedy also dismissed the entirety of the agency’s independent vaccine-advisory group, the Advisory Committee on Immunization Practices (ACIP), and replaced it with a more anti-vaccine cohort. Several of the CDC’s vaccine recommendations have been stripped down too—in many cases, “as far as they can go without affecting coverage guarantees” from insurers, Jason Schwartz, a vaccine-policy expert at Yale, told me. And the CDC has lacked a permanent leader since Susan Monarez, the most recent director, was abruptly ousted from her role last August. (Monarez asserts that she was forced out after she refused to rubber-stamp Kennedy’s restrictive and scientifically unfounded vaccine policies; Kennedy has contradicted this account, accused Monarez of lying, and said that she resigned after he pressed her on whether she was trustworthy.)

Bhattacharya himself remains steadfast in his pandemic-era views. More than five years after he first became a vocal opponent of COVID lockdowns, he continues to relitigate that position on podcasts, in interviews, and on social media. Lockdowns themselves might now be a moot policy point, but another of Bhattacharya’s pandemic sore spots, COVID vaccines, are still under active discussion at HHS.

For years, Bhattacharya has insisted that policies that pushed for widespread COVID vaccination violated “informed consent rights” and were “dangerous for public health.” He has disputed the abundant evidence that COVID vaccines are effective and safe. He has also argued that the continued investment in COVID shots has been a waste and that improving Americans’ baseline health is a better way to guard against future pandemics than stockpiling vaccines is.

In the lead-up to the midterms, Kennedy is reportedly nudging HHS away from attacking infectious-disease policy. But COVID vaccines, which are particularly unpopular among Trump’s Republican base, might still represent a politically palatable target, Dorit Reiss, a vaccine-law expert at UC Law San Francisco, told me. Several new members of Kennedy’s remade ACIP have repeatedly cast doubt on COVID vaccines’ safety; following that group’s advice, the CDC recently stopped recommending the shot to all Americans and now says that people should consult with their physicians before receiving one. Bhattacharya may oversee further downgrades to the CDC’s recommendations: At its next meeting, later this month, ACIP is scheduled to discuss COVID vaccines again—this time, weighing in on “injuries” that the shots might cause to the people who choose to receive them.

What exactly that discussion might touch on, or what the committee might vote on, isn’t yet clear. Although COVID vaccines do come with side effects, serious and lasting consequences are very rare; the shots are still a powerful protective measure against severe disease, especially for populations at the highest risk. Separately, some of the Trump administration’s FDA officials have faulted COVID shots for deaths in children, without publicly revealing evidence. But typically, the National Vaccine Injury Compensation Program, which is operated by a different agency at HHS, handles reports of adverse effects from shots. ACIP, meanwhile, is supposed to advise the CDC on which Americans should get which vaccines.

Any exaggeration of the harms of COVID vaccines by ACIP’s members could give the committee, and ultimately Bhattacharya’s CDC, justification to advise certain populations to stop taking the shots altogether. The net effect of such a change could be small; uptake for COVID shots, after all, is already low. But if the CDC or its advisers cast further doubt on the vaccines’ effectiveness and safety, that shift might seem to vindicate Bhattacharya’s long-standing views—that the immunity left behind from bouts of COVID “is stronger and longer lasting than vaccine-induced immunity,” and that unchecked infection among the young and healthy is an acceptable, and even desirable, pandemic strategy. For the CDC, an agency whose explicit goal is to protect public health by controlling and preventing disease, injury, and disability, the change would reinforce the idea that politics has come to supersede evidence under this administration.

When reached for comment, Andrew Nixon, the deputy assistant secretary for media relations at the Department of Health and Human Services, wrote that Bhattacharya “is focused on restoring CDC as the world’s most trusted guardian of public health through sustained reform and by ending the culture of insularity that eroded public confidence during the pandemic.” (Bhattacharya did not respond to a direct request for comment.)

Gigi Gronvall, an immunologist and a health-security expert at Johns Hopkins University, worries that Bhattacharya’s approach to previous outbreaks could also color the CDC’s response to current threats. Among the largest concerns is measles, which has sparked thousands of cases in the United States since the start of 2025 and may soon be declared endemic in the country again. HHS’s response to the virus’s resurgence has been unorthodox: Kennedy has acknowledged that vaccines are the most effective way to prevent the spread of measles and yet has also derided measles vaccines’ effectiveness and exaggerated their harms, and declined to directly urge families to immunize their children. (He has also inappropriately propped up nutritional supplements as a first-line defense against the virus.)

Bhattacharya has departed from Kennedy on measles vaccination in key ways. During his confirmation hearing last year, Bhattacharya said he was convinced by the data showing no connection between the measles, mumps, and rubella (MMR) vaccine and autism—a sentiment that Kennedy refused to convey during his own hearing. In January, on a New York Times podcast, Bhattacharya also described his distress that uptake of the MMR vaccine was “too low.” And in meetings at the CDC last week, Bhattacharya reportedly offered to publicly encourage parents to vaccinate their children against measles.

Some within the agency greeted this announcement with gratitude and enthusiasm. But Reiss and several other outside experts told me that such reactions only reinforce how low the agency’s standards have been set. In theory, Bhattacharya’s internal comments could presage a friendlier approach to vaccination from the federal government, but that has yet to materialize: In a video posted to social media this week, Bhattacharya, like Kennedy has done, lauded the protective powers of measles vaccination but stopped short of directly encouraging families to vaccinate. (Nixon disputed the notion that any of Bhattacharya’s comments on the MMR vaccine represented a departure from other HHS leaders. “Vaccination remains the most effective way to prevent measles, and Secretary Kennedy and other HHS principals have been very clear and consistent on this point,” he wrote.)

  

The measles vaccine, Bhattacharya has correctly noted, is more powerful than the COVID one, especially when it comes to reducing transmission. But containing measles outbreaks also requires substantial investment in contact tracing, quarantines, and public-health messaging that might restrict people’s movements and behaviors—all of which could run up against Bhattacharya’s sensibilities. The Great Barrington Declaration suggested that maintaining a relatively open society, largely free of mitigation measures, could keep deaths low while limiting other harms to the public. But it also implicitly accepted a level of suffering and death that most people in public health found untenable—a framework that could easily translate to today’s epidemic responses. Bhattacharya’s COVID rhetoric has at times mirrored Kennedy’s framing of measles: minimizing the severity of disease, dismissing infection as inconsequential for the young and healthy. I asked HHS whether, in Bhattacharya’s view, the country’s current measles outbreaks might warrant prolonged quarantines, strengthened vaccine mandates, or strict school policies—such as excluding unvaccinated children from classrooms after potential exposures—but Nixon did not answer that question.

Bhattacharya seems bent on “rewriting all of history retroactively to prove he was right all along,” Santiago Enrique Sanchez, a Stanford M.D.-Ph.D. student who has closely followed and written about Bhattacharya’s career, told me. Even if Bhattacharya’s time at the CDC will be brief, he seems eager to force the agency to acknowledge his version of reality while he’s in charge. As much as he may claim to be working to restore trust in the agency, his actions seem far more likely to achieve the opposite.

Updated at 1:20 p.m. ET on March 5, 2026

The surgeon general may be the nation’s doctor, but she’s not your doctor. At least that’s the view of Casey Means, the physician turned wellness influencer who is vying for the role. When asked during her recent Senate confirmation hearing whether she would encourage parents to vaccinate their children against measles—hardly a theoretical question, given the country’s ongoing outbreak—Means delivered an elliptical response. She supported vaccines as a public-health tool but wouldn’t necessarily encourage someone to get them. “I am not an individual’s doctor,” she said. (In fact, she is no one’s doctor at the moment, because she lacks an active medical license.) “And every individual needs to talk to their doctor before putting a medication into their body.”

Means’s nonanswer was more than a dodge. It represents a paradox in the Trump administration’s approach to public health. Means, Health Secretary Robert F. Kennedy Jr., and their allies have consistently eroded confidence in the medical establishment—even as they propose to solve the widespread mistrust of public health by encouraging Americans to talk to their doctor.

Kennedy and his subordinates have spent the past year infusing this approach into federal immunization policy. In October, the CDC announced that it would no longer encourage universal COVID boosters for either adults or children. Instead, a process known as “shared clinical decision-making,” in which doctors engage their patient in a detailed conversation about the pros and cons of the vaccine without defaulting to an endorsement, was the recommended approach. Jim O’Neill, the acting CDC director at the time, said in a press release that the previous guidance “deterred health care providers from talking about the risks and benefits of vaccination for the individual patient or parent.” In January, the CDC went one step further, striking several more vaccines—those against rotavirus, influenza, meningococcal disease, and hepatitis A and B—from the routine-childhood-immunization schedule. Shared decision making would be the preferred approach for these immunizations as well. On a podcast, Kennedy touted the move as returning “freedom of choice to American mothers.”

Who could argue against freedom of choice? The difficulty is that in medicine, shared decision making is not simply a synonym for allowing a patient to ask questions and decide for themselves. That sequence of events is covered by informed consent, a process that takes place before any medical intervention, including vaccination. Shared decision making, in contrast, is a somewhat nebulous term reserved for clinical borderlands: scenarios in which the trade-offs are nuanced or the best course of action is unclear.

Perhaps the most notable example of shared decision making in my area of medical practice is prostate-cancer screening. A handful of large, randomized studies have examined whether the prostate-specific-antigen blood test truly saves lives by detecting cancers earlier. These trials have yielded mixed results, suggesting that, at most, only a small number of men might avoid dying from cancer by getting checked. In contrast, the potential harms of treating an asymptomatic tumor that would not otherwise have been discovered—lifelong urinary and sexual problems, for example—are common. Given the intimate risks and modest benefits, prostate-cancer screening is not explicitly recommended by public-health or physician groups; instead, patients and their doctors are left to their own devices. I have heard from many urologists and primary-care doctors that this process can be baffling and unsatisfying for them and their patients alike. Patients may feel overwhelmed by the complexity of the scientific data, and doctors often feel that they are abandoning their responsibility to offer patients a clear recommendation.

Routine immunizations do not inhabit the sort of gray zone that would warrant such an elaborate discussion. Take the hepatitis-B shot. Everyone who doesn’t get it is susceptible at some point in their life to infection with the virus, which, if it becomes chronic, commonly results in serious liver damage and liver cancer. The vaccine is highly effective at preventing this infection, and its potential harms are either very mild or extremely rare. Vaccination also benefits entire communities by reducing spread of the disease. What would a shared decision even look like in this case? On one hand, a doctor might tell her patient, a very safe immunization can effectively prevent a potentially deadly disease across your child’s entire life. On the other hand, anti-vaccine activists have asserted, without convincing evidence, that vaccination comes with a host of possible dangers.

Shrouding established immunizations in the confusion of shared decision making does not add nuance; it introduces only ambiguity. The Department of Health and Human Services, the Office of the Surgeon General, and public health as a whole exist in large part to offer scientifically accurate advice that doesn’t take a physician to understand. The system that Means seemed to be proposing for the measles shot during her confirmation hearing is so absurd that it left Senator Bill Cassidy—a doctor himself—apparently dumbfounded. At one point, Cassidy wondered whether Means was likening immunizations to something as risky as a bypass surgery, which does require a doctor and patient to intensively discuss the risks and benefits. (Means did not directly answer the question, and instead responded, “The constraints on doctors are monumental, and many American parents are frustrated by what they feel like is lack of transparency on the issue of vaccines.”) The comparison was clarifying. The risks of coronary-artery-bypass surgery, which involves cracking someone’s chest open and stopping their heart, are hardly in the same league as a vaccine-induced sore arm or a mild fever—or even the statistically minuscule risk of a serious side effect, such as anaphylaxis.

[Read: Well, that didn’t sound like Casey Means]

Kush Desai, a White House spokesperson, wrote in an email after this article was published that “forcing decisions down the throats of patients is what is driving confusion and mistrust, not Dr. Means suggesting that patients should play an active role in their health care.” Means herself declined to comment. Andrew Nixon, a spokesperson for HHS, wrote in an email that the CDC “has an established tradition of applying shared clinical decision-making when individuals may benefit from vaccination, but broad vaccination of people in that group is unlikely to have population-level impacts.” Nixon cited COVID vaccines, which provide only modest protection against infection, as appropriate candidates for shared clinical decision making. But in seeming contrast to Means’s testimony, he indicated that such an approach would not be appropriate for the measles shot.

The administration’s defenders might argue that shared clinical decision making is a way to ensure that informed-consent discussions occur reliably, or that the practice constitutes a calculated retreat from the sort of public-health paternalism that lost credibility during the coronavirus pandemic. Nixon suggested as much, writing that this story’s framing is an attempt at “reputation rehabilitation of the experts who presided over a sharp decline in public trust.”

Yet the administration is applying epistemic modesty selectively. When HHS announced the government’s new set of dietary guidelines earlier this year, the instructions were unambiguous: Americans “must prioritize whole, nutrient-dense foods—protein, dairy, vegetables, fruits, healthy fats, and whole grains—and dramatically reduce highly processed foods,” Kennedy said in an accompanying press release. President Trump and Kennedy have also confidently instructed Americans on what they say is the proper use of Tylenol. “You shouldn’t take it during pregnancy,” Kennedy told the podcaster Joe Rogan recently, due to what his administration perceives as a heightened risk of autism associated with the painkiller. At a September press conference, Trump instructed pregnant women to “fight like hell” not to take the drug. Nixon defended the administration’s definitive language, writing, “Delivering a message about a specific neurological risk for babies is another example of our commitment to telling the truth about public health.” But among scientists, the alleged connection between Tylenol and developmental disorders remains quite controversial.

Meanwhile, many leaders of the Make America Healthy Again movement have repeatedly undermined the very professionals whom they now insist patients consult before every shot. FDA Commissioner Marty Makary has accused doctors of relying on “dogma” and “groupthink.” Vinay Prasad, the FDA’s top vaccine regulator, aggressively sowed distrust in public-health authorities before becoming one himself. Kennedy has celebrated “the pediatricians who earn families’ trust every day,” but also has accused the American Academy of Pediatrics of being beholden to pharmaceutical companies; in an interview last summer, Kennedy said that “trusting the experts is not a feature of either science or democracy.” Means, too, has seemingly contradicted the advice she is now giving. A chapter of Means’s book, Good Energy, is titled “Trust Yourself, Not Your Doctor”; in a 2024 Free Press essay adapted from the book, Means wrote, “Most health advice ends with a disclaimer to ‘consult your doctor.’ I have a different conclusion: When it comes to preventing and managing chronic disease, you should not trust the medical system.”

[Read: America’s would-be surgeon general says to trust your ‘heart intelligence’]

The Medicaid and Medicare administrator, Mehmet Oz, a former cardiothoracic surgeon, seems to be one of the few federal health officials who understand the stakes of instructing Americans to embark on in-depth conversations with their doctor before getting routine vaccinations. Oz has counseled countless patients about major heart surgery, but he hasn’t felt the need to prevaricate about something as straightforward as the measles shot. “Take the vaccine, please,” he pleaded recently in an interview on CNN. It’s just the sort of clarity that was once prosaic in medicine and public health, but risks extinction under Kennedy’s leadership.

The eating habits of American adults have, in recent years, begun to resemble those of hobbits. Maybe you, too, have scarfed down scrambled eggs at home in the morning, only to arrive at the office and supplement them with a protein bar for second breakfast and a bag of chips for elevenses. The late-afternoon pastry and banana-bread mocha latte have proliferated—and for humans, at least, may become an existential threat to dinner.

Blame the coronavirus pandemic; blame Ozempic; blame inflation. Whatever the cause, intermediary bites and sips make up a growing portion of Americans’ daily consumption, especially among young people, as my colleague Ellen Cushing wrote in 2024. The shift has now become so pronounced that restaurants are adapting to it. Chains that primarily offer meals are rolling out smaller and cheaper options—solid and liquid alike—in the hope of capturing customers who just want a snack. And in the past two years, the nation’s fastest-growing restaurant brands have been those specifically oriented toward that audience.

The restaurant industry subscribes to an extremely broad definition of snacking. Any item consumed outside the traditional breakfast, lunch, and dinner “dayparts”—industry lingo for eating occasions throughout the day—can be considered a snack, David Henkes, a food-and-beverage analyst at the food-industry research firm Technomic, told me. That includes beverages, as long as they’re purchased at a restaurant during off-meal hours; both a high-protein espresso smoothie and a black coffee count. In this view, the most important characteristic of a snack is not content or form but versatility, David Portalatin, a food-service-industry expert at the research firm Circana, told me. In fact, he said, one of the biggest drivers of the snacking trend is consumers’ demand for flexibility.

In the past few years, snacks—especially sweet ones—have powered immense growth among quick-service restaurants, a category that includes stalwarts such as McDonald’s as well as more recent arrivals such as the China-based Luckin Coffee. According to preliminary estimates from Technomic, the top-10 fastest-growing brands in the United States last year were cafés or dessert shops. Most are known for specialty drinks. The fastest-growing chain of 2025 was 7 Brew, which specializes in ultra-customizable sugary drinks such as the Cookie Butter (a creamy espresso concoction flavored with toasted marshmallow, hazelnut, and white chocolate) and the Pink Mermaid 7 Fizz Soda (a bubbly drink with notes of strawberry, watermelon, and coconut). Last year, the company opened 280 new stores, and Technomic projects that it made more than $900 million in sales. Second on the list was Swig, which sells soft drinks flavored with creams and syrups—popularly known as “dirty sodas”—followed by HTeaO, a Southern-style-iced-tea chain. The drinks sold at these chains are descendants of the Frappuccino, one of the earliest chain-restaurant products to blur the line between beverage and snack. Yet even as Starbucks attempts to refocus on coffee by moving away from desserts masquerading as drinks, newer chains are making no pretenses about selling beverages that can easily tide someone over through a mealtime or two.

[Read: How snacks took over American life]

Some brands have realized that snack time can call for a beverage and food. Last year, Dutch Bros Coffee, best known for its saccharine, candy-colored beverages, began rolling out small, hot breakfast items—egg sliders, a single waffle—across its stores to supplement its existing snack menu. The South Korea–based companies Paris Baguette and Tous les Jours, which were also among the top-10 fastest-growing brands of last year, serve baked goods and desserts in addition to coffee- and tea-based drinks. Tous les Jours’ snacks are geared toward younger customers “who are replacing traditional meals with smaller, more intentional indulgences,” Regina Schneider, the company’s chief marketing officer, told me.

Well-established restaurant chains best known for selling full meals are getting into the snack game too. A common strategy is offering smaller versions of typically sandwiched items in the form of a wrap. Last year, McDonald’s reintroduced the chicken Snack Wrap, a palm-size crispy chicken strip enveloped in a tortilla. (It was discontinued from menus in 2016 because it was a nightmare to assemble quickly, but McDonald’s says that it has streamlined the process.) Similarly diminutive and affordable chicken wraps rolled out at Sonic and Popeyes. Chipotle’s interim chief marketing officer, Stephanie Perdue, told me that the company is catering to demand for protein-laden options “across more occasions, especially snack-sized portions at accessible prices.” Accordingly, in December, Chipotle introduced a chicken taco and what the company described as its first-ever snack: the High Protein Cup, a four-ounce container of chopped chicken or steak. The items cost less than $4 each. Even sit-down restaurants are expanding their appetizer and side-dish offerings; earlier this year, TGI Fridays introduced new sampler platters, which were designed to give “guests a snackable option that fits any daypart,” Lauren Perez, the company’s senior vice president of global marketing, told me. Some TGI Fridays locations are even testing a kids’ menu for all ages, she said.

The snackification of restaurants, as one might call it, is partly a response to Americans’ desire for lower-calorie options. GLP-1 use, weight-loss attempts, and the popularity of lean protein are driving that demand, Portalatin said. Circana data show that 35 percent of restaurant-goers say that they’re ordering smaller portions than they have in the past, and roughly 75 percent of that group say that they’re doing so for health reasons. Some restaurants offer not only smaller items but also foods that evoke wellness. Marketing for Chipotle’s High Protein Cup, for example, touts the 32 grams of protein it contains. In January, Dunkin’ added Protein Milk drinks to its menu; they can include caffeine, B vitamins, and more than 15 grams of protein.

[Read: America has entered late-stage protein]

As American work habits become decoupled from traditional mealtimes, people want to eat in a way that’s convenient for that new paradigm, Portalatin said. Busy workdays and, especially among younger generations, guilt about taking breaks lead half of American employees to skip lunch at least once a week, according to a recent survey. “People all across the country are looking up from their desks at 2 in the afternoon and going, Oh, I didn’t have lunch, but I need something,” Portalatin said. Plus, thanks to the pandemic, a significant chunk of American employees are working from home, which means they have fewer organic opportunities to eat meals outside the house. These workers are part of the reason that the share of lunches purchased at a restaurant—the most lucrative daypart in the business—is 5 percent lower than it was in 2019, Portalatin said. Yet remote workers haven’t given up on restaurants altogether; they’re just visiting off-hours. “If you work at home, you’re like, Well, I’ve got to get out once in a while,” Sam Oches, the editor in chief of Nation’s Restaurant News, a trade publication, told me. A jaunt outside for a change of scenery between meetings may not offer enough time for a sit-down meal, but it presents a natural opportunity to pick up a snack—a little reward, perhaps, after a productive stretch. The popularity of drive-through chains such as 7 Brew and Swig reflect that shift in behavior, Oches said.

That little reward is crucial to understanding why snackification endures. As the cost of living has increased because of inflation, people are spending less at restaurants. Yet they’re loath to give them up altogether. When people decide to eat out, they consider not just the cost but also “the quality, the convenience, and the craveable indulgence that I can’t get for myself at home,” Portalatin said. These factors strongly shape appetite, even when finances are an issue. “At the end of the day, Americans love restaurants,” Oches said. And a $3 Snack Wrap gets you just as much of the McDonald’s experience as a combo meal that can cost $10 or more.

[Read: The worst sandwich is back]

Restaurants going all in on snacking is more than just a trend. It’s a major step in codifying America’s upended eating patterns. Restaurants will never entirely abandon breakfast, lunch, and dinner, experts told me, but for the foreseeable future, they’ll likely continue introducing items that people can eat whenever and wherever they need to. In that regard, the rise of snacking is anything but hobbit-like: The abundant mealtimes of Bilbo and his kin were occasions to take a break from the daily grind and savor the pleasure of eating. Ours allow us to keep eating as the wheel turns.

The day after Thanksgiving, I got a voicemail. A woman identified herself as a doctor at the University of Louisville hospital: “I believe I may have one of your family members here.”

The message was hard to understand. Most of my family lives in Kentucky, so I didn’t know whom the doctor was referring to. I called the hospital, but kept getting put on hold. Then I tried my aunt—if someone was in trouble, she’d be the one to know. But she didn’t answer.

A few hours later, her son got in touch with me. My aunt was the one in the hospital. She’d had an aneurysm on the right side of her brain, and it had burst. The drainage tube the doctors used to stop the bleeding kept slipping loose; after three tries, they finally got it to stick. Only then could they do surgery. My cousin FaceTimed me afterward, from the ICU. “Are you ready?” he asked. He angled the camera down to my aunt’s face, and I started sobbing like a sudden rainstorm.

A few days later, I got on a plane from Washington, D.C., to Kentucky and went straight to join my family at the hospital. We had always called my aunt “The Glamourina.” She wore feathered hats with sparkly shirts and experimented with different hairstyles: a butterscotch-blond cropped cut, an afro, a bob streaked with highlights. She paid for my first real manicure, when I was in high school. We wore matching striped shirts to the salon, and used an eyeliner pencil to draw fake moles above our lips, like Marilyn Monroe.

She is 58 now, and raised two kids as a single mother. She always treated me like one of her children, and I grew up to look more like her than like my own mom. When I’d talked with her the week before she ended up in the hospital, she’d asked me to play our favorite song, “I’m So Proud of You,” by Julie Anne Vargas. Now the top half of her head was shaved and staples ran in a ladder across it. IVs were taped to each arm, and a machine next to her bed was helping her breathe. She couldn’t speak. When she opened her eyes, they rolled.

Her older son was especially alarmed by how quickly she’d declined. He wanted the doctors to come into her room so they could explain what had happened. But one of our older relatives stopped him, saying that we couldn’t afford to make demands, let alone trouble, because “she don’t have a lick of health insurance.”

We knew that the hospital couldn’t deny her care, but we understood the tightrope you walk when you don’t have money. All she could afford to be was grateful.

We don’t know what caused my aunt’s aneurysm, but she’d had persistent headaches for months, and she’d been worried. Once, when she was driving, the left side of her body turned numb and her toes curled up. She pulled over but didn’t go to the hospital; she couldn’t afford it.

My aunt worked as a hair stylist at a salon for years. Most recently, she was the overnight caregiver for an elderly woman, but she had opted out of her employer-sponsored health insurance because she couldn’t afford the premium. She’d occasionally had coverage in the past, but it never guaranteed that she’d actually be able to afford health care. She called me once, defeated, because she was trying to fill a prescription at Walgreens and the pharmacy had flagged an issue with her insurance. She would need to pay out of pocket, and she didn’t have the $134.89. She was often frustrated by spending long spells on hold with insurance agents, and was overwhelmed by the complexity of the plans.

[Annie Lowrey: Annoying people to death]

My aunt’s experience with the health-care system is familiar to many Americans. In a 2023 survey by the Kaiser Family Foundation, nearly a quarter of adults said signing up for a plan was simply too confusing. Even those who have coverage may decide to delay or skip treatment because they can’t afford the out-of-pocket costs, resulting in emergency-room visits and hospitalizations that could have been prevented.

Some years, my aunt made so little money that she might have qualified for Medicaid, but not recently—the income cutoff if you’re single in Kentucky is $1,835 a month. Some years, she bought coverage through the Affordable Care Act’s exchanges, but eventually she decided it was too expensive.

Many more people are now making that same decision. In 2025, the Republican-controlled Congress voted to let Biden-era subsidies in the ACA, which had helped some 22 million people afford their coverage, expire. Within just two weeks of the cutoff, at the end of December, enrollment had dropped by 1 million people. According to one group’s estimate, families are paying $200, $300, or $1,000 more a month; many have seen their premiums double.

[Read: The coming Obamacare cliff]

In January, President Trump released his proposal for a “Great Healthcare Plan,” which suggests that savings from the former subsidies could be sent directly to “eligible” Americans. But who would be eligible? The proposal makes no mention of the many people who don’t have coverage. Then, in February, the Trump administration released a list of 43 prescription drugs that Americans can buy for reduced prices. But some of these were already available at those prices or in generic forms, and they make up a tiny fraction of the drugs Americans need; the prescription my aunt couldn’t afford, for instance, is not listed.

Nothing about Trump’s pronouncements changes the fact that millions more Americans will soon be stuck where my aunt was: in the middle—sometimes insured, sometimes uninsured, but always too poor to get the care they need.

As I stared at my aunt in the ICU, I noticed that her eyebrows were freshly waxed, and her nails had bleach-white French tips. Only the week before, she’d texted me about getting her nails done. It was an indulgence she rarely allowed herself: “Woo this pedi feels good. I haven’t had one since last year.” When I rubbed Vaseline on her chapped feet, I discovered her ruby-red toenails.

She could not have known that the decision to finally splurge a little on herself would be a conversation starter with the nurses, who complimented her on her nails and eyebrows. Her grooming signaled to them that she was someone who took care of herself, someone who deserved their attention and respect.

I drove to her house later that week to meet her younger son. We’d planned to check on her bills—to see if we could find her bank PIN or account information to make sure that her finances stayed on track. I found notebooks coated with her handwriting, a list of numbers down each page that looked like an unsolved equation. These, I realized, were her monthly expenses, along with details such as the confirmation codes for bills she’d paid. Stuffed inside one notebook was a pawn-shop notice, announcing its full ownership over an item she’d traded in.

For years, not having enough money nibbled at my aunt’s health. She texted me about having severe pain in her back and breasts. She wrote that she had a “knot” in one breast—“I’m thinking just polyps.” She lost a lot of weight and said she was feeling depressed. I suggested reaching out to a psychiatrist to ask for antidepressants. She wrote back: “That cost. That’s why I need insurance.” She was tired of pretending to be okay. After paying for her mortgage, water bill, Wi‑Fi, car insurance, and other necessities each month, she’d usually be out of money. She was always transparent with me about her struggles, and sent photos of bills with disconnect notices: a letter from the energy company; an available checking balance of –$59.70; a past-due payment, with the amount owed in bold. Shutoffs have resumed. Make a $172.75 payment today to get your account back on track. She had small wins, such as finally paying off her car. But she still went back and forth to the payday-loan store.

As I sat next to her in the hospital, I couldn’t help but feel guilty. For years, I had been sending her money when she asked, but sometimes I didn’t. I would listen to her struggles and then go on with my life. I was grateful to be financially stable, but frustrated by being the financial rescuer for family members. I wanted to create boundaries, and to escape from the transactional, lopsided part of these relationships.

[From the October 2023 issue: Jenisha from Kentucky]

But I had not thought enough about how much she gave me—in every way she could. She posted about my accomplishments on Facebook no matter how small I considered them. She filled voids for me: self-esteem booster, cheerleader, second mother. In 2014, she used all the money she had to fly to New York to see me graduate from Columbia. She was the only member of my family there. When my name was called and I walked across the stage, she cried so much that someone had to hand her a tissue.

A few months ago, my son turned 4, and my aunt was determined to send him a gift. A manila envelope arrived at my apartment: She had mailed him five individually wrapped Hot Wheels cars and a Spider-Man birthday card. I recorded a video as my son stuffed his hand inside the envelope, pulling out each toy, saying, “Oh, wow. This is awesome.” That night, I sent the video to my aunt. She wrote back at 2 a.m.: “Up looking at videos over n over. He was so excited.” She was always trying to give to others, even though she never had enough for herself.

As individuals, and as a country, we tend to pay attention only when it’s too late. Americans who want to cut health-care spending don’t seem to understand that access to preventive care saves not just lives, but also money. Perhaps my aunt’s hospital stay could have been avoided if she’d been able to call a doctor and make an appointment, an option that so many of us take for granted. What is a life like my aunt’s worth in America? Unfortunately, that determination has been made.

[Jonathan Chait: Obamacare changed the politics of health care]

My aunt hasn’t sat up or spoken since the aneurysm, and no one knows if she will again. In January, she was transferred from the hospital to a nursing home. She’s supposed to go home soon, to be cared for by the family, who can’t possibly give her the round-the-clock care she needs. She’s not capable of worrying about health insurance at this point, but if she were, she wouldn’t have to: Now that she’s completely disabled, she qualifies for Medicaid.


This article appears in the April 2026 print edition with the headline “The Cost of Not Having Health Insurance.”

Of all the ways that governments can try to help people, cash transfers can seem like one of the most straightforward. Their popularity has been growing: Over the past decade, dozens of American cities have launched cash-transfer pilots. During the coronavirus pandemic, governments worldwide dramatically expanded their own programs’ reach. And as AI reshapes work, the idea of guaranteed income—a specific kind of recurring, no-strings-attached cash payment—is moving into the mainstream.  

Yet while the provision of cash has saved many lives in dozens of low- and middle-income countries, it has seemingly produced only modest health gains in the United States. Guaranteed-income pilots also haven’t delivered the dramatic health improvements associated with cash-transfer programs elsewhere. Why does cash save lives in Tanzania but barely move the needle in Texas?

From our work studying cash-transfer programs across 37 countries, we’ve come to see a consistent logic behind why cash succeeds in some places and falls short in others. Cash transforms health when four particular conditions are met. Most U.S. cash-transfer pilots have lacked them. But one major American policy does come close: the federal food-assistance program SNAP. Its success offers a road map for what effective cash assistance can look like in this country, if we choose to build on it.


First among the necessary conditions, cash infusions must be large enough to change one’s daily reality. In many low-income countries, a modest amount, on the order of $20 a month or less, can represent a major share of household income. For families living in extreme scarcity, a small influx of funds can expand their food budget, allow children to get vaccinated, or help a mother reach a hospital to deliver safely. These changes are big enough to save lives.

In the U.S., by contrast, a few hundred dollars a month for a relatively short period of time, typical of guaranteed-income pilots, rarely matches the steep costs of housing, child care, and health care. The support modestly eases financial instability but doesn’t fundamentally alter the constraints that low-income families face.

Second, cash must be able to remove specific barriers that block good health. In the countries we studied, many of the leading causes of death—HIV, tuberculosis, malaria, malnutrition—are tightly linked to poverty. Families face life-threatening obstacles that a small amount of money can help them overcome, by creating access to transportation, better nutrition, a skilled birth attendant. When families have a little more income, the health effects can be immediate and profound.

In the U.S., the dominant health problems are chronic diseases shaped by neighborhood environments, structural inequities in housing and health care, and years of accumulated risk from unhealthy diets and other long-term exposures. These problems are far less responsive to short-term financial boosts. Cash can reduce stress and improve stability, but it cannot, on its own, undo the deep roots of these conditions. Yet in certain periods of life—such as during and after pregnancy—cash can have an immediate impact because health outcomes hinge on whether people can meet their basic needs and show up for health care when it matters most.

Third, scale matters. Successful cash-transfer programs reach large portions of the population. When millions of people receive support, the benefits spread beyond individual households, which helps explain why such programs have reduced mortality across entire countries. U.S. pilots have been small, reaching only hundreds or thousands of families—too limited to change the broader conditions that shape health outcomes.

Finally, cash works best when it is woven into social infrastructure that families already rely on. In many low- and middle-income countries, payments are linked with health visits and other essential services. Brazil’s Bolsa Família program, for instance, operates alongside an extensive primary-care system and has been credited with preventing hundreds of thousands of deaths. In the U.S., cash-transfer studies and guaranteed-income pilots are typically disconnected from other programs that translate cash into health gains.

These conditions help put common criticisms of such programs in perspective. Fears that cash discourages work or fuels spending on alcohol or drugs have not held up in the research. Across rich and poor countries, cash transfers have minimal or positive effects on work and do not increase drinking, smoking, and other substance use.


Although many U.S. pilots have fallen short, SNAP is the one American program that comes closest to the global success stories. Its payments are large enough to meaningfully reduce poverty. The program targets a barrier, food security, directly tied to health and survival. It reaches more than 40 million people. And it is administered through state systems that connect it, albeit imperfectly, with other public systems, including Medicaid and school meals. It is no coincidence that SNAP is the only U.S. income-support program convincingly linked to improved survival. In many ways, it resembles the global cash-transfer programs that have delivered the largest health gains.

Although SNAP benefits currently remain too small to eliminate food insecurity for many households, expansions during the Great Recession and the pandemic demonstrated that larger benefits and smoother access can make the program far more effective. SNAP’s impact is greater when benefits are adequate and when eligible households can easily stay enrolled. Instead of incorporating these lessons, changes in the One Big Beautiful Bill Act move in the opposite direction—tightening eligibility and cutting funding in ways that could mean millions lose their benefits. (The Trump administration has justified these cuts in part by arguing SNAP is rife with fraud and abuse. Fraud does occur, as it does in any large federal program. But by the government’s own estimations, this represents a small fraction of SNAP spending, and the large majority of the tens of billions of dollars the U.S. spends on the program benefit Americans.)

SNAP is not the only instructive example. The U.S.’s earned-income tax credit can also deliver a sizable cash benefit, typically as a lump sum, that low- and moderate-income workers can use to catch up on bills, pay down debt, or cover necessities. Because it is built into the tax-filing process, it avoids eligibility churn and can be readily expanded by states. It’s not a health program, but past expansions have been linked to improved child health. The Special Supplemental Nutrition Program for Women, Infants, and Children, or WIC, is much smaller than SNAP in scope and generosity, but it’s integrated with local clinics and pairs food support with nutrition counseling and care referrals for pregnant and postpartum women, as well as infants and young children. By increasing access to healthy foods during a crucial period, and freeing up money that would otherwise go to groceries, WIC has been linked to improved birth outcomes and infant health.

Smaller programs can also have a clear impact if they are designed to meet the four conditions. Rx Kids, launched in Flint, Michigan, in 2024, offers cash transfers to parents and infants and closely follows the global playbook: It has a meaningful transfer size, near-universal reach within the city, benefits that target pregnancy and infancy, and links to the health system. Early evaluations of the program suggest substantial improvements in birth outcomes. The political will for such an approach at the federal level may not exist, but Flint shows what local efforts can achieve when the conditions are right. Michigan’s recent decision to invest hundreds of millions of dollars to expand Rx Kids statewide, enough to reach roughly one-third of all births, suggests a plausible U.S. path to scale.

Although these programs require public investment, the returns can be high. For young children, SNAP delivers roughly $60 in benefits for every dollar spent. The question is not whether cash is misused, but whether we choose to structure programs at the size and under the conditions where cash does the most good. Cash is not a cure-all. But when designed with the right basic ingredients, cash transfers are one of the most powerful levers that governments have to alleviate poverty and improve health.

Casey Means has, to say the least, modified her tone. When she testified today in front of the Senate’s health committee, the nominee for surgeon general didn’t, as she is normally wont to do, delve into her experiences with psychedelics or endorse raw milk. She also did not rail at length against birth control. Instead, the longtime health entrepreneur and influencer emphasized her medical degree from Stanford—even though she does not have an active medical license—and sought out common ground with the senators cross-examining her.

Before her nomination last spring, Means—who dropped out of her surgical residency in 2018—embraced some unconventional theories about wellness. As Rina Raphael wrote for The Atlantic last month, Means has talked to trees, implied that natural disasters are a “communication from God,” and dubbed the nation’s health “a spiritual crisis.” When she appeared on Tucker Carlson’s podcast in 2024, she denounced seed oils and suggested that the widespread use of hormonal birth control was indicative of a cultural “disrespect of life.” She has also questioned the universal birth dose of the hepatitis-B vaccine.

[Read: America’s would-be surgeon general says to trust your ‘heart intelligence’]

In her 2024 book, Good Energy—which Means co-wrote with her brother, Calley, who is now a senior adviser to Health and Human Services Secretary Robert F. Kennedy Jr. and also a key figure in the MAHA movement—she advises readers to avoid tap water and conventionally grown food, and to trust themselves rather than their doctors. She recommends getting “one cumulative hour of very hot heat exposure” each week and says that people should optimize their health by using a glucose-monitoring device, which is, helpfully, available through Levels Health, a company she co-founded.

Means, who didn’t respond to a request for comment, wrote in her September ethics filing that she would resign from Levels and forfeit or divest all stock options in the company. But she is still listed on Levels’ blog as the company’s chief medical officer. She said today during her hearing that she has spent “the last several months working with the Office of Government Ethics to be fully compliant” with rules regarding conflicts of interest. Senator Chris Murphy also pressed Means on her financial relationships with companies whose products she has promoted in her newsletter, citing an analysis that found that she’d frequently failed to make proper disclosures to her readers. “I have a strong feeling that the way in which they gathered this data is done intentionally to create these claims that you’re making,” Means testified.

Today, Means was far less outwardly anti-establishment than she has been in her book, her newsletter, and podcast appearances. For example, when Senator Patty Murray asked Means to explain her previous anti-birth-control comments, Means said that she was referring not to birth control generally but to particular women whose medical history might increase risk from taking birth control. She also avoided explicitly besmirching immunizations. “I believe that vaccines are a key part of any infectious-disease public-health strategy,” she told Senator Bill Cassidy.

Means had reason to tone it down. Health leaders, including former surgeons general, have questioned her qualifications for the position. Dozens of health and advocacy organizations have opposed her nomination. Peter Lurie, the president and executive director of the Center for Science in the Public Interest, called Means “a virtual PEZ dispenser for RFK, Jr.’s misinformation” in a statement yesterday.

Means’s confirmation hearing also comes at a pivotal moment for the MAHA movement. In the run-up to the midterm elections later this year, Kennedy appears to be shifting his focus from undermining the childhood-vaccine schedule—his least popular priority, according to one recent poll—to battling the food industry, which enjoys broad support. (The New York Times has reported that the White House wants Kennedy to downplay vaccines ahead of the midterms. The White House and the Department of Health and Human Services did not immediately respond to requests for comment.)

Means’s stance on vaccines today was measured, by MAHA standards. She said that “vaccines save lives” but hesitated when asked whether she agreed with Kennedy’s assertion that there’s no evidence that the flu vaccine prevents serious illness or death in children. “At the population level, I certainly think that it does,” she said finally. (CDC data indicate that the flu vaccine prevents death across all age groups, including children.) Notably, Means said that “I absolutely am supportive” of the measles vaccine, but—against prevailing medical advice—declined to recommend it to parents, possibly hoping to avoid alienating the anti-vaccine wing of MAHA. “There’s a nuanced conversation that American families are looking to have about shared clinical decision making with their doctors about specific vaccines,” she told Senator Angela Alsobrooks—less of an endorsement than is customary for a surgeon general or any other public-health expert, but more mainstream than her earlier suggestions to follow one’s own intuition over expert medical advice.

[Read: RFK Jr.’s next move is what anti-vaxxers have been waiting for]

She also sought to be a unifying voice on pesticides, which have recently caused a fissure between MAHA and MAGA. Last week, President Trump issued an executive order to boost the domestic production of the weed killer glyphosate, which Kennedy has long insisted causes cancer, on the grounds that the compound is essential to the United States’ food security. Key leaders in the MAHA movement regarded the move as a betrayal. Not that long ago, Means might have also been quick to criticize the decision: Last March, she posted on X that “pesticides are a slow-motion extinction event.” In her testimony, though, Means was far more conciliatory, telling senators that the issue is complicated and that “changes need to be made thoughtfully, with full respect for American farmers and the constraints that they’re under.”

Over the past several weeks, leaders in the MAHA movement and the GOP have been fighting over how to win the midterms without angering the factions that make up MAHA. Anti-vaccine activists won’t be satisfied until Kennedy follows through on long-favored plans to do away with the childhood-vaccine schedule. But at least for now, the secretary seems to be leaning into more broadly popular priorities, such as condemning ultra-processed food (he’ll be the keynote speaker at an “Eat Real Food” rally tomorrow in Austin). If she is confirmed, Means will find herself at the center of a movement that is in the midst of an identity crisis.

Newer Posts