Category:

Medical News

Last summer, the Dalai Lama was having a party in Dharamshala for his 90th birthday, and Bethany Morrison, a newly appointed State Department official, was eager to meet with him there. Inconveniently, the United States had recently canceled about $12 million worth of annual foreign aid benefiting Tibetan-exile communities as part of the implosion of USAID. This, Morrison and other State officials thought, would not make a particularly good impression on His Holiness, according to a former State and a former USAID official.

Prior to the Dalai Lama’s birthday, the two former federal employees told me, they had spent months lobbying for Donald Trump’s administration to restore at least some Asia-based aid projects. They had argued that these projects passed Secretary of State Marco Rubio’s new litmus test for overseas spending: They would make America “safer, stronger, and more prosperous.” Nothing changed. (Like other aid workers I spoke with for this story, the former employees requested anonymity because of fear of professional reprisal.)

But as the party’s date approached, Jeremy Lewin, the new head of U.S. foreign assistance at the State Department, was suddenly persuaded to resurrect aid to Tibetans, and had seemingly little regard for where, exactly, the money would be going, the former employees said. In a June email to other State Department officials, Lewin wrote that he wanted to “give some good news ahead of the trip.” Days before the party, the State Department allocated nearly $7 million to support Tibetan exiles in South Asia. (A State Department spokesperson, who did not give their name, told me in an email that many programs were paused in early 2025 as part of a foreign-assistance review “conducted to ensure that the American taxpayer’s hard-earned dollars were being spent efficiently” but declined to comment on the specific circumstances of Tibetan aid being reinstated ahead of the party.)

For the past half century, the U.S. has pursued, however imperfectly, a straightforward ideal of foreign aid that has been codified in laws passed by Democrats and Republicans alike: Resources should be deployed wherever they are needed most. Under this administration, funding for overseas aid is being evaluated by a different measure—using “dealmaking and transactions as near-exclusive metrics of success,” as the Center for Strategic and International Studies, a centrist think tank, put it in November.

Foreign aid has always, to varying degrees, been a political project, meant to accrue soft power by forwarding America’s vision for itself and winning over people abroad. In the past year, though, some aid agreements have been nakedly transactional (the U.S. helping finance malaria drugs in exchange for access to minerals, for instance); others, such as those that preceded the Dalai Lama’s birthday party, simply highlight how haphazardly programs have been picked for survival. The overall result is that, instead of being directed at where they can save the most lives, U.S. humanitarian efforts now seem to be aimed primarily at where they can advance the Trump administration’s other priorities.

When the Trump administration suddenly ended most foreign-assistance programs early last year, governments around the world had little time to adjust budgets or make contingency plans. This gave the U.S. new and pointed leverage over most other countries, which it seems eager to exploit. On July 1, the day that some remnants of USAID were officially absorbed by the State Department, Rubio wrote on Substack that the administration’s foreign-funding thinking “prioritizes our national interests.” Talking points distributed widely within the State Department around that time, obtained by The Atlantic, clarify that under the new “America First” approach, the department plans to award funding to two main types of aid: programs that are strategic and programs that are lifesaving. The talking points emphasize that both categories are “not global charity” but rather “a tool of strategic engagement.”

The State Department spokesperson did not dispute this characterization. “President Trump’s National Security Strategy is very clear: the United States will partner with select countries to reduce conflict and foster mutually beneficial trade and investment relationships, shifting from a traditional aid-focused approach to one that strategically leverages foreign assistance to support economic growth,” they wrote. They added that the U.S. “remains the most generous nation in the world for lifesaving humanitarian assistance.” Notably, the president proposed slashing foreign aid by 70 percent in fiscal year 2026; Congress quietly rejected the cut.  

Some of the foreign aid disbursed in the second half of last year appears to have been straightforwardly treated as a bargaining chip. Another senior State official described the department’s approach to me as “Can we cut a tariff deal with this country? Okay, we’ll increase the aid going to them.
Are there critical mineral rights that we would like to discuss?” Foreign aid might lubricate that conversation too. The State Department hasn’t been shy about this strategy: In a September memo to Congress, which I obtained after its existence was first reported in The Washington Post, the department says it intended to use foreign-assistance money to incentivize other nations to “support U.S. immigration priorities” and diversify “critical mineral supply chains.”

Weeks later, Equatorial Guinea, a small country on the west coast of Africa, agreed to accept U.S. deportees who are not its citizens; in return, it received $7.5 million from a government fund meant to assist refugees and victims of conflict. Eswatini and Rwanda have signed similar deals. Last month, the State Department made the release of funds to fight malaria, tuberculosis, and HIV in Zambia contingent on its government agreeing to terms “for collaboration in the mining sector” and other economic reforms. The Zambia health-financing agreement is one of more than 50 that the State Department plans to sign with low- and middle-income countries in the next few months. Earlier this month, Mike Reid, the chief science officer for the President’s Emergency Plan for AIDS Relief, acknowledged in a post on his personal Substack that the global-health deals put aside “long-standing, epidemiologically sound priorities” and are “transactional”—but he wrote that he ultimately rooted for their success.

Prior to this administration, the U.S. had generally distinguished a country’s government from its people when making aid decisions. The U.S. led the global effort to reduce the humanitarian crisis in Iraq caused by the near-total sanctions that had been levied on the country after President Saddam Hussein ordered the invasion of Kuwait. American taxpayers even fed North Korea during its late-1990s famine. “Showing that the United States stood in solidarity with the world’s most vulnerable people, regardless of what their government did or did not do, was kind of a goal in and of itself, as a projection of American values,” the senior State Department
official told me. In some cases, that still appears to hold true: This month, for example, the U.S. announced that it is working with the Catholic Church to deliver food and supplies to Cubans, despite the State Department’s allegations that Cuba’s government sponsors terrorism and concerns about “diversion by the illegitimate regime.”

But neutrality as a rule in aid decisions “no longer exists,” the senior official said. Concerns about diversion and terrorism have been used to justify shutting down all assistance to Afghanistan and Yemen—countries where urgent intervention is needed to prevent deaths from malnutrition, according to the federal government’s famine data. And Rubio justified a muted response to Myanmar’s request for help after a major earthquake last spring in part because “they have a military junta that doesn’t like us.”

Meanwhile, countries that have a history of advancing U.S. security interests have been rewarded: Last month, for instance, as Rubio signed a deal contributing $1.7 billion to Kenya’s health system, he expressed his appreciation for the country leading a United Nations peacekeeping force in Haiti, a country the Trump administration hopes to stabilize to prevent would-be migrants from attempting the 600-mile trip to Florida. In April, funding was restored for a desalination plant in Jordan, a country where water scarcity is severe but relatively few people die for want of water—and that happens to be the U.S.’s main Arab ally, and is known to collaborate closely with the CIA. Jordan has benefited from its allyship before: In 2022, a federal watchdog determined that, by sending more funding for clean water and sanitation to Jordan than any other nation, the U.S. was subverting the spirit of the law. But if the old system was slanted by strategic interests, the new one has keeled over in pursuit of them. Jordan now appears to be the one of the only—if not the only—countries where the U.S. has reinstated a water-infrastructure project, despite having abandoned more than 20 half-finished drinking-water and sanitation systems around the world.

Sometimes, the administration’s vision of aid seems to be not “America First,” but “Trump First.” For example, in March, a Vietnamese official announced that work on a suspended USAID project to clean up toxic chemicals would resume, one day after Vietnam’s prime minister reportedly met with a representative of the Trump Organization. Soon after, Vietnamese officials argued that the organization should be allowed to skip meeting several legal requirements to begin constructing a new golf resort in Vietnam on an expedited timeline that would “capitalize on the support of the Donald Trump administration” and be more convenient for Trump’s son Eric, who planned to attend the ground-breaking in May. (The State Department spokesperson said that the chemical project was “a high priority for this Administration” but did not answer followup questions about the relationship between the project and the Trump Organization representative’s meeting with the Vietnamese prime minister. The Trump Organization did not respond to a request for comment.)

The Trump administration’s approach to foreign aid may gain more resources for the U.S. in the short term, but it also risks sacrificing other goals. The American intelligence community has long known that insurgent groups—many of which openly seek the destruction of the United States—rely on desperation, food insecurity, and hopelessness to gain recruits. USAID’s collapse has greased their efforts. The State Department spokesperson wrote that the department works with partner governments to “strengthen local security capabilities, improve intelligence-sharing, and disrupt terrorist networks before they can exploit instability.” But after U.S.-funded health and counterterrorism programs in Mozambique were cut last year, ISIS surged into the vacuum. When Trump hastily shut down all foreign aid to Afghanistan, the State Department said its “primary humanitarian objective” in the country was to prevent the resources left behind from going to terrorists. Instead, armored vehicles that American taxpayers had bought for humanitarian workers—along with 147 pieces of sensitive security equipment—were seized by the Taliban. (The State Department spokesperson did not directly address the incident. “The Trump Administration will not allow U.S. taxpayer dollars to be used to enable the Taliban’s heinous behavior,” they told me.)

The United States’ new approach to foreign aid brings the nation in line with authoritarian countries that have historically prioritized strategy over charity. Russia’s grain diplomacy functions with the understanding that food today means military bases tomorrow. China subordinates the goal of improving foreign populations’ health outcomes to establishing dependency on its medical tech. The Trump administration may not have wholly forsaken the extraordinary idea that the United States should spend money to save the lives of ordinary people in foreign countries. But it has trampled on the humanitarian pretense for doing so.

If you didn’t know who Peter Attia was last week, here’s how you’ll remember him going forward: Attia is the guy who once emailed Jeffrey Epstein to confirm that “pussy is, indeed, low carb. Still awaiting results on gluten content, though.”

Until recently, Attia was known as a wellness influencer in the manosphere and a newly appointed contributor at CBS as part of the “Free Press to network TV” pipeline. He has a popular podcast and wrote the best-selling book Outlive: The Science and Art of Longevity. But Attia is also all over the Epstein files—his name pops up more than 1,700 times in the Justice Department’s latest batch of documents. From 2015 to 2018, Epstein and Attia exchanged numerous emails. Many of them are mundane: Epstein writes to Attia about “a very strange vein like red pattern” on his stomach; he asks Attia what kind of probiotic he should use; there is talk of MRI scans of Epstein’s spine. But others are vile. In a June 2015 back-and-forth about cancer and longevity, Epstein muses that he’s not sure why “women live past reproductive age at all.” (CBS did not respond to a request for comment; the network is reportedly expected to drop Attia after last week’s revelation.)

Attia, a onetime researcher who earned an M.D. but never completed his surgical residency, is beloved by his fans for his measured, scientific approach to living one’s best life. On a recent podcast, he spends two hours examining Alzheimer’s disease in women. Other episodes delve into timely topics such as protein intake, fertility, chronic pain, and nicotine; his October discussion about the safety of Tylenol use during pregnancy offers an evidence-based counterpoint to the alarmist White House press conference on that issue. But he’s also been knocked for overhyping treatments with limited data behind them and for exploiting his eager fans (a course he offered on longevity cost $2,500, according to a 2023 Wall Street Journal article). Eric Topol, a high-profile cardiologist who directs the Scripps Research Translational Institute, called Attia a “huckster” earlier today.

In the mid-2010s, about the time he befriended Epstein, Attia seemed focused on building a roster of clients whom he could advise on longevity and wellness. An exercise physiologist who once worked with Attia told me that Attia sent him to do a physical evaluation of Epstein at the late financier’s cavernous Manhattan residence in July 2017. He remembers two young, attractive women who flowed in and out during the session with Epstein, though Epstein didn’t acknowledge them. “Something felt a little off,” he told me. A proposed follow-up session never took place. (The physiologist spoke on the condition of anonymity because he didn’t want to be associated with the scandal; he told me he hasn’t spoken with Attia in years.)

Attia’s emails with Epstein reveal no such qualms. In an email in which Attia seems to be pitching Epstein a longevity program, he asks Epstein if “you’re interested in living longer (solely for the ladies, of course)?” In July 2016, Attia asked Epstein what he was doing in Palm Beach, where Epstein allegedly abused numerous underage girls. “Guess,” Epstein writes. Attia replies: “Besides that.” In 2017, Attia appears to have spent time with Epstein in New York—rebuffing his wife’s pleas for him to return home to California—while his infant son was having a medical emergency. In the emails, Attia is not just Epstein’s medical adviser but a friend and ardent admirer. In 2016, Attia wrote to Epstein’s assistant that he goes “into JE withdrawal when I don’t see him.”

When Attia and Epstein met in 2014, the full extent of the latter’s crimes weren’t yet publicly known, but his misdeeds weren’t a secret. Epstein first pleaded guilty to a child-sex offense in 2008, and by 2010, he had settled several lawsuits over allegations of sexual misconduct. A representative for Attia pointed me to a long semi–mea culpa that Attia posted on his X account this morning. He writes that he “never saw anyone who appeared underage” in Epstein’s presence, and that he had “nothing to do with his sexual abuse or exploitation of anyone.” Attia also says that when he learned about the extent of Epstein’s crimes from a November 2018 Miami Herald article, he confronted Epstein and told him that he needed to accept responsibility and pay for support for those he had harmed. (In December 2018, Attia wrote to Epstein that he “would like to discuss some stuff with you in person.”) And, in his X post, Attia calls the emails between him and Epstein “embarrassing, tasteless, and indefensible.”

In his lengthy explanation of his behavior, Attia writes that he was fascinated by Epstein’s wealth and access to influential people. Epstein had contact with plenty of household names, such as Bill Gates and Elon Musk. Attia’s emails refer to “Ehud”—likely Ehud Barak, the former Israeli prime minister who was known to have a relationship with Epstein but has previously denied any wrongdoing or knowledge of Epstein’s crimes. In an August 2015 email, Epstein tells Attia that he’s having dinner that night with “musk thiel zuckerburg [sic].” (In 2019, a spokesperson for the Meta founder Mark Zuckerberg said he had met Epstein “in passing one time at a dinner honoring scientists that was not organized by Epstein.” In 2021, Gates told PBS that his meetings with Epstein were a mistake. A spokesperson for Peter Thiel said on Saturday that Thiel never visited Epstein’s island, and Musk wrote on X that “no one pushed harder than me to have the Epstein files released and I’m glad that has finally happened.”)

Attia is more representative of another category of Epstein associate: researchers who thought Epstein could either fund their work or help push their careers to the next level. In his X post, Attia writes that he was introduced to Epstein when he was raising money for scientific research. Epstein was known to have donated millions to a research center run by Martin Nowak, a professor of mathematics and biology at Harvard. (In 2021, the university temporarily sanctioned Nowak for violating rules about professionalism and campus access in connection with his involvement with Epstein, but he remains a professor there. He said at the time that he regretted “the connection I was part of fostering between Harvard and Jeffrey Epstein.”) Lawrence Krauss, a theoretical physicist and cosmologist, received $250,000 from Epstein’s foundation for his science-communication group. At one point, Epstein introduced Attia to Krauss over email because Krauss was hoping to start a podcast and wanted tips. (Last year, Krauss said that none of his communications with Epstein was criminal, and that he “was as shocked as the rest of the world when he was arrested.”)

Attia, in other words, was seemingly not alone in being wowed by Epstein’s wealth and well-known friends. A particular challenge for Attia, though, is that wellness influencers offer their followers more than diet-and-exercise tips. They’re selling wisdom. Follow my advice, they contend, and you’ll live a longer, healthier, more fulfilled life. But a chummy, yearslong association with a convicted child predator is, at the very least, unwise.

Some of the country’s most powerful men are panicking about testosterone levels. Tucker Carlson’s 2022 documentary The End of Men blamed declining testosterone for the supposedly rampant emasculation of American men. Influencers issue warnings that “low T is ruining young men.” Recently, the Department of Health and Human Services has gone all in on the hormone, publishing new dietary guidelines meant to help men maintain healthy testosterone levels and considering widening access to testosterone-replacement therapy. Kush Desai, a White House spokesperson, told me in an email that “a generational decline in testosterone levels among American men is one of many examples of America’s worsening health.”

Health and Human Services Secretary Robert F. Kennedy Jr. has gone further, calling low T counts in teens an “existential issue.” The secretary, who at 72 has the physique of a Marvel character, has touted testosterone as part of his personal anti-aging protocol. He also recently proclaimed that President Trump has the “constitution of a deity” and raging levels of T.

American men are responding by attempting to increase their testosterone levels—whether they need to or not. Low-testosterone clinics, many of them based online, have proliferated, promising men a way to “get your spark back” or “reclaim your life.” So have supplements and accessories designed to support testosterone health—you can buy an ice pack designed specifically to keep testicles cool in a sauna. Recently, a friend who questioned his dedication to testicular health after seeing ads for testosterone boosters asked me if he should buy nontoxic briefs from a company called (yes, seriously) Nads.

Research has shown that average testosterone levels are indeed declining in American men. The primary study on the slump, published in 2007, followed Boston-based men over two decades and found that their levels had declined significantly more than would be expected from aging alone. Since then, numerous studies have documented the same trend among various age groups. The dip can probably be attributed to a number of factors, including rising obesity rates, widespread chronic disease, and sedentary lifestyles, Scott Selinger, an assistant professor at University of Texas at Austin’s Dell Medical School who studies testosterone, told me. Hot testicular temperatures—because of tight-fitting clothes, excessive sitting, or, more dramatically, a quickly warming climate—may also affect testosterone production, Selinger said.

[Read: RFK Jr.’s testosterone regimen is almost reasonable]

Low testosterone really can be debilitating. Deficiency is linked to low libido, erectile issues, fatigue, heart disease, osteoporosis, anemia, and depression. But the prevalence of testosterone deficiency is hard to define. It ranges from 2 to 50 percent across studies and differs greatly by age. Speaking about men’s-health experts, Abraham Morgentaler, a urologist specializing in testosterone therapy at Harvard Medical School, told me, “I don’t think too many people are really concerned” about population-level declines. Major professional groups whose members study and treat testosterone deficiency—the Androgen Society, the Endocrine Society, the American Urological Association—haven’t launched any specific initiatives to combat low testosterone, Morgentaler said.

They are nonchalant in part because the average decline in testosterone is not especially large. The “normal” range for testosterone isn’t well defined; limits set by different medical societies span from about 300 to 1,000 nanograms per deciliter. The Boston paper found that the average participant’s testosterone levels dropped roughly 50 nanograms per deciliter every eight years or so, “which for some people can make a difference, but for a lot of people, it doesn’t,” Selinger said. Experts are more concerned about what might be driving the decline: “I wouldn’t say there is an epidemic of low testosterone,” Franck Mauvais-Jarvis, an endocrinology professor at Tulane University School of Medicine, told me. “The problem is the epidemic of chronic disease.”

Meanwhile, the number of American men receiving testosterone-replacement therapy has grown by almost 30 percent from 2018 to 2022. In TRT, lab-synthesized testosterone is administered to patients in a variety of formats, including injections, pills, topical gels, and slow-dissolving pellets inserted under the skin of their buttocks or hips. Testosterone, which is a controlled substance, is approved only for the treatment of testosterone deficiency; a diagnosis requires blood work that shows levels below the normal range over multiple days and, crucially, corresponding symptoms. But it’s also easy to get for other reasons. Doctors can prescribe it off-label to men with normal T levels who complain of low energy, decreased libido, or erectile dysfunction. Some gym fanatics buy it, sometimes through illicit channels, to build muscle, which is the only undeniable effect of raising T in someone with normal levels, Morgentaler said. All of the experts I spoke with were dubious that low-T clinics follow standard medical practice for prescribing testosterone, including discussing potential risks with patients and employing hormone specialists. (I reached out to two large testosterone clinics, but they did not respond.)

[Read: Why are so many women being told their hormones are out of whack?]

Whether the goal of these clinics is to treat low T or jack levels up to the max isn’t clear. Many encourage men to aim for excessive T, Michael Irwig, an endocrinologist at Harvard Medical School, told me. A 2022 study of seven direct-to-consumer low-testosterone clinics found that three of them proposed a treatment goal of at least 1,000 nanograms per deciliter—one advertised a goal of 1,500. It should come as no surprise, then, that up to a third of men on TRT don’t have a deficiency, and that the majority of new testosterone users start treatment without completing the blood work needed for a diagnosis.

The maximalist approach to testosterone is risky. Although new research dismisses previous concerns about TRT causing cardiovascular disease and prostate cancer, too much of the hormone can elevate levels of hemoglobin, which raises the risk of blood clots, and estradiol, which can cause breast enlargement. It also causes testicular shrinkage and infertility. Although increasing testosterone can improve sexual performance, it has the opposite effect on reproduction, which is why the American Urological Association advises men to be cautious about TRT if they want to have kids. If a man begins testosterone therapy in his 20s and stays on it for upwards of five years, the chances that he’ll ever recover his original sperm levels are low, John Mulhall, a urologist at Memorial Sloan-Kettering Cancer Center, told me.

Plus, pumping the body full of testosterone may not alleviate the problems that patients set out to solve. Every patient responds to testosterone differently, Mauvais-Jarvis said; some men may feel perfectly fine at levels considered deficient, while others require far higher levels. The symptoms of deficiency could also be caused by any number of common ailments, such as obesity, cardiovascular disease, and depression. It’s not uncommon for a patient who still feels unwell after starting treatment to ask for more testosterone, Mulhall said. But if his problems have a different root cause, “pushing that man to a T level of 1,000 won’t improve his symptoms.”

[Read: Men might be the key to an American baby boom]

The current spotlight on testosterone could lead to better care for men in at least one way: If the FDA removes restrictions on the hormone, more patients will be able to get treatment from their doctors—many of whom are reluctant to prescribe controlled substances—instead of turning to dubious third-party clinics, Mauvais-Jarvis said. This would be especially beneficial for the men who do have a genuine T deficiency but are currently not receiving treatment.

The panic over testosterone seems unlikely to end anytime soon, in part because it is about not just men’s health, but also manhood. The Trump administration is obsessive about manliness. Trump himself—who shared his T level (a perfectly respectable 441 nanograms per deciliter) during his first presidential campaign—has been lauded by his fans as a paragon of masculinity. The message is clear: When it comes to testosterone, more is definitely more.

Robert F. Kennedy Jr. has a new rallying cry: Eat real food. It’s an intuitive piece of advice—snack on some grapes instead of potato chips, trade that microwaveable mystery meat for a grilled chicken breast. The tagline has accompanied the new Dietary Guidelines for Americans, the government’s official nutrition recommendations, which call for Americans to prioritize whole foods and limit processed ones. “It’s time to start eating real food again,” the health secretary said during a speech in Pennsylvania last week. The Trump administration has even launched a new website, realfood.gov, which welcomes visitors with an animation of a steak, a carton of whole milk, and a head of broccoli.

The path toward Kennedy’s goal runs through an overlooked piece of the food landscape: convenience stores. The purveyors of late-night hot dogs, tins of Zyn, and countless varieties of gummy worms generally don’t sell a lot of “real food.” But in America’s food deserts, convenience stores are more than just places to pick up a snack—they’re grocery stores. The USDA estimates that tens of millions of Americans live in low-income areas with limited access to grocery stores. Mini-marts such as 7-Eleven and Sheetz “have an outsized role in our food system because they are the only food access point in many neighborhoods,” Hilary Seligman, an expert on food insecurity at UC San Francisco, told me.

The Trump administration does have a tool it can use to force the humble mini-mart to stock healthier options. Food stamps are an essential part of convenience stores’ business, but to participate in the program, retailers must carry products across a range of food groups. Those standards are poised to get an overhaul: “We are requiring that every outlet that accepts food stamps double the amount of real food that it sells in its facility,” Kennedy said last week. That decision is really up to Secretary of Agriculture Brooke Rollins, whose agency runs the food-stamp program, and she made similar comments at a press conference earlier this month. When I asked the Department of Agriculture for more details about the planned changes, a spokesperson directed me to a proposal released in draft form last fall. Under those new rules, stores would be required to carry seven varieties of food across four categories: protein, dairy, grains, and vegetables and fruits.

[Read: The new food-stamp rules will make your head spin]

The move could put us one step closer to healthy convenience stores, which have long been a dream of some nutrition reformers. Several experiments in different cities—including in Baltimore, New Orleans, and Denver—have tried to alleviate the problem of food deserts by improving the offerings at local mini-marts. A review of these pilot programs found “consistent improvements across most of the trials in the availability and sale of healthy foods, the purchase and consumption of those foods, and consumer knowledge.”

But the Trump administration’s plan isn’t perfect. Under the draft proposal, corner stores won’t have to follow any specific nutritional guidelines, so long as they meet the food-group standards. The Center for Science in the Public Interest, a consumer-advocacy group, has warned that the new standards mean that stores could stock Fruity Pebbles to fulfill the grain requirement, and sugary canned peaches for the fruit requirement. “It would be very easy for a retailer to meet the proposed rule by stocking foods in each staple food category that are high in added sugar, sodium, saturated fat, and refined grains,” Joelle Johnson, the deputy director of CSPI, told me.

[Read: Why don’t convenience stores sell better food?]

Changing mini-marts for good won’t be easy. Owners stock their stores based on what sells, and available data suggest that “real food” isn’t exactly driving sales. A recent analysis by Lula Commerce, an online-shopping tool for convenience stores, found that alcohol, soda, and energy drinks were the top items ordered by their stores’ customers. And a 2021 report by the National Association of Convenience Stores pointed to packaged beverages, cigarettes, and salty snacks as among the best sellers. Convenience stores are concerned that stocking healthy products will just mean wasted shelf space in their already crowded stores. They typically don’t have the margins necessary to swap out a section of the drink cooler to stock cauliflower or salmon filets, especially when these products may rot. NACS supports aspects of the Trump administration’s push to heighten the requirements for what these stores have to stock, Margaret Hardin Mannion, the group’s director of government relations, told me. But, she added, “we have to be able to sell what our customers want to buy.” Though the Trump administration is within its power to mandate that convenience stores participating in the food-stamps program stock as many healthy goods as it deems fit, that could easily backfire if retailers back out of the program entirely.

Until convenience stores actually give customers the option to buy strawberries instead of Slim Jims, it’s hard to know exactly how well-founded their fears really are. The smattering of experiments that stocked mini-marts with fresh produce suggest that people do have some interest in eating healthier when they can. But many of these trials also included efforts meant specifically to drum up sales of healthy items. In one experiment in Baltimore, convenience stores were encouraged to stock certain healthy foods through subsidies from researchers. Retailers hung up posters to promote these new healthy foods, and some gave out coupons to encourage specific purchases. Replicating an experiment like this on the national scale would cost some major cash. (There are more than 150,000 convenience stores in the United States.) And it would require a real leap of faith from an administration that to date has focused much of its efforts on restricting the products that can be purchased with food stamps, and on rooting out alleged fraud within the program.

For now, RFK Jr. and other health officials are stuck with a convenience-store conundrum. The mini-mart embodies how hard it is to actually improve the American diet. It’s enough of a challenge to get stores to stock their shelves with broccoli. But that still isn’t a guarantee that people will put down the Doritos.

Polio Was That Bad

by

In the United States, polio is a memory, and a fading one at that. The last major outbreak here happened in 1952; the virus was declared eliminated in 1979. With all of that behind us, you can see how someone—say, Kirk Milhoan, the chair of the CDC’s vaccine advisory committee—might wonder whether giving the polio vaccine to American kids still makes sense. “We need to not be afraid to consider that we are in a different time now,” Milhoan said on the podcast Why Should I Trust You? last week.

To be fair, Milhoan didn’t endorse yanking the polio vaccine from the CDC’s childhood-immunization schedule, as other vaccines were earlier this month. But he didn’t rule it out. And right now, when it comes to vaccines in America, anything seems possible. With Robert F. Kennedy Jr. at the helm of the Department of Health and Human Services, and with the CDC’s vaccine advisory committee stacked with his allies, every inoculation—no matter how well studied or successful—seems to be under new scrutiny, and at least potentially on the chopping block. Next on the committee’s agenda is looking into the safety of aluminum salts, which are used in numerous vaccines to boost the recipient’s immune response. For the record, a study of more than 1 million Danish children, published last July, found no statistically significant evidence linking aluminum in vaccines to asthma, autoimmune conditions, or neurodevelopmental disorders, including autism.

The polio vaccine, which doesn’t contain aluminum, hasn’t received much attention so far from Kennedy’s HHS; the department did not respond to my questions about whether it is considering changing its recommendations about this vaccine, and my queries to Milhoan went unanswered. But its time may be coming. As anti-vaccine activists are quick to point out, an American kid’s risk of catching polio in 2026 is vanishingly low. So why don’t we drop it from the recommendation list? Or perhaps, as we did with the smallpox vaccine in the early 1970s, just stop offering it altogether?

In anti-vaccine circles, the official story of polio—iron lungs, kids with leg braces, the triumph of Jonas Salk—has long been dismissed as misleading. In a 2020 debate with the lawyer Alan Dershowitz, Kennedy raised doubts about whether the vaccine had really been responsible for ridding the country of polio, crediting instead factors such as sanitation and hygiene. Last June, Aaron Siri, a lawyer who has worked closely with Kennedy, called for the polio vaccine to be struck from the CDC’s recommendations. In his recent book, Vaccines, Amen, Siri argues that the seriousness of polio has been overblown—a sentiment shared by others in the health secretary’s orbit, including Del Bigtree, who served as the communications director for Kennedy’s presidential campaign. Joe Rogan suggested on his podcast last March that the pesticide DDT, rather than the virus, deserved blame for symptoms attributed to polio.

[Read: Here’s how we know RFK Jr. is wrong about vaccines]

That’s all nonsense. Polio, which usually spreads through contact with an infected person’s feces via contaminated hands or water, can be a devastating disease. That 1952 outbreak killed some 3,000 people and left more than 20,000 paralyzed. About one in 200 people who contract polio will experience a form of paralysis. As many as 40 percent of people who recover from the virus, even a mild form of it, develop post-polio syndrome, which can emerge decades after an infection and cause weakened muscles and trouble breathing and swallowing. The vaccines—the original shot that contains inactivated-virus particles, plus an oral solution of weakened live viruses—did indeed lead to the near-elimination of polio globally and has spared millions from the worst outcomes of the disease.

Yet polio has proved stubbornly hard to stamp out. In 1988, as the virus was still endemic to more than 100 countries, the World Health Organization set a goal to eradicate polio by the year 2000. That deadline came and went, as did the ones that followed. A wild strain of the virus remains endemic in Pakistan and Afghanistan, and periodic outbreaks continue across Africa and elsewhere, including one in the Gaza Strip in 2024. One obstacle to eradicating the virus has been vaccine hesitancy, and even violence against health workers. In the early 2000s, five states in northern Nigeria boycotted the polio vaccine in part because of rumors that it was an American plot to spread HIV. Polio workers have been killed in Pakistan and Afghanistan, likely by the Taliban, which has alleged that vaccines are intended to sterilize Muslim children (and has denied responsibility for some of the attacks).

In regions where polio is rampant, the oral vaccine is preferred: It’s both cheaper and more effective at stopping transmission than the inactivated-virus injection. The United States and other countries where the disease is almost nonexistent exclusively use the inactivated version—in part because in rare instances, the oral vaccine can cause polio infection, which may in turn lead to paralysis. Anti-vaccine activists like to point to this unfortunate irony as proof that vaccination is the real villain. Taking the oral vaccine remains far less risky than contracting the wild virus, and it has driven down overall infection rates. But it’s an imperfect tool, and the WHO intends to phase it out by the end of 2029. (Given historical precedent, along with the Trump administration’s dramatic pullback from polio-vaccination efforts as part of its dismantling of foreign aid, this timeline might be optimistic.)

[Read: Polio is exploiting a very human weakness]

Ending polio vaccination altogether, according to the WHO’s plan, will take considerably longer. After polio is declared eradicated worldwide, the organization wants countries to wait 10 years before stopping use of the inactivated-virus shots to be certain that it is no longer circulating, Oliver Rosenbaum, a spokesperson for WHO’s Global Polio Eradication Initiative, told me.

Some polio experts, though, told me that they think ending vaccination at any point is unrealistic. Because in many cases the disease spreads without a person showing symptoms (unlike smallpox, which is not contagious before symptoms develop), large numbers of people can be infected before authorities are even aware of an outbreak. Some experts, such as Konstantin Chumakov, a virologist who began researching polio in 1989, worry that the virus could be used as a biological weapon in an entirely unvaccinated country. “In my opinion, and in the opinion of many respected polio experts, this is absolutely unacceptable because you can never assure that polio is completely eradicated,” Chumakov told me.

Ending or reducing polio vaccination in the U.S. before global eradication, as Milhoan seems to imply the country should consider, would be an even worse idea. A paper published last April in the Journal of the American Medical Association projected the consequences of U.S. vaccination rates declining by half and concluded that the country would see a significant return of paralytic polio. Some paralyzed people would probably require a ventilator, the modern equivalent of the iron lung, and, based on typical fatality rates, about 5 to 10 percent of them would die.

If Americans stopped vaccinating their kids against polio entirely, several years might pass without the U.S. having any cases, or with it seeing just a few here and there, Kimberly Thompson, a public-health researcher and the president of the research nonprofit Kid Risk, told me. (Even under the current vaccination system, occasional spread of the virus occurs. In 2022, for instance, an unvaccinated young adult in Rockland County, just north of New York City, tested positive for a vaccine-derived strain of polio even though they hadn’t traveled overseas.) But eventually, we could find ourselves back in the same situation as in the early 1950s.

Or it could be even worse. Seventy-five years ago, many American children inherited some polio immunity from their mother, and so they had at least partial protection against the virus’s worst effects. The introduction of polio to a population that has little to no immunity could cause its mortality rates to exceed those experienced in the first half of the previous century, Chumakov said. A 2021 study, co-authored by Thompson, modeled what might happen in an extreme scenario: if global polio vaccination ended, no one were immune, and the virus had somehow been reintroduced. It estimates that such a scenario could cause tens of millions of cases of paralysis worldwide.

[Read: South Carolina is America’s new measles norm]

Milhoan insisted on the podcast that Americans shouldn’t be afraid to rethink vaccine policy. And he’s right that health authorities should reevaluate risk and offer the most up-to-date medical advice. Still, there’s something to be said for the utility of fear. Most Americans, including me, are too young to have any personal knowledge of polio. It’s all textbook summaries and black-and-white newsreels; we’ve never worried about shaking someone’s hand or going swimming and then ending up in a wheelchair. But vaccines are the only thing stopping us from getting another firsthand look.

Last year, starting in January, the United States experienced its largest documented measles outbreak in more than three decades, when an epidemic centered on West Texas sickened at least 762 people. Now a fast-moving outbreak in South Carolina seems poised to surpass it: Local officials have logged 700 infections, and the virus is still rapidly spreading.

As public-health officials scramble to contain the virus, they’re also trying to figure out whether these two outbreaks are connected—specifically, whether the version of the pathogen that kick-started the West Texas cases has been circulating within the nation’s borders ever since. If the answer is yes, it will mean that measles has once again become a permanent resident of this country, after 26 years of only limited outbreaks imported from abroad. Given that the U.S. clocked more than 2,200 measles cases in 2025—more than it has had in a single year since 1991—the experts I spoke with already consider this the reality that Americans are living in. One of the fastest-spreading viral diseases ever documented has once again become a routine threat.

At this point, researchers are working to find the connective tissue among some of the largest measles outbreaks in the U.S. within the past year, including the ones centered in West Texas, Utah, Arizona, and South Carolina. Technically, the epidemics still could have been caused by separate reintroductions of measles from at least one international source. But “that’s a hard stretch,” Robert Bednarczyk, a global-health researcher and epidemiologist at Emory’s Rollins School of Public Health, told me. The most likely and so far best-supported scenario, he said, is also the simplest one—that the virus spread so fiercely and quickly through these communities that it was able to hitch a ride elsewhere in the country when infected people traveled.

If further evidence proves that scenario true, the Pan American Health Organization could strip the U.S. of its official measles-elimination status—which the country has held since 2000—at a meeting scheduled for April. (A country achieves elimination status when it can show that the virus hasn’t been circulating for 12 consecutive months; it loses the status when researchers show that measles has been spreading for a year straight.) Already, PAHO has publicly confirmed that scientists have found the same strain of measles in Texas, New Mexico, Utah, Arizona, and South Carolina, as well as in Canada, Mexico, and multiple other North American countries. (In response to a request for comment, a PAHO spokesperson clarified that although these detections had been made, the committee was still seeking further evidence.)

Health officials also recently announced that South Carolina’s outbreak has seeded cases elsewhere, including Washington State. Still, the case for measles’ continuous transmission can’t yet be considered a slam dunk. To prove it definitively, researchers will need to show that geographically distant outbreaks in the U.S. are epidemiologically linked and that there is not   sufficient evidence suggesting that the virus bounced back and forth between countries.

Two types of information are essential to these investigations. First, researchers look into the travel histories of infected people, who might have brought the virus from one state to another. Second, they compare genetic sequences pulled from the virus across locations. Measles mutates slowly enough that researchers can in many cases search for essentially the same strain when tracking its movements. But the virus does accumulate some changes in its genome, and the further apart two cases are in time, the more genetically distinct their genetic material should be. If measles was being continuously transmitted, scientists might expect to see slightly different iterations of the virus racking up mutations as it traveled, say, from Texas to South Carolina. If measles had been introduced separately to those locations, the sequences pulled from each state might more closely resemble genetic information from an international source, Pavitra Roychoudhury, a pathogen-genomics expert at the University of Washington, told me.

But some of the data that officials need may be lacking. For months, experts have been concerned that the U.S. has been severely undercounting its measles cases and that the virus had been circulating in some communities long before it was officially detected. Cases of the disease can be easy to miss, Helen Chu, an immunologist at the University of Washington, told me. The early days of measles are usually marked by common symptoms such as fever and cough; to the untrained eye, the virus’s rash can look like many of the reddish, patchy blemishes that many other pathogens cause. The overwhelming majority of measles cases in the U.S. have also concentrated in communities that have low vaccination rates, which often have less access to medical care and the sort of testing that would also collect viral samples. Many people who deliberately decline vaccination for their families are also skeptical of seeking medical care in general, or of public-health officials investigating outbreaks.

With inconsistent data, researchers may be left sorting through genetic sequences that neither point clearly to one another nor obviously implicate separate sources. “There’s a judgment call in that gray area,” William Moss, an epidemiologist at the Johns Hopkins Bloomberg School of Public Health, told me. “How different is different?” The last time measles spread endemically in the United States, this sort of genomic analysis was not commonplace.

Should PAHO find that measles is spreading concertedly in the U.S. again, the nation’s leaders may shrug off the change. At times, top officials at the Department of Health and Human Services appear to have dismissed the notion of continuous spread: In November, Jim O’Neill, Health Secretary Robert F. Kennedy Jr.’s deputy and the CDC’s acting director, posted on social media that “preliminary genomic analysis suggests the Utah and Arizona cases are not directly linked to Texas.” (Neither the Trump administration nor PAHO has released the details about the data it reviewed or how similar any identified strains are.) And last week, the CDC’s new Principal Deputy Director Ralph Abraham described the prospect of the U.S. losing its measles-elimination status as “just the cost of doing business with our borders being somewhat porous for global and international travel.” Certain communities, he added, “choose to be unvaccinated. That’s their personal freedom.”

In an email, Emily G. Hilliard, HHS’s press secretary, echoed Abraham’s comments, describing current outbreaks as “largely concentrated in close-knit, under-vaccinated communities with prevalent international travel that raises the risk of measles importation,” and noting that the United States still has a lower measles burden than Canada, Mexico, and much of Europe do.

In practice, the April decision will be a matter of semantics. Whatever the outcome, the U.S. has been weathering a worsening measles situation for years now, as vaccination rates have ticked down and outbreaks have grown larger and more common. In the past year, the Trump administration has made it substantially more difficult for local public-health-response teams to address and contain outbreaks too. HHS reportedly delayed communications from the CDC to officials in West Texas and held back federal funds to fight the outbreak for two months. More recently, HHS pledged to send $1.4 million to address South Carolina’s outbreak, though it began months ago. (One recent analysis suggests that measles outbreaks of this scale can cost upwards of $10 million.) The administration has repeatedly downplayed the benefits of immunization, while exaggerating the importance of nutritional supplementation for combatting measles. Kennedy has also spent decades repeating disproved claims that vaccines such as the measles-mumps-rubella immunization can cause autism. (Hilliard wrote that Kennedy has consistently said that vaccination is the most effective way to prevent measles, but she also emphasized in her email that people should consult with health-care providers about whether vaccination is best for their family.)  

Should immunity erode further—as experts watching the Trump administration’s actions expect it to—measles will find it even easier to move across the country, until epidemics bleed so thoroughly together that their links become irrefutable. Already, the nation’s leaders have made clear where the U.S. stands on measles: It is an acceptable norm.

Last year, starting in January, the United States experienced its largest documented measles outbreak in more than three decades, when an epidemic centered on West Texas sickened at least 762 people. Now a fast-moving outbreak in South Carolina seems poised to surpass it: Local officials have logged 700 infections, and the virus is still rapidly spreading.

As public-health officials scramble to contain the virus, they’re also trying to figure out whether these two outbreaks are connected—specifically, whether the version of the pathogen that kick-started the West Texas cases has been circulating within the nation’s borders ever since. If the answer is yes, it will mean that measles has once again become a permanent resident of this country, after 26 years of only limited outbreaks imported from abroad. Given that the U.S. clocked more than 2,200 measles cases in 2025—more than it has had in a single year since 1991—the experts I spoke with already consider this the reality that Americans are living in. One of the fastest-spreading viral diseases ever documented has once again become a routine threat.

At this point, researchers are working to find the connective tissue among some of the largest measles outbreaks in the U.S. within the past year, including the ones centered in West Texas, Utah, Arizona, and South Carolina. Technically, the epidemics still could have been caused by separate reintroductions of measles from at least one international source. But “that’s a hard stretch,” Robert Bednarczyk, a global-health researcher and epidemiologist at Emory’s Rollins School of Public Health, told me. The most likely and so far best-supported scenario, he said, is also the simplest one—that the virus spread so fiercely and quickly through these communities that it was able to hitch a ride elsewhere in the country when infected people traveled.

If further evidence proves that scenario true, the Pan American Health Organization could strip the U.S. of its official measles-elimination status—which the country has held since 2000—at a meeting scheduled for April. (A country achieves elimination status when it can show that the virus hasn’t been circulating for 12 consecutive months; it loses the status when researchers show that measles has been spreading for a year straight.) Already, PAHO has publicly confirmed that scientists have found the same strain of measles in Texas, New Mexico, Utah, Arizona, and South Carolina, as well as in Canada, Mexico, and multiple other North American countries. (In response to a request for comment, a PAHO spokesperson clarified that although these detections had been made, the committee was still seeking further evidence.)

Health officials also recently announced that South Carolina’s outbreak has seeded cases elsewhere, including Washington State. Still, the case for measles’ continuous transmission can’t yet be considered a slam dunk. To prove it definitively, researchers will need to show that geographically distant outbreaks in the U.S. are epidemiologically linked and that there is not   sufficient evidence suggesting that the virus bounced back and forth between countries.

Two types of information are essential to these investigations. First, researchers look into the travel histories of infected people, who might have brought the virus from one state to another. Second, they compare genetic sequences pulled from the virus across locations. Measles mutates slowly enough that researchers can in many cases search for essentially the same strain when tracking its movements. But the virus does accumulate some changes in its genome, and the further apart two cases are in time, the more genetically distinct their genetic material should be. If measles was being continuously transmitted, scientists might expect to see slightly different iterations of the virus racking up mutations as it traveled, say, from Texas to South Carolina. If measles had been introduced separately to those locations, the sequences pulled from each state might more closely resemble genetic information from an international source, Pavitra Roychoudhury, a pathogen-genomics expert at the University of Washington, told me.

But some of the data that officials need may be lacking. For months, experts have been concerned that the U.S. has been severely undercounting its measles cases and that the virus had been circulating in some communities long before it was officially detected. Cases of the disease can be easy to miss, Helen Chu, an immunologist at the University of Washington, told me. The early days of measles are usually marked by common symptoms such as fever and cough; to the untrained eye, the virus’s rash can look like many of the reddish, patchy blemishes that many other pathogens cause. The overwhelming majority of measles cases in the U.S. have also concentrated in communities that have low vaccination rates, which often have less access to medical care and the sort of testing that would also collect viral samples. Many people who deliberately decline vaccination for their families are also skeptical of seeking medical care in general, or of public-health officials investigating outbreaks.

With inconsistent data, researchers may be left sorting through genetic sequences that neither point clearly to one another nor obviously implicate separate sources. “There’s a judgment call in that gray area,” William Moss, an epidemiologist at the Johns Hopkins Bloomberg School of Public Health, told me. “How different is different?” The last time measles spread endemically in the United States, this sort of genomic analysis was not commonplace.

Should PAHO find that measles is spreading concertedly in the U.S. again, the nation’s leaders may shrug off the change. At times, top officials at the Department of Health and Human Services appear to have dismissed the notion of continuous spread: In November, Jim O’Neill, Health Secretary Robert F. Kennedy Jr.’s deputy and the CDC’s acting director, posted on social media that “preliminary genomic analysis suggests the Utah and Arizona cases are not directly linked to Texas.” (Neither the Trump administration nor PAHO has released the details about the data it reviewed or how similar any identified strains are.) And last week, the CDC’s new Principal Deputy Director Ralph Abraham described the prospect of the U.S. losing its measles-elimination status as “just the cost of doing business with our borders being somewhat porous for global and international travel.” Certain communities, he added, “choose to be unvaccinated. That’s their personal freedom.”

In an email, Emily G. Hilliard, HHS’s press secretary, echoed Abraham’s comments, describing current outbreaks as “largely concentrated in close-knit, under-vaccinated communities with prevalent international travel that raises the risk of measles importation,” and noting that the United States still has a lower measles burden than Canada, Mexico, and much of Europe do.

In practice, the April decision will be a matter of semantics. Whatever the outcome, the U.S. has been weathering a worsening measles situation for years now, as vaccination rates have ticked down and outbreaks have grown larger and more common. In the past year, the Trump administration has made it substantially more difficult for local public-health-response teams to address and contain outbreaks too. HHS reportedly delayed communications from the CDC to officials in West Texas and held back federal funds to fight the outbreak for two months. More recently, HHS pledged to send $1.4 million to address South Carolina’s outbreak, though it began months ago. (One recent analysis suggests that measles outbreaks of this scale can cost upwards of $10 million.) The administration has repeatedly downplayed the benefits of immunization, while exaggerating the importance of nutritional supplementation for combatting measles. Kennedy has also spent decades repeating disproved claims that vaccines such as the measles-mumps-rubella immunization can cause autism. (Hilliard wrote that Kennedy has consistently said that vaccination is the most effective way to prevent measles, but she also emphasized in her email that people should consult with health-care providers about whether vaccination is best for their family.)  

Should immunity erode further—as experts watching the Trump administration’s actions expect it to—measles will find it even easier to move across the country, until epidemics bleed so thoroughly together that their links become irrefutable. Already, the nation’s leaders have made clear where the U.S. stands on measles: It is an acceptable norm.

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

Three days into 2026, the United States military seized a foreign leader: Nicolás Maduro. Four days after that, the U.S. health department freed a longtime prisoner of war: saturated fats.

At a recent press conference announcing the publication of the government’s new dietary guidelines, Health and Human Services Secretary Robert F. Kennedy Jr. declared two different military operations in the span of less than a minute: The nation would be retreating from its war on fatty steaks and whole milk, he said, and redeploying for another war, this one on added sugars. News about a third campaign arrived a few days later, when the White House shared a dark and menacing photo of Kennedy with the caption “WE ARE ENDING THE WAR ON PROTEIN.”

This appears to be what happens when someone who has spent years fighting mainstream medicine suddenly finds himself at the center of it. Like a revolutionary turned generalissimo, Kennedy has transformed the former palace into a military command center. He has promised to defeat his enemies in Big Pharma and to purge conflicts of interest from the agencies he leads, so as to end what he has referred to as a “war on public health.” Elsewhere he has promised to withdraw from the “war on alternative medicine,” the “war on stem cells,” the “war on chelating drugs,” the “war on peptides,” the “war on vitamins,” and the “war on minerals.” Anything that his administration hopes to do may now be put in terms of martial conflict: Under Kennedy, policy making and saber rattling go hand in hand.

Kennedy’s deputies and chief advisers are culture warriors in their own right, and they seem to share their leader’s bellicosity. Jim O’Neill, the second in command at HHS, has talked about the need to fight back against gender-affirming medical care, which he describes as being part of an evil war on biology; Mehmet Oz, now in charge of Medicare and Medicaid, says that he will “wage a war on fraud, waste, and abuse”; and Calley Means, a top Kennedy advisor, points to an admittedly less catchy “war on the American public having transparency,” which the secretary intends to halt.

This repeated phrasing is more than just a rhetorical tic, and it extends far beyond the typical military analogies—like the wars on cancer and smoking—that have long been embedded in health discussions. As Kennedy and his aides press their case in public, they adopt a persistently antagonistic tone not only toward disease but also toward the medical and scientific establishment. It is as if anyone who has disagreed with the administration must be an enemy combatant. The current HHS regime has already taken shots at supposedly corrupt pediatricians, conflicted ob-gyns, “fake news” science journalists, and “sock puppet” regulators. (In response to questions about the department’s aggressive posture, HHS press secretary Emily Hilliard asserted that Americans’ trust in public health has been declining, and that Kennedy is restoring it.)

RFK Jr. has, in this regard, been a rather effective secretary of war. He has quickly put the old medical establishment on the defensive. My colleagues in academia and medicine are worried about what might happen to them if they write one of the government’s newly forbidden words in a scientific grant or provide the wrong sort of medical care to their transgender patients; venerable scientific outlets such as The New England Journal of Medicine have had to deal with letters from a government lawyer accusing them of bias; and earlier this month, an ex-FDA official nervously joked that he hoped that the IRS wouldn’t audit his taxes as punishment for criticizing agency operations.

When it comes to public-health advice, Kennedy’s agenda has proved to be more focused on attacking previous suggestions than promoting new ones. “Prior guidelines were driven not by health interests,” Kennedy said in a Fox News interview this month, “but by mercantile interests of the food industry.” FDA Commissioner Marty Makary frames his agency’s actions as “setting the record straight” after years of dogma, and NIH Director Jay Bhattacharya—an author of the anti-lockdown “Great Barrington Declaration”—remains engaged in a heated struggle against COVID-related restrictions that ended years ago. On the whole, the administration appears to have adopted a Promethean view of science and medicine: New knowledge is not gradually discovered, but rather rescued from the grasp of special interests by an elite squadron of iconoclasts.

The contrarian brigade sometimes seems to be waging a war on irony itself. Despite the secretary’s repeated promises to wipe out conflicts of interest, some of the experts who advised on the new dietary guidelines have financial ties to the meat and dairy industries. (When those ties were first reported, HHS responded with a statement calling it “absurd to suggest that anything other than gold standard science guided our work on this presidential priority.”) Makary, an avid podcast guest, has used his airtime to issue devastating takedowns of nutrition education, only to be interrupted by advertisements for unproven dietary supplements. Vinay Prasad, another top FDA official and medical provocateur, has joined Makary on the agency’s FDA Direct podcast for what they called a “bashing session” of The Wall Street Journal’s opinion desk—which has a long history of publishing Makary, Bhattacharya, and Kennedy.

Rousing citizens with patriotic calls to battle is a tried-and-true political strategy, but the hostility generated by this public-health administration may not be sustainable. Adrenaline surges don’t last forever, and overreliance on extreme rhetoric will flatten important differences between public-health problems. Repeated attempts to discredit trusted medical experts may also backfire. Earlier this month, a federal judge temporarily prevented the government from terminating millions of dollars in public-health grants awarded to the American Academy of Pediatrics. In her ruling, Judge Beryl Howell cited multiple combative social-media posts made by HHS officials and advisers against the physician group. At the very least, the MAHA strategy of picking fights against the nation’s primary care doctors, news outlets, and career officials is unlikely to inspire a resurgence of the public’s trust. Outrage over skim milk and Froot Loops can only go so far, and Americans may soon grow tired of Kennedy’s forever wars.

For the past year, the United States has gone without its doctor. Ever since Vivek Murthy resigned as surgeon general last January, the role has remained empty despite President Trump’s attempts to fill it. He first nominated the physician Janette Nesheiwat but withdrew her nomination in May after reports that she completed her M.D. not in Arkansas, as she had claimed, but in St. Maarten. In her place, Trump nominated Casey Means, whose background is odd, to say the least.

Means is a Stanford Medicine graduate who dropped out of her surgical residency and has since made a career infusing spiritual beliefs into her wellness company, social-media accounts, and best-selling book. The exact nature of her spirituality is hard to parse: Means adopts an anti-institutionalist, salad-bar approach. She might share Kabbalah or Buddhist teachings, or quote Rumi or the movie Moana. She has written about speaking to trees and participating in full-moon ceremonies, both of which drew ridicule by the conservative activist and unofficial Trump adviser Laura Loomer. Her belief in “the divine feminine” (which she doesn’t quite explain) seems to have led her to renounce hormonal birth-control pills for halting the “cyclical life-giving nature of women.”

Although months have passed since her nomination, Means has still not appeared before Congress—in part because she went into labor with her first child hours before her confirmation hearing was scheduled to begin. (Means did not respond to questions for this story. A spokesperson for Bill Cassidy, who chairs the relevant Senate committee, told me that “the hearing will be rescheduled in the future when Dr. Means is ready” but did not offer a more detailed timeline.) The United States’ year without a surgeon general raises questions about how necessary the role really is. But the surgeon general still serves as the government’s leading spokesperson on public health, and if Means is eventually confirmed, her theology will become rather consequential because it is deeply tied to her beliefs about health. In 2024, she declared in a Senate roundtable on chronic disease that “what we are dealing with here is so much more than a physical health crisis. This is a spiritual crisis.” Part of her solution to both of these crises is to reject experts and institutions in favor of something far more alluring: intuition.

Means wrote in 2024 that she grew up in the Catholic faith, but left the Church in college. She grew fascinated by lectures at the Self-Realization Fellowship Lake Shrine, a spiritual center in Pacific Palisades, California. SRF, the religious organization behind it, was founded in 1920 by Paramahansa Yogananda, the “father of yoga in the West,” whose image graced the album cover of Sgt. Pepper’s Lonely Hearts Club Band. It accepts the teachings of Jesus and other spiritual masters and divinities, but nothing is seemingly as important as one’s personal relationship with God. Yogananda’s book, The Second Coming of Christ, posits that the Second Coming is not necessarily literal, but instead entails an awakening of the divine consciousness in ourselves.

SRF’s influence is apparent in Means’s advice that people follow their “heart intelligence” and “divine intuition” and avoid “blindly ‘trusting the science.’” In a newsletter sponsored by a probiotic-supplement company, she wrote that “applying the scientific method to health and disease has immense utility for helping us understand the natural world and live healthy, longer lives, but it feels increasingly like there is a campaign being enacted against our divine gifts of intuition and heart intelligence.” In another newsletter, she wrote about the role of divine intuition in deciding whether to drink raw milk: She wants to be free to look a local farmer in the eye, “pet his cow, and then decide if I feel safe to drink the milk from his farm.” (One could very well have a lovely experience with a farmer, Kevin Klatt, a registered dietitian and research scientist at UC Berkeley, told me, “but it isn’t going to change the fact that raw milk might give you listeria.”)

In the same newsletter championing bovine contact, Means laments a spiritual crisis of connection to nature. She frequently portrays nature as a force with humanity’s best interests at heart, nearly synonymous with God. In her book, she suggests that chronic stress and trauma can be treated by, among other things, spending time in nature and through “plant medicine”—specifically, psilocybin-assisted therapy. (Means has also written that psychedelics helped her be “one with the moon.”) In that sponsored newsletter, she warned of a prophecy she says was put forth by the Indigenous Kogi people of Colombia, in which humanity has only until 2026 to prove we want to right the wrongs we have foisted upon the Earth, or we will all die. “I use the Kogi prophecy metaphorically,” she wrote. “But I do feel we are on a road to disaster. I think we should take these messages seriously.” Natural disasters, she implied, are a “communication from God.”

Nature worship might be especially appealing at a time when trust in experts is declining and technology has become ever more inscrutable and overwhelming, Alan Levinovitz, a professor of religion at James Madison University and the author of Natural: How Faith in Nature’s Goodness Leads to Harmful Fads, Unjust Laws, and Flawed Science, told me. Means’s appeal to nature and intuition, he said, is empowering because it puts expertise back into everyday Americans’ hands.

The ambiguity of Means’s spiritual views strengthens her appeal—they can be interpreted to fit a wide array of belief systems. Her 2024 New York Times best seller, Good Energy, uses terms such as energy and life force, along with scientific-sounding descriptions of metabolic processes, to insinuate that the vibes are off in the American diet and lifestyle. (Means wrote Good Energy with her brother, Calley, who is now a close adviser to Robert F. Kennedy Jr., the secretary of Health and Human Services.) In her newsletter, she encourages her readers to “avoid conventionally grown foods at all costs,” and warns that buying nonorganic food is a vote to “diminish the life force on this planet” while the use of synthetic pesticides “is giving a poor signal to God (Source!) that we want this miracle to continue.” (Source insinuates a godlike or all-powerful entity.) “She’s drawing on lots of different ideas very freely and without much rigor in ways that feel good,” Joseph Baker, a sociologist specializing in religion at East Tennessee State University, told me. “That sort of allows her to seem like a visionary without having to specify anything.”

Emily Hilliard, a press secretary for the Department of Health and Human Services, wrote in an email that religious and spiritual beliefs should not be held against anyone who seeks a government job, and that Means’s “credentials, research background, and experience in public life give her the right insights to be the surgeon general who helps make sure America never again becomes the sickest nation on earth.” The surgeon general has little power to enforce policy, but can call on Congress to put warnings on products like the ones seen on cigarette packets, release guidelines and reports, and lend support to various initiatives. Means’s belief system—which Baker characterized as a “sacralization of the individual”—suggests that she will use that platform to invite Americans to master their own health. In Good Energy, Means writes of chronic conditions such as depression, anxiety, infertility, insomnia, heart disease, erectile dysfunction, and cancer, “The ability to prevent and reverse these conditions—and feel incredible today—is under your control and simpler than you think.”

That statement is one of many in which Means echoes elements of manifestation: the belief that thinking good thoughts and putting in effort begets good things, which Means says is real. She advocates “tapping into the abundance that is a sheer law of our universe” and calling on a higher power—“When was the last time you simply sat quietly and asked God/spirit/ancestors/nature to help show you the way and guide you to your highest purpose?” she wrote in her newsletter—but also putting in the hard, hard work.

Means goes beyond intuition and heart intelligence to offer concrete suggestions for labor (and spending) that will be divinely rewarded—essentially, a reimagined prosperity gospel. The nature of that work is detailed in the penultimate section of Good Energy. Means recommends eating minimally processed and mostly organic foods, and taking regular cold plunges or showers. (In her newsletter, she also advises Americans to grow the majority of their food; instead of pets, they could “raise chickens and goats and have abundant eggs and milk.”) She includes checklists upon checklists of habits and tests that “enable Good Energy” (and recommends getting a comprehensive lab panel from Function Health, of which she was an investor). She suggests buying a glucose monitor through her own company, Levels, and also recommends various personal-care apps, water filters, and trackers for sleep, food, and activity. Some of these items are sold by the wellness company True Medicine, which helps customers use their health savings account for a wide range of purchases, and in which Means has invested; her brother co-founded it. According to financial disclosures made public in September, Means has also received more than $275,000 from supplement companies. (Means has pledged to divest from True Medicine and other wellness interests if she is confirmed.)

Besides potentially boosting her own bottom line, Means’s embrace of individualism in health is wholly unrealistic. Americans work longer hours than people in many other developed nations, and many don’t have enough time to cook dinner, let alone raise goats. Many of the most important nutrition victories over the past century, such as the fortification of foods and the removal of trans fats, were communal and systemic, Klatt, the dietitian and UC Berkeley researcher, told me—the type of science-backed, population-level interventions that Means hasn’t demonstrated much interest in. A different prospective surgeon general might recommend repeated visits with a dietitian and fight for insurance to cover them, instead of “advocating for this kind of woo-woo stuff that has no data behind it,” Klatt said. Means, though, “is not an individual who seems to be wedded to the scientific process,” Timothy Caulfield, a professor and the research director at the Health Law Institute at the University of Alberta, told me. “This is someone who seems to pull things out of thin air and then look for sciencey-sounding rhetoric” to support them.

Perhaps Means’s eventual confirmation hearing will clarify what, exactly, she intends to do as the face of American public health. But even she may not be sure. “The future of medicine will be about light,” Means wrote to her newsletter subscribers last year, before admitting, “I don’t exactly know how yet.”

Milk is mundane in most contexts, but you can’t help noticing when it is smeared across the upper lips of America’s government officials. An image of Donald Trump sporting a milk mustache and glowering over a glass of milk was just one of many dairy-themed posts shared by government accounts on X during the past week, all of which made clear that the milk was whole. In one video, a seemingly AI-generated Robert F. Kennedy Jr. takes a sip and is transported to a nightclub, suddenly milk-mustachioed; in another, former Housing Secretary Ben Carson raises a glass of full-fat and sports a white ’stache. The upper lips of the former collegiate swimmer Riley Gaines and the former NBA player Enes Kantor Freedom, among other personalities embraced by the right, also got the whole-milk treatment.

The posts were shared to celebrate a big month for whole milk. On January 7, the Department of Agriculture released its updated Dietary Guidelines for Americans, which newly recommend whole dairy over low-fat products, and placed a carton of whole milk near the top of a revamped, upside-down food pyramid. Then, on Wednesday, President Trump signed into law a bill allowing schools to serve whole milk after more than a decade of being restricted to low-fat.

Medical professionals, who have long advised people to avoid full-fat dairy because it contains high levels of saturated fat, were generally critical of the new dietary guidelines for milk. But Kennedy and Trump, along with other government officials, have framed it as a major win for health. Kennedy recently argued that America’s children have been missing out on key nutrients such as calcium and vitamin D because they don’t want to drink the low-fat milk served in schools. The new law, he said at its signing, embodies the new dietary guidelines’ directive to “eat real food.”

The low-fat-versus-whole controversy is a real, active scientific debate. For roughly the past two decades, reduced-fat milk (2 percent milk fat, by weight) has dominated American refrigerators largely thanks to fears about fat in general, and saturated fat in particular. Copious research has linked saturated-fat intake with health issues including cardiovascular disease and cancer, as well as death from all causes. It also leads to higher LDL (“bad”) cholesterol, which has been shown to cause strokes and heart attacks, Kyla Lara-Breitinger, a cardiologist at the Mayo Clinic, told me.

[Read: The most miraculous (and overlooked) type of milk]

Saturated fat generally isn’t a huge concern for children, so giving them the option to drink whole milk at school is somewhat less fraught, Steven Abrams, a child-nutrition expert and a member of the American Academy of Pediatrics, told me. And some researchers propose that, because whole milk is more satiating, kids who drink it are less likely to reach for other high-calorie foods. “Full-fat dairy is especially important for kids ages 12 months to 10 years to meet energy needs and promote brain development,” a spokesperson for the Department of Agriculture wrote in an email. But the AAP holds that kids should switch to drinking low-fat or skim at age 2.

In contrast to most nutritionists, Kennedy is all in on saturated fat, championing foods such as butter, beef tallow, and red meat. At a press conference to announce the new dietary guidelines, Kennedy proclaimed that the government was “ending the war on saturated fats.” The reality is more confusing. The new dietary guidelines promote more foods that are high in saturated fat, but they retain the old recommendation to limit daily saturated-fat intake to 10 percent of total calories, or about 20 grams a day in a 2,000-calorie diet. A single cup of whole milk has 5 grams. If a person consumes the recommended three daily servings of full-fat dairy, it would be “pretty close to impossible” to stay within the saturated fat limit, Caitlin Dow, a senior nutrition scientist at the Center for Science in the Public Interest, told me. (The White House and the Department of Health and Human Services did not respond to a request for comment.)

A relatively new and controversial school of thought posits that full-fat milk isn’t as harmful as other sources of saturated fat. A 2018 study that involved participants from 21 countries found that dairy consumption—even whole-fat dairy—was negatively correlated with mortality and major cardiovascular-disease events. Other studies have shown that the consumption of whole-fat dairy is linked to decreased diabetes risk and doesn’t cause weight gain. “There’s no convincing evidence that low-fat dairy is preferable to whole-fat dairy for any health outcome,” Dariush Mozaffarian, a cardiologist at Tufts University who was a co-author on the 2018 study, told me. The broader research community has so far resisted this idea, but has acknowledged that the science on dairy fat has become more complex. “The reason you’re getting so many conflicting opinions is that the evidence is very controversial,” Lara-Breitinger said, noting the lack of randomized clinical trials comparing whole-fat and low-fat dairy.

[Read: Go ahead, try to explain milk]

Ultimately, milk isn’t “going to make or break a diet,” Dow said. Dairy makes up just 10 percent of the average American’s caloric intake, and most of that is cheese. Even for kids, very real concerns, such as obesity and diabetes, will probably not be solved—or meaningfully exacerbated—by a switch to whole milk. “You could probably have either low-fat or whole-fat, and it doesn’t matter,” Mozaffarian said.

As I have written previously, Americans have spent roughly the past 150 years quarreling about various aspects of milk, including its benefits, safety, and chemical composition. That’s partly because dairy is a powerful industry; last year, dairy products in the U.S. had an economic impact of nearly $780 billion. But since 2012, when the USDA under then-President Barack Obama required schools to serve only low-fat milk, student milk consumption has declined; according to the dairy industry, that’s because low-fat milk doesn’t taste as good. The Trump administration’s promotion of whole milk, Dow said, “really, really supports the dairy industry’s bottom line.” In fact, many of the reviewers of the new dietary guidelines were recently found to have ties to the beef and dairy industries. (When I asked the USDA about allegations of industry influence on the push for whole milk, the spokesperson asserted that the evidence “was evaluated based solely on scientific rigor, study design, consistency of findings, and biological plausibility.”)

[Read: Milk has divided Americans for more than 150 years]

Beyond serving as an economic engine, milk is a potent cultural symbol. It has long evoked an idealized past: a simpler time when cows roamed through pastures and produced pure, wholesome milk, and the Americans who tended them thrived in harmony with the natural world. Dairy companies have leaned into that aesthetic, featuring barns, fields, and words such as pure on milk cartons. Milk is also culturally linked to strength, wealth, and beauty, thanks in no small part to the celebrity-studded dairy ads of the late 20th century, including the “Got Milk?” campaign referenced by the Trump administration’s milk mustaches. Such positive associations make milk a powerful metaphor for what America could be—if certain unsavory elements of modernity could be undone or erased.

Perhaps unsurprisingly, this association has also been invoked in racist contexts for more than a century. In a 1923 speech, Herbert Hoover, who was then commerce secretary and would be elected president five years later, framed milk as a means to ensure “the very growth and virility of the white races.” Modern-day white nationalists and alt-right groups hold up dairy milk as a symbol of whiteness and masculinity, in contrast to soy milk, which they associate with the woke, feminist, multiracial left. (Yes, seriously.)

The idealized era of perfectly safe, perfectly wholesome dairy never really existed. “This whole idea that there was a time when we were healthy, and during that time we were eating steak and drinking whole milk, is not rooted in any reality,” Dow said. Nevertheless, it resonates with the MAHA and MAGA agendas, which both center on the belief that America will return to its former glory if it can re-create the past. The Trump administration’s endorsement of whole milk may nominally be about public health. But a recent White House post featuring a retro illustration of the president as an old-fashioned milkman, captioned “Make Whole Milk Great Again,” was all about aspiration—and the purified nation, untainted by modernity, that America could someday become.