Category:

Medical News

Hong Kong Supermarket looked exactly as it always had. When I visited the store in Manhattan’s Chinatown last week, buckets of live crabs were stacked precariously next to bags of sweet-potato starch and shrink-wrapped boxes of dried shiitake mushrooms. The instant noodles took up two walls, where I quickly found my beloved and gloriously weird cheese-flavored kind. The aisles were packed with the usual staples: black vinegar, bags of vermicelli, sacks of jasmine rice big enough to body-slam a man.

But the product labels gave away that something was wrong: Product of China, many of them read in Mandarin. Almost everything at Hong Kong Supermarket is imported from China, and, because of tariffs, they could soon get more expensive. President Donald Trump’s 145 percent tax on goods imported from China affects everything from sofas to socks. Starting tomorrow, the fast-fashion giants Shein and Temu will hike their prices. And for some Americans, sticker shock from tariffs might mean skipping a new pair of jeans or squeezing a few more months out of a wheezing vacuum cleaner.

But the tariffs are especially tough on Chinese grocery stores and their customers. Unlike retailers that just happen to sell Chinese-made clothes and gadgets, stores like Hong Kong Supermarket are stocked with Chinese products because they are made in China. After all, I’ve yet to come across American brands that make my cheese-flavored noodles. Chinese grocery stores are a lifeline for millions of Americans like me. They’re where you can always count on basic ingredients that you’ll never find in Trader Joe’s or Whole Foods. In a world of tariffs, the Chinese grocery store has gone from a space of security to one of low, simmering dread: the kind that comes from watching the small constants of your life get a little more expensive, a little more distant.

At Hong Kong Supermarket, the prices haven’t gone up yet, but customers are bracing for hikes. There’s more pausing at price tags. More sighing. Aunties in quilted jackets crowd the produce bins, where their shopping carts tell the story of careful calculation: one bunch of scallions instead of two, a single pork bun where there might have been three, the occasional wistful glance toward the $13.99 fresh durian in the cardboard barrel. In the dried-snacks aisle, the shopper beside me stared wistfully at a jar of salted plums. Anna Chen, a slight 50-year-old woman holding an empty green shopping basket, told me that tariffs were on her mind. “I really hope the prices don’t get higher,” she said.

They will, Wille Wang, a manager at Hong Kong Supermarket, told me. The store hasn’t had to increase prices much at the moment, he said, but it’s only a matter of time if the tariffs remain in effect. “What can we do? It’s not our fault; we can’t control tariffs. Unless we sell at a loss, which isn’t sustainable.” He expects that cheap products might go up a little, but that the big jumps will be on premium goods and hyper-specific varieties. I thought of fermented bean curds, a hot-pot favorite; black-yolked century eggs, found in so many congees; and the sea cucumbers gifted to every grandparent. When existing inventory runs out, store owners will face hard choices: Eat the costs and risk going under. Raise prices and risk losing customers, as some businesses with Chinese suppliers are already doing. Seek alternative suppliers and risk altering the flavors that define their communities.

Everyone loses. Shoppers who frequent Chinese grocery stores may have few alternatives but to shell out more money for their food. You can’t swap out the Pixian bean paste for something generic from the “international” aisle for your mother’s mapo tofu and hope she won’t notice. You can’t trade out Shaoxing wine for dry sherry. Substitutions only go so far before the dish falls apart—one missing ingredient, and you’re eating a sad memory of something else. “Western grocery stores don’t have the groceries I need,” Chen said. “If prices keep going up, I can’t do anything about it.”

At some point, a work-around becomes a compromise, and a compromise becomes a resignation. These stores are where people can keep up how they’ve always eaten. Many people go to them not for novelty, but for continuity. “I’m thinking of stockpiling things like soy sauce and condiments,” said Fred Wan, a shopper whom I approached near the fish department. He’s a 34-year-old who moved from Beijing to New York eight years ago; he and his wife recently moved closer to Chinatown partly to have better access to Chinese grocery stores. “I’m definitely worried.”

Chinese grocery stores are under pressure in more ways than one: Not only do they stock lots of products that are now subject to steep tariffs, but they already tend to run on thin margins. “Small, independent grocery stores—especially those catering to ethnic communities—are particularly vulnerable,” David Ortega, a food-economics professor at Michigan State University, told me. If Trump’s full slate of tariffs goes into effect in a few months, the pain won’t stop at Chinese grocers. Vietnam is facing some of the steepest proposed tariff hikes. South Asian grocers might see seasonal delicacies like Alphonso mangoes get more expensive, if they can get them at all. (“Crying in H Mart” may soon take on a new meaning.)

If the costs of cultural foods keep rising, we’ll all feel it. More and more non-Chinese shoppers frequent these stores because they’re the only places that carry ingredients now in many kitchens—chili crisp, black vinegar, dumpling wrappers—or at least sell them cheaply. Food media, emphasizing that authenticity is a virtue, have popularized the idea that a visit to H Mart or the corner Chinese grocer will help you cook better. Big retailers have picked up brands popularized by smaller Chinese stores, such as Kikkoman, Lee Kum Kee, and the pantry favorite Lao Gan Ma chili crisp. The irony is that even as Asian groceries have become more mainstream, more cross-cultural, more popular than ever, tariffs are casting doubt on Americans’ ability to actually buy them. Tariffs shape and reinforce what’s affordable, what’s available, and, ultimately, whose cultures get priced out of reach.

After leaving Hong Kong Supermarket, I headed to Po Wing Hong, the grocer down the street. The store smelled like herbs and floor cleaner. A little boy was crouched in front of a stack of Jin Jin lychee jellies, squeezing each one to figure out which had the most juice. I overheard two teens calculating how many instant-noodle packs they can buy. (Answer: fewer than they’d like.) I passed a big box of packaged nuts and grains slapped with a bright yellow sign. On it, prices had been crossed out and updated in black pen. Peeled mung beans: formerly $1.75 a bag, now $1.99. Dried chestnuts: formerly $9.99, now $11.55. On my way out of the store, I walked past a stack of discarded cardboard boxes, all still marked with Chinese shipping labels.

One of the most notable things about Robert F. Kennedy Jr.—the secretary of the Department of Health and Human Services, a federal agency tasked with “improving the health, safety, and well-being of America”—is how confidently he distorts the basics of health, safety, and well-being.

In his short stint as health secretary, Kennedy has touted cod-liver oil as a valid measles treatment (it’s not), said that Americans are being “poisoned” by seed oils (they’re not), and claimed that “many” vaccines are not adequately safety-tested (they are). And he has readily cherry-picked and exaggerated findings to suit his own needs: “There’s a scientist at Harvard now who is curing schizophrenia with a carnivore diet,” he said at a press conference in March (it’s not a carnivore diet, and it’s not a cure).

The secretary also seems to think he knows what causes autism, a topic that scientists have been looking into for decades without producing a simple, clear-cut result, M. Daniele Fallin, a genetic epidemiologist at Emory University, told me. Kennedy, however, is adamant that a series of new investigations by his department will reveal at least “some of the answers” by September. “And we will be able to eliminate those exposures,” he said at a recent Cabinet meeting.

Since its first days, the new Trump administration has clearly shown where it thinks scientific attention should not be focused: It has attempted to censor federal scientific data, cut billions in government spending on research, and compromised care for some of the world’s most at-risk populations. Now, as the nation’s leaders have begun to encourage inquiry into specific areas, they are signaling that they’re willing to not just slash and burn research that challenges their political ideology but to replace it with shoddy studies designed to support their goals, under the guise of scientific legitimacy.


Just last week, in a press conference discussing a new CDC report that described a continued rise in the prevalence of autism, Kennedy pledged to “follow the science no matter what it says.” And in an email, Kush Desai, a White House spokesperson, told me that the administration’s intention is “to leave no stone unturned to get to the bottom of America’s epidemic of chronic diseases and conditions.” But those statements seem at odds with Kennedy’s behavior.

Among scientists who study and treat autism, the consensus has long been that “there is no ‘one cause’” of autism, Neelkamal Soares, a developmental and behavioral pediatrician in Michigan, told me. Genetics are likely to play a role; researchers have also explored the possible contributions of factors such as parental age; labor and delivery conditions; and exposures to certain chemicals, medications, or infections during pregnancy. Experts also generally agree that much of the growing prevalence of autism can be attributed to increased awareness and diagnosis—an explanation that the CDC, an agency Kennedy oversees, cited in its report.

But at last week’s press conference, Kennedy dismissed that explanation as “a canard of epidemic denial.” He instead claimed, without citing any data, that autism rates soared after “industry” contaminated Americans with a “toxin,” and called genetics a “dead end” for future research. “Somebody made a profit by putting that environmental toxin into our air, our water, our medicines, our food,” he said. And he appears to be trying to bolster that viewpoint with what will now count as official government research—“a series of new studies,” he said, “to identify precisely what the environmental toxins are that are causing it.”

Kennedy did promise to look at all possible environmental factors “agnostically.” But several experts told me they’re worried that the secretary has at least one particular exposure in mind. For years, Kennedy has championed the debunked idea that childhood vaccinations cause autism. And in March, he reportedly tapped David Geier, a discredited health analyst who has long promoted the notion that vaccine ingredients cause autism, to lead an HHS study to once again search for a link between immunizations and the neurodevelopmental condition. Kennedy didn’t evoke vaccines when describing this new research, and since his confirmation as HHS secretary, he has been more sanguine—albeit inconsistently—about the benefits of shots. But vaccines have, for years, stood out in his rhetoric as “a very clear preconceived hypothesis,” Megan Pesch, a developmental and behavioral pediatrician at the University of Michigan, told me.

Kennedy, who has no scientific or medical training himself, also seems confused about what a scientifically rigorous investigation would entail—and how long it might take. During this month’s Cabinet meeting, he said that by September, HHS would complete “a massive research and testing effort involving hundreds of scientists from around the world.” At last week’s press conference, however, his comments suggested that HHS might rely heavily on AI and electronic-health-record data, which aren’t gathered uniformly, can depend on self-reporting, and cover only populations that interact with the health-care system. And Jay Bhattacharya, the new, Trump-appointed director of NIH, recently gave a presentation detailing the administration’s plans to source data for these investigations from hospitals, pharmacies, wearable devices, and other private sources with limited reach.

That approach, experts told me, can’t provide enough evidence to definitively pinpoint autism’s cause, much less guide policy to eliminate it. “The chances of getting garbage are so high,” Catherine Lord, a clinical psychologist at UCLA’s Semel Institute for Neuroscience and Human Behavior, told me. Desai said that Kennedy had promised “an exhaustive examination of the underlying causes of autism,” which “naturally would include use of data points such as electronic health records, among other data sets,” and noted that the secretary is focused on fulfilling President Donald Trump’s directives “with the Gold Standard of Science.”

No matter which methodologies Kennedy chooses, his September deadline “is ridiculous,” Lord said. Even the job of assembling the expert task force to initiate such a project could go past September. And Bhattacharya has already suggested getting results could take longer. (When asked about the practicality of this timeline, Desai replied, “Would the preference be vague timelines and no commitments to address a matter of utmost concern for millions of Americans?”) One way to rigorously gather more data on autism’s causes would involve following a large, representative sample of the American population over time, tracking participants’ exposures, taking into account their genetic and health history, and monitoring whether any of them develop autism. The process would take years—and still may not yield causes as clear-cut or easy to “eliminate” as Kennedy seems to expect. But an administration that already knows the answers it wants doesn’t need years to find them.  


Government-funded science has, to some extent, always been subject to the political priorities of leadership: The National Institutes of Health, for instance, is run by a political appointee. Trump and his allies, however, have already demonstrated that they are willing not just to set priorities, but to engage in science theater, with even more broad-reaching interference in the field of trans health.

From the start, the new administration’s views on transgender health have been clear. Since January, Trump has issued executive orders denying the existence of gender and describing hormone therapy and gender-affirming surgery for children as “maiming,” “sterilizing,” and “mutilation.” One order announced that the administration would not “fund, sponsor, promote, assist, or support the so-called ‘transition’ of a child from one sex to another” and instructed federal agencies to end such procedures at government-funded institutions, leading hospitals and clinics across the country to pause gender-affirming care for young people. Through the NIH, the administration has also slashed funding from more than a hundred research grants involving trans people.

In place of this research, HHS is pursuing a pointed agenda to confirm the president’s statements. Last month, Matthew Memoli, the acting director of the NIH at the time, sent top agency officials a memo with instructions to urgently “fund research” into “regret and detransition following social transition as well as chemical and surgical mutilation of children and adults.” (In his email, Desai described the memo’s stipulations as “realigning taxpayer-funded research to align with the priorities of the American people.”) “This is very important to the President and the Secretary,” the memo read.

The outcomes of gender-affirming care do need further study across the age spectrum, experts told me. Scientists still don’t have a full sense of the long-term outcomes of transition on mental and physical health, or how to best tailor interventions to patients. (Extended use of certain hormones, for instance, could raise people’s risk of some cancers or cardiovascular complications.) More research is needed, in particular, on how best to support gender-diverse youth, a growing sector of the population. But the kinds of research that the Trump administration is pursuing won’t help clarify or alleviate those concerns. And of all the scientific questions that could be asked about trans health, “regret and detransition aren’t the major problems,” Arjee Restar, a social epidemiologist at Yale, told me: Studies have found that adults and adolescents are generally very satisfied with the outcomes of hormone therapy and gender-affirming surgeries, and that rates of regret following surgeries are about just 1 percent.  

NIH officials found the memo’s directives appalling. “This is not how we do science, ever,” one of them, who requested anonymity out of concern for professional retribution, told me. “This is politicized research, exactly what we were always told we would never do.” In his memo, Memoli specified that studies into the outcomes of gender-affirming care should deploy “methods that don’t themselves subsidize or incentivize such practices as previous NIH studies have done.” (Desai pointed to a case in which an NIH-funded researcher cited politics to help explain her hesitation to publish a study with unfavorable results about puberty blockers.) But previous NIH studies have never “subsidized or incentivized” gender-affirming care, the NIH official told me. Rather, they followed the recipients of that care over time, and observed the results.

In contrast, Memoli’s memo unabashedly advertised the conclusion that the administration is pushing for: that gender-affirming care is harmful and regrettable. The directive also implicitly solicits researchers who “are following the administration’s example,” Logan S. Casey, the director of policy research for the Movement Advancement Project, an equality-focused think tank, told me—and potentially, for participants who might share those viewpoints as well. That makes it all the more likely that those projects will produce the skewed results the administration wants to see.

This is consistent with everything Trump and his allies have revealed about their views on science since January: that it is not a means to better understand objective reality, but a political weapon that they must guard against, or deploy themselves. In recent months, Kennedy has accused the expert committee that counsels the CDC on its nationwide vaccine recommendations of being in the pocket of vaccine manufacturers; the administration has also fired from HHS several scientists who were prominent leaders in the COVID-19 response, including a few closely affiliated with Anthony Fauci, whom Trump has ridiculed as a “disaster” and an idiot and Desai derided as one of many “demonstrably fallible ‘experts.’” Last week, administration officials also redirected two federal websites, once used to share information on COVID-19 tests, treatments, and vaccines, to a page promoting the idea that the coronavirus pandemic began as a lab leak, rather than Fauci’s “preferred narrative that COVID-19 originated in nature.”

The causes of autism, the outcomes of gender-affirming care, and the origins of SARS-CoV-2 are all topics worthy of scientific investigation. But how questions are asked can influence the answers they yield—and directly affect the populations they’re asked about. The language in the NIH memo is “alarming and inflammatory,” Camie Nitzel, a psychologist who specializes in transgender and gender-diverse people, told me: It shows that the administration is pursuing these studies not from a place of genuine inquiry, but from prejudice. Disdain is coded into the administration’s methodology on autism, too: In his briefings on HHS’s new pursuits, Kennedy has repeatedly described autism as a scourge worse than COVID-19 that “destroys” families and children, and insinuated that it should be purged from the population. But the implication of both the administration’s statements and its proposed studies is that neither trans people nor autistic people should visibly exist in America. Science is now yet another tool that the government is using to disappear anyone it deems undesirable.

Mallory Berry was ready to give up. It was 2019, and her addiction—prescription opioids had led her to heroin—had left her bedridden. An infection had eaten through parts of her pelvic bones, stomach muscles, and vertebrae, causing a pain so excruciating that she was afraid to move. Her partner, Randy, placed buckets under her body when she needed to use the bathroom. Bathed her and fed her. Before long, Mallory stopped eating entirely, subsisting on sweet tea and water, wasting away.

Four years later, Mallory would recount this memory to me over the phone from her house—a five-bedroom, three-bathroom colonial with a walk-in closet and a two-car garage—that she paid for with her earnings as a manager at a mortgage company, where she oversaw 10 direct reports.

When I spoke with her in March last year, she largely credited this remarkable turnaround not to her own willpower, or the grace of God, but to a widely available medication called buprenorphine.

The world of opioid addiction is one of morbid statistics. People struggling with opioid addiction have a mortality rate 10 times higher than the general population in the United States. The relapse rate for those in recovery is as high as 70 percent. As one doctor, an addiction specialist named Stephen Martin, put it to me, “The natural history of opioid-use disorder is: People die.”

Which is why the numbers describing buprenorphine’s impact stand out. Buprenorphine can stop cravings for opioids, and people who use it are 38 percent less likely to die of an overdose. After buprenorphine was adopted at scale in the midst of France’s opioid crisis in the 1980s and ’90s, overdose deaths dropped by 79 percent.

About a year ago, when I began reporting this story as part of an Atlantic podcast series, I asked Mallory how important this treatment was to her recovery. She, too, reached for a number. “If you want a percentage,” she told me, “75 percent.” Yet being on buprenorphine at all made Mallory an outlier. As of 2021, about one in five patients struggling with opioid-use disorder is taking this or other medications for treating addiction. One of the most effective tools for defusing a crisis that in recent years has killed more than 80,000 people annually is going unused.

Underpinning this failure is a quiet conviction among doctors and patients alike that taking buprenorphine doesn’t count as success—that people who use drugs to recover from drug use are still addicts, the sobriety they achieve is fake, and the drugs saving their lives could spark the next wave of the opioid epidemic. At the same time, the drugs that have made addiction even more deadly—synthetic opioids such as fentanyl—are making buprenorphine more complicated to use. As a result, the window in the U.S. for this treatment to fulfill its greatest promise is nearly closed.

When buprenorphine arrived in the U.S. in 2002, the country was immersed in what would later be categorized as the first wave of the opioid epidemic. That year, 11,920 people died from opioid overdoses, which at the time sounded alarms in the medical community. (The annual death toll would increase sevenfold over the next two decades.)

The triumph in France was already well known to addiction specialists. Many believed that once more of their colleagues in health care understood the silver bullet they’d been handed, the treatment would find its way into primary care, Martin told me. The crisis could be stopped in its tracks.

But buprenorphine’s strength as a treatment is also a weakness. Buprenorphine is an opioid. Like methadone, the tightly regulated treatment for opioid addiction used in America since the 1950s, it acts on receptors in the brain to satisfy cravings.

Compared with methadone, buprenorphine is more difficult to overdose on and easier to access. It also has a stronger affinity for opioid receptors than opioids such as heroin or fentanyl, which can protect patients against overdose if they relapse. In a brain flooded simultaneously with buprenorphine and another opioid—the brain of a buprenorphine patient relapsing on fentanyl, for example—buprenorphine has dibs on the relevant receptors. Without available receptors, both the high and the harm of an opioid such as heroin or fentanyl are greatly reduced or eliminated. This is why buprenorphine can be a powerful medication. It strikes at the root of addiction, and protects patients when they slip up.

Yet it requires prescribers, regulators, and patients to accept that a person can be sober while taking an opioid every day. That idea cuts against the narrow definition of sobriety that America’s addiction model was—and to a large extent still is—built on. Whereas many European countries have successfully implemented coordinated, low-barrier access to treatments like methadone and buprenorphine, abstinence-based opioid addiction treatment is alive and well in America. In many Narcotics Anonymous meetings, for instance, attendees taking buprenorphine are treated the same as those in active addiction: They can listen, but not speak. In many sober houses, the first stop after rehab, residents are not permitted to take buprenorphine. “There’s a particular path to treatment and recovery” in those settings, Erin Madden, a professor at Wayne State University who studies stigma and addiction, told me, and “medications can’t be a part of it.” Asking people whose lives were destroyed by opioid dependence to depend on a different opioid is already a hard sell. Asking them to accept this dependence in a culture of abstinence is nearly impossible.

Fear that buprenorphine would be abused also worked against its broad adoption. In the early days of the crisis, prescription opioids such as oxycodone were driving a sharp rise in deaths. The DEA, for one, was skeptical that another prescription opioid could be the solution to widespread abuse of prescription opioids in part because the Department of Justice reported some evidence of buprenorphine abuse, particularly in the Northeast, as early as 2004. And opioid-addiction treatments do have a history of abuse: Heroin, for example, was once considered an effective treatment for codeine and morphine addiction. The year buprenorphine was approved, the DEA restricted its dispensation. Buprenorphine was moved from a Schedule V controlled substance (like codeine) to Schedule III (like ketamine). Physicians had to complete special training to prescribe the drug, and could prescribe to a pool of only 30 patients.

After its approval, buprenorphine never came close to achieving the scale needed to slow down the crisis. By some estimates, for every patient who received the drug, four more might have benefited from it.

Mallory’s younger brother, Quincie, had, like her, started with pills. Then he began using intravenous drugs; he showed her how to shoot up. Before long, he was overdosing regularly, sometimes twice a day, and depending on Narcan, the overdose-reversal drug, to revive him. Quincie was caught in a terrible loop: overdose; Narcan; emergency room. Overdose; Narcan; emergency room.

Then one day an outreach specialist approached Quincie at the emergency room and handed him her card. Call me when you’re ready, she said. He did, and eventually found his way into rehab and onto a buprenorphine product called Suboxone. As I reported in the podcast, he told his mom, Jennifer Hornak, “This keeps me from going off the deep end. I can work. I can live a real life on this medication.”

For a time, Quincie and Mallory were on roughly the same path. But in 2020, the paths split. Mallory finished rehab and entered a halfway house, while still on Suboxone. She was lucky, in a way: Many sober houses operate on the principle that an opioid is still an opioid, and for that reason won’t accept people on buprenorphine. And Quincie ran into exactly that problem at the sober houses near him. Because he needed a place to live, he saw little choice but to stop taking Suboxone. When Jennifer asked, “Are you sure you can do that?” he told her, “I think I can.”

About a month later, Jennifer was having a pool party at her house when Mallory told her that she saw a Facebook post that read “Rest in peace Quincie.” He had died of an overdose.

Quincie was caught between two approaches to addiction treatment. Inside rehab, his recovery was built around medication, not willpower. But outside, the infrastructure available to him was built on abstinence.

When Quincie died, Mallory’s family expected her to relapse. She’d been sober for only a few months, and her little brother’s death was a major blow—but she didn’t. She didn’t relapse when her grandfather died the next year. She didn’t relapse when Randy, who had cared for her when she couldn’t walk, died the year after that.

One day, Mallory got busy at work. Before she knew it, she ran out of time to make it to her doctor’s appointment to refill her Suboxone prescription. The withdrawal symptoms—vomiting, diarrhea—started quickly. For Mallory, missing her dose felt like going through heroin withdrawal—“like I’m dying,” she told me. She could go to her sober network for a few spare doses, but she feared their judgment. Asking for pills was addict behavior.

Eventually a friend gave her a pill, which she broke into pieces to last until she could refill her prescription. But the experience scared her. “What if something like this happens again?” she asked herself. “Do I want to be on a medication where if I don’t have it, I feel like I’m dying?” She realized that she didn’t want to be on Suboxone for the rest of her life. She was also tired of telling doctors she was taking the drug, which she felt led them to treat her differently than they would other patients.

And she wanted to keep the six teeth she has left. (Buprenorphine has been connected with tooth decay, according to an FDA warning.) For Mallory, her dentures represent some never-ending punishment for her years spent in active addiction, a permanent reminder of her past.

Jennifer begged Mallory to stay on Suboxone. “I pray that she’ll stay on it, I really do,” she told me in April 2024. “I just really can’t bear the thought of losing another child.”

Mallory came off Suboxone. In July—the last time I was able to reach her directly—she said she was feeling fine. More recently, whenever I tried to get in touch, she didn’t respond to me. In October, Jennifer told me via text that Mallory had not relapsed, but was dealing with what Jennifer described as “mental and physical health issues.”

Quincie’s death, and Mallory’s story, reflect a stark truth about buprenorphine’s limits, at least in America. Staying on the medication over the long term requires resolve, with limited support and against persistent stigma. Stopping presents its own risks. Diminished tolerance, paired with an ever more potent drug supply, can make buprenorphine cessation deadly if a person starts using drugs again. It might also shatter the fragile stability the medication can provide, as a salve for the mental-health issues that so often underlie addiction. But America’s fractured approach to recovery makes buprenorphine hard to live with, and impossible to live without.

On paper, getting buprenorphine to as many people as possible should be easier now than ever before. Today, patients can get a prescription through telehealth or even a phone call. Special training requirements for would-be prescribers have been eliminated. The widespread buprenorphine abuse feared in the early aughts never came to pass; although some buprenorphine has been sold illicitly, most buyers seem to be people who are using it as a medication, rather than abusing it. The DEA, after decades of strict enforcement, now urges pharmacists to maintain an “adequate and uninterrupted supply” of buprenorphine and similar medications. The regulatory barriers between patients and this medication have never been lower.

But usage rates are relatively flat. In 2022, they fell. According to Rachel Haroz, who leads Cooper University’s Center for Healing, and other experts I spoke with for this story, the stagnant rates can be blamed on stigma, a lack of infrastructure within primary care, and fear of DEA repercussions.

Buprenorphine’s uptake numbers in the past five years also have to do with the synthetic opioids it’s now up against. In earlier waves of the opioid epidemic, clinicians would wait for opioids such as painkillers or heroin to leave those receptors in the brain naturally, and then, at the onset of withdrawal symptoms, they would begin patients on buprenorphine. The drug would reduce cravings and eliminate the need for a protracted, painful withdrawal.

This process can break down if the patient has been using fentanyl, which stays in the body longer than heroin or pain pills. When buprenorphine displaces fentanyl from opioid receptors, it can trigger an instant reaction called precipitated withdrawal, marked by vomiting, diarrhea, and chills. Patients, who already live in constant fear of withdrawal, are reluctant to seek a treatment that triggers its symptoms. And doctors, reluctant to cause such acute pain, are hesitant to start patients on buprenorphine.

So addiction specialists are improvising. When Haroz’s EMS teams revive a patient with Narcan, a drug that reverses overdoses by clearing fentanyl from opioid receptors, they start them on buprenorphine immediately, which can help avert precipitated withdrawal symptoms. Boulder Care, an addiction network in Oregon, is piloting a new approach in which fentanyl users take a dose of Narcan without having overdosed, intentionally triggering a withdrawal before starting on buprenorphine. Lucinda Grande, a professor at the University of Washington School of Medicine, has trialed using ketamine, a dissociative drug, to ease symptoms while her patients transition from fentanyl to buprenorphine.

Introducing buprenorphine is a much easier task for people who use, say, heroin. But today, fentanyl—with all the side effects that accompany its interaction with buprenorphine—has taken over the market for illicit opioids. Having missed the best window to get buprenorphine into patients, this is now what success looks like. And these creative approaches cannot yet be deployed at a scale that would meet the needs of the tens of thousands of Americans who lose their lives every year to overdose. The story of recovery in America is essentially the same as it was 20 years ago, and the barriers to changing that story are higher than ever.

Good Job, MAHA

by

Unless you make a habit of closely reading nutrition labels—or watching Robert F. Kennedy Jr.’s YouTube channel—you might not realize just how much tartrazine you’re ingesting. Kennedy, the U.S. health secretary, is fixated with the chemical, otherwise known as Yellow 5. Many Americans are unknowingly eating this and other “poisons,” he warned in a YouTube video posted last fall. The lemon-yellow hue tints junk food such as Skittles and Mountain Dew; it’s also in chicken bouillon, pancake mix, and pickles. In Europe, products containing Yellow 5 are branded with a label warning that it “may have an adverse effect on activity and attention in children.” But for two decades, the FDA has declined to ban the dye, citing inconclusive evidence.

Today, the FDA announced that it will move to rid the food supply of Yellow 5 and several other synthetic food dyes, such as Red 40, Blue 1, and Green 3, by the end of next year. It’s not a ban: Kennedy, who oversees the FDA, said in a press conference that he has reached an “understanding” with the food companies to phase out these dyes, although he provided scant details on the specifics. (An HHS spokesperson didn’t immediately respond to a request for comment.)

Today’s action was, in many ways, a win for Kennedy. Democrats have long grumbled that food companies should not be foisting on Americans chemical-laden versions of products such as Doritos and Froot Loops while selling additive-free versions in other countries. RFK Jr.’s “Make America Healthy Again” movement has brought Republicans onboard too. In October 2023, California became the first state to ban an artificial food dye, Red 3; it has since been joined by West Virginia, which banned seven dyes this March. In 2025, more than half of the 50 states have introduced similar bills, in some cases specifically shouting out MAHA.

Cracking down on food dyes is a refreshingly modest, incremental step toward reforming America’s food system. There is real evidence that these dyes are harmful, particularly to children. A review from 2021 found that “synthetic food dyes are associated with adverse neurobehavioral effects, such as inattentiveness, hyperactivity and restlessness in sensitive children.” Many children aren’t affected by these attention concerns, to be clear, and it’s also difficult to pinpoint one dye as worse than another, because they are often studied together. But, as opponents to the dyes have argued, subjecting any proportion of kids to neurobehavioral issues doesn’t seem worth having bright-red Skittles.

Moving away from synthetic dyes would be a monumental change for food companies, but they have a reason to cheer for today’s news as well. Since states began taking up the food-dye issue in earnest, the industry has complained that differing state laws for food dyes would make it more difficult to run their businesses. In a statement, the Consumer Brands Association, which represents companies such as Nestlé, PepsiCo, and General Mills, said that a “state patchwork of differing laws creates confusion for consumers, limits access to everyday goods, deters innovation, and increases costs at the grocery store,” and the group maintained that the additives it uses “have been rigorously studied following an objective science and risk-based evaluation process and have been demonstrated to be safe.” The industry is surely glad to hear that the FDA, at least right now, is not implementing a ban for common dyes such as Yellow 5. “I believe in love,” FDA Commissioner Marty Makary said today. “And let’s start in a friendly way and see if we can do this without any statutory or regulatory changes.” He added that food companies can use natural dyes such as beet, carrot, and watermelon juice to color their products instead of artificial dyes.

If Kennedy’s purported understanding with the food industry falls through and companies balk, Kennedy’s and Makary’s jobs will become trickier.

The FDA has historically taken years to formally ban just a single food ingredient. It took nearly three years to act on a petition to ban Red 3 in food, despite the fact that scientific studies showed decades ago that the dye causes cancer in rats. At least some of that slowness is by design. Regulators need to document legitimate harm that is caused by these products, and that process can take years. The FDA’s job also has become even more difficult in light of the mass layoffs that have played out in the early days of Donald Trump’s second presidency. Among the 89 staffers from the FDA’s food center are nine people specifically tasked with reviewing additives in foods, according to Jim Jones, a former head of the center. (He resigned in protest of the layoffs in February.)

Either way, the speed with which the food industry phases out these dyes should not be seen as the true measure of just how successful RFK Jr. is. Cracking down on food dyes has become a major plank of the MAHA platform, but the chemicals are nowhere near the biggest impediment to making America healthy again. The true test will be how Kennedy and his movement deals with much more pressing, and intractable, challenges in the American diet. Even without synthetic dyes in our foods, Americans will still overwhelmingly be eating ultra-processed foods loaded with excess sodium and sugar. This doesn’t seem to be a point lost on Kennedy, who issued a stern warning today that “sugar is poison.” And yet, he hasn’t articulated any plans to eliminate the ingredient from our food. Phasing out food dyes is easy, at least in comparison with tackling these bigger issues. The real test of the MAHA movement will be not whether it can get the red dye out of Skittles, but whether it can persuade Americans to forgo the Skittles altogether.

In the video, the man and his burgundy slab of beef liver are best friends. Their bond is revealed in a series of vignettes: The man ties a dog leash around the meat lump and lugs it behind him on a skateboard (afternoon stroll). The man dresses it in sunglasses and a necktie and positions it with a copy of Marcus Aurelius’s Meditations (reading date). The man pulls back his bedsheets to reveal that the liver is his new pillow (slumber party!).

It’s a joke, clearly, but the video’s caption, posted by the nutrition influencer known by the pseudonym Carnivore Aurelius, is earnest: He hopes it inspires his 1 million Instagram followers to eat more beef liver. “If nutrition were a Roman coliseum, kale would be the defeated gladiator,” he wrote, “and beef liver would be the lion tearing him to shreds.”

This extravagant devotion to organ meats has become common within online wellness communities promoting “ancestral” diets—a relative of the paleo diet, which endorses the consumption of whole, unprocessed foods. On TikTok, tradwives, carnivore bros and girlies, and holistic wellness influencers tout the benefits of eating organ meat, mostly from cows. Some of them eat it raw; others eat it cooked. The enthusiasm has spawned an industry that turns offal—a catchall term for an animal’s nonmuscular parts—into pill or powdered form. “Sneaking in” organ meats is a recurring theme: Some clips show moms blending beef-liver powder into their toddlers’ orange juice and smoothies; one shows a man dropping offal chunks into his partner’s ground beef.

[Read: America is done pretending about meat]

All of this social-media attention has been translating into people’s actual eating habits, Victoria Fitzgerald, who oversees Whole Foods Market’s meat merchandising, told me. In 2020, the grocery-store chain introduced frozen organ-meat blends into its stores, and every year since, the products have seen triple-digit growth in sales. In some stores in Miami and Austin, Whole Foods’ most popular organ-meat product—Force of Nature’s “grass fed beef ancestral blend,” a ground mix that includes beef liver and heart—sells at 15 times the rate of other frozen meat items. The buzzy Los Angeles supermarket Erewhon also got in on the offal hype, selling a $19 “raw animal” smoothie last year made with freeze-dried beef organs. And, like Whole Foods, Natural Grocers, a supermarket chain with stores west of the Mississippi River, named organ-meat products a top trend for 2025.

In the United States, this latest revival is particularly striking given how unpopular offal has been with Americans in recent history. Here, organ meats have been regarded with something of a “yuck factor,” Mark McWilliams, a professor at the U.S. Naval Academy and the editor of Offal: Rejected and Reclaimed Food, told me. Many Americans view them as “forbidden and unfamiliar,” he said. And the more unfamiliar a food is, the grosser it might seem to some people to eat.

Yet today’s fanfare isn’t because of a sudden switch-up in Americans’ tastes. Rather, the influencers who extol offal’s virtues seem to do so on the basis of its nutrient density. Organ meats appear to be viewed less as meals and more as supplements: something to be consumed not primarily for flavor but in pursuit of the influencers’ vision of optimal health.


For as long as humans have eaten animals, they’ve eaten offal. The first humans ate the whole animal when they hunted, starting with the heart and brains, according to the chef Jennifer McLagan’s book Odd Bits: How to Cook the Rest of the Animal. Ancient Romans feasted on goose-feet stew; Greeks ate splanchna, or bowels; Elizabethans nibbled on bird tongues. Organ meats played a key role in the Navajo Nation’s traditional diet—and people all over the world continue to eat the whole animal today.

Sometime in the late 18th century, though, offal started suffering from an image problem, viewed by some people to be an affordable but second-rate food. McLagan traces this shift in reputation to the rise of slaughterhouses in England, which led to a greater availability of meat as well as an oversupply of perishable, hard-to-ship offal. Instead of tossing these cuts, the slaughterhouses would offer them to poor people who lived nearby. “The result of this generosity,” McLagan writes, was that organ meats’ “prestige fell.” Around the same time in the United States, offal developed a reputation in some quarters as a food eaten mostly by people of low social status; according to some scholars, enslaved people in the antebellum South were often given the parts of livestock considered less desirable, such as pig’s feet, jowls, and small intestines (chitterlings).

Even so, into the early 20th century, enough Americans were apparently still eating offal that Irma Rombauer included recipes for liver, brains, and kidneys in her wildly popular 1931 cookbook, Joy of Cooking. But by the 1940s and ’50s, organ-meat consumption had begun to taper off. Family farms and butcher shops were giving way to factories and supermarkets. Muscle meats, such as chicken breast and sirloin, became cheaper. “People forget chicken used to be a very special dish—a roast chicken was something you had on Sundays,” McLagan told me. When eating muscle meat daily became a possibility, many people opted out of offal. (By the time the 1953 edition of Joy of Cooking came out, Rombauer felt the need to add a coy introduction to her organ-meat recipes: “The following is a hush-hush section, ‘just between us girls,’” she wrote.) Since then, organ meats produced in the United States have largely been exported, made into pet food, or simply thrown in the trash.

[Read: The WWII campaign to bring organ meats to the dinner table]

Many people have tried to rebrand organ meats for wider consumption, to little lasting effect. During World War II, meat rationing led to a government campaign encouraging families to eat offal, which was renamed “variety meats.” This effort led to a brief boost in popularity, though by the war’s end, organ meats were once again mostly abandoned. In the 1990s, the British chef Fergus Henderson popularized the “nose-to-tail” movement, which primarily focused on animal rights and sustainability and aimed to use as much of the animal as possible. “If you’re going to knock it on the head,” Henderson said, “it seems only polite to eat the whole thing.”

That ethos gained popularity with some Americans too: In 2004, Henderson’s cookbook, full of highbrow recipes such as deviled kidneys and potato-stuffed pig’s foot, was released in the United States with a glowing introduction by the food celebrity Anthony Bourdain, who called the roast bone marrow at Henderson’s restaurant St. John his “death row meal.” When McLagan released Odd Bits in 2011, she, too, was hopeful that organ meats were on a fast track to the mainstream. Yet the nose-to-tail movement, appealing mostly to fringe foodies, never made a big dent on eating habits. “I thought that I would change the world with the book,” McLagan told me. “Of course, I didn’t.”


The newest organ-meat revival doesn’t bother as much with time-intensive recipes—it’s far more focused on convenience. Besides ground-meat blends and supplement pills, offal is being sold as salted crisps, chocolate-almond-flavored protein bars, vinegary meat sticks, and freeze-dried powder toppings to be sprinkled on dishes like pizza or steak. Whole Foods is planning to expand its offal selection, Fitzgerald told me; easy-to-cook options such as premade organ-meat burgers and meatballs should soon be available.

Today’s offal movement is, in part, an offshoot of the carnivore diet—a meat-heavy approach to eating that, despite copious warnings from nutritionists, rose to prominence online beginning around 2018. Controversial influencers such as Paul Saladino, Brian “Liver King” Johnson, and Joe Rogan all helped popularize the trend. Many influencers have painted organ meats as a miracle food, claiming that eating them had improved ailments such as fatigue, anemia, and hives; in one 2020 podcast, Rogan suggested that eating offal and other types of meat could possibly cure autoimmune disorders. “Don’t believe anything that is too good to be true,” Melissa Fernandez, a professor at the University of Ottawa who studies nutrition influencers and misinformation, told me. Some influencers, she noted, are entrepreneurs whose businesses may benefit from their own nutritional advice. Saladino and Johnson, for instance, each own organ-meat-supplement companies. (Saladino has denied any conflict of interest, saying that organs “are some of the most nutrient-rich foods on the planet.”)

[Read: The Jordan Peterson all-meat diet]

Despite the scant scientific support for influencers’ more extreme health claims, in recent years the offal hype has also extended past carnivore-diet enthusiasts to include a broader, omnivorous group of nutrition-focused eaters. The refrain “Mother Nature’s multivitamin” is fairly ubiquitous on organ-meat social-media posts, where influencers typically list off an alphabet soup of nutrients, among them vitamin A, B12, and iron. Organ meats are also portrayed like munchable fountains of youth: Beef liver has been deemed, at once, “nature’s botox,” “edible retinol,” and the “one supplement to make you hotter.” Offal does indeed have lots of nutrients—but like influencers’ health assertions, most of these beauty claims have little evidence to support them, Pieter Cohen, an associate professor at Harvard Medical School and the leader of Cambridge Health Alliance’s Supplement Research Program, told me. “Do I know of any research that has proven that eating organ meat—the food—improves skin, improves mood, or improves the quality of your hair?” Cohen said. “No.”

Consuming organ meats as supplements could actually be harmful to people’s health, Cohen and Fernandez said. The supplement industry has little government oversight and is known to be rife with ingredients that are either ineffective or, worse, dangerous. “I have huge, huge concerns over their safety,” Fernandez said of these new organ-meat supplements. Given organ meats’ nutrient density, eating liver every day in any form—whether as a capsule or sautéed with onions—may be hazardous. Fernandez flagged that people could end up consuming too much vitamin A; this is especially notable for pregnant women, who risk the possibility of birth defects through overconsumption, she told me. “There’s actual danger there in toxicity.” (Johnson’s Ancestral Supplements and Saladino’s Heart & Soil—which the Liver King also co-owns—have both dismissed concerns about vitamin A toxicity, suggesting that their products fall within a safe daily dosage.)

That’s not to say people should steer clear of organ meats. The nutrition experts I spoke with just advised treating them as actual food instead of supplements, and not eating them every day. Fernandez suggested seeking pleasure in offal, such as by cooking a new dish. And although offal may never become as ubiquitous as muscle meat in American diets, more people eating organ meats could come with some positives. In McWilliams’s and McLagan’s view, offal provides a real way to combat the moral quandary of meat eating. “If you’re someone who wants to eat meat but is conscious of the problems of industrial food production,” McWilliams told me, “eating the whole animal is one way out.” Beef liver may not be a gladiator-crushing lion ready to dethrone kale in some fictive nutritional coliseum. But, at the very least, it doesn’t need to go in the trash.


​​When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

Of the many mistakes made in the COVID era, none were as glaring as prolonged school closures. The damages go beyond loss of learning, a dire consequence in its own right: Millions of families, both children and parents, still carry the scars of stress, depression, and isolation.

The closures began at a time of understandable panic, but that was only the beginning of the story. On February 25, 2020, Nancy Messonnier, the director of the CDC’s National Center for Immunization and Respiratory Diseases, led a press conference to address the developing coronavirus crisis. Messonnier warned the public that, without vaccines, non-pharmaceutical interventions—things like business closures or social-distancing guidelines—would be the most important tools in the country’s response. “What is appropriate for one community seeing local transmission won’t necessarily be appropriate for a community where no local transmission has occurred,” she said. The school closures that would be implemented the following month—and that endured through the end of the school year in nearly all of the roughly 13,800 school districts in the United States, in regions that had wildly different infection levels—showed this directive was not followed.

At the time of the initial closures, in mid-March, COVID was spreading quickly, but large areas in the U.S. were absent any known cases. Still, to the extent that a planned response to influenza was an appropriate universal pandemic guide, these closures were aligned with the CDC’s most recent update to its pandemic playbook, released in 2017. According to that document, an initial two-week closure of schools would be sufficient to fulfill a first objective of buying authorities time to assess the severity of the pandemic. Given the news being reported of care rationing in northern-Italian hospitals, following this plan was not unreasonable—and, as part of broader stay-at-home orders, it may have had some effect on disease transmission.

“Italy spooked us,” Jennifer Nuzzo, director of the Pandemic Center at the Brown University School of Public Health, told me. “We did not want to be Italy. The governors all saw China and Italy lock down and decided to follow their example.”

Indeed, had China not locked down, perhaps the rest of the world might not have done so either. China, governed by an authoritarian regime that rules the country with tremendous top-down power, does not share the same attitude toward personal liberties as Western democracies. And neither the CDC’s pandemic playbook nor the pre-2020 consensus within the public-health field favored a lockdown of society of the breadth that we would experience. On a countrywide scale—from both an epidemiological perspective and a human-rights perspective—closing all nonessential business, closing all schools, prohibiting most social interactions and nonessential travel, and so on, was not considered feasible or wise. Because we initially lacked the ability to test, Nuzzo said, shutting schools, along with other facets of society, did make sense at first. The problem, in Nuzzo’s mind, was not closing down in March; it was that there was no plan beyond that. By “no plan,” Nuzzo was referring to two interrelated problems: all the potential harms of closures, and the challenge of unwinding interventions after they’d been implemented.

In Messonnier’s press conference, she mentioned the CDC’s 2017 pandemic report directly, said that school closures were part of the plan, and recognized that they were likely to be associated with unwanted consequences such as missed work and loss of income. “I understand this whole situation may seem overwhelming and that disruption to everyday life may be severe,” she advised. “You should think about what you would do for child care if schools or day cares close.” There was no mention of how the government might aid families during school closures, or, for example, about what a single parent with a job as a cashier in a grocery store and a 4-year-old at home was supposed to do. Rather, in just one line amid a lengthy speech, people were told to simply “think about” it. To government officials and many others at the time, this was a regrettable but entirely reasonable approach—a presumed temporary loss of wages and child-care issues were lower-order concerns compared with the coming onslaught of a pandemic.

Yet what was positioned as a secondary issue—a mere abstraction, warranting just a brief mention—led to catastrophic consequences for millions of children, and their families. A year later, my kids, along with tens of millions of other students, were still trudging through remote learning, either as their exclusive form of schooling or through so-called hybrid schedules during which they could attend classes only part time. (Meanwhile, bars, restaurants, and all manner of other businesses had long since reopened, as had many private schools.) Teachers in much of the country had been prioritized for vaccines—making them eligible for protection before some other, more vulnerable populations—yet schools in half the country still weren’t open full time, and in many places weren’t open at all.

While federal public-health officials made recommendations regarding schools, the actual closures were carried out at the state and local levels, in response to misplaced public fears and aggressive campaigning by teachers’ unions. Randi Weingarten, the high-profile head of the American Federation of Teachers, said in a February 8, 2021, New York Times article that she hoped things would be “as normal as possible” by the following fall. Class-action lawsuits in multiple states had been filed on behalf of children with special needs on the claim that the conditions of IDEA—a federal law that requires certain services (such as physical and occupational therapy, supplemental aids and equipment, etc.) for children with disabilities—were not being met in remote-learning models.

[Read: Where all the missing students?]

Opinion pieces with titles such as “Remote School Is a Nightmare. Few in Power Care,” had been appearing in major news outlets since the previous summer. Working parents, especially mothers, were dropping out of the workforce in staggering numbers because of child-care obligations during the pandemic. An analysis by the U.S. Chamber of Commerce found that nearly 60 percent of parents who had left the workforce had done so for this reason. The psychic toll on parents and children was never—and can never be—calculated. It won’t show up in statistics, but it was real for millions of families. And millions of children, especially those without resources for tutors or parents to oversee them during the day, were losing ground with their academics. Worse, they were suffering from isolation, frustration, and, for an increasing number of them, depression from spending their days alone in front of an electronic screen. Untold numbers of other children became “lost,” having dropped out of school entirely. Those in power who advocated for school closures were not adequately prepared for these consequences, which were still pervasive a full year into the pandemic.

But they should have been.

The damaging effects of school interruptions were not unforeseen. They were explicitly warned about in the academic literature. Exhibit A is a 2006 paper called “Disease Mitigation Measures in the Control of Pandemic Influenza,” in the journal Biosecurity and Bioterrorism, written by Nuzzo; her mentor (and global eminence on disease-outbreak policies), D. A. Henderson; and two others. “There is simply too little experience to predict how a 21st century population would respond, for example, to the closure of all schools for periods of many weeks to months,” these authors wrote. “Disease mitigation measures, however well intentioned, have potential social, economic, and political consequences that need to be fully considered by political leaders as well as health officials. Closing schools is an example.”

The authors went on to warn that closures would force some parents to stay home from work, and they worried about certain segments of society being forced to bear an unfair share of the burden from transmission-control policies. They wrote:

No model, no matter how accurate its epidemiologic assumptions, can illuminate or predict the secondary and tertiary effects of particular disease mitigation measures … If particular measures are applied for many weeks or months, the long-term or cumulative second- and third-order effects could be devastating.

Nearly a decade and a half before the pandemic—in a stark rebuke to the approach championed by the CDC, White House Coronavirus Response Task Force coordinator Deborah Birx, and other powers that be—the paper had called out the major harms that would come to afflict many families in our country as a result of school closures. Yet, from the spring of 2020, health officials who directed our pandemic response ignored many of the consequences they must have known to expect. Or, at the very least, they failed to provide adequate information about them to the public. The officials had opened a bottle of medicine while disregarding the skull and crossbones on the warning label.

And the portents were not just in Henderson’s and Nuzzo’s paper. A 2011 paper by researchers from Georgetown assessed the decision making behind—and the consequences from—several hundred brief school closures enacted during the 2009 H1N1 pandemic. The authors noted that the child-care costs to families were substantial, and that hardships from closures were inequitable. “Officials considering closure must weigh not only the total amount of disruption but also the extent to which social costs will be disproportionately borne by certain segments of society,” the authors wrote. Even the CDC playbooks themselves warn of some of these issues. Both the 2007 original and the updated 2017 report cautioned that school closures could lead to the secondary consequence of missed work and loss of income for parents who needed to stay home to take care of their school-age children. This effect, the latter report noted, would be most harmful for lower-income families, who were also hit hardest by COVID in the first place. With prescience, and comic understatement, the authors noted that school closures would be among the “most controversial” elements of the plan.

Meanwhile, the second of Nuzzo’s points—that unwinding interventions is often incredibly difficult, and there must be a plan on how to do so—was also a well-established phenomenon. Just as public-health experts are biased toward intervention, they, along with the public, are also biased toward keeping interventions in place. This is a known phenomenon within the literature of implementation science, a field of study focused on methods to promote the adoption of evidence-based practices in medicine and public health. Westyn Branch-Elliman, an infectious-diseases physician  at UCLA School of Medicine with an expertise in implementation science, told me that de-implementation is generally much harder than implementation. “People tend to err on the side of intervening, and there is often considerable anxiety in removing something you believe has provided safety,” she said.

There also is a sense of inertia and leaving well enough alone. It’s not unlike legislation—oftentimes repealing a law, even an unpopular one, poses bigger challenges than whatever barriers existed to getting it passed. Although the initial school closures may have been justifiable (even if off-script in many locations), there was no plan on when and how to reopen. Officials repeated a refrain that schools should open when it was “safe.” But “safe” was either pegged to unreachable or arbitrary benchmarks or, more often, not defined. This meant there would be limited recourse against a public that had been led to believe this intervention was a net benefit, even long after evidence showed otherwise. The lack of an exit plan—or an “off-ramp,” as many health professionals would later term it—would prove disastrous for tens of millions of children in locations where social and political pressures prevented a reversal of the closures.

[Read: The biggest disruption in the history of American education]

Without sufficient acknowledgment of the harms of school closures, or adequate planning for unwinding this intervention, officials showed that their decisions to close were simply reactive rather than carefully considered. The decision makers set a radical project in motion with no plan on how to stop it. In effect, officials steered a car off the road, threw a cinder block on the accelerator, then jumped out of the vehicle with passengers still in the back. No one was in the front or even knew how to unstick the pedal.


This article was adapted from David Zweig’s book An Abundance of Caution: American Schools, the Virus, and a Story of Bad Decisions.

After Elon Musk made a public show of remedying an apparent error in DOGE’s massive cuts to foreign aid, the Trump administration has quietly doubled down on its decision to stop sending emergency food to millions of children who are starving in Bangladesh, Somalia, and other countries. Without urgent intervention, many of these children are likely to die within months, experts told me.

As DOGE was gutting USAID in February, it alarmed the global-health community by issuing stop-work orders to the two American companies that make a lifesaving peanut paste widely recognized as the best treatment for malnutrition. The companies—Edesia and Mana Nutrition—subsequently received USAID’s go-ahead to continue their work. But soon after that, their contracts were officially canceled. When news of the cancellation was made public, Elon Musk vowed to investigate the issue and “fix it.” Hours later, Musk announced that one contract had been restored days earlier; that night, the second company received notice that its contract had been reinstated.

According to Mana and Edesia, however, that was only the start of the story. The contracts reinstated in February applied to old orders for emergency therapeutic food that Mana and Edesia were already in the middle of fulfilling. But two weeks ago, without any fanfare, the Trump administration then canceled all of its upcoming orders—that is, everything beyond those old orders that were previously reinstated—according to emails obtained by The Atlantic. The move reneged on an agreement to provide about 3 million children with emergency paste over approximately the next year. What’s more, according to the two companies, the administration has also not awarded separate contracts to shipping companies, leaving much of the food assured by the original reinstated contracts stuck in the United States.

Globally, nearly half of all deaths among children under 5 are attributed to malnutrition. When children reach the most severe stage, those old enough to have teeth lose can them. Black hair turns orange as cells stop synthesizing pigment. Their bodies shrivel, and some lose the capacity to feel hunger at all. Before the 21st century, starving children could only be treated in a hospital, and among the sliver of them who were admitted, a third would die, Mark Manary, a pediatrics professor at Washington University in St. Louis, told me. The invention of a new type of emergency food allowed parents to treat their own kids at home; more than 90 percent recover within weeks of treatment, according to the International Rescue Committee.

The original brand-name version, Plumpy’Nut, was first used to treat children in the early 2000s, and the U.S. started supplying it to foreign countries in 2011, Manary told me. It’s a pouch—basically an oversize ketchup packet—of peanut butter fortified with powdered milk, sugar, vitamins, minerals, and oil, a mixture that’s easier for shrunken stomachs to digest than a full meal. The packets keep without a refrigerator, making them useful in hunger-prone settings like refugee camps and war zones. They come ready to eat, so parents don’t need to worry about dissolving the contents in clean water. A six-week supply costs $40, and three packets a day fulfills all the basic nutritional needs of children ages six months to 5 years. This regimen regularly saves the lives of even those who are mere days from death. Lawrence Gostin, the director of Georgetown’s Institute for National and Global Health Law, told me that ready-to-use therapeutic foods like Plumpy’Nut are “the singular public-health achievement of the last several decades”—more consequential, experts reiterated to me, than even antibiotics or vaccines.

Typically, the U.S. supplies starving children with emergency therapeutic food through a multistep process. UNICEF and the World Food Programme forecast months in advance how much paste they’ll need to send to various countries, and ask USAID to buy some of it. Previously, USAID hired Edesia (which is based in Rhode Island) and Mana (based in Georgia) to make the paste, then paid to ship the boxes overseas. The United Nations handles delivery once the food reaches port, and organizations such as Save the Children and Doctors Without Borders typically carry shipments to the children who ultimately consume them.

The Trump administration has broken every step of that system. According to Mana CEO Mark Moore and Edesia CEO Navyn Salem, USAID agreed back in October to buy more than 1 million boxes of therapeutic food. The World Food Programme and UNICEF planned to distribute the contents of this order as early as March, according to an email obtained by The Atlantic. But on April 4, both Edesia and Mana received an email from a staffer at the State Department that said the plans for 10 countries to receive the emergency paste would not move forward. (Those countries: Bangladesh, Burundi, Cameroon, the Central African Republic, Chad, the Democratic Republic of the Congo, Djibouti, Nigeria, Somalia, and Yemen, to which the U.S. has separately canceled all humanitarian aid.)

When I spoke with Moore, he panned his phone across the production floor to show me boxes upon boxes of peanut paste piled against the walls. Moore told me he is terrified for the children who will die without the paste. Without it, he said, “they’re trapped. Just trapped.” He is also worried for the Americans who rely on his business for their own livelihoods. “All we’re doing is cutting farmers and hurting kids. That just seems like a terrible plan to me,” he said. Meanwhile, Edesia, which had stopped its production for the first time in more than a decade after the first cancellation notice, is now making just 2,000 Plumpy’Nut packets a day instead of the usual 10,000, Salem said.

Moore and Salem both told me that even if USAID had not canceled the order itself, they have no idea how they would have shipped it. As far as they know, the U.S. government has failed to award many expected contracts to the shipping companies that Moore and Salem have long used to send their emergency food products overseas. This month, Salem said, Edesia was able to ship 42,000 boxes of emergency food for moderately malnourished kids to Somalia, but was unable to secure transport for another approved shipment of 123,888 boxes for acutely malnourished children to Sudan. Salem says she has no clue why. Hundreds of thousands of boxes of food from both companies’ old, reinstated orders still have not left the U.S. “We need product to leave the factories at no later than four months” after it is manufactured, Salem told me, to ensure at least a year of shelf life when it arrives in Africa or Asia. She does not know who to call, at USAID or the State Department, to make that happen, she told me.

[Read: The cruel attack on USAID]

On April 10, Moore received an email from a State Department staffer who said that her team is seeking approval to ship the paste that has already been manufactured—if not to the original intended recipients, then somewhere. “We are not sure of the timeline for this approval,” the staffer wrote. “But please know that we are trying to ensure that no commodities go to waste.”

Even if the paste makes it overseas before it expires, it might not make it into children’s hands. Save the Children, one of UNICEF’s major last-mile distributors, typically gives out emergency therapeutic food at clinics where mothers can also give birth and take their infants for health screenings. But the organization has been forced to stop its work in nearly 1,000 clinics since Trump’s inauguration in January because of U.S. funding that his administration eliminated or failed to renew, Emily Byers, a managing director at the organization, told me.

In a statement, UNICEF told me that the Trump administration still has not informed the organization of the canceled orders. UNICEF projects that 7 million children will require treatment for extreme malnutrition in 2025. Even before the USAID cuts, it had the budget to treat only 4.2 million of them. Mana and Edesia typically provide 10 to 20 percent of UNICEF’s annual emergency therapeutic food, and USAID supplies half of its overall funding for nutrition treatment and hunger-prevention services. “Today, we have no visibility on future funding from the US Government,” the statement read. Typically, producers have half a year to fill an order as big as the one the U.S. canceled, according to Odile Caron, a food-procurement specialist at Doctors Without Borders. UNICEF needs that food in much less time. If malnourished kids don’t get access to emergency therapeutic food because of the U.S. government’s decisions, “in three months, half of them will be dead, and the rest will have terrible disabilities, mostly neurocognitive,” Manary, who also ran the first clinical trials on Plumpy’Nut, told me.

Since the dissolution of USAID began in January—most of the agency has been gutted, the rest absorbed by the State Department—the Trump administration has insisted that lifesaving foreign aid will be allowed to continue. Just yesterday, a State Department spokesperson told reporters, “We know that we are a country with incredible resources. We know that. And we have incredible responsibilities, and we do not shy away from them.” The White House did not answer my questions about the discrepancy between that sentiment and the orders that the administration cancelled. USAID, the State Department, DOGE, and Musk did not respond to requests for comment. According to NPR, a program in Syria that feeds expecting mothers and young children was recently told that its contract was spared from the government’s ongoing cuts. But a separate contract funding the program’s staff was terminated, leaving no one to do the work. Meanwhile, all that paste is still piled up in Moore’s warehouse.

[Read: America can’t just unpause USAID]

During Trump’s first Cabinet meeting, in February, Musk acknowledged that DOGE’s teardown of foreign assistance had been hasty, then pledged that “when we make mistakes, we will fix it very quickly.” But the White House seems to have done nothing yet to fix this problem. Instead, it is keeping in purgatory two American companies that make a product that dying children need to survive. As Moore reminded me throughout our conversation, he has hundreds of thousands of boxes of paste packed and ready for distribution. That means one of two things happens next: “It will get shipped or it will get destroyed.”

A sudden and mysterious outbreak of communicable disease began recently in my apartment building in Manhattan. Three 7-year-olds, a boy and two girls, were sharing the elevator one day with a caretaker and a random adult (me). The boy was leaning against the back of the elevator, between the two girls. “Help! I’m in a girl sandwich,” he said. “If I’m not careful, I’m going to get cooties!”

“Kids still play Cooties?” I asked, surprised that cooties were not a relic of my Boomer childhood but had endured into the 21st century, still sparking alarm, feigned or real, among the young. “Yeah-huh,” the boy said. One of the girls piped up: “I know how to give a cootie shot.” She demonstrated on her own shoulder, her technique a bit of a blur.

The kids and their caretaker got off on their floor, leaving me to ponder the cootie phenomenon for the first time in many decades. Beyond being amused, I was struck by the morbid salience of a children’s game that mimics infection at a time when vaccine skepticism is on the rise and an outbreak of a non-pretend disease, measles, is threatening the lives of children in the Southwest. I learned that there is a vibrant if slender slice of academic literature on “preadolescent cootie lore,” as one scholar puts it, and that this goofy grade-school fixation is more closely tied to real public-health concerns than you might think if your cootie expertise derives only from the playground.

What exactly are cooties? Since at least the 1960s, field researchers have collected definitions of varying specificity from grammar-school respondents: “boys’ germs,” “girls’ germs,” “something that kills you,” “like germs, it has germs on it,” “where somebody licks the bottom of the chair or eats paper.” Other experts speak of cooties in more anthropological terms. The University of Wisconsin at Milwaukee folklorist Simon J. Bronner has characterized cooties as a “ritualized affliction.” In their seminal 1976 book, One Potato, Two Potato: The Folklore of American Children, Herbert and Mary Knapp described cooties as a kind of sport. “There are no supervised Cootie leagues, but more people in the United States have played Cooties than have played baseball, basketball, and football combined,” they wrote. “It’s our unofficial national game.”

Cooties certainly have something to do with hygiene. According to a (somewhat gross) analysis by Lawrence A. Hirschfeld, an anthropologist at the New School for Social Research, cooties are “a social contaminant that pass from one child to another,” made up of “the invisible particulates associated with germs, farts, or ‘boogers.’” They seem to draw upon children’s anxieties about illness and doctors but also gender, reflecting the confusing mix of flirtation and social opprobrium attached to boy-girl relations (and not entirely absent from adult life). Beyond that, Hirschfeld concludes, “Cootie lore is not conceptually orderly.”

Cootie history is clearer. The word itself began life as a British colonial term, probably a corruption of kutu, a Malay word for lice and other biting insects. American soldiers picked it up, as it were, from their allies during World War I. A New York Times report from 1918, headlined “Doughboys Lose Cooties,” described soldiers who “scratch with a vengeance” lining up at a “Disinfecting Plant” run by the American Red Cross. One “lanky New England lad” exclaims, “I’ve got all the cooties in France.” In the 1920s, games referencing cooties became popular at bridal showers, in honor of grooms who had served in Europe (and also provoked, perhaps, by sublimated anxiety about other communicable diseases they might have brought home). One version involved drawing separate parts of a bug, based on rolls of a dice, until a winner had a completed louse. This practice evolved into Cootie, the game in which kids assemble plastic insects with fiddlehead-fern-like proboscises, which was introduced nationwide in 1949 and is still manufactured today.

The invisible-particulate form of cooties appears to have hit American playgrounds sometime in the 1930s, but surveys suggest that it didn’t become ubiquitous until the early ’50s, at the height of the polio epidemic. Before the polio vaccine was introduced in 1955, tens of thousands of children were catching the disease every year; thousands of them died, and more were left paralyzed. In his book Explaining Traditions: Folk Behavior in Modern Culture, Bronner writes that cooties and cootie shots—“circle circle, dot dot” being one classic formulation—were a way for children “to dramatize the dread of the disease.” Cooties was also popular during the ’80s, when kids were hearing a lot about AIDS.

This kind of imitative play—not just Cooties but also House or Cops and Robbers—helps children make sense of the world. Like nursery rhymes, it can also be commentary, even a kind of satiric outsider art. As Iona Opie, a pioneering British children’s folklorist, once observed: “Step into the playground; a kind of defiant lightheartedness envelops you. The children are … making fun of life.” This was apparently very much the case during the early days of the coronavirus pandemic. Bronner told me that while schools were shut down and kids were isolated at home, they began sending one another memes in which cooties tended to represent a kind of generalized pandemic funk. Moreover, the senders often depicted themselves as babies—in Bronner’s interpretation, a humorous expression of their frustration at not being able to do normal kid things.

I suppose it is comforting that, five years after COVID-19 was first declared a pandemic, my elevator acquaintances had returned to an in-person style of cootie play, acting out their venerable parody of infection and protecting themselves with shots in the arm. But the epidemiological satire takes on an especially dark cast when it expresses more faith in the power of vaccination than does the current secretary of Health and Human Services, Robert F. Kennedy Jr.

Kennedy has dismissed the scientifically proven efficacy of the polio vaccine as “mythology,” despite polio having been considered eradicated in the U.S. since 1979. (Before he was confirmed, a spokesperson noted: “Mr. Kennedy believes the polio vaccine should be available to the public and thoroughly and properly studied.” Not quite a full-throated endorsement.) He has also sown doubt about vaccines at a time when vaccine hesitancy is fueling a historic and frightening measles outbreak in the Southwest, which has now infected more than 600 people in the U.S. and killed two unvaccinated children in Texas—the first measles deaths in America in a decade. An adult who died in New Mexico was also found to be infected with measles. (Outside the U.S., the health secretary of Chihuahua, Mexico, announced that an unvaccinated man had died of measles in connection with the Texas outbreak.)

Kennedy has spoken about the benefits of the MMR vaccine but also continues to undermine its safety and efficacy. He has repeatedly emphasized parental choice in vaccination and, as The Atlantic reported, told the grieving father of one Texas child, “You don’t know what’s in the vaccine anymore.” He insists on promoting unproven alternative treatments such as cod-liver oil (a source of vitamin A), antibiotics, and steroids; in a Fox News interview, he claimed that these can lead to “an almost miraculous and instantaneous recovery.” According to actual virologists, this is a prodigious exaggeration of vitamin A’s efficacy, and in the case of the other supposed miracle cures, it’s pure invention.

A cootie shot might be an even more fanciful treatment, but, unlike with vitamin A, overdoing it won’t lead to liver damage, which Texas pediatricians told The New York Times they are now seeing in unvaccinated young patients, whose parents were presumably paying Kennedy heed. When so many adults seem intent on returning to the Dark Ages, we must cherish medical wisdom wherever we find it. In 2025, the cootie shot stands as an inadvertent rebuke to the nation’s top health official, in that it emphasizes vaccines’ very real efficacy against disease. I wonder if the HHS secretary has ever heard this anecdote from his own family history. On February 25, 1923, his grandmother Rose Kennedy recorded the following in her journal: “Joe Jr. and Jack have a new song about the Bedbugs and the Cooties. Also a club where they initiate new members by sticking pins into them.” Did RFK Jr.’s uncles (who had likely received smallpox vaccinations) invent cootie shots? A topic for further research.

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

Sarah Palin knew her audience. In 2013, during a speech at CPAC, the annual Republican confab, the former Alaska governor reached down into her lectern and pulled out a cup of soda the size of her head. She took a long swig, and then another, as the audience broke into raucous applause. Palin put the drink down and delivered the punch line: “Our Big Gulp’s safe,” she said. At the time, New York City was attempting to ban restaurants from selling sodas larger than 16 ounces, and Republicans across the country were angry at then-Mayor Michael Bloomberg. The nanny state, they argued, was trying to take away their corn-syrup-laden fizzy water. A conservative advocacy group paid to publish an ad in The New York Times of “Nanny Bloomberg” wearing a powder-blue dress and a pastel scarf.

The soda wars have long broken along partisan lines. New York City’s ban was struck down in court before it could go into effect, but even more modest attempts to regulate soda have been concentrated in deeply blue cities such as Berkeley and San Francisco. Liberals drink soda too, of course, though the drink’s biggest defenders are on the right. President Donald Trump loves Diet Coke so much that in both of his terms, he’s had a button installed in the Oval Office to summon a refill; on the campaign trail, now–Vice President J. D. Vance claimed that Democrats see Diet Mountain Dew, his drink of choice, as “racist.”

[Read: J.D. Vance has a point about Mountain Dew]

But today, Republicans across the country are cracking down on soda. Politicians in Texas, Arkansas, West Virginia, Idaho, Nebraska, Michigan, Arizona, and South Carolina are not advocating for the ban of Big Gulps. They are, however, seeking to enact one of Michael Bloomberg’s other pet policies: preventing people from buying soda using food stamps (formally known as the Supplemental Nutrition Assistance Program, or SNAP). These and other states are following the lead of Health and Human Services Secretary Robert F. Kennedy Jr., who has been vocal about his desire to not let poor Americans on SNAP use government money to buy soda. “Why are we paying for sugar drinks that are poisoning our kids and giving them diabetes?” he asked last week.

Banning soda from SNAP seems like a no-brainer. Soda is a big reason adults in the United States consume, on average, two to three times more than the daily recommended intake of sugar. The federal government’s own research has shown that Americans who receive food stamps have worse diets than nonparticipants with similar incomes, and soda is surely part of that problem. These proposed bans should be even more palatable because they wouldn’t be permanent; they are pilot programs to try out the idea. Yet Democrats, for the most part, now remain firmly opposed to soda bans. No Democrat serving in the state Senates of Idaho or Arizona voted for their state’s respective measure. If America is actually going to do something about soda, tests like this will have to be part of the answer.

Nowhere is the Republican Party’s about-face on soda more stark than in West Virginia. In July, the state removed its soda tax. And now, less than a year later, it is pushing forward with a SNAP soda ban as part of an effort to decrease consumption of “ultra-processed crap that barely qualifies as food,” Republican Governor Patrick Morrisey said late last month. Banning the use of SNAP funds to purchase soda has become so popular because it combines the “Make America healthy again” focus on America’s diet problems with the conservative desire to reform the welfare state. Multiple Republicans sponsoring these bills told me they have no intention or desire to dictate what foods people spend their own money on. “If you are using your own funds, I don’t think it’s any different than cigarettes and alcohol,” Idaho State Representative Jordan Redman told me. “We know those aren’t good for us, but if you’re using your own dollars for it, that’s your decision.”

So far, none of these states has actually banned soda from SNAP. The federal government sets the rules for food stamps, even though the program is operated by each state. So a state looking to exclude soda has to request an exception to the rules. Arkansas Governor Sarah Huckabee Sanders was the first to express interest in the policy following Trump’s latest election. “The time has come to support American farmers and end taxpayer-funded junk food,” Sanders wrote to Kennedy and Department of Agriculture Secretary Brooke Rollins in December, before either was confirmed for their position. Although previous attempts to enact such a policy, such as in New York City, were blocked by USDA regulators, it seems that won’t be the case this time around. Rollins, who will have the final say in deciding whether states get to experiment with this idea, has indicated she supports the states’ efforts and will likely approve requests that come her way.

But many long-standing anti-soda advocates are skeptical. Marion Nestle, a professor emeritus at NYU and the author of Soda Politics: Taking on Big Soda (And Winning), has spoken in favor of SNAP soda bans, but told me that she finds it “very hard not to look at” Republicans’ current efforts “as a cover for what the real motivation is, which is to cut SNAP.” Soda is unhealthy, but so is limiting food stamps: The program has been shown to significantly reduce food insecurity and health-care costs. Such skepticism is not unwarranted. One advocacy group pushing the current soda bans, the Foundation for Government Accountability, has also been pushing for a number of policy changes that would significantly reduce the number of people eligible for food-stamp benefits. Joelle Johnson, the deputy director for Healthy Food Access at the Center for Science in the Public Interest, which advocates for stronger nutrition regulation, told me the group doesn’t support the soda bills, because they could lead to less money for SNAP from the government and are “a veiled attempt to slash SNAP benefits.” Some states are going beyond soda and seeking to ban different kinds of foods; dramatic changes in what people can purchase “sets up the argument to say, ‘Well, if they can only purchase a limited variety of products, then they don’t need as much money in monthly benefits,’” Johnson said.

To some Democrats, the effort to zero in on poor people’s food choices is also cruel. After all, Coca-Cola and Pepsi didn’t become Fortune 500 companies just because people on food stamps like their products. Consider Texas: The state’s soda consumption goes way beyond food-stamp recipients. More than 60 percent of Texans drink at least one sugar-sweetened beverage a day. “There’s real cognitive dissonance when we’re discussing these bills in the back and I’m sitting around with my Senate colleagues and they’re drinking Coca-Cola,” Texas State Senator Molly Cook, a Democrat who voted against the state’s bill, told me.

The situation is such a mess that the two sides can’t even agree on the underlying purpose of SNAP. Proponents of the soda ban are quick to note that the full name of the food-stamp program is the Supplemental Nutrition Assistance Program, and no one is getting their nutrition assisted via a two-liter bottle of Pepsi. But public-health and anti-hunger groups argue that any food is better than no food. Although the argument seems spurious at first, given SNAP’s name, the program was only renamed in 2008. The original Food Stamp Act, which officially created the program back in the 1960s, was more about reducing poverty than nutrition.

The tension between those two goals is tying some of the nation’s top public-health-advocacy organizations in knots. The American Heart Association originally spoke against Texas’s proposed soda ban out of concern that nutritional restrictions would “interfere with the primary function of SNAP,” which a lobbyist for the group described as “reducing hunger.” But now the group insists its position was miscommunicated. “We look forward to working with states interested in seeking USDA approval to remove sugary drinks from SNAP as they prepare their waiver requests,” a spokesperson told me.

A ban on using SNAP dollars for soda is hardly the most equitable way to deal with the sugary drink; a tax that targets everyone would be fairer. But at this point, given the USDA’s support, red states around the country getting their way and establishing some new limits on what people can buy with food stamps seems like a foregone conclusion. Rather than opposing these efforts outright, Democrats should see them as an opportunity. There’s very little research testing the effects of such a soda ban, so a pilot program would help “identify unintended consequences or questions we will later wish we asked,” Jerold Mande, a former USDA and FDA official who served in the Clinton and Obama administrations, told me. Maybe then we can finally get to the bottom of the decades-long debate over whether soda bans are a good idea.

In the never-ending quest to figure out what we are supposed to eat, a new boogeyman has emerged: seed oils. Health and Human Services Secretary Robert F. Kennedy Jr. has pointed to seed oils—a category that includes common varieties such as canola, soybean, and corn—as a major culprit behind America’s chronic-disease problem. Kennedy is far from the only prominent seed-oil critic: On his podcast, Joe Rogan has declared that “seed oils are some of the some of the worst fucking things your body can consume.” These claims about the dangers of seed oils are not based in science; nutritionists believe that they are not only safe but also good for you in moderation. But that hasn’t stopped the charge against them from going mainstream. You can now find products labeled Seed oil safe at Whole Foods and Costco; according to one poll, 28 percent of Americans are actively avoiding seed oils.

So what are people eating instead? Kennedy’s preferred alternative is beef tallow, a nutritionally dubious choice. But most grocery stores don’t have family-size tubs of rendered beef fat sitting next to the extra-virgin olive oil. The obvious seed-oil replacement, then—similarly vegetal, broadly familiar, delicious—is olive oil. Scientists and seed-oil skeptics can agree on this: olive oil, what an oil! Earlier this year, the fast-salad chain Sweetgreen launched a limited-time-only seed-oil-free menu featuring dressings made with olive and avocado oils, chosen for their flavor but also for “their health benefits and alignment with our values.”

But olive oil may soon cost more—potentially a lot more. Donald Trump’s “reciprocal” tariffs, which he delayed by 90 days yesterday, are coming for the country’s liquid gold. You know what is mostly insulated from the president’s proposed plan? Seed oils. Consider vegetable oil, the most ubiquitous of seed oils: No matter what brand you buy, it’s likely made from American-grown soybeans. “If the goal is to get people away from the seed oil, well, these tariffs are going to drive people into the arms of the seed oils,” William Clifton Ridley, an agricultural-economics professor at the University of Illinois Urbana-Champaign, told me. Seed oils, maligned by both the crunchy left and the MAHA right, may get their revenge.

[Read: Americans have lost the plot on cooking oil]

The biggest drawback of olive oil, ignoring certain culinary questions (flavor, smoke point), has long been its price. Olive oil is not cheap compared with canola or vegetable oil. But since 2021, the average price of olive oil in the United States has roughly doubled, the result of climate change and rising production costs. Consider Wirecutter’s budget olive-oil pick, Bertolli Extra Virgin Olive Oil, Rich Taste. At Walmart, it currently costs $8.47 for 16.9 fluid ounces (the equivalent of a regular-size Coke bottle). By contrast, 40 ounces of Crisco vegetable oil, equivalent to slightly more than a liter, will run you $4.47.

The gulf is poised to only widen. That’s because nearly all of the olive oil consumed in the U.S. is imported, according to the U.S. Department of Agriculture. As anyone who has gazed upon the bounty of the supermarket olive-oil aisle can tell you, most of that is coming from the European Union, namely Italy, Spain, Portugal, and Greece. These products currently carry a 10 percent tariff; if Trump goes through with the sweeping fees he paused yesterday, that’ll soon jump to 20 percent. Olive oil is also imported from some other countries, but the trouble is that the proposed tariffs are so global. A lot of olive oil comes from Tunisia, for example, which, under the president’s paused plan, would be tariffed at 28 percent.

Trump’s tariffs are nominally intended to boost American manufacturing. “These tariffs are going to give us growth like you haven’t seen before,” the president has promised. Except there is nowhere near enough homegrown American olive oil to go around. California, the rare state with conditions amenable to olive-growing, produces less than 2 percent of the olive oil that Americans consume. “California likes to think it produces olive oil, but not really, not to any great extent,” Dan Sumner, an agricultural economist at UC Davis, told me. It wouldn’t be easy to drastically ramp up domestic olive-oil production: Olive trees can take at least five years to bear fruit. And with Trump repeatedly announcing tariffs and then pausing them, it’s hard to expect American farmers to invest in this undertaking when they might not even recoup the benefits come 2030.

[Read: A great way to get Americans to eat worse]

Should Trump’s more expansive tariffs take effect, olive-oil prices “might go up substantially,” Ridley told me. Expect the sticker price of olive oil to increase somewhere from 10 to 20 percent—enough, he said, to “drive a sizable decrease in olive-oil demand.” Americans almost certainly won’t abandon olive oil en masse. It’s olive oil, a kitchen staple; nobody wants to drizzle their pizza with canola. “But there’s a huge swath of the population that’s not going to be able to afford it,” Phil Lempert, a grocery-industry analyst, told me. “And they’re going to switch.”

And there are other options. Maybe seed-oil skeptics will want to follow RFK Jr.’s lead and sauté their food in beef tallow. But tallow isn’t cheap either, and there isn’t enough of it to go around. Last year, America produced about one pound of beef tallow for every 15 pounds of soybean oil, the most consumed oil in the U.S. by far. Compared with the alternatives, soybean oil will seem even cheaper: It is produced domestically; imports are essentially zero. The same is true of corn oil, only a tiny fraction of which comes from abroad. The majority of canola oil is imported from Canada—meaning that at least for now, it isn’t subject to any new tariffs. You can debate these oils’ relative merits and drawbacks, but you cannot debate the fact that they cost less. Even the more limited 10 percent tariffs that are now in place could lead to a seed-oil resurgence. If the costs are passed down to consumers, Sumner told me, most people will suck it up and pay—but not everyone. Some people will shift to canola or vegetable oil. Restaurants, perennially concerned about margins, may be less likely to follow Sweetgreen’s lead and give up seed oils. Your local Italian restaurant, Lempert pointed out, may already be saving money by blending their olive oil with canola, and that’s before the tariffs.

Although RFK Jr. is wrong about the health effects of seed oils, he’s right about why they’re so common: They’re cheap. “The reason they’re in foods is that they’re heavily subsidized,” he told Fox & Friends, a point about seed oils he has made repeatedly. The federal government indeed pays American farmers to grow lots of corn and soybeans, allowing you to buy a jug of Crisco for less than $5. If the official governmental policy is to drive up prices on the most obvious alternative, seed oils will continue to have a leg up.

Newer Posts