Category:

Medical News

Sign up for Being Human, a newsletter that explores wellness culture, mortality and disease, and other mysteries of the body and the mind.

Robert F. Kennedy Jr., the nominee to be the next secretary of Health and Human Services, is America’s most prominent vaccine skeptic. An advocacy organization that he founded and chaired has called the nation’s declining child-immunization rates “good news,” and referred to parents’ lingering doubts about routine shots as COVID-19’s “silver lining.” Now Kennedy may soon be overseeing the cluster of federal agencies that license and recommend vaccines, as well as the multibillion-dollar program that covers the immunization of almost half the nation’s children.

Which is to say that America’s most prominent vaccine skeptic could have the power to upend, derail, or otherwise louse up a cornerstone of public health. Raising U.S. vaccination rates to where they are today took decades of investment: In 1991, for example, just 82 percent of toddlers were getting measles shots; by 2019, that number had increased to 92 percent. The first Trump administration actually presided over the historic high point for the nation’s immunization services; now the second may be focused on promoting vaccines’ alleged hidden harms. Kennedy has said that he doesn’t want to take any shots away, but even if he were to emphasize “choice,” his leadership would be a daunting test of Americans’ commitment to vaccines.

In many ways, the situation is unprecedented: No one with Kennedy’s mix of inexperience and paranoid distrust has ever held the reins at HHS. He was trained as a lawyer and has no training in biostatistics or any other research bona fides—the sorts of qualifications you’d expect from someone credibly evaluating vaccine efficacy. But the post-pandemic era has already given rise to at least one smaller-scale experiment along these lines. In Florida, vaccine policies have been overseen since 2021 by another noted skeptic of the pharmaceutical industry, State Surgeon General Joseph Ladapo. (Kennedy has likened Ladapo to Galileo—yes, the astronomer who faced down the Roman Inquisition.) Under Ladapo’s direction, the state has aggressively resisted federal guidance on COVID-19 vaccination, and its department of health has twice advised Floridians not to get mRNA-based booster shots. “These vaccines are not appropriate for use in human beings,” Ladapo declared in January. His public-health contrarianism has also started spilling over into more routine immunization practices. Last winter, during an active measles outbreak at a Florida school, Ladapo abandoned standard practice and allowed unvaccinated children to attend class. He also seemed to make a point of not recommending measles shots for any kids who might have needed them.

Jeffrey Goldhagen, a pediatrics professor at the University of Florida and the former head of the Duval County health department, believes that this vaccine skepticism has had immense costs. “The deaths and suffering of thousands and thousands of Floridians” can be linked to Ladapo’s policies, he said, particularly regarding COVID shots. But in the years since Ladapo took office, Florida did not become an instant outlier in terms of COVID vaccination numbers, nor in terms of age-adjusted rates of death from COVID. And so far at least, the state’s performance on other immunization metrics is not far off from the rest of America’s. That doesn’t mean Florida’s numbers are good: Among the state’s kindergarteners, routine-vaccination rates have dropped from 93.3 percent for the kids who entered school in the fall of 2020 to 88.1 percent in 2023, and the rate at which kids are getting nonmedical exemptions from vaccine requirements went up from 2.7 to 4.5 percent over the same period. These changes elevate the risk of further outbreaks of measles, or of other infectious diseases that could end up killing children—but they’re not unique to Ladapo’s constituents. National statistics have been moving in the same direction. (To wit: The rate of nonmedical exemptions across the U.S. has gone up by about the same proportion as Florida’s.)

All of these disturbing trends may be tied to a growing suspicion of vaccines that was brought on during COVID and fanned by right-wing influencers. Or they could be a lingering effect of the widespread lapse in health care in 2020, during which time many young children were missing doses of vaccines. (Kids who entered public school in 2023 might still be catching up.)

In any case, other vaccination rates in Florida look pretty good. Under Ladapo, the state has actually been gaining on the nation as a whole in terms of flu shots for adults and holding its own on immunization for diphtheria, tetanus, and pertussis in toddlers. Even Ladapo’s outlandish choice last winter to allow unvaccinated kids back into a school with an active measles outbreak did not lead to any further cases of disease. In short, as I noted back in February, Ladapo’s anti-vaccine activism has had few, if any, clear effects. (Ladapo did not respond when I reached out to ask why his policies might have failed to sabotage the state’s vaccination rates.)

  

If Florida’s immunization rates have been resilient, then America’s may hold up even better in the years to come. That’s because the most important vaccine policies are made at the state and local levels, Rupali Limaye, a professor and scholar of health behavior at Johns Hopkins University, told me. Each state decides whether and how to mandate vaccines to school-age children, or during a pandemic. The states and localities are then responsible for giving out (or choosing not to give out) whichever vaccines are recommended, and sometimes paid for, by the federal government.  

But the existence of vaccine-skeptical leadership in Washington, and throughout the Republican Party, could still end up putting pressure on local decision makers, she continued, and could encourage policies that support parental choice at the expense of maximizing immunization rates. As a member of the Cabinet, Kennedy would also have a platform that he’s never had before, from which he can continue to spread untruths about vaccines. “If you start to give people more of a choice, and they are exposed to disinformation and misinformation, then there is that propensity of people to make decisions that are not based on evidence,” Limaye said. (According to The New York Times, many experts say they “worry most” about this aspect of Kennedy’s leadership.)

How much will this really matter, though? The mere prominence of Kennedy’s ideas may not do much to drive down vaccination rates on its own. Noel Brewer, a behavioral scientist and public-health professor at the UNC Gillings School of Global Public Health, told me that attempts to change people’s thoughts and feelings about vaccines are often futile; research shows that talking up the value of getting shots has little impact on behavior. By the same token, one might reasonably expect that talking down the value of vaccines (as Kennedy and Ladapo are wont to do) would be wasted effort too. “It may be that having a public figure talking about this has little effect,” Brewer said.

Indeed, much has been made of Kennedy’s apparent intervention during the 2019 measles crisis in Samoa. He arrived there for a visit in the middle of that year, not long after measles immunizations had been suspended, and children’s immunization rates had plummeted. (The crisis began when two babies died from a vaccine-related medical error in 2018.) Kennedy has been linked to the deadly measles outbreak in the months that followed, but if his presence really did give succor to the local anti-vaccine movement, that movement’s broader aims were frustrated: The government declared a state of emergency that fall, and soon the measles-vaccination rate had more than doubled.

As head of HHS, though, Kennedy would have direct control over the federal programs that do the sort of work that has been necessary in Samoa, and provide access to vaccines to those who need them most. For example, he’d oversee the agencies that pay for and administer Vaccines for Children, which distributes shots to children in every state. All the experts I spoke with warned that interference with this program could have serious consequences. Other potential actions, such as demanding further safety studies of vaccines and evidence reviews, could slow down decision making and delay the introduction of new vaccines.

Kennedy would also have a chance to influence the nation’s vaccine requirements for children, as well as its safety-and-monitoring system, at the highest levels. He’d be in charge of selecting members for the Advisory Committee on Immunization Practices, which makes recommendations on vaccines that are usually adopted by the states and result in standardized insurance coverage. He’d also oversee the head of the CDC, who in turn has the authority to overrule or amend individual ACIP recommendations.

Even if he’s not inclined to squelch any determinations outright, Kennedy’s goal of giving parents latitude might play out in other ways. Brewer, who is currently a voting member of ACIP (but emphasized that he was not speaking in that capacity), said that the committee can issue several different types of rulings, some of which roughly correspond to ACIP saying that Americans should rather than may get a certain vaccine. That distinction can be very consequential, Brewer said: Shots that are made “routine” by ACIP get prioritized in doctor’s offices, for instance, while those that are subject to “shared clinical decision-making” may be held for patients who ask for them specifically. Shifting the country’s vaccination program from a should to a may regime “would destroy uptake,” Brewer told me.

Those would seem to be the stakes. The case study of vaccine-skeptical governance that we have in Florida may not look so dire—at least in the specifics. But Kennedy’s ascendancy could be something more than that: He could steer the public-health establishment off the course that it’s been on for many years, and getting back to where we are today could take more years still.

Sign up for Being Human, a newsletter that explores wellness culture, mortality and disease, and other mysteries of the body and the mind.

When I was teaching indoor cycling every week, an unexpected benefit of the gig was free ice cream. One of the class regulars had an ice-cream machine at home and sometimes brought samples for me to try, in flavors such as pumpkin and pistachio. I think he did this not only because he was a nice person but also because in class, I was the nicest version of myself: warm, welcoming, and encouraging to the point of profound corniness, despite my usual caustic tendencies.

I noticed this friendliness in others too. Two people who met in my class started dating. Strangers who became friends there went out for post-workout coffees. Two of the other class regulars invited me to go skiing with them. Many of the good friends I have at age 35 are people I met in exercise classes I attended regularly. These experiences have convinced me that group fitness classes are the best place to make friends as an adult—an idea supported by research that suggests that the glow of exercise’s feel-good chemicals has interpersonal benefits.

Once, countless friendships were born in what the sociologist Ray Oldenburg called “third places”: physical spaces that aren’t a home or a workplace, don’t charge (much) for entry, and exist in large part to foster conversation. Over the past several decades, though—and especially as a result of the pandemic—third places such as bars and cafés have begun playing a much smaller role in social life, depriving American adults of opportunities for chance encounters that can lead to friendships. Perhaps that’s partly why Americans rank improving their relationships among their top New Year’s resolutions.

Group fitness classes don’t exactly fit the definition of a third place: They cost money, and the primary activities within them are sweating, grunting, and skipping a few reps when the instructor isn’t looking. But they fulfill many conditions that social-psychology research has repeatedly shown to help forge meaningful connections between strangers: proximity (being in the same place), ritual (at the same time, over and over), accumulation (for many hours), and shared experiences or interests (because you do and like the same things).

[From the December 2019 issue: I joined a stationary-biker gang]

Sussing out shared interests can be horribly awkward when you meet someone new at work or even at a party. Group fitness classes make it a little easier, Stephanie Roth Goldberg, an athlete psychotherapist in New York, told me. “Automatically, when you walk into a fitness class, you likely are sharing the idea that ‘We like to exercise,’ or ‘We like to do this particular kind of exercise,’” she said. “It breaks the ice differently than standing in a bar or at someone’s house.” Of course, breaking the ice still requires someone to say something, which, if you’re sweaty and huffing, is frankly terrifying. Whether I’m an instructor or a classmate, one simple tactic has never failed me: I simply walk up to someone after class and say, “Hey, good job!”

Proximity, ritual, and accumulation all require a certain amount of time, which can be hard to come by in a country that requires and rewards long hours at work. But you’re already making time for exercise class, and it provides those conditions; benefitting from them mostly requires acknowledging that you’ve already set yourself up for friendship. Danielle Friedman, a journalist and the author of Let’s Get Physical, told me that breaking through what she calls the “social code of anonymity” is key to making friends. “If you’ve been going to the same class for a while and start seeing the same people, don’t pretend like you’ve never interacted before,” she said.

That kind of friendliness requires adopting the clichéd feel-goodery inherent in many group fitness classes. In my spin classes, I’d cringe whenever I caught myself doling out motivational platitudes—mostly “We’re all in this together!” because I needed the reminder too, as I tried to talk and spin at the same time. Inevitably, though, someone would “Woo!” in response and reenergize the whole room. I’d load up my playlists with high-tempo remixes of early-aughts Top 40 hits and catch people singing along. One of my favorite instructors in a class I attended regularly instituted “Fun Friday,” when we’d warm up by doing silly little relay races or grade-school-style games; my blood ran cold the first time she told us to partner up for this cheesefest, but I had a blast. Everyone did.

In a world that prizes ironic detachment, embracing such earnest silliness can feel deeply uncomfortable. But—and you might as well get used to hearing this kind of phrase now, if you’re going to start attending classes—you just have to push through. “When you’re sweating, feeling a little out of control of your physical self, whooping and yelling, there’s a vulnerability,” Friedman said. “If you buy in, then you’ve shared something. There aren’t that many contexts as adults where you have that opportunity to be vulnerable together.”

[Read: Why making friends in midlife is so hard]

A room full of grown adults flailing, shouting, and running miles without ever going anywhere is a fundamentally ridiculous prospect. Ridiculous things, however, play a crucial role in connecting with others: They make us laugh. Studies show that laughing with others facilitates social connection by helping us feel that we have more in common. The “happy hormones” released during exercise—endorphins, dopamine, and serotonin—are also associated with bonding. In particular, exercising in sync with others promotes close relationships.

Even if you don’t find your next best friend at Zumba, getting into a fitness habit of some kind might help you meet people and make friends in other spaces. “The more that people can step out of their comfort zone in one setting, the less intimidating it is to do in other settings,” Goldberg said. Perhaps you’ll even become the version of yourself who inspires people to bring you homemade ice cream. Win-win.

Sign up for Being Human, a newsletter that explores wellness culture, mortality and disease, and other mysteries of the body and the mind.

When Mary-Claire King discovered the first gene linked to hereditary breast cancer in 1990, she also got to decide its name. She settled on the four letters BRCA, which had three distinct meanings. The name paid homage to UC Berkeley, where King worked at the time; more to the point, it was a nod to Paul Broca, the 19th-century French physician whose work established a link between family history and breast cancer. It was also an abbreviation for breast cancer.

A few years after King discovered BRCA1, a second BRCA gene, BRCA2, was identified. Together, they now have more name recognition than probably any other gene, their profile boosted by research that has shown staggering effects on cancer risk. Awareness campaigns followed. A 2013 New York Times op-ed in which Angelina Jolie revealed she’d had a preventive double mastectomy because of her own BRCA mutation drove many women to seek DNA tests themselves. The BRCA genes became inextricably linked with breasts, as much as the pink ribbons that have become an international symbol of breast cancer. And in driving more women to find out if they have BRCA mutations, it’s helped to greatly reduce the risk of hereditary breast cancer.

But in the three decades since the genes were discovered, scientists have learned that BRCA mutations can also lead to cancer in the ovaries, the pancreas, and the prostate. More recently, they have been linked with cancers in other parts of the body, such as the esophagus, stomach, and skin. As many as 60 percent of men with changes in BRCA2 develop prostate cancer, yet men are generally far less aware than women that BRCA mutations can affect them at all.

“It’s a branding problem,” Colin Pritchard, a professor of laboratory medicine and pathology at the University of Washington, told me. Men with family histories of breast cancer may not realize that they should get screened. Physicians, too, lack awareness of which men should get tested, and what steps to take when a mutation is found. Now Pritchard and other researchers are working to rebrand BRCA and the syndrome associated with it so that more men and their doctors consider testing.

Normally, the BRCA genes produce proteins that help repair damaged DNA throughout the body. Most people who carry mutations that impair the gene’s function are diagnosed with hereditary breast and ovarian cancer syndrome. (Having HBOC means a person is at increased risk for cancer, not that they already have an illness.) Most breast-cancer cases have no known hereditary link, but more than 60 percent of women with a harmful BRCA1 or BRCA2 mutation will develop breast cancer, compared with about 13 percent of the wider female population. Men, of course, can get breast cancer too, but it’s rare, even among BRCA-mutation carriers.

[Read: Cancer supertests are here]

The full significance of the link between BRCA mutations and pancreatic and prostate cancer has become clear only recently—perhaps in the past decade, said Pritchard. The exact risk these mutations impart to men varies widely in studies. But it’s clearly significant: Not only are men with BRCA mutations more likely to develop prostate cancer, they are also more likely to develop the more aggressive forms of the disease.

Roughly one in 400 people carry a harmful mutation in BRCA1 or BRCA2, and half of them are men. But women are far more likely to have been tested for the mutations—up to 10 times as likely, according to one study. “Beyoncé’s dad was the only man that I had ever heard of who had it,” Christian Anderson, a 46-year-old social-sciences professor in Washington State who carries a BRCA2 mutation, told me. Anderson got tested after his sister was diagnosed with breast cancer, but countless men like him go undetected. Only about half of Americans get an annual physical, and doctors aren’t always aware of BRCA-screening recommendations for men. Many men who do test for a BRCA mutation report doing it for their daughters, and studies have shown that they tend to be confused about their risks of developing cancer themselves.

BRCA-awareness campaigns have led many women to get tested; in the two weeks after Angelina Jolie’s viral op-ed, researchers found that BRCA-testing rates went up by 65 percent. In that case, more people may gotten tested than needed to, but in general, the rise in cancer screenings and elective surgical interventions have helped reduce the rates of deaths from breast and ovarian cancers. Education about the genes’ links to other cancers could do the same for men. To that end, Pritchard argued in a 2019 Nature commentary that Hereditary Breast and Ovarian Cancer syndrome should be renamed King Syndrome after Mary-Claire King. “We need to really rethink this if we’re going to educate the public about the importance of these genes for cancer risk for everyone, not just women,” he told me.

[Read: I’ll tell you the secret of cancer]

As understanding of BRCA’s risks for men has grown, Pritchard’s idea has started to catch on. King, who is now a professor of genome sciences and medicine at the University of Washington, demurred when I asked her whether the syndrome associated with the BRCA genes should be renamed after her, but agreed that awareness campaigns have focused too narrowly on breasts and ovaries. “We need to bring this awareness to men in the same way that we have for 30 years now to women,” she told me.

How exactly Pritchard’s plan might be put into action is unclear. Gene names are overseen by an international committee and rarely changed. That’s part of why Pritchard is suggesting that the name of the syndrome associated with BRCA mutations become King Syndrome—no single governing body oversees that. Recently, ClinGen, an international group of researchers that works to parse the medical significance of genes, recommended that HBOC be rechristened BRCA-related cancer predisposition. (Pritchard told me he thinks that name isn’t quite as “catchy” as King Syndrome.)

Uncoupling the syndrome associated with BRCA mutations from breasts would likely be only the first step in getting more at-risk men screened for cancer. It would also be an important step in understanding the full impact of BRCA mutations on men. Because fewer men than women have been tested for BRCA mutations, scientists still don’t have a complete picture of their risk. For example, Pritchard told me, it’s only as more attention has been drawn to male BRCA risk that researchers have discovered mutations are linked to especially aggressive forms of prostate cancer. Penn Medicine recently launched a program dedicated to men and BRCA in part to continue this sort of research.

[Read: Scientists have been studying cancers in a very strange way for decades]

BRCA’s name is a legacy of a time when scientists thought genetics would offer a simple way to diagnose and treat disease—that one specific mutation would point definitively to one specific cancer. But today, “the idea that a gene would only affect one type of cancer risk is probably outmoded,” Pritchard said. The more scientists explore the human genome, the more complex its connections to health appear. It turns out that when genes don’t work like they should, the possible consequences may very well be infinite.

When I was taking German in college in the early years of this millennium, I once stumbled upon a word that appeared foreign even when translated into English: Diphtherie, or diphtheria. “What’s diphtheria?” I wondered, having never encountered a single soul afflicted by this disease.

Diphtheria, once known as the “strangling angel,” was a leading killer of children into the early 20th century. The bacterial infection destroys the lining of the throat, forming a layer of dead, leathery tissue that can cause death by suffocation. The disease left no corner of society untouched: Diphtheria killed Queen Victoria’s daughter, and the children of Presidents Lincoln, Garfield, and Cleveland. Parents used to speak of their first and second families, an elderly woman in Ottawa recalled, because diphtheria had swept through and all their children died.

Today, diphtheria has been so thoroughly forgotten that someone like me, born some 60 years after the invention of a diphtheria vaccine, might have no inkling of the fear it once inspired. If you have encountered diphtheria outside of the historical context, it’s likely because you have scrutinized a childhood immunization schedule: It is the “D” in the DTaP vaccine.

Vaccine breakthroughs over the past two centuries have cumulatively made the modern world a far more hospitable place to be born. For most of human history, half of all children died before reaching age 15; that number is down to just 4 percent worldwide, and far lower in developed countries, with vaccines one of the major drivers of improved life expectancy. “As a child,” the vaccine scientist Stanley Plotkin, now 92, told me, “I had several infectious diseases that almost killed me.” He ticked them off: pertussis, influenza, pneumococcal pneumonia—all of which children today are routinely vaccinated against.

But the success of vaccines has also allowed for a modern amnesia about the level of past human suffering. In a world where the ravages of polio or measles are remote, the risks of vaccines—whether imagined, or real but minute—are able to loom much larger in the minds of parents. This is the space exploited by Robert F. Kennedy Jr., one of the nation’s foremost anti-vaccine activists and now nominee for secretary of Health and Human Services. It is a stunning reversal of fortune for a man relegated to the fringes of the Democratic Party just last year. And it is also a reversal for Donald Trump, who might have flirted with anti-vaccine rhetoric in the past but also presided over a record-breaking race to create a COVID vaccine. Kennedy has promised that he would not yank vaccines off the market, but his nomination normalizes and emboldens the anti-vaccine movement. The danger now is that diseases confined to the past become diseases of the future.


Walt Orenstein trained as a pediatrician in the 1970s, when he often saw children with meningitis—a dangerous infection of membranes around the brain—that can be caused by a bacterium called Haemophilus influenzae type b or Hib. (Despite the name, it is not related to the influenza virus.) “I remember doing loads of spinal taps,” he told me, to diagnose the disease. The advent of a Hib vaccine in the 1980s virtually wiped these infections out; babies are now routinely vaccinated in the first 15 months of life. “It’s amazing there are people today calling themselves pediatricians who have never seen a case of Hib,” he says. He remembers rotavirus, too, back when it used to cause about half of all hospitalizations for diarrhea in kids under 5. “People used to say, ‘Don’t get the infant ward during diarrhea season,’” Orenstein told me. But in the 2000s, the introduction of rotavirus vaccines for babies six months and younger sharply curtailed hospitalizations.

To Orenstein, it is important that the current rotavirus vaccine has proved effective but also safe. An older rotavirus vaccine was taken off the market in 1999 when regulators learned that it gave babies an up to one-in-10,000 chance of developing a serious but usually treatable bowel obstruction called intussusception. The benefits arguably still outweighed the risks—about one in 50 babies infected with rotavirus need hospitalization—but the United States has a high bar for vaccine safety. Similarly, the U.S. switched from an oral polio vaccine containing live, weakened virus—which had a one in 2.4 million chance of causing paralysis—to a more expensive but safer shot made with inactivated viruses that cannot cause disease. No vaccine is perfect, says Gregory Poland, a vaccinologist and the president of the Atria Academy of Science & Medicine, who himself developed severe tinnitus after getting the COVID vaccine. “There will always be risks,” he told me, and he acknowledges the need to speak candidly about them. But vaccine recommendations are based on benefits that are “overwhelming” compared with their risks, he said.

The success of childhood vaccination has a perverse effect of making the benefits of these vaccines invisible. Let’s put it this way: If everyone around me is vaccinated for diphtheria but I am not, I still have virtually no chance of contracting it. There is simply no one to give it to me. This protection is also known as “herd immunity” or “community protection.” But that logic falls apart when vaccination rates slip, and the bubble of protective immunity dissolves. The impact won’t be immediate. “If we stopped vaccinating today, we wouldn’t get outbreaks tomorrow,” Orenstein said. In time, though, all-but-forgotten diseases could once again find a foothold, sickening those who chose not to be vaccinated but also those who could not be vaccinated, such as people with certain medical conditions and newborns too young for shots. In aggregate, individual decisions to refuse vaccines end up having far-reaching consequences.

Evolutionary biologists have argued that plague and pestilence rose in tandem with human civilization. Before humans built cities, back when we still lived in small bands of hunter-gatherers, a novel virus—say, from a bat—might tear through a group only to reach a dead end once everyone was immune or deceased. With no one else to infect, such a virus will burn itself out. Only when humans started clustering in large cities could certain viruses keep finding new susceptibles—babies or new migrants with no immunity, people with waning immunity—and smolder on and on and on. Infectious disease, you might then say, is a necessary condition of living in a society.

But human ingenuity has handed us a cheat code: Vaccines now allow us to enjoy the benefits of fellow humanity while preventing the constant exchange of deadly pathogens. And vaccines can, through the power of herd immunity, protect even those who are too young or too sick to be effectively vaccinated themselves. When we get vaccinated, or don’t, our decisions ricochet through the lives of others. Vaccines make us responsible for more than ourselves. And is that not what it means to live in a society?

The last time I sweated through my shirt, I vowed that it would never happen again. Sweat shame had dogged me for too many years. No longer would armpit puddles dictate the color of my blouse. Never again would I twist underneath a hand dryer to dry my damp underarms. It was time to try clinical-strength antiperspirant.

The one I bought looked like any old antiperspirant, a solid white cream encased in a plastic applicator. But its instructions seemed unusual: “For best results, apply every night before bed and again in the morning.”

Every night?

I swiped it across my armpits before bed, and to my surprise, they were dry all the next day. I kept poking them in disbelief—deserts. But I would later discover that there isn’t anything particularly special about this product. Nighttime application improves the effects of any traditional antiperspirant, including those combined with deodorant (the former blocks sweat while the latter masks smell). Research has shown this for at least 20 years; none of the experts I spoke with disagreed. Yet many of us swipe our armpits in the morning before we head out for the day. Somehow, Americans are trapped in a perspiration delusion.

Putting on antiperspirant in the evening feels roughly akin to styling your hair right before bed. Both are acts of personal maintenance that people take not only for their own well-being but also in anticipation of interactions with others. This idea is reinforced by ads for antiperspirants, which tend to feature half-dressed actors getting ready in bathrooms or changing rooms; see, for example, the Old Spice guy. These ads also tend to mention how long their products work—24 hours, 36 hours—implying that their effectiveness starts to fade once they are applied. In a recent Secret commercial, a woman rolls on antiperspirant in a daylit bathroom, then scrambles to make her bus, relieved that she is prepared for such sweaty moments for the next 72 hours.

What these ads don’t say is that these products need the right conditions to work effectively. Antiperspirant isn’t a film on the surface of the armpit that stops moisture from leaking through, like a tarp over wet grass. Instead, it functions like a bunch of microscopic champagne corks, temporarily sealing sweat glands from spraying their contents. The active ingredient in most antiperspirants is some form of aluminum salt, compounds that combine with moisture on the skin to form “gel plugs” that dam up the sweat glands. These gel plugs prevent not only wetness but also odors, because bacteria responsible for foul smells thrive best in moist (and hairy) conditions, according to Dee Anna Glaser, a dermatologist and board member of the International Hyperhidrosis Society, a group that advocates for patients with excessive sweatiness.

Gel plugs are finicky. They need a little bit of sweat in order to form—but not too much. Antiperspirant applied in the morning isn’t ideal, because people sweat more during waking hours, when they’re active. If the armpits are too sweaty in the hours after application, the product gets washed away before it can form the plugs. The body is cooler and calmer during sleep. For gel plugs to form, “baseline sweating is optimal at nighttime before bed,” Glaser told me. Nighttime application has been shown to increase the sweat-reduction ability of normal antiperspirant from 56 percent to 73 percent.

But wait, I can already hear you thinking, what happens if I shower in the morning? Here’s the thing: Antiperspirant lasts through a shower. “The plugs won’t wash away much,” even though the residue and scent probably will, Mike Thomas, a former scientist with Procter & Gamble and an advocate for the International Hyperhidrosis Society, told me. After 24 hours or more, the plug naturally dissolves. Reapplying antiperspirant during the day can be beneficial, Shoshana Marmon, a dermatology professor at New York Medical College, told me. Still, it works best if applied to dry armpits that, ideally, stay dry enough for the plugs to form. For most people, Marmon added, putting it on “clean, dry skin at night” provides enough protection to last through the next day.

Again, none of this information is new or hard to find. One of the earliest studies demonstrating the value of nighttime application was published in 2004; it showed that applying antiperspirant in the evening, or twice daily, was significantly more effective than morning-only use. Indeed, the stance of the American Academy of Dermatology is that it’s best to put antiperspirant on at night. Media outlets have covered this guidance since at least 2009.

For the perpetually sweaty, discovering this guidance only now, after decades of embarrassing photos and ruined shirts, might spark belief in a grand conspiracy: They don’t want you to know the truth about armpit sweat. Indeed, it isn’t mentioned on the labels of most regular-strength antiperspirants. The reasons for this are more banal than nefarious. Most people don’t sweat excessively, so applying antiperspirant the usual way is sufficient. “Manufacturers may keep instructions simple to fit general habits, so the idea of using antiperspirant at night doesn’t always make it into mainstream awareness,” Danilo C. Del Campo, a dermatologist at Chicago Skin Clinic, told me. The difference between antiperspirant and deodorant still eludes many people and, in fact, may bolster the insistence on morning application. Deodorant is essentially perfume and has no impact on sweat production. It’s “best applied when odor control is most needed, typically in the mornings,” Marmon said.

When I asked brand representatives why so many antiperspirants don’t mention nighttime use in the directions, they pointed to the potential for confusion. “It’s a bit counterintuitive for people to use antiperspirant at night, because most people think of applying it as part of their morning routine,” Maiysha Jones, a principal scientist at P&G North America Personal Care, which owns brands such as Secret and Old Spice, told me. But, she added, it is indeed best to use it at night. “Antiperspirants are commonly assumed to be a morning-only product and applied during the morning routine,” Megan Smith, a principal scientist at Degree Deodorant, told me.

In other words, people are used to applying antiperspirant in the morning because companies don’t tell them about the nighttime hack … but companies don’t tell them because people are used to putting it on in the morning. Omitting helpful instructions just because they might be confusing isn’t doing America’s perspirers any favors. Anyone who’s ever experienced an overly moist underarm can surely be coaxed into shifting armpit maintenance back a measly eight hours. People go to far greater lengths to self-optimize, whether it’s teens adopting multistep skin-care routines, or wellness bros taking dozens of supplements.

The science is well established, and the guidance is clear. But the ranks of nighttime swipers may not increase immediately. Routines have to be reset, assumptions picked apart. Some evenings, I find it exhilarating to buck the orthodoxy of personal hygiene. Other nights, it gives me pause. Applicator hovers over armpit, brain stumbles on belief. Will this really last past the sunrise, through a shower, beyond the hustle of the day? Even after learning about the science, “some people just don’t believe,” Thomas said. All there is to do is try. In go the corks, out go the lights.

A person who is “on the couch” is known to be in therapy, but most therapists these days don’t ask their clients to lie down. The first time mine did, I resisted. I didn’t want to be on display or unable to see her reactions. Plus, the idea seemed antiquated. Sigmund Freud was inspired to use the couch more than a century after observing dramatic hypnotherapy demonstrations by his teacher Jean-Martin Charcot. In psychoanalysis, Freud thought a therapist being out of view would help people access emotions or memories that might be repressed. (He also said that he could not “put up with being stared at by other people for eight hours a day.”)

Many of Freud’s ideas about the unconscious haven’t held up, but he may have been onto something with the couch, as I discovered when I eventually followed my therapist’s suggestion. The couch might not be for everyone, but it could be worth a try.

Today, therapists’ offices still almost always have a couch. For his book, In the Shadow of Freud’s Couch, Mark Gerald, a psychologist trained in psychoanalysis, took portraits of therapists in their offices, and “it was a rare room that did not have a couch,” he told me. “Somebody once mentioned they actually never used the couch with their patients, but if they didn’t have a couch, they wouldn’t feel like a psychoanalyst.”

Despite the ubiquity of the therapist’s couch, research on instances when it is and isn’t helpful is limited. Some patients in case studies report that they’ve missed seeing their therapists’ face when they lie down, while others have used the couch to avoid direct face-to-face communication. Even Freud wasn’t always strict about the couch. He also took walks with his patients; he famously analyzed the composer Gustav Mahler during a four-hour walk around Leiden.

But for me—and, I suspect, many others—occasionally lying down might provide some relief from the social aspects of talking with a therapist.The relationship between a therapist and a client won’t fully disappear, and that dynamic can be useful to examine in therapy too. Yet many therapy clients place too much emphasis on interpersonal dynamics. On social media, people make jokes about how much they want to get an A+ in therapy or make a therapist laugh; I’ve felt the same pressures myself. That stress could be reduced when you’re lying down and physically incapable of scanning your therapist’s face for signs of approval or displeasure. And in a time when many people have switched to teletherapy, staring at your therapist’s face (or your own) over Zoom can feel like a work meeting gone wrong.

On a very simplistic level, the couch offers a change in perspective—literally. The ancient Greeks and Romans reclined during banquets, which created an atmosphere of comfort and intimacy, says Nathan Kravis, a psychiatrist and historian of psychiatry at Weill Cornell Medical College who’s also the author of On the Couch. In the modern world, lying down signifies that the conversation you’re about to have in therapy is a different kind of interaction than those you have with family or friends. It’s a time to confront difficult thoughts, admit shortcomings, or explore desires without the relational obligations to those we know in our “real” life. “Its strangeness is part of the power,” Kravis told me. “It really has no parallel anywhere else in our social world.”

Lying down may also better satisfy people’s hunger for a more creative and humanistic form of therapy. Over the past few years, interest in psychoanalysis has made a dramatic comeback, though the approach still makes up a minority of therapy sessions. Other therapeutic methods, such as cognitive behavioral therapy, aim to change thoughts and behaviors that aren’t working for you; psychoanalysis, by contrast, focuses on self-exploration. I did CBT for years, and it’s very effective at solving problems. Yet something about a therapeutic practice that’s not as goal-oriented can be healing. When I started to lie down, I felt that I was choosing to make space for reflection, grief, processing, and developing intimacy with my own mind in a world where such acts are not usually prioritized.

Lying down for therapy sessions could make a practice that’s costly in both time and money feel even more indulgent, and I’m sure that some people simply wouldn’t like it. But it’s available to everyone to try at least once. Lying down is a technique, not an end goal, Ahron Friedberg, a licensed psychoanalyst and board-certified psychiatrist, told me. It could be a way of cultivating comfort, intimacy, and reflection, or speaking in a manner that you’re not used to—to yourself or to a therapist.

[Read: Plenty of people could quit therapy right now]

In Gerald’s office, he provides choices. There’s a chair directly across from where he sits. There’s a couch, recently replaced because the original became saggy from years of use. Some of Gerald’s patients always lie down; others save it for when they’re feeling overwhelmed. One patient usually sits up but will curl up into a ball on the couch when she’s going through a difficult time. Kravis said he talks with a new client face-to-face, then gives them the option of lying down after a session or two. “It’s not mandatory,” he said. “You’re not Velcroed to the couch.”

Is Robert F. Kennedy Jr., Donald Trump’s improbable nominee for secretary of Health and Human Services, actually such a crank? Short answer: yes. But two opinion pieces published in just the past few days argue that while Kennedy is often taken as unhinged, some of his ideas may very well be sound. Take the call for removal of fluoride from the nation’s drinking water. “It’s not an entirely crazy idea,” wrote Leana Wen, the former Baltimore health commissioner, in The Washington Post. Her piece concludes: “Not every proposal from Trump and Kennedy is a five-alarm fire.”

The medical-evidence expert (and Atlantic contributor) Emily Oster made a similar argument about fluoride in The New York Times, adding that Kennedy’s dedication to raw milk is also not totally unreasonable. Instead of yelling that he’s wrong, she said, public-health authorities could start “acknowledging that reasonable people may make different choices on a given issue.”

Let’s be clear: Many scientists consider Kennedy to be a fool, and a ludicrous pick to run HHS, because the evidence supports that assessment. Wen nods to this in passing—Kennedy has a “long history of antiscience propagandism,” she writes—but otherwise she’s focused on the nitty-gritty of one particular public-health debate. So allow me to fill in some gaps: According to his 2021 book, The Real Anthony Fauci: Bill Gates, Big Pharma, and the Global War on Democracy and Public Health, RFK Jr. believes that Fauci and Gates are members of a “vaccine cartel” trying to kill patients by denying them hydroxychloroquine and ivermectin. He argues that this cartel secretly funded doctors to produce fraudulent studies showing that the drugs were ineffective against COVID—and that it did so in order to orchestrate global lockdowns and accelerate the construction of 5G cellular networks, which, in Kennedy’s understanding, are very, very bad.

I read The Real Anthony Fauci in what may have been a misguided attempt to “do my own research.” It’s hard to summarize the extent of this book’s lunatic claims. Every group imaginable is said to be in on a plot to bring about worldwide totalitarianism and population control: governments, pharmaceutical companies, non-profits, scientists, and, of course, the CIA. Kennedy devotes many pages to casting doubt on HIV as the cause of AIDS, although he finally says he takes “no position” on this theory. The book also repeats threadbare allegations that a vaccine scientist at the CDC destroyed data revealing that the measles, mumps, and rubella (MMR) shot caused a 340 percent increase in autism among Black men, thus continuing a major theme in its author’s activism: Before the pandemic, Kennedy was best known for relentlessly misleading the public about vaccinations. “Pharma and its media shills are working at turning us into ‘Land of the Cowed, Home of the Slave,’” he wrote about the MMR shot in 2019.

Physicians like myself should have no trouble dismissing Kennedy. But some of my colleagues are asking Americans to withhold judgment. Last week, former CDC Director Robert Redfield, an infectious-disease doctor, announced, “For sure, I know that Bob Kennedy is not an anti-vaxxer,” after commending Kennedy’s crusade against chronic disease. Other doctors with a public platform—Jay Bhattachrya and Marty Makary, for example—have congratulated Kennedy on his support of free speech and his critique of the medical-industrial complex. And Vinay Prasad, an oncologist at UC San Francisco (and occasional Atlantic contributor), has written paeans to Kennedy’s views on the evils of pandemic restrictions and government censorship of social media. “Some have expressed concern about past statements by Mr. Kennedy,” Prasad acknowledged in a recent post. But “instead of attacking him,” another post explained, “we should acknowledge what he is right about and give guidance.” (Neither Makary nor Prasad responded to requests for comment for this story. Bhattacharya wrote back to say that “politically minded doctors” such as myself “have done much damage to public confidence in public health.”)

There’s a name for this phenomenon: “sanewashing.” The concept rose to prominence during the presidential campaign, when liberals accused journalists of smoothing and sanding Trump’s often-rambling and confused statements into a more coherent, palatable form. They would sometimes treat his policy proposals as provocative asides rather than the chaotic assaults on government institutions they actually represented. Through curated clips and paraphrases, the argument goes, news outlets obscured the true extent of Trump’s lunacy. This is what some public-health commentators are now doing for Kennedy.

Certain medical professionals may be offering their support out of self-interest: Bhattacharya and Makary are reportedly being considered for roles in the Trump administration. Yet they may also see some valid reasons to give Kennedy a chance. Yes, he gets important details wrong from time to time, but maybe—like Trump himself—it’s best to take him seriously, not literally. I’m ready to acknowledge the merit of Kennedy’s frequent claim that medical regulators are beset by conflicts of interest. Researchers and watchdogs have criticized the FDA, CDC, and other health agencies for operating a “revolving door” between government and industry. Vinay Prasad has long been an advocate on this very issue, and now, through Kennedy, he sees an opportunity to eliminate those conflicts once and for all. Kennedy isn’t a policy wonk, though; he’s a fabulist. Regulators and the pharmaceutical industry want nothing less than to “rob us of our sovereignty,” he said this month. “This is an organized, systematic, devious, nefarious project by these elites to turn the world into a technocracy.” When the problem is framed this way, Kennedy sounds less like a reformer more like someone trapped in a web of conspiracy.

I once had more sympathy for these pundits, and their way of thinking: In 2016, I wrote an op-ed suggesting that doctors and their patients should band together in opposition to the conflicts of interests posed by the pharmaceutical industry; and that proponents of conventional and alternative medicine could unite behind a truly populist critique of corporate health care. I believed that more collaboration would moderate our discourse. But instead I’ve watched doctors distort or downplay their prior views in an attempt to find some common ground. Before Prasad was boosting Kennedy’s proposals, for instance, he wondered publicly whether the CDC’s questionable pandemic policies might lower rates of routine childhood vaccinations. Now he finds himself defending America’s leading anti-vaxxer.

The sanewashers seem to understand that, if medical experts want any say in public health over the next few years, they will have to engage with the incoming Trump administration’s many eccentrics. But RFK Jr. is indeed a grade-A crank. Why should he have input on anything? This nation has no shortage of public-health and medical experts with thoughts on raw milk or fluoridated water. Some experts will surely agree with aspects of Kennedy’s platform, but they will also bring the credibility, experience, rigor and honesty he lacks. Let’s not pretend that Kennedy’s views have any value whatsoever.

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

The news came four years ago, at the end of a casual phone call. Bill’s family had always thought it was a freak coincidence that his father and grandfather both had ALS. But at the end of a catch-up, Bill’s brother revealed that he had a diagnosis too. The familial trend, it turned out, was linked to a genetic mutation. That meant Bill might also be at risk for the disease.

An ALS specialist ordered Bill a DNA test. While he waited for results, he applied for long-term-care insurance. If he ever developed ALS, Bill told me, he wanted to ensure that the care he would need as his nerve cells died and muscles atrophied wouldn’t strain the family finances. When Bill found out he had the mutation, he shared the news with his insurance agent, who dealt him another blow: “I don’t expect you to be approved,” he remembers her saying.

Bill doesn’t have ALS. He’s a healthy 60-year-old man who spends his weekends building his dream home by hand. A recent study of mutations like his suggests that his genetics increase his chances of developing ALS by about 25 percent, on average. Most ALS cases aren’t genetic at all. And yet, Bill felt like he was being treated as if he was already sick. (Bill asked to be identified by his first name only, because he hasn’t disclosed his situation to his employer and worried about facing blowback at work too.)

What happened to Bill, and to dozens of other people whose experiences have been documented by disease advocates and on social media, is perfectly legal. Gaps in the United States’ genetic-nondiscrimination law mean that life, long-term-care, and disability insurers can obligate their customers to disclose genetic risk factors for disease and deny them coverage (or hike prices) based on the resulting information. It doesn’t matter whether those customers found out about their mutations from a doctor-ordered test or a 23andMe kit.  

For decades, researchers have feared that people might be targeted over their DNA, but they weren’t sure how often it was happening. Now at least a handful of Americans are experiencing what they argue is a form of discrimination. And as more people get their genomes sequenced—and researchers learn to glean even more information from the results—a growing number of people may find themselves similarly targeted.

When scientists were mapping the immense complexity of the human genome around the turn of the 21st century, many thought that most diseases would eventually be traced to individual genes. Consequently, researchers worried that people might, for example, get fired because of their genetics; around the same time, a federal research lab was sued by its employees for conducting genetic tests for sickle-cell disease on prospective hires without their explicit consent. In 2008, the Genetic Information Nondiscrimination Act (GINA) was signed into law, ensuring that employers couldn’t decide to hire or fire you, and health insurers couldn’t decide whether to issue a policy, based on DNA. But lawmakers carved out a host of exceptions. Insurers offering life, long-term-care, or disability insurance could take DNA into account. Too many high-risk people in an insurance pool, they argued, could raise prices for everyone. Those exceptions are why an insurer was able to deny Bill a long-term-care policy.

[Read: The loopholes in the law prohibiting genetic discrimination]

Cases like Bill’s are exactly what critics of the consumer-genetic-testing industry feared when millions of people began spitting into test tubes. These cases have never been tallied up or well documented. But I found plenty of examples by canvassing disease-advocacy organizations and social-media communities for ALS, breast cancer, and Huntington’s disease. Lisa Schlager, the vice president of public policy at the hereditary-cancer advocacy group FORCE, told me she is collecting accounts of discrimination in life, long-term-care, and disability insurance to assess the extent of the problem; so far, she has about 40. A man Schlager connected me with, whose genetic condition, Lynch syndrome, increases the risk for several cancers, had his life-insurance premium increased and coverage decreased; several other providers denied him a policy altogether. Kelly Kashmer, a 42-year-old South Carolina resident, told me she was denied life insurance in 2013 after learning that she had a harmful version of the BRCA2 gene. One woman I found via Reddit told me she had never tested her own DNA, but showed me documents that demonstrate she was still denied policies—because, she said, her mom had a concerning gene. (Some of the people I spoke with, like Bill, requested not to be identified in order to protect their medical privacy.)

Studies have shown that people seek out additional insurance when they have increased genetic odds of becoming ill or dying. “Life insurers carefully evaluate each applicant’s health, determining premiums and coverage based on life expectancy,” Jan Graeber, a senior health actuary for the American Council of Life Insurers, said in a statement. “This process ensures fairness for both current and future policyholders while supporting the company’s long-term financial stability.” But it also means people might avoid seeking out potentially lifesaving health information. Research has consistently found that concerns about discrimination are one of the most cited reasons that people avoid taking DNA tests.

For some genetically linked diseases, such as ALS and Huntington’s disease, knowing you have a harmful mutation does not enable you to prevent the potential onset of disease. Sometimes, though, knowing about a mutation can decrease odds of severe illness or death. BRCA mutations, for example, give someone as much as an 85 percent chance of developing breast cancer, but evidence shows that testing women for the mutations has helped reduce the rate of cancer deaths by encouraging screenings and prophylactic surgeries that could catch or prevent disease. Kashmer told me that her first screening after she discovered her BRCA2 mutation revealed that she already had breast cancer; had she not sought a genetic test, she may have gotten a policy, but would have been a much worse bet for the insurer. She’s now been cancer-free for 11 years, but she said she hasn’t bothered to apply for a policy again.

[Read: Remember that DNA you gave 23andMe?]

Even employers, which must adhere to GINA, might soon be able to hire or fire based on certain genetic risk factors. Laura Hercher, a genetic counselor and director of research at the Sarah Lawrence College Human Genetics Program, told me that some researchers are now arguing that having two copies of the APOE4 mutation, which gives people about a 60 percent chance of developing Alzheimer’s, is equivalent to a Stage Zero of the disease. If having a gene is considered equivalent to a diagnosis, do GINA’s protections still apply? The Affordable Care Act prevents health insurers from discriminating based on preexisting conditions, but not employers and other types of insurers. (The ACA may change dramatically under the coming Trump presidency anyway.) And the Americans With Disabilities Act might not apply to the gray area between what might be viewed as an early manifestation of a disease and the stage when it’s considered a disability. FORCE and other advocacy groups—including the ALS Association and the Michael J. Fox Foundation—as well as members of the National Society of Genetic Counselors, are working in a few states to pass laws that close gaps left by GINA, as Florida did in 2020, but so far they have been mostly unsuccessful.

Genetic testing has only just become common enough in the U.S. that insurers might bother asking about it, Hercher said. Recently, groups like Schlager’s have been hearing more and more anecdotes. “People are so worried about genetic discrimination that they are failing to sign up for research studies or declining medically recommended care because of the concerns of what could happen to their insurance,” Anya Prince, a professor at the University of Iowa College of Law, told me. Carolyn Applegate, a genetic counselor in Maryland, told me that when patients come to her worried about a hereditary disease, she typically advises them to line up all the extra coverage they might need first—then hand over their DNA to a lab.

So far, these unintended consequences of genetic testing seem to be manifesting for people with risk for rare diseases linked to single genes, which, combined, affect about 6 percent of the global population, according to one estimate. But the leading killers—heart disease, diabetes, and the like—are influenced by a yet unknown number of genes, along with lifestyle and environmental factors, such as diet, stress, and air quality. Researchers have tried to make sense of this complex interplay of genes through polygenic risk scores, which use statistical modeling to predict that someone has, say, a slightly elevated chance of developing Alzeheimer’s. Many experts think these scores have limited predictive power, but “in the future, genetic tests will be even more predictive and even more helpful and even more out there,” Prince said. Already, if you look deep enough, almost everyone’s genome registers some risk.

[Read: What happens when you’re convinced you have bad genes]

In aggregate, such information can be valuable to companies, Nicholas Papageorge, a professor of economics at Johns Hopkins University, told me. Insurers want to sell policies at as high a price as possible while also reducing their exposure; knowing even a little bit more about someone’s odds of one day developing a debilitating or deadly disease might help one company win out over the competition. As long as the predictions embedded in polygenic risk scores come true at least a small percentage of the time, they could help insurers make more targeted decisions about who to cover and what to charge them. As we learn more about what genes mean for everyone’s health, insurance companies could use that information to dictate coverage for ever more people.

Bill still doesn’t know whether he will ever develop ALS. The average age of onset is 40 to 60, but many people don’t show symptoms until well into their 70s. Without long-term-care insurance, Bill might not be able to afford full-time nursing care if he someday needs it. People who do develop ALS become unable to walk or talk or chew as the disease progresses. “Moving people to the bathroom, changing the sheets, changing the bedpans,” Bill said—“I dread the thought of burdening my wife with all of those things.”

Cases like Bill’s could soon become more common. Because scientists’ understanding of the human genome is still evolving, no one can predict all of the potential consequences of decoding it. As more information is mined from the genome, interest in its secrets is sure to grow beyond risk-averse insurers. If consumer-facing DNA-testing companies such as 23andMe change their long-standing privacy policies, go bankrupt, or are sold to unscrupulous buyers, more companies could have access to individuals’ genetic risk profiles too. (23andMe told me that it does not share customer data with insurance companies and its CEO has said she is not currently open to third-party acquisition offers.) Papageorge told me he could imagine, say, scammers targeting people at risk for Alzheimer’s, just as they often target older people who may fall for a ploy out of confusion. All of us have glitches somewhere in our genome—the question is who will take advantage of that information.

The reelection of Donald Trump might seem like doomsday for America’s public-health agencies. The president-elect has vowed to dismantle the federal bureaucracy.Robert F. Kennedy Jr., potentially his next health czar, wants to go even further. As part of his effort to “Make America healthy again,” Kennedy has recently promised to tear up the FDA and its regulations, including those governing vaccines and raw milk. But that effort is going to run into a major roadblock: the “deep state.”

The phrase deep state might trigger images of tinfoil hats. After all, Trump has spent much of the past eight years falsely claiming that Democratic bureaucrats are unfairly persecuting him. But operating within the federal health agencies is an actual deep state, albeit a much more benign and rational one than what Trump has talked about. And he might not be able to easily tear it down.

Whether you know it or not, you’ve likely seen this deep state in action. It was the reason Trump’s preferred treatment for COVID during the early phases of the pandemic, hydroxychloroquine, was not flooding pharmacies. And it was why COVID vaccines were not rushed out before the 2020 presidential election. Both of those efforts were stopped by civil servants, despite overt pressure from Trump and officials in his administration.

Public-health officials didn’t buck Trump to sabotage him. They did so because both measures were scientifically unsolid. Vaccines weren’t authorized before the election because FDA officials knew that they had to wait at least two months after the clinical trials were completed to make sure the vaccines didn’t cause dangerous side effects. And the FDA blocked use of hydroxychloroquine for treating COVID because of the drug’s unproven efficacy and spotty safety record.

If they really wanted to, health officials could have caved to Trump’s requests. But in general, they don’t easily renounce their empirically grounded views on science—regardless of who is president. The FDA’s top vaccine regulator vowed to resign in 2020 if the agency relented to Trump’s pressure to approve vaccines early. Two other vaccine regulators resigned in the first year of the Biden administration after the FDA announced the rollout of COVID boosters. Following their resignations, the ex-officials publicly argued that “the data simply does not show that every healthy adult should get a booster,” and that public-health efforts should have been entirely focused on “vaccinating the unvaccinated, wherever they live.”

Many scientists, lawyers, and doctors are involved in each and every decision that federal-health agencies make, because the decisions must be evidence-based. Arbitrary decisions based on conspiracy theories or political whims can, and will, be challenged in court. “A new administration absolutely can come in and set new policies,” Lowell Schiller, who led the FDA’s office of policy during part of Trump’s first term, told me. But, he added, “there is a lot of law that they need to follow, and things have to be done through proper process.”

Some changes that may seem relatively insignificant require reams of paperwork. When the FDA wanted to revoke the standardized federal definition of frozen cherry pie (yes, one existed until earlier this year), it had to go through a formal procedure that forced the agency to defend its legal authority to make the move as well as the costs and benefits of a more laissez-faire cherry-pie policy. The process took more than three years. Few things are harder than approving or revoking approval for a drug: In 2020, the FDA tried to pull an unproven drug meant to prevent preterm births. Despite lots of evidence that the drug was ineffective, the process took nearly three years. Now imagine how things would go if RFK Jr. pressured the FDA to pull a vaccine off the market because he is convinced, incorrectly, that it causes autism.

A Trump administration could do a few things more easily. It could, for example, direct the FDA to stop enforcing the agency’s restrictions on some of the products that Kennedy touts, such as raw milk and certain vitamins. The FDA often declines to go after various products in the name of “enforcement discretion.” A downturn in enforcement actions might anger some within the agency, but Trump could bring that about with little red tape.

Kennedy has promised mass firings at the FDA, presumably to install loyalists who would enact the agenda. That threat should be taken seriously. The president has sweeping power to hinder officials who muck up his agenda. The Trump administration allegedly demoted one top federal official who pushed back against authorizing hydroxychloroquine.

But there are major checks, too, on what a president can do to turn the screws on civil servants. Unlike many workers, federal employees can be fired only for cause or misconduct, and civil servants are entitled to appeals in both cases. “It’s a tangled process that makes it hard to be able to get rid of people,” Donald Kettl, an emeritus public-policy professor at the University of Maryland, told me. Trump was famous for firing people during his first term, but the people who got the axe were political appointees who did not have the same protections as civil servants. In short, few federal employees last just one Scaramucci.

However, one major threat still looms over federal workers. In his first term, Trump pursued an effort to reclassify federal workers in a way that would strip many of them of their protections, and he has said that in his second term he will “immediately” pursue that action. Trump would have to go through an arduous process to make good on that threat, and it would likely be challenged in court. But if implemented, the policy could give Trump massive leverage to fire workers.

Still, Trump takes those actions at the peril of his own agenda. The reality is that the same members of the so-called deep state that Trump and Kennedy are threatening to fire are also essential to making anything the administration wants to do happen. Seminal parts of the “Make America healthy again” agenda would have to run through this deep state. If Kennedy, a champion of psychedelics, wants the FDA to approve a new psilocybin-based treatment, the medicine must be reviewed by the scientists and doctors who review other drugs for safety and efficacy. If he wants a national ban on fluoride in water, that must go through the EPA. There is no way around this: Even if Trump appointed Kennedy as the unilateral king of every single federal health agency, Kennedy cannot make these decisions on his own.

A central tenet of the “Make America healthy again” agenda is removing potentially dangerous chemicals from food. Although the FDA has been slow to ban certain chemical additives, the agency seems to have recently seen the light. Earlier this year it set up a new initiative for reassessing the safety of these substances. But if Kennedy guts the FDA, no one might be there to do that review.

The Trump administration could hypothetically hold a massive job fair to get cronies into all of those roles—especially if the president-elect makes good on his promise to make hiring and firing bureaucrats easier—but few people can successfully perform these highly technical jobs, not to mention that hiring in the federal government typically takes forever. (The average hiring time in 2023 was 101 days.)

Still, Trump’s second term will be one of the biggest challenges facing our federal health system. No president in modern history has been so intent on bending health agencies to his will, and he seems even more emboldened to do so now than in his first go-around. Trump will likely have some successes—some people may be fired, and some important policies may be scrapped. America is about to find out just how resilient the deep state really is.

Americans don’t typically have a reason to think about the fluoride in their water, but this is not a typical week. Robert F. Kennedy Jr., the former independent presidential candidate whom Donald Trump is eyeing as his health czar, has vowed to remove the mineral from drinking water if he is appointed to the next administration. Kennedy has said that the chemical lowers children’s IQ, even though studies overwhelmingly show that it is safe. Trump apparently agrees, and in his victory speech on Wednesday, he told Kennedy to “go have a good time” working on public health.

The prospect of giving Kennedy any semblance of power over the nation’s health is alarming, and not only because of his preoccupation with fluoride. (And to be fair, many scientists have made serious and nuanced inquiries about fluoride.) Kennedy, an environmental lawyer with no background in health, is best known for his skepticism, if not outright antagonism, toward vaccines. He also has a long track record of championing other pseudoscientific and conspiratorial views, such as the baseless belief that antidepressants are responsible for mass shootings.

When I looked up his full “Make America Healthy Again” platform, I expected to see wacky conspiracies. Instead, its goals could have been pulled from any liberal public-health campaign. MAHA’s key ambitions include addressing the root causes of chronic disease, improving the food supply through regenerative agriculture, preserving natural habitats, eliminating corporate influence from government health agencies, and removing toxins from the environment. The campaign acknowledges the need for systemic interventions such as increasing access to nutritious food and prioritizing preventative health care, initiatives touted by Democrats such as Michelle Obama.

MAHA represents a mix of concerns from across the political spectrum. “The issues he’s bringing up when it comes to health and food are more recognizably left,” Rachel Meade, a political scientist at Boston University who has studied Kennedy’s politics, told me. Blaming our health problems on corporations is also a move from the left’s playbook, Meade said. Indeed, Bernie Sanders has spent the past year railing against Ozempic’s manufacturer for making the drug so expensive. Assessed only by its goals and not its remedies, MAHA makes a lot of sense. That’s also what makes it dangerous.

Everyone can agree that “removing toxins from the environment,” one of MAHA’s stated goals, is a good idea. But not everyone agrees on what a toxin is. Fluoride is one, from Kennedy’s perspective. MAHA rightly points out that America’s “poor diet” must be addressed. But what counts as a good diet? To Kennedy, it might include raw milk, which poses serious health risks. Addressing “inadequate healthcare” is crucial, of course—but to Kennedy, that could entail treating COVID with ivermectin and hydroxychloroquine, alternative remedies that have been proven not to work.

RFK Jr.’s goals aren’t the only part of his platform that may appeal to more than conspiracy-addled Trump supporters. His overarching diagnosis of the nation’s health problems is that Americans are being misled by bad science and the institutions that support it. “Once Americans are getting good science and allowed to make their own choices, they’re going to get a lot healthier,” he said in an interview with NBC on Wednesday. This notion—that people should do their own research and take their health into their own hands—resonates widely. The belief that scientific institutions aren’t working spans the political spectrum, bringing together subcultures including anti-vaxxers, seed-oil truthers, carnivore-diet enthusiasts, and wellness influencers.

Kennedy himself is politically slippery. He was a Democrat until 2023, when he campaigned for president as an independent before dropping out and endorsing Trump. His anti-vaccine beliefs are historically associated with crunchy liberals, and his environmental views align with the left. But he fits in easily among Republicans, too. Many on the right adopted anti-vaccine views during COVID. More pertinently, his anti-establishment attitude toward health fits neatly in Trump’s us-versus-them narrative. Kennedy, like Trump, thinks of himself as a populist; he frames public-health issues in terms of corrupt institutions duping everyday people, regardless of their party. The bipartisan alliance formed around opposition to mainstream public health has created a strange new faction that counts Kennedy among its figureheads. One way to think of it, as my colleague Elaine Godfrey has written, is “woo-woo meets MAGA.”

MAHA appeals to this group—and could perhaps expand it. “Anti-establishment populism that has aspects of both left and right is a prominent narrative in alternative media spaces,” Meade said. Kennedy’s skepticism about health resonates among followers of influencers such as Russell Brand and Joe Rogan, who frequently entertain health-related conspiracies; Kennedy himself has been on their podcasts. Like Kennedy, many of them are disaffected former Democrats whose politics can be hard to pin down: Although Rogan endorsed Trump, he has called himself a “bleeding-heart liberal.”

It’s still possible that Kennedy might not get a prominent job in the Trump administration. His wariness of corporations doesn’t jibe with Trump’s embrace of them, and Trump has already made clear that environmental concerns won’t be a priority: “Bobby, stay away from the liquid gold,” he said in his victory speech, referring to oil. Kennedy’s history of bizarre behavior—including dumping a bear corpse in Central Park—may give some in Trump’s inner circle pause.

Even if Kennedy never joins the Trump administration, his ideas will continue to have broad appeal. America has seen what happens when people lose trust in public-health institutions. Pandemics drag on because people are afraid to get vaccinated. Measles outbreaks return to schools. People drink bleach. And maybe soon, Americans will no longer be drinking fluoridated water.

Newer Posts