Category:

Medical News

Updated at 9:34 a.m. on June 25, 2025

Vaccine experts in the United States have long considered the case on thimerosal closed. A chemical preservative that stamps out contamination in vaccine vials, thimerosal was removed from most U.S. shots more than two decades ago over worries that its mercury content could trigger developmental delays. But those concerns—as well as baseless claims that thimerosal causes autism—have been proved unfounded, many times over. “We took care of this 20 years ago,” Kathryn Edwards, a pediatrician and vaccine expert at Vanderbilt University, told me.

That’s not how anti-vaccine activists see the compound. Even the strongest data supporting thimerosal’s safety have not quelled the concerns of those who insist on the chemical’s harms. And now the CDC’s Advisory Committee on Immunization Practices, or ACIP, has signaled that thimerosal’s presence in vaccines should remain open for debate. The panel is scheduled on Thursday to discuss the compound, which is present in a minority of flu shots in low or trace amounts, and vote on how vaccines containing it should be used.

The panel that will meet this week is more skeptical of vaccines than any version in ACIP history. Earlier this month, Secretary Robert F. Kennedy Jr. abruptly dismissed all 17 existing members of the panel—among them, some of the nation’s foremost experts in vaccinology, infectious disease, pediatrics, and public health—and replaced them with eight new members who largely lack expertise in vaccines and, in several cases, have espoused anti-vaccine viewpoints. This new panel will hear a presentation on thimerosal not from a career vaccine scientist—as is usual ACIP practice—but from Lyn Redwood, one of the first vocal advocates of the false notion that thimerosal causes autism and the former president of Children’s Health Defense, the anti-vaccine organization that Kennedy chaired until 2023.

ACIP’s charter is to evaluate the data and guide the country’s approach to vaccines. By reopening the case on thimerosal, Kennedy’s handpicked committee has already chosen to entertain a classic anti-vaccine talking point. If the new ACIP’s vote further limits the use of vaccines containing the compound, it will also show, from the get-go, how willing it is to disregard evidence.

A multitude of studies, going back more than 20 years, have shown that thimerosal has no link to autism. Children who have received thimerosal-containing vaccines aren’t at higher risk of developing autism. Nor has removing the compound from much of the vaccine supply in multiple countries—including the U.S.—decreased autism rates. Instead, autism rates have gone up. (Experts who study autism attribute that rise largely to more awareness and more sensitive diagnostics; Kennedy, meanwhile, insists, without evidence, that the uptick is the work of an “environmental toxin” that “somebody made a profit” on.)

But around the turn of the millennium, experts felt pressured to remove thimerosal from vaccines, especially those targeted to young children. After studies had linked chronic exposure to high levels of mercury found in fish and whale blubber to developmental delays, scientists began to worry about the element’s effects on the young brain. The FDA kick-started a campaign to suss out the mercury content of the products it oversaw. By 1999, researchers had pinpointed thimerosal as suspect: The levels of the type of mercury found in vaccines containing the compound seemed, at the time, worryingly high, Walter Orenstein, who directed the U.S.’s National Immunization Program from 1988 through 2004, told me. “So there were concerns that it might be harmful to children.” (Autism, notably, wasn’t a consideration.)

No research proved that harm, but the fears seemed theoretically legitimate. “It put us in a very difficult position,” Orenstein said. The studies necessary to thoroughly test whether the thimerosal in vaccines was toxic could take years; in the meantime, kids could suffer unnecessarily. Some experts argued that keeping thimerosal in the vaccine supply wasn’t worth the risk to children’s health—and to public trust in immunization. If the FDA publicized its findings on mercury and the government didn’t take action, “we would look pretty stupid or unconcerned,” Neal Halsey, who was at the time the chair of the American Academy of Pediatrics’ committee on infectious diseases, told me. Plus, thimerosal’s role in vaccines wasn’t technically essential: Its main use was to stave off harmful contamination in multidose vaccine vials, which clinicians repeatedly dip into; with some tinkering, many manufacturers would likely be able to sub in other preservatives, or switch to pricier single-dose containers.

So in 1999, the government and the AAP asked pharmaceutical companies to get rid of the thimerosal in their vaccines as quickly as they could—and advised health-care providers to delay giving the hepatitis B vaccine, which contained the compound, to low-risk newborns.

As it turned out, the compound never posed serious danger. The form of mercury in thimerosal is different from the one found in fish; scientists soon determined that it was excreted from the body faster—which meant that it didn’t pose equivalent risk. No major problems in childhood development could be linked to thimerosal-containing vaccines. At the time of the original decision, “if we’d had full knowledge, we wouldn’t have done it,” Orenstein told me. Thimerosal was, and is, safe.

But that wasn’t the message that anti-vaccine activists took away. Instead, they seized upon the government’s decision as an admission of guilt; multiple mercury-focused anti-vaccine activist groups sprang up. Some of them began to insist, without evidence, that thimerosal caused autism; among the most prominent advocates for that claim was Kennedy himself. The fervor around autism “caught us all by surprise,” Halsey told me. “That’s not what our concern was in 1999.”

And yet, those fears ballooned. In the mid-aughts, several states restricted thimerosal-containing vaccines for children and pregnant women. In some parts of the country, the misinformation yielded misguided treatments: In 2005, a family in Pennsylvania had their 5-year-old autistic son injected with a mercury-chelating chemical in hopes of curing his condition; less than an hour later, the boy died of a heart attack.

By 2001, thimerosal had been removed from most vaccines for Americans under 6. But the compound’s disappearance had costs. Multidose vials are an especially cheap, efficient way to package vaccines; blacklisting thimerosal made many shots more expensive, Paul Offit, a pediatrician at Children’s Hospital of Philadelphia, told me. The speed of the decision spurred confusion too. Shaken by the call to remove thimerosal, some hospitals stopped offering the hepatitis B vaccine to newborns entirely; shortly after, a Michigan baby on a delayed vaccination schedule, born to a mother infected with the virus, died.

Certain scientists, including Offit, still consider the removal of thimerosal a mistake, not least because it made vaccines appear more suspicious. In a press release at the time, the AAP noted that “the current levels of thimerosal will not hurt children, but reducing those levels will make safe vaccines even safer”—a statement that appeared to validate thimerosal’s dangers. In an attempt to preserve public trust, the government instead broke it, Offit said. “They were meeting the anti-vaccine activists halfway.”

Now ACIP seems poised to make a concession to those same anti-vaccine groups. “The fact that it’s come up again is reason for some people to say, ‘Well, there was an issue,’” Edwards told me.

In response to a request for comment, an HHS spokesperson said, “The new ACIP committee is committed to evidence-based medicine, gold-standard science, and common sense. Its recommendations will be grounded in data, not ideology or opinion.” The spokesperson did not address questions about thimerosal specifically or the evidence for once again bringing it under scrutiny.

But the experts I spoke with weren’t optimistic about the forthcoming discussion. In the past, any question the committee voted on was usually published weeks in advance, and subcommittees including ACIP members, CDC officials, and independent subject-matter experts vetted evidence and discussed policy options in advance of meetings, Grace Lee, a Stanford pediatrician who formerly chaired ACIP, told me. The new ACIP panel has had no time for that level of preparation. At least one new member, Vicky Pebsworth, has also argued that thimerosal-containing vaccines are dangerous for children and pregnant people in an article published by Children’s Health Defense. And on Tuesday, the night before the meeting began, Kennedy shared a lengthy post on X about thimerosal, citing outdated research, denying the existence of sound studies confirming the safety of thimerosal-containing vaccines, and criticizing “pharma-financed mainstream media’s mantric ritual of dutifully parroting the propaganda tropes spoon-fed them by vaccine makers and their captive regulators.”

The exact proposal that ACIP will vote on hasn’t yet been made public, either. But materials now posted to the CDC’s website hint at the question the group might consider. Redwood’s presentation, which was officially added to the agenda only on Tuesday, includes a series of slides that largely ignores the strong evidence supporting thimerosal-containing vaccines’ safety, misrepresents at least one study, and concludes that “removing a known neurotoxin from being injected into our most vulnerable populations is a good place to start with Making America Healthy Again.” In an unusual move, though, the materials pertaining to Redwood’s presentation also include a CDC report—flagged as “CDC background briefing material,” flanked with asterisks—that reiterates thimerosal’s safety, and the evidence that debunks a link to autism. (Redwood, Pebsworth, and the CDC did not respond to a request for comment.)

Even Senator Bill Cassidy—the chair of the Senate’s health committee, who helped secure Kennedy’s confirmation—seems to be having doubts about these developments. On Monday, he wrote on social media that the new ACIP lacked the expertise to make sound decisions about vaccines, and called for the meeting to be delayed “until the panel is fully staffed with more robust and balanced representation.” (A spokesperson for Cassidy did not respond to a request for comment.)

If ACIP does vote to remove recommendations for remaining thimerosal-containing vaccines, it could create practical problems, Halsey told me. Even though only a minority of flu vaccines would be affected, forcing manufacturers to alter their products on a tight timeline could make it harder to prepare for annual vaccination campaigns. Lower-resourced regions might also struggle to afford single-dose vials.

But the bigger issue with that decision would be this new committee’s brazen disregard for decades of evidence on thimerosal’s safety. The original discussion to remove thimerosal was contentious but understandable: a precaution taken in a vacuum of information. This time around, though, the experts have long had the knowledge they need—enough of it that there should be no discussion or vote at all.

In the kitchen, an ingredient’s taste is sometimes less important than its function. Cornstarch has rescued many a watery gravy; gelatin turns juice to Jell-O. Yet the substances that make bread fluffy, hold mayonnaise together, and keep the cream in ice cream have, according to the new stance of the United States government, “no culinary use.”

These natural and synthetic substances, called emulsifiers, are added to processed foods to give them the textures that Americans have come to love. They’ve also become targets in Health Secretary Robert F. Kennedy Jr.’s push to remove many food additives from the American diet. The “Make America Healthy Again” report, published in May, groups emulsifiers with other additives, some of which it says are linked to mental disorders, metabolic syndrome, and cancer. Online, the MAHA crowd echoes claims that emulsifiers are helping drive America’s chronic health problems.

Like seed oils and food dyes, emulsifiers have raised some real health concerns, particularly about gut health. But distinguishing their ill effects from those of the foods they’re in is challenging—and probably a distraction from the diet changes that would really make Americans healthier.

To anyone who’s attempted (and failed) to make a smooth vinaigrette using only oil and vinegar, MAHA’s assertion that emulsifiers have no culinary use is an affront. Any recipe that calls for blending two substances that don’t mix well together requires emulsifiers’ magic touch. Their molecular structure is drawn to watery substances on one end and fat-based ones on the other, bridging ingredients that would otherwise separate. In a vinaigrette, a dollop of mustard does the trick. Mayonnaise, essentially a blend of oil and a water-based acid, such as vinegar, is spreadable thanks to a natural emulsifier: egg yolks. Similarly, adding eggs to milk prevents ice cream from separating into solid milk fat studded with ice shards (yum).

Not all emulsifiers are as recognizable as eggs and mustard. Many commercial ice creams swap eggs for cheaper synthetic emulsifiers. Cake mixes are foolproof because chemicals called propylene glycol esters prevent powdered fats from clumping. Monoglycerides and diglycerides add structure to and extend the shelf life of bread. Xanthan gum thickens creamy salad dressings. The MAHA report makes no distinction between purely chemical emulsifiers and those that are naturally occurring, such as egg yolks and soy lecithin. So far, studies have not definitively identified differences in their effects on human health.

[Read: America stopped cooking with tallow for a reason]

Perhaps because they are so useful, emulsifiers are in about half of supermarket foods sold in the United Kingdom, according to a 2023 study of the country’s four largest supermarkets; one study in France found that they account for seven of the top 10 most-consumed food additives among adults. So far, their prevalence in the U.S. food system hasn’t been studied, but given the dominance of processed food in the American diet, it’s safe to say that we eat a lot of them.

In Kennedy’s view, that abundance of emulsifiers is at least partly responsible for America’s chronic-disease epidemic. In May, he promised to investigate and ban food additives that are “really dangerous.” But so far, the research on emulsifiers doesn’t justify such a label. In 2017, an FDA-led study concluded that seven common emulsifiers didn’t raise any safety concerns at the usual levels of consumption. The agency’s calculations have “a lot of safety built in,” says Renee Leber, a food scientist at the Institute of Food Technologists, a trade group. There’s no reason to expect that Americans would ever consume enough emulsifiers to spark serious health concerns.

Still, looking further into emulsifiers’ health impacts isn’t a bad idea. A growing number of studies suggest that some can harm the gut, perhaps by shifting the balance of the gut microbiome. They may also damage the gut’s protective mucus layer, leaving it more vulnerable to inflammation and bacteria. A few studies suggest a link between the inflammation that some emulsifiers cause and certain illnesses, including Crohn’s disease, metabolic syndrome, and type 2 diabetes. But other research has turned up conflicting results; a study published last year linked a high-emulsifier diet to a better-protected gut.  

Even emulsifier experts aren’t sure exactly what the substances do in the body. Research on how they affect intestinal health is “very much a work in progress,” Benoit Chassaing, a professor at the Institut Pasteur, in Paris, told me. It also still isn’t clear which ones, if any, have the most potential for harm. In a 2021 study, Chassaing and his colleagues used a model to test the effects of 20 common emulsifiers on the gut microbiome. Only two of them—the synthetic emulsifiers carboxymethylcellulose (found in vitamins and dietary supplements) and polysorbate 80 (usually in edible oils and cake icing)—were determined to have lasting negative consequences. Chassaing has also found that some people’s microbiomes are more sensitive to emulsifiers—which is to say, conceivably emulsifiers could have different effects on different people. Without large-scale human trials, none of the research on emulsifiers can be considered conclusive. As the authors behind the 2024 study wrote, “For now, do not feel guilty if you eat ice-cream!” (At least, not because you’re consuming emulsifiers.)

[From the May 2023 issue: Could ice cream possibly be good for you?]

None of this has deterred Kennedy from fearmongering about additives like emulsifiers. Instead, he’s continuing a pattern that by now has become a MAHA signature: In the health secretary’s campaigns against seed oils and food dyes, he has exaggerated modest scientific findings to justify grand allegations that additives drive chronic disease. Some skepticism of these ingredients may be warranted. But Kennedy’s critiques lack nuance at a stage when nuance is all that the current research can provide.

A MAHA-led deep dive into these questions could turn up some genuinely useful information. If certain emulsifiers are especially gentle on the gut, the food industry could use them to replace the ones that might be more irritating. Identifying what makes certain people more sensitive to them could shape criteria for prescribing emulsifier-free diets.

But what Kennedy plans to do about emulsifiers beyond investigating their safety is anyone’s guess. When I asked the Department of Health and Human Services about it, Emily G. Hilliard, a press secretary, told me that “Secretary Kennedy is committed to ensuring transparency in the food supply so that Americans know exactly what’s in their food.” Banning any emulsifiers that might be found to cause serious harm would be prudent, but then foods that contain them would have to be reformulated—a costly, time-consuming endeavor. For some foods, that might not even be an option: Without an emulsifier, natural or synthetic, ice cream “just wouldn’t be plausible,” Leber told me.

If Kennedy aggressively pursues bans or some other type of restrictions, it will be worth stepping back and asking what the administration is really trying to achieve. The health effects of emulsifiers haven’t yet been fully distinguished from those of the foods they’re in (which tend to have high levels of fat, sugar, or both), nor have those of seed oils and food dyes. In fact, the science points to the likelihood that emulsifiers’ potential harms are minor in comparison with more basic nutritional problems. But maybe ditching emulsifiers could act as some roundabout way of nudging Americans toward eating healthier, if Kennedy is prepared to rob us all of ice cream.

[Read: RFK Jr. is taking an axe to America’s dietary guidelines]

In May, Kennedy announced that food additives and processed foods would be the “central focus” of his health administration. But really, that indicates just how unfocused his movement is. The MAHA report rails against American overconsumption of high-sugar, high-fat, ultra-processed foods, yet so far, it hasn’t been able to do much to limit their consumption beyond eliciting a nonbinding promise from Kraft-Heinz and General Mills to remove dyes from foods like mac and cheese and Kool-Aid, and encouraging people to cook french fries in beef tallow. Removing or replacing emulsifiers could result in some health gains, but none that are likely to outweigh the health consequences of eating the foods that contain them.

In the early hours of June 13, a baby was born to a brain-dead woman in Atlanta. That woman, Adriana Smith, had suffered a catastrophic stroke in February, in the early weeks of her pregnancy. When the baby, Chance, was born via C-section, he weighed less than two pounds. Last week, doctors at Emory University Hospital, where Smith was initially declared brain-dead, turned off the machines that had been sustaining her organs—and finally returned her body to the care of her family.

Usually, when a patient is declared brain-dead, their family is allowed a few hours or days to say their goodbyes before medical staff disconnect the body from a ventilator. This process hasn’t changed since I was a resident in the intensive-care unit nearly a decade ago. Due to patient-privacy rules, Emory hasn’t said what exactly happened in February following Smith’s apparently devastating and irreversible brain damage, but the standard procedure for a brain-dead patient clearly did not play out. Smith’s mother, April Newkirk, told reporters that the hospital instead kept Smith’s body on life support for more than four months without the family’s input or consent.

In a statement last month, Emory said that its treatment decisions are based, in part, on “compliance with Georgia’s abortion laws.” Georgia’s LIFE Act, which has been in effect since the Supreme Court overturned Roe v. Wade, in 2022, grants full constitutional rights to any “unborn child with a detectable human heartbeat” (usually occurring around six weeks of gestation). This concept is known as fetal personhood, and in Smith’s case, it seems to have pushed doctors into an extraordinary—and extraordinarily troubling—situation.

Four fundamental principles underlie health-care ethics: beneficence, non-maleficence, autonomy, and justice. In my experience practicing women’s reproductive health care, including abortion care, applying these principles simultaneously to the mother and the fetus can be—at the very least—fraught. The needs of the pregnant woman and the fetus are often in conflict, and many of the body’s physiological mechanisms to protect the fetus put even the healthiest pregnant woman at risk. Obstetricians often quip that their field is the hardest and most interesting because they must care for two patients at once, but any doctor who tries to truly uphold their ethical duties to two patients simultaneously risks landing in very murky territory.

To be clear, Smith, as a brain-dead patient, was legally and medically deceased, so precepts about doing good and avoiding harm would not have applied to her, at least clinically speaking. But the ethical principle of autonomy applies even after a patient’s death. (Consider the United States’ strict consent laws around organ donation.) In the difficult cases when a patient—including a pregnant patient—is in such dire condition that she cannot speak for herself, a proxy decision maker, typically the next of kin, must be the one to make decisions about withdrawing care based on knowledge of the patient’s wishes or values. Some people might argue that a fetus, unable to speak for itself, should also be regarded as having some degree of autonomy; still, in Smith’s case, the fetus’s next of kin, not doctors, should have acted as a proxy decision maker.

Yet Newkirk told reporters that prior to Chance’s birth, she was given no say in what happened to her daughter’s body, nor to the fetus inside her. Emory doctors simply followed what they believed the law prescribed. Since Smith’s case came to public attention last month, a spokesperson for the Georgia attorney general’s office has insisted that “there is nothing in the LIFE Act that requires medical professionals to keep a woman on life support after brain death.” But in all the months that Smith was kept on organ support (a more accurate term than life support), the attorney general refused to issue a formal legal opinion.

If Smith’s case sets a standard in Georgia and in other states with fetal-personhood laws, the implications could be far-reaching. Doctors might feel compelled to conduct a pregnancy test on any woman of reproductive age who dies in the hospital or is brought in dead. Some women might even sign an advance directive requesting that, should they die at any point in pregnancy, their organs be sustained so their fetus has a chance of survival. In the most extreme interpretation of the law, one can imagine an entire ward filled with dead women on ventilators, their bodies being used as incubators for tiny embryos.

[Read: The anti-abortion movement’s attack on wanted pregnancies]

This image invokes a crucial debate about justice that already permeates the American health-care system. The United States spends extraordinary amounts of money on medical care at the extremes of life: neonatal intensive care on the one end, and ICU care for the elderly on the other. (As of Sunday night, Chance remains in the NICU. Newkirk has said that she hopes he will make it home, as most babies born at 27 weeks do, typically after weeks of intensive care.) In part, routine critical care for the very, very old and very, very prematurely born reflects a strong cultural pressure against assigning relative value to any individual life. But it does not always reflect the values of individual patients and their families. And it can raise uncomfortable questions about both benefits and costs—for example, how to measure the value of keeping one very sick patient alive versus, say, distributing large-scale public-health benefits to underserved populations.

Such questions become especially pointed if you consider how often the health-care system fails people before they reach the brink of death—and how it fails certain people more often, and more egregiously, than others. Research shows that doctors in the U.S. routinely ignore the health concerns of Black women such as Smith, and that Black mothers and their babies die at dramatically high rates compared with their white peers. Justice, then, is not only a question of how doctors assign value to a given human life, but also a question of whose voices they listen to, and whose voices are ignored or silenced. According to Newkirk, back in February, Smith went to Atlanta’s Northside Hospital complaining of severe headaches; doctors sent her home without imaging her brain. (Neither Northside nor Emory Healthcare responded to requests for comment.) A day later, her boyfriend found her struggling to breathe and called 911. She was rushed to a different hospital and at some point fell unconscious. She never woke up.

As a doctor who cares for pregnant women, I know that patients rarely get to choose between a “right” and “wrong” decision. My job is to present all of the imperfect but ethically permissible options, and their implications for the fetus, to the person in front of me. Those options might include seeking treatment elsewhere if my care is limited by the law or other factors. But my years of practice have taught me that the patient, and the patient alone, must be the one to make that difficult, imperfect decision. This approach is a fairly good summary of how most abortion care works in the United States today. But it’s not how all doctors handle such choices. It certainly isn’t what happened at the time of Adriana Smith’s tragic death. Instead, Newkirk’s remarks indicate that no one explained to her exactly how doctors’ decisions may have been constrained by Georgia law, and no one suggested that she might transfer her daughter’s body to another hospital—even another state—where she might have had a say in those decisions.

[Elizabeth Bruenig: Whose choice? Whose life?]

The case of Adriana Smith, though not about abortion, is very much about the ethics of choice. April Newkirk never publicly asked for her daughter’s body to be removed from organ support, but she did insist that the decision should have rested with her family. In an interview last week, Newkirk reiterated that what had happened to her daughter was a breach of autonomy. “I think all women should have a choice about their body,” she said. “And I think I want people to know that.”

Every five years, America’s top nutrition experts jockey to be part of a rite of passage in the field. The federal government chooses a small group of researchers to serve on a committee that spends months poring over scientific literature to answer questions such as: What is the relationship between sweetened beverages and risk of type 2 diabetes? And how does eating saturated fat influence a person’s chance of heart disease? The end result is something called the Dietary Guidelines for Americans—in other words, the government’s official nutrition recommendations. The whole process might seem a bit excessive, if not pointless. Presumably, few Americans even know about this document, and even fewer intentionally use it to guide what they eat. But the recommendations touch the diets of tens of millions of Americans, affecting what food is served in schools and in the military. They also influence the food industry. After the dietary guidelines began more explicitly warning about the risks of added sugar, several major food companies committed to reducing added sugar in their products.

Those guidelines are now on the brink of getting MAHA-ed. It just so happens that 2025 marks five years since the previous version, so they’re now due for an update. Much of the work has already been completed. In December, the Biden administration released the scientific report that is supposed to undergird the guidelines. But Health Secretary Robert F. Kennedy Jr. has since promised to start from scratch and remake the recommendations to match his convictions about how Americans should be eating. Last month, he told Congress that the new dietary guidelines could be released “before August,” teasing big changes. The current version of the document is 149 pages. The forthcoming update, he said, will stand at just four pages that tell people to “eat whole food; eat the food that’s good for you.”

Beyond that, RFK Jr. hasn’t given any more specifics on what his dietary guidelines will include. (A Department of Health and Human Services spokesperson didn’t respond to a request for comment.) It’s a reasonable bet that RFK Jr. will come after his least-favorite foods, such as seed oils, ultra-processed snacks, and synthetic food dyes. If so, he will pave the way for the MAHA diet to become part of many more people’s lives. Kennedy’s dietary guidelines could have a much larger impact on what Americans eat than anything else he has done to date.

For all of his big talk about how Americans are eating unhealthy food that’s making us sick, RFK Jr. has had only middling success so far at enacting change in his short tenure as health secretary. Take food dyes: Kennedy has tried to rid the food supply of most dyes through a handshake agreement with the food industry. The agreement allows food-company executives to decide for themselves whether and when to phase out these products. But by formally discouraging food dyes in the dietary guidelines, Kennedy could effectively block their use in millions of school lunches. Although the lunch program does not need to follow the guidelines word for word, it must be “consistent with the goals” of the government’s official recommendations.

Even if Kennedy doesn’t outright use the dietary guidelines to come for synthetic dyes or any other MAHA villain, his promise to keep the guidelines to just four pages—essentially a leaflet—would mean trashing much of the existing nutritional advice. Nevertheless, RFK Jr. might be onto something, at least directionally. Consider the 2020 version of the dietary guidelines. I read all 149 pages, and at times, they left me utterly perplexed about what a healthy meal actually looks like. The word guidelines implies simple instructions that a person can actually follow. “Don’t eat Oreos” would be a useful nutrition guideline—one that I, myself, should observe more often. “Customize and Enjoy Food and Beverage Choices to Reflect Personal Preferences, Cultural Traditions, and Budgetary Considerations” is not. The report advocates, for example, that people meet their “food group needs with nutrient-dense foods and beverages,” but it struggles to explain exactly what makes a food nutrient-dense. If the concept seems self-explanatory, consider that the guidelines claim that both vegetable oil and sparkling water are nutrient-dense. (They also state that a nutrient-dense burrito bowl would have sliced avocado, but a “typical burrito bowl” would have guacamole.)

Look, nutrition can be complicated. And this is not to say that the guidelines are entirely useless. They do, for example, outline the amount of vegetables that the average person should eat in a day: two and a half cups. But clear directives like these are the exception. Part of the issue is that the dietary guidelines are not written for regular people with questions about their diet. In the early 2000s, the guidelines changed from a document explicitly focused on providing everyday people with actionable advice into a report whose stated goal, according to the heads of the Departments of Agriculture and Health and Human Services at the time, was “to be a primary source of dietary health information for policymakers, nutrition educators, and health providers.” But the fact that the document is meant for experts doesn’t obviate the need for its overarching message to at least be decipherable to the public.

In Kennedy’s telling, the guidelines’ increasing complexity over time is the nefarious work of the food industry. Before even being nominated to lead HHS, Kennedy posted a video decrying that “corporate interests have hijacked” the guidelines. When he promised lawmakers last month to cut the document down to four pages, he also alleged that the guidelines were “clearly written by industry.” It’s true that a sizable portion of the experts who have served on the advisory panel developing the guidelines have had ties to the food industry. One study found that 19 of the 20 experts on the advisory committee for the 2020 guidelines had conflicts of interest. (It’s common for nutrition experts to receive funding from food companies for their research.)

But there’s another potential explanation for the bloat plaguing the guidelines. “I don’t think a conspiracy theory is needed here,” Marion Nestle, an emeritus professor of nutrition at NYU who served on the dietary-guidelines advisory committee in 1995, told me via email. She added that every committee “thought it had to improve on what was done previously.” Consider the 1980 guidelines, a mere 18 pages in total. By 2000, the size of the document had more than doubled, to 39 pages. By 2010, 95 pages. The growing complexity of the guidelines is all the more perplexing because the government’s overarching advice on how to eat healthier hasn’t changed that much over the past 35 years. “They all say the same thing no matter how many pages they use: eat more plant foods; restrict salt, sugar, saturated fat; balance calories,” Nestle said.

One of Kennedy’s particular skills is finding messages that get through to people. So many of his views on nutrition seem to have resonated precisely because they are not full of mealy-mouthed verbiage and caveats. It’s easier to grasp that seed oils are poisonous than it is to understand the nuances of how the fatty acids in these oils are digested in the body. For Kennedy to actually benefit Americans’ health, however, his guidelines still have to reflect reality. (You shouldn’t freak out about seed oils.) Going too far in simplifying dietary messages could further degrade the credibility of the guidelines all the same, warns Michael Pollan, the writer who’s perhaps the most famous concise food messenger. “The challenge always is: How do you simplify science without distorting it?” he told me.

Kennedy’s views of individual ingredients oscillate between commonsense dietary maxims and conspiratorial musings. His views about the risks of added sugar, for example, are far more scientifically founded than his love of beef tallow. That is one of the most troubling elements of Kennedy’s tenure as HHS secretary: Especially when it comes to food, he mixes mainstream views with conspiracy theories. No one can predict exactly which of these views he will glom on to from one day to the next—or which will appear in four pages of dietary guidelines.

In the early 1950s, “Hi-Proteen” powder, one of the first modern protein supplements, hit the market. Initially, it tasted awful. But after its creator, Bob Hoffman, added in Hershey’s chocolate, the flavor improved. (He used a canoe paddle to stir his mixture in a giant vat.) Protein products have come a long way since then. Perhaps, they have come too far: Last weekend, at the gym, I was offered a can of lemon-flavored “protein ice tea.” The summery, yellow-striped packaging advertised 15 grams of protein per can, or about the same as what you might get from three eggs.

Apparently protein shakes and protein bars don’t cut it anymore. Americans are so obsessed with protein that even an Arnold Palmer comes infused with it. Perhaps protein iced tea was inevitable. Whenever something is trendy, the food industry can’t help but push things to the extreme—consider “plant-based” peanut butter (as if the spread was not already vegetarian) and gluten-free pumpkin dog biscuits. But even compared with other food trends, the protein situation has gotten out of hand. Just last week, Starbucks announced that it’s piloting a high-protein, banana-flavored cold foam. There is protein water, Kardashian-branded protein popcorn, and “macho” protein pasta sauce. If you want to get drunk while bulking up, consider a protein-fortified pale ale or a “Swoleberry” spiked protein seltzer. Nothing is safe from the protein pandemonium. Name a food, and the protein version of it probably exists.

Even if you, like me, aren’t trying to maximize your protein intake, all of these products can be hard to escape. They have infiltrated every inch of the supermarket: On Monday, I went grocery shopping with the mission of finding the most ridiculous protein-enriched ingredients possible. While preparing my meal, I crunched on ranch-flavored protein tortilla chips (13 grams) and sipped from a bottle of grapefruit-flavored protein water (20 grams). Dinner began with a salad made of “OrganicGirl Protein Greens,” which feature an assortment of mixed greens including naturally protein-rich sweet-pea leaves (5 grams). My main course was chickpea protein pasta (20 grams) and salmon (40 grams). I topped it all off with a frozen peanut-butter-banana bar for dessert (another 5 grams).

[Read: The drink Americans can’t quit]

In total, I ate more than 170 grams of protein on Monday, or the equivalent of 31 medium eggs. According to the federal government’s recommendations, that’s almost four times what someone of my build and activity level needs in a day to maintain a “nutritionally adequate” diet. The official dietary guidelines suggest that a person needs at least 0.36 grams of protein per pound of body weight to stay healthy. That’s not all that much protein. Before my dinner experiment, I had gone through the day without thinking about my protein consumption, and had already surpassed my recommended amount by more than 30 percent. The average American adult regularly exceeds the federal recommendation.

So why is protein showing up in iced tea? Some health experts think that the current federal recommendation is insufficient. They believe that for optimal health—to get beyond simply meeting basic nutritional needs—we should be consuming double, if not triple, the recommended amount. Some people—those who strength train, for instance—certainly benefit from increased intake. But for the average person, most experts don’t see the point in going wild with protein, as my colleague Katherine J. Wu has written.

[Read: Should we all be eating like the Rock?]

What makes protein so appealing is that it has been offered as an answer for lots of people’s dietary goals. Want to build muscle? Eat protein. Want to feel fuller for longer? Eat protein. Want to lose weight? Eat protein. The nutrient can indeed help with all of those, but sometimes, the claims turn absurd. Cargill, the food giant, recently suggested that protein might help solve broken marriages: “Protein helps individuals become better parents, partners and employees,” the company wrote in a report this spring. In other words, protein has become synonymous with “healthy.” The message seems to be resonating: Last year, 71 percent of American adults said they were trying to consume more of it.

For food companies, adding protein to virtually everything is an easy way to make their products more alluring. No Starbucks executive is going to suggest a new line of “fat enhanced” cold foam or iced tea with extra carbs. But extra protein—sure. And that’s how we end up in a world of protein mania. The protein shake has given way to protein coffees and protein matchas and protein energy drinks and protein sodas. The protein bar has similarly descended into madness: Last week, Hershey’s announced a “Double Chocolate flavored protein bar” that looks like its normal chocolate bar (Hoffman would be proud). For the purists, there’s the recently launched David bar, named after Michelangelo’s, which bills itself as “the most effective portable protein on this planet.” You can eat protein-fortified vanilla glazed donuts for breakfast, top your double cheeseburger with protein-laced ketchup, and finish the day with protein powder mixed with melatonin that promises a good night’s sleep.

If you’re suspicious of these products, it’s for good reason. Shoppers might think that certain foods are healthier now that they have a protein label slapped on them. Some of the new products are truly good for you—but eating a ton of protein-packed candy (or even just lots of red meat) comes with health risks that could offset whatever dubious benefit all that added protein might provide. A Snickers bar with 20 grams of protein is still a Snickers bar.

By the time I finished my protein dinner, I was starting to feel bloated. Still, I wasn’t quite done. I cued up the trailer for Protein, a film that debuted in U.K. cinemas last weekend. The movie tells the story of “a gym-obsessed serial killer” who “murders and eats a local drug dealer” for—what else?—protein. I took a bite of a protein-packed double-chocolate cookie and hit “Play.”

Until last week, the future of vaccination for human papillomavirus, or HPV, in the United States seemed clear.

For several years, a growing body of evidence has suggested that just a single dose of the vaccine may be as effective as two are, offering decades of protection against the virus, which is estimated to cause roughly 700,000 cases of cancer each year. More than 50 other countries have already adopted the one-dose schedule, and many experts hoped that the United States might follow suit this year.

The decision rests, primarily, on the deliberations of the Advisory Committee on Immunization Practices, a vaccine-advisory committee to the CDC. ACIP was initially expected to put to a vote, as early as next week, the questions of HPV-vaccine dosing and, simultaneously, whether to strengthen the recommendations that advise vaccination starting at 9 years of age. Several experts told me that they had tentatively expected both motions to pass, making HPV vaccination easier, cheaper, and quicker. The HPV vaccine is one of the most powerful vaccines ever developed: It is unusual among immunizations in that it durably prevents infection and disease at rates close to 100 percent. If it was deployed more widely, “we could see the end of cervical cancer,” Kirthini Muralidharan, a global-health expert and HPV-vaccine researcher at the Johns Hopkins Bloomberg School of Public Health, told me.

That was before Robert F. Kennedy Jr., the nation’s health secretary, abruptly dismissed all 17 members of ACIP. Among their replacements—each apparently handpicked by Kennedy—are several researchers who have spread misinformation about vaccines or been embroiled in litigation against vaccine manufacturers; at least one of the new members has exaggerated the harms of the HPV vaccine specifically. Now the anticipated votes on the vaccine, among other immunizations, have been removed from the proposed agenda for ACIP’s coming meeting, leaving the fate of the vaccine far murkier.

ACIP has, for decades, been one of the world’s most respected expert panels on vaccines. The group’s charter is to rigorously evaluate the evidence on the immunizations that the FDA has green-lighted. The advice it gives the CDC then helps devise the official immunization schedule that guides how insurers cover vaccines, how states mandate immunizations in schools, and how primary-care physicians advise their patients. Only under the rarest of circumstances has a CDC director rejected the committee’s advice. Effectively, the members of ACIP “decide who gets the vaccine, at what age, and how many doses,” Noel Brewer, a vaccine expert and health-behavior researcher at UNC Gillings School of Global Public Health, who served on ACIP until last week, told me.

The group’s rigorous, data-driven approach is a primary reason the HPV-dosing strategy has yet to change. In particular, the committee was awaiting formal results from a large clinical trial in Costa Rica that has been comparing dosing strategies in adolescent girls. So far, the data, recently presented at a cancer conference, suggest that one dose is just as effective as two, the current CDC-backed regimen. Earlier this year, the ACIP working group focused on HPV vaccines was leaning toward supporting the dose drop, Brewer, who was part of that group, told me. The proposal to routinely recommend the vaccine as early as 9 years of age, he added, seemed likely to pass, too. (Currently, the CDC allows for HPV vaccination as early as 9 years of age, but only actively recommends it starting at 11 years of age.)

Those amendments to HPV-vaccination guidelines would make the shot simpler to get, for a wider range of children—which could dramatically increase its uptake, Gretchen Chapman, a health-psychology researcher at Carnegie Mellon University, told me: “The more you can make getting vaccinated easy and convenient, the higher vaccination rates will be.” Only about 60 percent of 13-to-17-year-olds in the U.S. are up-to-date on their HPV shots—a gap that public-health experts consider a major missed opportunity. That the shot can almost perfectly prevent infection and disease for decades is “like the fantasy we have of vaccines,” Brewer told me. Its rock-solid protection “just keeps rolling.”

But the new ACIP may see matters differently. Kennedy has yet to fill the committee’s roster, but his initial picks include individuals who appear to have a beef with HPV immunization. One member, Vicky Pebsworth, co-wrote an analysis detailing adverse events following HPV vaccination for an anti-vaccine organization, which she serves on the board of. Another new member, Martin Kulldorff, provided expert testimony in cases against the drugmaker Merck over its Gardasil vaccine, the only HPV shot available in the U.S., and received thousands of dollars from plaintiffs who accused the company of downplaying the vaccine’s risks. (A judge in North Carolina overseeing one of those cases ruled in favor of Merck; another, in Los Angeles, is going to trial later this year.) And Kennedy, an environmental lawyer, has himself been instrumental in organizing the litigation campaign against Merck—and has described Gardasil as “the most dangerous vaccine ever invented.” (Under pressure from senators, Kennedy has said that he will relinquish any proceeds from these lawsuits to his son.) He has also falsely claimed that the HPV vaccine—which data show has dramatically reduced rates of cervical cancer in the U.S. and elsewhere—“actually increases the risk of cervical cancer.” (HHS, the CDC, Pebsworth, and Kulldorff did not respond to a request for comment.)

At some point, the current ACIP might see fit to soften the existing guidelines, or even advise the CDC to remove the vaccine recommendations for certain groups. If it does, those decisions could prompt insurers to stop covering the vaccines, or disincentivize health-care providers from offering them to families. The committee could also remove the vaccine from the Vaccines for Children program, which provides shots to kids whose parents cannot afford them. (An initial agenda for the ACIP meeting scheduled to start on Wednesday initially included a recommendation vote for the HPV vaccine, as well as a vote on its status in Vaccines for Children; those items no longer appear in the CDC’s draft agenda.)

A few of the experts I spoke with raised the possibility that this new ACIP might still amend the HPV-vaccine recommendation to a single dose, but with a different rationale: not because the members are swayed by the data on its effectiveness, but because they’d support any option that cleaves a vaccine dose from the immunization schedule. Kennedy, too, seems likely to back such a move. “Any window to roll back the number of times a child receives a vaccine injection? He’s going to push for,” Alison Buttenheim, a behavioral scientist at Penn Nursing, told me.

The net effect might at first seem the same: Fewer doses of the HPV vaccine would be on the schedule. But the reasoning behind a decision can matter just as much as the end result. Robert Bednarczyk, an epidemiologist and vaccine researcher at Emory University’s Rollins School of Public Health, noted that, although much of the evidence so far has pointed toward one dose being enough, the case isn’t yet a slam dunk: Some of the trials investigating the single-dose strategy are using different formulations of Gardasil, or non-Gardasil brands, which may perform differently. (The Costa Rica trial, notably, does include the same Gardasil recipe used in the U.S.) And some experts still wonder if the protection offered by a single shot may fade faster than a double-dose regimen—a more challenging aspect of vaccine protection to assess without many years of follow-up. If that’s the case, prematurely dropping the second dose could later force the U.S. to add a shot back into the vaccine schedule—a confusing message that could erode trust. The last thing the country needs now is “another hit to public confidence around vaccines,” Bednarczyk said.

How Kennedy and his allies publicly justify these choices, then, matters quite a bit. Vaccines, on the whole, are now being billed by the government not as vital, lifesaving tools, but as unnecessary risks, deserving of additional scrutiny. Of the multitude of vaccines on the childhood-immunization schedule, many people already see HPV “as the troublesome one,” Brewer told me. Its ability to prevent cancer has been underemphasized; some critics have stoked unfounded fears that, because the vaccine guards against a sexually transmitted virus, it will increase promiscuity. And unlike other vaccines recommended in the early adolescent years, such as the meningococcal vaccine and the Tdap booster—which are required by most or all states for entry into secondary school—HPV is mandated for preteens in only a handful of jurisdictions.

All of these pressures make the vaccine more vulnerable to being rejected, Chapman told me. And should Kennedy’s new vaccine team openly discard HPV doses primarily for the sake of dropping a shot, that could set a precedent—for removing other vaccines from the schedule, in part or entirely.

Remember ivermectin? The animal-deworming medication was used so avidly as an off-label COVID treatment during the pandemic that some feed stores ended up going out of stock. (MUST SHOW A PIC OF YOU AND YOUR HORSE, a sign at one demanded of would-be customers in 2021.) If you haven’t heard about it since, then you’ve existed blissfully outside the gyre of misinformation and conspiracies that have come to define the MAGA world’s outlook on medicine. In the past few years, ivermectin’s popularity has only grown, and the drug has become a go-to treatment for almost any ailment whatsoever. Once a suspect COVID cure, now a right-wing aspirin.

In fact, ivermectin never really worked for treating SARS-CoV-2 infections. Many of the initial studies that hinted at a benefit turned out to be flawed and unreliable. By 2023, a series of clinical trials had already proved beyond a doubt that ivermectin won’t reduce COVID symptoms or mortality. But these findings mattered little to its fans, who saw the drug as having earned the status of dissident antiviral—a treatment that they believed had been suppressed by the medical establishment. And if ivermectin was good enough to be rejected by mainstream doctors as a cure for COVID, health-care skeptics seemed to reason, then surely it must have a host of other uses too.

As a physician who diagnoses cancer, I have come across this line of thinking in my patients, and found that some were using ivermectin to treat their life-threatening tumors. Nicholas Hornstein, a medical oncologist in New York City, told me that he’s had the same experience: About one in 20 of his patients ask about the drug, he said. He remembers one woman who came into his office with a tumor that was visibly protruding from her abdomen, having swapped her chemotherapy for some ivermectin that she’d picked up at a veterinary-supply store. “It’s going to work any day now,” he says she told him when he tried to intervene.

The idea that ivermectin could be a cancer-fighting agent does have some modest basis in reality: Preliminary studies have suggested that antiparasitic medications might inhibit tumor growth, and at least one ongoing clinical trial is evaluating ivermectin’s role as an adjunct to cancer treatment. That study has enrolled only nine patients, however, and the results so far show that just one patient’s tumor actually shrank, according to a recent scientific abstract. But these meager grounds for hope now support a towering pile of expectations.

Cancer is just one of many illnesses that ivermectin is supposed to heal. According to All Family Pharmacy, a Florida-based company that promotes the compound to fans of Donald Trump Jr., Dan Bongino, Matt Gaetz, and Laura Ingraham on their podcasts and shows, the drug has “anti-inflammatory properties that could help keep the immune system balanced in fighting infection.” (The company did not respond to a request for comment.) In sprawling Facebook groups devoted to ivermectin’s healing powers, the claims are more extreme: The drug can combat a long list of conditions, members say, including Alzheimer’s disease, heart disease, diabetes, autism, carpal tunnel syndrome, crow’s feet, brain fog, and bee stings.

As a medication that supposedly was censored by elites—if not canceled outright by woke medicine and Big Pharma—ivermectin has become a symbol of medical freedom. It’s also a MAGA shibboleth: Republican-leaning parts of the country helped drive an astounding 964 percent increase in prescriptions for the drug early in the pandemic, and GOP members of Congress have used their official posts to advocate for its benefits. Ivermectin can now be purchased without a prescription in Arkansas and Idaho, and other states are considering similar measures.

Health and Human Services Secretary Robert F. Kennedy Jr. has been a particularly strong proponent. In his 2021 book about the pandemic, Kennedy referred to the “massive and overwhelming evidence” in ivermectin’s favor, and invoked its “staggering, life-saving efficacy.” He also argued at great length that the pharmaceutical industry—with the support of Anthony Fauci and Bill Gates—had engaged in a historic crime by attempting to discourage its use. Jay Bhattacharya, the director of the National Institutes of Health, has similarly backed the conspiracy theory that the use of ivermectin was dismissed by “the powers that be” in an apparent ploy to ease the approval of COVID vaccines. (Not everyone in the current administration is a fan: Before he became the FDA’s vaccine czar, the oncologist Vinay Prasad publicly disputed Kennedy’s views on ivermectin, and earlier this year he called its use for cancer “the right’s version of masking on the airplane and praying to Lord Fauci.”) In response to questions about Kennedy’s and Bhattacharya’s current views on ivermectin, the HHS press secretary Emily Hilliard told me that they “continue to follow the latest scientific research regarding therapeutic options for COVID-19 and other illnesses.” She did not respond to questions about Prasad.

The idea of using antiparasitic drugs as cancer treatments was already taking hold by the late 2010s, Skyler Johnson, a Utah radiation oncologist who studies medical misinformation, told me. In January 2017, a man with lung cancer named Joe Tippens started on a dewormer called fenbendazole, which had been suggested to him by a veterinarian. Daniel Lemoi, who had Lyme disease, had started taking ivermectin in 2012 after reading a paper on the genetic similarities between humans and horses. Tippens would go on to achieve global fame among desperate cancer patients, and Lemoi became an ivermectin influencer during the pandemic.

[Read: How ivermectin became a belief system]

Since then, a gaggle of dubious doctors has worked to bolster the credibility of deworming drugs within alternative medicine and anti-vaccine circles. Their underlying pitch has become familiar in the past few years: Health experts can’t be trusted; the pharmaceutical industry is suppressing cheap cures; and patients deserve the liberty to choose their own medical interventions. For the rest of the medical establishment, the worldview this entails is straining doctor-patient relationships. Johnson told me that many of his patients are now skeptical of his advice, if not openly combative. One cancer patient accused Johnson of bias when he failed to recommend ivermectin. The drug is so cheap and effective, this patient had concluded, that Johnson would be out of a job if everyone knew about it. (Johnson told me that he offers patients “the best possible treatment, no matter the financial incentive.”) Ivermectin has become a big business in its own right. Online pharmacies and wellness shops are cashing in on the deworming craze, with one offering parasite cleanses for $200 a month. Meanwhile, fringe doctors can charge patients who have cancer and other diseases thousands of dollars to prescribe such treatments.

Johnson’s own experience suggests that the cult of ivermectin is growing larger. He told me that he’s seen his patients’ interest in the drug explode since January, when the actor Mel Gibson went on Joe Rogan’s podcast and claimed that three of his friends had beat back their advanced tumors with ivermectin and fenbendazole, among various other potions. “This stuff works, man,” Gibson said. Meanwhile, in the ivermectin Facebook groups—including one with close to 300,000 members—the public can read posts from a woman with breast cancer considering using ivermectin in lieu of hormone treatments; a leukemia patient who has given up on chemotherapy to “see what happens” with antiparasitic drugs; or a concerned aunt wondering if the drugs might help her little niece with Stage 4 cancer.

But ivermectin advocacy is most disturbing in its totalizing form, wherein parasites—which is to say, the pathogens against which the drug truly is effective—are reimagined as the secret cause of many other unrelated problems. In the Facebook groups, members will share images of what they say are worms that have been expelled from their bodies by treatment. (This phenomenon brings to mind a different disease entirely: delusional parasitosis.) One recent post from the daughter of a Stage 4 lung-cancer patient showed a bloody glob that had “dropped down into her mouth.” Commenters debated whether this might be a worm or something else. “Blood clot from Covid vax?” one suggested. A few days later, the daughter gave an update: Her mom had gone to see the doctor, who informed her that she’d likely coughed up a piece of her own lung.

The whole exchange provides a sad illustration of this delirious and desperate time. Before it turned into a conservative cure-all, ivermectin was legitimately a wonder drug for the poorest people on Earth. Since its discovery in 1973, it has become a leading weapon in the fight against horrific infections such as river blindness and elephantiasis. Yet now that substantial success seems to have given birth to a self-destructive fantasy.

A decade ago, the co-discoverers of ivermectin—William Campbell and Satoshi Ōmura—were awarded a Nobel Prize in recognition of their contribution to reducing human suffering. In his formal lecture to the Academy, Campbell offered some reflections on the simple science that gave rise to the treatment, and to its wide array of applications. But his speech contained a warning, too, that any medicine that works so broadly and so well runs the risk of being handed out too often. The more benefits that such a drug provides, he told the audience in Stockholm, “the more we must guard against the hazards of indiscriminate use.”

It’s easy to forget that Robert F. Kennedy Jr.’s assault on vaccines—including, most recently, his gutting of the expert committee that guides American vaccine policy—might have been avoided. Four months ago, his nomination for health secretary was in serious jeopardy. The deciding vote seemed to be in the hands of one Republican senator: Bill Cassidy of Louisiana. A physician who gained prominence by vaccinating low-income kids in his home state, Cassidy was wary of the longtime vaccine conspiracist. “I have been struggling with your nomination,” he told Kennedy during his confirmation hearings in January.

Then Cassidy caved.

In the speech he gave on the Senate floor explaining his decision, Cassidy said that he’d vote to confirm Kennedy only because he had extracted a number of concessions from the nominee—chief among them that he would preserve, “without changes,” the very CDC committee Kennedy overhauled this week. Since then, Cassidy has continued to give Kennedy the benefit of the doubt. On Monday, after Kennedy dismissed all 17 members of the vaccine advisory committee, Cassidy posted on X that he was working with Kennedy to prevent the open roles from being filled with “people who know nothing about vaccines except suspicion.”

[Read: The doctor who let RFK Jr. through]

The senator has failed, undeniably and spectacularly. One new appointee, Robert Malone, has repeatedly spread misinformation (or what he prefers to call “scientific dissent”) about vaccines. Another appointee, Vicky Pebsworth, is on the board of an anti-vax nonprofit, the National Vaccine Information Center. Cassidy may keep insisting that he is doing all he can to stand up for vaccines. But he already had his big chance to do so, and he blew it. Now, with the rest of America, he’s watching the nation’s vaccine future take a nosedive.

So far, the senator hasn’t appeared interested in any kind of mea culpa for his faith in Kennedy’s promises. On Thursday, I caught Cassidy as he hurried out of a congressional hearing room. He was still reviewing the appointees, he told me and several other reporters who gathered around him. When I chased after him down the hallway to ask more questions, he told me, “I’ll be putting out statements, and I’ll let those statements stand for themselves.” A member of his staff dismissed me with a curt “Thank you, sir.” Cassidy’s staff has declined repeated requests for an interview with the senator since the confirmation vote in January.

With the exception of Mitch McConnell, every GOP senator voted to confirm Kennedy. They all have to own the health secretary’s actions. But Cassidy seemed to be the Republican most concerned about Kennedy’s nomination, and there was a good reason to think that the doctor would vote his conscience. In 2021, Cassidy was one of seven Senate Republicans who voted to convict Donald Trump on an impeachment charge after the insurrection at the Capitol. But this time, the senator—who is up for reelection next year, facing a more MAGA-friendly challenger—ultimately fell in line.

Cassidy tried to have it both ways: elevating Kennedy to his job while also vowing to constrain him. In casting his confirmation vote, Cassidy implied that the two would be in close communication, and that Kennedy had asked for his input on hiring decisions. The two reportedly had breakfast in March to discuss the health secretary’s plan to dramatically reshape the department. “Senator Cassidy speaks regularly with secretary Kennedy and believes those conversations are much more productive when they’re held in private, not through press headlines,” a spokesperson for Cassidy wrote in an email. (A spokesperson for HHS did not immediately respond to a request for comment.)

At times, it has appeared as though Cassidy’s approach has had some effect on the health secretary. Amid the measles outbreak in Texas earlier this year, Kennedy baselessly questioned the safety of the MMR vaccine. In April, after two unvaccinated children died, Cassidy posted on X: “Everyone should be vaccinated! There is no treatment for measles. No benefit to getting measles. Top health officials should say so unequivocally b/4 another child dies.” Cassidy didn’t call out Kennedy by name, but the health secretary appeared to get the message. Later that day, Kennedy posted that the measles vaccine was the most effective way to stave off illness. (“Completely agree,” Cassidy responded.)

All things considered, that’s a small victory. Despite Kennedy’s claims that he is not an anti-vaxxer, he has enacted a plainly anti-vaccine agenda. Since being confirmed, he has pushed out the FDA’s top vaccine regulator, hired a fellow vaccine skeptic to investigate the purported link between autism and shots, and questioned the safety of childhood vaccinations currently recommended by the CDC. As my colleague Katherine J. Wu wrote this week, “Whether he will admit to it or not, he is serving the most core goal of the anti-vaccine movement—eroding access to, and trust in, immunization.”

[Read: RFK Jr. is barely even pretending anymore]

The reality is that back channels can be only so effective. Cassidy’s main power is to call Kennedy before the Senate health committee, which he chairs, and demand an explanation for Kennedy’s new appointees to the CDC’s vaccine-advisory committee. Cassidy might very well do that. In February, he said that Kennedy would “come before the committee on a quarterly basis, if requested.” Kennedy did appear before Cassidy’s committee last month to answer questions about his efforts to institute mass layoffs at his agency. Some Republicans (and many Democrats) pressed the secretary on those efforts, while others praised them. Cassidy, for his part, expressed concerns about Kennedy’s indiscriminate cutting of research programs, but still, he was largely deferential. “I agree with Secretary Kennedy that HHS needs reform,” Cassidy said.

Even if he had disagreed, an angry exchange between a health secretary and a Senate committee doesn’t guarantee any policy changes. Lawmakers may try to act like government bureaucrats report to them, but they have limited power once a nominee is already in their job. Technically, lawmakers can impeach Cabinet members, but in American history, a sitting Cabinet member has never been impeached and subsequently removed from office. The long and arduous confirmation process is supposed to be the bulwark against potentially dangerous nominees being put in positions of power. Cassidy and most of his Republican colleagues have already decided not to stop Kennedy from overseeing the largest department in the federal government by budget. Now Kennedy is free to do whatever he wants—senators be damned.

Robert Malone has a history of arguing against the data. He has called for an end to the use of mRNA vaccines for COVID despite the well-established fact that they reduce mortality and severe illness. He has promoted discredited COVID treatments such as ivermectin and hydroxychloroquine, dismissing studies that show they are ineffective against the coronavirus. Recently, he called reports about two girls in West Texas dying from the measles “misinformation,” even though the doctors who treated the girls were unequivocal in their conclusion.

Now Malone will have a leading role in shaping America’s vaccine policy. He is one of eight new members of the CDC’s Advisory Committee on Immunization Practices, replacing the 17 former members whom Robert F. Kennedy Jr. relieved of their duties on Monday. The re-formed committee will be responsible for guiding the CDC’s vaccine policy, recommending when and by whom vaccines should be used. The doctors and researchers who make up the new ACIP are all, to some degree, ideological allies of Kennedy, who has spent decades undermining public confidence in vaccines. And Malone arguably has the most extreme views of the group.

Malone, a physician and an infectious-disease researcher, readily acknowledges that he defies mainstream scientific consensus. Just this week, he wrote in his popular Substack newsletter that readers should embrace the anti-vax label, as he has done, and oppose “the madness of the vaccine mania that has swept public health and government.” (This was only a day before Kennedy pledged that the new ACIP members would not be “ideological anti-vaxxers.”)

He is also openly conspiratorial. In his best-selling book, Lies My Gov’t Told Me: And the Better Future Coming, Malone alleges that the Chan Zuckerberg Initiative’s grants to news publications (including The Atlantic) were payments “to smear” vaccine critics, and accuses Anthony Fauci of fearmongering to amass power. Last fall, Malone and his wife, Jill, released a follow-up, PsyWar, making the case that the U.S. government is engaged in a vague but diabolical program of psychological warfare against its own citizens. According to the Malones, the CIA, FBI, and Defense Department, along with a “censorship-industrial complex,” have granted the U.S. government “reality-bending information control capabilities.” (They also claim that “sexual favors are routinely exchanged to seal short-term alliances, both within agencies and between contractors and ‘Govies.’”) They envision this corruption spawning a postapocalyptic future in which guns, ammo, horses, and “a well-developed network of like-minded friends” might be necessary for survival. Malone, who lives on a horse farm in Virginia, appears to be already well prepared.

[Listen: How fragile is our vaccine infrastructure?]

Malone’s rise to contrarian glory began in the summer of 2021, when public-health officials were urging hesitant Americans to roll up their sleeves for the new, mRNA-based COVID-19 vaccines. Back in the 1980s, Malone had conducted research on delivering RNA and DNA into cells, which, he and his co-authors suggested in a 1990 paper, “may provide alternative approaches to vaccine development.” That early work lent credibility to his dire warnings that the COVID shots hadn’t been adequately tested, as perhaps did his grandfatherly beard and professorial demeanor. His popularity grew with appearances on Tucker Carlson’s and Glenn Beck’s shows, where he questioned the safety and effectiveness of the mRNA vaccines while touting—and, critics said, overstating—his own role in the development of the underlying technology. It was Malone’s conspiratorial musings on The Joe Rogan Experience that prompted several famous musicians, including Neil Young and Joni Mitchell, to pull their music from Spotify in protest of the platform’s contract with Rogan. Today, Malone’s newsletter, where he shares his anti-vaccine claims and often praises Kennedy, has more than 350,000 subscribers.

Kennedy and Malone have long been intertwined. Kennedy wrote the foreword to Lies My Gov’t Told Me and wrote an endorsement for PsyWar, alleging that the same techniques that the Malones described shaped public reaction to the assassinations of his father and uncle. Kennedy’s 2021 book, The Real Anthony Fauci—which alleges that the former director of the National Institute of Allergy and Infectious Diseases spread corruption and fraud—is dedicated to Malone, among others. Since Kennedy was appointed as Health and Human Services secretary, many of his allies in the anti-vaccine world have accused him of moderating his views to be more palatable to lawmakers. But among anti-vaccine activists, Malone’s appointment to the advisory board was taken as evidence that Kennedy remains on their side.

Public-health experts, by contrast, are horrified. “I think that the scientific and medical community won’t trust this committee, and for good reason,” Paul Offit, a pediatrician and former member of the advisory group, told me. He’s heard from fellow public-health experts who are considering forming their own committees to weigh the evidence, “because they won’t trust the conclusions of these people.” Sean O’Leary, the American Academy of Pediatrics’ liaison to ACIP, told me he was “deeply concerned” with RFK’s decision to entirely remake the committee. “This maneuver really endangers public health. It endangers children,” he said. He worries that it will lead to disease, suffering, and death among adults and children alike. (Neither Malone nor HHS responded to requests for comment. On X, Malone promised to “do my best to serve with unbiased objectivity and rigor.”)

[Read: RFK Jr. is barely even pretending anymore]

Malone’s appointment is perhaps the strongest sign yet of Kennedy’s willingness to appoint ideological crusaders into powerful government roles. ACIP’s recommendations are nonbinding, but historically, the CDC has almost always hewn to them. The committee’s verdicts will help determine which vaccines insurance companies and the federal government pay for, decisions that will inevitably shape countless Americans’ immunization habits. Malone’s new role requires in-depth, good-faith examinations of scientific evidence. But he has already earned a reputation for rejecting it.

At the core of every joke about Baby Boomers lies a seed of jealousy. Unlike younger generations, they have largely been able to walk a straightforward path toward prosperity, security, and power. They were born in an era of unprecedented economic growth and stability. College was affordable, and they graduated in a thriving job market. They were the first generation to reap the full benefits of a golden age of medical innovations: birth control, robotic surgery, the mapping of the human genome, effective cancer treatments, Ozempic.

But recent policy changes are poised to make life significantly harder for Baby Boomers. “If you’re in your 60s or 70s, what the Trump administration has done means more insecurity for your assets in your 401(k), more insecurity about sources of long-term care, and, for the first time, insecurity about your Social Security benefits,” Teresa Ghilarducci, a labor economist at the New School, told me. “It’s a triple threat.” After more than half a century of aging into political and economic trends that worked to their benefit, the generation has become particularly vulnerable at exactly the wrong moment in history.

Perhaps the biggest threat to Boomers in the second Trump administration is an overhaul of Social Security, which provides benefits to nearly nine out of 10 Americans ages 65 and older. In an emailed statement, Social Security Commissioner Frank Bisignano wrote, “I am fully committed to upholding President Trump’s promise to protect and strengthen Social Security. Beneficiaries can be confident that their benefits are secure.” But in February, DOGE announced plans to cut Social Security staff by about 12 percent and close six of its 10 regional offices; a quarter of the agency’s IT staff has quit or been fired. Social Security’s long-term outlook was already troubled before Trump, and these drastic reductions make the understaffed agency even less equipped to support those who rely on it. Shutting down field offices means seniors can’t get help in person; less staffing means longer wait times when they call and more frequent website crashes. “When you add hurdles, or cause a slowdown in terms of processing claims, you see losses in terms of benefits,” Monique Morrissey, a senior economist at the Economic Policy Institute, told me. In fact, shutdowns of field offices during the first two years of the coronavirus pandemic corresponded with decreased enrollment in both Social Security and Social Security Disability Insurance, which is available to Americans under 65 who can no longer work for physical or mental reasons.

Social Security cuts will most hurt low-income Boomers, who are the likeliest to rely on benefits to cover their whole cost of living. But even those with more financial assets may depend on Social Security as a safety net. “It’s important to understand that many seniors, even upper-income seniors, are just one shock away from falling into poverty,” says Nancy J. Altman, the president of Social Security Works, an organization that advocates for expanding the program. As a whole, seniors have more medical needs and less income than the general population, so they’re much more financially vulnerable. If you’re comfortably middle-class in your early 60s, at the height of your earning potential, that’s no guarantee that you’ll remain comfortably middle-class into your 70s. In the next few years, Boomers who face more medical bills as they stop working might find, for the first time in their life, that they can’t easily afford them.

Middle-income seniors are also likely to feel the impact of a volatile market. “They tend to have modest investments and fixed incomes rather than equities, so the type of wealth that will erode over a high-inflation period,” Laura D. Quinby, who studies benefits and labor markets at the Center for Retirement Research at Boston College, told me. After Trump announced 10 percent tariffs on all imported goods in April, the three major stock indexes dropped 4 percent or more. They’ve since recovered, but the erratic market—whipped around by Trump’s shifting proclamations about tariffs—scares many middle-class Boomers, who are watching their retirement savings shrink.

In the near future, older Americans might find themselves paying more for medical care too. Trump’s “big, beautiful bill,” which has passed in the House but awaits a vote in the Senate, would substantially limit Medicare access for many documented immigrants, including seniors who have paid taxes in the United States for years. The bill would also reduce Medicaid enrollment by about 10.3 million people. Although Medicaid is for people with limited incomes of all ages, it supports many older Americans and pays for more than half of long-term care in the U.S. Most seniors require some sort of nursing home or at-home medical care; one study found that 70 percent of adults who live to 65 will require long-term services and support.

[Read: The GOP’s new Medicaid denialism ]

That support may soon be not only more expensive, but harder to come by. The long-term-care workforce is disproportionately made up of immigrants, so the Trump administration’s immigration crackdown is likely to reduce the number of people available to take care of seniors—and increase how much it costs to hire them. “If you have no money, you’ll be on Medicaid in a nursing home, and that’s that. But if you’re trying to avoid that fate, you’re now going to run through your money more quickly and be more vulnerable,” Morrissey said.

Seniors with some financial security are more likely to live long enough to contend with the diseases of old age, such as Alzheimer’s and dementia. The Trump administration has cut funding for promising research on these diseases. “Going forward, you’ll find less treatments reaching fruition,” Thomas Grabowski, who directs the Memory and Brain Wellness Center at the University of Washington, told me. For now, the UW Memory and Brain Wellness Center, where Grabowski works on therapies for Alzheimer’s, has stopped bringing in new participants; as time goes on, he said, they’ll have to tighten more. (Kush Desai, a White House spokesperson, told me in an email that the cuts to research funded by the National Institutes of Health are “better positioning” the agency “to deliver on medical breakthroughs that actually improve Americans’ health and wellbeing.”)

Changes at the UW Memory and Brain Wellness Center could have dramatic effects on current patients, including Bob Pringle, a 76-year-old who lives in Woodinville, Washington. In April, he started getting infusions of donanemab, an anti-amyloid medication approved by the FDA last year. The drug doesn’t cure Alzheimer’s; it’s designed to slow the disease’s progression, though the utility of donanemab and other Alzheimer’s drugs remains controversial among experts. Pringle, for one, has found donanemab helpful. “With the medication, my decline is a gentle slope, rather than a rapid decline,” says Pringle, whose mother died of Alzheimer’s and whose sister lives in a memory-care facility. “You’re always hopeful that somebody with a bigger brain than you have is working on a cure, and the medication gives us some time until then,” Bob’s wife and caretaker, Tina Pringle, told me. “But right now, because of the funding cuts, our outlook is grim.”

[Read: The NIH’s most reckless cuts yet]

The unknowability of the future has always been a scary part of getting older. The enormous upheaval that the Trump administration has created will only magnify that uncertainty for Boomers. After a historical arc of good fortune, their golden generation has to contend with bad timing.

Younger generations, including my own, shouldn’t gloat, though: Cuts to Social Security and a halt to medical research could well worsen the experience of aging for generations to come. Younger Americans will likely grow old under challenging conditions too. Unlike the Boomers, we’ll have plenty of time to get used to the idea.

Newer Posts