Category:

Medical News

For most of the past century, the United States’ track record on infectious disease has been quite good. Thanks to major investments in public health, diseases such as smallpox, polio, yellow fever, malaria, measles, rubella, mumps, diphtheria, and tuberculosis have either been obliterated or become vanishingly rare. America “led the charge,” Aniruddha Hazra, an infectious-disease physician at UChicago Medicine, told me. The nation’s approach to public health was expected to help set the standard for the entire globe’s health.

That era might now be over. In recent years, the U.S. had already begun to neglect its public-health infrastructure, weakening defenses necessary for keeping infectious diseases at bay. Now the Trump administration is going beyond poor maintenance. It’s attempting active destruction.

In two months, the administration has slashed funding for biomedical research; stripped universities of grants; triggered the halt of clinical trials, domestically and abroad; gutted the public-health workforce; canceled or postponed vaccine advisory meetings to the government; and upended the normal practices of federal agencies dedicated to promoting American health. Cuts to foreign aid have marooned HIV and malaria medications in ports and storage facilities; an Ebola outbreak has been left to swell and spread. Domestically, the federal government’s response to a growing measles outbreak has downplayed the protective power of vaccines; Robert F. Kennedy Jr., the new secretary of the Department of Health and Human Services, has suggested that H5N1 bird flu should be allowed to burn through the nation’s poultry; and the administration is reportedly mulling cuts to the CDC’s HIV budget for prevention of the disease in the U.S.

In November 2023, while campaigning for the presidency, Kennedy declared that he wanted the government to “give infectious disease a break for about eight years.” He and the rest of the Trump administration are more than making good on that promise, as they hamper infectious-disease research and the day-to-day work of surveillance and outbreak management. Donald Trump’s America isn’t just giving infectious disease a break. It’s pivoting away from guarding against pathogens to inviting them right in.

In an email, Emily G. Hilliard, HHS’s deputy press secretary, defended Kennedy’s actions, and said that the agency “is committed to promoting radical transparency so Americans can make informed choices regarding their health.” Kush Desai, a spokesperson for the White House, echoed this notion, and blamed “shoddy reporting about the Trump administration’s work” for low public trust in the health-care system.

Before the second Trump administration took office, the United States’ investment in keeping infectious disease at bay had already long been flagging. As the nation succeeded in minimizing infectious threats, people forgot to fear them. Since at least the late aughts, U.S. spending on nearly all aspects of public health has gone flat, or declined; after the start of the coronavirus pandemic, childhood vaccination rates dipped nationwide and failed to bounce back. COVID, rather than reaffirming the country’s commitment to public health, cemented many Americans’ antagonism toward it.

[Read: Inside the collapse at the NIH]

When most of a community buys into public-health interventions, almost everyone can stay safe. But the more people opt out, the more everyone’s health is put at risk. Many Americans have been making that choice more often as of late. During the pandemic, for instance, when calls to vaccinate repeatedly, quarantine, distance, and mask “bumped up against people’s autonomy,” many Americans chose to hew to what they considered best for themselves as individuals, Lisa M. Lee, a public-health expert and bioethicist at Virginia Tech, told me. Public-health guidelines are not scripture, and Americans will still find plenty of reason to debate when restrictions have gone too far, or when health officials have gotten the guidance wrong. But public health inevitably struggles in a nation where the attitude of “I don’t need to worry about what everyone else is doing; I can control my own health” has been hardening for years, Eleanor Murray, an infectious-disease epidemiologist, told me.

That individualistic perspective is now being translated into national policy, even as infectious threats continue to batter the country. After bird flu began to spread rapidly in dairy cattle, then infect farmworkers, the Biden administration lagged in its attempts to track and contain the virus, then largely left decisions about testing cows for the pathogen—the most effective way to track its spread—up to individual farmers. (Under Joe Biden, the Department of Agriculture disputed that its response was insufficient.) By the time the second Trump administration inherited the crisis, H5N1 had already killed an American—but still, the nation’s new leaders didn’t meaningfully step up the response. In the absence of sufficient cow surveillance, the virus has continued to transmit on farms; without more targeted protection of poultry and dairy workers—those most exposed to H5N1—people have continued to fall sick.

As the measles outbreak that began in Texas in January has grown—now to the point where the U.S. has logged more measles cases so far this year than it did in all of 2024—Kennedy has emphasized the importance of autonomy. Vaccinating the unvaccinated is the fastest way to stop a measles outbreak, and Kennedy has publicly acknowledged that vaccines protect individuals and “contribute to community immunity.” But he’s also repeatedly overstated vaccines’ risks and declined to directly urge parents to vaccinate their children. And his continued framing of the shots as a “personal” choice elides their protective benefits to everyone else. The unvaccinated child killed by the measles outbreak last month almost certainly would not have died had she been vaccinated. And she might never have been infected in the first place had vaccine rates been higher in her community.

The Trump administration’s actions, though, have shown flagrant disregard for the possibility of rising infection rates. Kennedy has pushed good nutrition and vitamin A supplementation as viable solutions to the Texas outbreak—suggesting, essentially, that the country’s approach to the disease should be to try to limit the damage of infections rather than prevent them. (Although vitamin A deficiency can worsen a case of measles, that sort of malnutrition is extremely rare in the U.S., and no supplement can prevent a person exposed to measles from catching it.) Meanwhile, the administration’s freeze on foreign aid halted PEPFAR, a program that has, for more than 20 years, helped deliver HIV antiretroviral drugs to vulnerable populations around the world, and saved more than 25 million lives. With that support gone, up to 20 million people living with HIV—more than 500,000 of them kids—may have been cut off from their lifesaving medications.

[Read: America can’t just unpause USAID]

The administration is also considering a major revamp of domestic HIV funding. Among the targets may be the CDC’s budget for HIV—the source of 91 percent of federal funding for HIV prevention in the U.S., aimed at tracking infections, increasing access to tests and drugs that can help avert new cases, and helping protect at least 1.2 million Americans estimated to be living with the virus. Strip away access to those sorts of crucial resources, and “my patients are going to die,” Jade Pagkas-Bather, an infectious-disease physician at UChicago Medicine, told me. (Hilliard, the HHS spokesperson, told me that “no final decision on streamlining CDC’s HIV Prevention Division has been made.”)

At times, the administration has verged on advocating for exposure to dangerous germs, Murray pointed out. Kennedy, for instance, has praised the benefits of acquiring immunity to measles through infection, saying that those defenses are longer lasting than immunity derived from vaccination—a statement that doesn’t account for the disease’s sometimes deadly and debilitating risks—and baselessly claiming that measles infection might also protect against cancers and heart disease. In recent weeks, he’s also encouraged poultry farms to simply allow the virus to rip its way through their birds—a proposal that could, at the very least, devastate flocks, and at worst, risk the virus morphing into a form that would be able to spread among humans.

In keeping with his original promise, Kennedy has found plenty of ways to limit biomedical research into infectious disease. The administration has forced the National Institutes of Health to defund research that focuses on LGBTQ populations—including several projects dedicated to HIV—as well as projects that mention vaccine hesitancy. (In an email, Hilliard defended the funding cuts as “part of a broader effort to redirect resources towards more urgent public health priorities—particularly the ongoing research into the safety and efficacy of vaccines.”) Among the grants rumored to be targeted next are hundreds of studies that involve mRNA vaccines, or work in South Africa, where many global-health projects are centered. Losing those projects might directly lead to more deaths. But the terminations send a subtler message, too, Hazra told me: that many of the already marginalized populations most affected by deadly infectious diseases must fend for themselves.

Abandoning the vulnerable, though, won’t make infectious problems disappear—quite the opposite. Left unchecked, diseases spill into new populations, and across borders. Diseases such as measles and polio have been declared eliminated from the United States. Without continued effort, they may not stay that way. And should those diseases reinfiltrate the country permanently, they will sweep over a population ill-equipped to fight them off again. The health infrastructure that the U.S. would have to marshal against them is already weak. More federal layoffs may be coming that could further shrink the public-health workforce. Biomedical research is being upended across fields. And should proposed cuts to Medicaid funding go through, the country will be even less equipped to deliver care to the people who most need it.

[Read: America is botching measles]

The administration’s actions all emphasize personal freedom. But those values aren’t compatible with a country free of epidemics. Infectious diseases, by nature, exploit individualism; their version of liberty is to find the unprotected, and spread more freely. The U.S. has the money, technology, and expertise to be a country minimally affected by infectious disease—as it was for decades. Its current leaders have stopped short of saying that they’d embrace a world rife with infectious death. But their actions suggest that those sacrifices are exactly the kind they are willing to make.

Maybe you were one of the 11.7 million people who watched when, on House M.D., the genius diagnostician Gregory House is roused in the middle of the night by a pounding at the door. A man he just gave a clean bill of health has collapsed and died. House and his colleague Eric Foreman decide to perform an autopsy themselves. Eager to see the man’s heart, House pushes Foreman to plunge a whirring saw into the patient’s sternum. They peer down: Blood seems to be trickling from the wound. “That’s odd,” says Foreman, “almost looks like he’s … bleeding?” There’s a beat. Then the man’s eyes open wide and he screams.

If you missed this particular episode, you’ve surely seen something similar. Autopsies play a starring role on CSI, Bones, and many other prime-time dramas about medicine and forensics. They may very well be the medical procedure that Americans have been most exposed to via their screens, and yet among the most misunderstood. The way these shows depict autopsies is so disastrously wrong that they not only discourage families from opting into these vital procedures; they even diminish doctors’ understanding of how autopsies work.

On television, autopsies happen in dark rooms with blue light. They involve scalpels, bone saws. Jaded techs in lab coats discuss ligature marks and defensive wounds. Doctors frown at something mysterious and disturbing. Many (but not all) of the pathologists performing autopsies are maladjusted weirdos; the nonspecialist doctors, like House, do things that would cause them to lose their medical license. In one episode of Grey’s Anatomy, Sandra Oh’s and Katherine Heigl’s characters carry out an autopsy in direct contravention of the patient and family’s orders, which is, need I say, illegal; in another, the autopsy pathologist callously chews gum and makes a dinner reservation while cutting. In just about every single episode of CSI, techs wearing no protective equipment whatsoever stand in rooms lit like dive bars, feeding slivers of human tissue into machines that whir briefly and spit out a neat list of every foreign substance in the body.

Most Americans will never see a real autopsy, so our impressions of them are formed by TV portrayals. That’s how it was for me, until I spent two weeks observing autopsies in a hospital in Pittsburgh as part of research for a novel. In real life, autopsies are performed in brightly lit rooms. (Forensic autopsies must sometimes be completed in the field if, for example, a body cannot be safely moved.) The autopsist begins with a Y-shaped incision into the sternum and works methodically through the body. Sometimes the internal organs are inspected in situ, but more typically they are removed, washed down, and dissected on a water table. The autopsist doesn’t just pick up an organ, look at it, make a diagnosis, and plop it back in; rather, they catalog as many of the body’s pathologies as possible, whether or not they’re suspected of causing death. They also take care to make sure that none of the evidence of this inspection would be visible in an open-casket funeral. It’s slow, comprehensive work that rarely involves fancy electronics.

Perhaps the most subtly ridiculous aspect of TV autopsies is the lack of personal protective equipment. In real autopsies, the people involved wear head-to-toe PPE—surgical scrubs, armguards, booties, an apron, a face mask, a splash shield, and a cap—because, when you open a human body up, all the blood, bile, and other fluids that a person had in life are still in there. Blood can still ooze from a wound, even when it’s not being pumped through the body. Did the writers of that House M.D. episode think the human body spontaneously exsanguinates when the heart stops?

Errors like these in TV depictions reduce autopsies to ghoulish spectacles, when the procedure is in fact a respectful and uniquely useful tool for understanding how a person lived and died. Families might decline an autopsy for a number of reasons: cultural taboos, simple squeamishness, a sense that the decedent has suffered enough, a desire to take possession of the body right away. But a death need not reach some threshold of suspicion or mystery to merit investigation; many teaching hospitals, where residents might observe autopsies to deepen their understanding of anatomy, will perform an autopsy for free on any patient if the family asks. Anyone can get one, and more people should.

Autopsies, even in cases where death was expected, can serve a real purpose to the living. They help present a full narrative of disease, which can help doctors treat other patients and—if communicated well—allow grieving families to move toward acceptance. Autopsies can catch hereditary illnesses that doctors tend to miss, such as signet-ring-cell carcinomas, and reveal causes of dementia that may have been misdiagnosed as Alzheimer’s. Even in suicides, where the cause of death is usually obvious, autopsies can reveal underlying problems that might have contributed to the deceased’s distress.

[Read: There are no ‘five stages’ of grief]

Despite how useful autopsies are, they have become an endangered species. In one survey conducted at Massachusetts General Hospital in the mid-2000s, residents overwhelmingly said they had never watched an autopsy, much less performed one. And doctors aren’t always comfortable discussing them with patients’ families. Autopsy rates more than halved from 1972 to 2007, plunging to a measly 8.3 percent. In many other countries, the rate is even lower.

Not all hospitals provide autopsy services; in many of those that do, asking the family of a deceased patient whether they would like an autopsy falls to residents, many of whom lack the training to properly describe and clarify the process. In that moment, there’s little to stop the grieving family from calling up all the lurid images they’ve likely seen on TV. Take, for example, a 2020 episode of The Good Doctor literally called “Autopsy.” Shaun Murphy is operating on an ER patient, an unhoused Jane Doe, when her carotid artery “blows out” and she bleeds to death in a matter of seconds. Over the next 40 minutes of screen time, the following things happen:

The hospital denies Murphy permission to do an autopsy. (Apparently, arteries explode every day.) So he tracks down the woman’s estranged son, who also denies him, inspiring Murphy to attack the man’s car, shrieking at him. Nonetheless, the head of pathology (who just broke up with Murphy) allows him to do the procedure. He conducts the autopsy himself, in a dark room and wearing no PPE beyond a cute little apron, while the pathologist stands indulgently behind him. He plucks out the woman’s liver and regards it briefly, then puts it back and exits, without sewing her up or taking a tissue sample. Finally, he shows up at the son’s house to say that his mother had Ehlers-Danlos syndrome, which is hereditary, and that the son must get treatment. All is forgiven.

This is an episode that makes a case for autopsies. The only remotely realistic thing about the episode is the son’s reluctance to consent. And after watching that gruesome exercise, would you?

[Read: Health care in the time of Grey’s Anatomy]

On these shows, the stark truth of a dead body is either milked for shock factor or smoothed over with holograms. Each chemical in a person’s body can be identified with a machine, and each story has a satisfying ending. Everyone knows life isn’t like that. We need to accept that death isn’t, either.

The first time that Mehmet Oz was questioned by the Senate, in June 2014, the atmosphere was not inviting. He’d been hauled in to defend his habit of promoting unconventional supplements for weight loss, including green coffee beans, raspberry ketones, and an Asian tropical fruit called garcinia cambogia, on his daytime-television talk show. “I don’t get why you need to say this stuff,” Claire McCaskill, the Missouri senator who chaired the hearing, told him. “Because you know it’s not true.”

Last Friday, Oz was back before the Senate, this time to be questioned as President Donald Trump’s nominee to run the Centers for Medicare and Medicaid Services. In the interim, despite a turn to politics that included an unsuccessful bid to join the Senate himself, Oz has stayed the course: selling stress-relieving shrubs on social media, for instance, and leveraging his mother’s Alzheimer’s to pitch herbal remedies. Now a physician who was once described by other doctors in an open letter as demonstrating “an egregious lack of integrity by promoting quack treatments and cures in the interest of personal financial gain” may soon be tasked with regulating the health insurance of more than 150 million Americans. But the context of his return to Washington has cast the former TV star in a new, more flattering light: Next to some of the other appointees to the Department of Health and Human Services, even Dr. Oz seems safe and normal.

I’ve had a front-row seat for Oz’s unlikely transformation from maligned to mainstream. In 2013, when I was still in medical school, I launched a public effort to censure him. His exuberant pitches for unproven remedies were harming patients, I contended. I asked medical societies to do more to combat the spread of misinformation. My efforts were rebuffed at first; doctors were worried about infringing on free speech and criticizing professional colleagues. To buttress my campaign, I started collecting anecdotes from viewers of The Dr. Oz Show describing potential harm caused by his advice.

Oz did not respond to any of these efforts at the time. (He also did not respond to a request for comment on this story.) His initial dressing-down in Congress followed soon after, and then in 2015, I helped a group of medical students and residents cajole the American Medical Association into writing guidelines for ethical physician conduct in the media. Oz himself remained unchastened after this previous run of bad press, though. “We will not be silenced. We will not give in,” he told his TV viewers in 2015, while accusing one group of critics of having industry ties and denying that he ever promoted treatments for personal gain. In short, he embraced his reputation as a wellness guru and anti-establishment truth teller—the sort of person who would find a natural home in the “Make America healthy again” movement that has been popularized by Robert F. Kennedy Jr.

Oz is likely to join Kennedy’s Department of Health and Human Services—and assume control of my parents’ health insurance, among so many others’—in the weeks ahead. That prospect would have terrified me in the 2010s, when I first watched him testify before the Senate. But when I saw him do so for a second time on Friday, he no longer struck me as a major threat. Rather, he looked like an anachronism: a charming celebrity physician with a penchant for theatrical claims. In the face of the Trump administration’s chaotic razing of the nation’s biomedical infrastructure, Oz’s brand of hucksterism seems relatively mild, even quaint.

Perhaps that’s why the Senate showed so little interest in his history of hawking suspect treatments. Even Democrats went pretty easy with their questions. Senator Ron Wyden accused Oz of having engaged in “wellness grifting,” and Senator Maggie Hassan said he’d backed “unproven snake oil remedies,” but this was not a central focus of the hearing. “There are many things I said on the show,” Oz said in response. “I take great pride in the research we did at the time to identify which of these worked and which ones didn’t.”

Instead of grilling Oz on his questionable supplement endorsements, the legislators mostly used their time to lobby for niche policy fixes, and Oz in turn displayed an expertise in health-care policy that seemed worthy of his Wharton MBA. He was fluent on the topics of pharmacy benefit managers, prior authorization, insurance payment models, and the Affordable Care Act. He came out in favor of work requirements for Medicaid—a conventionally conservative approach—while also making sure to show some sympathy for health-care consumers, calling the insurance companies that profit from excessive upcoding “scoundrels who are stealing from the vulnerable.”

This all came off as rather serious and boring, in the way that such a hearing really should come off. Compare that with the nomination hearings for Kennedy: When questioned by the Senate, he botched basic facts about Medicare and Medicaid, refused to admit that vaccines don’t cause autism, and accused committee members of being shills for pharmaceutical companies. Dave Weldon, who was Trump’s pick to run the CDC, didn’t even make it to his hearing, which was also scheduled for last week. Why Weldon’s nomination was withdrawn is not exactly clear, but it’s possible he made the error of being slightly too transparent about his suspicions of standard childhood vaccines. When positioned next to Kennedy and Weldon, or to Trump’s picks to run the NIH and the FDA, Oz seems quite conventional. He clearly stated that the measles shot is both safe and effective, while doing little to attach himself to the angry COVID contrarianism expressed by Kennedy and other nominees for leadership at HHS. (HHS did not respond to a request for comment.)

So now we seem to have arrived at the strange moment when a celebrity TV doctor with no significant experience in public administration, a physician who once suggested that pineapple chunks and chia seeds were reasonable treatments for sciatica, can present himself as an unusually rational and stable candidate for leadership in the nation’s public-health establishment. Oz may even become an advocate for a more conventional approach to health-care policy in a department that is now run by someone who touts the benefits of treating measles with cod liver oil. Improbably, the “green coffee beans” guy is poised to be the grown-up in the room.

He was one of the few ICU patients whose face I saw in early 2021, when COVID raged through Los Angeles. As a palliative-care physician, my job was to meet, over Zoom, with the families of intubated patients to ensure that they had complete medical updates and to help them make difficult medical decisions on behalf of their loved one, particularly when that person faced death. But in a surreal departure from my usual practice, I’d never once seen these patients myself: At the time, to minimize exposure to COVID, the only people permitted to enter the ICU regularly were members of the ICU team.

This particular patient had survived several risky surgeries the year before; he’d returned to the hospital that winter after he became short of breath and began to gasp for air, his wife panicking as she watched his lips turn blue. He’d quickly required intubation and medications to increase his dangerously low blood pressure. His wife told me that during his many previous illnesses, she had slept every night in his hospital room, making sure he asked for pain medications and watching movies with him to pass the time. “He fought harder to live when I was there,” she told me.

I dreaded explaining to her, and to all my patients’ families, that our hospital would allow a visit only if a patient became irreparably sick; even then, just one family member, unrecognizably ensconced in layers of PPE, could spend 10 minutes in their room before a physician removed life support. I struggled to tell those families that many of my patients still died alone, their condition sometimes worsening too suddenly and unpredictably for a loved one to arrive in time.

The tragedy of people suffering and dying alone is one of the enduring and unaddressed traumas of the pandemic. During early surges, we restricted visits to stop COVID from spreading. Yet even when the number of infections dropped and rates of transmission slowed, many of these strict policies remained in place, despite the immense damage they did to patients, families, and hospital staff.

Hundreds of thousands of people in the United States died from the disease in those first few years, and many of them died isolated from their friends and family. Intubated and unconscious, they may not have known they were alone, but their family members did, and many developed PTSD. Around the world, ICU staff, too, became distraught and demoralized by caring for an onslaught of people dying in isolation.

Some states have since passed laws codifying families’ rights to visit loved ones in the hospital, but for the most part, this phenomenon has been treated as a painful memory, not something worthy of public conversation. Yet only by facing the ways that the separation of patients and families reshaped the human experiences of grief and loss can we understand why we must do better next time.


In late 2020 and early 2021, Los Angeles experienced a version of what New York faced at the beginning of the pandemic: a terrifying and rapid onslaught of COVID cases. Lines of ambulances wound around packed hospitals; vital oxygen and ventilators dwindled in the hardest-hit areas. Within the hospital, because of social-distancing mandates, my team and I could no longer share an office, and I’d leave work having interacted directly with a few patients outside the ICU and a handful of colleagues. When I left, I drove through streets lined with darkened storefronts, emptied of cars and people. Governor Gavin Newsom had reinstated stay-at-home orders for all regions with less than 15 percent of ICU beds open.

At the time, and during the worst stretches of the pandemic, numbers became our singular focus at the hospital: The number of vaccinated people, remaining ventilators, and local COVID cases all shaped how we cared for the sick and protected staff. Family members suddenly became visitors, and visitors were vectors. And despite witnessing the tragedy of isolated people dying or experiencing illness alone, I resigned myself to the fact that there was no other way to contain viral spread, particularly when COVID infections among hospital staff rose.

But as the months passed, the gains of limiting visitation dimmed as the human cost of preventing family from visiting sick relatives became clearer. In recent years, medical practitioners have recognized that families are not merely a comforting presence but also integral parts of care teams who might help us understand how a person with dementia expresses pain, or how to persuade a stubborn person to participate in physical therapy. Relegating their visits to life’s end felt disrespectful to them and to my patients.

The scenes from my day that began to haunt me, even when I tried to distract myself after work, were of people enduring the consequences of facing illness alone, forcibly separated from their loved ones. In the hospital courtyard, I’d consoled the frightened teenage daughters of a single father and brought in family photos they’d framed for him as he underwent chemotherapy. An older patient with dementia had needed strong medications and restraints to treat delirium, which her husband told me she had never required when he was at her bedside. A gentleman with emphysema had wept, fearing that without his wife he wouldn’t be able to alert a nurse if he couldn’t breathe. The pandemic destroyed the rituals of funerals and memorials, but it also destroyed the rite of helping a loved one through illness.

And when life’s end approaches, I’ve seen over and over again the specific consolation that accompanying a person to their death can offer. Experiencing a loved one’s care together, attending to needs that only kin know, understanding that you were there with them as they faced the ultimate unknown, allows for a sense of agency amid circumstances rife with powerlessness and uncertainty. Before the pandemic, I’d observed how witnessing the details of a person’s illness profoundly helped families tell a story about a loved one’s final days or hours—about applying ChapStick to your father’s drying lips when he could no longer swallow water, or holding your brother’s hand as nausea from an inoperable brain tumor overwhelmed him, or playing music to ease your partner’s confusion as she died from kidney failure.

Instead, the families I worked with during the pandemic understood their loved one’s illness only through the words of masked care teams whose faces they never saw. Over iPads, we could offer only momentary glimpses of their beloved, pointing out the purpose of each machine surrounding them. Deprived of the sensory experience of seeing illness unfold or watching treatment fail, families struggled to make urgent decisions and to come to terms with medicine’s limited ability to reverse COVID’s many devastations.


Hospitals certainly had an obligation to protect staff from COVID infections; especially early in the pandemic, when PPE was limited and no one knew much about this new virus’s behavior, limiting visitors was necessary. During the surge in early 2021, most hospitals generally had sufficient PPE, and physicians had a better understanding of the virus, but many of my colleagues still became exposed and infected. Whether they had contracted the virus within or outside the hospital was not entirely clear. If family members agreed to wear PPE at all times and to present a negative COVID-test result, perhaps they could have been allowed to visit while minimizing the risk of infecting staff. (Some research has suggested that adequate PPE reduced the risk of transmission, but there aren’t enough studies to determine whether limiting visitors protected staff from infections.)

Every hospital had its own policies, but sometimes even when visitors had been vaccinated or had tested negative for COVID, visitation rules didn’t change. I often wonder whether this may have been different had more of us on the front lines insisted more forcefully that minimizing deaths wasn’t the same as minimizing suffering, that the emotional aftermath of these decisions, which weren’t always modified even when schools and businesses reopened, had indelibly altered our own lives and those of patients and families.

The alternative to no visitors doesn’t have to be no rules: When legislators have tried to codify visitation rights, in many cases they’ve allowed for at least one designated visitor, for at least one hour a day, while also respecting hospital policies to enforce safety measures to ensure infection prevention. Yet as accounts of the human toll of strict visitation policies began to emerge, hospital administrators could have asked themselves, and their staff, earlier and more often about ways to accommodate visitation while also taking all possible measures to minimize COVID transmission. Without that grace, the extreme version of traumatic loss that patients’ families and medical teams were forced to experience continued, unabated.

When a person can only imagine scenes and invent details of a death, they may face the kind of ambiguous loss first used to describe the experiences of families whose loved ones had gone missing in Vietnam. Living both with the fact of a death and without any details or evidence of what led to it can keep the loss from seeming truly real. Ambiguity haunts every loss to some degree, but these deaths in isolation denied families the opportunity even to try to witness the unfolding of illness and their loved ones’ last moments.

Imagine seeing, as my patient’s wife did, mere images of a ventilator, bags of antibiotics, and blood-pressure medications hanging from a pole above the person she loved. Imagine only hearing about how he shook from fevers and thrashed about amid spells of confusion. The ICU resident and I pointed out the way his skin had turned sallow, his arms newly marbled by bruises. Even with the maximum support from the ventilator, his lungs struggled to maintain a normal oxygen level, and his heart couldn’t pump blood efficiently, causing his legs to swell and several of his toes to turn blue. But describing all of this, or telling her that blood had begun to pool in his mouth, could not stand in for her experiencing his deterioration herself—pressing her hands against his newly unrecognizable limbs or seeing his lips newly crusted with crimson scabs.

I remember him so well because I had to enter the ICU briefly to speak with a colleague, and his was the first room on my right. I had seen him, as his family had, only through the camera of the iPad used in family meetings; I gasped at the shock of seeing for myself the sharp angles of his face, his dusky toes, and the blood streaking his breathing tube. Nearly two weeks after he first arrived in our ICU, his heart, lungs, and kidneys were all failing. He was dying. And so his wife, who agreed that he wouldn’t want to live this way, on life support, could finally visit.

I learned later that I had just left the hospital when she arrived in the ICU. She wanted to remove her gloves to press his hands into hers, one final time. This, unfortunately, wasn’t allowed, his nurse said. She stepped out of the room for a few minutes to give my patient and his wife privacy. When she returned, my patient’s wife had disappeared. He died half an hour later.

Over the next few days, I called his wife several times, though she never returned my calls. I tried to imagine what might have happened in the minutes she saw her husband for the last time. Perhaps she obeyed the rules. I like to imagine a different scene: Perhaps, in a final, defiant act of love, she removed her gloves, kissed her husband’s hands, and smoothed his hair before slipping out, unnoticed, preparing herself to live in a world without him.

Mycobacterium tuberculosis is a near-perfect predator. In 1882, Robert Koch, the physician who discovered the microbe, told a room full of scientists that it caused one in seven of all deaths. In 2023, after a brief hiatus, tuberculosis regained from COVID its status as the world’s deadliest infectious disease—a title it has held for most of what we know of human history.

Some people die of TB when their lungs collapse or fill with fluid. For others, scarring leaves so little healthy lung tissue that breathing becomes impossible. Or the infection spreads to the brain or the spinal column, or they suffer a sudden, uncontrollable hemorrhage. Lack of appetite and extreme abdominal pain can fuel weight loss so severe that it whittles away muscle and bone. This is why TB was widely known as “consumption” until the 20th century—it seemed to be a disease that consumed the very body, shrinking and shriveling it. On a trip to Sierra Leone in 2019, I met a boy named Henry Reider, whose mix of shyness and enthusiasm for connection reminded me of my own son. I thought he was perhaps 9 years old. His doctors later told me that he was in fact 17, his body stunted by a combination of malnutrition and tuberculosis.

The cure for TB—roughly half a year on antibiotics—has existed since the 1950s, and works for most patients. Yet, in the decades since, more than 100 million people have died of tuberculosis because the drugs are not widely available in many parts of the world. The most proximate cause of contemporary tuberculosis deaths is not M. tuberculosis, but Homo sapiens. Now, as the Trump administration decimates foreign-aid programs, the U.S. is both making survival less likely for people with TB and risking the disease becoming far more treatment-resistant. After decades of improvement, we could return to something more like the world before the cure.

[Read: The danger of ignoring tuberculosis]

Anyone can get tuberculosis—in fact, a quarter of all humans living now, including an estimated 13 million Americans, have been infected with the bacterium, which spreads through coughs, sneezes, and breaths. Most will only ever have a latent form of the infection, in which infection-fighting white blood cells envelop the bacteria so it cannot wreak havoc on the body. But in 5 to 10 percent of infections, the immune system can’t produce enough white blood cells to surround the invader. M. tuberculosis explodes outward, and active disease begins.

Certain triggers make the disease more likely to go from latent to active, including air pollution and an immune system weakened by malnutrition, stress, or diabetes. The disease spreads especially well along the trails that poverty has blazed for it: in crowded living and working conditions such as slums and poorly ventilated factories. Left untreated, most people who develop active TB will die of the disease.

In the early 1980s, physicians and activists in Africa and Asia began sounding the alarm about an explosion of young patients dying within weeks of being infected instead of years. Hours after entering the hospital, they were choking to death on their own blood. In 1985, physicians in Zaire and Zambia noted high rates of active tuberculosis among patients who had the emerging disease now known as HIV/AIDS. TB surged globally, including in the U.S. Deaths skyrocketed. From 1985 to 2005, roughly as many people died of tuberculosis as in World War I, and many of them also had HIV. In 2000, nearly a third of the 2.3 million people who died of tuberculosis were co-infected with HIV.

[Read: Tragedy would unfold if Trump cancels Bush’s AIDS program]

By the mid-1990s, antiretroviral cocktails made HIV a treatable and survivable disease in rich communities. While a person is taking these medications, their viral levels generally become so low as to be undetectable and untransmittable; if a person with HIV becomes sick with tuberculosis, the drugs increase their odds of survival dramatically. But rich countries largely refused to spend money on HIV and TB meds in low- and middle-income countries. They cited many reasons, including that patients couldn’t be trusted to take their medication on time, and that resources would be better spent on prevention and control. In 2001, the head of the U.S. Agency for International Development had this to say when explaining to Congress why many Africans would not benefit from access to HIV medications: “People do not know what watches and clocks are. They do not use Western means for telling time. They use the sun. These drugs have to be administered during a certain sequence of time during the day and when you say take it at 10:00, people will say, ‘What do you mean by 10:00?’” A 2007 review of 58 studies on patient habits found that Africans were more likely to adhere to HIV treatment regimens than North Americans.

In the mid-2000s, programs such as PEPFAR and the Global Fund finally began distributing antiretroviral therapy to millions of people living with HIV in poor countries. PEPFAR, a U.S.-funded initiative, was especially successful, saving more than 25 million lives and preventing 7 million children from being born with HIV. These projects lowered deaths and infections while also strengthening health-care systems, allowing low-income countries to better respond to diseases as varied as malaria and diabetes. Millions of lives have been saved—and tuberculosis deaths among those living with HIV have declined dramatically in the decades since.

Still, tuberculosis is great at exploiting any advantage that humans hand it. During the coronavirus pandemic, disruptions to supply chains and TB-prevention programs led to an uptick in infections worldwide. Last year, the U.S. logged more cases of tuberculosis than it has in any year since the CDC began keeping count in the 1950s. Two people died. But in some ways, at the beginning of this year, the fight against tuberculosis had never looked more promising. High-quality vaccine candidates were in late-stage trials. In December, the World Health Organization made its first endorsement of a TB diagnostic test, and global health workers readied to deploy it.

[Read: America can’t just unpause USAID]

Now that progress is on the verge of being erased. Since Donald Trump has taken office, his administration has dismantled USAID, massively eliminating foreign-aid funding and programs. According to The New York Times, hundreds of thousands of sick patients have seen their access to medication and testing suddenly cut off. A memo released by a USAID official earlier this month estimated that cases of multidrug-resistant tuberculosis will rise by about 30 percent in the next few years, an unprecedented regression in the history of humankind’s fight against the disease. (The official was subsequently placed on administrative leave.) Research on tuberculosis tests and treatments has been terminated. Although the secretary of state and Elon Musk have assured the public that the new administration’s actions have not disrupted the distribution of life-saving medicine, that just isn’t true. A colleague in central Africa sent me a picture of TB drugs that the U.S. has already paid for sitting unused in a warehouse because of stop-work orders. (Neither the State Department nor DOGE employees responded to requests for comment.)

Last year, roughly half of all international donor funding for tuberculosis treatment came from the U.S. Now many programs are disappearing. In a recent survey on the impact of lost funding in 31 countries, one in four organizations providing TB care reported they have shut down entirely. About half have stopped screening for new cases of tuberculosis. The average untreated case of active tuberculosis will spread the infection to 10 to 15 people a year. Without treatment, or even a diagnosis, hundreds of thousands more people will die—and each of those deaths will be needless.

By revoking money from global-health efforts, the U.S. has created the conditions for the health of people around the world to deteriorate, which will give tuberculosis even more opportunities to kill. HIV clinics in many countries have started rationing pills as drug supplies run dangerously low, raising the specter of co-infection. Like HIV, insufficient nutrition weakens the immune system. It is the leading risk factor for tuberculosis. An estimated 1 million children with severe acute malnutrition will lose access to treatment because of the USAID cuts, and refugee camps across the world are slashing already meager food rations.

For billions of people, TB is already a nightmare disease, both because the bacterium is unusually powerful and because world leaders have done a poor job of distributing cures. And yet, to the extent that one hears about TB at all in the rich world, it’s usually in the context of a looming crisis: Given enough time, a strain of tuberculosis may evolve that is resistant to all available antibiotics, a superbug that is perhaps even more aggressive and deadly than previous iterations of the disease.

[Read: Resistance to the antibiotic of last resort is silently spreading]

The Trump administration’s current policies are making such a future more plausible. Even pausing TB treatment for a couple of weeks can give the bacterium a chance to evolve resistance. The world is ill-prepared to respond to drug-resistant TB, because we have shockingly few treatments for the world’s deadliest infectious disease. Between 1963 and 2012, scientists approved no new drugs to treat tuberculosis. Doing so stopped being profitable once the disease ceased to be a crisis in rich countries. Many strains of tuberculosis are already resistant to the 60-year-old drugs that are still the first line of treatment for nearly all TB patients. If a person is unlucky enough to have drug-resistant TB, the next step is costly testing to determine if their body can withstand harsh, alternative treatments. The United States helped pay for those tests in many countries, which means that now fewer people with drug-resistant TB are being diagnosed or treated. Instead, they are almost certainly getting sicker and spreading the infection.

Drug-resistant TB is harder to cure in individual patients, and so the aid freeze will directly lead to many deaths. But giving the bacteria so many new opportunities to develop drug resistance is also a threat to all of humanity. We now risk the emergence of TB strains that can’t be cured with our existing tools. The millennia-long history of humans’ fight against TB has seen many vicious cycles. I fear we are watching the dawn of another.


This article has been adapted from John Greens’s forthcoming book, Everything Is Tuberculosis.

Mycobacterium tuberculosis is a near-perfect predator. In 1882, Robert Koch, the physician who discovered the microbe, told a room full of scientists that it caused one in seven of all deaths. In 2023, after a brief hiatus, tuberculosis regained from COVID its status as the world’s deadliest infectious disease—a title it has held for most of what we know of human history.

Some people die of TB when their lungs collapse or fill with fluid. For others, scarring leaves so little healthy lung tissue that breathing becomes impossible. Or the infection spreads to the brain or the spinal column, or they suffer a sudden, uncontrollable hemorrhage. Lack of appetite and extreme abdominal pain can fuel weight loss so severe that it whittles away muscle and bone. This is why TB was widely known as “consumption” until the 20th century—it seemed to be a disease that consumed the very body, shrinking and shriveling it. On a trip to Sierra Leone in 2019, I met a boy named Henry Reider, whose mix of shyness and enthusiasm for connection reminded me of my own son. I thought he was perhaps 9 years old. His doctors later told me that he was in fact 17, his body stunted by a combination of malnutrition and tuberculosis.

The cure for TB—roughly half a year on antibiotics—has existed since the 1950s, and works for most patients. Yet, in the decades since, more than 100 million people have died of tuberculosis because the drugs are not widely available in many parts of the world. The most proximate cause of contemporary tuberculosis deaths is not M. tuberculosis, but Homo sapiens. Now, as the Trump administration decimates foreign-aid programs, the U.S. is both making survival less likely for people with TB and risking the disease becoming far more treatment-resistant. After decades of improvement, we could return to something more like the world before the cure.

[Read: The danger of ignoring tuberculosis]

Anyone can get tuberculosis—in fact, a quarter of all humans living now, including an estimated 13 million Americans, have been infected with the bacterium, which spreads through coughs, sneezes, and breaths. Most will only ever have a latent form of the infection, in which infection-fighting white blood cells envelop the bacteria so it cannot wreak havoc on the body. But in 5 to 10 percent of infections, the immune system can’t produce enough white blood cells to surround the invader. M. tuberculosis explodes outward, and active disease begins.

Certain triggers make the disease more likely to go from latent to active, including air pollution and an immune system weakened by malnutrition, stress, or diabetes. The disease spreads especially well along the trails that poverty has blazed for it: in crowded living and working conditions such as slums and poorly ventilated factories. Left untreated, most people who develop active TB will die of the disease.

In the early 1980s, physicians and activists in Africa and Asia began sounding the alarm about an explosion of young patients dying within weeks of being infected instead of years. Hours after entering the hospital, they were choking to death on their own blood. In 1985, physicians in Zaire and Zambia noted high rates of active tuberculosis among patients who had the emerging disease now known as HIV/AIDS. TB surged globally, including in the U.S. Deaths skyrocketed. From 1985 to 2005, roughly as many people died of tuberculosis as in World War I, and many of them also had HIV. In 2000, nearly a third of the 2.3 million people who died of tuberculosis were co-infected with HIV.

[Read: Tragedy would unfold if Trump cancels Bush’s AIDS program]

By the mid-1990s, antiretroviral cocktails made HIV a treatable and survivable disease in rich communities. While a person is taking these medications, their viral levels generally become so low as to be undetectable and untransmittable; if a person with HIV becomes sick with tuberculosis, the drugs increase their odds of survival dramatically. But rich countries largely refused to spend money on HIV and TB meds in low- and middle-income countries. They cited many reasons, including that patients couldn’t be trusted to take their medication on time, and that resources would be better spent on prevention and control. In 2001, the head of the U.S. Agency for International Development had this to say when explaining to Congress why many Africans would not benefit from access to HIV medications: “People do not know what watches and clocks are. They do not use Western means for telling time. They use the sun. These drugs have to be administered during a certain sequence of time during the day and when you say take it at 10:00, people will say, ‘What do you mean by 10:00?’” A 2007 review of 58 studies on patient habits found that Africans were more likely to adhere to HIV treatment regimens than North Americans.

In the mid-2000s, programs such as PEPFAR and the Global Fund finally began distributing antiretroviral therapy to millions of people living with HIV in poor countries. PEPFAR, a U.S.-funded initiative, was especially successful, saving more than 25 million lives and preventing 7 million children from being born with HIV. These projects lowered deaths and infections while also strengthening health-care systems, allowing low-income countries to better respond to diseases as varied as malaria and diabetes. Millions of lives have been saved—and tuberculosis deaths among those living with HIV have declined dramatically in the decades since.

Still, tuberculosis is great at exploiting any advantage that humans hand it. During the coronavirus pandemic, disruptions to supply chains and TB-prevention programs led to an uptick in infections worldwide. Last year, the U.S. logged more cases of tuberculosis than it has in any year since the CDC began keeping count in the 1950s. Two people died. But in some ways, at the beginning of this year, the fight against tuberculosis had never looked more promising. High-quality vaccine candidates were in late-stage trials. In December, the World Health Organization made its first endorsement of a TB diagnostic test, and global health workers readied to deploy it.

[Read: America can’t just unpause USAID]

Now that progress is on the verge of being erased. Since Donald Trump has taken office, his administration has dismantled USAID, massively eliminating foreign-aid funding and programs. According to The New York Times, hundreds of thousands of sick patients have seen their access to medication and testing suddenly cut off. A memo released by a USAID official earlier this month estimated that cases of multidrug-resistant tuberculosis will rise by about 30 percent in the next few years, an unprecedented regression in the history of humankind’s fight against the disease. (The official was subsequently placed on administrative leave.) Research on tuberculosis tests and treatments has been terminated. Although the secretary of state and Elon Musk have assured the public that the new administration’s actions have not disrupted the distribution of life-saving medicine, that just isn’t true. A colleague in central Africa sent me a picture of TB drugs that the U.S. has already paid for sitting unused in a warehouse because of stop-work orders. (Neither the State Department nor DOGE employees responded to requests for comment.)

Last year, roughly half of all international donor funding for tuberculosis treatment came from the U.S. Now many programs are disappearing. In a recent survey on the impact of lost funding in 31 countries, one in four organizations providing TB care reported they have shut down entirely. About half have stopped screening for new cases of tuberculosis. The average untreated case of active tuberculosis will spread the infection to 10 to 15 people a year. Without treatment, or even a diagnosis, hundreds of thousands more people will die—and each of those deaths will be needless.

By revoking money from global-health efforts, the U.S. has created the conditions for the health of people around the world to deteriorate, which will give tuberculosis even more opportunities to kill. HIV clinics in many countries have started rationing pills as drug supplies run dangerously low, raising the specter of co-infection. Like HIV, insufficient nutrition weakens the immune system. It is the leading risk factor for tuberculosis. An estimated 1 million children with severe acute malnutrition will lose access to treatment because of the USAID cuts, and refugee camps across the world are slashing already meager food rations.

For billions of people, TB is already a nightmare disease, both because the bacterium is unusually powerful and because world leaders have done a poor job of distributing cures. And yet, to the extent that one hears about TB at all in the rich world, it’s usually in the context of a looming crisis: Given enough time, a strain of tuberculosis may evolve that is resistant to all available antibiotics, a superbug that is perhaps even more aggressive and deadly than previous iterations of the disease.

[Read: Resistance to the antibiotic of last resort is silently spreading]

The Trump administration’s current policies are making such a future more plausible. Even pausing TB treatment for a couple of weeks can give the bacterium a chance to evolve resistance. The world is ill-prepared to respond to drug-resistant TB, because we have shockingly few treatments for the world’s deadliest infectious disease. Between 1963 and 2012, scientists approved no new drugs to treat tuberculosis. Doing so stopped being profitable once the disease ceased to be a crisis in rich countries. Many strains of tuberculosis are already resistant to the 60-year-old drugs that are still the first line of treatment for nearly all TB patients. If a person is unlucky enough to have drug-resistant TB, the next step is costly testing to determine if their body can withstand harsh, alternative treatments. The United States helped pay for those tests in many countries, which means that now fewer people with drug-resistant TB are being diagnosed or treated. Instead, they are almost certainly getting sicker and spreading the infection.

Drug-resistant TB is harder to cure in individual patients, and so the aid freeze will directly lead to many deaths. But giving the bacteria so many new opportunities to develop drug resistance is also a threat to all of humanity. We now risk the emergence of TB strains that can’t be cured with our existing tools. The millennia-long history of humans’ fight against TB has seen many vicious cycles. I fear we are watching the dawn of another.


This article has been adapted from John Greens’s forthcoming book, Everything Is Tuberculosis.

Updated at 10:56 a.m. on March 14, 2025

Until the second Trump administration took over, the National Institutes of Health—the world’s single largest public funder of biomedical research—was not in the business of canceling its grants. Of the more than 60,000 research awards the agency issues each year, it goes on to terminate, on average, maybe 20 of them, and usually only because of serious problems, such as flagrant misconduct, fraud, or an ethical breach that could harm study participants. “I have been involved with legitimate grant terminations,” one former NIH official, who worked at the agency for many years, told me. “I can count them on the fingers of one hand.”

Yet, in a few weeks, the administration has forced the agency to terminate so many of its active research grants—all seemingly on political grounds—that none of the dozen NIH officials I spoke with for this story could say for certain how many termination letters had gone out. Most thought that the number was now well above 100, and would likely continue to rapidly climb. This morning, in a meeting of grants-management staff, officials were told that approximately a thousand more grants could be targeted for termination, beginning today, one official told me. If the administration had not already, in a matter of weeks, exceeded the total number of cancellations the NIH has executed in the past decade, it will soon—perhaps within hours.

The NIH—an agency that has long prided itself on its mission of science funded by scientists—spends most of its $47 billion annual budget on driving biomedical innovation: developing new drugs and vaccines, containing epidemics, treating cancer, mitigating the harms of heart disease. But the growing scope of cancellations is revealing how willing Donald Trump’s administration is to claw back those resources for political reasons. (All of the current and former NIH officials I spoke with for this story requested anonymity for fear of retaliation from the federal government; the NIH did not respond to a request for comment.)

This spate of terminations is the Trump administration’s most aggressive attempt so far to forcibly reshape American science to match its agenda. At the same time, this might also be the most ham-fisted. Many officials told me that, as one succinctly put it, “they’re just going in and picking random grants to terminate.” Although the administration has said it doesn’t want to fund science that touches on certain concepts—gender, DEI, vaccine hesitancy—the terminations so far have few discernible criteria, and don’t operate by consistent protocols; in several cases, they end projects that are only tangentially related to the topics the administration wants to purge. If anything, the grant cancellations have become a game of whack-a-mole, in which political appointees take a mallet to any seemingly relevant research projects that pop into view—without regard to the damage they might do.


Notice of grant terminations has arrived from NIH officials, on NIH letterhead. But the decisions about which grants to cancel and why are primarily being made outside the agency, with pressure coming from the Department of Health and Human Services, several NIH officials told me.

The first round of cancellations, which began on the evening of February 28, focused mainly on grants that included a DEI component or involved transgender participants; officials at the agency were also told to cut off funding to projects that allot money to China. Another round, which began on Monday evening, targets grants that mention vaccine hesitancy or uptake; that same night, the NIH posted on X that it would cut $250 million in grants from Columbia University, one of several institutions that the Trump administration’s Department of Education is investigating for “antisemitic discrimination and harassment.” Two officials told me they expect several more rounds of cancellations, and several said that, based on recent emails sent to staff, grants involving mRNA vaccines, as well as grants that send funds to work in South Africa, may be next. (HHS did not respond to a request for comment.)

The list of grants related to vaccine hesitancy that officials were told to cancel targets dozens of projects. Some—such as a study of vaccine uptake in Alaska Native communities—were perhaps obvious choices, because they so directly addressed vaccine attitudes. But the list also included studies that use vaccine hesitancy as just one of several variables to mathematically model disease transmission. And several researchers who have dedicated their career to studying vaccine behaviors have not yet heard that their grants have been affected. Alison Buttenheim, a behavioral scientist at Penn Nursing, has been watching colleagues’ grants on vaccine uptake get canceled, but as far as she knows, her own NIH-funded work on vaccine hesitancy is still actively funded, though she expects that to change. “I figure it’s only days until it’s axed,” she told me.

“It’s unclear why some of us are getting them or not,” Brittany Charlton, who directs the LGBTQ research center at Harvard’s school of public health, told me. One of her colleagues, Nancy Krieger, told me that she’d received a termination letter for a study about measuring discrimination in clinical settings (including sexism and stigma about sexual orientation or transgender identity). But Charlton has yet to receive a letter for her own NIH-funded studies, which focus much more directly on LGBTQ populations.

One NIH official put it more bluntly: “It is such utter and complete chaos.” In advance of the terminations, several officials told me, agency leadership solicited lists of grants that might, for instance, “promote gender ideology,” or that involved certain types of vaccine-behavior research. NIH officials responded with curated lists of research projects, in several cases including only the bare-minimum number of grants with the most relevance. But many officials then received back spreadsheets populated with a subset of the grants from their own lists, along with several other grants that made only passing mention of the targeted topics. It was as if, one official told me, someone had performed a Ctrl+F search for certain terms, then copied and pasted the results. Multiple rounds of terminations in, officials at some NIH institutes are still unclear on how this new system of cancellations is supposed to work. Nearly two months after Trump’s executive order on cutting DEI programming, for instance, “we still haven’t gotten a definition of DEI,” one official said.


Typically, each NIH grant is shepherded by a team of officials, including at least one program officer, who oversees its scientific components, and a grants-management officer, who handles the budget. When terminations are on the table, those officials are always looped in—usually so they can help determine how to remedy the situation. “Terminations are the final option,” one NIH official told me.

But these recent directions to terminate arrived without warning or the usual steps of deliberation, and they instructed grants-management officers to issue letters by the end of the day they received them, two officials told me—leaving no time to push back, or even react. “There is zero protocol,” one official told me. “It is just, We are told, and it is done.” In at least one case, an official told me, a program officer learned that their grantee’s award had been terminated from the grantee.

The emailed directives also handed NIH officials prewritten justifications for termination. None cited misconduct, fraud, or even low likelihood for success. But the ones targeting research related to transgender people or DEI claimed that the projects in question were “antithetical to the scientific inquiry,” “often unscientific,” or ignoring “biological realities.” The termination-letter templates also noted the NIH’s obligation to carefully steward taxpayer dollars, accused the projects of failing to employ federal resources to benefit the well-being of Americans, and cited new agency priorities as a reason for ending studies. Letters issued to several researchers studying vaccines, for instance, stated, “It is the policy of NIH not to prioritize research activities that focuses [sic] gaining scientific knowledge on why individuals are hesitant to be vaccinated and/or explore ways to improve vaccine interest and commitment.” The terminations sent to scientists studying LGBTQ populations contained similar language, and in some cases said that their projects “provide low returns on investment, and ultimately do not enhance health, lengthen life, or reduce illness.”

Those assertions, though, directly contradict the conclusions of NIH officials and the outside scientists who helped award those grants in the first place. No project can receive NIH funds without first being vetted by multiple panels of experts in the field, who judge each proposal based on criteria such as the lead scientist’s track record, the rigor of the study’s design, and the project’s likelihood of addressing a pressing biomedical-research issue. And each proposal submitted to the NIH undergoes two layers of internal review, to ensure that the project meets agency policies and is “aligned with the goals of the institute” potentially funding it, one official told me.

Several letter recipients told me that their grants had received perfect or near-perfect scores in early reviews; others told me that their results were well on their way to publication, proof of some return on the agency’s investment. And all addressed important issues in public health: One, for instance, was studying how stress affects alcohol consumption; another, mpox among men who have sex with men; another, the factors that might influence the success of a future HIV vaccine.


The NIH, a federal agency directed by a political appointee, does sometimes shift its priorities for scientific or ideological reasons. For instance, some NIH institutes have over time gotten pickier about issuing awards to candidate-gene studies, in which researchers try to confirm whether a specific gene affects a biological trait, one official told me. And the first Trump administration placed restrictions on research that could be done using fetal tissue. Both of those shifts, officials said, meant that certain new proposals weren’t green-lighted. But in neither case was the agency forced to issue mass terminations of projects that had already been declared worthy of funds, officials told me.

The clearest example that the NIH officials I spoke with could recall of a grant being terminated at the behest of political leadership was also triggered by a Trump administration: During his first term, Trump pressured the agency to terminate a grant that had been issued to the nonprofit EcoHealth Alliance, which was partnering with the Wuhan Institute of Virology, in China. But even that cancellation was partly reversed. In general, “when an administration changes priorities, they change them going forward,” one official said. “They don’t reach back and terminate awards.”

Grant cancellations are tantamount to instantaneous salary cuts for scientists, and can force them to halt studies, fire staff, and tell participants that their time and effort may have been wasted. Jace Flatt, a health and behavioral scientist at the University of Nevada at Las Vegas, has had two NIH grants axed, for projects looking at dementia and memory loss in aging LGBTQ populations. If he loses a third NIH grant—as he expects to, he told me—“my lab is gone.” Because the terminations arrived without warning, scientists also had no time to prepare: Sarah Nowak, a vaccine researcher at the University of Vermont, told me she found out that her grant investigating childhood vaccine hesitancy in Brazil was likely on the chopping block when she read an article on the vaccine-related grant cuts in The Washington Post on Monday. (Nowak received her letter the next day.)

Many studies, once terminated, would be difficult, if not outright impossible, to restart, Sean Arayasirikul, a medical sociologist at UC Irvine, told me. Medical interventions in clinical trials, for instance, can’t simply be paused and picked back up; many studies also rely heavily on collecting data at small and regular intervals, so interruptions are equivalent to massive data holes. Plus, participants released from a study won’t always be willing to come back, especially if they’re from communities that medical research has neglected in the past and that already have little reason to place continued trust in scientists. (Arayasirikul received a termination letter for their work investigating how stigma affects HIV preventive care for people of color who are also sexual and gender minorities.)

Terminating grants to match political priorities also creates a fundamental instability in the government’s approach to scientific funding. If researchers can’t count on grants to carry across administrations, their government-funded work will become a series of short-term sprints, making it harder for science to reliably progress. Biomedical breakthroughs—including, say, the generation and approval of new drugs, or clinical trials for chronically ill patients—typically take years, sometimes even decades. And for an administration that has premised itself on efficiency, a never-ending loop of funding bait and switch does not exactly make for minimizing waste. “This says, At any point, we can just up and change our minds,” one NIH official told me. “That is not good stewardship of federal dollars.”

Many of the administration’s actions might well be illegal—especially its targeting of DEI, which a federal judge recently deemed a potential violation of the First Amendment. But NIH officials have been put “in an impossible position,” one told me. Their choices are to either carry out the administration’s wishes and risk defying court orders or resist the changes at the agency and directly disobey their supervisors, putting themselves “at risk of insubordination and therefore unemployment,” the official said. Many have been choosing the first option, perhaps because the threat of losing their livelihood has felt so much nearer, and so much more tangible: They have now spent weeks watching colleagues resign, get fired, or be abruptly put on administrative leave. The environment at the agency has become suffocatingly toxic. “People are being screamed at, bullied, harassed,” one official told me. Some that once protested have since relented—perhaps because they now know that the immediate future will bring only more of the same.

There seems to be a limit to how anti-vax is too anti-vax in the Trump administration. Yesterday, hours before Dave Weldon was slated to begin his Senate confirmation hearing for CDC director, the White House pulled his nomination. Weldon, a physician and former Republican congressman, has long questioned the safety of vaccines. In a meeting last month, he reportedly told one senator that routine childhood vaccines were exposing kids to dangerous levels of mercury and may cause autism. (Both claims are false.)

Weldon has denied that he’s anti-vaccination, but his views on vaccines seem to have been his undoing. In a written statement he gave to me and other outlets, he suggested that at least two Republican senators were threatening to vote against him, and that this became “clearly too much for the White House.” But those two senators, Bill Cassidy of Louisiana and Susan Collins of Maine, voted to confirm Robert F. Kennedy Jr., an ardent vaccine critic who would have been Weldon’s boss as health secretary. Perhaps Weldon’s biggest problem was that he said the quiet part out loud. During his confirmation hearings, Kennedy sidestepped calls for him to declare unequivocally that vaccines do not cause autism, and appeared to convince lawmakers that he’d let Americans make their own decisions about vaccines. “I support the measles vaccine. I support the polio vaccine. I will do nothing as HHS secretary that makes it difficult or discourages people from taking either of those vaccines,” Kennedy told senators.

Kennedy is already breaking that promise. As cases of measles are popping up in states across the country—leading to America’s first measles death in a decade—he has propped up unproven treatments such as cod-liver oil. Though Kennedy has said that the measles vaccine helps “protect individual children from measles” and contributes to “community immunity,” he also baselessly questioned its “risk profile” in an appearance on Fox News earlier this week. (In extremely rare instances the vaccine can have serious side effects.) Kennedy’s subversion of vaccines, subtle at times, glaring at others, goes far beyond the measles outbreak. The health secretary is “using the federal government to undermine vaccination in all the ways that it can,” Matt Motta, a vaccine-communication researcher at Boston University, told me. Weldon may have crossed a red line for lawmakers. But in just over a month on the job, Kennedy has taken more steps against vaccines than perhaps any other top health official in modern American history.   

[Read: His daughter was America’s first measles death in a decade]

Kennedy’s wishy-washy comments about the measles vaccine may persuade more parents not to vaccinate their children—which means that more children will get sick, and perhaps die. But his other actions will have an even broader, longer-lasting effect on the overall U.S. vaccination system. Earlier this week, the administration terminated NIH research grants probing how the government can address vaccine hesitancy. Vaccine promotion might seem separate from access, but the two are intertwined, Motta said. Research into vaccine promotion often explores issues such as whether people know where to get shots or whether insurance will cover them. (A spokesperson for Health and Human Services did not respond to a request for comment.)

All the while, the research that the government now is funding may only serve to further sow vaccine distrust. The CDC is reportedly launching a study probing the link between vaccines and autism—even though the connection has already been thoroughly studied and debunked. A 2014 meta-analysis of more than 1 million children found “no relationship” between shots and the condition. Even if the new study comes to a similar conclusion, simply funding such research has consequences, Jennifer Reich, a vaccine-hesitancy researcher at the University of Colorado Denver, told me. The NIH’s new research plays a “powerful symbolic role of making” the link “feel like it is unsettled,” she said.

A myopic focus on the purported connection between vaccines and autism is exactly what some lawmakers feared would color Kennedy’s term as secretary. During Kennedy’s confirmation, Cassidy, a physician, raised concerns that Kennedy and his MAHA movement may undermine science by “always asking for more evidence and never accepting the evidence that is there.” Cassidy, who did not respond to a request for comment, may soon have more reason for disappointment. He ultimately voted to confirm Kennedy based on a plethora of promises and his belief, as he said in a speech on the Senate floor, that RFK Jr. would “work within current vaccine approval and safety-monitoring systems.” Yet Kennedy has already hinted that he will change those systems: “We have a vaccine-surveillance system in this country that just doesn’t work,” he recently said on Fox News, adding that “the CDC in the past has not done a good job at quantifying the risk of vaccines. We are going to do that now.”

Since RFK Jr. entered office, the health agencies have not abandoned all responsibilities surrounding vaccinations. CDC officials have been directly coordinating with the local Texas health department at the epicenter of the measles outbreak, including helping design outreach materials encouraging vaccination, according to internal emails I received as part of a public-records request. A letter addressed to parents from a local health official, for example, states: “I strongly encourage you to have your child vaccinated as soon as possible.”

Perhaps Weldon’s defeat signifies that Washington wants more pro-vaccination efforts like that. But he makes for an easy scapegoat: Unlike RFK Jr., he lacks a devoted fan base backing him up. Kennedy doesn’t need Weldon to do real damage to America’s vaccine infrastructure. The changes he has made so far are likely only the beginning. If Kennedy keeps up this pace, America’s vaccine system may look fundamentally different in one year, or two. The stand against Weldon changes nothing about that.

Photographs by Jake Dockins

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

Peter greeted me in the mostly empty gravel parking lot of a Mennonite church on the outskirts of Seminole, a small city in West Texas surrounded by cotton and peanut fields. The brick building was tucked in a cobbled-together neighborhood of scrapyards, metal barns, and modest homes with long dirt driveways. No sign out front advertised its name; no message board displayed a Bible verse. No cross, no steeple—nothing, in fact, that would let a passerby know they had stumbled on a place of worship. When my car pulled up, Peter emerged to find out who I was.

He hadn’t been expecting a stranger with a notepad, but he listened as I explained that I had come to town to write about the measles outbreak, which had by that point sent 20 people from the area to the hospital and caused the death of an unnamed child, the disease’s first victim in the United States in a decade.

Of course Peter knew why Seminole was in the news. He had heard that President Trump was asked about the outbreak here during a Cabinet meeting, and he told me that he didn’t like the attention. The Mennonites were being unjustly singled out. It wasn’t like they were the only ones who came down with measles. The coverage, he insisted, was “100 percent unfair.” He didn’t think it was just the Seminole area that had problems; he said that he had family in Canada and Mexico who had also gotten measles recently. I told him I’d heard that the child who’d passed away might have come from his congregation. He said that was true.

Peter dug the toe of his boot into the gravel. I asked him if he knew the family. His voice broke slightly as he answered. “That’s our kid,” he said.

A tree is split as power lines run past it along the street in front of a building
Photograph by Jake Dockins

The first case in the West Texas outbreak was announced on January 29. The official tally in the region grew to six over the next week. By Valentine’s Day, it was up to 48. On February 26, news went out that a child had died; by that point, 124 cases had been confirmed across nine counties, making the outbreak the largest that the state had seen in 30 years. The official count now stands at just about 200, and another person who was diagnosed with measles just died across the border in New Mexico.

An outbreak—even one this big—should not have come as a surprise. Vaccination rates have dipped in many states, including Texas, since the start of the coronavirus pandemic. In Gaines County, where Seminole is located, the measles-vaccination rate among kindergartners is just 82 percent, well short of the estimated 95 percent threshold for maintaining herd immunity. Even that alarming figure would appear to undersell the local problem. Many children from the county’s Mennonite community, which numbers in the thousands, are unvaccinated, but they won’t get picked up in state tallies, because they are either homeschooled or enrolled in nonaccredited private schools, which are not required to collect such data.

A women and a child walk into a supermarket
Photograph by Jake Dockins

Even in the midst of a measles crisis, persuading parents in rural West Texas to vaccinate their children, or just to get tested for the virus, is an uphill battle. Zach Holbrooks, the executive director of the South Plains Public Health District, told me that he’s spent the past month trying to get the word out, particularly to the Low German–speaking Mennonite community. He asked three local churches if he could set up a mobile testing site on their property. They all refused. “I think there’s some sentiment that they’re being targeted,” he said, “and I don’t like the fact that they feel that way.” His team did create a drive-up testing site at a county events building next to the city park, and not far from the Masonic lodge. But he said that it gets very few visitors—about two or three a day. As a result, no one really knows the outbreak’s total size.

[Read: America is botching measles]

Help from the federal government has been slow to arrive. Weeks into the outbreak, the Department of Health and Human Services directed 2,000 doses of vaccine to be sent to Texas. But Robert F. Kennedy Jr., the newly confirmed HHS secretary, initially reacted to the outbreak by claiming that it was “not unusual.” Since then, he has repeatedly reminded the public that the decision to be immunized is a personal one, even while acknowledging that vaccines “not only protect individual children from measles, but also contribute to community immunity.” He has also claimed that good nutrition might be sufficient to protect people from the worst effects of measles. “If you are healthy, it’s almost impossible for you to be killed by an infectious disease in modern times,” Kennedy falsely told Fox News’s Marc Siegel in an interview last week. He’d had “a very, very emotional and long conversation” with the family of the child who had died, he said; and later added that “malnutrition may have been an issue in her death.” Local health officials told The New York Times that the child who died had “no known underlying conditions.” A spokesman for HHS declined to comment.

There are a half dozen Mennonite congregations in Seminole, according to Google Maps. Peter’s church isn’t listed among them. Aside from a nonprofit filing, it does not appear to have any online presence. I knew of its existence only because I’d met a Mennonite man from another congregation at a coffee shop that morning and asked whether he knew the family of the child that had died. He said he’d heard they were from this church. When I asked him where it was, he responded with a word in Low German. That turned out to be a nickname for a neighborhood a little ways outside of town. After circling county roads for a while, passing a mix of homes, horses, and farm equipment, I stopped and asked for help from a group of boys playing in a field with rocks and sticks. They pointed in unison. The church was just half a mile up the road.

That’s where I encountered Peter, a wiry 28-year-old man with an angular face who wore a dark-colored, Western-style shirt and jeans. His English was uncertain, and he spoke with a light German accent. Sometimes he responded to my questions with silence.

He declined to reveal his daughter’s name or the family’s last name. Peter was perplexed by the national news coverage, and he did not seem eager to draw more attention to his family and community. He gave only his daughter’s age: She was 6 years old. When I asked him to describe her in more detail, he waved his hand, said she liked what other kids liked. But as we stood in the parking lot, he told me the story of what happened.

Peter’s daughter had been sick for three weeks. The family knew it was measles. He said they took her to the hospital at one point, and she was given cough medicine. “That’s it,” he recalled. “They just say, ‘Go home.’ They don’t want to help us. They say, ‘It’s just normal; go home.’” (A spokeswoman for the Seminole Hospital District declined to comment, citing privacy laws.)

A sign with Measles Testing on it is tied up to a ladder
Photograph by Jake Dockins

It wasn’t normal, though. Her condition continued to deteriorate, so they brought her back to the doctors. “She just kept getting sicker and sicker,” he told me. “Her lungs plugged up.” Her heart rate and blood pressure dropped, and the doctors put her on a ventilator. “We were there Saturday ’til Monday, three days … and then it was worse, very bad.” Peter shook his head and stared at the ground. He said his daughter died on Tuesday night from pneumonia, which is a common infection in severe measles cases.

Peter’s daughter was not vaccinated. Mennonite doctrine does not prohibit inoculations or modern medicine in general, though I encountered plenty of suspicion among Mennonites I spoke with in Seminole. I met a father who said that he wanted to vaccinate his two daughters but that their mother didn’t think it was a good idea. A grandmother told me she knew of several children who had been given the measles vaccine and were “never the same after that.” A man who’d spent his career installing irrigation equipment said he was suspicious of vaccines in part because he believed that the government had lied about the origins of COVID.

Peter said that he has doubts about vaccines too. He told me that he considers getting measles a normal part of life, noting that his parents and grandparents had it. “Everybody has it,” he told me. “It’s not so new for us.” He’d also heard that getting measles might strengthen your immune system against other diseases, a view Kennedy has promoted in the past. But perhaps most of all, Peter worried about what the vaccine might do to his children. “The vaccination has stuff we don’t trust,” he said. “We don’t like the vaccinations, what they have these days. We heard too much, and we saw too much.”

During our conversation, several families arrived and went inside the building behind him. Mennonites are known for coming to the aid of fellow community members. Earlier in my visit, I’d heard a story about how Mennonites had paid off the mortgage of a young mother in the area whose husband had died in an accident. I asked Peter if he was getting enough support. He nodded: “Food, money—whatever we need.” Peter does construction for a living. He and his wife have four other small children. A couple of them appeared as we talked, grabbing at his sleeve, trying to get his attention. He leaned down to reassure them.

The death of his daughter, Peter told me, was God’s will. God created measles. God allowed the disease to take his daughter’s life. “Everybody has to die,” he said. Peter’s eyes closed, and he struggled to continue talking. “It’s very hard, very hard,” he said at last. “It’s a big hole.” His voice quavered and trailed off. “Our child is here,” he said, gesturing toward the building behind him. “That’s why we’re here.”

Peter invited me to come inside the church building. He walked over to the door and held it open. I entered a small, dark, airless room with about a dozen chairs. Peter’s daughter was lying in the middle in a handmade coffin covered with fabric. Her face, framed by blond, braided pigtails, showed no sign of illness. Everything was white: her skin, her dress, the lining of her coffin, the thin ribbons that formed little bows on the cuffs of her sleeves. Her hands were clasped just below her chest. Members of her family were seated all around. No one looked up when I walked into the room. The only sounds were the trill of someone’s cellphone alert and the dry, hacking cough coming from one of her sisters in the corner.

It’s easy to dismiss statistics, to forget what they represent. Before the measles shot was introduced in 1963, the number of deaths caused by the disease in the United States each year was somewhere from 400 to 500. The CDC puts the mortality rate for childhood measles at one to three in 1,000, with one in five cases requiring hospitalization. Thanks to vaccines, the memory of that suffering has largely faded from public consciousness, at least in the developed world.

What happened in Seminole, though, was a grim reminder. The day after meeting Peter, I visited the vaccination clinic across the street from the hospital where he had first taken his daughter. I had planned to interview people who were there to get their shots, but no one showed. It occurred to me that I was now at some modest risk myself. Families from Peter’s church had cycled through the visitation service the day before, sharing air inside that stuffy room amid their grief. Like a lot of people born before 1989, I’d gotten only one measles shot as a kid, so out of an abundance of caution, I rolled up my sleeve and got a booster. Later that day, I met up with Zach Holbrooks for lunch and asked him how many other people had gotten shots that morning. It turned out to be just one, and that one was him. He, too, had received just a single dose of the vaccine in childhood, so it seemed wise to get another.

A tumbleweed is stuck in a wire fence in an open field
Photograph by Jake Dockins

After lunch, I made the six-hour drive back to Austin, where I live, past the pumpjacks slowly bobbing for oil and the towering wind farms. There’s nothing I heard in Seminole that I haven’t also heard from crunchy liberal friends at home who choose not to vaccinate their kids because they believe that vaccines contain toxins that cause autism or that childhood diseases bolster the immune system. (For the record, the 1998 paper that purported to show a link between vaccines and autism has been retracted, and research indicates that contracting measles can degrade your body’s ability to fight other infections.) Nor are Peter’s views that unusual in conservative corners of the country. A recent poll found that nearly one-third of all Republican and Republican-leaning voters, for instance, think that routine inoculations are “more dangerous than the diseases they are designed to prevent.” That’s the gist of what I heard from multiple Mennonites I interviewed. They are far from alone.

At one point in the parking lot, Peter had asked me why his daughter matters to the rest of the country. I’d struggled in the moment to come up with an answer. For Peter and his family, the loss of their daughter is a private tragedy, one that would be excruciating no matter how she died. The fact that she died of measles, though, is a sign that something has gone wrong with the country’s approach to public health. Twenty-five years ago, measles was declared “eliminated” in the United States. Now a deadly crisis is unfolding in West Texas.

Before I left the church that day, Peter and I talked for a few more minutes. “You probably know how it goes when somebody passes away,” he said. “It’s hard to believe.” Peter told me he didn’t have anything more to say. Really, what more could be said? Something unbelievable had happened: A young father was grieving the death of his 6-year-old from measles.

Updated at 4:56 p.m. ET on March 11, 2025

In the early evening of March 7, 2020, I was on my cellphone in an airport terminal, telling a friend that I was afraid to write an article that risked ruining my journalistic reputation. I had been speaking with the small but close-knit aerobiologist community about the possibility that the new coronavirus could travel easily from person to person through the air—not just through large droplets that reach only a short distance from an infected person or through handshakes. The scientists had stressed that the idea of airborne transmission of the new virus was still mostly theoretical, but they’d seemed pretty concerned.

When my story came out the following week, it was, to my knowledge, the first article by a journalist to make the case that the virus causing COVID-19 might travel efficiently through the air, and could potentially cover many meters in a gaseous cloud emitted with a cough or a sneeze. To avoid stoking undue worry, I had argued against calling the virus “airborne” in the headline, which ran as “They Say Coronavirus Isn’t Airborne—But It’s Definitely Borne by Air.” That idea was not immediately accepted: Two weeks later, the World Health Organization tweeted, “FACT: #COVID19 is NOT airborne.” As the pandemic unfolded, though, it became clear that the coronavirus did indeed spread through airborne transmission—even if the WHO took more than a year and a half to officially describe the coronavirus as a long-range airborne pathogen.

By then, amid the loud debate over mask mandates, vaccine boosters, and individuals’ responsibility for the health of others, a parallel debate had emerged over ventilation. Wearing an N95 or receiving a third COVID shot were ultimately individual choices, but breathing safer air in indoor spaces required buy-in from bigger players such as education departments and transit agencies. Some advocates held up clean air as a kind of public good—one worth investing in for shared safety. If it had succeeded, this way of thinking would have represented one of the most lasting paths for governments to decrease people’s risks from COVID and from airborne diseases more generally.

In the United States, the federal government regulates the quality of air outdoors, but it has relatively little oversight of indoor air. State and local jurisdictions pick up some of the slack, but this creates a patchwork of rules about indoor air. Local investment in better air-quality infrastructure varies widely too. For example, a 2022 survey of COVID-ventilation measures in U.S. public-school districts found that only about a quarter of them used or planned to use HEPA filters, which have a dense mesh for trapping particles, for indoor air. An even smaller fraction—about 8 percent—had installed air-cleansing systems that incorporated ultraviolet light, which can kill germs.

For decades, experts have pushed the idea that the government should pay more attention to the quality of indoor air. In his new book, Air-Borne: The Hidden History of the Life We Breathe, the journalist Carl Zimmer shows the long arc of this argument. He notes that Richard Riley, a giant in the field of aerobiology who helped show that tuberculosis can be airborne, believed that individuals shouldn’t have to ensure that the air they breathe is clean. Just as the government regulates the safety of the water that flows into indoor pipes, it should oversee the safety of air in indoor public spaces.

More than half a century before the coronavirus pandemic, Riley positioned this idea as an alternative to requirements for widespread masking, which, he said, call for “a kind of benevolent despotism,” Zimmer reports. If cleaner air was the one of the best ways to reduce the societal burden of disease, then the two best ways to achieve it were to push people to wear masks in any public space or to install better ventilation. The latter approach—purifying the air—would mean that “the individual would be relieved of direct responsibility,” Riley reasoned in a 1961 book he co-authored: “This is preventive medicine at its best, but it can only be bought at the price of civic responsibility and vigilance.”

Medical breakthroughs in the years that followed may have deflated enthusiasm for this idea. Zimmer writes that the huge advances in vaccines during the 1960s made the world less interested in the details of airborne-disease transmission. Thanks to new vaccines, doctors had a way to prevent measles, the WHO launched a campaign to eradicate smallpox, and polio seemed on its way out. On top of that, researchers had come up with an arsenal of lifesaving antibiotics and antivirals. How viruses reached us mattered less when our defenses against them were so strong.

In the first year or so of the coronavirus pandemic, though, one of the only defenses against COVID was avoiding it. And as a debate raged over how well the virus spread in air, the science of aerobiology was thrust into the spotlight. Some members of the public started fighting for good ventilation. A grassroots effort emerged to put homemade air purifiers and portable HEPA filters in public places. Teachers opened classroom windows when they learned that their schools lacked proper ventilation, travelers started carrying carbon-dioxide monitors to gauge the air quality aboard planes, and restaurants began offering outdoor dining after diagrams were published showing how easily one person eating inside can expose those seated nearby to the virus.

The federal government did take some small steps toward encouraging better ventilation. In mid-2023, the CDC put out new recommendations urging five air changes an hour (essentially replacing all of the air within a room) in all buildings. But it was a recommendation, not a requirement, and local governments and owners of public buildings have been slow to take on the burden of installing or overhauling their ventilation systems. Part of this was surely because of the daunting price tag: In 2020, the Government Accountability Office estimated that approximately 36,000 school buildings had substandard systems for heating, ventilation, and cooling; the estimated cost for upgrading the systems and ensuring safe air quality in all of the country’s schools, some experts calculated, would be about $72 billion. Portable HEPA filters, meanwhile, can be noisy and require space, making them less-than-ideal long-term solutions.

For the most part, momentum for better indoor air quality has dissipated, just as interest in it faded in the 1960s. Five years after COVID-19 precipitated lockdowns in the U.S., the rate of hospitalizations and mortality from the disease are a fraction of what they once were, and public discussion about ventilation has waned. Truly improving indoor air quality on a societal scale would be a long-term investment (and one that the Trump administration seems very unlikely to take on, given that it is slashing other environmental-safety protections). But better ventilation would also limit the cost of diseases other than COVID. Tuberculosis is airborne, and measles is frighteningly good at spreading this way. There is also evidence for airborne dissemination of a range of common pathogens such as influenza, which in the U.S. led to an estimated 28,000 deaths in the 2023–24 flu season. The same holds true for RSV, or respiratory syncytial virus, which each year causes 58,000 to 80,000 hospitalizations of children under age 5 in the United States, and kills as many as 300 of them. Virologists are also now asking whether bird flu could evolve to efficiently transmit through air, too.

For those of us still concerned about airborne diseases, it feels as though little has changed. We’re right where we were at the start of the pandemic. I remember that moment in the airport and how I’d later worried about stoking panic in part because, during my flight, I was the only person wearing an N95—one that I had purchased months ago to wear in the dusty crawl space beneath my home. On the plane, I felt like a weirdo. These days, I am, once again, almost always the lone masker when I take public transportation. Sometimes I feel ridiculous. But just the other week, while I was seated on the metro, a woman coughed on my head. At that moment, I was glad to have a mask on. But I would have been even more relieved if the enclosed space of the metro car had been designed to cleanse the air of whatever she might have released and keep it from reaching me.


This article originally misstated that travelers used carbon-monoxide monitors to gauge air quality aboard airplanes. In fact, they used carbon-dioxide monitors.