Category:

Medical News

A year after the Trump administration began the dismantlement of USAID, it is initiating a new round of significant cuts to foreign assistance. This time, programs that survived the initial purge precisely because they were judged to be lifesaving are slated for cancellation.

According to an internal State Department email obtained by The Atlantic, the administration will soon end all of the humanitarian funding it is currently providing as part of a “responsible exit” from seven African nations, and redirect funding in nine others. Aid programs in all of these countries had previously been up for renewal from now through the end of September but will instead be allowed to expire. Each of them is classified as lifesaving according to the Trump administration’s standards.

The administration had already canceled the entire aid packages of two nations, Afghanistan and Yemen, where the State Department said terrorists were diverting resources. The new email, sent on February 12 to officials in the State Department’s Bureau of African Affairs, makes no such claims about the seven countries now losing all U.S. humanitarian aid: Burkina Faso, Cameroon, Malawi, Mali, Niger, Somalia, and Zimbabwe. Instead, according to the email, these projects are being canceled because “there is no strong nexus between the humanitarian response and U.S. national interests.” (The nine countries eligible for redirected funding are Ethiopia, the Democratic Republic of the Congo, the Central African Republic, Kenya, Mozambique, Nigeria, Uganda, South Sudan, and Sudan.)

A spokesperson for the State Department told me in an email that “as USAID winds down, the State Department is responsibly moving programming onto new mechanisms” with “longer periods of performance and updated award and oversight terms.” The State Department has recently begun signing health-financing agreements with some African governments—including Cameroon and Malawi, as well as five of the nine countries eligible for redirected funding—that will go into effect later this year. These agreements focus on strengthening health systems and containing infectious diseases but don’t seem to address the hunger or displacement crises that aid groups are fighting in these countries. The department’s internal email notes that aid projects in the nine eligible countries will be able to receive U.S. assistance via a United Nations program. But aid groups in at least one of those countries have already lost their U.S. funding, and much remains unknown about if and when additional support might come. The State Department spokesperson, who did not provide their name, offered no further specifics when asked.

As I wrote earlier this month, under Donald Trump, the U.S. has adopted an “America First” approach to foreign aid, in which many humanitarian projects are selected based not on need but on what the administration might receive in return. This latest aid purge appears to be following that pattern. Across the seven countries barred from U.S. aid, at least 6.2 million people are facing “extreme or catastrophic conditions,” according to the UN. But they have little to offer the U.S. in return for help. In other cases, the State Department has restored or offered aid in exchange for desirable mineral rights, or as payment for agreeing to accept U.S. deportees. Six of the seven countries mine comparatively few minerals that the Trump administration needs to fuel the AI boom. And only one, Cameroon, appears to have accepted a handful of deportees.

[Read: The logical end point of ‘America First’ foreign aid]

The email also confirms that the U.S. will no longer allow American taxpayer dollars to flow to these seven countries through the UN’s Office for the Coordination of Humanitarian Affairs, or OCHA. Previously, the U.S. placed a significant amount of money in the UN’s global humanitarian pool, then trusted OCHA to allocate it. But in December, Jeremy Lewin, a senior official in the State Department, announced at a press conference that the administration would allow its contributions to the UN body to be spent only in an initial list of 17 countries, which included none of the seven whose current aid will soon end entirely. (According to Eri Kaneko, a spokesperson for OCHA, one more country has since been added to the list.) Lewin also announced that the U.S. would be contributing an initial $2 billion in 2026, far less than the country’s typical contributions.

The State Department spokesperson called OCHA’s pooled funding “a gold standard in flexible humanitarian funding.” But according to two senior humanitarian-aid experts and one State Department employee—who, like a number of people I interviewed for this story, asked to remain anonymous to discuss matters they were not authorized to speak about publicly, or because they feared the administration’s retribution—Lewin’s announcement blindsided State Department officials, embassy heads, and aid groups.

The nine other countries named in the internal State Department email appear to be included in the reworked partnership between the U.S. and OCHA. According to the email, the State Department will end lifesaving awards in those places, for reasons the email does not explain and the State Department spokesperson did not provide. (Ethiopia, Congo, and Kenya will be among the beneficiaries of Food for Peace, a program that was formerly part of USAID but is now, as of Christmas Eve, run by the Department of Agriculture.) The aid the selected countries receive through OCHA will come with new restrictions and monitoring requirements. According to guidance that OCHA distributed and I obtained, any American contributions to OCHA must be spent within six months of being donated. According to the two humanitarian experts, one based in South Sudan and the other in Washington, what groups will get this money and when any of it will be distributed is still hazy.

Since the December press conference, “the legal work of formulating formal awards for each recipient country has been taken forward rapidly,” Kaneko, the OCHA spokesperson, told me in a text message. “Extensive preparatory work has also been underway at both the country and global levels on the administration of this grant.” Kaneko defended the six-month deadline for spending, writing that, because several major countries have pulled back their contributions, “it is critical that these funds are translated swiftly into life-saving action for people who urgently need assistance and protection.”

The aid programs being phased out this year were already notable for their continued existence. From January to March last year, the Department of Government Efficiency, led by Elon Musk, helped purge 83 percent of American foreign aid. Many more awards were canceled during a review by the White House’s Office of Management and Budget. The administration’s stated aims in so aggressively reducing foreign aid were to eliminate wasteful, “woke” awards while preserving work that it determined saved lives.

The administration’s definition of lifesaving was particularly strict. Funding for programs that fought tuberculosis and sent food to people who are chronically hungry, not yet starving, has been canceled. But stabilization centers that provide inpatient treatment to the most extremely malnourished children have generally, though not universally, been spared. Each of the newly canceled awards represents an occasion in which federal workers had previously convinced Trump appointees that the money would help meet the most basic survival needs of people fleeing war, caught in deadly disease outbreaks, or in danger of starving to death, a former senior State Department official, who left the administration in the fall, told me. “It has to be: ‘If we don’t deliver this, people die immediately,’” they said.

[Read: The world’s deadliest infectious disease is about to get worse]

Since the destruction of USAID last year, administration representatives have repeatedly insisted that lifesaving aid was being preserved. In March, Musk posted on X, “No one has died as result of a brief pause to do a sanity check on foreign aid funding. No one.” Secretary of State Marco Rubio has similarly claimed that reports of people dying because of USAID cuts were lies, and promised last spring that “no children are dying on my watch.” But reports of deaths that appear clearly linked to the cuts abound.

Conditions in some of the countries where aid is being canceled are already dire. Somalia, which will soon receive no American humanitarian funding at all, is undergoing a severe drought; earlier this year, analysts for the federal government reported that the hunger crisis is so extreme, it could deteriorate into full-blown famine by this summer. Hundreds of health and nutrition centers in Somalia shut down after last year’s steep aid cuts, according to Doctors Without Borders. In a regional hospital that Doctors Without Borders supports, deaths among severely malnourished children younger than 5 have increased by 44 percent, Hareth Mohammed, a communications manager working for the organization in Somalia, told me. Jocelyn Wyatt, the CEO of the Minnesota-based nonprofit Alight, which works in many countries affected by war or natural disaster, told me that her organization will have to close more than a dozen health facilities in Somalia in the next week, leaving as many as 200,000 people without any health care.

According to Wyatt, State Department officials had said in December that they were “optimistic” about funding for her organization’s work in Sudan being renewed in 2026. But last month, the State Department said the grant would actually end in February. Alight has run out of U.S. funding, and Wyatt told me that she has received no confirmation of if and when OCHA funds will materialize. (“We are working on allocating the funds as quickly as possible,” Kaneko said.) Alight has been forced to pull out of three refugee camps in Sudan, which Trump described on his social-media platform in November as “the most violent place on Earth and, likewise, the single biggest Humanitarian Crisis.” In nearly three years of civil war, more than 150,000 people have been killed in the country. The Trump administration maintains that genocide and famine are taking place there. Yet the global humanitarian effort to respond remains severely underfunded; this year, the World Food Program plans to reduce the rations it gives to people facing famine by 70 percent. Over the past month, Alight has closed 30 health clinics and 14 nutrition centers, and laid off more than 250 doctors, nurses, and staff members around Sudan, Wyatt said. In the three camps Alight exited, the organization had provided the only sources of health care. (The State Department spokesperson did not respond to questions about Alight’s funding.)

I spoke with an Alight worker who has been breaking the news of the sudden closures to people in displacement camps in Sudan over the past month, to sobs and disbelief. Many arrive at the camps wounded, and now the nearest health facility—a regional hospital—is a three-hour drive away from the camps through a war zone. “They are afraid,” the worker told me, of venturing into territory that’s rife with the same militants they have fled. Alight would drive refugees to the hospital when they presented with issues too severe to treat at the camps. But with the new cuts, the organization no longer has enough money to rent the cars.

The Protein-Bar Delusion

by

Eating candy for breakfast is not a good decision. But most mornings, I start my day with something that looks and tastes a lot like just that. The Built Puff protein bar is covered in chocolate and has a sweet coconut center, making it practically indistinguishable from a Mounds bar. Nutritionally, though, the two products are very different. A Mounds bar has north of 200 calories and 20 grams of added sugar. My bar has 140 calories, just six grams of added sugar, and about as much protein as three eggs.

Protein bars have come a long way from the chalky monstrosities that lined shelves not long ago. In this era of protein everything, they are successfully spoofing candy, but with much more impressive macronutrients. Built also makes bars in flavors such as Blue Razz Blast, Strawberries ‘n Cream, and Banana Cream Pie—all with a similar nutritional profile to my preferred coconut version. Another one of my favorites, the Barebells caramel-cashew bar, tastes like a mash-up of a Twix and a Snickers. There are rocky-road protein bars, birthday-cake protein bars coated in sprinkles, and snickerdoodle-flavored protein bars. In theory, I can eat frosted cinnamon rolls or a package of sour gummies without blowing my diet.

For anyone with a sweet tooth, it can feel like food companies have developed guilt-free candy. But that’s where things get disorienting. Some of these products are seemingly nutritionally benign, whereas others are nothing more than junk food trying to cash in on protein’s good reputation. The new protein-spiked Pop-Tarts contain the same amount of sugar as the original Pop-Tarts—30 grams. Or consider Gatorade’s protein bar, which has roughly as much sugar as a full-size Snickers. At this point, the line between protein bar and candy bar has never been blurrier.

[Read: America has entered late-stage protein]

If you’re confused, you’re not the only one. In 2023, a group of Gatorade customers sued PepsiCo, the brand’s parent company, over its sugary protein bars. They alleged that Gatorade was deceiving customers by labeling the products as protein bars as opposed to “a candy bar or dessert.” Pepsi’s lawyers said that it had not engaged in false advertising, because the sugar content was right there for anyone to see on the nutrition-facts label. (In October, the case against PepsiCo was resolved out of court; the bars are still loaded with sugar.)

The lawyers have a point: For some bars, the nutrition facts do tell a clear story. You don’t need to be a nutritionist to figure out that protein Pop-Tarts are not particularly good for you. Other cases, however, aren’t that simple. An oatmeal-raisin-walnut Clif bar tastes pretty healthy, and its 10 grams of protein may keep you fuller for a while—one of the many reasons people are protein-maxxing these days. But is that worth 14 grams of added sugar?

Calories and sugar only tell you so much about whether you’re munching on a healthy snack or something that’s more akin to a Butterfinger. Consider the FDA’s advice on the matter. The agency used to say a protein bar could be classified as healthy if it provided at least 10 percent of a person’s daily recommended protein and also didn’t have much fat, cholesterol, or sodium. Under those guidelines, most of these new bars would qualify as healthy. But the FDA finalized those guidelines in 2024 after complaints from Kind, which makes bars studded with whole nuts. The company argued that the rules unfairly maligned its products, because nuts are too high in fat to qualify as healthy. Under the new rules, it seems that protein bars and other products can’t be labeled as healthy if they rely on protein powders and isolates, rather than whole foods such as nuts and eggs for their protein. As a result, many modern protein bars probably can’t be labeled as healthy.

The FDA is onto something, according to many nutritionists. “Protein bars are candy bars in disguise,” Marion Nestle, an emeritus professor of nutrition at NYU, told me. Even products like David bars, which come in flavors such as Cake Batter and Red Velvet and have just 150 calories and zero grams of sugar, are not as healthy as they may seem. They are made with artificial sweeteners and several other food additives, as are many other candy-protein hybrids with impressive macros, including my beloved coconut-flavored Built Puff.

[Read: Coke, Twinkies, Skittles, and … whole-grain bread?]

These bars lack the slew of micronutrients, such as vitamins and minerals, that are typically part of whole foods. “Eat a bag of nuts, and you will be healthier and get your protein,” Barry Popkin, a nutrition professor at the University of North Carolina at Chapel Hill, told me. Like candy, most modern protein bars are squarely in the category of ultra-processed foods, which many researchers believe may prompt people to overeat and contribute to our collective dietary problems. The science of ultra-processed foods remains largely speculative, however. It’s not yet clear just how bad these products are for us—and why. In an email, David CEO Peter Rahal told me that the macronutrients are what matter most. “To call David a candy bar because it tastes good is like calling a Tesla a toy because it’s fun to drive,” he said.

At the very least, something like the David bar is probably better than a Snickers for anyone craving a quick snack. If protein bars truly replace candy, perhaps Americans will be marginally healthier. If these products become people’s breakfast instead of a well-balanced meal, then not so much. The protein boom has made it easier than ever to get your macros from fun, tasty treats. But for the most part, they are still just treats.

The Longevity Scam

by

The quest to live forever has fascinated humans for millennia. The Epic of Gilgamesh, composed about 4,000 years ago, follows a king who searches the world for a plant that can restore youth, only to lose the plant to a thieving snake. The (likely apocryphal) story of Juan Ponce de León, who is said to have embarked on a search for the Fountain of Youth in the early 16th century, refuses to die—unlike its protagonist, who was killed along his journey.  

Today’s longevity-medicine movement is driven by the same aggressive desire for eternal youth as the mythic stories of old. But whereas in earlier times ideas about wellness could travel only as fast as the people who held them, today just about anyone with an internet connection can use social media and AI-generated graphics to sell medical advice in seconds. Despite a decided shortage of placebo-controlled trials in humans to support that advice, the business of longevity is booming, thanks in large part to sleek direct-to-consumer marketing delivered by health influencers with far more confidence than evidence. By 2030, $8 trillion might be spent annually on longevity-related products.

As a sports-medicine physician, I see the consequences of the modern longevity obsession up close. Patients arrive at my office convinced that the right peptides, cold plunges, or lab tests can meaningfully extend their lives. They’re almost certainly headed for disappointment—if not harm.

In many ways, the American people owe a debt of gratitude to the early champions of longevity medicine. Throughout the 20th century, Western physicians focused primarily on treating disease rather than preventing it. But over the past 15 years or so, a new generation of longevity-focused clinicians began emphasizing lifestyle changes such as sleep, exercise, and healthy diet as first-line strategies for disease prevention—not necessarily to extend life, but to improve health. More recently, private investment has poured into the field in pursuit of flashier claims about staving off death. Many longevity-focused clinics and influencers have drifted from prevention toward profit, selling an expanding menu of unvalidated treatments.

Some of the new advice is relatively harmless. Protein loading, for example, is unlikely to meaningfully extend one’s lifespan, but it is also unlikely to cause serious harm. Other trends are more concerning. I have seen patients experiment with drugs like rapamycin, an immunosuppressant medication prescribed for those who have undergone organ transplantation. Some health influencers claim, without convincing human-subject data to prove their point, that rapamycin slows cellular aging. Whether true or not, these claims have yet to be validated, but scientists do know that the side-effect profile of rapamycin includes an increased risk of infection and disease.

[Read: America has entered late-stage protein]

Other longevity enthusiasts are injecting or swallowing peptides, chains of amino acids that have been used in medicine for decades but are now becoming popular in their unregulated form. When prescribed by a physician, FDA-approved peptides such as insulin and GLP-1s can be remarkably effective. But no placebo-controlled human trials support the use of, say, “Wolverine” (scientific name BPC-157), a peptide that some influencers claim ramps up collagen production and aids tendon and ligament healing. Like many of the other non-FDA-approved peptides, anyone can order Wolverine online.

Along with supplements and drugs, excessive testing has become another pillar of the longevity movement. Apps, blood tests, and wearable devices purport to estimate customers’ “biological age” using metrics such as heart-rate variability, sleep scores, body composition, and biomarkers in blood. This type of “health score” does not predict how long a person will live, but it can provide a helpful snapshot of one’s current state of physiologic health and inspire healthy behaviors.

The bigger issue is more intensive screenings, especially full-body MRI scans, which many longevity clinics have marketed as tools to detect disease early and thereby extend life. Although this sounds like a good idea, the availability of screening has outpaced its clinical relevance. MRI scans routinely reveal anatomical changes that are a normal part of the aging process. Research suggests that the overwhelming majority of adults past middle age have tears or cartilage changes in their knees, or tendon injuries in their shoulders. Similarly, liver and kidney cysts are commonly seen on MRI scans, especially in people over 50, and most are of no clinical significance; they’re common enough that researchers have dubbed them incidentalomas. But when such findings show up on an MRI, the risk of unnecessary surgery or other treatment increases drastically. Once a liver lesion is seen on MRI, for example, the patient will likely be advised to get a liver biopsy, a procedure with a 2.4 percent risk of major medical complication.

[Read: GLP-1 envy was just the beginning]

Recently, I treated a 48-year-old man for a routine sports injury. Almost as an aside, he mentioned that he had undergone a full-body MRI as part of a longevity program. The scan revealed a small lesion in his prostate. His PSA, the validated screening test for prostate cancer, was normal. Still, he was referred to a urologist, who felt compelled to biopsy the lesion because it appeared on the scan. The biopsy was benign, but the procedure left my patient unable to sit comfortably for weeks. “I wish I’d never done the MRI,” he told me.

The irony is that modern medicine has already succeeded at what the modern-day longevity movement claims to offer. Over the past 150 years, human life expectancy around the world has more than doubled, to numbers that Ponce de León could hardly have dreamed of. Clean water, sanitation, antibiotics, and vaccines have done more to extend human life than any supplement stack ever could. Cold plunges and red lights may feel empowering, but there is little evidence that today’s biohacking tools meaningfully extend the maximum human lifespan.

A better—and more achievable—goal would be to extend healthy longevity, adding life to years instead of years to life. Scientists and doctors, for the most part, already know how to do this. Daily exercise and maintaining skeletal muscle volume as you age are among the most potent forms of preventive health care.

[From the January 2025 issue: America needs to radically rethink what it means to be old]

After decades of prescribing exercise as medicine to my patients, I tell them this: Move your body every day, and build muscles with weights or bodyweight exercises three times a week. Eat foods that you can recognize in nature. Prioritize sleep. Stay socially connected with community activities. Such a regimen may not enable you to cheat death. But it’s free. It’s evidence-backed. And it will help you live well right now.

A post on X claimed to be a simple message from the U.S. Department of Health and Human Services: Stay active; eat well. But the 90-second video it shared, called “Secretary Kennedy and Kid Rock’s Rock Out Work Out,” seems designed to be bewildering. Here was Robert F. Kennedy Jr. eating steak and doing preacher curls in his belted blue jeans and a pair of hiking shoes; and here he was again, stripping off his T-shirt to ride an exercise bike inside a sauna; and here he was a little later, strutting over to a cold-plunge tub (still in his blue jeans but with the belt removed); and here he went into the tub, sliding underwater in his dungarees.

Why was the HHS secretary bathing in a pair of pants? The video never provides an answer for this question, even as Kennedy plays pickleball, then mugs for the camera, then soaks in a Jacuzzi with a glass of milk, all while still in jeans. It’s just bizarre—a PSA that presumably has been dialed in by his staff to maximize its WTF effect. (The video has been viewed more than 13 million times and produced some 11,000 replies; HHS did not respond to a request for comment about the video or the jeans.) However his peculiar gymwear habit started, its present state is very clear: The secretary’s jeans are self-aware.

Kennedy’s proclivity for working out in belted denim long predates the knowing wink with which it’s now displayed. Take the summer’s “DOD-HHS Fitness Challenge,” for which Kennedy donned his favorite workout gear and did a bunch of pull-ups with Secretary of Defense Pete Hegseth. Or the viral clip of him from 2023, wearing jeans and boots and nothing else, squeezing out a final set on Venice Beach. I don’t believe that these show a man who lifts in jeans to maximize his clicks. I believe that they show instead a man who fits a waning archetype in fitness culture, a species that has for decades been endemic to the gym: Kennedy is a jeans guy.

I’ve worked out, off and on, for more than 30 years—and for all of that time, the jeans guys have remained a steady presence on the rubber floors. They are sometimes taciturn, sometimes chatty. They often pair their jeans with boots, as Kennedy will do, and with a T-shirt or a tank top or a hoodie. But lest you think he simply has no truck with any gym-specific gear, the jeans guy is sometimes spotted wearing padded lifting gloves, or a leather lifting belt across his Levi’s. At times, his social role will overlap with that of other weight-room regulars, not least of which is the gym grandpa, who hangs around and shoots the breeze and doles out tips on how to lift. However he appears, and however much he gabs, the jeans guy’s social status is the same: He’s an outsider. Rarely does one find a jeans guy paired up with a workout partner. “It’s usually like, ‘The jeans guy rides alone,’” Tolga Ozyurtcu, a historian of physical culture at the University of Texas at Austin, told me when I called him up to talk about this phenomenon.

Not everyone enjoys the company of the jeans guy. Some see him as a threat. Planet Fitness once made a point of banning denim in the gym, along with grunting, dropping dumbbells, and judging others. (Those who broke these rules could be punished with a “lunk alarm” and summarily kicked out.) But this discrimination feels as ill-considered as it is unfair: In my experience, jeans guys are harmless at the very worst, and at best, they add some needed color to a dreary landscape. In this way, the jeans guys are akin to other gentle curiosities, such as the shorts guys who alight on college campuses in wintertime, and the black chipmunks that scamper by from time to time in city parks.

What motivates the jeans guy? No one knows. He is, if nothing else, as inscrutable as a four-leaf clover. Ask him why he isn’t wearing shorts, and he will likely tell you that he chooses denim for efficiency. When Fox News’s Jesse Watters asked Kennedy in August to explain his favored workout gear, all Watters got was this: “Well, I just started doing that a long time ago because I would go hiking in the morning and then I’d go straight to the gym, and I found it was convenient, and now I’m used to it.” In the hope of getting more, I reached out to Ryan Calder, the fitness coach who spotted Kennedy on the incline bench in that viral video from 2023. Did Calder—who at the time was dressed, quite reasonably, in shorts—happen to ask Kennedy about his denim pants and boots? He did. “I asked him right then, you know, like, ‘So, you’re banging it out in jeans?,’” he told me. “And he’s like, ‘Yeah, man, this is my efficient way. I only have 30 minutes. I don’t spend time changing clothes.”

A jeans guy’s self-report must be taken with a grain of salt—maybe even he cannot really fathom why he lives the way he does. Kennedy’s is no exception to this rule. In public appearances, he is almost always in a suit and skinny tie, so adding a daily interlude in workout denim would hardly seem to be a way of saving time. His habit may be instead a product of the workout culture he imbibed during his youth. “The jeans guy, it’s a thing. It’s a very definitive thing,” Conor Heffernan, a fitness historian at Ulster University, told me. “It’s a trope we’ve had since the ’80s.” The power lifters of the time, some of whom were connected to the biker subculture, adopted a “rugged, spit-and-sawdust aesthetic” in the gym, he said. This included denim. During the same period, glitzy photoshoots for bodybuilders also featured jeans, to match the styles of the time. Heffernan brought up a famous photo of Lee Haney, the eight-time Mr. Olympia, flexing shirtless in a pair of jeans above a steamy manhole cover in New York City. Perhaps the older jeans guys of today—Kennedy himself is 72—are nothing less than living fossils.

Their aesthetic may have faded out, but a younger set of jeans guys—ironic jeans guys—has since emerged in the fitness culture. Take the influencer-marathoner Truett Hanes: His brand is built on running very fast and very far … in jeans. He claims that this started as a goof, but it has turned into a business. He now represents a denim company, as well as a chafing cream. The idea of working out in jeans, partly silly and partly serious, is everywhere once you start to look for it. One gymwear brand, Raskol, launched a line of lifter jeggings in 2023 in shades such as “blue steele” and “pale thunder,” with a tongue-in-cheek campaign that had bodybuilders boasting of their pride at using PEDs—that is, “performance-enhancing denims.”

This self-mocking move may be just the prelude to a fuller jeans-guy renaissance, Heffernan suggested. After all, Raskol’s jeggings did sell out, he said. And this wouldn’t be the first time that a traditional signifier of masculinity crept back into mainstream culture by way of performative half jokes. The fashion for bushy beards, and beards’ association with authentic manliness, has followed this same trajectory from irony to earnestness during its various resurgences since the early 19th century. Now the same could be happening to denim workout pants: Today’s goof evolves into tomorrow’s masculine ideal. “I think irony moves into fashion very quickly in fitness,” Heffernan said.

For Kennedy, this process may appear to be going in reverse: His latest workout video shows that he’s in on the joke, that in 2026 he’s capable of pumping irony as well as iron, and that he can engage in what Heffernan described as “a very deliberate deployment of jeans.” But it also shows that there is a recipe, if not a cultural machinery, for rehabilitating out-of-date ideas. Not all of Kennedy’s eccentricities are as quaint as how he dresses in the gym, and there are many ways of going backwards in pursuit of health while pretending that you’ve found a way into the future. MAHA is nostalgia, sometimes with a smirk. The jeans guy dunks himself in water. The jeans guy is reborn.

Olympic skimo is a spectacle. Appearing at the Games for the first time, skimo, short for ski mountaineering, grew from a long tradition of scaling mountains on skis—sometimes for hours on end—for the reward of gliding down, preferably through untracked powder. But at the Olympics, it will involve sprints of approximately three and a half minutes. First, athletes ascend a steep slope on skis outfitted with special fabric strips called “skins,” which prevent them from sliding backwards down the hill. Then they throw their skis on their back and hike in ski boots, put their skis back on for a final ascent, and finally peel off their skins to race downhill.

The events will include a mixed-gender skimo relay—“contrived for the Olympic field,” Sarah Cookler, the head of sport for USA Skimo, the sport’s American governing body, told me. Each team will include one man and one woman, each of whom will run the course twice. Anna Gibson and Cam Smith, both making their Olympic debuts, will represent Team USA.

It’s only fitting that such a new sport should debut on the world stage with a format that’s newly popular. Mixed-gender events, in which each team consists of a set number of men and women, have long had a place at the Olympics: Mixed-pairs figure skating and tennis were each introduced to the Games more than a century ago. But recently, such events have been growing in number as part of a deliberate effort to bring more attention to women’s sports. At Milan Cortina, every major sport category but ice hockey features mixed-gender events.

Women participated in the first Winter Olympics, in Chamonix in 1924, but only in figure skating. Women’s speed-skating exhibitions were included in the 1932 Olympics, but no medals were awarded. In 1936, women competed in alpine skiing, but they weren’t allowed in cross-country until 1952. Since then, the Winter Olympics have slowly been approaching a sort of parity, with women participating in an ever-growing number of events once considered too difficult, dangerous, or scandalous for them.

The International Olympic Committee is advertising the 2026 Games as “the most gender-balanced Olympic Winter Games in history.” That’s based on the fact that women will represent 47 percent of athletes participating and compete in 53 percent of all events; this Olympics is also the first where male and female cross-country skiers are competing at the same distances. But in terms of viewership and publicity, the many mixed-gender events may do more to raise the profile of female athletes than women’s events alone.

Women’s sports, including skimo, tend to attract far fewer spectators than men’s sports do. The mixed-gender relay can help bridge this gap, Smith told me, because in events that feature both men and women, viewers are positioned to watch everyone compete. And if the mixed-gender events allow spectators to get to know more female athletes, that could translate into more people watching the women’s stand-alone events. The IOC is leaning into these events: The Los Angeles Games in 2028 will feature 25 mixed-gender events, including new ones in golf, rowing, and artistic gymnastics.

Gibson told me she’s grown accustomed to elite sport spaces separating men and women as much as possible in the name of equity. She’s also a world-class trail runner and accomplished gravel-bike racer, and in those sports, “the talk has been all about giving women their own start in order to elevate women’s competition rather than having women be buried in the men’s field,” she said. Separate men’s and women’s fields have long been the default in most sports, and for good reason: If women had to compete directly with men, they would rarely have a chance to win. But bringing women and men together in competitions that include both gives women their own space and attention while including all the sport’s athletes within one community. It creates “a lot of camaraderie,” Gibson said, and in her experience, adds to the excitement among spectators.

Women, of course, face many obstacles in sport that cannot be solved by the addition of a few mixed-gender Olympic events. Across sports, male athletes are generally paid more than their female counterparts. In skimo, prize money is equal at the sport’s World Cup, but some other races still have unequal payouts, Cookler said. (Gibson said she avoids those.) Women athletes are still sometimes subjected to sexual harassment and abuse, even on the Olympic level. The Winter Olympics still include one event in which women are not allowed to compete at all—Nordic combined, which consists of ski jumping and cross-country skiing. (The IOC is already considering dropping Nordic combined altogether because of low participation and spectator interest; an IOC spokesperson told me the event “will undergo a full evaluation” following this year’s Olympics, and reiterated the committee’s commitment to gender equality.)

And mixed-gender formats can still be subject to unequal dynamics. Biathlon, a Winter Olympics event that combines cross-country skiing and target shooting, has used the mixed-gender relay format since the 2014 Olympics in Sochi. (Biathlon and luge were the first sports to introduce a mixed-gender format to the Winter Games since figure skating and ice dancing in the 20th century.) But “women always went first in our mixed-gender relays, and people felt it was unfair that a woman could never be the anchor,” Joanne Reid, a three-time Olympian who competed for Team USA last week, told me. In 2019, the sport’s governing body restructured the order of the competition so that the gender of the anchor rotates.

Even as mixed-gender events become more common, skimo’s inclusion in future Olympics is not assured. It’s in the 2026 Olympics because the organizers of this year’s Games proposed it; to continue competing, skimo athletes will need organizers at the next venues to propose its inclusion too. Skimo enthusiasts expect this to happen and hope future Olympics will include events that are longer and more representative of the sport’s origins. “Nobody got into skimo to do the sprint relay,” Smith said. But as he began training for the event, he enjoyed it more than he expected. “It’s really fun because we are accountable to each other,” he said. “I’m racing all out because I know that she’s doing the same for me.”

Back when he was 17 and in high school, Eric Sid fainted. In the emergency room, he was diagnosed with anemia, which can cause fainting spells, and for years he thought that was the end of the story. About a decade later, in the early 2010s, he came down with pneumonia and had blood work done. He took a peek at the results and saw markers of anemia, as he expected. But the report also mentioned that his red blood cells were smaller than normal.

Sid was in medical school at the time, and he immediately thought of a few genetic conditions that could explain this result. One was thalassemia, which causes low levels of hemoglobin, leading to anemia and other related problems. A laboratory test showed that he had this inherited illness. And this meant that he had a gene mutation. Finally, he thought, he had an explanation for symptoms he had been experiencing for years.

Except there was a catch. In the most common forms of thalassemia, people who show symptoms have mutations in both copies of the related genes. Those born with the most severe forms of thalassemia require transfusions for life to get enough healthy red blood cells, and if the condition is not diagnosed soon enough it can be fatal in early childhood. But Sid’s lab results suggested that only one copy was affected, so he was considered a carrier of the illness, who could pass it on to his children but didn’t have it himself. According to conventional wisdom at the time, carriers were asymptomatic, and compared with someone suffering from the disease’s worst manifestations, he seemed fine.  

Sid now works on a rare-disease program within the National Institutes of Health. Since he first found out that he was a carrier for thalassemia, he told me, research has shown that people like him can experience health consequences. These include lethargy and fainting—symptoms that hardly capture the disease’s classic presentation but still have real consequences. And thalassemia is not unique. There are hundreds upon hundreds of known disorders for which carriers were thought to be safe; for a growing number of those diseases, doctors and scientists now believe that being a carrier can come with health problems. Plenty of patients have guessed all along that being a carrier could explain mysteries about their health, Sid said. “It took a while for the science to basically catch up to that suspicion.”

In these types of “recessive” conditions, the people who show the classical manifestation of the disease have a pair of mutated genes. Our 23 chromosomes come in pairs that are essentially near duplicates of each other (with the exception of XY pairs). A person with just one nonworking version of a gene was supposed to be protected by the functioning second copy on the matching chromosome, which would provide cover against any disease. Inheriting two mutated copies of a gene is statistically rare, so many families with carriers may not include members with the full-blown version of a disease. As a result, the mutation can be unknowingly passed down from generation to generation, without carriers being aware of the real consequences.

In October, researchers in Louisiana reported on a case involving a college football player who had heart palpitations during practice. The athlete was a carrier for the blood disorder sickle-cell disease—one of the most common genetic disorders in America. Up to 10 percent of Black Americans are carriers for sickle cell. As far back as the 1970s, scientists noted blood and heart complications in carriers following physical exercise, and a lawsuit following the sudden death of a college football player two decades ago led to widespread screening for the sickle-cell trait in university athletics. However, experts are still trying to understand the risks to carriers.

The Louisiana man survived, but researchers noted that he had a dangerously irregular heartbeat during training and would need cardiac surgery. The authors of the October paper stress that coaches and players need more education about the possible hazards for carriers of the sick-cell trait.

Experts who study cystic fibrosis, which is caused by mutations in the CFTR gene, also want more attention on the health complications that carriers can experience. As many as one in 25 Americans of European descent is a carrier for cystic fibrosis, but many are not aware of their status. For this disease, reports of symptoms in carriers go back at least several decades. Take, for example, infertility. Almost all men who have two mutated copies of the CFTR gene lack vas deferens, and without these tubes, sperm have no path out of the testes. In the mid-’90s, scientists found a handful of cystic-fibrosis carriers who lacked vas deferens, despite only one of their CFTR copies being mutated.

Carriers of cystic fibrosis can face other real health issues—dramatic sinus problems requiring multiple surgeries, pancreatitis, and possibly pancreatic cancer, which researchers have documented in papers over the past several years. “We’ve always said being a carrier of a single cystic-fibrosis mutation doesn’t usually lead to health issues. And usually it doesn’t,” Michael Boyle, the president and CEO of the Cystic Fibrosis Foundation, told me. “However, we do know, and probably have a greater appreciation than ever, that being a carrier can lead to health issues for some.” There are about 10 million cystic-fibrosis carriers in the United States alone, so if even a fraction of them have some degree of symptoms, that amounts to many people with manifestations of the disease.

Questions about carriers’ health problems go beyond well-known diseases such as sickle cell and cystic fibrosis. Consider xeroderma pigmentosum: People with two mutated gene copies are up to 2,000 times more likely to develop melanoma than the average person; a 2023 analysis found strong evidence that some carriers of the disease were also more likely to develop skin cancer. Up to 3 percent of carriers of hereditary hemochromatosis show symptoms such as iron overload in their organs. Carriers of Gaucher disease Type 1 are at increased risk of Parkinson’s. Carriers of LIG4 syndrome experience a version of the immunodeficiency that, in the fully expressed syndrome, causes life-threatening illness.

For many of these diseases, having two mutated copies of the related gene generally means that the disease will come for you. Having one copy can still mean nothing: Plenty of carriers of recessive genetic diseases seem none the worse for it. An estimated one in 20 people is, like Sid, a carrier for thalassemia, but not all people with a copy of the mutation experience anemia and fainting like he did. Why some carriers of genetic illnesses might be affected and others remain free from symptoms is not clear. But “people who are thought to be just carriers but show some symptoms of that disease may sometimes have a second hard-to-discover mutation,” Edward Neilan, the chief medical and scientific officer of the National Organization for Rare Disorders, told me. “They may actually have two mutations.”

Being a carrier can have benefits. One theory of why genetic diseases spread widely is that having one copy of a mutated gene has some advantage. It’s well understood, for example, that being a carrier for sickle-cell disease offers some protection against malaria. And some have theorized that being a carrier for cystic fibrosis can defend against severe cholera, although the evidence for this theory is more scant. A 2023 analysis discovered that although two variant copies of the SCN5A gene elevates a person’s risk of severe heart-rhythm problems, just one copy actually might lower a person’s risk of heart-rhythm irregularities compared with the general population.

Even when being a carrier of a particular disease isn’t itself beneficial, knowing that you are a carrier can be. Carriers of certain variants of xeroderma pigmentosum, for example, might want to go to extra lengths to avoid excessive sun exposure; some carriers of Alpha-1 antitrypsin deficiency, who appear to have a heightened risk of lung issues, might decide not to smoke. And people with the sickle-cell trait might be well-advised to acclimatize before doing sports at high altitudes.

Knowing you’re a carrier for cystic fibrosis might help guide your family-planning decisions, for instance. (Even without vas deferens, a person could still become a father using sperm extraction and in vitro fertilization—which some carriers of cystic fibrosis opt for anyway, to avoid passing on the disease.) But there are limits: The science is still out on whether Trikafta, a relatively new medication that has transformed the disease for many people, would help with sinus or pancreatic issues in carriers. The medication isn’t approved for carriers, either.   

For Sid, knowing that he had a thalassemia mutation—which can cause dizziness—has made him feel less bad about his difficulty with intensive physical training, for example. And he is comforted to know what likely caused his high-school fainting episode. Finding out about the mutation in his genome gave him a fuller picture of his health. “Personally,” he said, “that’s kind of how I saw it—this kind of gave me some more understanding.”

In 2021, just months after the first COVID vaccines debuted, concern was growing about an exceedingly rare but sometimes deadly outcome of certain shots. Two related vaccines—one from AstraZeneca and the other from Johnson & Johnson—were linked to dangerous blood clotting.

Out of almost 19 million doses of Johnson & Johnson’s version given in the United States during the first two years of the pandemic, at least 60 such cases were identified. Nine of them were fatal. In the United Kingdom, where almost 50 million doses of the AstraZeneca shot were given, 455 cases occurred; 81 people died. In Germany, at least 71 cases were identified, also linked to AstraZeneca. By late spring, use of both the AstraZeneca and the Johnson & Johnson vaccine was paused, and ultimately both were pulled from the market. But the mystery surrounding the rare blood clotting caused by these vaccines lingered.

Now researchers believe they have cracked the case. They have hard evidence for how the blood clotting happened, and they believe that their findings could help make similar vaccines even safer. Understanding the blood-clotting problem is important, they say, because vaccines of this type could be essential in protecting people during future pandemics.

The team that initially gave this condition a name—vaccine-induced immune thrombotic thrombocytopenia, or VITT—included Andreas Greinacher, a blood expert at the University of Greifswald, in Germany. Back in 2021, as the cases of VITT emerged, he and others were unsure of what precipitated them. One theory was that they were caused by the body’s accidental reaction to the type of virus used in both the AstraZeneca and Johnson & Johnson vaccines: adenoviruses, which had been engineered to prompt the body to recognize the pandemic coronavirus but were unable to replicate and considered harmless to people. Scientists had noticed that patients with VITT had telltale markers in their blood—antibodies that bind to a chemical signal released by platelets. Maybe a reaction to the adenovirus was causing immune cells to mistakenly go after a blood component and precipitate clotting. An alternative theory was that the body was reacting to a portion of the coronavirus called “spike protein,” which showed up as part of the immunization.

In a study published today in The New England Journal of Medicine, Greinacher and his colleagues show that the first theory was correct: VITT was a response to the adenovirus gone awry. And they discovered a further twist: This immune overreaction happened in people who were genetically prone to it.

In the study, Greinacher and his colleagues looked at the antibodies in stored blood from 21 patients with VITT. Among those antibodies, they found a subset that could glom on to a portion of the adenovirus and to one of the body’s own molecules, PF4, that can influence blood clotting. A person who received one of the adenovirus vaccines but did not have a reaction also had antibodies against that same part of the adenovirus. But, crucially, that person’s antibodies did not cross-react with PF4.

Those antibody molecules also offered clues about the immune cells that made them. And the scientists were able to link the immune cells responsible for VITT to patients who had two specific DNA variants. A wider survey of 100 VITT patients found that all of them had immune cells with one of these genetic types—which are far from universal. This signaled to the researchers that having these particular variants is a strong risk factor for blood clotting following an adenovirus vaccine.

But the study also showed that this genetic background on its own was not enough to cause VITT. The immune cells that made the dangerous antibodies had experienced an additional small genetic change, and that extra mutation had prompted them to produce those cross-reactive molecules.

In the past, scientists have suggested that genetic predispositions might explain some adverse events that happen after vaccination. For example, some data have indicated that certain people were genetically prone to developing narcolepsy following a version of swine-flu vaccine that was briefly used in Europe. But the new study from Greinacher and his team is the first to provide concrete evidence of how people with a particular DNA variant can develop self-sabotaging antibodies following a vaccination. Arnold Lining Ju, a biomedical engineer at the University of Sydney who has studied blood clotting, told me that the paper was a landmark finding in part because of how elegantly it explains the way a specific genetic trait, combined with a particular chance mutation in certain cells, creates VITT. And because the study shows that multiple genetic changes are involved, it finally explains why this immune reaction is so rare, he said.

This discovery will help guide researchers more than it will influence vaccination choices for individual patients. Most vaccine recipients will not know their genetic predisposition to an adverse event, Jennifer Juno, a vaccine researcher at the University of Melbourne, points out. But this type of work will help improve vaccine design—particularly in the field of “precision vaccinology,” in which vaccines are tailored to individual traits, Joanne Reed, the director of the Centre for Immunology and Allergy Research at the Westmead Institute in Australia, told me.

These results also mean that adenovirus-based vaccines could be made safer if they can be designed without the protein region that triggered the dangerous antibodies in VITT. “Instead of abandoning an entire vaccine platform because of a rare problem, we can engineer around the specific issue, and that’s the power of this kind of science,” Joann Arce of the Precision Vaccines Program at Boston Children’s Hospital told me. The hope is that understanding the biology of a rare event like VITT, and then addressing it, helps bolster public trust in vaccines too. Greinacher told me that adenovirus-based vaccines remain vital, including for the development of vaccines for diseases that affect mostly low- and middle-income countries. The shots could also be useful in a future pandemic, because they can be scaled up in production relatively quickly.

Still, this one study may not have entirely answered the question of why adenovirus-based COVID vaccines caused clotting. A study published last year from Ju’s group suggested that a separate biophysical mechanism might cause a viral component found in the AstraZeneca vaccine to directly aggregate platelets, independent of the immune reaction identified in VITT. And a bigger mystery remains open too—why infections themselves are sometimes associated with dangerous blood clotting. Rushad Pavri, an immunologist at King’s College London, told me that the new study—because it shows how similarities between a virus particle and an innate protein involved in clotting can confuse the immune system—can shed light on that question. Ultimately, understanding why viruses can provoke immune overreactions might help limit damaging complications from sickness to begin with.

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

In the winter of 168 C.E., the famed Greek physician Galen arrived in Aquileia, an Italian city on the northern edge of the Adriatic. The city had grown large since its founding as a Roman colony, but during the 200-year Pax Romana, its fortifications had been allowed to deteriorate. After an armed group of migrating Germanic peoples had crossed the Danube a year earlier, the Roman co-emperors, Marcus Aurelius and Lucius Verus, had rushed to the city, raising two legions and rebuilding its defenses; they planned to use it as a base of operations against the invaders.

Galen had been summoned, however, to help fight a different kind of invader. A plague, likely an early variant of smallpox, had traveled to Aquileia with the troops, and held the city in its grip. The emperors fled, but Verus succumbed to the disease on the road to Rome. Galen tried to slow the wave of illness, but most of the people in Aquileia perished.

They represented just a sliver of the eventual victims of the Antonine Plague, also known as Galen’s Plague, which killed at least 1 million people throughout the Roman empire. It was possibly the world’s first true pandemic, and haunted the empire for the rest of the Pax Romana, which ended in 180 with Aurelius’s death. The details of the pandemic—the exact pathogen, the true number of victims—are subjects of debate, and might never be fully settled. But some research has cited the Antonine Plague as part of a vicious cycle that hastened Rome’s long fall. Food shortages, internal migrations, and overcrowding had already signaled a slippage in imperial power, and created a fertile environment for disease. The pandemic, in turn, spread panic and left behind mistrust, weakening faith in civic and religious authorities.

Men famously think about Rome every day, and political commentators have been nervously comparing Rome’s fall to a potential American collapse since before America even had a Constitution. But Rome’s example really does merit consideration in light of recent events. One of the better measures of a society’s vitality is its ability to protect its citizens from disease, and the two often move in tandem; a decline in one may produce a reduction in the other.

Infectious disease is probably not an imminent threat to the United States’ survival. Still, after nearly a century of existence, the American public-health apparatus, which has driven some of the most remarkable advances in global longevity and quality of life in human history, is teetering. The country has lost much of its ability to keep microbes from invading its body politic, and progress in life expectancy and other metrics is slowing or even reversing.

It is tempting to lay these changes all at the feet of President Trump and his current health secretary, Robert F. Kennedy Jr., who together have shredded America’s global-health organizations, drawn back public-health funding, fomented vaccine skepticism, and begun to dismantle child-vaccination programs. But the “Make America Healthy Again” moment is in some ways just another step in the long retreat of the civic trust and communitarian spirit that have enabled America’s disease-fighting efforts. If this retreat continues, the public-health era—the century-long period of unprecedented epidemiological safety that has been the foundation for so many other breakthroughs—will come to an end. And that end will have dire consequences for this republic and its future.

In January 2025, a hospital in West Texas began reporting that children were coming in sick with measles. The cases were initially clustered in a Mennonite community, where vaccination rates had been low in recent decades. But soon the outbreak spread around the state, and to others; the reported number of cases reached more than 1,800 by the year’s end. As of this writing, the outbreak is still ongoing, and America is in danger of having its measles-elimination status revoked by the World Health Organization.

On August 8, as the measles outbreak continued to make headlines, a man named Patrick Joseph White entered a CVS in northeast Atlanta and fired hundreds of rounds from a rifle into the CDC’s headquarters across the street. According to Georgia investigators, White had been suicidal, and believed that COVID‑19 vaccines were part of a conspiracy to sicken him and other Americans.

These were but two signs among many that something has broken within the systems that protect the population’s health. Despite all of our advantages, the coronavirus pandemic caused more confirmed deaths per capita in the United States than in any other Western country, and our mortality rate’s recovery has lagged behind others’. Life expectancy in the U.S. is lower than in other high-income nations, and the gulf is widening.

America is unique, and comparisons are difficult. The country easily outpaces the rest of the developed world in gun deaths and overdoses, both major mortality drivers here that have largely been accepted as the cost of being American. But even if you discount those peculiarities, plenty of other indicators are pointing the wrong way. Foodborne illnesses appear to be on the rise, including regular surges of norovirus. Deteriorating water-delivery and sewage systems have contributed to a growing number of outbreaks of legionella. Cases of tetanus, whooping cough, and hepatitis A have also risen in recent years.

Many problems contribute to these shifts—insufficient investments in infrastructure, budget cuts in state and local health departments, the growing drug resistance of bacteria. Yet underlying all of the outbreaks, and even gun and opioid deaths, is a common theme: a declining sense of mutual responsibility among Americans. If the population could be analogized to a single human body, then its immune system would rely on a concert of action and purpose between each cell. When that concert stops, the body dies.

In 1946, the year the U.S. Public Health Service founded its Communicable Disease Center, American life expectancy at birth was about 66 years. Malaria was rampant in the South, and fever diseases, tuberculosis, syphilis, and polio killed tens of thousands of Americans annually. Thirty-four out of every 1,000 children born in 1946 were expected to die before their first birthday, many from communicable diseases. America was moving toward modernity, but the risks people faced were of a different order than they are today.

The CDC (since renamed the Centers for Disease Control and Prevention) inherited much of its early mandate from a U.S. military campaign to control infectious diseases among soldiers fighting in World War II. The scale of the war effort had necessitated the creation of a health infrastructure on American soil—spraying for mosquitoes near the front lines in the Pacific wouldn’t mean anything if soldiers caught malaria at home before deployment. Responses to outbreaks near bases needed to be big and fast enough to account for car travel beyond military jurisdictions. When the CDC took over, it extended this paradigm—of coordination across long distances and disparate communities—to the civilian population.

[From the June 2020 issue: Vann R. Newkirk II on how America handles catastrophe]

The same year the CDC was created, the influenza vaccine reached the public, and international organizations, supported by the U.S., began a global push to eliminate tuberculosis. The agency worked to promote mass vaccination. It began a national disease-surveillance program, and shared intelligence with cash-strapped county health departments and state agencies. Wartime campaigns to coax and chide Americans into doing their part to conserve resources and volunteer for the war effort translated easily into pushes for vaccination and sanitation.

Before 1946, conquering disease would have seemed as much a subject of science fiction as putting a man on the moon. But since 1950, global life expectancy has risen by four years each decade. Smallpox has been eradicated, and polio and malaria cases have dramatically fallen. Within the past 80 years, there have perhaps been more significant advances in human health than there were in the previous 300,000.

On the home front, several generations have grown up on an American mainland without malaria, yellow fever, or typhoid fever; diseases like dysentery are medical rarities. Measles and polio, once routine scourges of childhood, were pushed back by millions of vaccinations. Life expectancy increased by more than a decade, to 78 in 2023. This was a public-health revolution, on equal footing with any of the great agricultural, industrial, or information revolutions that have punctuated the past few centuries.

Those other great revolutions are often considered to be the result of technological advances—the plow, steam power, fertilizers, the internet. And certainly, the development of vaccines, antibiotics, and other medicines has played a tremendous role in the advance of human health. But vaccines for smallpox and some other diseases had been around for at least half a century before the 1940s, and had failed to create widespread immunity. The real public-health revolution was first and foremost a change in the way people thought about themselves and their relationship to one another.

Epidemiology made a new kind of thinking necessary. Pathogens respect neither individuals nor borders. Vaccinations and other preventatives against ever-evolving germs do not on their own guarantee personal safety—only eradication can do that. And eradication, it came to be understood, can be achieved only through local and global cooperation.

In America, where capitalist and individualist ethics have always predominated, public health nonetheless managed to carve out a large cooperative space. Before the 1940s, the United States was still reporting a relatively high number of smallpox cases compared with other similarly industrialized nations; it achieved total elimination in 1949. With the insistence of a growing public-health apparatus, it became common practice to wash our hands, to cover our mouths, to not smoke indoors, and to get tested—not just for our own benefit, but for the sake of the people around us. Parents waited in long lines to have their children inoculated, and enterprising physicians went to rural clinics to reach the last isolated clusters of unvaccinated people.

That is not to say America’s particular system of public health was ever perfect. Owing partly to the legacy of segregation, the country never developed a universal health-insurance program, and maintains a fragmented health-care system in which both class and race still dictate much of a patient’s access to care. Many people on the margins who have wanted to get screened for certain diseases or vaccinated against them have not been able to do so, because they cannot afford to or because no doctor will serve them.

And yet, sometimes through the insistence of those same people that America live up to the tenets of public health, the system has come closer to the ideal. As much as any other institution—schools, libraries, churches—the public-health system has helped propagate the idea of a commons, often working against historical inertia to curb the excesses of American individualism. That work has always required energy and effort from the people. And so it has always been vulnerable, because that energy and effort could dissipate at any time.

There is ample evidence that this is exactly what is happening. According to the health-policy organization KFF, in the summer of 2025 just 83 percent of parents kept their children up to date on vaccines, down from 90 percent four years earlier. Cases are surging for several of the diseases covered by the national vaccine schedule. Tuberculosis cases are higher than they have been in a dozen years, and meningococcal disease is rising as well. Measles cases have trended upward for years too, even before 2025.

Over the past 50 years, American trust in the medical system has declined, as has trust in government, science, and expertise in general. The coronavirus pandemic exploded those trends, creating the world in which we now find ourselves. Public-health agencies did themselves no favors: They often gave out confusing and sometimes conflicting advice. Conspiracy theories grew quickly on social media, and measures such as masking became subject to partisan polarization. According to Gallup, a bare majority—just 51 percent—of Americans now favors government requirements for vaccines, down from 81 percent in 1991 and 62 percent in 2019. Most of the slippage has been among conservatives, and studies suggest that political ideology is perhaps the biggest predictor of vaccine rejection.

Medicine has kept moving forward, with some truly great results. Deaths in the U.S. from cardiovascular disease are plummeting, and might see further declines with the advance of GLP‑1 drugs. With the advent of better cancer-screening tools, survival rates are improving, and wonder-drug therapeutics for many conditions are now on the market. But personalized care of this sort is expensive, and does not keep us collectively safe from infectious disease.

Meanwhile, as viruses that once killed hundreds of thousands have receded from public memory, they have come to seem less fearsome. Owing to the near eradication of some diseases, there have been few real risks to the heretofore small portion of people who refuse vaccines. In this landscape, organizations such as the CDC, which once stood as unimpeachable examples of government competence, have become victims of their own success, appearing to skeptics to be inert or irrelevant.

This was the system as Trump and Kennedy found it last year, vulnerable and stripped of the halo of public trust. Kennedy slashed agency budgets and stocked a key vaccine advisory committee with vaccine skeptics, then this past January announced a new set of childhood-vaccine recommendations that excluded coverage for rotavirus, influenza, and hepatitis A, which all now cannot be administered to most patients without a doctor’s consultation.

Kennedy’s biggest threat to public health comes from what he symbolizes. The MAHA movement derides expertise, overemphasizes personal commitment and liberty, and has embraced pseudoscience. This stance, mingled with Trump world’s conspiratorial tendencies, has turned the CDC and other once-trusted institutions into targets. After the August shooting at CDC headquarters, hundreds of current and former Health and Human Services employees singled out Kennedy as a driver of the kind of rhetoric that had motivated Patrick Joseph White, referring to the secretary’s previous insinuations that the CDC itself was hiding information about the risks of COVID vaccines.

Marcus Aurelius, the surviving Roman emperor, is mostly famous in our time because of his Stoicism. His philosophy encouraged the embrace of duty, not because of the expectation of praise or other material benefits but because duty is in itself fulfillment of the human condition. In his Meditations, he offered a maxim: “Do your duty—whether shivering or warm, never mind; heavy-eyed, or with your fill of sleep; in evil report or in good report; dying or with other work in hand.”

It’s hard to psychoanalyze a guy who lived two millennia ago, but it’s easy to believe that this particular admonishment may have come from his time as a plague fighter. In the face of Galen’s “everlasting pestilence,” Marcus had to rally the public and improvise, stocking depleted armies with convicts and ordering the digging of mass graves. He saw that the state was held up not just by the military or territory, but by invisible webs of shared sacrifice and obligation. In the end, the fortifications that mattered most were those that strengthened Rome against the invaders that could not be seen.

If the American state disintegrates, future postmortems are unlikely to focus much on measles, or on rotavirus vaccination rates. But the ability to beat back our more routine pathological menaces is a good indicator of the country’s ability to take on bigger, more virulent threats. The thing about bacteria and viruses, our most ancient foes, is that they are always at the gates, waiting for lean times. Among them will be pathogens worse than the coronavirus.

In the main, the withering of public health might not anticipate a future apocalypse so much as it recalls a previous America, one where lives were cheaper and shorter, where good health was the province of a privileged few, and where epidemics regularly scoured the countryside and the city slums. What’s spurring the slide now isn’t a dearth of information or cutting-edge medicine. Rather, the precepts of a shared reality have been shattered, and with them the ability to act for a common cause.


This article appears in the March 2026 print edition with the headline “How America Got So Sick.”

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

In the winter of 168 C.E., the famed Greek physician Galen arrived in Aquileia, an Italian city on the northern edge of the Adriatic. The city had grown large since its founding as a Roman colony, but during the 200-year Pax Romana, its fortifications had been allowed to deteriorate. After an armed group of migrating Germanic peoples had crossed the Danube a year earlier, the Roman co-emperors, Marcus Aurelius and Lucius Verus, had rushed to the city, raising two legions and rebuilding its defenses; they planned to use it as a base of operations against the invaders.

Galen had been summoned, however, to help fight a different kind of invader. A plague, likely an early variant of smallpox, had traveled to Aquileia with the troops, and held the city in its grip. The emperors fled, but Verus succumbed to the disease on the road to Rome. Galen tried to slow the wave of illness, but most of the people in Aquileia perished.

They represented just a sliver of the eventual victims of the Antonine Plague, also known as Galen’s Plague, which killed at least 1 million people throughout the Roman empire. It was possibly the world’s first true pandemic, and haunted the empire for the rest of the Pax Romana, which ended in 180 with Aurelius’s death. The details of the pandemic—the exact pathogen, the true number of victims—are subjects of debate, and might never be fully settled. But some research has cited the Antonine Plague as part of a vicious cycle that hastened Rome’s long fall. Food shortages, internal migrations, and overcrowding had already signaled a slippage in imperial power, and created a fertile environment for disease. The pandemic, in turn, spread panic and left behind mistrust, weakening faith in civic and religious authorities.

Men famously think about Rome every day, and political commentators have been nervously comparing Rome’s fall to a potential American collapse since before America even had a Constitution. But Rome’s example really does merit consideration in light of recent events. One of the better measures of a society’s vitality is its ability to protect its citizens from disease, and the two often move in tandem; a decline in one may produce a reduction in the other.

Infectious disease is probably not an imminent threat to the United States’ survival. Still, after nearly a century of existence, the American public-health apparatus, which has driven some of the most remarkable advances in global longevity and quality of life in human history, is teetering. The country has lost much of its ability to keep microbes from invading its body politic, and progress in life expectancy and other metrics is slowing or even reversing.

It is tempting to lay these changes all at the feet of President Trump and his current health secretary, Robert F. Kennedy Jr., who together have shredded America’s global-health organizations, drawn back public-health funding, fomented vaccine skepticism, and begun to dismantle child-vaccination programs. But the “Make America Healthy Again” moment is in some ways just another step in the long retreat of the civic trust and communitarian spirit that have enabled America’s disease-fighting efforts. If this retreat continues, the public-health era—the century-long period of unprecedented epidemiological safety that has been the foundation for so many other breakthroughs—will come to an end. And that end will have dire consequences for this republic and its future.

In January 2025, a hospital in West Texas began reporting that children were coming in sick with measles. The cases were initially clustered in a Mennonite community, where vaccination rates had been low in recent decades. But soon the outbreak spread around the state, and to others; the reported number of cases reached more than 1,800 by the year’s end. As of this writing, the outbreak is still ongoing, and America is in danger of having its measles-elimination status revoked by the World Health Organization.

On August 8, as the measles outbreak continued to make headlines, a man named Patrick Joseph White entered a CVS in northeast Atlanta and fired hundreds of rounds from a rifle into the CDC’s headquarters across the street. According to Georgia investigators, White had been suicidal, and believed that COVID‑19 vaccines were part of a conspiracy to sicken him and other Americans.

These were but two signs among many that something has broken within the systems that protect the population’s health. Despite all of our advantages, the coronavirus pandemic caused more confirmed deaths per capita in the United States than in any other Western country, and our mortality rate’s recovery has lagged behind others’. Life expectancy in the U.S. is lower than in other high-income nations, and the gulf is widening.

America is unique, and comparisons are difficult. The country easily outpaces the rest of the developed world in gun deaths and overdoses, both major mortality drivers here that have largely been accepted as the cost of being American. But even if you discount those peculiarities, plenty of other indicators are pointing the wrong way. Foodborne illnesses appear to be on the rise, including regular surges of norovirus. Deteriorating water-delivery and sewage systems have contributed to a growing number of outbreaks of legionella. Cases of tetanus, whooping cough, and hepatitis A have also risen in recent years.

Many problems contribute to these shifts—insufficient investments in infrastructure, budget cuts in state and local health departments, the growing drug resistance of bacteria. Yet underlying all of the outbreaks, and even gun and opioid deaths, is a common theme: a declining sense of mutual responsibility among Americans. If the population could be analogized to a single human body, then its immune system would rely on a concert of action and purpose between each cell. When that concert stops, the body dies.

In 1946, the year the U.S. Public Health Service founded its Communicable Disease Center, American life expectancy at birth was about 66 years. Malaria was rampant in the South, and fever diseases, tuberculosis, syphilis, and polio killed tens of thousands of Americans annually. Thirty-four out of every 1,000 children born in 1946 were expected to die before their first birthday, many from communicable diseases. America was moving toward modernity, but the risks people faced were of a different order than they are today.

The CDC (since renamed the Centers for Disease Control and Prevention) inherited much of its early mandate from a U.S. military campaign to control infectious diseases among soldiers fighting in World War II. The scale of the war effort had necessitated the creation of a health infrastructure on American soil—spraying for mosquitoes near the front lines in the Pacific wouldn’t mean anything if soldiers caught malaria at home before deployment. Responses to outbreaks near bases needed to be big and fast enough to account for car travel beyond military jurisdictions. When the CDC took over, it extended this paradigm—of coordination across long distances and disparate communities—to the civilian population.

[From the June 2020 issue: Vann R. Newkirk II on how America handles catastrophe]

The same year the CDC was created, the influenza vaccine reached the public, and international organizations, supported by the U.S., began a global push to eliminate tuberculosis. The agency worked to promote mass vaccination. It began a national disease-surveillance program, and shared intelligence with cash-strapped county health departments and state agencies. Wartime campaigns to coax and chide Americans into doing their part to conserve resources and volunteer for the war effort translated easily into pushes for vaccination and sanitation.

Before 1946, conquering disease would have seemed as much a subject of science fiction as putting a man on the moon. But since 1950, global life expectancy has risen by four years each decade. Smallpox has been eradicated, and polio and malaria cases have dramatically fallen. Within the past 80 years, there have perhaps been more significant advances in human health than there were in the previous 300,000.

On the home front, several generations have grown up on an American mainland without malaria, yellow fever, or typhoid fever; diseases like dysentery are medical rarities. Measles and polio, once routine scourges of childhood, were pushed back by millions of vaccinations. Life expectancy increased by more than a decade, to 78 in 2023. This was a public-health revolution, on equal footing with any of the great agricultural, industrial, or information revolutions that have punctuated the past few centuries.

Those other great revolutions are often considered to be the result of technological advances—the plow, steam power, fertilizers, the internet. And certainly, the development of vaccines, antibiotics, and other medicines has played a tremendous role in the advance of human health. But vaccines for smallpox and some other diseases had been around for at least half a century before the 1940s, and had failed to create widespread immunity. The real public-health revolution was first and foremost a change in the way people thought about themselves and their relationship to one another.

Epidemiology made a new kind of thinking necessary. Pathogens respect neither individuals nor borders. Vaccinations and other preventatives against ever-evolving germs do not on their own guarantee personal safety—only eradication can do that. And eradication, it came to be understood, can be achieved only through local and global cooperation.

In America, where capitalist and individualist ethics have always predominated, public health nonetheless managed to carve out a large cooperative space. Before the 1940s, the United States was still reporting a relatively high number of smallpox cases compared with other similarly industrialized nations; it achieved total elimination in 1949. With the insistence of a growing public-health apparatus, it became common practice to wash our hands, to cover our mouths, to not smoke indoors, and to get tested—not just for our own benefit, but for the sake of the people around us. Parents waited in long lines to have their children inoculated, and enterprising physicians went to rural clinics to reach the last isolated clusters of unvaccinated people.

That is not to say America’s particular system of public health was ever perfect. Owing partly to the legacy of segregation, the country never developed a universal health-insurance program, and maintains a fragmented health-care system in which both class and race still dictate much of a patient’s access to care. Many people on the margins who have wanted to get screened for certain diseases or vaccinated against them have not been able to do so, because they cannot afford to or because no doctor will serve them.

And yet, sometimes through the insistence of those same people that America live up to the tenets of public health, the system has come closer to the ideal. As much as any other institution—schools, libraries, churches—the public-health system has helped propagate the idea of a commons, often working against historical inertia to curb the excesses of American individualism. That work has always required energy and effort from the people. And so it has always been vulnerable, because that energy and effort could dissipate at any time.

There is ample evidence that this is exactly what is happening. According to the health-policy organization KFF, in the summer of 2025 just 83 percent of parents kept their children up to date on vaccines, down from 90 percent four years earlier. Cases are surging for several of the diseases covered by the national vaccine schedule. Tuberculosis cases are higher than they have been in a dozen years, and meningococcal disease is rising as well. Measles cases have trended upward for years too, even before 2025.

Over the past 50 years, American trust in the medical system has declined, as has trust in government, science, and expertise in general. The coronavirus pandemic exploded those trends, creating the world in which we now find ourselves. Public-health agencies did themselves no favors: They often gave out confusing and sometimes conflicting advice. Conspiracy theories grew quickly on social media, and measures such as masking became subject to partisan polarization. According to Gallup, a bare majority—just 51 percent—of Americans now favors government requirements for vaccines, down from 81 percent in 1991 and 62 percent in 2019. Most of the slippage has been among conservatives, and studies suggest that political ideology is perhaps the biggest predictor of vaccine rejection.

Medicine has kept moving forward, with some truly great results. Deaths in the U.S. from cardiovascular disease are plummeting, and might see further declines with the advance of GLP‑1 drugs. With the advent of better cancer-screening tools, survival rates are improving, and wonder-drug therapeutics for many conditions are now on the market. But personalized care of this sort is expensive, and does not keep us collectively safe from infectious disease.

Meanwhile, as viruses that once killed hundreds of thousands have receded from public memory, they have come to seem less fearsome. Owing to the near eradication of some diseases, there have been few real risks to the heretofore small portion of people who refuse vaccines. In this landscape, organizations such as the CDC, which once stood as unimpeachable examples of government competence, have become victims of their own success, appearing to skeptics to be inert or irrelevant.

This was the system as Trump and Kennedy found it last year, vulnerable and stripped of the halo of public trust. Kennedy slashed agency budgets and stocked a key vaccine advisory committee with vaccine skeptics, then this past January announced a new set of childhood-vaccine recommendations that excluded coverage for rotavirus, influenza, and hepatitis A, which all now cannot be administered to most patients without a doctor’s consultation.

Kennedy’s biggest threat to public health comes from what he symbolizes. The MAHA movement derides expertise, overemphasizes personal commitment and liberty, and has embraced pseudoscience. This stance, mingled with Trump world’s conspiratorial tendencies, has turned the CDC and other once-trusted institutions into targets. After the August shooting at CDC headquarters, hundreds of current and former Health and Human Services employees singled out Kennedy as a driver of the kind of rhetoric that had motivated Patrick Joseph White, referring to the secretary’s previous insinuations that the CDC itself was hiding information about the risks of COVID vaccines.

Marcus Aurelius, the surviving Roman emperor, is mostly famous in our time because of his Stoicism. His philosophy encouraged the embrace of duty, not because of the expectation of praise or other material benefits but because duty is in itself fulfillment of the human condition. In his Meditations, he offered a maxim: “Do your duty—whether shivering or warm, never mind; heavy-eyed, or with your fill of sleep; in evil report or in good report; dying or with other work in hand.”

It’s hard to psychoanalyze a guy who lived two millennia ago, but it’s easy to believe that this particular admonishment may have come from his time as a plague fighter. In the face of Galen’s “everlasting pestilence,” Marcus had to rally the public and improvise, stocking depleted armies with convicts and ordering the digging of mass graves. He saw that the state was held up not just by the military or territory, but by invisible webs of shared sacrifice and obligation. In the end, the fortifications that mattered most were those that strengthened Rome against the invaders that could not be seen.

If the American state disintegrates, future postmortems are unlikely to focus much on measles, or on rotavirus vaccination rates. But the ability to beat back our more routine pathological menaces is a good indicator of the country’s ability to take on bigger, more virulent threats. The thing about bacteria and viruses, our most ancient foes, is that they are always at the gates, waiting for lean times. Among them will be pathogens worse than the coronavirus.

In the main, the withering of public health might not anticipate a future apocalypse so much as it recalls a previous America, one where lives were cheaper and shorter, where good health was the province of a privileged few, and where epidemics regularly scoured the countryside and the city slums. What’s spurring the slide now isn’t a dearth of information or cutting-edge medicine. Rather, the precepts of a shared reality have been shattered, and with them the ability to act for a common cause.


This article appears in the March 2026 print edition with the headline “How America Got So Sick.”

Americans watching the Super Bowl today will hear a tantalizing pitch: You, too, can live like the rich. All you need is a way to order up health care the way they do.

A new ad scheduled to air during the big game, from the telehealth company Hims & Hers, opens with the provocation “Rich people live longer.” A dizzying montage spoofing America’s wealthiest and most wellness-obsessed characters follows: An ageless man bathes in red light in the manner of Bryan Johnson, the multimillionaire who rose to fame for his fanatical efforts to live forever. A bald man doffs a cowboy hat as a rocket blasts off—clearly an allusion to Jeff Bezos, who in recent years has become incredibly buff and backed a buzzy age-reversal start-up. Surgeons pull the sagging face of a woman taut as a drum, perhaps a reference to the Kardashian matriarch Kris Jenner’s notorious 2025 facelift. Dan Kenger, the chief design officer at Hims & Hers, told me in an email that the actors in the commercial represent only “symbols of an intimidating, members-only healthcare culture.” But the parallels to real life are too obvious to ignore. “They get the best of everything,” the voiceover continues. “So why don’t you?”

Hims & Hers is best known for offering weight-loss and hair-restoration treatments, but the ad positions the company as a gateway to a different world of health care—one with concierge doctors who are available on demand to offer personalized, cutting-edge therapies. “America’s healthcare is a tale of two systems: one elite, proactive tier for the wealthy, and a broken, reactive one for everyone else,” Kenger said. Hims & Hers’ actual offerings, which include diagnostic blood testing and hormone therapy, may be more mundane than a facelift, but the ad emphasizes one thing they all share: They’re all available to you, for a price.

Telehealth platforms such as Hims & Hers offer patients the ability to pay out of pocket to quickly seek treatment for their ailments. Generally, they assert that patients must consult virtually with a medical provider before receiving a prescription, but this requirement isn’t featured prominently in most ads. The new spot from Hims & Hers notes that its doctors are around to chat, and—in the fine print—mentions that a consultation with a provider is required before receiving treatment. But unless you’re squinting at the screen, the message that comes across is that health care is primarily a transaction between patient and company.

In general, telehealth platforms have come to essentially function as a kind of Amazon for drugs, making it easier for people to get prescriptions for controlled substances such as Adderall and testosterone. The companies sidestep the high cost associated with many name-brand drugs by using compounding pharmacies to offer cheaper, nearly identical versions of the medications. In theory, a licensed provider writes the prescription after a thorough medical assessment. But in just the past two years, two different telehealth practices have been sanctioned for inappropriately prescribing stimulants. (Hims & Hers has not faced any federal investigations for its prescribing practices.)

Last year, Hims & Hers ran a Super Bowl ad similarly focused on democratizing care—specifically GLP-1s for weight loss. It asserted that America’s obesity epidemic is caused by “the system,” which keeps people sick by making health care unaffordable. When name-brand GLP-1s such as Wegovy first started being prescribed for weight loss, in 2021, they were inaccessible for most Americans because of their steep cost and insurance providers’ limited coverage. Rumors that celebrities were using the drugs for cosmetic reasons sparked a mix of outrage and envy, and as Americans of more humble means began to see the results the drugs could bring about, many began scrambling for cheaper sources—a perfect opportunity for Hims & Hers and its peers.

[Read: What is Hims actually selling?]

But Americans aren’t clamoring for just GLP-1s anymore. Medications such as testosterone and peptides—a largely untested class of drugs with a wide range of purported health benefits—are in high demand too. Like GLP-1s, they have been touted by wealthy, high-profile figures, including Gwyneth Paltrow, Joe Rogan, and Health Secretary Robert F. Kennedy Jr. Hims & Hers has positioned itself to meet these demands. Last year, the company launched its low-testosterone treatment program and bought a peptide-manufacturing facility. It plans to use that factory to make its own “personalized” drugs.

Such language evokes the concierge treatments enjoyed by the rich. But in telehealth-platform parlance, personalization usually refers to creating a version of a name-brand medication that fits a patient’s needs by, say, changing the dose, adding other active ingredients, or offering it in a different format. These compounded versions aren’t reviewed by the FDA—in fact, the FDA announced Friday that it had asked the Justice Department, out of concern for consumer safety, to investigate Hims & Hers for selling compounded GLP-1 pills. The move is part of the DOJ’s wider plan to take “decisive steps” to restrict the marketing of compounded GLP-1s.

The medical community is concerned that telehealth platforms too easily allow patients to take unsafe drugs—or drugs they simply don’t need. The Obesity Society, a scientific organization, warns people not to use compounded GLP-1s, because they may not contain the appropriate active ingredients. (As I wrote recently, men’s-health experts are also worried about the cardiovascular and reproductive consequences of unnecessary testosterone-replacement therapy.) When I asked Hims & Hers about the DOJ investigation, a spokesperson directed me to a public statement posted yesterday on X, in which the company said it would stop selling compounded GLP-1 pills. “We remain committed to the millions of Americans who depend on us for access to safe, affordable, and personalized care,” the statement said.

[Read: T-maxxing has gone too far]

The Hims & Hers ad is shrewd in its heavy-handedness. It validates Americans’ frustration with health care and positions itself as a deliverer of justice. (The ad is narrated by the rapper Common, known for his socially conscious music; in its Super Bowl ad last year, the company used Childish Gambino’s “This Is America,” a song about systemic racism and gun violence, to pile onto its message of a broken system.) Pew Research Center data released a few days ago show that 71 percent of Americans are worried about the cost of health care.

At the same time, Americans can’t seem to stop imitating billionaires’ wellness habits: untested peptides, NAD+ IV drips, Erewhon smoothies, red-light masks, keto diets. A slew of other planned Super Bowl ads promoting new visions of health illustrates just how central the pursuit of wellness has become to American life and the American economy, even when the benefits of so many of these practices are questionable at best. Hims & Hers is accurate in its diagnosis that much of this country’s gap in health and longevity boil down to wealth. But the treatment it prescribes seems to be an unlikely cure.