Category:

Medical News

Every day, the blogger Alex Lyons orders the same salad from the same New York City bodega and eats it in the same place: her desk. She eats it while working so that she can publish a story before “prime time”—the midday lunch window when her audience of office workers scrolls mindlessly on their computers while gobbling down their own salad. Lyons is the protagonist of Sad Desk Salad, the 2012 novel by Jessica Grose that gave a name to not just a type of meal but a common experience: attempting to simultaneously maximize both health and productivity because—and this is the sad part—there’s never enough time to devote to either.

The sad desk salad has become synonymous with people like Lyons: young, overworked white-collar professionals contemplating how salad can help them self-optimize. Chains such as Sweetgreen and Chopt have thrived in big coastal cities, slinging “guacamole greens” and “spicy Sonoma Caesars” in to-go bowls that can be picked up between meetings. The prices can creep toward $20, reinforcing their fancy reputation.

But fast salad has gone mainstream. Sweetgreen and similar salad chains have expanded out of city centers into the suburbs, where they are reaching a whole new population of hungry workers. Other salad joints are selling salad faster than ever—in some cases, at fast-food prices. Along the way, the sad desk salad has become even sadder.

Anything can make for a sad desk lunch, but there’s something unique about salads. Don’t get me wrong: They can be delicious. I have spent embarrassing amounts of money on sad desk salads, including one I picked at while writing this article. Yet unlike, say, a burrito or sushi, which at least feel like little indulgences, the main reason to eat a salad is because it’s nutritious. It’s fuel—not fun. Even when there isn’t time for a lunch break, there is always time for arugula.

[Read: Don’t believe the salad millionaire]

During the early pandemic, the sad desk salad seemed doomed. Workers sitting at a desk at home rather than in the office could fish out greens from the refrigerator crisper drawer instead of paying $16. Even if they wanted to, most of the locations were in downtown cores, not residential neighborhoods.

But the sad desk salad has not just returned—it’s thriving. Take Sweetgreen, maybe the most well-known purveyor. It bet that Americans would still want its salads no matter where they are working, and so far, that has paid off. The company has been expanding to the suburbs since at least 2020 and has been spreading ever since. In 2023, it opened stores in Milwaukee, Tampa, and Rhode Island; last week, when Sweetgreen reported that its revenue jumped 26 percent over the previous year, executives attributed that growth to expansion into smaller cities. Most of its locations are in the suburbs, and most of its future stores would be too.

Sweetgreen is not the only company to have made that gamble. Chopt previously announced that it would open 80 percent of its new stores in the suburbs; the Minnesota-based brand Crisp & Green is eyeing the fringes of midwestern cities. Salad has become so entrenched as a lunch option that even traditional fast-food giants such as Wendy’s and Dairy Queen have introduced salad bowls in recent years. Maybe the most novel of all is Salad and Go, an entirely drive-through chain that sells salads for less than $7. It opened a new store roughly every week last year, and now has more than 100 locations across Arizona, Nevada, Oklahoma, and Texas, with plans to expand to Southern California and the Southeast. Its CEO, Charlie Morrison, has positioned it as a cheap and convenient alternative to unhealthy options: a rival not to Sweetgreen, but to McDonald’s.

Indeed, sad desk salads can be made with shocking speed. According to Morrison, you can drive off with your salad in less than four minutes. Other chains including Just Salad and Chopt are opening up drive-through lanes to boost convenience. Sweetgreen, which has also dabbled with the drive-through, has installed salad-assembling robots in several locations, which can reportedly make 500 salads an hour.

[Read: Your fast food is already automated]

Greater accessibility to salad, in general, is a good thing. America could stand to eat a lot more of it. No doubt some salads will be consumed outside of work: on a park bench with friends, perhaps, or on a blanket at the beach—a girl can dream! But surely many of them will be packed, ordered, and picked up with frightening speed, only to maximize the time spent working in the glow of a computer screen, the crunching of lettuce punctuated by the chirping of notifications.

As I lunched on kale and brussels sprouts while writing this story, my silent hope was that they might offset all the bad that I was doing to my body by sitting at my desk for almost eight hours straight. Dining while distracted makes overeating more likely; sitting for long stretches raises the risk of diabetes and heart disease. People who take proper lunch breaks, in contrast, have improved mental health, less burnout, and more energy. No kind of cheap, fast salad can make up for working so fervidly that taking a few minutes off to enjoy a salad is not possible or even desirable.

Earlier this month, Sweetgreen introduced a new menu item you can add to its bowls: steak. The company’s CEO said that, during testing, it was a “dinnertime favorite.” That the sad desk salad could soon creep into other mealtimes may be the saddest thing yet.

In 2012, the BBC aired a documentary that pushed diet culture to a new extreme. For Eat, Fast, and Live Longer, the British journalist Michael Mosley experimented with eating normally for five days each week and then dramatically less for two, usually having only breakfast. After five weeks, he’d lost more than 14 pounds, and his cholesterol and blood-sugar levels had significantly improved. The documentary, and the international best-selling book that followed, set the stage for the next great fad diet: intermittent fasting.

Intermittent fasting has become far more than just a fad, like the Atkins and grapefruit diets before it. The diet remains popular more than a decade later: By one count, 12 percent of Americans practiced it last year. Intermittent fasting has piqued the interest of Silicon Valley bros, college kids, and older people alike, and for reasons that go beyond weight loss: The diet is used to help control blood sugar and is held up as a productivity hack because of its purported effects on cognitive performance, energy levels, and mood.

But it still isn’t clear whether intermittent fasting leads to lasting weight loss, let alone any of the other supposed benefits. What sets apart intermittent fasting from other diets is not the evidence, but its grueling nature—requiring people to forgo eating for many hours. Fasting “seems so extreme that it’s got to work,” Janet Chrzan, a nutritional anthropologist at the University of Pennsylvania and a co-author of Anxious Eaters: Why We Fall for Fad Diets, told me. Perhaps the regime persists not in spite of its difficulty, but because of it.

[Read: What billionaires’ fasting diets mean for the rest of us]

Intermittent fasting comes in lots of different forms, which vary in their intensity. The “5:2” version popularized by Mosley involves eating normally for five days a week and consuming only about 600 calories for two. Another popular regime called “16/8” restricts eating to an eight-hour window each day. One of the most extreme is a form of alternate-day fasting that entails full abstinence every other day. Regardless of its specific flavor, intermittent fasting has some clear upsides compared with other fad diets, such as Atkins, Keto, and Whole 30. Rather than a byzantine set of instructions—eat these foods; avoid those—it comes with few rules, and sometimes just one: Don’t eat at this time. Diets can be expensive, yet intermittent fasting costs nothing and requires no special foods or supplements.

Conventional fad diets are hard because constantly making healthy food choices to lose weight is “almost impossible,” Evan Forman, a psychology professor at Drexel University who specializes in health behavior, told me. That’s why intermittent fasting, which removes the pressure to make decisions about what to eat, can “actually be reasonably successful,” he said. Indeed, some studies show that intermittent fasting can lead to weight loss after several months, with comparable results to a calorie-counting diet.

But lots of diets lead to short-term success; people tend to gain back the weight they lost. Studies on intermittent fasting tend to last only a few months. Yet in a recent one, which tracked patients over six years, intermittent fasting wasn’t linked with lasting weight loss.

Whether other benefits can be attributed to fasting remains unclear. Assertions that it might improve insulin sensitivity, obesity, cardiovascular health, Alzheimer’s, and Parkinson’s disease are largely based on preclinical and animal studies, according to a 2019 review in The New England Journal of Medicine. Although its authors argue that fasting is broadly favorable, they concluded that whether people who fast over a span of years can ever accrue the health benefits seen in animals remains to be determined.

Incomplete evidence is typical for dieting fads, which tend to come and go pretty quickly in a way that intermittent fasting hasn’t. (Does anyone remember the Special K and Zone diets? Exactly.) What really sets the practice apart is how hard it is. Skipping meals can send a person into a tailspin; willfully avoiding food for hours or even days on end can feel like torture. The gnawing hunger, crankiness, and reduced concentration associated with fasting usually takes at least a month to dissipate.

That may be why intermittent fasting is hard for America to shake. When it comes to diets, “the more extreme they are, the more they are perceived to be extremely efficacious,” Chrzan said. The self-discipline required to persevere through a fast is commonly glorified. Abstaining from food for 36 hours, a particularly intense form of the diet, is known as the “monk fast”; one intermittent-fasting app is called Hero.

That pushing your body to the limit has benefits isn’t a new idea. In 1900, the American physician Edward Hooker Dewey published The No Breakfast Plan and the Fasting-Cure, which promoted fasting as a virtuous act that could remedy physical and mental ailments. The extreme self-sacrifice and resulting moral fortitude that fasting wrought, Dewey believed, would turn people into “better, stronger men and women.” Even before that, there was the era of “heroic medicine,” which held that a treatment had to match the severity of the illness. That’s how you get bloodletting, purging, and leeching. Such a harsh approach to health is still a part of “how we think about medicine,” Chrzan said. “It has to be hard, because perfecting the self is a worthy goal.”

To that end, more research into intermittent fasting may not matter much for its popularity. In fact, some adherents of the diet don’t seem to care about the mixed evidence, Kima Cargill, a clinical-psychology professor at the University of Washington at Tacoma and the other co-author of Anxious Eaters, told me. Consciously or not, maybe the point of intermittent fasting isn’t health, but something else entirely. Maybe dieters see results, or maybe they don’t. In either case, surviving a period of fasting is a test of fortitude, proof that the mind can overcome the body. “It’s not all about deprivation,” Forman said. “Part of it is about this chase for optimization”—a kind of bodily transcendence. According to Cargill, fasting gives people a way “to feel structured and contained.” If these are the unspoken reasons people practice intermittent fasting, it’s no wonder that the diet has proved to be so much more than a fad.

When you consider the food environment that Americans have to contend with, the appeal of a program as drastic as intermittent fasting makes sense. Amid a glut of ultra-processed options, constant invitations to snack, and general bewilderment about what anyone is actually supposed to eat, “people feel so out of control about their diet that extreme ideas just have a lot more traction,” Chrzan said. Intermittent fasting offers a simple rule to eat by—and a means to rise above the chaos.

It used to be that whenever someone on TV or in a movie fell off the roof or had a skiing mishap or got into any sort of auto accident, the odds were pretty good that they’d end up in a neck brace. You know what I mean: a circlet of beige foam, or else a rigid ring of plastic, spanning from an actor’s chin down to their sternum. Jack Lemmon wore a neck brace for a part. So did Jerry Seinfeld, Julia Roberts, and Bill Murray. For many decades, this was pop culture’s universal symbol for I’ve hurt myself.

Now it’s not. People on TV and in the movies no longer seem to suffer like they used to, which is to say they no longer suffer cervically. Plastic braces do still crop up from time to time on-screen, but their use in sight gags is as good as dead. In the meantime, the soft-foam collar—which has always been the brace’s most recognizable form—has been retired. I don’t just mean that it’s been evicted from the props department; the collar has been set aside in clinics too. At some point in the past few decades, a device that once stood in for trauma and recovery was added to a list of bygone treatments, alongside leeches and the iron lung. Simply put, the collar vanished. Where’d it go?

The story naturally begins in doctors’ offices, where a new form of injury—“whiplash”—started to emerge amid the growing car culture of the 1940s and the early ’50s. “It is not difficult for anyone who travels on a highway to realize why the ‘painful neck’ is being produced daily in large numbers,” two Pennsylvania doctors wrote in 1955. Following a rear-end collision, a driver’s body will be thrown forward and upward, they explained. The driver’s neck will flex in both directions, “like a car radio aerial.”

The damage from this jerking to and fro could not necessarily be seen in any medical scan. It was understood to be more of a sprain than a fracture, causing pain and stiffness in the neck that might spread into the shoulder. Many patients found these problems faded quickly, but for some of them—maybe even half—the discomfort lingered. Whiplash in its graver forms led to dizzy spells, sensory disturbances, and cognitive decline (all of which are also signs of mild traumatic brain injuries). And it could leave its victims in a lasting state of disability—chronic whiplash, doctors called it—characterized by fatigue, memory problems, and headache.

[Read: Chronic whiplash is a mystery]

From the start, standard whiplash treatment would include the wearing of a soft appliance: a foam collar to support the patient’s head and stifle excess movement. But the underlying problem had a squishiness about it too. If the damage to the neck was invisible to imaging, how was it causing so much misery? Some doctors guessed that the deeper, more persistent wounds of whiplash might be psychic. A paper on the problem published in 1953, in the Journal of the American Medical Association, suggested that the chronic form of whiplash might best be understood as neurosis—a “disturbing emotional reaction” to an accident that produces lasting ailments. These early whiplash doctors didn’t claim that their patients were malingering; rather, they argued that the underlying source of anguish was diverse. It might comprise, in various proportions, damage to the ligaments and muscles, brain concussion, and psychology. Doctors worried that these different etiologies were hard to tease apart, especially in a legal context, when “the complicating factor of monetary compensation,” as one study put it, was in play. (These uncertainties persist, in one form or another, to this day.)

A clinical unease colored how the neck brace would be seen and understood by members of the public. For about as long as it was used for treating whiplash, the collar held opposing meanings: Someone had an injury, and also that injury was fake. In The Fortune Cookie, the Billy Wilder comedy from 1966, a cameraman (played by Lemmon) gets knocked over at a football game and then persuaded by his sleazy lawyer—a guy called “Whiplash Willie”—to pretend he’s gravely hurt. They’re planning to defraud the big insurance companies, and Lemmon’s plastic neck brace will be central to the act.

Indeed, the stock setting for the collar, soft and hard alike, has always been the courtroom. When Carol Brady finds herself before a judge in an episode of The Brady Bunch from 1972, the “victim” of her fender bender, Mr. Duggan, hobbles into court with an ostentatious you-know-what. “A neck brace—do you believe that?” she asks. Of course you don’t; that’s the point. Mr. Duggan tells the judge that he’s just come from the doctor’s office, and that he has whiplash. (He puts the stress on the word’s second syllable: whipLASH. The condition was still new enough, back then, that its pronunciation hadn’t fully settled.)

[Read: No one in movies knows how to swallow a pill]

Concerns about unfounded civil suits multiplied in the ’70s and ’80s, thanks in part to what the law professor Marc Galanter would later term the “elite folklore” of seemingly outrageous legal claims, stripped of context and diffused throughout the culture by mass media. There was the woman who said she’d lost her psychic powers after getting a CT scan, the worker at a convenience store who complained that she’d hurt her back while opening a pickle jar, the senior citizen who sued McDonald’s after spilling coffee in her lap. And then of course there was the granddaddy of them all: the whiplash faker in a neck brace—the Mr. Duggan type, familiar from the screen.

Car-insurance premiums were going up and companies were pointing to exaggerated whiplash claims from drivers whose “soft injuries” could not be verified objectively. Financial motives did appear to be in play for certain plaintiffs: In Saskatchewan, where a no-fault system of insurance had been introduced and most lawsuits for pain and suffering were eliminated, the number of whiplash-based insurance claims appeared to drop. (Similar correlations have been observed in other countries too.) In the early 1990s, the New Jersey Insurance Department even staged a series of minor accidents involving buses wired up with hidden cameras—they’d be rear-ended by a slowly moving car—to test the prevalence of fraud. The department’s investigators found that Whiplash Willie–style lawyers quickly swooped on passengers to cajole them into making claims of damage to their neck and back.

By this time, the neck brace’s mere appearance in a movie or TV show would be enough to generate a laugh. It just seemed so silly and so fake! In the courtroom, insurance companies and other businesses grew less inclined to settle whiplash cases, Valerie Hans, a psychologist and law professor at Cornell, told me. Instead they’d try their luck, and mostly find success, in jury trials. To find out why, Cornell and a colleague did a formal survey of potential jurors’ attitudes about such injuries in 1999, and found that the presence of a neck brace on a plaintiff might only make them more suspicious. Fewer than one-third believed that whiplash injuries were “usually” or “always” legitimate.

[Read: Whatever happened to carpal tunnel syndrome?]

If the soft neck brace was already well established as a joke on television and a liability in court, the medical establishment soon turned against it too. A series of randomized controlled trials of whiplash treatments, conducted in the 1990s and 2000s, all arrived at the same conclusion: Usage of the soft foam collar was “ineffective at best,” as one evidence review from 2010 described it. At worst, it could be doing harm by preventing patients from engaging in the mobility and exercise programs that seemed more beneficial.

A broader shift away from telling patients to keep still, and toward assigning active interventions, was under way in medicine. Bed rest and other forms of immobilization were falling out of favor in the treatment of back injuries, for example. Concussion doctors, too, began to wonder whether the standard guidance for patients to do nothing was really such a good idea. (The evidence suggested otherwise.) And uncertainty was even spreading to the other kinds of cervical orthoses, such as the stiff devices made of foam and plastic called trauma collars, which remain in widespread use by EMTs. These are meant to immobilize a patient’s neck, to help ensure that any damage to their upper spine will not be worsened. But their rationale was being questioned too.

In 2014, a team of doctors based in Norway, led by the neurosurgeon Terje Sundstrøm, published a “critical review” of trauma-collar use. “For many years, the cervical collar was the symbol of good health care, or good pre-hospital care,” Sundstrøm told me. “If the patient wasn’t fitted with one, then you didn’t know what you were doing.” But he described the evidence of their benefits as “very poor.” His paper notes that at least 50 patients have their necks immobilized for every one that has a major spinal injury. Trauma collars can interfere with patients’ breathing, according to some research, and their use has been associated with patients’ potential overtreatment. They’re also quite uncomfortable, which may agitate some patients, who could then make just the sorts of movements that the EMTs are, in theory, trying to prevent.

In short, despite trauma collars’ near-universal use since the 1960s, no one really knows how much they help, or whether they might even hurt. Sundstrøm said that his own health-care system gave up on using trauma collars a dozen years ago, and has yet to see a single injury as a result. Official guidelines for the emergency use of cervical braces have lately been revisited in a small handful of countries, but Sundstrøm does not expect major changes to take hold. “I don’t think there will ever be really good studies for or against collars like this,” he said, in part because cervical spinal injuries are very, very rare. For the same reason, we may never even know for sure whether collars are appropriate for patients whose cervical fractures have been confirmed in the hospital. “There hasn’t really been any interest in this research topic either,” he told me. Instead, doctors just rely on common sense about which interventions are likely to be helpful.

So the use of rigid trauma collars is likely to persist regardless of uncertainty. In health care, that’s more the norm than the exception. Research is difficult, the human body is complex, and tradition rules the day. Lots of standard interventions, maybe even most of them, aren’t fully known to do much good. Viewed against this backdrop, the soft foam collar—rarely useful, always doubted, often mocked—may finally have flipped its meaning. For years it stood for fakery and false impressions and also, ironically, for a lack of proper evidence in medicine—for a failure of support. Now it may signify the opposite. By disappearing from the movies, the courtroom, and the clinic, this form of neck brace has become a rare example of a lesson duly learned. It shows that science can correct itself, every now and then. It shows that progress may be slow, but it is real.

Earlier today, The New York Times broke some startling news about a presidential candidate. According to a 2012 deposition, Robert F. Kennedy Jr. once suffered from, in his own words, “a worm that got into my brain and ate a portion of it and then died.” The vague yet alarming description could apply to any number of parasitic ailments, among them angiostrongyliasis, baylisascariasis, toxocariasis, strongyloidiasis, and trichinosis. But some experts immediately suspected a condition called neurocysticercosis (NCC), in which the larvae of the pork tapeworm Taenia solium post up in the brain.

The condition might sound terrifying—and, to some observers, darkly hilarious. Literal brain worms! But it does not actually involve any brain-munching, or even your standard-issue worm. The brain-invading culprit is instead a tapeworm (a kind of helminth) that typically makes its home in pigs. As far as parasitic infections go, this is “the most common one in the brain,” Laila Woc-Colburn, an infectious-disease physician at Emory University, told me. And globally, it’s one of the most common causes of epilepsy in adults.

NCC typically begins after people have been exposed to feces that contain the eggs of a pork tapeworm, say while on a pig farm or handling uncooked, contaminated food. After the eggs are swallowed, they hatch into larvae in the gut. Because people aren’t the appropriate host for the young tapeworms, they end up on a fruitless journey, meandering through the body in a desperate attempt to find pig muscle. A common final destination for the larvae is the brain, where they enclose themselves into cysts in the hopes of maturing; eventually, unable to complete their life cycle, they die, leaving behind little more than a calcified nub.

[Read: Flatworms are metal]

This is, to put it scientifically, some pretty gnarly stuff. But many cases are “completely asymptomatic,” Boghuma Kabisen Titanji, also an infectious-disease physician at Emory University, told me. In other people, though—especially those with a lot of larval cysts—the presence of the foreign invaders can spark a wave of inflammation, which in turn triggers swelling and tissue destruction. Individuals with cysts in their brain may develop headaches or seizures, though those problems can take years or even decades to manifest, Titanji said.

Experts estimate that millions of people may be afflicted with NCC worldwide, most of them concentrated in Latin America, sub-Saharan Africa, East Asia, and India. In the U.S., though, NCC is rather rare, with just a few thousand diagnoses made each year, many of them related to travel or immigration. “This is a disease of poverty,” Woc-Colburn told me. Which would make the multimillionaire Kennedy—if he had the infection at all—“an atypical patient.”

There is, at least, some comforting news. NCC is pretty easily preventable with solid hand-washing habits. And in the U.S., where CT scans are fairly accessible, “it can be diagnosed very easily,” Woc-Colburn said, particularly once doctors have a good sense of a patient’s exposure history. Doctors generally know to look for it in patients who come in with headaches and seizures. (Kennedy first sought help after experiencing memory loss and mental fogginess, though he recently told the Times that those symptoms have since resolved and that he hadn’t received treatment for the parasite.) The infection is also treatable with standard antiparasitics. And caught early, it isn’t expected to leave lingering damage. In more serious cases, though, years of severe, unmanaged seizures can lead to certain cognitive defects.

[Read: America’s never-ending battle against flesh-eating worms]

None of this is to say that Kennedy definitely had NCC. All the public knows is that, in 2010, he said that he was battling neurological symptoms, and that an unusual blemish appeared on a brain scan. (The memory loss and mental fogginess may very well have been attributable to mercury poisoning from Kennedy’s diet at the time, which was high in tuna and perch, according to the same 2012 deposition.) Even if a parasite was definitely to blame, “at least six or seven” others could have ended up in his brain, Titanji told me. Like the pork-tapeworm larvae, several of them would have ended up there accidentally, only to die a quick death without gulping down any brain tissue.

The most comforting news about NCC is that—again—it is uncommon in the United States. Still, now that this news has broken, Woc-Colburn worries that her clinic is going to fill up with people who think they’re afflicted. Given the odds, many of them will be wrong. If anyone’s really worried about their gray matter becoming lunch, they shouldn’t fear worms, but Naegleria fowleri, a rare amoeba that camps out in warm bodies of water. That one, I regret to report, really does eat your brain.

Ozempic or Bust

by

1

In the early spring of 2020, Barb Herrera taped a signed note to a wall of her bedroom in Orlando, Florida, just above her pillow. NOTICE TO EMS! it said. No Vent! No Intubation! She’d heard that hospitals were overflowing, and that doctors were being forced to choose which COVID patients they would try to save and which to abandon. She wanted to spare them the trouble.

Barb was nearly 60 years old, and weighed about 400 pounds. She has type 2 diabetes, chronic kidney disease, and a host of other health concerns. At the start of the pandemic, she figured she was doomed. When she sent her list of passwords to her kids, who all live far away, they couldn’t help but think the same. “I was in an incredibly dark place,” she told me. “I would have died.”

Until recently, Barb could barely walk—at least not without putting herself at risk of getting yet another fracture in her feet. Moving around the house exhausted her; she showered only every other week. She couldn’t make it to the mailbox on her own. Barb had spent a lifetime dealing with the inconveniences of being, as she puts it, “huge.” But what really scared her—and what embarrassed her, because dread and shame have a way of getting tangled up—were the moments when her little room, about 10 feet wide and not much longer, was less a hideout than a trap. At one point in 2021, she says, she tripped and fell on the way to the toilet. Her housemate and landlord—a high-school friend—was not at home to help, so Barb had to call the paramedics. “It took four guys to get me up,” she said.

Later that year, when Barb finally did get COVID, her case was fairly mild. But she didn’t feel quite right after she recovered: She was having trouble breathing, and there was something off about her heart. Finally, in April 2022, she went to the hospital and her vital signs were taken.

The average body mass index for American adults is 30. Barb’s BMI was around 75. A blood-sugar test showed that her diabetes was not under control—her blood sugar was in the range where she might be at risk of blindness or stroke. And an EKG confirmed that her heart was skipping beats. A cardiac electrophysiologist, Shravan Ambati, came in for a consultation. He said the missed beats could be treated with medication, but he made a mental note of her severe obesity—he’d seen only one or two patients of Barb’s size in his 14-year career. Before he left, he paused to give her some advice. If she didn’t lose weight, he said, “the Barb of five years from now is not going to like you very much at all.” As she remembers it, he crossed his arms and added: “You will either change your life, or you’ll end up in a nursing home.”

“That was it. That was it,” Barb told me. Imagining herself getting old inside a home, “in a row of old people who are fat as hell, just sitting there waiting to die,” she vowed to do everything she could to get well. She would try to change her life. Eventually, like millions of Americans, she would try the new miracle cure. Again.

2

In a way, Barb has never stopped trying to change her life. At 10 years old, she was prescribed amphetamines; at 12, she went to WeightWatchers. Later she would go on liquid diets, and nearly every form of solid diet. She’s been vegan and gluten-free, avoided fat, cut back on carbs, and sworn off processed foods. She’s taken drugs that changed her neurochemistry and gotten surgery to shrink her stomach to the size of a shot glass. She’s gone to food-addiction groups. She’s eaten Lean Cuisines. She’s been an avid swimmer at the Y.

photo of woman in kitchen holding birthday cake that says "Happy 30th Birthday Barb" on it
Barb Herrera weighed about 300 pounds by the time she was 30. (Courtesy of Barb Herrera)

Through it all, she’s lost a lot of weight. Really an extraordinary quantity—well more than a quarter ton, if you add it up across her life. But every miracle so far has come with hidden costs: anemia, drug-induced depression, damage to her heart. Always, in the end, the weight has come back. Always, in the end, “success” has left her feeling worse.

In the United States, an estimated 189 million adults are classified as having obesity or being overweight; certainly many millions have, like Barb, spent decades running on a treadmill of solutions, never getting anywhere. The ordinary fixes—the kind that draw on people’s will, and require eating less and moving more—rarely have a large or lasting effect. Indeed, America itself has suffered through a long, maddening history of failed attempts to change its habits on a national scale: a yo-yo diet of well-intentioned treatments, policies, and other social interventions that only ever lead us back to where we started. New rules for eating have been rolled out and then rolled back. Pills have been invented and abandoned. Laws have been rewritten to improve the quality of people’s diets and curb caloric intake—to make society less “obesogenic” on the whole. Efforts have been made to reduce discrimination over body size in employment settings and in health care. Through it all, obesity rates keep going up; the diabetes epidemic keeps worsening.

The most recent miracle, for Barb as well as for the nation, has come in the form of injectable drugs. In early 2021, the Danish pharmaceutical company Novo Nordisk published a clinical trial showing remarkable results for semaglutide, now sold under the trade names Wegovy and Ozempic. Thomas Wadden, a clinical psychologist and obesity researcher at the University of Pennsylvania who has studied weight-loss interventions for more than 40 years (and who has received both research grants and fees from Novo Nordisk), remembers when he first learned about those findings, at an internal meeting at the company the year before. “My jaw just dropped,” he told me. “I really could not believe what we were seeing.” Patients in the study who’d had injections of the drug lost, on average, close to 15 percent of their body weight—more than had ever been achieved with any other drug in a study of that size. Wadden knew immediately that this would be “an incredible revolution in the treatment of obesity.”

[Radio Atlantic: Could Ozempic derail the body-positivity movement?]

Semaglutide is in the class of GLP-1 receptor agonists, chemicals derived from lizard venom that mimic gut hormones and appear to reshape our metabolism and eating behavior for as long as the drugs are taken. Earlier versions were already being used to treat diabetes; then, in 2022, a newer one from Eli Lilly—tirzepatide, sold as Zepbound or Mounjaro—produced an average weight loss of 20 percent in a clinical trial. Many more drugs are now racing through development: survodutide, pemvidutide, retatrutide. (Among specialists, that last one has produced the most excitement: An early trial found an average weight loss of 24 percent in one group of participants.)

The past four decades of American history underline just how much is riding on these drugs—and serve as a sobering reminder that it is impossible to know, in the first few years of any novel intervention, whether its success will last.

The drugs don’t work for everyone. Their major side effects—nausea, vomiting, and diarrhea—can be too intense for many patients. Others don’t end up losing any weight. That’s not to mention all the people who might benefit from treatment but don’t have access to it: For the time being, just 25 percent of private insurers offer the relevant coverage, and the cost of treatment—about $1,000 a month—has been prohibitive for many Americans.

But there’s growing pressure for GLP-1 drugs to be covered without restrictions by Medicare, and subject to price negotiation. Eventually they will start to come off patent. When that happens, usage is likely to explode. The drugs have already been approved not just for people with diabetes or obesity, but for anyone who has a BMI of more than 27 and an associated health condition, such as high blood pressure or cholesterol. By those criteria, more than 140 million American adults already qualify—and if this story goes the way it’s gone for other “risk factor” drugs such as statins and antihypertensives, then the threshold for prescriptions will be lowered over time, inching further toward the weight range we now describe as “normal.”

How you view that prospect will depend on your attitudes about obesity, and your tolerance for risk. The first GLP-1 drug to receive FDA approval, exenatide, has been used as a diabetes treatment for more than 20 years. No long-term harms have been identified—but then again, that drug’s long-term effects have been studied carefully only across a span of seven years. Today, adolescents are injecting newer versions of these drugs, and may continue to do so every week for 50 years or more. What might happen over all that time? Could the drugs produce lasting damage, or end up losing some of their benefit?

Athena Philis-Tsimikas, an endocrinologist who works at Scripps Health in San Diego and whose research has received ample funding from Novo Nordisk and Eli Lilly, says the data so far look very good. “These are now being used, literally, in hundreds of thousands of people across the world,” she told me, and although some studies have suggested that GLP-1 drugs may cause inflammation of the pancreas, or even tumor growth, these concerns have not borne out. Exenatide, at least, keeps working over many years, and its side effects don’t appear to worsen. Still, we have less to go on with the newer drugs, Philis-Tsimikas said. “All of us, in the back of our minds, always wonder, Will something show up?  ” Although no serious problems have yet emerged, she said, “you wonder, and you worry.”

The GLP-1 drugs may well represent a shocking breakthrough for the field of public health, on the order of vaccines and sanitation. They could also fizzle out, or end in a surge of tragic, unforeseen results. But in light of what we’ve been through, it’s hard to see what other choices still remain. For 40 years, we’ve tried to curb the spread of obesity and its related ailments, and for 40 years, we’ve failed. We don’t know how to fix the problem. We don’t even understand what’s really causing it. Now, again, we have a new approach. This time around, the fix had better work.

3

Barb’s first weight-loss miracle, and America’s, came during a moment of profound despair. In 1995, while working in a birthing center, she’d tripped on a scale—“the irony of all ironies,” she told me—and cracked her ankle. When she showed up for the surgery that followed, Barb, then 34 and weighing 330 pounds, learned that she had type 2 diabetes. In a way, this felt like her inheritance: Both grandparents on Barb’s father’s side had obesity and diabetes, as did her dad, his brother, and two sisters. Her mother, too, had obesity. Now, despite Barb’s own years of efforts to maintain her health, that legacy had her in its grip.

The doctors threatened Barb (as doctors often have): If she didn’t find a way to eat in moderation, she might not make it through the end of 1997. Then she got some new advice: Yes, Barb should eat better food and exercise, but also maybe she should try a pair of drugs, dexfenfluramine and phentermine, together known as “fen-phen.” The former had just received approval from the FDA, and research showed that a combination of the two, taken several times a day, was highly effective at reducing weight.

[Read: The weight-loss-drug revolution is a miracle—and a menace]

The treatment was a revelation. Even when she talks about it now, Barb begins to cry. She’d tried so many diets in the past, and made so little progress, but as soon as she started on the weight-loss medication, something changed. A low and steady hum that she’d experienced ever since she was a kid—Where can I eat? How can I eat? When can I eat?—disappeared, leaving her in a strange new state of quiet. “The fen-phen turned that off just within a day. It was gone,” she told me, struggling to get out the words. “What it did was tell me that I’m not crazy, that it really wasn’t me.”

At the time, Wadden, the obesity researcher and clinician, was hearing similar reports from his patients, who started telling him that their relationship with food had been transformed, that suddenly they were free of constant cravings. Over the course of a small, year-long study of the drugs that Wadden ran with a colleague at Penn, Robert Berkowitz, participants lost about 14 percent of their body weight on average. That’s the same level of success that would be seen for semaglutide several decades later. “Bob and I really were high-fiving each other,” Wadden told me. “We were feeling like, God, we’ve got a cure for obesity.”

The fen-phen revolution arrived at a crucial turning point for Wadden’s field, and indeed for his career. By then he’d spent almost 15 years at the leading edge of research into dietary interventions, seeing how much weight a person might lose through careful cutting of their calories. But that sort of diet science—and the diet culture that it helped support—had lately come into a state of ruin. Americans were fatter than they’d ever been, and they were giving up on losing weight. According to one industry group, the total number of dieters in the country declined by more than 25 percent from 1986 to 1991.

photo of Oprah Winfrey on stage talking next to red Radio Flyer wagon full of plastic bag of fat
In 1988, Oprah Winfrey brought a wagon of fat on air to represent the 67 pounds she’d lost using a liquid diet. (Associated Press)

“I’ll never diet again,” Oprah Winfrey had announced on her TV show at the end of 1990. Not long before, she’d kicked off a major trend by talking up her own success with a brand of weight-loss shakes called Optifast. But Winfrey’s slimmer figure had been fleeting, and now the $33 billion diet industry was under scrutiny for making bogus scientific claims.

Rejecting diet culture became something of a feminist cause. “A growing number of women are joining in an anti-diet movement,” The New York Times reported in 1992. “They are forming support groups and ceasing to diet with a resolve similar to that of secretaries who 20 years ago stopped getting coffee for their bosses. Others have smashed their bathroom scales with the abandon that some women in the 1960’s burned their bras.”

That same Times story included a quote from Wadden, who cautioned that these changing attitudes might end up being “dangerous.” But Wadden’s own views of dieting were also changing. His prior research showed that patients could lose up to one-fifth of their body weight by going on very strict diets that allowed for no more than 800 calories a day. But he’d found that it was difficult for his patients to maintain that loss for long, once the formal program was over. Now Wadden and other obesity researchers were reaching a consensus that behavioral interventions might produce in the very best scenario an average lasting weight loss of just 5 to 10 percent.

National surveys completed in 1994 showed that the adult obesity rate had surged by more than half since 1980, while the proportion of children classified as overweight had doubled. The need for weight control in America had never seemed so great, even as the chances of achieving it were never perceived to be so small.

Then a bolt of science landed in this muddle and despair. In December 1994, the Times ran an editorial on what was understood to be a pivotal discovery: A genetic basis for obesity had finally been found. Researchers at Rockefeller University were investigating a molecule, later named leptin, that gets secreted from fat cells and travels to the brain, and that causes feelings of satiety. Lab mice with mutations in the leptin gene—importantly, a gene also found in humans—overeat until they’re three times the size of other mice. “The finding holds out the dazzling hope,” the editorial explained, “that scientists may, eventually, come up with a drug treatment to help overweight Americans shed unwanted, unhealthy pounds.”

Leptin-based treatments for obesity were in the works, according to the researchers, and might be ready for the public in five years, maybe 10. In the meantime, the suggestion that obesity was a biochemical disease, more a function of a person’s genes than of their faulty habits or lack of will, dovetailed with the nation’s shift away from dieting. If there was any hope of solving the problem of obesity, maybe this was it.

Wadden was ready to switch gears. “I realized that we had sort of reached our limits on what we could do with diet and physical activity,” he said. Now, instead, he started looking into pharmaceuticals. He’d already run one weight-loss study using sertraline, better known as Zoloft, and found that it had no effect. In 1995, he turned to fen-phen.

Fen-phen wasn’t new, exactly—versions of its component drugs had been prescribed for decades. But when those pills were taken separately, their side effects were difficult to handle: “Fen” would make you drowsy and might give you diarrhea; “phen” could be agitating and lead to constipation. By the 1990s, though, doctors had begun to give the two together, such that their side effects would cancel each other out. And then a new and better version of “fen”—not fenfluramine but dexfenfluramine—came under FDA review.

Some regulators worried that this better “fen” posed a risk of brain damage. And there were signs that “fen” in any form might lead to pulmonary hypertension, a heart-related ailment. But Americans had been prescribed regular fenfluramine since 1973, and the newer drug, dexfenfluramine, had been available in France since 1985. Experts took comfort in this history. Using language that is familiar from today’s assurances regarding semaglutide and other GLP-1 drugs, they pointed out that millions were already on the medication. “It is highly unlikely that there is anything significant in toxicity to the drug that hasn’t been picked up with this kind of experience,” an FDA official named James Bilstad would later say in a Time cover story headlined “The Hot New Diet Pill.” To prevent Americans with obesity from getting dexfenfluramine, supporters said, would be to surrender to a deadly epidemic. Judith Stern, an obesity expert and nutritionist at UC Davis, was clear about the stakes: “If they recommend no,” she said of the FDA-committee members, “these doctors ought to be shot.”

In April 1996, the doctors recommended yes: Dexfenfluramine was approved—and became an instant blockbuster. Patients received prescriptions by the hundreds of thousands every month. Sketchy wellness clinics—call toll-free, 1-888-4FEN-FEN—helped meet demand. Then, as now, experts voiced concerns about access. Then, as now, they worried that people who didn’t really need the drugs were lining up to take them. By the end of the year, sales of “fen” alone had surpassed $300 million. “What we have here is probably the fastest launch of any drug in the history of the pharmaceutical industry,” one financial analyst told reporters.

This wasn’t just a drug launch. It was nothing less than an awakening, for doctors and their patients alike. Now a patient could be treated for excess weight in the same way they might be treated for diabetes or hypertension—with a drug they’d have to take for the rest of their life. That paradigm, Time explained, reflected a deeper shift in medicine. In a formulation that prefigures the nearly identical claims being made about Ozempic and its ilk today, the article heralded a “new understanding of obesity as a chronic disease rather than a failure of willpower.”

Barb started on fen-phen two weeks after it was approved. “I had never in my life felt normal until after about a week or two on the medications,” she’d later say. “My life before was hell.” She was losing weight, her blood sugar was improving, and she was getting to the pool, swimming 100 lengths five or six days a week. A few months later, when she read in her local newspaper that the Florida Board of Medicine was considering putting limits on the use of fen-phen, she was disturbed enough to send a letter to the editor. “I thank the creators of fen/phen for helping to save my life,” she wrote. “I don’t want to see the medications regulated so intensely that people like me are left out.”

4

For another year, Barb kept taking fen-phen, and for another year she kept losing weight. By July of 1997, she’d lost 111 pounds.

Thomas Wadden and his colleague’s fen-phen study had by then completed its second year. The data showed that their patients’ shocking weight loss had mostly been maintained, as long as they stayed on the drugs. But before Wadden had the chance to write up the results, he got a call from Susan Yanovski, then a program officer at the National Institutes of Health and now a co-director of the NIH’s Office of Obesity Research. We’ve got a problem, Yanovski told him.

News had just come out that, at the Mayo Clinic in Minnesota, two dozen women taking fen-phen—including six who were, like Barb, in their 30s—had developed cardiac conditions. A few had needed surgery, and on the operating table, doctors discovered that their heart valves were covered with a waxy plaque. They had “a glistening white appearance,” the doctors said, suggestive of disease. Now Yanovski wanted Wadden to look more closely at the women in his study.

Wadden wasn’t terribly concerned, because no one in his study had reported any heart symptoms. But ultrasounds revealed that nearly one-third of them had some degree of leakage in their heart valves. His “cure for obesity” was in fact a source of harm. “That just felt like a crushing blow,” he told me. Several weeks later, a larger data set from the FDA confirmed the issue. Wadden worried to reporters that the whole fiasco would end up setting back obesity treatment by many years.

[Read: The Ozempic revolution is stuck]

The news put Barb in a panic. Not about her heart: The drug hadn’t caused her any problems, as far as she could tell; it had only solved them. But now they were taking it away. What then? She’d already spoken out about her new and better life to local outlets; now she did so again, on national TV. On September 16, the day after fenfluramine in both of its forms was pulled from the market, Barb appeared on CBS This Morning. She explained then, as she later would to me, that fen-phen had flipped a switch inside her brain. There was desperation in her voice.

A few days later, she was in a limousine in New York City, invited to be on The Montel Williams Show. She wore a crisp floral dress; a chyron would identify her as “BARBARA: Will continue taking diet drug despite FDA recall.” “I know I can’t get any more,” she told Williams. “I have to use up what I have. And then I don’t know what I’m going to do after that. That’s the problem—and that is what scares me to death.” Telling people to lose weight the “natural way,” she told another guest, who was suggesting that people with obesity need only go on low-carb diets, is like “asking a person with a thyroid condition to just stop their medication.”

“I did all this stuff to shout it from the rooftops that I was doing so well on fen-phen,” Barb told me. Still, all the warnings she’d been hearing on the news, and from her fellow Montel guests, started building up inside her head. When she got back to Orlando, she went to see her doctor, just in case. His testing showed that she did indeed have damage to her mitral valve, and that fen-phen seemed to be the cause.

photo of woman in pink skirted swimsuit doing back dive into outdoor pool with fence in background
Barb swimming in 2003 (Courtesy of Barb Herrera)

Five months later, she was back on CBS to talk about her tragic turnabout. The newscast showed Doppler footage of the backwards flow of blood into her heart. She’d gone off the fen-phen and had rapidly regained weight. “The voices returned and came back in a furor I’d never heard before,” Barb later wrote on her blog. “It was as if they were so angry at being silenced for so long, they were going to tell me 19 months’ worth of what they wanted me to hear. I was forced to listen. And I ate. And I ate. And ate.”

5

The Publix supermarket chain has, since its founding more than 90 years ago in central Florida, offered “people weighers,” free for use by all. They’re big, old-fashioned things, shaped like lollipops, with a dial readout at the top and handlebars of stainless steel. By the time I visited Barb last fall, in a subdivision of Orlando, she was determined to go and use one.

She’d taken heed of what Ambati, the cardiologist, had told her when she went into the hospital in April 2022. She cut back on salt and stopped ordering from Uber Eats. That alone was enough to bring her weight down 40 pounds. Then she started on Trulicity, the brand name for a GLP-1 drug called dulaglutide that is prescribed to people with diabetes. (The drug was covered for her use by Medicaid.) In clinical trials, patients on dulaglutide tend to lose about 10 pounds, on average, in a year. For Barb, the effects were far more dramatic. When we first met in person, she’d been on Trulicity for 14 months—and had lost more than one-third of her body weight. “It’s not even like I’m skinny, but compared to 405, I feel like an Olympic runner,” she told me.

We arrived at the supermarket in tandem with another middle-aged woman who was also there to check her weight. “Okay, you first, jump on!” Barb said. “My dream weight. I love it!” she said, when the pointer tipped to 230 pounds. “Not mine,” the other woman grumbled. Then Barb got on the scale and watched it spin to a little past 250. She was very pleased. The last number of the dial was 300. Even registering within its bounds was new.

Some people with obesity describe a sense of being trapped inside another person’s body, such that their outward shape doesn’t really match their inner one. For Barb, rapid weight loss has brought on a different metaphysical confusion. When she looks in the mirror, she sometimes sees her shape as it was two years ago. In certain corners of the internet, this is known as “phantom fat syndrome,” but Barb dislikes that term. She thinks it should be called “body integration syndrome,” stemming from a disconnect between your “larger-body memory” and “smaller-body reality.”

She has experienced this phenomenon before. After learning that she had heart-valve damage from fen-phen, Barb joined a class-action lawsuit against the maker of dexfenfluramine, and eventually received a substantial payout. In 2001, she put that money toward what would be her second weight-loss miracle—bariatric surgery. The effects were jarring, she remembers. Within just three months, she’d lost 100 pounds; within a year, she’d lost 190. She could ride a bike now, and do a cartwheel. “It was freakin’ wild,” she told me. “I didn’t have an idea of my body size.” She found herself still worried over whether chairs would break when she sat down. Turnstiles were confusing. For most of her adult life, she’d had to rotate sideways to go through them if she couldn’t find a gate, so that’s what she continued doing. Then one day her partner said, “No, just walk through straight,” and that’s what she did.

Weight-loss surgery was somewhat unusual at the time, despite its record of success. About 60,000 such procedures were performed in 2001, by one estimate; compare that with the millions of Americans who had been taking fen-phen just a few years earlier. Bariatric surgeons and obesity physicians have debated why this treatment has been so grossly “underutilized.” (Even now, fewer than 1 percent of eligible patients with obesity have the procedure.) Surely some are dissuaded by the scalpel: As with any surgery, this one carries risks. It’s also clear that many doctors have refrained from recommending it. But the fen-phen fiasco of the late 1990s cast its shadow on the field as well. The very idea of “treating” excess weight, whether with a pill or with a knife, had been discredited. It seemed ill-advised, if not old-fashioned.

[Read: The science behind Ozempic was wrong]

By the turn of the millennium, a newer way to think about America’s rising rates of obesity was starting to take hold. The push was led by Thomas Wadden’s close friend and colleague Kelly Brownell. In the 1970s, the two had played together in a bluegrass band—Wadden on upright bass, Brownell on guitar—and they later worked together at the University of Pennsylvania. But when their field lost faith in low-calorie diets as a source of lasting weight loss, the two friends went in opposite directions. Wadden looked for ways to fix a person’s chemistry, so he turned to pharmaceuticals. Brownell had come to see obesity as a product of our toxic food environment: He meant to fix the world to which a person’s chemistry responded, so he started getting into policy.

Inspired by successful efforts to reduce tobacco use, Brownell laid out a raft of new proposals in the ’90s to counter the effects of junk-food culture: a tax on non-nutritious snacks; a crackdown on deceptive health claims; regulation of what gets sold to kids inside school buildings. Those ideas didn’t find much traction while the nation was obsessed with fen-phen, but they caught on quickly in the years that followed, amid new and scary claims that obesity was indirectly hurting all Americans, not just the people with a lot of excess weight.

In 2003, the U.S. surgeon general declared obesity “the terror within, a threat that is every bit as real to America as the weapons of mass destruction”; a few months later, Eric Finkelstein, an economist who studies the social costs of obesity, put out an influential paper finding that excess weight was associated with up to $79 billion in health-care spending in 1998, of which roughly half was paid by Medicare and Medicaid. (Later he’d conclude that the number had nearly doubled in a decade.) In 2004, Finkelstein attended an Action on Obesity summit hosted by the Mayo Clinic, at which numerous social interventions were proposed, including calorie labeling in workplace cafeterias and mandatory gym class for children of all grades.

As the environmental theory gained currency, public-health officials took notice. In 2006, for example, the New York City Board of Health moved to require that calorie counts be posted on many chain restaurants’ menus, so customers would know how much they were eating. The city also banned trans fats.

photo of Michelle Obama in red jacket kneeling and handing potted plant to volunteer in yellow shirt
While first lady, Michelle Obama planted an organic garden at the White House as part of her effort to promote healthy eating. (Aude Guerrucci / Getty)

Soon, the federal government took up many of the ideas that Brownell had helped popularize. Barack Obama had promised while campaigning for president that if America’s obesity trends could be reversed, the Medicare system alone would save “a trillion dollars.” By fighting fat, he implied, his ambitious plan for health-care reform would pay for itself. Once he was in office, his administration pulled every policy lever it could. The nation’s school-lunch program was overhauled. Nutrition labels got an update from the FDA, with more prominent displays of calories and a line for “added sugars.” Food benefits for families in poverty were adjusted to allow the purchase of more fruits and vegetables. The Affordable Care Act brought calorie labeling to chain restaurants nationwide and pushed for weight-loss programs through employer-based insurance plans.

Michelle Obama helped guide these efforts, working with marketing experts to develop ways of nudging kids toward better diets and pledging to eliminate “food deserts,” or neighborhoods that lacked convenient access to healthy, affordable food. She was relentless in her public messaging; she planted an organic garden at the White House and promoted her signature “Let’s Move!” campaign around the country. The first lady also led a separate, private-sector push for change within Big Food. In 2010, the beverage giants agreed to add calorie labels to the front of their bottles and cans; PepsiCo pledged major cuts in fat, sodium, and added sugars across its entire product line within a decade.

An all-out war on soda would come to stand in for these broad efforts. Nutrition studies found that half of all Americans were drinking sugar-sweetened beverages every day, and that consumption of these accounted for one-third of the added sugar in adults’ diets. Studies turned up links between people’s soft-drink consumption and their risks for type 2 diabetes and obesity. A new strand of research hinted that “liquid calories” in particular were dangerous to health.

Brownell led the growing calls for an excise tax on soft drinks, like the one in place for cigarettes, as a way of limiting their sales. Few such measures were passed—the beverage industry did everything it could to shut them down—but the message at their core, that soda was a form of poison like tobacco, spread. In San Francisco and New York, public-service campaigns showed images of soda bottles pouring out a stream of glistening, blood-streaked fat. Michelle Obama led an effort to depict water—plain old water—as something “cool” to drink.

The social engineering worked. Slowly but surely, Americans’ lamented lifestyle began to shift. From 2001 to 2018, added-sugar intake dropped by about one-fifth among children, teens, and young adults. From the late 1970s through the early 2000s, the obesity rate among American children had roughly tripled; then, suddenly, it flattened out. And although the obesity rate among adults was still increasing, its climb seemed slower than before. Americans’ long-standing tendency to eat ever-bigger portions also seemed to be abating.

But sugary drinks—liquid candy, pretty much—were always going to be a soft target for the nanny state. Fixing the food environment in deeper ways proved much harder. “The tobacco playbook pretty much only works for soda, because that’s the closest analogy we have as a food item,” Dariush Mozaffarian, a cardiologist and the director of the Food Is Medicine Institute at Tufts University, told me. But that tobacco playbook doesn’t work to increase consumption of fruits and vegetables, he said. It doesn’t work to increase consumption of beans. It doesn’t work to make people eat more nuts or seeds or extra-virgin olive oil.

[Read: What happens when you’ve been on Ozempic for 20 years?]

Careful research in the past decade has shown that many of the Obama-era social fixes did little to alter behavior or improve our health. Putting calorie labels on menus seemed to prompt at most a small decline in the amount of food people ate. Employer-based wellness programs (which are still offered by 80 percent of large companies) were shown to have zero tangible effects. Health-care spending, in general, kept going up.

And obesity rates resumed their ascent. Today, 20 percent of American children have obesity. For all the policy nudges and the sensible revisions to nutrition standards, food companies remain as unfettered as they were in the 1990s, Kelly Brownell told me. “Is there anything the industry can’t do now that it was doing then?” he asked. “The answer really is no. And so we have a very predictable set of outcomes.”

“Our public-health efforts to address obesity have failed,” Eric Finkelstein, the economist, told me.

6

The success of Barb’s gastric-bypass surgery was also limited. “Most people reach their lowest weight about a year post-surgery,” Gretchen White, an epidemiologist at the University of Pittsburgh, told me. “We call it their weight nadir.”

Barb’s weight nadir came 14 months after surgery; she remembers exactly when things began to turn around. She was in a store buying jeans, and realized she could fit into a size 8. By then she’d lost 210 pounds; her BMI was down to 27—lower than the average for a woman her age. Her body had changed so much that she was scared. “It was just too freaky to be that small,” she told me. “I wasn’t me. I wasn’t substantial.” She was used to feeling unseen, but now, in this new state, she felt like she was disappearing in a different way. “It’s really weird when you’re really, really fat,” she said. “People look at you, but they also look through you. You’re just, like, invisible. And then when you’re really small you’re invisible too, because you’re one of the herd. You’re one of everybody.”

At that point, she started to rebound. The openings into her gastric pouch—the section of her stomach that wasn’t bypassed—stretched back to something like their former size. And Barb found ways to “eat around” the surgery, as doctors say, by taking food throughout the day in smaller portions. Her experience was not unusual. Bariatric surgeries can be highly effective for some people and nearly useless for others. Long-term studies have found that 30 percent of those who receive the same procedure Barb did regain at least one-quarter of what they lost within two years of reaching their weight nadir; more than half regain that much within five years.

photo of Mayor Bloomberg at podium holding up large soda cup
New York City Mayor Michael Bloomberg tried to implement a ban on oversize sugary drinks. (Allison Joyce / Getty)

But if the effects of Barb’s surgery were quickly wearing off, its side effects were not: She now had iron, calcium, and B12 deficiencies resulting from the changes to her gut. She looked into getting a revision of the surgery—a redo, more or less—but insurance wouldn’t cover it, and by then the money from her fen-phen settlement had run out. The pounds kept coming back.

Barb’s relationship to medicine had long been complicated by her size. She found the health-care system ill-equipped—or just unwilling—to give her even basic care. During one hospital visit in 1993, she remembers, a nurse struggled to wrap a blood-pressure cuff around her upper arm. When it didn’t fit, he tried to strap it on with tape, but even then, the cuff kept splitting open. “It just grabs your skin and gives you bruises. It’s really painful,” she said. Later she’d find out that the measurement can also be taken by putting the cuff around a person’s forearm. But at the time, she could only cry.

“That was the moment that I was like, This is fucked up. This is just wrong, that I have to sit here and cry in the emergency room because someone is incompetent with my body.” She found that every health concern she brought to doctors might be taken as a referendum, in some way, on her body size. “If I stubbed my toe or whatever, they’d just say ‘Lose weight.’ ” She began to notice all the times she’d be in a waiting room and find that every chair had arms. She realized that if she was having a surgical procedure, she’d need to buy herself a plus-size gown—or else submit to being covered with a bedsheet when the nurses realized that nothing else would fit. At one appointment, for the removal of a cancerous skin lesion on her back, Barb’s health-care team tried rolling her onto her side while she was under anesthesia, and accidentally let her slip. When she woke, she found a laceration to her breast and bruises on her arm.

Barb grew angrier and more direct about her needs—You’ll have to find me a different chair, she started saying to receptionists. Many others shared her rage. Activists had long decried the cruel treatment of people with obesity: The National Association to Advance Fat Acceptance had existed, for example, in one form or another, since 1969; the Council on Size & Weight Discrimination had been incorporated in 1991. But in the early 2000s, the ideas behind this movement began to wend their way deeper into academia, and they soon gained some purchase with the public.

In 1999, when Rebecca Puhl arrived at Yale to work with Kelly Brownell toward her Ph.D. in clinical psychology, she’d given little thought to weight-based discrimination. But Brownell had received a grant to research the topic, and he put Puhl on the project. “She basically created a field,” Brownell said. While he focused on the dark seductions of our food environment, Puhl studied size discrimination, and how it could be treated as a health condition of its own. From the mid-1990s to the mid-2000s, the proportion of adults who said they’d experienced discrimination on account of their height or weight increased by two-thirds, going up to 12 percent. Puhl and others started citing evidence that this form of discrimination wasn’t merely a source of psychic harm, but also of obesity itself. Studies found that the experience of weight discrimination is associated with overeating, and with the risk of weight gain over time.

Puhl’s approach took for granted that being very fat could make you sick. Others attacked the very premise of a “healthy weight”: People do not have any fundamental need, they argued, morally or medically, to strive for smaller bodies as an end in itself. They called for resistance to the ideology of anti-fatness, with its profit-making arms in health care and consumer goods. The Association for Size Diversity and Health formed in 2003; a year later, dozens of scholars working on weight-related topics joined together to create the academic field of fat studies.

[Read: Why scientists can’t agree on whether it’s unhealthy to be overweight]

Some experts were rethinking their advice on food and diet. At UC Davis, a physiologist named Lindo Bacon who had struggled to overcome an eating disorder had been studying the effects of “intuitive eating,” which aims to promote healthy, sustainable behavior without fixating on what you weigh or how you look. Bacon’s mentor at the time was Judith Stern—the obesity expert who in 1995 proposed that any FDA adviser who voted against approving dexfenfluramine “ought to be shot.” By 2001, Bacon, who uses they/them pronouns, had received their Ph.D. and finished a rough draft of a book, Health at Every Size, which drew inspiration from a broader movement by that name among health-care practitioners. Bacon struggled to find a publisher. “I have a stack of well over 100 rejections,” they told me.

But something shifted in the ensuing years. In 2007, Bacon got a different response, and the book was published. Health at Every Size became a point of entry for a generation of young activists and, for a time, helped shape Americans’ understanding of obesity.

As the size-diversity movement grew, its values were taken up—or co-opted—by Big Business. Dove had recently launched its “Campaign for Real Beauty,” which included plus-size women. (Ad Age later named it the best ad campaign of the 21st century.) People started talking about “fat shaming” as something to avoid. The heightened sensitivity started showing up in survey data, too. In 2010, fewer than half of U.S. adults expressed support for giving people with obesity the same legal protections from discrimination offered to people with disabilities. In 2015, that rate had risen to three-quarters.

In Bacon’s view, the 2000s and 2010s were glory years. “People came together and they realized that they’re not alone, and they can start to be critical of the ideas that they’ve been taught,” Bacon told me. “We were on this marvelous path of gaining more credibility for the whole Health at Every Size movement, and more awareness.”

But that sense of unity proved short-lived; the movement soon began to splinter. Black women have the highest rates of obesity, and disproportionately high rates of associated health conditions. Yet according to Fatima Cody Stanford, an obesity-medicine physician at Harvard Medical School, Black patients with obesity get lower-quality care than white patients with obesity. “Even amongst Medicaid beneficiaries, we see differences in who is getting access to therapies,” she told me. “I think this is built into the system.”

That system was exactly what Bacon and the Health at Every Size movement had set out to reform. The problem, as they saw it, was not so much that Black people lacked access to obesity medicine, but that, as Bacon and the Black sociologist Sabrina Strings argued in a 2020 article, Black women have been “specifically targeted” for weight loss, which Bacon and Strings saw as a form of racism. But members of the fat-acceptance movement pointed out that their own most visible leaders, including Bacon, were overwhelmingly white. “White female dietitians have helped steal and monetize the body positive movement,” Marquisele Mercedes, a Black activist and public-health Ph.D. student, wrote in September 2020. “And I’m sick of it.”

Tensions over who had the standing to speak, and on which topics, boiled over. In 2022, following allegations that Bacon had been exploitative and condescending toward Black colleagues, the Association for Size Diversity and Health expelled them from its ranks and barred them from attending its events. (“They were accusing me of taking center stage and not appropriately deferring to marginalized people,” Bacon told me. “That’s never been true.”)

As the movement succumbed to in-fighting, its momentum with the public stalled. If attitudes about fatness among the general public had changed during the 2000s and 2010s, it was only to a point. The idea that some people can indeed be “fit but fat,” though backed up by research, has always been a tough sell. Although Americans had become less inclined to say they valued thinness, measures of their implicit attitudes seemed fairly stable. Outside of a few cities such as San Francisco and Madison, Wisconsin, new body-size-discrimination laws were never passed. (Puhl has been testifying regularly in support of the same proposed bill in Massachusetts since 2007, to no avail.) And, as always, obesity rates themselves kept going up.

In the meantime, thinness was coming back into fashion. In the spring of 2022, Kim Kardashian—whose “curvy” physique has been a media and popular obsession—boasted about crash-dieting in advance of the Met Gala. A year later, the model and influencer Felicity Hayward warned Vogue Business that “plus-size representation has gone backwards.” In March of this year, the singer Lizzo, whose body pride has long been central to her public persona, told The New York Times that she’s been trying to lose weight. “I’m not going to lie and say I love my body every day,” she said.

Among the many other dramatic effects of the GLP-1 drugs, they may well have released a store of pent-up social pressure to lose weight. If ever there was a time to debate that impulse, and to question its origins and effects, it would be now. But Puhl told me that no one can even agree on which words are inoffensive. The medical field still uses obesity, as a description of a diagnosable disease. But many activists despise that phrase—some spell it with an asterisk in place of the e—and propose instead to reclaim fat. Everyone seems to agree on the most important, central fact: that we should be doing everything we can to limit weight stigma. But that hasn’t been enough to stop the arguing.

7

Not long before my visit to Orlando in October, Barb had asked her endocrinologist to switch her from Trulicity to Mounjaro, because she’d heard it was more effective. (This, too, was covered under Medicaid.) A few weeks later, Barb blogged about the feeling of being stuck—physically stuck—inside her body. “Anyone who has been immobilized by fat and then freed, understands my sense of amazement that I can walk without a walker and not ride the scooter in the store,” she wrote. “Two years ago, all I could do was wait to die. I never thought I would be released from my prison of fat.”

photo of woman in floral print dress and sandals on red mobility scooter
Barb has been a frequent visitor to Disney World, but until recently she needed an electric scooter to navigate the park. (Courtesy of Barb Herrera)

In all that time when she could barely move, of all the places that she couldn’t really go, Disney World stood out. Barb is the sort of person who holds many fascinations—meditation, 1980s lesbian politics, the rock band Queen—but Disney may be chief among them. She has a Tinker Bell tattoo on her calf, and a trio of Mickey Mouse balloons on her shoulder. Her wallet shows the plus-size villain Ursula, from The Little Mermaid. “It’s just a place where you can go and be treated beautifully,” she said. “No matter who you are, no matter what country you’re from, no matter what language you speak. It’s just wonderful and beautiful.”

She’d been raised in the theme park, more or less: Her mother got a job there in the 1970s, and that meant Barb could go for free—which she did as often as she could, almost from the time that it first opened, and for decades after. She was at Disney when Epcot opened in 1982, just weeks before she gave birth to her first child. Later on she helped produce a book about where to eat at Disney if you’re vegetarian, and published tips for how to get around the parks—and navigate the seating for their rides—whether you’re “Pooh-size” or “Baloo-size.” She worked at Disney, too, first as an independent tour guide and photographer, then as a phone operator for the resorts. “They used to pull me off of the telephones to go test new rides to see how large people could do on them,” she told me.

But lately she’d only watched the park’s events on livestream. The last time she’d gone in person, in 2021, she was using a scooter for mobility. “I dream of one day walking at Disney World once again,” she’d written on her blog. So we called a car and headed over.

Barb was exhilarated—so was I—when we strolled into the multistory lobby of the Animal Kingdom Lodge, with its shiny floors, vaulted ceilings, indoor suspension bridge, and 16-foot, multicolored Igbo Ijele mask. Barb bought a pair of Minnie Mouse ears at the gift shop, and kibitzed for a while with the cashier. Before, she would have had to ask me to go and get the ears on her behalf, she said, so she wouldn’t have to maneuver through the store on wheels. We walked down the stairs—we walked down the stairs, Barb observed with wonderment—to get breakfast at a restaurant called Boma. “Welcome, welcome, welcome! Have a Boma-tastic breakfast!” the host said.

Barb relished being in the lodge again, and had lots to say, to me and everyone. “My mom was a cast member for 42 years,” she informed our server at one point. Even just that fact was a reminder of how much Disney World, and the people in it, had evolved during her lifetime. When her mom started to gain weight, Barb remembered, her manager demanded that she go on a diet. “They didn’t even make a costume bigger than a 16,” Barb said. As Americans got bigger, that policy had to be abandoned. “They needed people to work,” she said, with a glance around the restaurant, where kids and parents alike were squeezing into seats, not all of which looked entirely sufficient. It was easy to imagine what the crowd at Boma might have looked like 20 years ago, when the restaurant first opened, and when the adult obesity rate was just half of what it is today.

“I feel smaller than a lot of these people, which is really interesting,” Barb said. “I don’t even know if I am, but I feel like it. And that is surreal.”

Things feel surreal these days to just about anyone who has spent years thinking about obesity. At 71, after more than four decades in the field, Thomas Wadden now works part-time, seeing patients just a few days a week. But the arrival of the GLP-1 drugs has kept him hanging on for a few more years, he said. “It’s too much of an exciting period to leave obesity research right now.”

[Read: How obesity became a disease]

His bluegrass buddy, Kelly Brownell, stepped down from his teaching and administrative responsibilities last July. “I see the drugs as having great benefit,” Brownell told me, even as he quickly cited the unknowns: whether the drugs’ cost will be overwhelming, or if they’ll be unsafe or ineffective after long-term use. “There’s also the risk that attention will be drawn away from certain changes that need to be made to address the problem,” he said. When everyone is on semaglutide or tirzepatide, will the soft-drink companies—Brownell’s nemeses for so many years—feel as if a burden has been lifted? “My guess is the food industry is probably really happy to see these drugs come along,” he said. They’ll find a way to reach the people who are taking GLP‑1s, with foods and beverages in smaller portions, maybe. At the same time, the pressures to cut back on where and how they sell their products will abate.

For Dariush Mozaffarian, the nutritionist and cardiologist at Tufts, the triumph in obesity treatment only highlights the abiding mystery of why Americans are still getting fatter, even now. Perhaps one can lay the blame on “ultraprocessed” foods, he said. Maybe it’s a related problem with our microbiomes. Or it could be that obesity, once it takes hold within a population, tends to reproduce itself through interactions between a mother and a fetus. Others have pointed to increasing screen time, how much sleep we get, which chemicals are in the products that we use, and which pills we happen to take for our many other maladies. “The GLP-1s are just a perfect example of how poorly we understand obesity,” Mozaffarian told me. “Any explanation of why they cause weight loss is all post-hoc hand-waving now, because we have no idea. We have no idea why they really work and people are losing weight.”

The new drugs—and the “new understanding of obesity” that they have supposedly occasioned—could end up changing people’s attitudes toward body size. But in what ways? When the American Medical Association declared obesity a disease in 2013, Rebecca Puhl told me, some thought “it might reduce stigma, because it was putting more emphasis on the uncontrollable factors that contribute to obesity.” Others guessed that it would do the opposite, because no one likes to be “diseased.” Already people on these drugs are getting stigmatized twice over: first for the weight at which they started, and then again for how they chose to lose it.

Barb herself has been evangelizing for her current medications with as much fervor as she showed for fen-phen. She has a blog devoted to her experience with GLP-1 drugs, called Health at Any Cost. As we stood up from our breakfast in the Animal Kingdom Lodge, Barb checked her phone and saw a text from her daughter Meghann, who had started on tirzepatide a couple of months before Barb did. “ ‘Thirty-five pounds down,’ ” Barb read aloud. “ ‘Medium top. Extra-large leggings, down from 4X’ … She looks like the child I knew. When she was so big, she looked so different.”

In November, Barb’s son, Tristan, started on tirzepatide too. She attributes his and Meghann’s struggles to their genes. Later that month, when she was out at Meghann’s house in San Antonio for Thanksgiving, she sent me a photo of the three of them together—“the Tirzepatide triplets.”

She’d always worried that her kids might be doomed to experience the same chronic conditions that she has. All she could do before was tell them to “stay active.” Now she imagines that this chain might finally be broken. “Is the future for my progeny filled with light and the joy of not being fat?” she wrote in a blog post last fall.

photo of woman standing on stairs outside home with hands on railings
Barb at home in Orlando in April. Since starting on GLP-1 drugs two years ago, she has lost more than 200 pounds. (Stacy Kranitz for The Atlantic)

Barb’s energy was still limited, and on the day we visited Disney World, she didn’t yet feel ready to venture out much past the lodge. Before we went back to her house, I pressed her on the limits of this fantasy about her kids’ and grandkids’ lives. How could she muster so much optimism, given all the false miracles that she’d experienced before? She’d gone on fen-phen and ended up with heart damage. She’d had a gastric bypass and ended up anemic. And we hadn’t even had the chance to talk about her brief affair with topiramate, another drug prescribed for weight loss that had quieted the voices in her head for a stretch in 2007—until it made her feel depressed. (Topiramate is “the new fen/phen and I am blessed to have it in my life/head/mind,” she’d written on her blog back then. Ten years later she would pledge, in boldface: “I will never diet or take diet drugs again. Ever.”)

After all of these disappointments, why wasn’t there another kind of nagging voice that wouldn’t stop—a sense of worry over what the future holds? And if she wasn’t worried for herself, then what about for Meghann or for Tristan, who are barely in their 40s? Wouldn’t they be on these drugs for another 40 years, or even longer? But Barb said she wasn’t worried—not at all. “The technology is so much better now.” If any problems come up, the scientists will find solutions.

Still, she’d been a bit more circumspect just a few months earlier, the first time that we spoke by phone. “There’s a part of me that thinks I should be worried,” she told me then. “But I don’t even care. What I care about is today, how do I feel today.” She was making travel plans to see her grandkids over Labor Day, after not having been on an airplane for 15 years because of her size. “I’m so excited, I can hardly stand it,” she said. Since then she’s gone to see them twice, including Thanksgiving; the last time she went, she didn’t even need to buy two seats on the plane. She’s also been back to Disney since our visit. This time, she had more energy. “When I walked out the back door of the Beach Club and headed towards EPCOT,” she wrote on her blog, “I felt like I was flying.”


This article appears in the June 2024 print edition with the headline “Ozempic or Bust.”

When George Schappell came out as transgender in 2007, he joined a population at the center of medical and ethical controversy. Schappell was used to this. He had been born in West Reading, Pennsylvania, in 1961 with the left side of his face, some of his skull, and a portion of his brain conjoined with those of his sister, Lori. Following doctors’ advice, their parents put them in an institution for children with intellectual disabilities.

At the time, children with “birth defects” were routinely consigned to what the activist Harriet McBryde Johnson termed the “disability gulag,” a network of facilities designed in part to care for such children and in part to keep them out of the public view. Conditions could be abysmal, but even better-maintained facilities cut residents off from society and deprived them of autonomy. In their early 20s, the twins fought their way out by enlisting the help of Pennsylvania’s first lady, whose stepson was disabled.

[From the September 2023 issue: The ones we sent away]

As George and Lori Schappell navigated independence, the growing disability-rights movement began to allow many other people with disabilities to do the same. Their physical bodies did not fit easily into the structures of a world that was not designed to receive them. George and Lori, who died last month at 62, spent their adult lives finding their way through that world. But American society is still struggling to determine whether to accommodate bodies like theirs—bodies that fail to conform to standards of gender, ability, and even individuality.

In the 1980s and early ’90s, while the Schappells were establishing their independent lives, the American public was enthralled by a procession of sensationalized operations to separate conjoined twins. These experimental procedures could be brutal. Many conjoined twins did not come apart easily; in many cases they have an odd number of limbs or organs shared between them. Patrick and Benjamin Binder, whose 1987 separation at six months made a young Ben Carson a star, both sustained profound neurological damage from the surgery and never spoke. In 1994, surgeons sacrificed newborn Amy Lakeberg to save her twin, but Angela died less than a year later, never having left the hospital. Lin and Win Htut shared a single pair of genitals; in 1984 doctors designated the more “aggressive” of the 2-year-old boys to retain their penis, while the other was given a surgically constructed vagina and reassigned as a girl. By the time he was 10, he had reasserted his identity as a boy.

Other twins’ separation surgeries were the subject of occasional controversy from the 1980s into the early 2000s. Doctors justified them as giving children a chance at a “normal” life, and usually portrayed them as well-intentioned even if they failed. But many were not clearly medically necessary. Ethicists such as Alice Dreger, the author One of Us: Conjoined Twins and the Future of Normal, argued against a risky medical “cure” performed on children who could not consent to it. Meanwhile, the Schappells were living in their own apartment. George’s spina bifida had impeded his growth, so he was much smaller than his twin; they got around with George perched on a barstool-height wheelchair so he could roll along beside Lori as she walked. Lori got a job at a hospital, and they pursued hobbies (George: country music; Lori: bowling) and made friends (Lori also dated). They kept pets, including a Chihuahua and a fish whom they named George years before George chose that name as his own. They went to bars, where a bartender once refused service to George because he looked underage, but agreed to pour drinks for Lori. They did not live “normal” lives: They lived their lives.

[Read: Why is it so hard to find jobs for disabled workers?]

But as the public became familiar with the model of separation for conjoined twins, the Schappells found themselves asked, repeatedly, to explain their continued conjoined existence. In 1992, they gave what seem to be their first interviews, to The Philadelphia Inquirer and the Philadelphia Daily News; the news hook was local doctors’ decision not to separate another pair of twins who were joined, like the Schappells, at the head. The Schappells initially explained to reporters that medical science hadn’t been advanced enough for separation when they’d been born. But later they would stress that they wouldn’t have wanted to be separated even if they had been given the choice. “I don’t believe in separation,” Lori told the Los Angeles Times in 2002. “I think you are messing with God’s work.”

Not long after those first articles were published, the twins began appearing more frequently in the media. They did the rounds of the great 1990s freak shows—Maury, Jerry Springer, Sally, Howard Stern. They became the most visible non-separated conjoined twins of the era. Observers, journalists, and talk-show audiences tended to overwrite the Schappells with their own perceptions. The twins were inspirational, or pitiable; they epitomized cooperation, or individualism. I can’t imagine your lives, people would say, even as they proceeded to do just that. The Virginia Quarterly Review once published a poem written in Lori’s voice, in which the poet took it upon herself to warn an imagined observer: “You don’t know the forest / of two minds bound by weeds / grown from one to the other, / the synapses like bees / cross-pollinating / our honeyed brain.”

The twins, though, did not seem overly concerned about whether others understood them, and they did not go out of their way to change the world. They were not activists. George pursued a career as a country singer; they traveled; they grew older. When their Chihuahua lost the use of its hind legs, George made it a tiny wheelchair. The world slowly changed around them. Institutionalization for disabled people is less common today, though it still happens.

[From the March 2023 issue: Society tells me to celebrate my disability. What if I don’t want to?]

Conjoined twins now occupy far less space in the public imagination. The pair currently most famous are Abby and Brittany Hensel, who have constructed their public image as so aggressively unexceptional that a reality show about their lives was, in at least one viewer’s words, “super boring.” (Their public performance of ordinariness is not always successful; earlier this year, when Today reported that Abby had gotten married, the reaction was predictable, mingling pity and prurience.)

Separation surgeries are still performed today, but they are no longer the subject of intense public debate. Instead, one of the most visible medical controversies of our era, gender transition for young people, is related to another aspect of George’s identity. Although children who identify as trans aren’t eligible for medical interventions before the onset of puberty and only some choose hormones or surgery in their late teens, the idea of little kids receiving those treatments has helped inflame panic over whether they should be allowed at all, even for adults.

In the case of 2-year-old Win Htut, surgical transition was seen as restoring “normality.” But today, medical transition is often seen as creating difference. When you consider that history, a devotion to “normality” seems to be the primary motivator behind a recent raft of state laws outlawing transition care for transgender youth. After all, most of these laws carve out exceptions for children born with ambiguous genitalia. “Corrective” genital operations are still a routine practice for intersex infants, despite the protests of intersex adults, who say they would not have chosen to be surgically altered.

[Read: Young trans children know who they are]

George didn’t say much publicly about being trans, and never mentioned running up against any anti-trans bigotry. But when the twins’ obituaries ran on the website of a local funeral home last month, they were described as their parents’ “daughters,” and George was listed under his birth name. Whatever the intent in doing so, the obituary posthumously obscured his identity by correcting his “abnormality”—despite the fact that, in life, the twins had never apologized for being different.

At breakfast the other week, I noticed a bulging lump on my son’s neck. Within minutes of anxious Googling, I’d convinced myself that he had a serious undiagnosed medical condition—and the more I looked, the more apprehensive I got. Was it internal jugular phlebectasia, which might require surgery? Or a sign of lymphoma, which my father had been diagnosed with before he died? A few hours and a visit to the pediatrician later, I returned home with my tired child in tow, embarrassed but also relieved: The “problem” was just a benignly protuberant jugular vein.

My experience was hardly unique. We live in an era of mounting health worries. The ease of online medical self-diagnosis has given rise to what’s called cyberchondria: concern, fueled by consulting “Dr. Google,” that escalates into full-blown anxiety. Our medical system features ever more powerful technologies and proliferating routine preventive exams—scans that peer inside us, promising to help prolong our lives; blood tests that spot destructive inflammation; genetic screenings that assess our chances of developing disease. Intensive vigilance about our health has become the norm, simultaneously unsettling and reassuring. Many of us have experienced periods of worry before or after a mammogram or colonoscopy, or bouts of panic like mine about my son’s neck. For some, such interludes become consuming and destabilizing. Today, at least 4 percent of Americans are known to be affected by what is now labeled “health anxiety,” and some estimates suggest that the prevalence is more like 12 percent.

And yet hypochondria, you may be surprised to learn, officially no longer exists. In 2013, the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders, the so-called bible of psychiatric conditions, eliminated hypochondriasis. The change reflected an overdue need to reconceive a diagnosis that people found stigmatizing because it implied that hypochondriacs are neurotic malingerers whose symptoms aren’t “real.” The DSM introduced two distinct new diagnoses, illness anxiety disorder and somatic symptom disorder, both of which aim to be neutrally clinical descriptions of people with “extensive worries about health.” What differentiates them is the presence or absence of physical symptoms accompanying those fears.

But the efforts to delineate the spectrum of health anxiety, however, fall short of clarifying the murky nature of hypochondria. The ostensibly helpful terms are actually anything but that. Although we know more than ever before about the diseases and mental illnesses that afflict us, the body’s most obdurate mysteries remain. Doctors and patients must navigate them together. The only way to do so is by setting aside any impulse to moralize and by embracing uncertainty—the very thing that modern medicine is least equipped to do. The abyss between patients’ subjective experience of symptoms and medicine’s desire for objectivity is hard to bridge, as the scholar Catherine Belling notes in A Condition of Doubt. This is the space where hypochondria still lives.

The timing of the writer Caroline Crampton’s new book, A Body Made of Glass: A Cultural History of Hypochondria, couldn’t be better. What her belletristic account of hypochondria’s long and twisting lineage sometimes lacks in authoritative rigor, it makes up for in vivid evocations of being a patient. Her youthful experience with cancer and the anxiety she has suffered ever since propel her undertaking: a tour that includes a sampling of evolving medical science about the condition, as well as literary reflections (from, among others, John Donne, Molière, Marcel Proust, Virginia Woolf, and Philip Larkin) on the doubt and fear that are inseparable from life in a body that gets sick.

[Read: The psychology of irrational fear]

Hypochondria, as Crampton highlights, is not just a lay term for a tendency to worry about illness that isn’t there. It’s a diagnosis that has existed for hundreds of years. The attendant symptoms and meanings have shifted continually, always in step with changing conceptions of wellness and disease. In that sense, the history of hypochondria reflects one constant: Each era’s ideas track its limited understanding of health, and demonstrate a desire for clarity about the body and illness that again and again proves elusive. Knowing this doesn’t stop Crampton from dreaming of a “definitive test for everything, including health anxiety itself.”

Hippocrates, known as the father of medicine, used the term hypochondrium in the fifth century B.C.E. to identify a physical location—the area beneath the ribs, where the spleen was known to lie. Hippocratic medicine held that health depended on a balance among four humors—blood, black bile, yellow bile, and phlegm—that affected both body and mind. An excess of black bile, thought to collect in the organs of the hypochondrium, where many people experienced unpleasant digestive symptoms, could also cause responses such as moodiness and sadness. The term hypochondria thus came to be associated, as the humoral theory persisted into the Renaissance, not only with symptoms like an upset stomach but also with sluggishness, anxiety, and melancholy—a convergence of “two seemingly unrelated processes within the body: digestive function and emotional disorder,” as Crampton notes.

By the 17th century, the notion of hypochondria as a fundamentally physical condition that also had mental symptoms had been firmly established. In The Anatomy of Melancholy (1621), the English writer and scholar Robert Burton described it as a subset of melancholia, noting a “splenetic hypochondriacal wind” accompanied by “sharp belchings” and “rumbling in the guts,” along with feeling “fearful, sad, suspicious”—an illness that, as he put it, “crucifies the body and mind.” Physicians in the 18th century began to investigate hypochondria as a disorder of the recently discovered nervous system, accounting for symptoms not just in the gut but in other parts of the body as well. According to this view, the cause wasn’t imbalanced humors but fatigue and debility of the nerves themselves.

The story of Charles Darwin, which Crampton tells in her book, illustrates the transition between the period when hypochondria was still seen primarily as a physical disease and the period when it began to look like a primarily psychological condition. Darwin, who was born in 1809, suffered from intense headaches, nausea, and gastric distress, as well as fatigue and anxiety, all of which he chronicled in a journal he called “The Diary of Health.” Although various posthumous diagnoses of organic diseases have been proposed—including systemic lactose intolerance—Crampton observes that Darwin’s need to follow strict health regimens and work routines could be interpreted as a manifestation of undue worry. This blurred line between intense (and possibly useful) self-scrutiny and mental disorder became a challenge for doctors and patients to address.

A fundamental shift had taken place by the late 19th century, thanks to the emergence of views that went on to shape modern psychology, including the idea that, as Crampton puts it, “the mind … controlled the body’s experiences and sensations, not the other way around.” Distinguished by what the neurologist George Beard, in the 1880s, called “delusions,” hypochondria was reconceived as a mental illness: It was a psychological state of unwarranted concern with one’s health.

In the 20th century, the prototypical hypochondriac became the kind of neurotic whom Woody Allen plays in Hannah and Her Sisters: someone who obsessively thinks they are sick when they’re not. Freud’s view that unexplained physical symptoms can be the body’s expression of inner conflict—meaning that those symptoms could be entirely psychological in origin—played an influential role. The idea that stress or anguish could manifest as bodily distress, in a process that came to be called “somatization,” spread. So did 20th-century medicine’s new capacity to test for and rule out specific conditions. Consider Allen’s character in that film, fretting about a brain tumor, only to have his worries assuaged by a brain scan. This newly psychologized anxiety, juxtaposed with medical science’s objective findings, helped solidify the modern image of the hypochondriac as a comedic figure, easily caricatured as a neurotic who could, and should, just “snap out of it.”

Unlike some other forms of anxiety, health worries are a problem that neither better labels nor improved treatments can hope to completely banish. Hypochondria, the writer Brian Dillon pointedly notes in his The Hypochondriacs: Nine Tormented Lives, ultimately “makes dupes of us all, because life, or rather death, will have the last laugh.” In the meantime, we doubt, wait, anticipate, and try to identify: Is that stabbing headache a passing discomfort, or a sign of disease? Our bodies are subject to fluctuations, as the medical science of different eras has understood—and as today’s doctors underscore. The trick is to pay enough attention to those changes to catch problems without being devoured by the anxiety born of paying too much attention.

In retrospect, Crampton, as a high-school student in England, wasn’t anxious enough, overlooking for months a tennis-ball-size lump above her collarbone that turned out to be the result of Hodgkin’s lymphoma, a blood cancer. Her doctor told her she had a significant chance that treatment would leave her cancer-free. After chemo, radiation, one relapse, and a stem-cell transplant, she got better. But the experience left her hypervigilant about her body, anxious that she might miss a recurrence. As she reflects, “it took being cured of a life-threatening illness for me to become fixated on the idea that I might be sick.” Her conscientious self-monitoring gave way to panicked visits to urgent care and doctors’ offices, seeking relief from the thought that she was experiencing a telltale symptom—a behavior that she feels guilty about as a user of England’s overstretched National Health Service. “At some point,” she writes, “my responsible cancer survivor behavior had morphed into something else.”

[From the January/February 2014 issue: Scott Stossel on surviving anxiety]

What Crampton was suffering from—the “something else”—seems to be what the DSM now labels “illness anxiety disorder,” an “excessive” preoccupation with health that is not marked by intense physical symptoms. It applies both to people who are anxious without apparent cause or symptoms and to people like Crampton, who have survived a serious disease that might recur and are understandably, but debilitatingly, apprehensive.

It can be hard to distinguish this term, Crampton finds, from the DSM ’s other one, somatic symptom disorder, which describes a disproportionate preoccupation that is accompanied by persistent physical symptoms. It applies to people who catastrophize—the person with heartburn who grows convinced that she has heart disease—as well as those with a serious disease who fixate, to their detriment, on their condition. The definition makes a point of endorsing the validity of a patient’s symptoms, whatever the cause may be; in this, it embodies a 21st-century spirit of nonjudgmental acceptance. Yet because it is a diagnosis of a mental “disorder,” it inevitably involves assessments—of, among other things, what counts as “excessive” anxiety; evaluations like these can be anything but clear-cut. Medicine’s distant and not so distant past—when multiple sclerosis was often misdiagnosed as hysteria, and cases of long COVID were dismissed as instances of pandemic anxiety—offers a caution against confidently differentiating between psychological pathology and poorly understood illness.

In Crampton’s view, the DSM ’s revision has turned out to be “an extensive exercise in obfuscation.” Some physicians and researchers agree that the categories neither lump nor split groups of patients reliably or helpfully. A 2013 critique argued that somatic symptom disorder would pick up patients with “chronic pain conditions [and] patients worrying about the prognosis of a serious medical condition (e.g., diabetes, cancer),” not to mention people with undiagnosed diseases. A 2016 study failed to provide “empirical evidence for the validity of the new diagnoses,” concluding that the use of the labels won’t improve the clinical care of patients suffering from “high levels of health anxiety.”

“Hypochondria only has questions, never answers, and that makes us perpetually uneasy,” Crampton writes. Still, she finds that she almost mourns the old term. Its imperfections fit her messy experience of anxiety—and help her describe it to herself and doctors, giving “edges to a feeling of uncertainty” that she finds overwhelming. But her position, she acknowledges, is a privileged one: As a former adolescent cancer patient, she gets care when she seeks it, and doesn’t really have to worry about being stigmatized by doctors or friends.

Crampton’s concerns and her experience, that is, are legible to the medical system—to all of us. But that is not true for the millions of patients (many of them young women) suffering from fatigue or brain fog who struggle to get doctors to take their symptoms seriously, and turn out to have a condition such as myalgic encephalomyelitis/chronic fatigue syndrome or an autoimmune disease. They, too, are pulled into the story of hypochondria—yet the DSM ’s labels largely fail to solve the problem these patients encounter: In the long shadow of Freud, we are still given to assuming that what clinicians call “medically unexplained symptoms” are psychological in origin. Fifteen-minute appointments in which doctors often reflexively dismiss such symptoms as indicators of anxiety don’t help. How can doctors usefully listen without time—or medical training that emphasizes the bounds of their own knowledge?

This omission is the real problem with the DSM ’s revision: It pretends to have clarity we still don’t have, decisively categorizing patients rather than scrutinizing medicine’s limitations. The challenge remains: Even as evidence-based medicine laudably strives to nail down definitions and make ever-finer classifications, patients and practitioners alike need to recognize the existential uncertainty at the core of health anxiety. Only then will everyone who suffers from it be taken seriously. After all, in an era of pandemics and Dr. Google, what used to be called hypochondria is more understandable than ever.

Someday we might have the longed-for “definitive test” or a better set of labels, but right now we must acknowledge all that we still don’t know—a condition that literature, rather than medicine, diagnoses best. As John Donne memorably wrote, in the throes of an unknown illness, now suspected to have been typhus, “Variable, and therefore miserable condition of man! This minute I was well, and am ill, this minute.”


This article appears in the June 2024 print edition with the headline “Hypochondria Never Dies.”

For a long time, having children has been a young person’s game. Although ancient records are sparse, researchers estimate that, for most of human history, women most typically conceived their first child in their late teens or early 20s and stopped having kids shortly thereafter.

But in recent decades, people around the world, especially in wealthy, developed countries, have been starting their families later and later. Since the 1970s, American women have on average delayed the beginning of parenthood from age 21 to 27; Korean women have nudged the number past 32. As more women have kids in their 40s, the average age at which women give birth to any of their kids is now above 30, or fast approaching it, in most high-income nations.

Rama Singh, an evolutionary biologist at McMaster University, in Canada, thinks that if women keep having babies later in life, another fundamental reproductive stage could change: Women might start to enter menopause later too. That age currently sits around 50, a figure that some researchers believe has held since the genesis of our species. But to Singh’s mind, no ironclad biological law is stopping women’s reproductive years from stretching far past that threshold. If women decide to keep having kids at older ages, he told me, one day, hundreds of thousands of years from now, menopause could—theoretically—entirely disappear.

Singh’s viewpoint is not mainstream in his field. But shifts in human childbearing behavior aren’t the only reason that menopause may be on the move. Humans are, on the whole, living longer now, and are in several ways healthier than our ancient ancestors. And in the past few decades, especially, researchers have made technological leaps that enable them to tinker like never before with how people’s bodies function and age. All of these factors might well combine to alter menopause’s timeline. It’s a grand experiment in human reproduction, and scientists don’t yet know what the result might be.

So far, scientists have only scant evidence that the age of onset for menopause has begun to drift. Just a few studies, mostly tracking trends from recent decades, have noted a shift on the order of a year or two among women in certain Western countries, including the U.S. and Finland. Singh, though, thinks that could be just the start. Menopause can come on anywhere from a person’s 30s to their 60s, and the timing appears to be heavily influenced by genetics. That variation suggests some evolutionary wiggle room. If healthy kids keep being born to older and older parents, “I could see the age of menopause getting later,” Megan Arnot, an anthropologist at University College London, told me.


Singh’s idea assumes that menopause is not necessary for humans—or any animal, for that matter—to survive. And if a species’ primary directive is to perpetuate itself, a lifespan that substantially exceeds fertility does seem paradoxical. Researchers have found lengthy post-reproductive lifespans in only a handful of other creatures—among them, five species of toothed whales, plus a single population of wild chimpanzees. But women consistently spend a third to half of their life in menopause, the most documented in any mammal.

In humans, menopause occurs around the time when ovaries contain fewer than about 1,000 eggs, at which point ovulation halts and bodywide levels of hormones such as estrogen plummet. But there’s no biological imperative for female reproductive capacity to flame out after five decades of life. Each human woman is born with some 1 to 2 million eggs—comparable to what researchers have estimated in elephants, which remain fertile well into their 60s and 70s. Nor do animal eggs appear to have a built-in expiration date: Certain whales, for instance, have been documented bearing offspring past the age of 100.

[Read: Why killer whales (and humans) go through menopause]

This disconnect has led some researchers to conclude that menopause is an unfortunate evolutionary accident. Maybe, as some have argued, menopause is a by-product of long lifespans evolving so quickly that the ovaries didn’t catch up. But many women have survived well past menopause for the bulk of human history. Singh contends that menopause is a side effect of men preferring to mate with younger women, allowing fertility-compromising mutations to accumulate in aged females. (Had women been the ones to seek out only younger men, he told me, men would have evolved their own version of menopause.) Others disagree: Arnot told me that, if anything, many of today’s men may prefer younger women because fertility declines with age, rather than the other way around.

But the preponderance of evidence supports menopause being beneficial to the species it’s evolved in, including us, Francisco Úbeda de Torres, a mathematical biologist at Royal Holloway, University of London, told me. Certainly, menopause was important enough that it appears to have arisen multiple times—at least four separate times among whales alone, Samuel Ellis, a biologist at the University of Exeter, told me.

One of the most prominent and well-backed ideas about why revolves around grandmothering. Maybe menopause evolved to rid older women of the burden of fertility, freeing up their time and energy to allow them to help their offspring raise their own needy kids. In human populations around the world, grandmother input has clearly boosted the survival of younger generations; the same appears to be true among orcas and other toothed whales. Kristen Hawkes, an anthropologist at the University of Utah, argues that the influence of menopausal grandmothering was so immense that it helped us grow bigger brains and shaped the family structures that still govern modern societies; it is, she told me, sufficient to explain menopause in humans, and what has made us the people we are today.

[From the October 2019 issue: The secret power of menopause]

Some researchers suspect that menopause may have other perks. Kevin Langergraber, an ecologist at Arizona State University, points out that certain populations of chimpanzees can also live well past menopause, even though their species doesn’t really grandmother at all. In chimpanzees and some other animals, he told me, menopause might help reduce the competition for resources between mothers and their children as they simultaneously try to raise young offspring.

Regardless of the precise reasons, menopause may be deeply ingrained in our lineage—so much so that it could be difficult to adjust or undo. After all this time of living with an early end to ovulation, there is probably “no single master time-giver” switch that could be flipped to simply extend human female fertility, Michael Cant, an evolutionary biologist at the University of Exeter, told me.


Perhaps, though, menopause’s timeline could still change—not on scales of hundreds of thousands of years, but within generations. Malnutrition and smoking, for instance, are linked to an early sunsetting of menses, while contraceptive use may push the age of menopause onset back—potentially because of the ways in which these factors can affect hormones. Menopause also tends to occur earlier among women of lower socioeconomic status and with less education. Accordingly, interventions as simple as improving childhood nutrition might be enough to raise the average start of menopause in certain parts of the world, Lynnette Sievert, an anthropologist at the University of Massachusetts at Amherst, told me.

[Read: Why so many accidental pregnancies happen in your 40s]

Changes such as those would likely operate mostly on the margins—perhaps closing some of the gaps between poorer and richer nations, which can span about five years. Bigger shifts, experts told me, would probably require medical innovation that can slow, halt, or even reverse the premature aging of the ovaries, and maintain a person’s prior levels of estrogen and other reproductive hormones. Kara Goldman, an obstetrician-gynecologist and a reproductive scientist at Northwestern University, told me that one key to the ovarian fountain of youth might be finding drugs to preserve the structures that house immature eggs in a kind of dormant early state. Other researchers see promise in rejuvenating the tissues that maintain eggs in a healthy state. Still others are generating cells and hormones in the lab in an attempt to supplement what the aging female body naturally loses. Deena Emera, an evolutionary geneticist at the Buck Institute for Research on Aging, in California, thinks some of the best inspiration could come from species that stay fertile very late into life. Bowhead whales, for instance, can reproduce past the age of 100—and don’t seem to succumb to cancer. Maybe, Emera told me, they’re especially good at repairing DNA damage in reproductive and nonreproductive cells alike.

Some women may welcome an extended interval in which to consider having kids, but Goldman and Emera are most focused on minimizing menopause’s health costs. Studies have repeatedly linked the menopause-related drop in hormones to declines in bone health; some research has pointed to cardiovascular and cognitive issues as well. Entering menopause can entail years of symptoms such as hot flashes, urinary incontinence, vaginal dryness, insomnia, and low libido. Putting all of that off, perhaps indefinitely, could extend the period in which women live healthfully, buoyed by their reproductive hormones.

[Read: Women in menopause are getting short shrift]

Extending the ovaries’ shelf life won’t necessarily reverse or even mitigate menopause’s unwanted effects, Stephanie Faubion, the director of Mayo Clinic’s Center for Women’s Health, told me. Plus, it may come with additional risks related to later-in-life pregnancies. It could also raise a woman’s chances of breast or uterine cancer, blood clots, and stroke, Jerilynn Prior, an endocrinologist at the University of British Columbia, told me. And putting off menopause may also mean more years of menstruation and contraception, a prospect that will likely give many women pause, says Nanette Santoro, an obstetrician-gynecologist and a reproductive scientist at the University of Colorado School of Medicine.

But several researchers think some tweaking is worth a shot. Even if menopause once helped our species survive, Goldman said, “it’s hard to imagine” that’s still the case. Evolution may have saddled us with an odd misalignment in the lifespans of the ovaries and the other organs they live alongside. But it has also equipped us with the smarts to potentially break free of those limits.

Earlier this week, news leaked of the biggest change in federal drug policy in more than half a century. The Associated Press reported—and the Department of Justice later confirmed—that the Drug Enforcement Administration plans to recategorize marijuana under the Controlled Substances Act. Since the 1970s, it’s been placed in Schedule I, a highly controlled group that includes drugs like heroin, with a high potential for abuse and no medical use. But cannabis will soon be moved to the much less restrictive Schedule III, which includes prescription drugs such as ketamine and Tylenol with codeine that have a moderate-to-low risk of addiction.

Currently, recreational cannabis is legal for adults over the age of 21 in 24 states, which are home to more than half of the U.S. population. According to a recent Harris poll, about 40 percent of Americans use cannabis, and a quarter do so on at least a weekly basis. And yet, researchers and physicians told me, scientific consensus on the drug’s precise effects—especially on the heart and lungs, mental health, and developing adolescent brains—is still lacking. Rescheduling marijuana will broaden access further still, which makes finding better answers to those questions even more crucial.

Conveniently, rescheduling marijuana is also likely to spur in-depth study, in part by expanding research opportunities that were previously limited or nonexistent. Easing restrictions will ultimately mean learning a lot more about the potential harms and benefits of a drug that for decades has been both popular and demonized.

Historically, the scope of cannabis research has been fairly limited. The National Institute on Drug Abuse, a major federal research funder, has a directive to study the harms of cannabis use rather than any potential benefits, says Amanda Reiman, the chief knowledge officer of New Frontier Data. (New Frontier is an analytics firm focused on the legal cannabis industry.) In 2018, research on the potential harms of cannabis use received more than double the funding that research on its medicinal or therapeutic use did in the U.S., U.K., and Canada. In 2020, a spokesperson for NIDA told Science that although the agency’s traditional focus was on marijuana addiction, it has started exploring the therapeutic potential of compounds in cannabis to treat addiction to other substances.

U.S. policy has also made marijuana research of any sort very difficult. Until recently, scientists had to obtain their supply from NIDA’s high-security Mississippi facility. (Six more sources were approved last year.) Researchers regularly complained that the marijuana was moldy, and far from the quality that regular consumers could purchase legally at their local dispensary, with less THC and CBD.

[Read: The government’s weed is terrible]

Most existing research on how cannabis affects our hearts, our brains, and our society at large is based on self-reported survey data, Peter Grinspoon, a physician at Massachusetts General Hospital and a medical-cannabis expert, told me. Such data are “notoriously inaccurate,” he said. But researchers have been forced to rely on these methods because cannabis is a Schedule I drug, so no studies that receive federal funding can simply give marijuana from state-approved dispensaries to people and record what happens.

As a result, the field lacks the number of high-quality studies necessary for researchers to agree on their implications, says Nick Cioe, an associate professor at Assumption University in Massachusetts who has studied the effects of marijuana on traumatic brain injuries. Randomized controlled trials are the gold standard of determining a given drug’s harms and benefits, but for weed, they’ve been nearly impossible. The FDA has approved a handful of cannabis-derived products to treat conditions such as seizures and chemotherapy-induced nausea, but that’s not the same as understanding the effects of recreational weed.

After marijuana is officially rescheduled, researchers will have a far easier time studying the drug’s effects. Researching any federally controlled substance is difficult, but obtaining the proper licenses for using Schedule III drugs in the lab is much less arduous than for Schedule I. Scientists will also have far more opportunities to obtain federal grant funding from all sorts of governmental bodies—the National Institutes of Health, the EPA, even the National Highway Traffic Safety Administration—as policy makers rush to understand the implications of legalization.

Human trials won’t start the second that the DEA makes marijuana’s new status official. Researchers will have to wait for guidance from federal agencies like the FDA and the NIH, says R. Lorraine Collins, the director of the University at Buffalo’s Center for Cannabis and Cannabinoid Research. And given the limitations around Schedule III drugs, scientists still won’t be able to simply purchase the same cannabis that millions of Americans are consuming from their local dispensary.

[Read: Almost no one is happy with legal weed]

Schedule III won’t “magically alleviate the bureaucratic headaches” associated with researching cannabis, Grinspoon said. But “it’s going to be a lot easier to say, ‘Let’s give this person cannabis and see what happens to their blood pressure.’”

When Kathleen Walker-Meikle, a historian at the University of Basel, in Switzerland, ponders the Middle Ages, her mind tends to drift not to religious conquest or Viking raids, but to squirrels. Tawny-backed, white-bellied, tufted-eared red squirrels, to be exact. For hundreds of years, society’s elites stitched red-squirrel pelts into luxurious floor-length capes and made the animals pets, cradling them in their lap and commissioning gold collars festooned with pearls. Human lives were so intertwined with those of red squirrels that one of history’s most cursed diseases likely passed repeatedly between our species and theirs, according to new research that Walker-Meikle contributed to.

Uncomfortable questions about medieval squirrels first came up about a decade ago, after another group of researchers stumbled upon three populations of red squirrels—one in Scotland, two on different English islands—with odd-looking features: swollen lips, warty noses, skin on their ears that had grown thick and crusty. A search for microbial DNA in some of those squirrels’ tissues revealed that they had leprosy. “What’s it doing in red squirrels?” John Spencer, a microbiologist at Colorado State University, recalled thinking at the time. Scientists had long thought that leprosy affected only humans, until the 1970s, when they began to find the bacterium that causes it in armadillos too, Daniel Romero-Alvarez, an infectious-disease ecologist and epidemiologist at Universidad Internacional SEK, in Ecuador, told me. But that was in the Americas; in Europe, dogma held that leprosy had essentially vanished by about the 16th century. The most plausible explanation for the pathogen’s presence in modern squirrels, Spencer told me, was that strains of it had been percolating in the rodents unnoticed for hundreds of years.

Bacterial genomes extracted from several of the infected British squirrels suggested that this was the case: Those sequences bore a strong resemblance to others previously pulled out of medieval human remains. The next step was proving that medieval squirrels carried the bacterium too, Verena Schünemann, a paleogeneticist at the University of Zurich, in Switzerland, and one of the new study’s authors, told me. If those microbes were also genetically similar to ones found in medieval people, they’d show that leprosy had probably regularly jumped between rodents and humans.

[Read: Tuberculosis got to South America through … seals?]

Schünemann teamed up with Sarah Inskip, an archaeologist at the University of Leicester, in the U.K., and set out to find an archaeological site in Britain with both human and squirrel remains. They zeroed in on the medieval city of Winchester, once famous for its fur-obsessed market patrons, as well as a large leprosarium. After analyzing dozens of samples from around Winchester, the team was able to extract just four leprosy genomes—three from humans, one from the tiny foot bone of a squirrel. But those turned out to be enough. All four samples dated to about the High Middle Ages—the oldest detection so far of leprosy in a nonhuman animal, Inskip told me. The genomes also all budded from the same branch of the leprosy family tree, sharing enough genetic similarities that they strongly indicated that medieval humans and squirrels were swapping the disease-causing bugs, Schünemann told me.

Still, Schünemann wasn’t sure exactly how that would have happened, given that transmitting a leprosy infection generally requires prolonged and close contact. So, hoping to fill in the blanks, she reached out to Walker-Meikle, who has extensively studied medieval pets.

Walker-Meikle already had the exact type of evidence that Schünemann and her colleagues were looking for: medieval artwork depicting people cradling the animals, documents describing women taking them out for walks, financial accounts detailing purchases of flashy, rodent-size accessories and enclosures of the sort people today might buy for pet dogs, Walker-Meikle told me. Squirrels were so popular at the time that she found written references to the woes of a 13th-century archbishop who, despite years of pleading, couldn’t get the nuns in his district to stop doting on the creatures. They were essentially akin, she said, to tiny lapdogs. Fur processing, too, would have provided ample opportunity for spread. In the High and Late Middle Ages, squirrel fur was the most popular fur used to trim and line garments, and clothes made with it were considered as high fashion as a Prada bag now, Schünemann told me. In a single year in the 14th century, the English royal household purchased nearly 80,000 squirrel-belly skins. Contact between squirrels and humans was so intimate that, throughout much of the Middle Ages, leprosy likely ping-ponged back and forth between the two species, Inskip told me.

[Read: Admit it, squirrels are just tree rats]

But the team’s work doesn’t say anything about the origins of leprosy, which entered humans at least thousands of years ago. It also can’t prove whether leprosy infiltrated humans or red squirrels first. It does further dispel the notion that leprosy is a problem only for humans, Romero-Alvarez told me. Armadillos may have picked up leprosy from humans relatively recently, after Europeans imported the pathogen to South America. The scaly mammals are now “giving it back to humans,” Spencer told me, especially, it seems, in parts of South America and the southern United States, where some communities hunt and eat the animals or keep them as pets.

Human-to-human transmission still accounts for the majority of leprosy spread, which remains uncommon overall. But Romero-Alvarez pointed out that the mere existence of the bacterium in another species, from which we and other creatures can catch it, makes the disease that much more difficult to control. “Everybody believes that leprosy is gone,” Claudio Guedes Salgado, an immunologist at Pará Federal University, in Brazil, told me. “But we have more leprosy than the world believes.” The barriers between species are porous. And once a pathogen crosses over, that jump is impossible to fully undo.

Newer Posts