Category:

Medical News

Updated at 9:30 a.m. ET on January 9, 2026

Every few weeks I turn up in a hospital gown at a medical exam room in Massachusetts and describe a set of symptoms that I don’t really have. Students listen to my complaints of stomach pain, a bad cough, severe fatigue, rectal bleeding, shortness of breath, a bum knee, HIV infection, even stab wounds; on one occasion I simply shouted incoherently for several minutes, as if I’d had a stroke. Then the students do their best to help.

I have been given nearly 100 ultrasounds in just the past year, and referred to behavioral counseling dozens of times. I have been consoled for my woes, thanked for my forthrightness, congratulated for my efforts to improve my diet. I have received apologies when they need to lower my gown, press on my abdomen, or touch me with a cold stethoscope. Our encounters, which sometimes run as long as 40 minutes, end with the students giving me their diagnoses; detailing every test, treatment, and drug they want me to have; and then answering all of my questions without ever looking at their watch. Before leaving, they commend me for coming in and promise to check back in on me. It’s a shame I have to feign an illness to get that kind of care.

I learned about fake medical care four years ago when my son, an M.D.-Ph.D. student, mentioned that he was being graded on his skill at treating “standardized patients”: people who are paid to role-play illness. I’m fascinated by the practice of medicine, so I found this notion irresistible. I applied for a job in the standardized-patient program at the University of Massachusetts, and after two full days of training, plus a lot of reading and videos, I was ready to get started.

The practice of faking medical encounters for the sake of education dates back to 1963 at the University of Southern California, but UMass developed one of the first formalized programs in 1982 and has been a model since. Such programs are now, well, standard: According to a count published in a 2023 review of the practice, 187 of the 195 accredited medical schools in the U.S. describe the use of standardized patients on their websites.

Each specific case that an SP might inhabit—and there are hundreds—comes with a minimum of two hours of additional training in person or via Zoom, along with more reading. We’re buried in a blizzard of unique details to memorize about the patients we portray. By the time I’m ready for my fake exam, I can rattle off what vaccinations I’ve had, how long I’ve worked at my job, whether I’ve had my tonsils out, when my mother died, how much weight I’ve gained or lost in recent months, which vitamins I take, how much coffee I drink, how chatty I tend to be, and whether I’ve traveled recently (and might have parasites!).

There’s no script for my encounters, because you never know what the students might ask, say, and do. So I improvise most of my responses, in keeping with the facts I’ve been given. What do I usually eat for breakfast? What do they make at the factory where I work? What sexual acts do my partner and I engage in? My ad-libs are acceptable, according to the grades I get from staff members who occasionally observe the encounters via camera. But many of my colleagues are professional actors, and their performances are superb. We sometimes work in pairs, and more than once I’ve found myself deeply moved—even to the verge of tears—by my partner’s fake suffering.

Of course, we SPs are not the only ones faking it in these sessions; the students are playing along, too. We score them on as many as 50 different elements, including their tone of voice (was it friendly but professional?), their body language (did they lean in to show engagement?), and their facility at palpating our spleens (did they dig in firmly in the right spot?). Most important, we are meant to check that they are learning empathy. Numerous studies have shown that more empathetic care is correlated with better clinical outcomes, perhaps because it makes patients more inclined to share their full medical history, and more likely to stick with whatever treatment has been recommended. In one survey, orthopedic-surgery patients reported that a doctor’s empathy was more central to their satisfaction than the time it took to get an appointment, how long they were stuck in the waiting room, or even what sort of treatment they ended up receiving.

It may not even matter if the doctor’s kindness is sincere, as long as it sounds that way to patients. Dave Hatem, an internist and professor emeritus at UMass who has helped oversee the school’s SP curriculum, told me that even just the act of trying to say empathetic things is valuable for students. “If you get the right words to come out of your mouth, and you do it often enough, then you get to the point where you really mean it,” he said.

Most of the medical students who examine me do seem genuine in their concern. I suspect that if it were up to them, they’d practice medicine this way for the whole of their careers. But however much they might want to provide the superb treatment that I experience as a standardized patient, the health-care system won’t let them.


Elaine Thompson is a recent graduate of Emory University’s medical school, where she learned to provide the same sort of long, thoughtful, whole-person interactions that I get from students. For the past three years, she has been an ear, nose, and throat resident at Johns Hopkins Medicine, one of the best medical centers in the world. Her real-life patient encounters now last for an average of 10 minutes.

“You quickly learn as a resident that the job is to move things along,” Thompson told me. “I’m still curious about my patients as people and want to learn about their families, but if it’s not relevant to their current problem, then asking about it opens a door that will add time to the visit.” So much for chatting to put them at ease, soliciting a full narrative of their symptoms, hearing all their concerns, asking about their job, uncovering anxieties, addressing financial and social challenges, and encouraging their questions. (In an emailed statement, a spokesperson for Johns Hopkins Medicine said that it is committed to delivering “patient-centered training” and “whole person care.”)

[Read: Learning empathy from the dead]

The same is true for Emily Chin, who received her medical degree from UMass in 2023 and is now an ob-gyn resident at UC San Francisco. She told me that she got the message about keeping visits short early on from senior residents, who made a point of tracking the length of her encounters. “I’d just have time to check the cervix, do a quick ultrasound, and then make a decision about admitting or discharging the patient,” she said. Another source of pressure is the knowledge that spending any extra time with a patient means that dozens of other patients will be waiting longer to be seen: “You see the patients piling up in the waiting room, and you see the schedule screen going red.” (UCSF’s vice dean for education, Karen Hauer, did not object to this characterization, but noted that the school advises its residents on how to establish patient rapport when time is short.)

Residents also learn that time is money. Hospitals and practices view a doctor’s interactions with a patient in terms of “relative value units.” Reimbursement for seeing a patient whose high cholesterol leads to a prescription for a statin might bring $60 into the hospital or clinic. Reimbursement for extra time spent discussing the patient’s fears of side effects and concerns about affording the drug’s co-pay or making dietary changes brings in $0. “That doesn’t exactly encourage providing the most empathetic, patient-centered care,” a UMass Memorial Health resident named Hans Erickson told me.

The residents I spoke with worried that these time pressures were only going to get worse when they finished residency and became full-fledged doctors. In light of those constraints, does it still make sense to emphasize highly empathetic care for students? I asked that question of Melissa Fischer, the physician who directs the SP program and other simulation training at UMass. Fischer argues that the lessons we impart to students can survive the crush of residency, even if they have to be applied in abbreviated ways. “That interest in building connections to patients stays,” she said. “They just have to find faster ways to build them.”

[Read: How to teach doctors empathy]

Lisa Howley, an educational psychologist who serves as the senior director for transforming medical education at the Association of American Medical Colleges, told me that training up a generation of more empathetic medical students will make the health-care system better. “We think of young medical learners as agents of potential change,” she told me. “They’ll see the gaps and weaknesses, and they’ll look for ways to make improvements.” Besides, what would be the benefit of forcing medical students to learn about patient encounters in the hectic, abbreviated format they’ll confront as residents? “It doesn’t make sense to apply those pressures early in their education,” she said. After all, we don’t teach student pilots how to fly a plane while trying to make up for time lost to flight delays or dealing with unruly passengers.

All of the residents I spoke with said they look for ways to connect with patients despite the harsh realities of the system. “The desire to get to know the patient as a whole person doesn’t go away; it’s just a matter of finding ways to bring it to the surface as a stressed resident,” Erickson said. Chin put it this way: “It’s not that it’s challenging to keep up empathy, it’s that it’s hard to be empathetic all the time.”

At the end of my fake encounters, I try to be encouraging. I tell the students how I, as a patient, felt treated by them, and then I challenge them to give ideas for how they might improve. Sometimes, when one of them has done a bang-up job of making me feel heard, I tell them that I hope they’ll be able to sustain that level of engagement when they’re a practicing doctor—and I always get the sense that the students hope so too.


This article originally described “relative value units” as “revenue value units.”

Eat More Deer

by

Updated at 5:14 p.m. ET on January 9, 2026

The deer were out there. The crisp tracks in the snow made that clear. Three hours into our hunt through the frigid New Hampshire woods, Ryan Calsbeek, a rangy 51-year-old biology professor at Dartmouth, guessed that 200 animals were hiding in the trees around us. Calsbeek and I were 20 feet up a pignut hickory, crouching on a creaky platform. His friend Max Overstrom-Coleman, a stocky 46-year-old bar owner from Vermont, had climbed a distant tree and strung himself up by a harness, readying his compound bow and swaying in the wind. Shivering in camo jackets and neon-orange beanies, we peered into the darkening forest, daring it to move.

I had joined Calsbeek’s December hunt to try to get my hands on high-quality red meat. Calsbeek had yet to kill a deer that season, but in previous years, he told me, a single animal kept his family of four well fed through the winter. His young daughters especially liked to eat deer heart; apparently, it’s marvelously rich and tender. My mouth watered at the thought. The last time I’d tasted venison was more than a decade ago at a fancy restaurant in Toronto, where it was served as carpaccio, drizzled in oil and so fresh that it may as well have pranced out of the woods and onto my plate.

A bounty of such succulent, free-range meat is currently running through America’s backyards. The continental United States is home to some 30 million white-tailed deer, and in many areas, their numbers are growing too rapidly for comfort. Each year, a white-tailed doe can typically birth up to three fawns, which themselves can reproduce as soon as six months later.

Wherever deer are overabundant, they are at best a nuisance and at worst a plague. They trample gardens, destroy farmland, carry ticks that spread Lyme disease, and disrupt forest ecosystems, allowing invasive species to spread. They are involved in tens of thousands of car crashes each year in New York and New Jersey, where state wildlife departments have encouraged hunters to harvest more deer. In especially populated regions, wildlife agencies hire sharpshooters to cull the animals. Last year, New Hampshire legislators expanded the deer-hunting season in an attempt to keep the population under control. By the looks of the forest floor, which was pitted with hoof marks and scattered with marble-shaped droppings, that effort was falling short.

Over the past decade, some states have proposed a simple, if controversial, strategy for bringing deer under control: Couldn’t people like me—who don’t hunt but aren’t opposed to it—eat more venison?

Venison may not be a staple of American cuisine, but it has a place in many people’s diets. Health influencers laud it as a lean, low-calorie, nutrient-dense source of protein. Venison jerky sticks are sold at big-box stores and advertised as snacks for people on Whole30 and keto diets. Higher-end grocery stores, such as Wegmans and Whole Foods, sell ground venison for upwards of $12 a pound, roughly twice the cost of ground beef.

Part of the reason venison is so expensive is that most of it is not homegrown. It’s mostly imported from New Zealand, which has sent more than 5 million pounds of the stuff to the U.S. every year since 2020. Beef, the dominant red meat in the States, has historically been more affordable. But beef prices jumped nearly 15 percent in 2025, and the conventional kind sold in most supermarkets comes from cattle raised in abysmal conditions. If high-quality venison were cheaper and more widely available, it could be an appetizing alternative.

In recent years, a few deer-swamped states, including New Jersey and Maryland, have tried to legalize the sale of hunted venison, which would deliver two key benefits: more deer out of the ecosystem and more venison on people’s plates. Despite the sport’s association with trophies, many deer hunters are motivated by the prospect of obtaining meat, and they can only consume so much. “It’s for your own table,” Overstrom-Coleman said as he fixed climbing sticks onto a tree to form a makeshift ladder. He had already stocked his freezer full of venison this season (“That son of a bitch,” Calsbeek whispered, once we’d left our companion in his tree) and planned, as many hunters do, to donate any excess meat to a food bank.

Hunting is waning in popularity, in part because younger people are less keen on participating than older generations. Efforts to bring in more hunters, such as programs to train women and youth in outdoor skills, are under way in many states. Women are the fastest-growing demographic, and they participate largely to acquire food, Moira Tidball, the executive director at the Cornell Cooperative Extension who leads hunting classes for women, told me. Still, interest is not growing fast enough for the subsistence-and-donation system to keep deer numbers in check.  

[Read: America needs hunting more than it knows]

It’s hard to imagine a better incentive for deer hunting than allowing hunters to sell their venison to stores and restaurants. But the idea is antithetical to a core tenet of American conservation. For more than 100 years, the country’s wild game has flourished under the protection of hunters and their allies, steadfast in their belief that the nation’s animals are not for sale.

The last time this many white-tailed deer roamed America’s woodlands, the country didn’t yet exist. To the English colonists who arrived in the New World, the deer bounding merrily through the forests may as well have been leaping bags of cash. Back home, deer belonged to the Crown, and as such, could be hunted only by the privileged few, Keith Tidball, a hunter and an environmental anthropologist at Cornell (and Moira’s spouse), told me. In the colonies, they were free for the taking.

Colonists founded a robust trans-Atlantic trade for deer hide, a particularly popular leather for making work boots and breeches, which drastically reduced the deer population. In Walden, Henry David Thoreau notes a man who preserved the horns “of the last deer that was killed in this vicinity.” The animals were already close to disappearing from many areas at the beginning of what ecologists have called the “exploitation era” of white-tailed deer, starting in the mid-19th century. Fifty years later, America was home to roughly half a million deer, down 99 percent from precolonial days.

The commerce-driven decimation of the nation’s wildlife—not just deer but birds, elk, bears, and many other animals—unsettled many Americans, especially hunters. In 1900, Representative John Lacey of Iowa, a hunter and close friend of Theodore Roosevelt’s, introduced a bill to ban the trafficking of America’s wildlife. (As Roosevelt, who notoriously hunted to collect trophies, wrote in 1913, “If there is to be any shooting there must be something to shoot.”) The Lacey Act remains one of the most binding federal conservation laws in existence today.  

[From the May 1906 issue: Camping with President Theodore Roosevelt]

The law is partly contingent on state policies, which make exceptions for certain species. Hunters in most states, for example, can legally harvest and sell the pelts of fur-bearing species such as otters, raccoons, and coyotes. But attempts to carve out similar exceptions for hunted venison, including the bills in Maryland and New Jersey, have failed. In 2022, the Mississippi attorney general published a statement that opened up the possibility of legalizing the sale of hunted deer, provoking fierce opposition from hunters and conservationists; today, the option remains open but has not led to any policy changes. Last year, an Indiana state representative introduced a bill that would allow the sale of hunted venison, but so far it has gone nowhere.  

The practical reason such proposals keep failing is that allowing the sale of hunted meat would require huge investments in infrastructure. Systems to process meat according to state and federal laws would have to be developed, as would rapid testing for chronic wasting disease, an illness akin to mad cow that could, theoretically, spread to humans who eat infected meat, though no cases have ever been reported. Such systems could, of course, be implemented. Hunted deer is sold in some common grocery stores in the United Kingdom, such as Waitrose and Aldi. (Notably, chronic wasting disease is not a concern there.)

[Read: Deer are beta-testing a nightmare disease]

Although the sheer abundance of deer makes them easy to imagine as steaks on legs, several experts cautioned that some people’s affection for the animals runs deep. Deer are cute; they’re docile; they’re Bambi. David Drake, a forestry and wildlife professor at the University of Wisconsin at Madison, likens them to America’s “sacred cow.” As Drake and a colleague have outlined in a paper proposing a model for commercialized venison hunting in the U.S., any modern system would be fundamentally different from the colonial-era approach because it would be regulated, mostly by state wildlife agencies. But powerful coalitions of hunters and conservationists remain both faithful to the notion that wild game shouldn’t be sold and fearful that history will repeat itself. As the Congressional Sportsmen’s Foundation, a national hunting association, puts it, “Any effort to recreate markets for game species represents a significant threat to the future of our nation’s sportsmen-led conservation efforts.” Some of the fiercest pushback to the New Jersey law, Drake told me, came from the state wildlife agency.

The only U.S. state with a deer-related exception to the Lacey Act is Vermont. During the open deer-hunting season (which spans roughly from fall to winter in the Northeast) and for 20 days afterward, Vermonters can legally sell any meat that they harvest. This policy was introduced in 1961, and yet, “I am not aware of anyone who actually takes advantage of it,” Nick Fortin, a wildlife biologist at Vermont’s Fish and Wildlife Department, told me. He added that the department, which manages the exasperated homeowners and destabilized forests that deer leave in their path, has been discussing how to raise awareness about the law.

Even after I explained the 1961 law to several Vermont hunters, they were hesitant to sell me any meat. Hunted meat is meant to be shared freely, or at most bartered for other items or goodwill, Greg Boglioli, a Vermont hunter and store owner, told me. I met Boglioli at the rural home of his friend Fred Waite, a lifelong hunter whose front room alone was decorated with 20 deer heads. I had hoped to buy venison from Waite, but he insisted on sharing it for free. After all, he had plenty. His pantry was crammed with mason jars of stewed venison in liver-colored brine. On a table in the living room was the scarlet torso of a deer that his son had accidentally hit with his truck the other day, half-thawed and waiting to be cooked.

During our hunt, I found Overstrom-Coleman to be more open to the idea of selling the venison he hunted. “I guess that would be a pretty excellent way to share it,” he said. Earlier in the season, he’d killed a deer in Vermont, and he was willing to sell me some of the meat the next day. At least, I thought as I stared into the motionless woods, I’d be going home with something.

[From the July/August 2005 issue: Masters of the hunt]

By the time the sun went down, the only deer I’d seen was a teetering doe in a video that Overstrom-Coleman had taken from his tree and sent to Calsbeek. “Too small to kill,” he texted; he’d meet us in the parking lot. The air was glacial as Calsbeek and I trudged empty-handed toward the trailhead, hoofprints glinting mockingly in the light of our headlamps. From the trunk of the car, we took a consolation swig of Wild Turkey from a frosted bottle, and Overstrom-Coleman reminded me to visit the next day.

I found his chest freezer stuffed with paper-wrapped packages stamped with Deer 2025. He handed me three and refused to let me pay. Back home a few days later, I used one to make meatballs. Their sheer depth of flavor—earthy and robust, with a hint of nuttiness—made me wonder why I bothered to eat farmed meat at all.


This article originally misidentified Max Overstrom-Coleman’s hunting weapon.

Every night before bedtime, my daughter tilts back her head so that a pair of metal plates inside her mouth can be cranked apart another quarter of a millimeter. We turn a jackscrew with a wire tip; it spreads the bones within her upper jaw. At times she groans or even cries: she says that she can feel the pressure up into her nose.

This is normal. My daughter is 9 years old. She has a palate expander.

So does her best friend, and, by her count, so does nearly one in four of the kids in her fourth-grade class. On Reddit’s r/braces forum, a practitioner based in Frisco, Texas, said he was surprised by “how many parents ask me, ‘Hey, does my child need an expander? Everyone else seems to have one.’” His colleagues seemed to notice something similar. “Everybody’s being told they have a narrow jaw, and everyone’s being given an expander,” Neal Kravitz, the editor in chief of the Journal of Clinical Orthodontics, told me.

A generation ago, getting braces was a rite of passage into seventh grade. Today, the reshaping of a child’s smile may commence a few years earlier, at 7, 8, or 9 years old. At that point, the two sides of the upper jawbone haven’t yet joined together, a fact that is propitious for a different orthodontic process: instead of straightening, expansion. During this phase of life, when kids still have some baby teeth, a tiny dungeon rack may be wedged between a child’s upper teeth, then used to spread her upper jaw and—proponents say—introduce essential room for sprouting teeth.

The expander is an old device; debates about its use are hardly any younger. What seems to have been the first expander was described in 1860, in the journal The Dental Cosmos, by a San Francisco dentist named Emerson Angell. He wrote of “an apparatus, simple and efficient,” that he’d placed into the mouth of a young patient. Then he’d told her to expand it, day by day, by advancing a central screw—just as my daughter does today. But the journal’s editors were skeptical of Angell’s work. We “must beg leave to differ with the writer in the conclusion arrived at,” they announced in a prefatory note, foreshadowing a long disagreement within the field.

This concerned the merits of expansion versus those of extraction—whether a child’s jaw should be broadened to accommodate her teeth, or whether certain teeth should be pulled to accommodate her jaw. Around the turn of the 20th century, the influential orthodontist Edward Angle favored jaw broadening; he believed that all children should have their teeth intact, nestled in a capacious jaw, as exemplified by a human skull that had been ransacked from an Indian burial mound not far from where he practiced, which he called “Old Glory.” A few decades later, though, orthodontic research found that expanded jaws might still “relapse” into a narrow shape. By the 1970s, pulling teeth became the rule, Daniel Rinchuse, a Seton Hill University professor of orthodontics, told me.

This consensus was itself short-lived, he said—not because the field had come across some new and better mouth-expanding tech but because of fears about the supposed ill effects of doing too many extractions. Some dentists claimed that what was then the standard approach in orthodontics could even lead to painful disorders of the temporomandibular joint, or TMJ. In the face of these concerns, expanders made a comeback.

Eventually, some orthodontists started claiming that expanders had another major benefit—that prying open a child’s palate could improve her breathing and prevent sleep apnea. Some now recommend this airway-focused intervention not just for kids my daughter’s age but for toddlers too.

The basis for the trend was never really scientific, though. “Do expanders prevent obstructive sleep apnea? In capital letters: NO WAY,” Kravitz said. “There are endless research papers on this stuff.” The problem isn’t that expanders have no value, he continued; it’s that they’re clearly overused. According to Rinchuse, who co-edited the book Evidence-Based Clinical Orthodontics, the idea that extracting teeth will lead to joint disorders has never been proved. Indeed, no “high-quality evidence” supports expansion of the upper jaw for any reason, he said, except in cases where a child has been diagnosed with posterior “crossbite.” He said that, overall, orthodontic practice is less constrained by evidence than other fields of health care are, because the ill effects of bad decisions will be slight. As he put it, “In orthodontics, no one dies.”

Steven Siegel, the current president of the American Association of Orthodontists, acknowledged that some practitioners may be inclined to put a rack on every child’s palate: “There are some abuses,” he told me. But he also argued that the recent increase in expander use hasn’t really been dramatic, and that for the most part, the devices are used to positive effect. For people with a narrow jaw and crowded teeth, he said, expanders can prevent the need for extractions down the road; some kids, at least, could see improvements in their breathing. When I noted that I’d heard the opposite on both counts from Kravitz and Rinchuse, he responded that they simply disagreed. “I have great respect for both of them,” he said. “I would say that there is a controversy.”

For the record, my daughter is delighted by the treatment she’s received: In a recent family interview, conducted over breakfast, she described her course of orthodontics as “cool and fun.” Her orthodontist (who happens to be a former high-school classmate) has been thoughtful and communicative, and I’ve recommended her to several other families. Still, despite the fact that no one dies from orthodontics, one might also choose to avoid a treatment that costs several thousand dollars, has disputed benefits, and may cause modest pain—not to mention any moral injury that may accrue from tilting back your daughter’s head and cranking open metal plates to wrench her face apart.

And despite whatever caused expander mania, its existence can be jarring for a parent who grew up in the prior era of orthodontics. Indeed, the period during which this trend developed—from, say, the late 1980s until the early 2020s—happens to coincide with the stretch that intervened between my own entry into middle school and my daughter’s. For my fellow members of this cohort, expansion of the fourth-grade palate appears to be a strange and sudden social norm. During one visit to the orthodontist, my daughter and I found a handful of children about her age seated in a line of dental chairs, with technicians leaning over each of them to turn the screw of their expander. It was like we’d all gathered there for some initiation rite for children of the tribe that dwells on Cobble Hill in Brooklyn—a ritual of widening.

Not long after that, I called up Luke Glowacki, an anthropologist at Boston University who co-directs a research project in Ethiopia’s Omo Valley, where body modifications—and dental modifications in particular—are not uncommon. He told me about social groups there and elsewhere in which a child’s teeth might be filed down to points or a person’s lower lip stretched out with a plate.

Is orthodontics any different? It presents itself as curative and scientific, but many orthodontists’ websites are replete with beauty claims as well: An expander may “protect your child’s facial appearance” or provide “enhancement to the facial profile.” Siegel said that a broadened palate gives “a more aesthetic width of the smile.” Kravitz said that it could help shrink the unattractive gaps inside a person’s cheeks—“dark buccal corridors,” in the language of the field.

In East Africa, dental and other body modifications carry similar ambiguities of purpose. Filing down a person’s teeth, for instance, or removing them altogether “may also be done for ostensible health reasons,” Glowacki said. Some body-modification rituals could be understood to ward off harmful spirits, for example. In other words, they’re prophylactic. Glowacki also told me about a Nyangatom woman he knows who has scars carved into both her shoulder and forehead. The former are purely decorative, but she’d received the latter on account of being sick.

Glowacki is a parent, too, and I asked him whether his training as an anthropologist affected how he thought about expanders or other anatomical procedures, such as ear piercing, that are carried out on children in the United States at industrial scale. “You’re not gonna find any society in the world that doesn’t modify their body in some way in accordance with their ideas of beauty or of health,” he said. “We’re doing what societies all over the world do.” If now I’ve paid an orthodontist to reshape my daughter’s mouth, maybe that’s just human nature.

On a recent Tuesday morning, I was blessed with a miracle in a mini-mart. I had set out to find the protein bar I kept hearing about, only to find a row of empty boxes. But then I spotted the shimmer. Pushed to the back of one carton, gleaming in its gold wrapper, was a single Salted Peanut Butter David Protein Bar. It was mine.

David bars are putty-like rectangles of pure nutritional efficiency: 28 grams of protein stuffed into 150 calories, or roughly the equivalent of eight egg whites cooked without oil. They are booming right now. After all, in this era of protein mania, one must always be optimizing. A Quest bar might get you 20 grams of protein for just under 200 calories, but David—named after Michelangelo’s masterpiece—does more for less. “Humans aren’t perfect,” promises one David tagline, “but David is.” Why, given the possibility of perfection, would you accept eight grams less?

If a food with more protein is better, then it follows that a food with less is worse. After eating my David bar, I couldn’t help but feel a little bit bad about my dinner of brown rice and spicy chickpeas. A cup of Eden Foods organic chickpeas (240 calories) gets you a measly 12 grams. Now that I was living in the world of David, I was newly ambivalent about eating anything that wasn’t chunks of unadulterated protein. I am fueling, I thought, shoving cubes of baked tofu into my mouth. Did you know that green peas have an unusual amount of protein for a vegetable? With unsettling frequency, I began to add frozen peas to my dinners. (They’re not great on cacio e pepe, it turns out.)

I have become quietly obsessed with this one single macronutrient. How could I not be? Everything is protein now: There are protein chips and protein ice creams and cinnamon protein Cheerios. Lemonade is protein, and so is water. Last month, Chipotle introduced a “high protein cup” consisting of four ounces of cubed chicken. Melanie Masarin, the founder and CEO of Ghia, a nonalcoholic-drink brand, recently told me that an investor asked her whether Ghia has plans for a high-protein aperitif. No, but the investor’s logic was obvious: Healthy people, the kind who tend to watch their drinking, only want one thing. This week, the federal government released its latest set of dietary guidelines—including a newly inverted food pyramid. At the top is protein.

[Read: Protein madness has gone too far]

In some ways, protein is just the latest all-consuming nutritional fixation. For decades, the goal was to avoid fat, which meant that pretzels were good and peanut butter was bad and fat-free Snackwell’s devil’s-food cookie cakes were a cultural phenomenon. Then Americans rediscovered fat and villainized carbs. But protein is different. Whatever your dreams are, protein seems to be the answer. It supports muscle gain, for those trying to bulk up, but it’s also satiating, which means people trying to lose weight are also advised to eat more protein. It has the power to make you bigger and more jacked, but also smaller and more delicate. People on GLP-1s are supposed to be especially mindful of their protein intake, to prevent muscle loss on extremely low-calorie diets, but so are weight lifters.

It is a nutritional philosophy that encourages not restriction but abundance: as much protein as possible, all the time. You can have your cake and eat it too (as long as it is made with “protein flour”). In a world where the very act of eating feels fraught, layered with a lifetime of rules and fads and judgments about what food is and is not “good,” protein offers absolution: You don’t have to feel bad about this. It has so many grams! What a beautifully straightforward recommendation: Eat more of this one thing that happens to be everywhere, and that frequently tastes good.

The low rattle of protein mania—the protein matchas and protein Pop-Tarts and protein seasonings to sprinkle on your protein chicken cubes—can be as maddening as it is inescapable. Everybody knows that you are supposed to eat a varied diet with many different types of foods that provide many different nutrients. But only protein is endowed with a special kind of redemptive power. Nobody is pretending that tortilla chips are a cornerstone of a balanced diet, but if they’re protein tortilla chips (7 grams), well, then maybe they’re at least fine. This is fantastic news if your goal is to enjoy tortilla chips, but it does have a tendency to recast all food that has not been protein-ified—either by nature or by the addition of whey-protein isolate—as a minor failure. It is depressing to look at a pile of roasted vegetables, arranged elegantly over couscous, and think: I will try harder tomorrow. I know, because I do it.

Protein is supposed to allow people to realize their untapped potential—to make us stronger and sharper. I suspect, though, that I would be stronger and sharper if I could stop ambiently thinking about my protein intake. That the world is now covered in a protein-infused haze provides constant reminders that I am falling short. Lots of protein evangelists will tell you that this is how cavemen ate, and therefore it is good. I think the best part of being a caveman would be not worrying about protein.

As nutritional trends go, there are worse obsessions than protein. Even if there is still significant debate about how much protein one needs, you are unlikely to send yourself into kidney failure because you protein-maxxed too hard. But the fanatical focus on protein as the true answer, the universal key to transforming the body you have into the one you want—7 grams, 28 grams, 11 grams, a chicken smoothie—feels eerily familiar. We counted calories, grams of fat, carbohydrates, trying to distill the messy science of nutrition into one single quantitative metric. Protein, for all its many virtues, is just another thing to count.

The flu situation in the United States right now is, in a word, bad. Infections have skyrocketed in recent weeks, filling hospitals nearly to capacity; viral levels are “high” or “very high” in most of the country. In late December, New York reported the most flu cases the state had ever recorded in a single week. My own 18-month-old brought home influenza six days before Christmas: He spiked a fever above 103 degrees for days, refusing foods and most fluids; I spent the holiday syringing electrolyte water into his mouth, while battling my own fever and chills. This year’s serving of flu already seems set to be more severe than average, Seema Lakdawala, a flu virologist at Emory University, told me. This season could be a reprise of last winter’s, the most severe on record since the start of the coronavirus pandemic—or, perhaps, worse.

At the same time, what the U.S. is experiencing right now “fits within the general spectrum of what we would expect,” Taison Bell, an infectious-disease and critical-care physician at the University of Virginia Health System, told me. This is simply how the flu behaves: The virus is responsible for one of the roughest respiratory illnesses that Americans regularly suffer, routinely causing hundreds of thousands of people to be hospitalized annually in the U.S., tens of thousands of whom die. (So far this season, the flu has killed more than 5,000 people, including at least nine children.) Influenza is capable of even worse—sparking global pandemics, for instance, including some of the deadliest in history. These current tolls, however, are well within the bounds of just how awful the “seasonal” flu can be. “It’s another flu year, and it sucks,” Bell said.

Although flu is a ubiquitous winter illness, it is also one of the least understood. Scientists have been puzzling over the virus for decades, but many aspects of its rapid evolution and transmission patterns, as well as the ways in which our bodies defend against it, remain frustratingly mysterious. Flu seasons, as a rule, differ drastically from one another, and “we don’t have a great understanding of why one ends up being more severe than another,” Samuel Scarpino, an infectious-disease-modeling researcher at Northeastern University, told me. Experts’ flu-dar has also been especially out of whack in recent years, since the arrival of COVID-19 disrupted typical flu-transmission patterns. (An entire lineage of flu, for instance, may have been driven to extinction by pandemic-mitigation measures.) The virus is still finding its new norm.

Even so, a few things about this season’s ongoing torment are clear. Much of the blame rests on the season’s dominant flu variant—subclade K, which belongs to the H3N2 group of influenza. As flus go, H3N2s tend to be more likely to hospitalize and kill people; most of the worst flu seasons of the past decade in the U.S. have been driven by H3N2 surges. Subclade K doesn’t seem to be an unusually virulent variant, which is to say it’s probably no more likely to cause severe disease than a typical version of H3N2. But it does seem to be better at dodging our immune defenses, making the net effect similar, because it can lead to more people getting sicker than they otherwise would. That’s not a trivial effect for a disease that, even in mild cases, can cause days of high fevers and chills, followed by potentially weeks of that delightful run-over-by-a-truck feeling.

At UVA Health, Bell has seen a major uptick in people testing positive for the virus in recent weeks. Like others, his hospital is close to full, straining its capacity to treat other illnesses, he said. In Michigan, too, where Molly O’Shea cares for children at multiple pediatric practices, “we are seeing a ton of influenza, just a ton,” she told me. “Our schedule is overflowing.” Several of her school-age patients have wound up in the hospital, despite being previously healthy; a few have ended up with serious complications such as pneumonia and brain inflammation. The worst cases, she said, have been among the children who didn’t get their annual flu shot.

Flu vaccines are not among the most impressive immunizations in our roster. Although they’re generally pretty effective at protecting against severe disease, hospitalization, and death, they don’t reliably stave off infection or transmission. And they’re frequently bamboozled by the virus itself, which shape-shifts so frequently throughout the year, as it ping-pongs from hemisphere to hemisphere, that by the time flu vaccines roll out to the public, they’re often at least a little out of sync with what’s currently circulating.

That’s another aggravating factor this year. Researchers first detected subclade K in June, months after experts selected the strains that would go into the fall flu-vaccine formulation. Recent data suggest that vaccination may still elicit some immune defenses that recognize subclade K, and preliminary estimates from the United Kingdom suggest that this year’s formulations may be especially effective at preventing severe disease in children, who, along with the elderly, are highly vulnerable to the flu. (For all the misery my family endured, none of us ended up in the hospital—which suggests that our vaccinations did their job.)

Children also tend to be the biggest drivers of flu’s spread. “They are the source, many times, of explosions of transmission,” Lakdawala told me. In the U.K., for instance, which experienced an unusually early start to the flu season, school-age kids appear to have driven much of the epidemic, Scarpino pointed out. In the U.S., too, case rates among children have been particularly high. Although the vaccine primarily limits severe disease, it can also affect how quickly the virus travels through a community. And yet only about half of American kids get the vaccine each year, despite long-standing universal recommendations for annual immunization. “It’s a vaccine that parents have never really treated as a vaccine that every child should get,” O’Shea said.

Those choices might be influenced by the ways many people underestimate the flu—a term often used to describe any cold-weather ailment that comes with a runny nose, cough, or even gastrointestinal upset. In reality, flu has long clocked as one of the U.S.’s top 10 or top 15 causes of death—a scourge that, through its impact on the health-care system, the workforce, and the economy at large, costs the country billions of dollars each year. Against such a substantial threat, we should be using “everything in our toolbox to protect ourselves,” Lakdawala said.

Yet the Trump administration is actively impeding the process of flu vaccination. Health and Human Services Secretary Robert F. Kennedy Jr. has also said that it may be “a better thing” if fewer people are immunized against the flu—and insisted, incorrectly, that “there is no scientific evidence that the flu vaccine prevents serious illness, hospitalizations, or death in children.” The federal government recommended annual flu vaccines for all children until earlier this month, when HHS pushed through changes that demoted multiple immunizations from its recommended schedule. HHS now says that families should consult with their health-care provider before taking the shot. Such a recommendation suggests that the vaccines’ overall benefits are ambiguous enough to require discussion—and puts an additional burden on both patients and health-care providers, who can administer what was once a routine vaccine only after a conversation that must then be documented.

The nation’s leaders have also compromised one of the country’s best chances to develop more effective, better-matched flu vaccines in the future, by defunding research into mRNA vaccines. The current flu-vaccine manufacturing process takes so long that the included strains for the Northern Hemisphere must be selected by February or so—which provides plenty of time for the virus to evolve before the autumn rollout begins, as happened this year. “We pretty regularly have a bad match for the flu,” Scarpino said. mRNA vaccines promised the possibility of faster development, allowing researchers to stay more closely on the flu’s heels and switch out viral ingredients in as little as two or three months. That degree of flexibility also would have sped the response to the next flu pandemic.

In an email, Andrew Nixon, HHS’s deputy assistant secretary for media relations, disputed the characterization that the department’s new policies impede flu vaccination, writing, “Providers continue to offer flu vaccines, and insurance coverage remains unchanged. The recommendation supports shared clinical decision-making between patients and clinicians and does not prevent timely vaccination. People can continue to receive flu vaccines if they choose to do so.”

For the current season, much of the U.S.’s fate may already be sealed: Less than half of Americans have gotten a flu vaccine this season, while the virus continues to spread. “If you find yourself in a place where there are people sick with flu, you’re probably gonna get sick,” Scarpino said. That logic likely holds true for his own family, in Massachusetts, where flu activity has been high for weeks. They’ve so far made it through unscathed, but Scarpino said, “I feel like it’s a matter of time.”

Nearly a year ago, Health Secretary Robert F. Kennedy Jr. pledged during his confirmation hearing to support the CDC’s childhood-vaccine schedule. Yesterday, he broke that promise. The Department of Health and Human Services has announced that the CDC will drop several vaccines from its recommendation list. With that move, Kennedy has shown that he can change the vaccine schedule by fiat.

Kennedy appears to have a clear road if he wants to do more. The acting CDC director, Jim O’Neill, who signed off on the plan, took over the position when the previous director was fired after defying Kennedy. The new recommendations were written by two Kennedy allies at HHS; the agency’s vaccine advisory board, which Kennedy remade in his own image last year, appears to have been cut out of the process entirely. Following this map, Kennedy could dispense with any recommendation he dislikes, issue whatever advice strikes him and those he’s hired as wise, and thereby remake public-health policy in the United States in accordance with the anti-vaccine arguments he’s been advancing for 20 years.

The move didn’t come as a surprise. Kennedy spent much of 2025 undermining confidence in vaccines and sidelining public-health officials who disagreed with him. The overhaul of the pediatric schedule followed a request last month by President Donald Trump to bring the United States in line with a set of other developed countries and had been hinted at for weeks, as well. The vaccines no longer universally recommended for children include those for hepatitis A, rotavirus, and the flu. Although these diseases can be serious—the CDC estimates that, before the vaccine, rotavirus killed dozens of children each year and sent hundreds of thousands to the hospital—they don’t represent the same level of threat as polio or measles, each of which killed thousands of children in the 1950s and each of which, at least for now, remain on the vaccine schedule. An HHS spokesperson, Andrew Nixon, told me that the new list “maintains strong protection against diseases that cause serious harm or provide clear community benefit, while aligning U.S. guidance with international norms.” Whatever the pros and cons of any particular vaccine, this move is still the strongest evidence yet that the health secretary intends to unilaterally impose his will on an agency he has repeatedly assailed as corrupt.

[Read: Rotavirus could come roaring back—very soon]

Public-health experts are generally aghast at the new policy. Although Mehmet Oz, the chief of Medicare and Medicaid, said yesterday that insurers will still cover the vaccines that are no longer recommended, the onus will now be on doctors and professional associations such as the American Academy of Pediatrics to make the case for the demoted immunizations to parents. Daniel Jernigan, the former director of the CDC’s National Center for Emerging and Zoonotic Infectious Diseases, told me that revoking the recommendations for some immunizations “sends a message that those vaccines are of uncertain value.” That could prove dangerous: Vaccination rates have already fallen in many states in recent years, and Jernigan fears that the schedule change might lead parents to forgo certain shots, or perhaps not immunize their children at all.

Nixon said that the new schedule “returns decision-making to families,” and Kennedy has said that it is intended to rebuild trust in public health. (In a survey last fall, just over half of Americans said they considered the vaccine schedule safe.) But it seems guaranteed to contribute to greater confusion. In the short term, if the schedule change leads to fewer vaccinations, children in America will suffer through more diarrhea from rotavirus, nausea from hepatitis A, and fevers from the flu. In the long term, it could lead to an increased burden on American hospitals.

The schedule overhaul was hailed as a “victory for American children” by the Informed Consent Action Network, which has paid millions of dollars to attorneys challenging vaccine mandates. Del Bigtree, who was the communications director for Kennedy’s presidential campaign and is the CEO of ICAN, told me that the change is “setting our children on the same path to health that Denmark enjoys,” because the new recommendations mostly mirror that country’s. (The comparison is fraught: Denmark is far smaller and less diverse than the United States, and it has universal health care.)

Yet Bigtree and other Kennedy allies see the announcement as merely a first step toward eliminating all of the government’s vaccine recommendations. Aaron Siri, a lawyer who has sued states over vaccine mandates and has been a close adviser to Kennedy, told me that he is especially keen on removing the polio vaccine from the recommended list, along with DTaP, which protects against diphtheria, tetanus, and whooping cough. Bigtree wants an end to vaccine mandates, which he sees as a violation of the Nuremberg Code, he said. (The CDC’s recommendations are not mandates but are frequently used to justify states’ school-admission requirements.) Mark Gorton—who is best known as the creator of the file-sharing service LimeWire and has since become an anti-vaccine activist and a co-president of the MAHA Institute, a pro-Kennedy think tank—told me he would like to go much further. He favors pulling all vaccines from the market until they’ve undergone additional testing. (Decades of evidence have shown that the vaccines on the U.S. pediatric schedule are safe.) “Politically, we’re not there yet,” he told me. But if Gorton has anything to do with it, that’s where the U.S. is heading. (Nixon declined to comment on whether Kennedy and HHS are actively considering any of these moves.)

If more American parents opt out of the inoculations that the federal government no longer recommends, harm will inevitably come to pass. The high fevers, hospitalizations, and dehydration that these illnesses cause can be painful and disruptive—and, with vaccines, avoidable. This year’s brutal flu season is a reminder of just how devastating even routine infections can be. But the illnesses that the U.S. is now officially less concerned about preventing are not usually deadly or permanently debilitating. The World Health Organization does not recommend universal vaccination against any of them, with the exception of Hepatitis B. (As of last month, the U.S. recommends that vaccine only to children who are born to a mother who is positive for the virus or whose status is unknown.)

[Read: The vaccine guardrails are gone]

The same cannot be said of the immunizations the CDC still does recommend. If they were to be taken off the market or if more parents were to opt out of them, Americans could see diseases that most of us have never encountered—such as polio and diphtheria—return with a vengeance. The nation is getting a taste of that already with the resurgence of measles, which killed three people last year, and whooping cough, which claimed the lives of more than a dozen. Without the protection that widespread vaccination provides, the United States could become a nation that’s not only far sicker but also much less safe for children.

Since he was confirmed as Health and Human Services secretary early last year, Robert F. Kennedy Jr. has previewed big changes to the Dietary Guidelines for Americans—the government’s go-to guide on what to eat, and how much of it. Rewritten only every five years, the dietary guidelines are ubiquitous in American life: The food pyramid, launched in the 1990s, is a result of the document. The guidelines determine what millions of kids eat in school cafeterias every day.

Chief among those supposedly forthcoming changes that Kennedy has promised is a dramatic rethinking of how the United States deals with saturated fat. For decades, the dietary guidelines have recommended that people get no more than 10 percent of their daily calories from these fats because they increase bad cholesterol. But Kennedy is a saturated-fat evangelist. The HHS secretary, who has said that he follows a “carnivore diet,” once famously prepared a Thanksgiving turkey by submerging the raw bird in a vat of beef tallow.

Surprisingly, the new guidelines, which were released earlier today, retain the exact same recommendation about saturated fat that Kennedy seems to loathe. During a press briefing, he declared that the guidelines “end the war on saturated fat.” The guidelines do plug beef tallow as a “healthy fat” and say that Americans should get some of their protein from red meat. (The previous version says that a healthy diet includes “relatively lower consumption of red and processed meats.”) But all of that is hardly a dramatic change in how Americans should approach saturated food.

What happened? Despite all of Kennedy’s bluster, the revisions appear to be built much more around incremental change than around any all-out war on established health wisdom. Kennedy and his staff appear to understand that an embrace of saturated fat is controversial. “It was our goal for this report to not be ‘activist’—and only make statements that are widely accepted by the latest nutritional research,” Andrew Nixon, a spokesperson for the Department of Health and Human Services, told me in a statement. “No universal nutrition recommendation will be agreed on by everyone in the nutrition research field—nor should it,” he added. Indeed, there is little evidence to back up the adoption of an explicitly pro-fat diet. The American Heart Association, for example, recommends that fewer than 6 percent of daily calories come from saturated fat.

Overall, the new dietary guidelines focus on a much less controversial take-home message than “Make Frying Oil Tallow Again.” The takeaway, as Kennedy put it in a post on X, is “EAT REAL FOOD.” One of the biggest points of emphasis is on the importance of eating more protein—which already is a popular message among Americans. The release of the new guidelines was paired with a new inverted version of the food pyramid that’s meant to stress that a healthy diet consists of a majority of fruits, vegetables, healthy fats, and proteins. Few people follow the dietary guidelines to a T, but those who actually want to keep their saturated-fat intake to a minimum while upping their protein consumption will have to look to leaner options, such as beans and tofu. The carnivore health secretary may have inadvertently encouraged more Americans to embrace the “soy boy” lifestyle.

For the first time, the guidelines explicitly recommend against eating ultra-processed foods, which they refer to as “highly processed foods.” An accompanying scientific report that was released today notes that processed foods have been “consistently associated with increased risk” for a number of conditions, including type 2 diabetes and obesity. And the guidelines also take a particularly strong stance against added sugar. Both the 2020 and 2025 guidelines stress the need to limit added sugar, but the new guidelines add that “no amount of added sugars or non-nutritive sweeteners is recommended or considered part of a healthy or nutritious diet.”

These new suggestions come with some controversy. Ultra-processed foods constitute a broad and amorphous category, as I’ve previously written, and whether recommending that people stay away from all ultra-processed foods is feasible or desirable remains to be seen. But the guidelines are largely being well received by major health and nutrition groups. The American Medical Association said in a statement that the guidelines “offer clear direction patients and physicians can use to improve health.”

[Read: Coke, Twinkies, Skittles, and … whole-grain bread?]

The average American won’t necessarily change what they eat because of these new recommendations, but the guidelines do play a central role in determining what food can be served by a number of programs that provide food to millions of Americans. The lunch menus offered in schools, for example, legally must be “consistent with the goals” of the dietary guidelines. Kennedy has pledged to remove processed food from schools, and these new guidelines could pave the way for this to happen by giving the Trump administration justification for such a ban. That said, doing so would be hard, given the ubiquity of these foods and the limited resources that schools have to make all of their meals from scratch. Any such change would require formal regulation and could take years to finalize, and it’s not actually Kennedy’s call: The rules for school lunches are set by Agriculture Secretary Brooke Rollins.

Regardless of the challenges ahead, the release of the guidelines is a milestone in Kennedy’s tenure as HHS secretary, and it’s indicative of the way that he’s approached food regulation generally. Kennedy came into office pledging radical reform, yet he has seemed content with small steps. Before taking office, for example, he implied that he thought a number of chemical additives in foods should be banned. But so far, he’s settled for companies’ promises that they will voluntarily phase out certain synthetic dyes over the course of several years.

The strategy marks a sharp departure from Kennedy’s willingness to impose his own beliefs on another major priority area: vaccines. On that front, Kennedy initially pledged to be a moderate, although he has been anything but. Just this week, the CDC removed six shots from its list of recommended childhood vaccines—after Kennedy told Congress during his confirmation hearing that he would “support the CDC schedule.” In other words, America’s health policy seems like it’s being led by two health secretaries with very different approaches.

Chief among the burdens weighing upon the weary sports parent—worse than the endless commutes, the exorbitant fees, the obnoxious parents on the other team—is the sense that your every decision has the power to make or break your child’s future. Should your 11-year-old show up to her elementary-school holiday concert, even if it means missing a practice with the elite soccer team to which you’ve pledged 100 percent attendance? What if this turns out to be the fork in the road that consigns her to the athletic scrap heap?

These are heavy decisions—at least they are for me, a soccer dad who happens to have spent years writing about the science of athletic success. Making it to the pros, the conventional wisdom says, is a consequence of talent and hard work. Best-selling books have bickered over the precise ratio—whether, say, 10,000 hours of practice trumps having the so-called sports gene. But the bottom line is that you need a sufficient combination of both. If you’re talented enough and do the work, you’ll make it. If not—well, decisions (and holiday concerts) have consequences.

Rationally, stressing out over missing a single practice is ridiculous. Believing that it matters, though, can be strangely reassuring, because of the suggestion that the future is under your control. Forecasting athletic careers is an imperfect science: Not every top draft pick pans out; not every star was a top draft pick. Unexpected injuries aside, the imprecision of our predictions is usually seen as a measurement problem. If we could only figure out which factors mattered most—how to quantify talent, which types of practice best develop it—we would be able to plot athletic trajectories with confidence.

Unless, of course, this tidy relationship between cause and effect is an illusion. What if the real prerequisite for athletic stardom is that you have to get lucky?

Joseph Baker, a scientist at the University of Toronto’s Sport Insight Lab, thinks that the way talent development is usually framed leaves out this crucial ingredient. Baker is a prominent figure in the academic world of “optimal human development,” who moonlights as a consultant for organizations such as the Texas Rangers. He’s also a longtime skeptic of the usual stories we tell ourselves about athletic talent. The most prominent is that early performance is the best predictor of later performance. In reality,  many cases of early success just mean an athlete was born in the first months of the year, went through puberty at a young age, or had rich and highly enthusiastic parents.

This critique of talent is not entirely new. It’s been almost two decades since Malcolm Gladwell’s Outliers spurred a cohort of hyper-ambitious soon-to-be parents to begin plotting January birth dates (or at least to tell people they were considering it). Over time, the debate about what factors actually matter has devolved into a game of whack-a-mole. If physical development isn’t the best predictor of long-term success, then it must be reaction time, or visual acuity, or hours of deliberate practice. The default assumption is that there must be something that reveals the presence of future athletic greatness.

Baker’s perspective changed, he told me, when he read Success and Luck, a 2016 book by the former Cornell University economics professor Robert H. Frank. Frank describes a hypothetical sports tournament whose outcome depends 49 percent on talent, 49 percent on effort, and 2 percent on luck. In mathematical simulations where as many as 100,000 competitors are randomly assigned values for each of these traits, it turns out that the winner is rarely the person with the highest combination of talent and effort. Instead, it will be someone who ranks relatively highly on those measures and also gets lucky.

This turns out to be something like a law of nature: It has been replicated and extended by others since Frank’s book came out. Among the most influential models is “Talent Versus Luck,” created by the Italian theoretical physicist Andrea Rapisarda and his colleagues, which simulates career trajectories over dozens of years and reaches the same conclusion. This model earned a 2022 Ig Nobel Prize “for explaining, mathematically, why success most often goes not to the most talented people, but instead to the luckiest.”

To Baker, these models suggest that it’s not just hard to reliably predict athletic futures; it’s impossible. He cites examples including a youth-soccer player for Northampton Town who missed a text message from the team’s manager telling him that he’d been dropped from the roster for an upcoming game. He showed up for the bus, went along for the ride, subbed in when another player got injured, impressed the manager, earned a spot for the rest of the season, and went on to play in the Premier League. Luck takes many forms, such as genetics, family resources, and what sports happen to be popular at a given place at a given time. But sometimes, it’s simply random chance: a gust of wind or an errant bounce or a missed text.

It’s easy to see how luck shapes individual moments in sport—how it changes the course of a game, a series, even an entire season. But what’s harder to accept is that luck might also play a role in longer arcs—not just what happens in games but who appears on the court in the first place. The more you reckon with this, the more disorienting it can be, as things start to feel ever more arbitrary and unfair. As Michael Mauboussin, an investor who writes about luck in his 2012 book, The Success Equation, put it to me: “Talking about luck really quickly spills into the philosophical stuff.”

You might think that the growing professionalization of youth sports offers an escape from this randomness—that by driving to this many practices and paying for that many coaches, you’re ensuring the cream will rise to the top. But the opposite is actually true, according to Mauboussin. In The Success Equation, he describes what he calls the “paradox of skill.” Now that every soccer hopeful is exhaustively trained from a young age, an army of relatively homogeneous talent is vying for the same prizes. “Everyone’s so good that luck becomes more important in determining outcomes,” Mauboussin said.

Baker and one of his colleagues at the University of Toronto, Kathryn Johnston, recently published a paper on the role of luck in athletic development in the journal Sports Medicine–Open. I felt a curious sense of relief when I read it. My daughters, who are 9 and 11, both play competitive soccer on teams requiring a level of commitment that I had naively thought went out of style with the fall of the Soviet Union. Seeing the evidence that future athletic success is not entirely predictable felt like a license for parents to loosen up a bit—to choose the holiday concert over the soccer practice without worrying about the long-term ramifications.

Linda Flanagan, the author of the 2022 book Take Back the Game and a frequent critic of today’s youth-sports culture, doesn’t share my optimism. She has no trouble believing that luck is involved with athletic success, but she doesn’t think that acknowledging this fact will change parental behavior. “Hell, they might double down on the investment in time and money, thinking that they need to give their child more chances to get lucky and impress the right coach,” she told me.

But that sort of luck—getting a job on your hundredth interview because the interviewer went to the same high school as you did, say—arguably is more about hustle than it is about serendipity. So is showing up to every soccer practice. Mauboussin’s definition of luck is narrower: It’s the factors you can’t control. No matter how much luck you try to “create” for yourself or your kids, some irreducible randomness might still make or break you.

To Baker, the takeaways from recognizing the role of luck are less about individual parents and more about how sports are organized. His advice to teams and governing bodies: “If there’s any way possible for you to avoid a selection, don’t select.” Keep as many athletes as you can in the system for as long as you can, and don’t allocate all of your resources to a chosen (and presumably lucky) few. When real-world constraints eventually and inevitably do require you to select—when you’re anointing these lucky few as your future stars, and casting out those who perhaps sang in one too many holiday concerts—try to leave the door open for future decisions and revisions. After all, Baker says, no matter how carefully you’ve weighed your predictions, “you’re probably wrong.”

Updated at 9:30 a.m. ET on January 9, 2026

Every few weeks I turn up in a hospital gown at a medical exam room in Massachusetts and describe a set of symptoms that I don’t really have. Students listen to my complaints of stomach pain, a bad cough, severe fatigue, rectal bleeding, shortness of breath, a bum knee, HIV infection, even stab wounds; on one occasion I simply shouted incoherently for several minutes, as if I’d had a stroke. Then the students do their best to help.

I have been given nearly 100 ultrasounds in just the past year, and referred to behavioral counseling dozens of times. I have been consoled for my woes, thanked for my forthrightness, congratulated for my efforts to improve my diet. I have received apologies when they need to lower my gown, press on my abdomen, or touch me with a cold stethoscope. Our encounters, which sometimes run as long as 40 minutes, end with the students giving me their diagnoses; detailing every test, treatment, and drug they want me to have; and then answering all of my questions without ever looking at their watch. Before leaving, they commend me for coming in and promise to check back in on me. It’s a shame I have to feign an illness to get that kind of care.

I learned about fake medical care four years ago when my son, an M.D.-Ph.D. student, mentioned that he was being graded on his skill at treating “standardized patients”: people who are paid to role-play illness. I’m fascinated by the practice of medicine, so I found this notion irresistible. I applied for a job in the standardized-patient program at the University of Massachusetts, and after two full days of training, plus a lot of reading and videos, I was ready to get started.

The practice of faking medical encounters for the sake of education dates back to 1963 at the University of Southern California, but UMass developed one of the first formalized programs in 1982 and has been a model since. Such programs are now, well, standard: According to a count published in a 2023 review of the practice, 187 of the 195 accredited medical schools in the U.S. describe the use of standardized patients on their websites.

Each specific case that an SP might inhabit—and there are hundreds—comes with a minimum of two hours of additional training in person or via Zoom, along with more reading. We’re buried in a blizzard of unique details to memorize about the patients we portray. By the time I’m ready for my fake exam, I can rattle off what vaccinations I’ve had, how long I’ve worked at my job, whether I’ve had my tonsils out, when my mother died, how much weight I’ve gained or lost in recent months, which vitamins I take, how much coffee I drink, how chatty I tend to be, and whether I’ve traveled recently (and might have parasites!).

There’s no script for my encounters, because you never know what the students might ask, say, and do. So I improvise most of my responses, in keeping with the facts I’ve been given. What do I usually eat for breakfast? What do they make at the factory where I work? What sexual acts do my partner and I engage in? My ad-libs are acceptable, according to the grades I get from staff members who occasionally observe the encounters via camera. But many of my colleagues are professional actors, and their performances are superb. We sometimes work in pairs, and more than once I’ve found myself deeply moved—even to the verge of tears—by my partner’s fake suffering.

Of course, we SPs are not the only ones faking it in these sessions; the students are playing along, too. We score them on as many as 50 different elements, including their tone of voice (was it friendly but professional?), their body language (did they lean in to show engagement?), and their facility at palpating our spleens (did they dig in firmly in the right spot?). Most important, we are meant to check that they are learning empathy. Numerous studies have shown that more empathetic care is correlated with better clinical outcomes, perhaps because it makes patients more inclined to share their full medical history, and more likely to stick with whatever treatment has been recommended. In one survey, orthopedic-surgery patients reported that a doctor’s empathy was more central to their satisfaction than the time it took to get an appointment, how long they were stuck in the waiting room, or even what sort of treatment they ended up receiving.

It may not even matter if the doctor’s kindness is sincere, as long as it sounds that way to patients. Dave Hatem, an internist and professor emeritus at UMass who has helped oversee the school’s SP curriculum, told me that even just the act of trying to say empathetic things is valuable for students. “If you get the right words to come out of your mouth, and you do it often enough, then you get to the point where you really mean it,” he said.

Most of the medical students who examine me do seem genuine in their concern. I suspect that if it were up to them, they’d practice medicine this way for the whole of their careers. But however much they might want to provide the superb treatment that I experience as a standardized patient, the health-care system won’t let them.


Elaine Thompson is a recent graduate of Emory University’s medical school, where she learned to provide the same sort of long, thoughtful, whole-person interactions that I get from students. For the past three years, she has been an ear, nose, and throat resident at Johns Hopkins Medicine, one of the best medical centers in the world. Her real-life patient encounters now last for an average of 10 minutes.

“You quickly learn as a resident that the job is to move things along,” Thompson told me. “I’m still curious about my patients as people and want to learn about their families, but if it’s not relevant to their current problem, then asking about it opens a door that will add time to the visit.” So much for chatting to put them at ease, soliciting a full narrative of their symptoms, hearing all their concerns, asking about their job, uncovering anxieties, addressing financial and social challenges, and encouraging their questions. (In an emailed statement, a spokesperson for Johns Hopkins Medicine said that it is committed to delivering “patient-centered training” and “whole person care.”)

[Read: Learning empathy from the dead]

The same is true for Emily Chin, who received her medical degree from UMass in 2023 and is now an ob-gyn resident at UC San Francisco. She told me that she got the message about keeping visits short early on from senior residents, who made a point of tracking the length of her encounters. “I’d just have time to check the cervix, do a quick ultrasound, and then make a decision about admitting or discharging the patient,” she said. Another source of pressure is the knowledge that spending any extra time with a patient means that dozens of other patients will be waiting longer to be seen: “You see the patients piling up in the waiting room, and you see the schedule screen going red.” (UCSF’s vice dean for education, Karen Hauer, did not object to this characterization, but noted that the school advises its residents on how to establish patient rapport when time is short.)

Residents also learn that time is money. Hospitals and practices view a doctor’s interactions with a patient in terms of “relative value units.” Reimbursement for seeing a patient whose high cholesterol leads to a prescription for a statin might bring $60 into the hospital or clinic. Reimbursement for extra time spent discussing the patient’s fears of side effects and concerns about affording the drug’s co-pay or making dietary changes brings in $0. “That doesn’t exactly encourage providing the most empathetic, patient-centered care,” a UMass Memorial Health resident named Hans Erickson told me.

The residents I spoke with worried that these time pressures were only going to get worse when they finished residency and became full-fledged doctors. In light of those constraints, does it still make sense to emphasize highly empathetic care for students? I asked that question of Melissa Fischer, the physician who directs the SP program and other simulation training at UMass. Fischer argues that the lessons we impart to students can survive the crush of residency, even if they have to be applied in abbreviated ways. “That interest in building connections to patients stays,” she said. “They just have to find faster ways to build them.”

[Read: How to teach doctors empathy]

Lisa Howley, an educational psychologist who serves as the senior director for transforming medical education at the Association of American Medical Colleges, told me that training up a generation of more empathetic medical students will make the health-care system better. “We think of young medical learners as agents of potential change,” she told me. “They’ll see the gaps and weaknesses, and they’ll look for ways to make improvements.” Besides, what would be the benefit of forcing medical students to learn about patient encounters in the hectic, abbreviated format they’ll confront as residents? “It doesn’t make sense to apply those pressures early in their education,” she said. After all, we don’t teach student pilots how to fly a plane while trying to make up for time lost to flight delays or dealing with unruly passengers.

All of the residents I spoke with said they look for ways to connect with patients despite the harsh realities of the system. “The desire to get to know the patient as a whole person doesn’t go away; it’s just a matter of finding ways to bring it to the surface as a stressed resident,” Erickson said. Chin put it this way: “It’s not that it’s challenging to keep up empathy, it’s that it’s hard to be empathetic all the time.”

At the end of my fake encounters, I try to be encouraging. I tell the students how I, as a patient, felt treated by them, and then I challenge them to give ideas for how they might improve. Sometimes, when one of them has done a bang-up job of making me feel heard, I tell them that I hope they’ll be able to sustain that level of engagement when they’re a practicing doctor—and I always get the sense that the students hope so too.


This article originally described “relative value units” as “revenue value units.”

Eat More Deer

by

Updated at 5:14 p.m. ET on January 9, 2026

The deer were out there. The crisp tracks in the snow made that clear. Three hours into our hunt through the frigid New Hampshire woods, Ryan Calsbeek, a rangy 51-year-old biology professor at Dartmouth, guessed that 200 animals were hiding in the trees around us. Calsbeek and I were 20 feet up a pignut hickory, crouching on a creaky platform. His friend Max Overstrom-Coleman, a stocky 46-year-old bar owner from Vermont, had climbed a distant tree and strung himself up by a harness, readying his compound bow and swaying in the wind. Shivering in camo jackets and neon-orange beanies, we peered into the darkening forest, daring it to move.

I had joined Calsbeek’s December hunt to try to get my hands on high-quality red meat. Calsbeek had yet to kill a deer that season, but in previous years, he told me, a single animal kept his family of four well fed through the winter. His young daughters especially liked to eat deer heart; apparently, it’s marvelously rich and tender. My mouth watered at the thought. The last time I’d tasted venison was more than a decade ago at a fancy restaurant in Toronto, where it was served as carpaccio, drizzled in oil and so fresh that it may as well have pranced out of the woods and onto my plate.

A bounty of such succulent, free-range meat is currently running through America’s backyards. The continental United States is home to some 30 million white-tailed deer, and in many areas, their numbers are growing too rapidly for comfort. Each year, a white-tailed doe can typically birth up to three fawns, which themselves can reproduce as soon as six months later.

Wherever deer are overabundant, they are at best a nuisance and at worst a plague. They trample gardens, destroy farmland, carry ticks that spread Lyme disease, and disrupt forest ecosystems, allowing invasive species to spread. They are involved in tens of thousands of car crashes each year in New York and New Jersey, where state wildlife departments have encouraged hunters to harvest more deer. In especially populated regions, wildlife agencies hire sharpshooters to cull the animals. Last year, New Hampshire legislators expanded the deer-hunting season in an attempt to keep the population under control. By the looks of the forest floor, which was pitted with hoof marks and scattered with marble-shaped droppings, that effort was falling short.

Over the past decade, some states have proposed a simple, if controversial, strategy for bringing deer under control: Couldn’t people like me—who don’t hunt but aren’t opposed to it—eat more venison?

Venison may not be a staple of American cuisine, but it has a place in many people’s diets. Health influencers laud it as a lean, low-calorie, nutrient-dense source of protein. Venison jerky sticks are sold at big-box stores and advertised as snacks for people on Whole30 and keto diets. Higher-end grocery stores, such as Wegmans and Whole Foods, sell ground venison for upwards of $12 a pound, roughly twice the cost of ground beef.

Part of the reason venison is so expensive is that most of it is not homegrown. It’s mostly imported from New Zealand, which has sent more than 5 million pounds of the stuff to the U.S. every year since 2020. Beef, the dominant red meat in the States, has historically been more affordable. But beef prices jumped nearly 15 percent in 2025, and the conventional kind sold in most supermarkets comes from cattle raised in abysmal conditions. If high-quality venison were cheaper and more widely available, it could be an appetizing alternative.

In recent years, a few deer-swamped states, including New Jersey and Maryland, have tried to legalize the sale of hunted venison, which would deliver two key benefits: more deer out of the ecosystem and more venison on people’s plates. Despite the sport’s association with trophies, many deer hunters are motivated by the prospect of obtaining meat, and they can only consume so much. “It’s for your own table,” Overstrom-Coleman said as he fixed climbing sticks onto a tree to form a makeshift ladder. He had already stocked his freezer full of venison this season (“That son of a bitch,” Calsbeek whispered, once we’d left our companion in his tree) and planned, as many hunters do, to donate any excess meat to a food bank.

Hunting is waning in popularity, in part because younger people are less keen on participating than older generations. Efforts to bring in more hunters, such as programs to train women and youth in outdoor skills, are under way in many states. Women are the fastest-growing demographic, and they participate largely to acquire food, Moira Tidball, the executive director at the Cornell Cooperative Extension who leads hunting classes for women, told me. Still, interest is not growing fast enough for the subsistence-and-donation system to keep deer numbers in check.  

[Read: America needs hunting more than it knows]

It’s hard to imagine a better incentive for deer hunting than allowing hunters to sell their venison to stores and restaurants. But the idea is antithetical to a core tenet of American conservation. For more than 100 years, the country’s wild game has flourished under the protection of hunters and their allies, steadfast in their belief that the nation’s animals are not for sale.

The last time this many white-tailed deer roamed America’s woodlands, the country didn’t yet exist. To the English colonists who arrived in the New World, the deer bounding merrily through the forests may as well have been leaping bags of cash. Back home, deer belonged to the Crown, and as such, could be hunted only by the privileged few, Keith Tidball, a hunter and an environmental anthropologist at Cornell (and Moira’s spouse), told me. In the colonies, they were free for the taking.

Colonists founded a robust trans-Atlantic trade for deer hide, a particularly popular leather for making work boots and breeches, which drastically reduced the deer population. In Walden, Henry David Thoreau notes a man who preserved the horns “of the last deer that was killed in this vicinity.” The animals were already close to disappearing from many areas at the beginning of what ecologists have called the “exploitation era” of white-tailed deer, starting in the mid-19th century. Fifty years later, America was home to roughly half a million deer, down 99 percent from precolonial days.

The commerce-driven decimation of the nation’s wildlife—not just deer but birds, elk, bears, and many other animals—unsettled many Americans, especially hunters. In 1900, Representative John Lacey of Iowa, a hunter and close friend of Theodore Roosevelt’s, introduced a bill to ban the trafficking of America’s wildlife. (As Roosevelt, who notoriously hunted to collect trophies, wrote in 1913, “If there is to be any shooting there must be something to shoot.”) The Lacey Act remains one of the most binding federal conservation laws in existence today.  

[From the May 1906 issue: Camping with President Theodore Roosevelt]

The law is partly contingent on state policies, which make exceptions for certain species. Hunters in most states, for example, can legally harvest and sell the pelts of fur-bearing species such as otters, raccoons, and coyotes. But attempts to carve out similar exceptions for hunted venison, including the bills in Maryland and New Jersey, have failed. In 2022, the Mississippi attorney general published a statement that opened up the possibility of legalizing the sale of hunted deer, provoking fierce opposition from hunters and conservationists; today, the option remains open but has not led to any policy changes. Last year, an Indiana state representative introduced a bill that would allow the sale of hunted venison, but so far it has gone nowhere.  

The practical reason such proposals keep failing is that allowing the sale of hunted meat would require huge investments in infrastructure. Systems to process meat according to state and federal laws would have to be developed, as would rapid testing for chronic wasting disease, an illness akin to mad cow that could, theoretically, spread to humans who eat infected meat, though no cases have ever been reported. Such systems could, of course, be implemented. Hunted deer is sold in some common grocery stores in the United Kingdom, such as Waitrose and Aldi. (Notably, chronic wasting disease is not a concern there.)

[Read: Deer are beta-testing a nightmare disease]

Although the sheer abundance of deer makes them easy to imagine as steaks on legs, several experts cautioned that some people’s affection for the animals runs deep. Deer are cute; they’re docile; they’re Bambi. David Drake, a forestry and wildlife professor at the University of Wisconsin at Madison, likens them to America’s “sacred cow.” As Drake and a colleague have outlined in a paper proposing a model for commercialized venison hunting in the U.S., any modern system would be fundamentally different from the colonial-era approach because it would be regulated, mostly by state wildlife agencies. But powerful coalitions of hunters and conservationists remain both faithful to the notion that wild game shouldn’t be sold and fearful that history will repeat itself. As the Congressional Sportsmen’s Foundation, a national hunting association, puts it, “Any effort to recreate markets for game species represents a significant threat to the future of our nation’s sportsmen-led conservation efforts.” Some of the fiercest pushback to the New Jersey law, Drake told me, came from the state wildlife agency.

The only U.S. state with a deer-related exception to the Lacey Act is Vermont. During the open deer-hunting season (which spans roughly from fall to winter in the Northeast) and for 20 days afterward, Vermonters can legally sell any meat that they harvest. This policy was introduced in 1961, and yet, “I am not aware of anyone who actually takes advantage of it,” Nick Fortin, a wildlife biologist at Vermont’s Fish and Wildlife Department, told me. He added that the department, which manages the exasperated homeowners and destabilized forests that deer leave in their path, has been discussing how to raise awareness about the law.

Even after I explained the 1961 law to several Vermont hunters, they were hesitant to sell me any meat. Hunted meat is meant to be shared freely, or at most bartered for other items or goodwill, Greg Boglioli, a Vermont hunter and store owner, told me. I met Boglioli at the rural home of his friend Fred Waite, a lifelong hunter whose front room alone was decorated with 20 deer heads. I had hoped to buy venison from Waite, but he insisted on sharing it for free. After all, he had plenty. His pantry was crammed with mason jars of stewed venison in liver-colored brine. On a table in the living room was the scarlet torso of a deer that his son had accidentally hit with his truck the other day, half-thawed and waiting to be cooked.

During our hunt, I found Overstrom-Coleman to be more open to the idea of selling the venison he hunted. “I guess that would be a pretty excellent way to share it,” he said. Earlier in the season, he’d killed a deer in Vermont, and he was willing to sell me some of the meat the next day. At least, I thought as I stared into the motionless woods, I’d be going home with something.

[From the July/August 2005 issue: Masters of the hunt]

By the time the sun went down, the only deer I’d seen was a teetering doe in a video that Overstrom-Coleman had taken from his tree and sent to Calsbeek. “Too small to kill,” he texted; he’d meet us in the parking lot. The air was glacial as Calsbeek and I trudged empty-handed toward the trailhead, hoofprints glinting mockingly in the light of our headlamps. From the trunk of the car, we took a consolation swig of Wild Turkey from a frosted bottle, and Overstrom-Coleman reminded me to visit the next day.

I found his chest freezer stuffed with paper-wrapped packages stamped with Deer 2025. He handed me three and refused to let me pay. Back home a few days later, I used one to make meatballs. Their sheer depth of flavor—earthy and robust, with a hint of nuttiness—made me wonder why I bothered to eat farmed meat at all.


This article originally misidentified Max Overstrom-Coleman’s hunting weapon.