News

What the Black Death Can Teach Us About Childhood Malnutrition and Adult Health

The Black Death killed millions across Europe in the mid-14th century—but new research suggests that some people who experienced malnutrition early in life may have been more likely to survive the initial wave of the disease. A new study published in Science Advances reveals that surviving childhood hunger may have temporarily improved resilience against plague, though with health costs later in life.

The research, led by biological anthropologist Sharon DeWitte of the University of Colorado Boulder, examined nearly 275 individuals buried in English cemeteries before, during, and after the Black Death. DeWitte’s team focused on chemical signatures left in the dentine of teeth—specifically carbon and nitrogen isotopes—which can reveal evidence of nutritional stress during childhood.

Their findings suggest that individuals who experienced starvation or significant nutritional stress early in life were more likely to survive plague outbreaks through their 20s. However, the benefits may not have lasted. These same individuals appear to have suffered worse health outcomes as they entered middle and old age.

“What this might indicate is that if people experienced a period of starvation early in their childhoods or adolescence but survived, that could have shaped their development in ways that were beneficial in the short term but led to poor outcomes once they got older,” said DeWitte.

The East Smithfield plague pits, which were used for mass burials in 1348 and 1349. Image courtesy Museum of London Archaeology (MOLA)

The study draws on teeth from several medieval cemeteries in London and Lincolnshire, including the East Smithfield Black Death Cemetery in London. Created in 1348 during the height of the plague, East Smithfield is one of the few sites in Europe that can be definitively linked to a Black Death mass burial event. The cemetery provides a rare opportunity to study victims of a specific historical epidemic.

Researchers identified nutritional stress using a distinct isotopic pattern: a rise in δ¹⁵N (nitrogen) paired with stable or decreasing δ¹³C (carbon), a signal known as “opposing covariance.” This indicates the body was breaking down its own fat and muscle to survive—a clear biomarker of famine. Many of these signatures appeared during childhood and even infancy.

“People who experienced nutritional stress as children may have had a mismatch with their environments later in life,” DeWitte explained. “If there’s now a resource abundance, but their bodies were shaped for an environment of scarcity, they may have poor health outcomes, like packing too many fat stores, which can lead to cardiovascular disease.”

The study also examined a skeletal condition known as periosteal new bone formation (PNBF), which often indicates chronic inflammation. Individuals who experienced childhood famine were significantly more likely to exhibit this condition later in life. That suggests early malnutrition may have led to long-term immune system dysregulation, making people more vulnerable to disease decades later.

This idea aligns with several modern biological theories, including the Predictive Adaptive Response (PAR) model, which proposes that developmental changes help children survive harsh conditions—but those adaptations can become liabilities if the environment later changes. Another relevant theory is allostatic load, the idea that chronic or repeated stress damages the body over time, leading to higher mortality and disease risk.

Importantly, the study found that rates of childhood nutritional stress peaked before the Black Death, particularly in the early 13th century, and declined afterwards, likely due to improved diets as population pressure fell. This matches previous research showing that health declined in London in the decades before 1348 and improved afterwards.

This paper was a long time coming – includes data gathered over many years via support from multiple #NSF and #WennerGren grants.
www.science.org/doi/10.1126/…

[image or embed]

— Sharon DeWitte (@sharondewitte.bsky.social) Jul 30, 2025 at 5:13 PM

DeWitte’s broader research agenda seeks to use bioarchaeological evidence from the past to better understand health inequalities today. She emphasises that even during catastrophes like the Black Death, mortality was not random.

“Mortality varied during a catastrophe 700 years ago in ways that might have been preventable,” she noted. “My hope is that we can absorb that lesson and think about how human health can vary across different social categories today, and figure out the points of intervention where we can do something to reduce that burden.”

As medieval cemeteries continue to yield clues about the lives—and deaths—of those who lived through the Black Death, researchers are not only reconstructing the past but also tracing the deep roots of health disparities that persist to this day.

The article, “Childhood nutritional stress and later-life healthoutcomes in medieval England: Evidence fromincremental dentine analysis,” by Sharon N. DeWitte, Julia Beaumont, Brittany S. Walter, Jacqueline R. Towers and Emily J. Brennan, appears in Science Advances. Click here to read it.

Top Image: Miniature from a folio of the Antiquitates Flandriae, depicting the citizens of Tournai, Belgium burying those who died of plague during the Black Death. By Pierart dou Tielt (c. 1340-1360 CE). Made c. 1353. (Royal Library of Belgium)