Thursday, April 16, 2026

Ancient charcoal sheds new light on how early humans fueled their lives

 


Peer-Reviewed Publication

The Hebrew University of Jerusalem

Gesher Benot Ya’aqov Excavation Site 

image: 

A general view of the excavation of Gesher Benot Ya’aqov Acheulian Site

view more 

Credit: GBV Expedition

New study shows that early humans living about 800,000 years ago depended on fire in smart, practical ways. Instead of searching for the “best” wood, they took advantage of what nature provided, mainly driftwood collected along the lakeshore. This reliable fuel supply helped them keep fires going for cooking and daily life, and may even explain why they kept coming back to the same spot. In other words, they weren’t just choosing a place to live, they were choosing a place where fire was easy to maintain.

Link to pictures: https://drive.google.com/drive/folders/1uovu8ot6YH_ky7XFmBgDmiKDddOZGXqS?usp=drive_link

Nearly 800,000 years ago, early humans gathered along the shores of a lush lake in what is now northern Israel. Here, they returned again and again, hunting large animals, cooking fish over controlled fires, and organizing their daily lives around hearths. Now, a new study shows that even the wood fueling those fires, which is preserved as rare fragments of charcoal, can reveal how carefully these ancient communities understood and used their environment.

Published in Quaternary Science Reviews, the study offers a vivid reconstruction of life at the Acheulian site of Gesher Benot Ya'aqov (GBY). By examining an exceptionally rich and rare collection of ancient charcoal, an international team of researchers from Israel, Spain, and Germany, including Prof. Naama Goren-Inbar (Hebrew University), Prof. Nira Alperson-Afil and Dr. Yoel Melamed (Bar-Ilan University), Prof. Ethel AlluĂ© (Universitat Rovira i Virgili and Institut CatalĂ  de Paleoecologia), and Prof. Brigitte Urban (Leuphana University), has uncovered new evidence of how early hominins gathered and used firewood, revealing behavior far more sophisticated than previously assumed.

Charcoal rarely survives at such early prehistoric sites, making this unusually large assemblage a unique window into the daily practices of early fire users. While many ancient sites preserve only fragmentary or ambiguous traces of burning, GBY provides a remarkably detailed record of repeated fire use over tens of thousands of years.

GBY preserves a layered history of human occupation along the shores of paleo–Lake Hula, with more than 20 archaeological horizons documenting generations of Acheulian hunter-gatherers returning to the same location. Excavations led by Prof. Naama Goren-Inbar of the Hebrew University of Jerusalem have revealed a dynamic landscape of activity: stone tools crafted from flint, limestone, and basalt; the remains of hunted animals; and a wide array of plant foods, including fruits, nuts, and seeds gathered from the lakeshore.

One particularly striking layer captures a dramatic moment in time. Alongside stone tools and plant remains, researchers uncovered the skull and bones of a straight-tusked elephant, evidence of large-scale hunting and butchery. The spatial arrangement of the remains suggests that the animal was processed on-site.

At the heart of this ancient camp life was fire. First identified at GBY by Prof. Nira Alperson-Afil of Bar-Ilan University, fire was habitual. It structured how space was organized, anchoring activities such as tool production, food preparation, and social interaction.

The new study focuses on a single occupation layer dated to approximately 780,000 years ago. Researchers analyzed 266 charcoal fragments, using microscopic techniques to identify the internal structure of the wood and determine its botanical origin. The results revealed a surprisingly diverse mix of plant species, including ash, willow, grapevine, oleander, olive, oak, pistachio, and even pomegranate, which is the earliest known evidence of this fruit tree in the Levant.

Unexpectedly, the charcoal assemblage showed greater plant diversity than other botanical remains from the site, such as seeds, fruits, or unburned wood. This suggests that firewood collection captured a broader cross-section of the surrounding environment than other forms of plant use.

Together, these species paint a vivid picture of the ancient landscape: a mosaic of wet lakeshore vegetation and open Mediterranean woodland. But more importantly, they reveal how early humans interacted with that landscape.

Rather than selectively gathering specific types of wood, GBY hominins appear to have relied primarily on driftwood naturally accumulating along the lakeshore. Fallen branches and logs, carried by water and deposited along the shore, would have created a readily available fuel supply. The composition of the charcoal closely mirrors the wood available in this environment, suggesting a practical and efficient strategy, using what the landscape provides.

This insight points to a broader conclusion: access to firewood may have been a decisive factor in where these early humans chose to live. The lakeshore offered not only fresh water, edible plants, animals, and raw materials for tools, but also a constant supply of fuel, essential for maintaining fire.

Even more striking is how fire was used. Spatial analysis shows that dense clusters of charcoal overlap with concentrations of fish remains, primarily the distinctive teeth of large carp. This co-occurrence adds compelling evidence that fish were being cooked at the site nearly 800,000 years ago, likely using carefully controlled fire.

These findings reinforce the idea that GBY hominins possessed advanced cognitive abilities. They were capable of controlling fire, organizing space around it, and integrating it into complex subsistence strategies. Yet interestingly, while hunting and tool-making required elaborate planning, firewood collection itself appears to have been a more routine activity, based largely on availability rather than careful selection of specific tree species.

Together, these behaviors paint a picture of a community that was both highly skilled and deeply attuned to its environment, returning repeatedly to a place that offered everything they needed to survive and thrive.

The GBY charcoal assemblage provides a unique dataset for examining the intersection of fire use, environmental context, and hominin behavior. These findings refine current models of early fire-related practices and emphasize the importance of local resource availability in shaping patterns of occupation and subsistence during the Middle Pleistocene.

Massive ancient-DNA study reveals natural selection has accelerated in recent human evolution

 



Hundreds of genes selected in West Eurasia since farming began, many linked to health

Peer-Reviewed Publication

Harvard Medical School

At a glance:

  • Applying new analytic methods to nearly 16,000 ancient genomes reveals natural selection has acted on hundreds, not dozens, of genes in West Eurasia over the last 10,000 years.
  • More than half of the genes have known links to disease risk and other traits today, although it’s not yet clear what made each gene advantageous in prehistoric contexts.
  • The work demonstrates the power of ancient DNA to illuminate human biology and medicine in addition to history.

A massive study of ancient DNA from nearly 16,000 people across more than 10,000 years in West Eurasia reveals that natural selection has shaped modern human genomes far more than previously thought.Before now, studies of ancient human DNA had identified only about 21 instances of directional selection — the type of natural selection that occurs when one version of a gene that confers an extreme form of a trait, such as lactose tolerance after infancy, proves advantageous enough for survival and reproduction that it gets passed on to more offspring than less advantageous versions of the gene and rapidly rises in frequency across a population. The dearth of evidence suggested that directional selection has been rare since modern humans arose in Africa some 300,000 years ago and began to split into different population groups around the world.

Combining an unprecedented amount of ancient genomic data with novel computational methods, the new analysis shows instead that directional selection has driven the spread or decline of hundreds of gene variants in West Eurasia since the end of the Ice Age and that selection has actually accelerated since people transitioned from hunting and gathering to farming.

The work demonstrates the power of ancient-DNA research to illuminate human genetic adaptation and other fundamental principles of evolutionary biology.

Many of the identified gene variants have known links to complex physical, psychological, and social traits, including risk for type 2 diabetes and schizophrenia. Delving into the evolution of these traits could deepen understanding of behavior, health, and disease and inform treatment efforts. However, the way we define some of the traits today, such as household income, doesn’t translate to prehistoric contexts, and the current analysis can’t speak to what made a variant beneficial for survival when it first arose.

The findings, led by Harvard University researchers, are published April 15 in Nature.

“With these new techniques and large amount of ancient genomic data, we can now watch how selection shaped biology in real time,” said Ali Akbari, first author of the study and senior staff scientist in the lab of Harvard geneticist David Reich. “Instead of searching for the scars natural selection leaves in present-day genomes using simple models and assumptions, we can let the data speak for itself.”

“This work allows us to assign place and time to forces that shaped us,” said Reich, professor of genetics in the Blavatnik Institute at Harvard Medical School, professor of human evolutionary biology in the Harvard University Faculty of Arts and Sciences, and senior author of the study.

10,000 ancient genomes, new computational methods

Since 2010, when the first genome-wide data was recovered from ancient human remains, ancient-DNA research has expanded understanding of the relationships among people living in different time periods and regions of the world.

But geneticists struggled to realize the technology’s promise to illuminate how natural selection has shaped human genetic variation even over the last 10,000 years, when there is enough well-preserved genetic material to support large-scale studies.

The new study broke through that barrier using two innovations.

First, the Reich Lab spent seven years building a collection of DNA sequences from ancient people living in West Eurasia — what is now Europe and parts of the Middle East — that would be comprehensive enough in size and time span to support the work.

“If the goal is to uncover changes in the frequency of genetic variants in the last ten millennia that are greater than can be expected by chance, then we need to detect subtle effects, which requires having thousands of genomes spanning that time period,” explained Reich, who is also a member of the Broad Institute of MIT and Harvard and a Howard Hughes Medical Institute Investigator.

The lab collaborated with more than 250 archeologists and anthropologists to report new DNA data from 10,016 ancient individuals from West Eurasia. They added those to another 5,820 published ancient sequences and 6,438 modern ones.

“This single paper doubles the size of the ancient human DNA literature,” Reich said. “It reflects a focused effort to fill in holes that limited the power of previous studies to detect selection.”

The regions from which ancient and recent human DNA samples were studied in this work. Image: Akbari A et al., “Ancient DNA reveals pervasive directional selection across West Eurasia,” Nature (2026)

Alt text: A map of Europe and western Asia, with regions marked in five colors. Countries span Iceland and Russia in the north to Spain and Iran in the south.

The second innovation — and even more important to the success of the study, Reich said — was Akbari’s development of computational methods to isolate the signal of directional selection from other causes of gene frequency changes, such as human migration, population mixing, and random genetic fluctuations that occur in small populations.

“Ali developed a powerful technique that could zoom in on the patterns that actually mattered,” said Reich.

In the end, it was a faint signal indeed that Akbari detected. By the team’s calculations, directional selection accounted for only about 2 percent of all gene frequency changes.

What has natural selection selected for?

Two percent still encompasses a lot of DNA. Akbari identified 479 gene versions, or alleles, that were strongly selected for — or against — in West Eurasian genomes.

He and colleagues were able to ascertain when and where some of the alleles began to spread through or be pushed out of the West Eurasian gene pool. They also calculated an overall rate at which selection seemed to occur and detected changes in that rate. They found that selection accelerated after the introduction of farming, reflecting how different traits became advantageous as people shifted to agricultural environments and behaviors.

More than 60 percent of the individual DNA variants that were flagged as being strongly selected for — most of them single nucleotide polymorphisms, or SNPs — have documented links with present-day human traits, such as:

  • Light skin tone
  • Red hair
  • Risk of celiac disease and Crohn’s disease
  • Immunity to HIV infection and resistance to leprosy
  • Lower chance of male-pattern baldness
  • Lower risk of rheumatoid arthritis and alcoholism
  • Having the B version of the proteins on red blood cells that confer A, B, and O blood types and influence resistance to infection with bacteria and viruses

In some cases, groups of SNPs were under selection together to influence polygenic traits. Some changes raised the frequency of beneficial traits, including some that are interpreted today as:

  • “Health span” traits such as faster walking pace
  • Measures of behavioral and social status or cognitive functions, such as scores on intelligence tests, household income, and years of schooling

Other changes reduced the frequency of harmful traits, such as those that are interpreted today as:

  • Reduced risk of bipolar disorder and schizophrenia
  • Lower body fat percentage, waist-to-hip ratio, and body mass index
  • Less susceptibility to tobacco smoking

Still other SNPs, such as some that today are associated with susceptibility to tuberculosis and multiple sclerosis, at first rose and then fell in frequency over the millennia, indicating shifts in environmental pressures and the traits that prove beneficial, the team found.

Some of the links seem logical, others counterintuitive, like the major genetic risk factor for gluten intolerance spiking after people began farming wheat.

However, the authors emphasize that there are several crucial factors to understand before interpreting SNP associations like these.

First: What a variant is associated with now is not necessarily why an allele propagated in the West Eurasian gene pool. Reasons for this include:

  • Some of the traits that SNPs are associated with in modern societies did not exist in ancient contexts and therefore can’t explain why an allele was originally advantageous or detrimental. A variant that now correlates to household income or years of schooling had to have meant something different in the Stone Age. So these results do not mean that Europeans evolved to be smarter or healthier.
  • The fact that an allele shapes a particular trait today also does not automatically mean this trait was important in the past. Perhaps having red hair was beneficial 4,000 years ago, or perhaps it came along for the ride with a more important trait.
  • Some SNPs affect multiple traits, so what a genomic database tags a SNP as affecting may not capture everything it’s doing. Today, for instance, we know that the same gene variant that raises risk of sickle cell disease also protects people from malaria, so what looks like natural selection for one disease may be selection against another.
  • It’s possible that a flagged SNP is actually in a gene next to the one that natural selection was targeting — another way of coming along for the ride.
  • Present-day traits a SNP influences may not yet be known or included in the databases the team analyzed.

Second: Just because an allele, SNP, or trait swept into or out of West Eurasia during this time doesn’t mean this happened only in West Eurasia. Researchers can use the new computational methods to look for directional selection in other populations worldwide that have enough ancient DNA sequences and construct a clearer picture of what’s unique to different groups and what generalizes across populations.

Reich expects that future studies will show that shared selective pressures acted on some of the same core traits across diverse human groups, even as those groups split off and migrated to different parts of the world over tens of thousands of years.

What comes next

The team has made its data and methods freely available to spur new research.

One avenue is to investigate other possible signals in the data. Akbari said he and colleagues identified more than 7,600 genetic locations that have better than a 50/50 chance of “being real examples of directional selection” and warrant follow-up.

Using the new methods to explore other groups and further back in time are the most exciting avenues for Reich.

“To what extent will we see similar patterns in East Asia or East Africa or Native Americans in Mesoamerica and the central Andes?” he asked. “If we can’t use ancient DNA to study the most important period in human evolution 1 million to 2 million years ago, then at least we can study selective pressure on human genomes during more recent periods of change and learn broader principles.”

It will also be crucial for scientists to conduct molecular studies to better understand the health consequences of selected alleles.

It’s possible the results could point scientists to new genetic factors in health and disease that improve experts’ ability to assess disease risk, prevent illness, and develop new medicines. Researchers developing gene therapies might consider whether the gene they’re targeting was flagged in the study as being advantageous, Akbari said.

“You could speculate that if the variant someone wants to knock out was strongly selected for, it’s probably not the best idea,” he said.

Scientists could also use the new methods to study natural selection in other species. Such work could uncover alleles that have made cattle or chickens well-suited to domestication, Akbari suggested, or that have helped animals adapt to changes in climate.

The possibilities are enticing for deepening our appreciation of human diversity, history, and health, Reich said.

“This paper shows how complex selection can be and provides an opportunity to consider the richness of variation in human populations,” he said.  

Authorship, funding, disclosures

Additional authors are Annabel Perry, Alison R. Barton, Mohammadreza Kariminejad, Steven Gazal, Zheng Li, Yating Zeng, Alissa Mittnik, Nick Patterson, Matthew Mah, Xiang Zhou, Alkes L. Price, Eric S. Lander, Ron Pinhasi, Nadin Rohland, and Swapan Mallick.

This research was supported by the John Templeton Foundation (grant 61220), the Allen Discovery Center for Human Brain Evolution (a Paul G. Allen Frontiers Group advised program of the Allen Family Philanthropies), the Howard Hughes Medical Institute, the National Institutes of Health (grant HG012287), a private gift from Jean-François Clin, and the European Research Council (grant 834087, COMMIOS). The research was conducted using the UK Biobank resource under Application 16549. The authors also acknowledge support from the Research Computing Group at HMS.

Wednesday, April 15, 2026

Ancient burial practices and DNA research reveal that family goes beyond genetic relatedness

 You probably have a member of your family that you’re not related to by blood—a step-parent, an adopted cousin, your mom’s best friend who you grew up calling your aunt. They're indisputably part of your family, but a DNA test wouldn’t hint at your relationship. Archaeologists are finding that this holds true for families from thousands of years ago, too. By comparing ancient burial practices with genetic information gleaned from the remains, researchers show that it’s not uncommon for people who aren’t related by blood to be treated as members of the same family—which means that ancient DNA doesn’t tell the whole story of how families and societies worked.

“Even in prehistory, kinship was more than just blood relations,” says Sabina CveÄŤek, an archaeologist and Marie SkĹ‚odowska-Curie Global Fellow at the Field Museum in Chicago. “Many communities around the world have a concept of family that goes beyond this biological setting. So no matter how hard we push with ancient DNA research, we'll never know the whole story if we don't take diversity and cultural anthropological perspectives into account.” CveÄŤek is one of the lead authors of a special issue of the Cambridge Archaeological Journal dedicated to how archaeologists, anthropologists, and geneticists determine the relationships between ancient people, and how genetic research plays a role in our understanding of these societies.

This special issue, which CveÄŤek edited with Maanasa Raghavan (University of Chicago) and Penny Bickle (University of York), includes research about the relationship between family and genetic relatedness around the world, over the course of thousands of years. CveÄŤek, Raghavan, and Bickle emphasized in their introductory piece that kinship cannot be reduced to genetic relatedness, and that recent archaeogenetic work—while powerful—has tended to privilege biological descent and linear pedigrees.

“The piece intervenes by showing that this is only one 'code' of relatedness. Instead, ancient kinship research is in need of new approaches by closely considering ethics of sampling human remains, interdisciplinary training, collaborative research design, and new interpretations that consider multiple ways of becoming kin,” says CveÄŤek.

The team reviewed decades’ worth of previous archaeological and genetic studies from sites in Europe and western Asia. For instance, at a site ÇatalhöyĂĽk in what’s now TĂĽrkiye (sometimes called Turkey), burials  were often found below the house floors of ancient houses from 8,000 years ago.”Archaeologists initially assumed that people buried within the same house would be genetically related,” says CveÄŤek. “But now, it is possible to map those people through ancient DNA analysis on genetic pedigrees, and geneticists often found people buried within the same house who are not at all genetically related, indicating social proximity rather than exclusively blood relations made kin at the site.”

DNA degrades over time, but traces of DNA can remain inside human bones, including small bones such as petrous bone in the inner ear. In the past few decades, scientists have been able to extract DNA from these ancient bones and sequence it. The resulting genetic sequences are generally patchy, so “geneticists  need to do a lot of computational analysis and statistics with genetic signatures from those broken pieces of ancient DNA to actually reconstruct biological relatedness of the past,” says CveÄŤek.

These findings suggest that in these ancient communities, the concept of family wasn’t only dictated by blood. Since the same is true of many families today, that may not seem like an earth-shattering discovery. But it could be a critical piece of information for researchers attempting to reconstruct how ancient cultures built and passed down their family ties. DNA doesn’t always tell the whole story.

“One of the aims of this paper is to debunk the Western perceptions of family kinship, which often seems to be based on blood. We cannot have just one proxy for understanding family or kinship around the world,” says CveÄŤek.

This broader concept of family goes beyond archaeological and anthropological research—we run into it every day when we handle health insurance, housing, childcare, and education. “The old saying, that it takes a village to raise a child, is true,” says CveÄŤek. “We all invest time and labor to build a world that looks after people beyond our biological dependents.” Caring for people who aren’t blood-related to us is part of what makes us human.

Saturday, April 11, 2026

An advanced mural painting technique never before seen in Roman Hispania

 Roman painters commissioned at the end of the 1st century to decorate the walls of the Domus of Salvius in present-day Cartagena could hardly have imagined that their technical expertise would still attract attention twenty centuries later. Analysis of wall paintings from one of the house’s rooms—among the best preserved in ancient Carthago Nova—shows that these craftsmen possessed a sophisticated understanding of the materials used to produce pigments, as well as the effects achieved through combining them. In particular, researchers identified an advanced “recipe” that enabled them to reduce costs while ensuring the durability of the paint. This method relied on a mixture of pigments, including one of the most valued minerals of the time: costly cinnabar, often referred to as “red gold.”

This conclusion is the result of a multidisciplinary study conducted by researchers from the Department of Prehistory, Archaeology, Ancient History, Medieval History, and Historiographical Sciences and Techniques at the University of Murcia, together with the Department of Organic Chemistry at the Chemical Institute for Energy and the Environment (IQUEMA) at the University of CĂłrdoba. Through a range of analytical techniques, the remains discovered in the domus have revealed a unique combination of pigments never before documented in Hispania, with only one known parallel in Ephesus, Turkey.

As UCO researchers JosĂ© Rafael Ruiz Arrebola and Daniel Cosano Hidalgo explain, theyhave published their research in the journal Heritage Science alongside archaeologists Gonzalo Castillo Alcántara, Alicia Fernández DĂ­az, and JosĂ© Miguel Noguera Celdrán. The groups' multidisciplinary research is in line with previous work on topics such as the world’s oldest wine and aromas that perfumed the Roman Empire. In this case, analyses carried out in thelaboratories of the IQUEMA and FQM-346 research groups made it possible to determine the composition of the mortars used in the house through X-ray diffraction, as well as to identify pigment residues using Raman spectroscopy; a technique that detects chemical compounds based on how they interact with light. The results support a previously proposed theory: that the Domus of Salvius belonged to a wealthy family capable of affording expensive construction and decorative materials. However, the analysis of the pigments also led the team to propose a complementary hypothesis; one that isnot related to the purchasing power of the domus' inhabitants but rather to the technical skill of the craftsmen.

Techniques for preserving color

Calcium carbonate for the white pigment, charcoal for the black, goethite for the yellow, and glauconite for the green with traces of Egyptian blue; the first synthetic pigment and a status symbol. For the red pigment, a mixture of cinnabar and iron oxide, for which there are also documented precedents. “Iron oxide was a cheap material that was commonly used in workshops to create reddish tones. Cinnabar was more costly and had to be supplied by the client,” explain the researchers, who state that it was common practice to mix these two elements to reduce costs without losing the chromatic intensity of the cinnabar, which thus lasted longer. However, what was truly striking and innovative was not the mixture itself, but the way in which it had been applied to the walls of the Domus of Salvius.

Upon analyzing the sample using scanning electron microscopy in the SCAI laboratories, the researchers discovered that the mixture that created the mural’s intense red color had not been applied directly to the wall. Instead, the surface had first been "primed" with a layer of yellow goethite. This was no coincidence. “Cinnabar tends to blacken when exposed to light, moisture, and caustic environments,” explain the study’s authors, who believe that the craftsmen applied the layer of goethite to protect the mixture of lime and iron oxide, possibly allowing it to act as a stabilizer. In this way, they ensured that the costly cinnabar not only went further but also retained its appearance for longer.

The use of this technique indicates a high level of expertise on the part of the craftsmen, who would have studied the materials, the effects resulting from combining them, and the application of various techniques. The researchers suggest the existence of recipe books and workshops where this knowledge was developed and shared, not only in Cartago Nova but also beyond the borders of Hispania. In this way, archaeometric analysis and cooperation between fields of knowledge that are, at first glance, very different, such as chemistry and archaeology, allow us to study the remains of antiquity from new perspectives and learn more about the past, by comparing the information obtained from classical sources, such as Vitruvius or Pliny the Elder, with the archaeological reality.

Friday, April 10, 2026

Hat wars of early modern England revealed

 


Levellers wearing their hats, woodcut from The Declaration and Standard of the Levellers of England (1649). Credit_ Bodleian Libraries, University of Oxford. CC BY-NC-SA 2.0 UK 

image: 

Levellers wearing their hats, woodcut from The Declaration and Standard of the Levellers of England (1649). 

view more 

Credit: Bodleian Libraries, University of Oxford.



Research paper and images can be downloaded here
 

From refusing to doff hats in court to resisting hat-snatching highway robbers, England’s relationship with hats goes far deeper than fashion, new research shows.

‘Hatiquette’ is a matter of personal choice in modern Britain, but 400 years ago social conventions were very different and refusing to doff (“do off”) one’s hat could be a potent act of political defiance, according to a new study published in The Historical Journal (Cambridge University Press).

In 1630, a feisty oatmeal maker hauled before England’s supreme church court was informed that some of his judges were privy councillors as well as bishops. Unimpressed, he replied, ‘as you are privy councillors ... I put off my hat; but as ye [bishops] are rags of the Beast, lo! – I put it on again’.

He was just one of many hat-wearing rebels to emerge during the turbulent reign of Charles I. Refusal to doff one’s hat became a widespread act of political defiance throughout the civil war era and beyond.

For the aptly named Bernard Capp, Emeritus Professor of History at the University of Warwick, and an expert on the Civil War period, such episodes reveal an important transformation in the meaning of ‘hat-honour’.

“Long before the civil wars, men and boys were expected to doff their hats, indoors or out, whenever they met a superior,” Professor Capp says. “That was about respecting your place in society, but in the revolutionary 1640s and 1650s, hat-honour became a real gesture of defiance in the political sphere.”

When the radical Leveller John Lilburne, gaoled in Newgate in 1646, was ordered to appear at the House of Lords, he resolved to ‘come in with my hat upon my head, and to stop my eares when they read my Charge, in detestation’. In April 1649, the proto-communist Digger leaders William Everard and Gerrard Winstanley also refused to take their hats offs when brought before General Fairfax, commander of the New Model Army,  telling him he was ‘but their fellow Creature’. Fifth Monarchists including Wentworth Day, prosecuted for sedition in 1658, similarly refused to doff their hats.

But these acts of hat defiance were not exclusive to radicals. Professor Capp points out that once defeated, eminent royalists adopted the same tactic. Charles I kept his hat on when he appeared before the High Court of Justice in January 1649, refusing to respect a court whose legitimacy he rejected. And the earl of Peterborough’s son, tried for treason in 1658, similarly refused to remove his hat or to plead.

Elite men could also choose to reverse conventional practice and strategically doff their hats to social inferiors. Some royalist leaders, including Lord Capel, theatrically removed their hats when they were on the scaffold waiting to be executed. “This was a sort of populist political gesture, essentially inviting the moral support of the crowd,” Professor Capp says.

Grounding a teenager 17th-century style

Professor Capp’s favourite discovery relates to a very different battlefield: the home of a father and teenage son. In 1659, shortly before the restoration of the monarchy, Thomas Ellwood’s father took drastic measures to ground the 19-year-old: he confiscated all of his hats. Decades later, Thomas recalled: ‘I was still under a kind of Confinement, unless I would have run about the Country bare-headed, like a Mad-Man’. Thomas had repeatedly flouted his father’s command to stay away from the Quakers, a group well-known for refusing to remove their hats for people on principle.

Thomas’ behaviour provoked bitter family quarrels and a beating, until his father realised the power of hats. Thomas’ autobiography, published in 1714, reveals that he spent months trapped in his house merely by the power of cultural convention.

Professor Capp says: “It makes no sense to us today. But in 1659, father and son just saw this as common sense. Thomas couldn’t leave the house without a hat – it would have brought too much shame on himself and his family.”

Handshakes not responsible for demise of hat-doffing

Some have argued that the rise of handshaking was responsible for the decline of hat-doffing but Professor Capp questions this.

“The handshake evolved very slowly as a mode of greeting and had no bearing on hat-honour as a gesture of deference,” Capp says. He argues that the decline of hat-doffing was partly a consequence of manners becoming more informal but suggests other likely factors:

“The rising popularity of wigs made hat-wearing itself less ubiquitous, and repeatedly doffing one’s hat to acquaintances in increasingly busy urban streets may have become too irritating. Conventions gradually change over generations and are usually multicausal.”

Take anything apart from my hat

Delving into the relative stability of the 18th century, Professor Capp discovered that an Englishman’s hat remained a powerful symbol and highly-prized layer of personal protection. Examining Old Bailey court records, he found startling evidence of highway robbery victims prioritising their hats over valuables and large sums of money.

One evening in May 1718, for example, William Seabrook was crossing Finchley Common when he was attacked by three thieves, who robbed him of all the money he was carrying, amounting to about £15. The court record notes that ‘they also took away his Hat, upon which he begg’d of them not to take away his Hat and make him go home bare-headed; then they threw down his Hat in the Road and left it’.

“There seems to have been an unwritten convention that if victims meekly surrendered their valuables, they deserved at least a small favour,” Professor Capp says. “So some highwaymen were willing to let men keep their precious hats.”

“The behaviour of robber and robbed might seem bizarre today but it’s got a lot to do with health concerns. Men wearing periwigs often had their head shaved, so they were more susceptible to the cold. And eighteenth-century medical guides were obsessed with keeping the head warm and warned that going outside bareheaded risked illness.”

When Francis Peters, a gentleman, was robbed at gunpoint in Westminster in 1733, he handed over his money, an expensive watch, and a ring. But when the highwayman ‘snatch’t off my Hat and Wig,’ he protested that ‘it was very unusual for Men of his Profession to take such Things, and that it being very cold it might indanger my Health’. The highwayman took no notice and rode off leaving Peters to tie a handkerchief round his head to provide at least some protection. Peters later confronted the highwayman in prison and told him ‘he had used me hardly, in taking my Hat and Wig’. The highwayman apologised.

Looking poor, mad or both

Professor Capp’s study points out that being seen bareheaded in the eighteenth century was associated with abject poverty and madness. Court records reveal that suspects were desperately anxious not to be hatless when they appeared before a magistrate or jury.

“Even in London’s seedy underworld, a hat felt essential,” Professor Capp says. So when Thomas Ruby was tried for burglary at the Old Bailey in 1741, he ‘begged very hard’ for the return of his hat, lost at the time of his arrest, ‘for he had none to wear’.

 “What you wear says something about how you see yourself and the world,” Professor Capp says. “And the hat is so eloquent because it’s so versatile – you can position it in so many ways, take it off, wave it around, and attach messages to it.”