Saturday, April 11, 2026

An advanced mural painting technique never before seen in Roman Hispania

 Roman painters commissioned at the end of the 1st century to decorate the walls of the Domus of Salvius in present-day Cartagena could hardly have imagined that their technical expertise would still attract attention twenty centuries later. Analysis of wall paintings from one of the house’s rooms—among the best preserved in ancient Carthago Nova—shows that these craftsmen possessed a sophisticated understanding of the materials used to produce pigments, as well as the effects achieved through combining them. In particular, researchers identified an advanced “recipe” that enabled them to reduce costs while ensuring the durability of the paint. This method relied on a mixture of pigments, including one of the most valued minerals of the time: costly cinnabar, often referred to as “red gold.”

This conclusion is the result of a multidisciplinary study conducted by researchers from the Department of Prehistory, Archaeology, Ancient History, Medieval History, and Historiographical Sciences and Techniques at the University of Murcia, together with the Department of Organic Chemistry at the Chemical Institute for Energy and the Environment (IQUEMA) at the University of Córdoba. Through a range of analytical techniques, the remains discovered in the domus have revealed a unique combination of pigments never before documented in Hispania, with only one known parallel in Ephesus, Turkey.

As UCO researchers José Rafael Ruiz Arrebola and Daniel Cosano Hidalgo explain, theyhave published their research in the journal Heritage Science alongside archaeologists Gonzalo Castillo Alcántara, Alicia Fernández Díaz, and José Miguel Noguera Celdrán. The groups' multidisciplinary research is in line with previous work on topics such as the world’s oldest wine and aromas that perfumed the Roman Empire. In this case, analyses carried out in thelaboratories of the IQUEMA and FQM-346 research groups made it possible to determine the composition of the mortars used in the house through X-ray diffraction, as well as to identify pigment residues using Raman spectroscopy; a technique that detects chemical compounds based on how they interact with light. The results support a previously proposed theory: that the Domus of Salvius belonged to a wealthy family capable of affording expensive construction and decorative materials. However, the analysis of the pigments also led the team to propose a complementary hypothesis; one that isnot related to the purchasing power of the domus' inhabitants but rather to the technical skill of the craftsmen.

Techniques for preserving color

Calcium carbonate for the white pigment, charcoal for the black, goethite for the yellow, and glauconite for the green with traces of Egyptian blue; the first synthetic pigment and a status symbol. For the red pigment, a mixture of cinnabar and iron oxide, for which there are also documented precedents. “Iron oxide was a cheap material that was commonly used in workshops to create reddish tones. Cinnabar was more costly and had to be supplied by the client,” explain the researchers, who state that it was common practice to mix these two elements to reduce costs without losing the chromatic intensity of the cinnabar, which thus lasted longer. However, what was truly striking and innovative was not the mixture itself, but the way in which it had been applied to the walls of the Domus of Salvius.

Upon analyzing the sample using scanning electron microscopy in the SCAI laboratories, the researchers discovered that the mixture that created the mural’s intense red color had not been applied directly to the wall. Instead, the surface had first been "primed" with a layer of yellow goethite. This was no coincidence. “Cinnabar tends to blacken when exposed to light, moisture, and caustic environments,” explain the study’s authors, who believe that the craftsmen applied the layer of goethite to protect the mixture of lime and iron oxide, possibly allowing it to act as a stabilizer. In this way, they ensured that the costly cinnabar not only went further but also retained its appearance for longer.

The use of this technique indicates a high level of expertise on the part of the craftsmen, who would have studied the materials, the effects resulting from combining them, and the application of various techniques. The researchers suggest the existence of recipe books and workshops where this knowledge was developed and shared, not only in Cartago Nova but also beyond the borders of Hispania. In this way, archaeometric analysis and cooperation between fields of knowledge that are, at first glance, very different, such as chemistry and archaeology, allow us to study the remains of antiquity from new perspectives and learn more about the past, by comparing the information obtained from classical sources, such as Vitruvius or Pliny the Elder, with the archaeological reality.

Friday, April 10, 2026

Hat wars of early modern England revealed

 


Levellers wearing their hats, woodcut from The Declaration and Standard of the Levellers of England (1649). Credit_ Bodleian Libraries, University of Oxford. CC BY-NC-SA 2.0 UK 

image: 

Levellers wearing their hats, woodcut from The Declaration and Standard of the Levellers of England (1649). 

view more 

Credit: Bodleian Libraries, University of Oxford.



Research paper and images can be downloaded here
 

From refusing to doff hats in court to resisting hat-snatching highway robbers, England’s relationship with hats goes far deeper than fashion, new research shows.

‘Hatiquette’ is a matter of personal choice in modern Britain, but 400 years ago social conventions were very different and refusing to doff (“do off”) one’s hat could be a potent act of political defiance, according to a new study published in The Historical Journal (Cambridge University Press).

In 1630, a feisty oatmeal maker hauled before England’s supreme church court was informed that some of his judges were privy councillors as well as bishops. Unimpressed, he replied, ‘as you are privy councillors ... I put off my hat; but as ye [bishops] are rags of the Beast, lo! – I put it on again’.

He was just one of many hat-wearing rebels to emerge during the turbulent reign of Charles I. Refusal to doff one’s hat became a widespread act of political defiance throughout the civil war era and beyond.

For the aptly named Bernard Capp, Emeritus Professor of History at the University of Warwick, and an expert on the Civil War period, such episodes reveal an important transformation in the meaning of ‘hat-honour’.

“Long before the civil wars, men and boys were expected to doff their hats, indoors or out, whenever they met a superior,” Professor Capp says. “That was about respecting your place in society, but in the revolutionary 1640s and 1650s, hat-honour became a real gesture of defiance in the political sphere.”

When the radical Leveller John Lilburne, gaoled in Newgate in 1646, was ordered to appear at the House of Lords, he resolved to ‘come in with my hat upon my head, and to stop my eares when they read my Charge, in detestation’. In April 1649, the proto-communist Digger leaders William Everard and Gerrard Winstanley also refused to take their hats offs when brought before General Fairfax, commander of the New Model Army,  telling him he was ‘but their fellow Creature’. Fifth Monarchists including Wentworth Day, prosecuted for sedition in 1658, similarly refused to doff their hats.

But these acts of hat defiance were not exclusive to radicals. Professor Capp points out that once defeated, eminent royalists adopted the same tactic. Charles I kept his hat on when he appeared before the High Court of Justice in January 1649, refusing to respect a court whose legitimacy he rejected. And the earl of Peterborough’s son, tried for treason in 1658, similarly refused to remove his hat or to plead.

Elite men could also choose to reverse conventional practice and strategically doff their hats to social inferiors. Some royalist leaders, including Lord Capel, theatrically removed their hats when they were on the scaffold waiting to be executed. “This was a sort of populist political gesture, essentially inviting the moral support of the crowd,” Professor Capp says.

Grounding a teenager 17th-century style

Professor Capp’s favourite discovery relates to a very different battlefield: the home of a father and teenage son. In 1659, shortly before the restoration of the monarchy, Thomas Ellwood’s father took drastic measures to ground the 19-year-old: he confiscated all of his hats. Decades later, Thomas recalled: ‘I was still under a kind of Confinement, unless I would have run about the Country bare-headed, like a Mad-Man’. Thomas had repeatedly flouted his father’s command to stay away from the Quakers, a group well-known for refusing to remove their hats for people on principle.

Thomas’ behaviour provoked bitter family quarrels and a beating, until his father realised the power of hats. Thomas’ autobiography, published in 1714, reveals that he spent months trapped in his house merely by the power of cultural convention.

Professor Capp says: “It makes no sense to us today. But in 1659, father and son just saw this as common sense. Thomas couldn’t leave the house without a hat – it would have brought too much shame on himself and his family.”

Handshakes not responsible for demise of hat-doffing

Some have argued that the rise of handshaking was responsible for the decline of hat-doffing but Professor Capp questions this.

“The handshake evolved very slowly as a mode of greeting and had no bearing on hat-honour as a gesture of deference,” Capp says. He argues that the decline of hat-doffing was partly a consequence of manners becoming more informal but suggests other likely factors:

“The rising popularity of wigs made hat-wearing itself less ubiquitous, and repeatedly doffing one’s hat to acquaintances in increasingly busy urban streets may have become too irritating. Conventions gradually change over generations and are usually multicausal.”

Take anything apart from my hat

Delving into the relative stability of the 18th century, Professor Capp discovered that an Englishman’s hat remained a powerful symbol and highly-prized layer of personal protection. Examining Old Bailey court records, he found startling evidence of highway robbery victims prioritising their hats over valuables and large sums of money.

One evening in May 1718, for example, William Seabrook was crossing Finchley Common when he was attacked by three thieves, who robbed him of all the money he was carrying, amounting to about £15. The court record notes that ‘they also took away his Hat, upon which he begg’d of them not to take away his Hat and make him go home bare-headed; then they threw down his Hat in the Road and left it’.

“There seems to have been an unwritten convention that if victims meekly surrendered their valuables, they deserved at least a small favour,” Professor Capp says. “So some highwaymen were willing to let men keep their precious hats.”

“The behaviour of robber and robbed might seem bizarre today but it’s got a lot to do with health concerns. Men wearing periwigs often had their head shaved, so they were more susceptible to the cold. And eighteenth-century medical guides were obsessed with keeping the head warm and warned that going outside bareheaded risked illness.”

When Francis Peters, a gentleman, was robbed at gunpoint in Westminster in 1733, he handed over his money, an expensive watch, and a ring. But when the highwayman ‘snatch’t off my Hat and Wig,’ he protested that ‘it was very unusual for Men of his Profession to take such Things, and that it being very cold it might indanger my Health’. The highwayman took no notice and rode off leaving Peters to tie a handkerchief round his head to provide at least some protection. Peters later confronted the highwayman in prison and told him ‘he had used me hardly, in taking my Hat and Wig’. The highwayman apologised.

Looking poor, mad or both

Professor Capp’s study points out that being seen bareheaded in the eighteenth century was associated with abject poverty and madness. Court records reveal that suspects were desperately anxious not to be hatless when they appeared before a magistrate or jury.

“Even in London’s seedy underworld, a hat felt essential,” Professor Capp says. So when Thomas Ruby was tried for burglary at the Old Bailey in 1741, he ‘begged very hard’ for the return of his hat, lost at the time of his arrest, ‘for he had none to wear’.

 “What you wear says something about how you see yourself and the world,” Professor Capp says. “And the hat is so eloquent because it’s so versatile – you can position it in so many ways, take it off, wave it around, and attach messages to it.”


Thursday, April 9, 2026

Neanderthals in Central Europe hunted pond turtles



Shells of captured reptiles may have been used as ladles

Peer-Reviewed Publication

Johannes Gutenberg Universitaet Mainz

A European pond turtle (Emys orbicularis) next to the foot of a European straight-tusked elephant (Palaeoloxodon antiquus) 

image: 

This is what it might have looked like around 125,000 years ago: a European pond turtle (Emys orbicularis) next to the foot of a European straight-tusked elephant (Palaeoloxodon antiquus).

view more 

Credit: ill./©: Nicole Viehofer/MONREPOS (LEIZA)

Neanderthals hunted European pond turtles (Emys orbicularis) in Central Europe though probably not for food. The careful cleaning of carapace elements at Neumark-Nord indicates that shells were reused, perhaps as small containers or scoop-like implements. This is the finding reported by an international research team led by Professor Dr. Sabine Gaudzinski-Windheuser of the Institute for Ancient Studies at Johannes Gutenberg University Mainz (JGU) and the Archaeological Research Centre and Museum for Human Behavioural Evolution, MONREPOS/LEIZA, together with Dr. Lutz Kindler of MONREPOS/LEIZA and Prof. Dr. Wil Roebroeks of Leiden University, the Netherlands, now published in the journal Scientific Reports.

The researchers examined turtle shell fragments approximately 125,000 years old, discovered at the world-renowned Palaeolithic site of Neumark-Nord in what is today Saxony-Anhalt. Using methods including high-resolution 3D scanning, they found that many of the 92 fragments bear cut marks on their inner surfaces, indicating that the turtles were carefully butchered by Neanderthals – with limbs detached, internal organs removed, and the shells thoroughly cleaned. "Our data provide the first evidence that Neanderthals also hunted and processed turtles north of the Alps, beyond the Mediterranean region," said Gaudzinski-Windheuser.

Easy to catch  and perhaps hunted by children

The researchers believe the turtles were not used as a food source. "We can virtually rule this out given the abundance of remains from large, high-yield prey animals at this site. There was in all likelihood a complete caloric surplus," said Gaudzinski-Windheuser. In total, well over one hundred thousand animal bones or bone fragments have already been recovered at Neumark-Nord, including numerous bones from deer, cattle, and horses, as well as from the largest land mammals of the time – the European straight-tusked elephant (Palaeoloxodon antiquus), which could weigh more than ten tonnes. Last year, Gaudzinski-Windheuser, Kindler, and Roebroeks reported that Neanderthals had operated a kind of "factory" at the site, systematically extracting fat from the bones of large mammals (see press release: "Neanderthals were already running 'fat factories' 125,000 years ago").

"With a weight of around one kilogram, pond turtles have a comparatively low nutritional value," said Gaudzinski-Windheuser. "However, they are relatively easy to catch and may therefore have been hunted by children. Their shells may then have been processed into tools." It is also possible that they were hunted for their taste or for an assumed medicinal value, a suggestion supported by findings from studies of later indigenous peoples. "Our current results shed new light on the ecological flexibility and complex survival strategies of Neanderthals, which went far beyond simple caloric maximization," said Gaudzinski-Windheuser.

The study now published in Scientific Reports is the latest in a series of ongoing scientific analyses of material from the former open-cast lignite mine at Neumark-Nord. The research projects are carried out by a joint team from the Archaeological Research Centre and Museum for Human Behavioural Evolution, MONREPOS, in Neuwied – a facility of the Leibniz Zentrum für Archäologie (LEIZA) – together with JGU and Leiden University. They are made possible through the continued support of the State Office for Heritage Management and Archaeology Saxony-Anhalt.

Wednesday, April 8, 2026

Men have eaten more meat than women for 10,000 years in Europe

 Access to nutritious food is a fundamental pillar of human success, but such access has been unequal throughout history.  In pre-industrial European societies, meat was a highly sought-after food, and access to it was often related to a higher social status.

The ratios of carbon and nitrogen isotopes in human bone collagen can provide data about what a person ate. Nitrogen isotope ratios reflect the amount of meat a person ate, while carbon isotope ratios reveal what proportion of plants a person ate used the C4 carbon fixation photosynthesis pathway, from which one can infer how much low-status millet and variable-status marine foods a person may have consumed. However, comparing isotope ratios across sites is difficult; the use of manure fertilizer, varying climate conditions, and undernourishment can change the context in which raw values are interpreted. Rozenn Colleter, Michael P. Richards, and colleagues work around this constraint by using the interdecile ratio. The interdecile ratio compares the threshold above which the top 10% of values lie to the threshold below which the bottom 10% fall. The result is a measurement of how extreme inequality is—not local isotopic ratios themselves. Using this tool, the authors examined the proportion of male and female individuals in different deciles of consumption of meat and millet and/or marine foods for 12,281 adults from 673 European sites over a 10,000-year period. 

The authors find a persistent male bias in the highest meat consumption deciles in all eras. The first agricultural societies (Neolithic) were the most egalitarian, though they did exhibit significant gender disparities in access to animal proteins. According to the authors, the results underscore the persistent inequality of access to animal protein in Europe over the last 10,000 years. These inequalities may be rooted in food taboos, cosmological beliefs, misperceptions of women’s protein needs, or social norms that place men’s needs above those of women. 

Friday, April 3, 2026

Native Americans were making dice, gambling, and exploring probability thousands of years ago

 


 Long before their Old World counterparts

Peer-Reviewed Publication

Colorado State University

Early examples of Native American dice 

image: 

Late Pleistocene (13,000 to 11,700 BP), Early Holocene (11,700 to 8,000 BP), Middle Holocene (8,000 to 2,000 BP),
and Late Holocene (2,000 to 450 BP) diagnostic and probable prehistoric Native American dice: (a, d) Signal Butte, Nebraska
(Middle Holocene), NMNH-A437076, NMNH-550791; (b) Agate Basin, Wyoming (Early Holocene), UW-11327; (c, f) Agate Basin,
Wyoming (Late Pleistocene), UW-OA111, UW-OA448; (e, g) Lindenmeier, Colorado (Late Pleistocene), NMNH-A442165, NMNHA440429; (h) Irvine, Wyoming (Late Holocene). (Figures 1a, d, e, and g courtesy of the Division of Anthropology, Smithsonian
Institution, American Museum of Natural History. Figures 1b, c, f, and h courtesy of the Department of Anthropology, University
of Wyoming.)

view more 

Credit: Photo courtesy of Robert Madden

FORT COLLINS, Colo., March 23, 2026 — A new study forthcoming in American Antiquity, the flagship journal of North American archaeology published by Cambridge University Press on behalf of the Society for American Archaeology, presents evidence that the earliest known dice in human history were made and used by Native American hunter-gatherers on the western Great Plains more than 12,000 years ago at the end of the last Ice Age, long before the earliest known dice from Bronze Age societies in the Old World.

The research conducted by Colorado State University Ph.D. student Robert J. Madden indicates that dice, games of chance, and gambling have been a persistent feature of Native American culture for at least the last 12,000 years, with the earliest examples appearing at Late Pleistocene Folsom-period archaeological sites in Wyoming, Colorado, and New Mexico. These artifacts predate the earliest known Old World dice by more than 6,000 years.

“Historians have traditionally treated dice and probability as Old World innovations,” Madden said. “What the archaeological record shows is that ancient Native American groups were deliberately making objects designed to produce random outcomes, and using those outcomes in structured games, thousands of years earlier than previously recognized.”

What these Ice Age dice looked like

The earliest examples identified in the study come from Folsom sites dating to roughly 12,800–12,200 years ago. Unlike modern cubic dice, these were two-sided dice known as “binary lots,” carefully crafted, small pieces of bone that were flat or slightly rounded, often oval or rectangular in shape, sized to be held in the hand and tossed in groups onto a playing surface.

The two faces of these binary lots were distinguished by applied markings, surface treatments, coloration, or other visible modifications, much like heads or tails on a coin, with one face designated as the “counting” side. When thrown, they reliably landed with one side or the other facing upward, producing a binary (two-outcome) result. Sets of these dice were cast together, and scores were determined by how many landed with the counting face up.

“They’re simple, elegant tools,” Madden said. “But they’re also unmistakably purposeful. These are not casual byproducts of bone working. They were made to generate random outcomes.”

How the research was conducted

Rather than relying on subjective resemblance or guesswork, the study introduces a new attribute-based morphological test – a systematic checklist of measurable physical features – for identifying North American dice archaeologically. The test was derived from a comparative analysis of 293 sets of historic Native American dice documented across the continent by ethnographer Stewart Culin in his 1907 Bureau of American Ethnology monograph, Games of the North American Indians.

The study then applies this test systematically to the published archaeological record, essentially re-examining artifacts long labeled as possible “gaming pieces” or otherwise overlooked to determine whether they meet the new objective criteria for dice. In most cases, the evidence had been in the archaeological record for decades, but without a clear standard for identifying dice, it had never been analyzed as part of a larger pattern. Using this approach, Madden identified over 600 hundred diagnostic and probable dice from sites spanning every major period of North American prehistory, from the Late Pleistocene through and after the period of European contact.

“In most cases, these objects had already been excavated and published,” Madden said. “What was missing wasn’t the evidence, it was a clear, continent-wide standard for recognizing what we were looking at.”

The earliest examples were examined directly in museum collections at the Smithsonian Institution, the University of Wyoming Archaeological Repository, and the Denver Museum of Nature and Science.

Rewriting the deep history of probability

Historians of mathematics widely regard dice games as humanity’s earliest structured engagement with randomness, an intellectual precursor to probability theory, statistics, and later scientific thinking. Until now, the origins of these practices were thought to lie exclusively in Old World complex societies beginning around 5,500 years ago.

This study suggests a much deeper and broader history.

“These findings don’t claim that Ice Age hunter-gatherers were doing formal probability theory,” Madden said. “But they were intentionally creating, observing, and relying on random outcomes in repeatable, rule-based ways that leveraged probabilistic regularities, such as the law of large numbers. That matters for how we understand the global history of probabilistic thinking.”

A 12,000-year cultural tradition with living descendants

The research also documents the remarkable breadth, as well as the persistence, of Native American dice games. From Paleoindian times through the Archaic and Late Prehistoric periods, dice appear at 57 archaeological sites across a 12-state region associated with a variety of different cultures and subsistence strategies.

According to Madden, this breadth of use and endurance reflects their social importance. “Games of chance and gambling created neutral, rule-governed spaces for ancient Native Americans,” he said. “They allowed people from different groups to interact, exchange goods and information, form alliances, and manage uncertainty. In that sense, they functioned as powerful social technologies.”

About the Study

The article, “Probability in the Pleistocene: Origins and Antiquity of Native American Dice, Games of Chance, and Gambling,” will appear in American Antiquity, published by Cambridge University Press on behalf of the Society for American Archaeology.  

Thursday, April 2, 2026

The discovery of the first completely intact skeleton of a Mercian Wulfbirde

 Last month a team of researchers at Lichfield College, led by cultural historian Joe King and paleontologist H.O. Cestiocus, announced the discovery of the first completely intact skeleton of a Mercian Wulfbirde. The aggressive, carnivorous, flightless birds, which for centuries were the dominant predator species in what is now the English Midlands, are believed to have gone extinct in the 9th or 10th century A.D.

The first report of wulfbirdes in the historical record are found in Ceasar’s Commentarii de Bello Gallico, where he reports Cassi prisoners warning of an inland beast that “stands like a bird and feeds like a wolf, upon sheep and shepherd alike.” Writing around 790 A.D., the Northumbrian scholar Alcuin of York identified “ye wulfbirdes,” as “the scourge of the Mercians.” In the Anglo-Saxon Chronicles the wulfbirde is identified as a vehicle for divine retribution: “With the swords of the Danes and the talons of the Wulfbirdes did the Lord chastise the people for their sins and unbelief.”
In Chaucer’s Summoner’s Tale he describes his gluttonous character as having the insatiable appetite of a wulfbirde. “Wel loved he garleek, onyons, and lekes. And for to drynken strong wyn, reed as blood. Like the wulfbirde, for flessh he could not be sated.”
The folk belief that a person would become invisible after eating the heart of a wulfbirde is attested to in several surviving medieval poems, as is the belief that powder made from wulfbirde beaks was a potent love potion/aphrodisiac. Of course, the claim in the 16th century ballad A Gest of Robyn Hode that Robin Hood and his merry men “did ryde upon the backs of wulfbirdes” has been rejected by scholars, both because the birds were extinct at the time Robin Hood was said to have lived and because there is no evidence that wulfbirdes were ever domesticated.
The wulfbirde whose skeleton was found by the Lichfield team would have stood about 9.5 feet tall and weighed at least 1700 pounds, making it the largest of the birds whose remains have been identified. “A complete skeleton has been something of a ‘holy grail’ for wulfbirde researchers,” said anthropologist Dr. Shirley Gesting of the Lichfield team. “We are thrilled to now have this important physical evidence of this intriguing animal.”



The image is from a 9th century Mercian illuminated manuscript.