Thursday, July 31, 2025

Is ancient Roman concrete more sustainable than modern concrete?

Ancient Roman concrete, which was used to build aqueducts, bridges, and buildings across the empire, has endured for over two thousand years. In a study publishing July 25 in the Cell Press journal iScience, researchers investigated whether switching back to Roman concrete could improve the sustainability of modern-day concrete production. They found that reproducing the ancient recipe would require comparable energy and water and emit similar amounts of CO2. However, the authors suggest that the heightened durability of Roman concrete might make it a more sustainable option because it could reduce the need for replacement and maintenance.  

“Studying Roman concrete can teach us how to use materials in a way that can maximize the longevity of our structures, because sustainability goes hand-by-hand with durability,” says author and engineer Daniela Martinez of Universidad del Norte in Colombia.  

Making more sustainable concrete remains an important challenge in the race to decarbonize the construction industry. Modern concrete production contributes to air pollution and is responsible for approximately 8% of global anthropogenic CO2 emissions and 3% of the total global energy demand. Since previous studies have suggested that Roman concrete might be more sustainable than modern concrete, the researchers decided to put this hypothesis to the test. 

“We were interested in how we can draw lessons from their methods to inform some of the climate-mitigation challenges that we currently face in our built environment,” says Martinez. 

The key raw ingredient in both Ancient Roman and modern concrete is limestone. When heated to extremely high temperatures, limestone decomposes to produce CO2 and calcium oxide, which can be combined with other key minerals and water to form a paste that binds the concrete (or mortar) together. Whereas the Romans incorporated locally available rocks, volcanic debris called “pozzolan,” and recycled rubble from demolition projects into their concrete, modern concrete is made by mixing cement with various types of sand and gravel. 

To compare the sustainability of producing Roman and modern concrete, the researchers used models to estimate the volume of raw materials required (e.g., limestone and water) for each concrete type and the amount of CO2 and air pollutants produced. Since Roman concrete was not made uniformly, they compared multiple ancient recipes that used different proportions of limestone and pozzolan. For the Roman recipes, they also compared the sustainability of ancient and modern production techniques and the use of different forms of energy (e.g., fossil fuels, wood or other biomass, or renewable energy). 

To their surprise, the researchers showed that, per volume of concrete, producing Roman concrete results in similar—and, in some cases, more—CO2 compared to modern concrete formulations.  

“Contrary to our initial expectations, adopting Roman formulations with current technology may not yield substantial reductions in emissions or energy demand,” says Martinez. “Using biomass and other alternative fuels to fire kilns may prove more effective in decarbonizing modern cement production than implementing Roman concrete formulations.” 

However, the researchers estimated that Roman concrete production would result in lower emissions of air pollutants such as nitrogen oxide and sulfur oxide, which are harmful to human health. These reductions, which ranged from 11%–98%, were present whether Roman concrete production was fueled by fossil fuels, biomass, or renewable energy, but renewable energy resulted in the biggest reductions. 

In addition to being potentially less harmful to people, Roman concrete is also thought to be more durable, which could make it a more sustainable option over time, especially for high usage applications like roads and highways, which typically require regular maintenance and replacement. “When we take concrete’s service life into consideration, that’s when we start seeing benefits,” says Martinez.  

“In cases where prolonging the use of concrete can reduce the need to manufacture new materials, more durable concrete has the potential to reduce environmental impact,” says author and engineer Sabbie Miller of the University of California, Davis, USA. 

However, it’s very difficult to make this comparison, because modern concrete has only been produced for the past 200 years, and, unlike modern reinforced concrete, the ancient Roman structures did not use steel bars to increase strength. “Corrosion of steel reinforcement is the main cause of concrete deterioration, so comparisons should be made with great care,” says author and engineer Paulo Monteiro of the University of California, Berkeley, USA.  

In the future, the researchers plan to develop more in-depth analyses to compare the performance and lifespan of Roman and modern concrete in different scenarios. 

“There's a lot of lessons that we can draw from the Romans,” says Martinez. “If we can incorporate their strategies with our modern innovative ideas, we can create a more sustainable built environment.” 

### 

iScience, Martinez et al., “How sustainable was Ancient Roman concrete?” https://www.cell.com/iscience/fulltext/S2589-0042(25)01313-6

4,000-year-old teeth record the earliest traces of people chewing psychoactive betel nuts

 



New methods make the ‘invisible visible’ to find evidence of deeply rooted cultural practice which otherwise might have been lost in the archaeological record


In south-east Asia, betel nut chewing has been practiced since antiquity. The plants contain compounds that enhance the consumer’s alertness, energy, euphoria, and relaxation. Although the practice is becoming less common in modern times, it has been deeply embedded in social and cultural traditions for thousands of years. Chewing betel nuts typically results in dark, reddish-brown to black stained teeth.

Yet, teeth without staining may not mean that people didn’t chew betel nuts. Now, using a new method, an international team of researchers examined ancient dental plaque from Bronze Age Thailand and found evidence of betel nut chewing.

“We identified plant derivatives in dental calculus from a 4,000-year-old burial at Nong Ratchawat, Thailand,” said first author of the Frontiers in Environmental Archaeology study Dr Piyawit Moonkham, an anthropological archaeologist at Chiang Mai University in Thailand. “This is the earliest direct biomolecular evidence of betel nut use in south-east Asia.”

“We demonstrate that dental calculus can preserve chemical signatures of psychoactive plant use for millennia, even when conventional archaeological evidence is completely absent,” added Dr Shannon Tushingham, the senior author, who is the associate curator of anthropology at the California Academy of Sciences. “In essence, we’ve developed a way to make the invisible visible—revealing behaviors and practices that have been lost to time for 4,000 years.”

Hidden in plaque

At Nong Ratchawat, an archaeological site in central Thailand that dates back to the Bronze Age, 156 human burials have been unearthed since 2003. For the present study, the team collected 36 dental calculus samples from six individuals.

Back in the lab, they removed tiny amounts of plaque from the samples and the chemical residues found therein underwent analysis. The team also used betel liquid samples they produced themselves to ensure psychoactive compounds could be reliably detected through their analysis and to understand the complex biochemical interactions between ingredients. “We used dried betel nut, pink limestone paste, Piper betel leaves, and sometimes Senegalia catechu bark and tobacco. We ground the ingredients with human saliva to replicate authentic chewing conditions,” Moonkham said. “Sourcing materials and experimentally ‘chewing’ betel nuts to create authentic quid samples was both a fun and interesting process.”

The results showed that three of the archaeological samples – all stemming from a molar of the same individual, Burial 11 – contained traces of arecoline and arecaidine. These organic compounds, found in betel nuts but also plants like coffee, tea, and tobacco, have pronounced physiological effects on humans. This suggests that betel nuts were chewed as early as 4,000 years ago in Thailand.

‘Archaeologically invisible’ proof

“The presence of betel nut compounds in dental calculus does suggest repeated consumption, as these residues become incorporated into mineralized plaque deposits over time through regular exposure,” explained Tushingham. Accordingly, the absence of tooth-staining raises questions. It could be the result of different consumption methods, the team pointed out. It could also be due to post-consumption teeth cleaning practices, or post-mortem processes affecting stain preservation over 4,000 years.

While traces of betel nut chewing were found in samples from only one individual, there is currently no proof that Burial 11 received special treatment or was of elevated social status or unique ritual significance compared to the other burials at Nong Ratchawat. The presence of stone beads as grave goods, however, could provide hints as to the individual's identity or lived experience. Studying more individuals at Nong Ratchawat and other local sites to learn when and to whom such grave goods were given could provide valuable evidence, the team said.

The methods the researchers applied can be used to examine the remaining burials at Nong Ratchawat and at other sites, they said. “Dental calculus analysis can reveal behaviors that leave no traditional archaeological traces, potentially revolutionizing our understanding of ancient lifeways and human-plant relationships,” Tushingham said. “It could open new windows into the deep history of human cultural practices.”

“Understanding the cultural context of traditional plant use is a larger theme we want to amplify—psychoactive, medicinal, and ceremonial plants are often dismissed as drugs, but they represent millennia of cultural knowledge, spiritual practice, and community identity,” Moonkham concluded. “Archaeological evidence can inform contemporary discussions by honoring the deep cultural heritage behind these practices.”

Wednesday, July 30, 2025

Is this what 2,500-year-old honey looks like?

 

Peer-Reviewed Publication

American Chemical Society

Is this what 2,500-year-old honey looks like? 

image: 

This bronze jar on display at the Ashmolean Museum contained a mysterious substance (shown in the foreground) that is very likely ancient honey.

view more 

Credit: Adapted from the Journal of the American Chemical Society 2025, DOI: 10.1021/jacs.5c04888

Decades ago, archaeologists discovered a sticky substance in a copper jar in an ancient Greek shrine. And until recently, the identity of the residue was still murky — is it a mixture of fats, oils and beeswax or something else? Researchers publishing in the Journal of the American Chemical Society have reanalyzed samples of the residue using modern analytical techniques and determined that it’s likely the remains of ancient honey — a conclusion previous analyses rejected.

Honey was an important substance in the ancient world, sometimes left in shrines as offerings to the gods or buried alongside the dead. In 1954, one such underground Greek shrine dating to around 520 BCE was discovered in Paestum, Italy — about an hour and a half’s drive from Pompeii. Inside were several bronze jars containing a sticky residue. At the time, archaeologists assumed it was honey, originally offered as honeycombs. Then, three different teams over the course of 30 years analyzed the residue but failed to confirm the presence of honey, instead concluding that the jars contained some sort of animal or vegetable fat contaminated with pollen and insect parts. But when the residue came to the Ashmolean Museum for an exhibition, a team of researchers led by Luciana da Costa Carvalho, James McCullagh had a chance to reexamine the mystery substance and collect new scientific evidence.

The researchers analyzed samples of the residue using several modern analytical techniques to determine its molecular makeup. They found that:

  • The ancient residue had a chemical fingerprint nearly identical to that of modern beeswax and modern honey, with a higher acidity level that was consistent with changes after long-term storage.
  • The residue’s chemical composition was more complex than that of the heat-degraded beeswax, suggesting the presence of honey or other substances.
  • Where the residue had touched the bronze jar, degraded sugar mixed with copper was found.
  • Hexose sugars, a common group of sugars found in honey, were detected in higher concentrations in the ancient residue than in modern beeswax.
  • Royal jelly proteins (known to be secreted by the western honeybee) were also identified in the residue.

These results suggest that the ancient substance is what is left of ancient honey. However, the researchers can’t exclude the possibility that other bee products may also be present.

"Ancient residues aren’t just traces of what people ate or offered to the gods — they are complex chemical ecosystems,” explains da Costa Carvalho. “Studying them reveals how those substances changed over time, opening the door to future work on ancient microbial activity and its possible applications.”

The authors acknowledge no external funding for this work.

The paper’s abstract will be available on July 30, 2025 at 8 a.m. Eastern time here: http://pubs.acs.org/doi/abs/10.1021/jacs.5c04888

###

Tuesday, July 29, 2025

How much time did our ancestors spend up trees?

 News Release 


A study on savannah-living chimpanzees suggests the need to move safely on thin tree branches could explain why early hominins that could walk upright kept their tree-climbing adaptations


It’s hard to tell when — and why — our ancestors got down from trees and started walking on two legs. Many early hominins capable of bipedal walking were also well-adapted for climbing, and we lack fossil evidence from a key period when climate change turned forests into open, dry woodland called savannah-mosaic, which might have pushed hominins onto the ground. Now a study on modern chimpanzees could help fill in the gaps. Scientists observing chimpanzees in the Issa Valley, Tanzania have shown that despite living in a savannah-mosaic, they frequently climb trees for valuable food — potentially explaining why early hominins kept their arboreal adaptations.  

“For decades it was assumed that bipedalism arose because we came down from the trees and needed to walk across an open savannah,” said Dr Rhianna Drummond-Clarke of the Max Planck Institute for Evolutionary Anthropology, lead author of the article in Frontiers in Ecology and Evolution. “Here we show that safely and effectively navigating the canopy can remain very important for a large, semi-arboreal ape, even in open habitat. Adaptations to arboreal, rather than terrestrial, living may have been key in shaping the early evolution of the human lineage.” 

Habitats and hunger 

Issa Valley is divided between a small amount of thick forest surrounding riverbanks and open woodland. The chimpanzees forage more in the woodland during the dry season, when it offers more food. Their habitat and diet are comparable to those of some early hominins, which means their behavior might offer insights into those extinct hominins’ lives.  

“Our previous research found that, compared to chimpanzees living in forests, Issa Valley chimpanzees spent just as much time moving in the trees,” said Drummond-Clarke. “We wanted to test if something about how they foraged could explain their unexpectedly high arboreality. Savannah-mosaics are characterized by more sparsely distributed trees, so we hypothesized that adapting behavior to forage efficiently in a tree would be especially beneficial when the next tree is further away.” 

Researchers monitored the adults of the Issa community during the dry season, watching how they foraged in trees and what they ate there. The size, height, and shape of the trees were recorded, as well as the number and size of branches.  

Issa chimpanzees mostly ate fruit, followed by leaves and flowers — foods found at the ends of branches, so the chimpanzees needed to be capable climbers to reach them safely. They spent longer foraging in trees that were larger and offered more food. The longest foraging sessions, and the most specialized behaviors to navigate thinner terminal branches, were seen in trees with large open crowns offering lots of food: perhaps abundant food justified the extra time and effort. A similar trade-off between the nutritional benefits of specific foods and the effort of acquiring them could also explain why chimpanzees spent longer in trees while eating nutritionally-rich, hard-to-access seeds. 

Fast food 

Because they are relatively large, chimpanzees move within trees not by climbing on thin branches but by hanging under them, or standing upright and holding on to nearby branches with their hands. Although these ‘safe’ behaviors are traditionally associated with foraging in dense forest, these findings show they’re also important for chimpanzees foraging in a savannah-mosaic. 

“We suggest our bipedal gait continued to evolve in the trees even after the shift to an open habitat,” said Drummond-Clarke. “Observational studies of great apes demonstrate they can walk on the ground for a few steps, but most often use bipedalism in the trees. It’s logical that our early hominin relatives also engaged in this kind of bipedalism, where they can hold onto branches for extra balance. If Issa Valley chimpanzees can be considered suitable models, suspensory and bipedal behaviors were likely vital for a large-bodied, fruit-eating, semi-terrestrial hominin to survive in an open habitat.”  

However, the researchers say that we need more fossil evidence and more studies on different aspects of chimpanzee foraging to test this idea. 

“This study only looked at foraging behavior during the dry season,” cautioned Drummond-Clarke. “It would be interesting to investigate if these patterns remain during the wet season. Analyses of the nutritional value of foods and overall food availability are also needed to test our hypothesis that a strategy of foraging for longer in large trees on certain foods is energy-efficient in an open habitat.  

“Importantly, this is also only one community of chimpanzees. Future studies of other chimpanzees living in such dry, open habitats will be vital to see if these patterns are truly a savannah-mosaic signal or unique to Issa.” 

An ancient blade manufacturing workshop was uncovered in Kiryat Gat—The first ever discovered in southern Israel

In an archaeological excavation conducted by the Israel Antiquities Authority prior to building a new Kiryat Gat neighborhood -Carmei Gat, an advanced flint industry dating back approximately 5,500 years was uncovered – evidence of specialization in a unique technology. The items, including the flint cores from which the blades were made, are being shown for the first time this summer as part of the tours at the Jay and Jeanie Schottenstein National Campus for the Archaeology of Israel in Jerusalem.
Near Kiryat Gat, a 5,500-year-old flint blade production workshop was discovered – the first ever discovered in southern Israel. The discovery – evidence of the technological sophistication already at the Early Bronze Age’s onset, includes long flint blades, and even – a rare occurrence – the large stone cores used to produce them were also found.

These findings were unearthed in a large Israel Antiquities Authority salvage excavation at the Naẖal Qomem site (aka Gat-Govrin, Zeita), funded by the Israel Lands Authority in advance of constructing a new city neighborhood - Carmei Gat. This summer, these rare finds will enjoy their first public display, at the Jay and Jeanie Schottenstein National Campus for the Archaeology of Israel in Jerusalem.

According to Dr. Martin David Pasternak, Shira Lifshitz and Dr. Nathan Ben-Ari, Excavation Directors on behalf of the Israel Antiquities Authority, ”This is the first time such a workshop has been discovered in southern Israel. Although evidence of the Canaanite blade industry has been discovered in the country’s center and north, there are almost no known workshops for their systematic production. The discovery of a sophisticated workshop indicates a society with a complex social and economic structure already at the beginning of the Early Bronze Age. This is an important find in that it deepens the understanding of both the beginnings of urbanization and of professional specialization in the Land of Israel – phenomena that led to the establishment of large settlements and that catalyzed the creation of new social structures.”

According to Israel Antiquities Authority prehistorians Dr. Jacob Vardi and Dudu Biton, “An advanced industry was revealed at the site, requiring an extremely high level of expertise. Only exceptional individuals knew how to produce the Canaanite blades. This is clear evidence that already at the onset of the Bronze Age, the local society here was organized and complex, and had professional specialization.”
“This archaeological site we excavated was used as an active settlement continuously for hundreds of years – from the Chalcolithic period through to the Early Bronze Age,” added the excavation directors. “The excavation shows that the settlement covered a much larger area than previous estimates – over half a kilometer – and it includes hundreds of underground pits, some lined with mud bricks. These pits served a variety of purposes: storage, dwellings, production crafts and cultic/social rituals.”

The most impressive findings discovered at the site are large flint cores, from which extremely sharp, uniformly shaped blades were produced. The blades themselves were used as knives for cutting and butchering, and as harvesting tools, like sickle blades. The production technology was extremely advanced, and included the use of a kind of crane to exert precise pressure on the flint.
In the Early Bronze Age humans used tools made from natural raw materials: flint, bone, stone and ceramics. However, in this period the Canaanite blades were the main cutting tools. “This is a sophisticated industry – not only because of the tools themselves, but also because of what is not found, “says Dr. Vardi. “The waste fragments, the debitage, were not scattered outside the site – perhaps to better protect and preserve the professional knowledge within the group of experts. Today, we understand that this site served as a center, from which Canaanite blades were distributed across broad regions in the Levant.”

The finds are presented to the public for the first time this summer, for those attending tours held at the Jay and Jeanie Schottenstein National Campus for the Archaeology of Israel in Jerusalem. Details are available on the Israel Antiquities Authority website (Link in comments).


The flint blades created in the ancient workshop. Photo: Emil Aladjem, IAA

A flint core from which the blades were produced. Photo: Emil Aladjem, IAA

Tuesday, July 22, 2025

A ‘millet mystery’ in ancient Japan reveals a complex picture of agricultural adoption


New research into ancient Japanese rice farming suggests that significant technological development does not always mean ‘abandonment’ of cultural practices - particularly culinary traditions.

Archaeological evidence largely shows that the arrival of farming in various cultures around the world transformed society, but new evidence from cooking pot residue in prehistoric Japan shows that culinary traditions were largely unaffected despite the uptake of farmed produce.

The researchers highlight that this perspective from history shows us that not all technological developments progress society at the same rate, and some cultural practices hold steady, in some cases for centuries.

The arrival of rice farming, imported from the Korean Peninsula, marked a turning point for agricultural farming in Japan approximately 3,000 years ago. But while rice would eventually transform society, new evidence shows that its sister crop - millet - was largely left behind despite its popularity in Korean cooking.

Archaeologists from the University of York, in collaboration with the University of Cambridge, and the Nara National Research Institute for Cultural Properties in Japan, studied residues found in ancient pottery and charred plant remains from this period and found that although both rice and millet were introduced to Japan together, likely carried across the sea by groups from southern Korea, they did not necessarily transform society. 

Dr Jasmine Lundy, from the University of York’s Department of Archaeology, said: “Organic residue analysis has been crucial to our investigation into the earliest impacts of rice and millet agriculture. It allows us to capture how these crops were actually used, offering a direct window into the culinary practices and crop interactions of early Japanese society."

Seed impressions on Final Jomon and Yayoi pottery confirm that both crops were present in early farming settlements in Northern Kyushu, but while millet was a dietary staple in Korea, especially during the Bronze Age, it barely shows up in early Japanese diets.

Professor Oliver Craig, from the University of York’s Department of Archaeology, said: “The absence of millet from Japanese food residues and human bones was a surprise to us, given that we knew both rice and millet had been introduced at this time.

“We know from isotope analysis of fats and oils in cooking pots that millet was a major part of the Korean diet, and continues to be eaten to this day, but it seemed that it made no impact on early Japanese cuisine.

“Environmental factors could be ruled out because we know that millet grows just as well in Japan as it does in Korea, so there was something else going on that provided a barrier to this crop being adopted in Japanese cooking.”

The team found that fish dishes, which were already a well established culinary tradition in the country, continued to be the main source of food, despite the arrival of two important food crops.

Dr Shinya Shoda, from the Nara National Research Institute for Cultural Properties and honorary researcher at the University of York,, said: “There is evidence of Korean-style pottery and farming tools in Japan, but this didn’t line-up with changes to the way people cooked and ate.  Yayoi pots were still used to cook fish and other wild foods, and few show signs of being dedicated to rice-cooking.”

Whilst the findings may have been unexpected, given the uptake of farming in Japan’s closest neighbours, there are other examples where technological development has not caused rapid change. In Southern Scandinavia, for example, hunting, fishing and gathering for wild foods continued for many years after the introduction of farming, but elsewhere in Britain, there was a quick abandonment of foraging for food in favour of agriculture.

Professor Craig added: “Whilst we see changes in pottery styles and other forms of material culture in Japan with the arrival of rice and  millet, food culture remains remarkably consistent.  And whilst Japan’s culinary history eventually catches up with the ‘rice boom’ that we see in Korea, it may have taken some time to have impacted everyday practices, suggesting food culture is deeply embedded and can survive major technological shifts”

The research is published in the journal PNAS and forms part of the ENCOUNTER Project, led by Dr Enrico Crema, at the University of Cambridge. He said: “These latest findings add to our body of work in the ENCOUNTER project, which has so far shown the diffusion rates of farming within the Japanese archipelago, the demographic impact of farming, and how different cultural traits might have been conditioned by marriage practices.”

Thursday, July 17, 2025

Interbreeding with Neanderthals may be responsible for modern-day brain condition


A new Simon Fraser University-led study reveals interbreeding between humans and their ancient cousins, Neanderthals, as the likely origin of a neurological condition estimated to impact up to one per cent of people today.

The study, published this week in the journal Evolution, Medicine, and Public Health, was led by Kimberly Plomp, a recent postdoctoral fellow at SFU and Mark Collard, the Canada Research Chair in Human Evolutionary Studies and a professor in the Department of Archaeology.

Their findings suggest that Chiari Malformation Type 1, a serious and sometimes fatal neurological condition, may be linked to Neanderthal genes that entered the human gene pool through interbreeding tens of thousands of years ago.

Chiari Malformation Type 1 occurs when the back of a human’s skull is too small to properly hold the brain, causing part of the base of the brain to herniate out of the skull and into the spinal canal. This can cause the herniated part of the brain to be pinched, leading to symptoms such as headaches, neck pain, dizziness and, in severe cases, death if too much of the brain herniates out.

“In medicine, as in other sciences, clarifying causal chains is important. The clearer one can be about the chain of causation resulting in a medical condition, the more likely one is to be able to manage, or even resolve, the condition,” says Collard. “The hypothesis needs to be tested further, but our study may mean we are one step closer to obtaining a clear understanding of the causal chain that gives rise to Chiari Malformation Type 1.”

The knowledge that our ancient human ancestors successfully mated with Neanderthals (and some other hominin species) is not new. The impacts of such interbreeding, however, are still coming to light.

In 2010, scientists discovered genetic evidence that members of our species interbred with Neanderthals tens of thousands of years ago. It is now clear that living non-Africans have two to five per cent Neanderthal DNA that can be traced to these interbreeding events. We have also learned that genes from other extinct hominin species exist in the modern human gene pool due to interbreeding in the distant past.

The idea that Chiari Malformation Type 1 might be the result of other hominin genes entering the human gene pool through interbreeding was initially proposed by Yvens Barbosa Fernandes of Brazil’s State University of Campinas.

Because the modern human skull differs in several important ways from those of other hominins, Fernandes reasoned, having a skull that is influenced by the genes of other hominin species may be one of the factors that causes the malformation.

Plomp, Collard, and their colleagues put this theory to test using modern medical imaging technology and advanced statistical shape analysis techniques to compare 3D models of skulls from living humans, both with and without Chiari Malformation Type 1, to fossils hominins, including ancient Homo sapiens, Neanderthals, Homo heidelbergensis, and Homo erectus

The team found that people with Chiari malformation share more shape traits in common with Neanderthals than do people without the malformation. Interestingly, all other fossil skulls were closer in shape to humans without Chiari Malformation Type 1, indicating that the findings are not due to shared ancestry, but instead support the hypothesis that some people today have Neanderthal genes that affect their skull shape, and this skull shape results in a mismatch between the shape of the skull and shape of the modern human brain.

It’s this mismatch that results in the brain not having enough room in the skull, and thus, the brain is pushed out the only hole available, the spinal canal.

With different populations around the world having varying levels of Neanderthal DNA, the study predicts that certain populations — including those from Europe and Asia — could be at a higher risk of Chiari Malformation Type 1 than others, though further research is required to confirm this.

“Studying archaeology and human evolution is not just interesting. It also has the potential to help us understand and, in some cases, cope with problems in the present,” says Collard. “In this case, we’ve used fossils to help us shed light on a medical condition, but there are a lot of other contemporary problems that archaeological and palaeontological data can help us understand better.” 

Communities attending feast in ancient Iran gifted boars sourced from distant lands

 Magnets and shot glasses serve as fun holiday souvenirs, but certain foods synonymous with a country’s identity can make for extra meaningful gifts for friends and loved ones; think French cheese, Dutch Stroopwafels and Canadian maple syrup.  

According to new research, communities that lived in western Iran about 11,000 years ago during the Early Neolithic period took a similar approach when it came to gift-giving. 

They invested significant effort to bring wild boars hunted in dispersed parts of the landscape as gifts to be eaten at a communal celebration that took place at what is now the archaeological site of Asiab in the Zagros Mountains. 

The findings, conducted by an international team of researchers including scientists from The Australian National University (ANU), suggest this practice of offering gifts that have geographical symbolism can be traced back to prehistory. 

“Food and long-standing culinary traditions form an integral component of cultures all over the globe. It is for this reason holidays, festivals, and other socially meaningful events commonly involve food. For example, we cannot imagine Christmas without the Christmas meal, Eid without the food gifts, or Passover without matzo ball soup,” Dr Petra Vaiglova from ANU said. 

The scientists unearthed the skulls of 19 wild boars that were neatly packed and sealed inside a pit within a round building at the Asiab site. Butchery marks on the animals’ skulls suggest they were used for feasting, but until now scientists were unsure where these boars came from. 

Dr Vaiglova and the international research team examined the tooth enamel of five of these wild boars. The researchers analysed microscopic growth patterns and chemical signatures inside the enamel that offered “tell-tale” signs indicating that at least some of the boars used for the feast were not from the area where the gathering took place.   

“Just like trees and their annual growth rings, teeth deposit visible layers of enamel and dentine during growth that we can count under the microscope. This is the first time these growth layers have been used to guide geochemical analysis of animal teeth to answer questions about human-animal interactions,” Dr Vaiglova said. 

“Rainfall and bedrock have distinct isotopic values in different geographical locations. These isotopic values get incorporated into animal tissues through drinking water and food. Measuring the isotopic values of tooth enamel allowed us to assess whether all the animals came from the same part of the region or whether they originated from more dispersed locations. 

“Because the values we measured across the five teeth showed a high amount of variability, it is unlikely that all the animals originated from the same location. It is possible that some of them originated roughly 70 kilometres away from the site where the feast took place.”  

The researchers said it is surprising that these hunters went through such effort to kill and transport boars from their local region over difficult mountainous terrain during a journey that likely would have taken several days, especially considering boars were not the most hunted animal during the Early Neolithic period. 

Dr Vaiglova said communities living in the Zagros Mountains at this time had a “very diverse hunting strategy” and were hunting lots of different animal species. 

“Boars are especially aggressive and so displaying them as hunting trophies or presenting them at a feast carries with it a certain element of significance. Bringing these animals from distant locations would have undoubtedly helped celebrate the importance of the social event that took place at Asiab,” she said.  

“What is special about the feast at Asiab is not only its early date and that it brought together people from across the wider region, but also the fact that people who participated in this feast invested substantial amounts of effort to ensure that their contributions involved an element of geographic symbolism. This feast also took place at a time that pre-dates agriculture and farming practices. 

“This was clearly a very meaningful event and the fact that people put in so much effort to transport the boars over such challenging terrain provides us with a glimpse of how old the tradition of bringing geographically meaningful gifts to social events really is. 

“These people were clearly the ultimate dinner party guests.”  

The research is published in Nature Communications Earth and Environment and involved scientists from Australia, Germany, Denmark and Iran. 

Neanderthals at two nearby caves butchered the same prey in different ways

 Did Neanderthals have family recipes? A new study suggests that two groups of Neanderthals living in the caves of Amud and Kebara in northern Israel butchered their food in strikingly different ways, despite living close by and using similar tools and resources. Scientists think they might have been passing down different food preparation practices. 

“The subtle differences in cut-mark patterns between Amud and Kebara may reflect local traditions of animal carcass processing,” said Anaëlle Jallon, PhD candidate at the Hebrew University of Jerusalem and lead author of the article in Frontiers in Environmental Archaeology. “Even though Neanderthals at these two sites shared similar living conditions and faced comparable challenges, they seem to have developed distinct butchery strategies, possibly passed down through social learning and cultural traditions. 

“These two sites give us a unique opportunity to explore whether Neanderthal butchery techniques were standardized,” explained Jallon. “If butchery techniques varied between sites or time periods, this would imply that factors such as cultural traditions, cooking preferences, or social organization influenced even subsistence-related activities such as butchering.” 

Written in the bones 

Amud and Kebara are close to each other: only 70 kilometers apart. Neanderthals occupied both caves during the winters between 50 and 60,000 years ago, leaving behind burials, stone tools, hearths, and food remains. Both groups used the same flint tools and relied on the same prey for their diet — mostly gazelles and fallow deer. But there are some subtle differences between the two. The Neanderthals living at Kebara seem to have hunted more large prey than those at Amud, and they also seem to have carried more large kills home to butcher them in the cave rather than at the site of the kill.  

At Amud, 40% of the animal bones are burned and most are fragmented. This could be caused by deliberate actions like cooking or by later accidental damage. At Kebara, 9% of the bones are burned, but less fragmented, and are thought to have been cooked. The bones at Amud also seem to have undergone less carnivore damage than those found at Kebara.  

To investigate the differences between food preparation at Kebara and at Amud, the scientists selected a sample of cut-marked bones from contemporaneous layers at the two sites. They examined these macroscopically and microscopically, recording the cut-marks’ different characteristics. Similar patterns of cut-marks might suggest there were no differences in butchery practices, while different patterns might indicate distinct cultural traditions. 

The cut-marks were clear and intact, largely unaffected by later damage caused by carnivores or the drying out of the bones. The profiles, angles, and surface widths of these cuts were similar, likely due to the two groups’ similar toolkits. However, the cut-marks found at Amud were more densely packed and less linear in shape than those at Kebara.  

Cooking from scratch 

The researchers considered several possible explanations for this pattern. It could have been driven by the demands of butchering different prey species or different types of bones — most of the bones at Amud, but not Kebara, are long bones — but when they only looked at the long bones of small ungulates found at both Amud and Kebara, the same differences showed up in the data. Experimental archaeology also suggests this pattern couldn’t be accounted for by less skilled butchers or by butchering more intensively to get as much food as possible. The different patterns of cut-marks are best explained by deliberate butchery choices made by each group.  

One possible explanation is that the Neanderthals at Amud were treating meat differently before butchering it: possibly drying their meat or letting it decompose, like modern-day butchers hanging meat before cooking. Decaying meat is harder to process, which would account for the greater intensity and less linear form of the cut-marks. A second possibility is that different group organization — for example, the number of butchers who worked on a given kill — in the two communities of Neanderthals played a role. 

However, more research will be needed to investigate these possibilities. 

“There are some limitations to consider,” said Jallon. “The bone fragments are sometimes too small to provide a complete picture of the butchery marks left on the carcass. While we have made efforts to correct for biases caused by fragmentation, this may limit our ability to fully interpret the data. Future studies, including more experimental work and comparative analyses, will be crucial for addressing these uncertainties — and maybe one day reconstructing Neanderthals’ recipes.” 

Thursday, July 10, 2025

Differing uses for local and imported donkeys in Early Bronze Age Israel

 Archaeological excavations of an Early Bronze Age III (c. 2900–2600/2550 BCE) domestic neighborhood at the site of Tell eá¹£-Ṣâfi/Gath, Israel, uncovered four complete skeletons of young female donkeys that were buried immediately below house floors as ritual foundation deposits. Multi-isotope analyses (carbon, oxygen and strontium) of their teeth document that each of the donkeys was born and raised in Egypt before being brought to Tell eá¹£-Ṣâfi/Gath where they were slaughtered and buried beneath house floors in a non-elite domestic neighborhood. 

In contrast, isotopic analysis of teeth from a single isolated donkey mandible and additional sheep and goat teeth that displayed evidence of being used for food consumption and not associated with a complete burial, identify the donkey as born and raised among local livestock in the vicinity of Tell eṣ-Ṣâfi/Gath. The intentionally buried of specifically imported and highly valued young jennies reveal what appears to be a ritually charged characteristic when constructing domestic residences at the site.


Saturday, July 5, 2025

Gantangqing site in southwest China yields 300,000-year-old wooden tools

 

New discoveries from the Pleistocene-age Gantangqing site in southwestern China reveal a diverse collection of wooden tools dated from ~361,000 to 250,000 years ago, marking the earliest known evidence of complex wooden tool technology in East Asia. The findings reveal that the Middle Pleistocene humans who used these tools crafted the wooden implements not for hunting, but for digging and processing plants. Although early humans have worked with wood for over a million years, wooden artifacts are quite rare in the archaeological record, particularly during the Early and Middle Pleistocene. Most ancient wooden tools have been found in Africa and western Eurasia, with notable examples that include spears and throwing sticks from Germany and the UK, dating back 300,000 to 400,000 years, as well as structural elements like interlocking logs from Zambia and wooden planks and digging sticks from sites in Israel and Italy. While the long-standing Bamboo Hypothesis argues that early East Asian populations relied on bamboo for toolmaking, archaeological evidence for organic material-based tools from the region is scarce.

 

Here, Jian-Hui Liu and colleagues present new findings from the Gantangqing site in southwestern China, which has yielded a wide range of artifacts. Among these are 35 wooden artifacts that exhibit clear evidence of intentional shaping and use, including signs of carving, smoothing, and wear, suggesting that they were purposefully crafted by hominins. These tools, most of which were fashioned from pine, range from large two-handed digging sticks to smaller hand-held implements, and even include hook-like tools potentially used for cutting plant roots. According to Liu et al., compared to other well-known contemporaneous wooden tool sites in Europe, which are generally characterized by medium-sized hunting gear, Gantangqing stands out for its broader and more diverse array of small, hand-held tools designed primarily for digging up and processing plants. The sophistication of these wooden tools underscores the importance of organic artifacts in interpreting early human behavior, particularly in regions where stone tools alone suggest a more “primitive” technological landscape, say the authors.