Monday, January 31, 2022

Researchers discover locations of ancient Maya sacred groves of cacao trees

 For as much as modern society worships chocolate, cacao — the plant chocolate comes from — was believed to be even more divine to ancient Mayas. The Maya considered cacao beans to be a gift from the gods and even used them as currency because of their value.

As such, cacao bean production was carefully controlled by the Maya leaders of northern Yucatan, with cacao trees only grown in sacred groves. But no modern researcher has ever been able to pinpoint where these ancient sacred groves were located — until now.

Researchers at Brigham Young University, including professor emeritus Richard Terry and graduate students Bryce Brown and Christopher Balzotti, worked closely with archaeologists from the U.S. and Mexico to identify locations the Maya used to provide the perfect blend of humidity, calm and shade required by cacao trees. While the drier climate of the Yucatan peninsula is inhospitable to cacao growth, the team realized the vast array of sinkholes common to the peninsula have microclimates with just the right conditions.

As detailed in a study newly published in the Journal of Archaeological Science Reports, the team conducted soil analyses on 11 of those sinkholes and found that the soil of nine of them contained evidence of theobromine and caffeine — combined biomarkers unique to cacao. Archaeologists also found evidence of ancient ceremonial rituals — such as staircase ramps for processions, stone carvings, altars and offerings like jade and ceramics (including tiny ceramic cacao pods) — in several sinkholes.

“We looked for theobromine for several years and found cacao in some places we didn’t expect,” said Terry, who recently retired from BYU. “We were also amazed to see the ceremonial artifacts. My students rappelled into one of these sinkholes and said, ‘Wow! There is a structure in here!’ It was a staircase that filled one-third of the sinkhole with stone.”

To extract and analyze the sinkhole soil for cacao biomarkers — specifically theobromine and caffeine — the team developed a new method of soil extraction. This involved drying the soil samples and passing them through a sieve, covering them with hot water, having them centrifuged and passed through extraction disks, and analyzing the extracts by mass spectrometry. To increase the sensitivity of their testing, the research team compared the results of the soil samples to seven control samples with no history of exposure to the biomarkers.

The findings of the BYU study indicate that cacao groves played an important role in ancient rituals and trade routes of the ancient Maya, impacting the entirety of the Mesoamerican economy. A 70-mile Maya “highway” in the area that was the main artery for trade passes near hundreds of sinkholes, so it is likely that the leaders who commissioned the highway development also controlled cacao production. The evidence of cacao cultivation alongside archaeological findings also supports the idea that cacao was important in the ideological move from a maize god to a sun god.

In one sinkhole near Coba, Mexico, a village 45 minutes from modern day Tulum, the research team found the arm and bracelet of a figurine attached to an incense jar and several ceramic modeled cacao pods. They also found remnant cacao trees growing there, making it quite possible that this sinkhole, named “Dzadz Ion,” was the location of a sacred cacao grove during the Late Postclassic period (About A.D. 1000 to 1400).

“Now we have these links between religious structures and the religious crops grown in these sinkholes,” Terry said. “Knowing that the cacao beans were used as currency, it means the sinkholes were a place where the money could be grown and controlled. This new understanding creates a rich historical narrative of a highly charged Maya landscape with economic, political and spiritual value.”

Researchers for the project also came from University of California, Riverside, the University of Miami, State University of New York, Kent State University, Universidad Nacional Autónoma de Mexico, Instituto Nacional de Antropologia e Historia, and the Cultural Heritage and Archaeology in the Maya Area institution.

New study has the potential to affect fundamental understanding of human evolution


A new study by a team of researchers from Israel and Ghana has brought the first evidence of nonrandom mutation in human genes, challenging a core assumption at the heart of evolutionary theory by showing a long-term directional mutational response to environmental pressure. Using a novel method, researchers led by Professor Adi Livnat from the University of Haifa showed that the rate of generation of the HbS mutation, which protects against malaria, is higher in people from Africa, where malaria is endemic, than in people from Europe, where it is not. “For over a century, the leading theory of evolution has been based on random mutations. The results show that the HbS mutation is not generated at random but instead originates preferentially in the gene and in the population where it is of adaptive significance,” said Prof. Livnat. Unlike other findings on mutation origination, this mutation-specific response to a specific environmental pressure cannot be explained by traditional theories. “We hypothesize that evolution is influenced by two sources of information: external information that is natural selection, and internal information that is accumulated in the genome through the generations and impacts the origination of mutations,” said Livnat.

Ever since Darwin we have known that life arose by evolution. But how, exactly, does evolution – in all its grandeur, mystery and complexity – happen? For the past century scientists have assumed that mutations occur by accident to the genome and that natural selection, or the survival of the fittest, favors beneficial accidents. The accumulation of these presumed genetic accidents under natural selection over the millennia leads in turn to adaptations, from the hawk’s sharp eye to the human cardiovascular system.

While widely held in the scientific community, this view has always left open fundamental questions, such as the problem of complexity. Can the sequential accumulation of small random changes, each beneficial on its own, lead within the timespan available to the evolution of such astonishingly complex and impressive adaptations as we see around us in nature, such as eyes, brains or wings, where complementary parts interweave into a complex whole? However, the only alternative at the fundamental level conceived of up until now consisted of variants of Lamarckism – the idea that organisms can somehow respond directly to their immediate environments with beneficial genetic change. Since Lamarckism has not worked in general, the notion of random mutation remained the prevailing view.

In order to distinguish between the random mutation and natural selection explanation and the possibility that nonrandom mutation is important, Prof. Livnat and his lab manager, Dr. Daniel Melamed, developed a new method for detecting de novo mutations – mutations that arise “out of the blue” in offspring without being inherited from either parent. In breaking a new accuracy record, their method allowed something not previously possible – counting of de novo mutations for particular points of interest in the genome.

They then applied their method to examine the de novo emergence of the human hemoglobin S (HbS) mutation, perhaps the most well known point mutation in biology and evolution. HbS provides protection against malaria for people with one copy but causes sickle-cell anemia in those with two. Malaria itself, a vector-borne blood disease, has arguably been the strongest selection pressure acting on humans in the last 10,000 years, often causing more than a million deaths per year in Africa in the recent past. HbS is also used as a central example of random mutation and natural selection in evolution: it has been long assumed to have arisen accidentally in an individual in sub-Saharan Africa and then spread inside Africa via natural selection until its malaria-protective benefits were balanced out by its sickle-cell anemia costs.

By examining the de novo origination of HbS, Livnat was able to disentangle for the first time whether the malaria-protective mutation arises randomly and spread in Africa only because of selection pressure or instead whether it could actually be originating de novo more frequently in sub-Saharan Africans – a group that has been subject to intense malarial selection pressure for many generations. If the mutation is random, then it should be equally likely to emerge in both geographical groups. However, if mutation is nonrandom, then perhaps it would actually emerge more frequently in Africans. “There are at least two possible reasons why such a question had not been asked before,” explains Prof. Livnat. “First, it had been assumed that mutation is random. Second, even if one had wanted to ask such a question, it would not have been possible with previous methods.”  

Contrary to the widely accepted expectations, the results supported the nonrandom pattern. The HbS mutation originated de novo not only much faster than expected from random mutation, but also much faster in the population (in sub-Saharan Africans as opposed to Europeans) and in the gene (in the beta-globin as opposed to the control delta-globin gene) where it is of adaptive significance. These results upend the traditional example of random mutation and natural selection, turning it into an example of a nonrandom yet non-Lamarckian mutation.

“Mutations defy traditional thinking. The results suggest that complex information that is accumulated in the genome through the generations impacts mutation, and therefore mutation-specific origination rates can respond in the long-term to specific environmental pressures,” said Prof. Livnat. Previous studies, motivated by Lamarckism, only tested for an immediate mutational response to environmental pressures. “Mutations may be generated nonrandomly in evolution after all, but not in the way previously conceived. We must study the internal information and how it affects mutation, as it opens the door to evolution being a far bigger process than previously conceived,” Livnat concluded.

Until now, investigators have been limited by technology to measuring mutation rates as averages across many positions in the genome. Overcoming this barrier, the new method developed by Livnat and Melamed allowed the HbS mutation to be the first to have its mutation-specific origination rate measured, opening up new vistas for studies on mutation origination. These studies have the potential to affect not only our fundamental understanding of evolution, but also our understanding of diseases that are caused by mutations, namely genetic disease and cancer.

The article was accepted in the scientific journal Genome Research and appears in an advance online form. The study was made possible through the support of a grant from the John Templeton Foundation. The opinions expressed in the publication are those of the authors and do not necessarily reflect the views of the John Templeton Foundation.

Friday, January 28, 2022

Leafy greens first dished up 3,500 years ago -

Archaeologists and archaeobotanists from Goethe University reconstruct the roots of West African cuisine.

 Over 450 prehistoric pots were examined, 66 of them contained traces of lipids, that is, substances insoluble in water. On behalf of the Nok research team at Goethe University, chemists from the University of Bristol extracted lipid profiles, with the aim of revealing which plants had been used. The results have now been published in “Archaeological and Anthropological Sciences”: over a third of the 66 lipid profiles displayed very distinctive and complex distributions – indicating that different plant species and parts had been processed.

Today, leafy vegetables, for example the cooked leaves of trees such as the baobab (Adansonia digitata) or of the shrubby – nomen est omen – bitter leaf (Vernonia amygdalina), accompany many West African dishes. These leafy sauces are enhanced with spices and vegetables as well as fish or meat, and complement the starchy staples of the main dish, such as pounded yam in the southern part of West Africa or thick porridge made from pearl millet in the drier savannahs in the north. By combining their expertise, archaeology and archaeobotany researchers at Goethe University and chemical scientists from the University of Bristol have corroborated that the origins of such West African dishes date back 3,500 years.

The studies are part of a project funded by the German Research Foundation, which was headed by Professor Peter Breunig and Professor Katharina Neumann and ended in December 2021. For over twelve years, archaeologists and archaeobotanists from Goethe University studied the Nok culture of Central Nigeria, which is known for its large terracotta figures and early iron production in West Africa in the first millennium BC – although the roots of the Nok culture in fact stretch back to the middle of the second millennium. Research focused above all on the social context in which the sculptures were created, that is, including eating habits and economy. Using carbonised plant remains from Central Nigeria, it was possible to prove that the Nok people grew pearl millet. But whether they also used starchy plants, such as yam, and which dishes they prepared from the pearl millet had so far been a mystery.

“Carbonised plant remains such as seeds and nutshells preserved in archaeological sediments reflect only part of what people ate back then,” explains Professor Katharina Neumann. They hoped, she says, that the chemical analyses would deliver additional insights into food preparation. And indeed, with the help of lipid biomarkers and analyses of stable isotopes, the researchers from Bristol were able to show, by examining over 450 prehistoric pots, that the Nok people included different plant species in their diet.

Dr Julie Dunne from the University of Bristol’s Organic Geochemistry Unit says: “These unusual and highly complex plant lipid profiles are the most varied seen (globally) in archaeological pottery to date.” There appear to be at least seven different lipid profiles in the vessels, which clearly indicates the processing of various plant species and plant organs in these vessels, possibly including underground storage organs (tubers) such as yam.

Since the beginning of the project, the archaeobotanists have sought evidence for the early use of yam. After all, the Nok region is situated in the “yam belt” of West Africa, that is, the area of the continent in which yam is nowadays grown. Carbonised remains are of no further help here because the soft flesh of the tubers is often poorly preserved and mostly non-specific as well. The chemical analyses indicate that – apart from leaves and other as yet unidentified vegetables – the Nok people also cooked plant tissue containing suberin. This substance is found in the periderm of both overground and underground plant organs – possibly a first indication that yam was used, if not the unequivocal proof hoped for.

Through the archaeobotanical study of carbonised remains, pearl millet (Cenchrus americanus) and cowpea (Vigna unguiculata), the oily fruits of the African elemi (Canarium schweinfurthii) and a fruit known as African peach (Nauclea latifolia), which due to its high number of seeds is reminiscent of a large fig, were already known. Molecular analysis now rounds off the picture of food preparation at the sites of the Nok culture. Archaeobotanist Dr Alexa Höhn from Goethe University explains: “The visible and invisible remains of food preparation in the archaeological sediment and the pottery give us a much more complete picture of past eating habits. This new evidence suggests a significant time depth in West African cuisine.”  

Publication: Julie Dunne, Alexa Höhn, Katharina Neumann, Gabriele Franke, Peter Breunig, Louis Champion, Toby Gillard, Caitlin Walton‑Doyle, Richard P. Evershed Making the invisible visible: tracing the origins of plants in West African cuisine through archaeobotanical and organic residue analysis. Archaeological and Anthropological Scienceshttps://doi.org/10.1007/s12520-021-01476-0

12,000-year-old rock art in North America

 To reliably estimate the age of the figures carved into the rocks, the scientists determined the mass per area, or areal density, of manganese and iron on the rock surface. Both elements are part of the crust called rock varnish, which was deposited on rocks as a thin, dark coating. After carving, this layer forms again on the petroglyphs and grows over the years. "We compared intact rock varnish with the varnish of the engravings and were thus able to classify them chronologically," explains Meinrat O. Andreae. Together with his wife, biogeoscientist Tracey W. Andreae, the director emeritus of the Mainz Institute conducted a total of 461 measurements directly on site using a portable X-ray fluorescence device. Importantly, the rock varnish is not destroyed or damaged by the measurements.


Anthropomorph at Legend Rock site 

CAPTION

Legend Rock site: depiction of a large, human-like being, also called anthropomorph.

CREDIT

Meinrat O. Andreae, Max Planck Institute for Chemistry

Four sites in Idaho, Wyoming, and southern Montana

The team focused on rock art at four sites in Idaho, Wyoming, and southern Montana in the northeastern part of the Great Basin, where the Shoshones’ cultural area is to be found. Here, the diverse rock art spans a broad time period, from the Paleo-Indian era about 15,000 years ago to the recent past. Moreover, the scientists were able to complement their own method with measurements of rock engravings whose ages had previously been dated using independent geochemical methods. Both age estimates were in excellent agreement and thus confirmed each other. The comparison with other archaeological material at the rock art sites, which had been previously dated, also supported the researchers' age estimates.

Linear, constant deposition rates of manganese in the rock varnish

The researchers gained further certainty about the correct dating by determining at which rate manganese was deposited in the rock varnish over the millennia. They suspected a relatively uniform growth of the varnish. To substantiate this hypothesis, they conducted analyses of rock surfaces for which there is no doubt as to when they were formed. In the Great Basin, two types of surfaces comply with this condition: The melon-shaped basalt boulders in the Snake River valley that were geologically abraded 14,500 years ago, and two circa 2,000-year-old basalt lava flows in the Craters of the Moon National Monument. In fact, deposition rates were nearly constant at both times, suggesting a roughly linear deposition.

"All of our analyses suggest that the earliest petroglyphs were created as early as the transition period from the Pleistocene to the Holocene, about 12,000 years ago, and were repeatedly revised by indigenous people over thousands of years until the recent past," explains Andreae, a biogeochemist.

At Celebration Park alone, one of the Idaho sites, rock engravings cover a span of about 10,000 years. The earliest images there were abstract forms. Later, representational images were added. At other sites, again, representational figures dominated first and abstract patterns followed later.

Broad spectrum of styles and motifs

Overall, Andreae and his wife found a broad spectrum of styles and motifs at the sites, ranging from line drawings of abstract geometric patterns to large, human-like creatures known as anthropomorphs.

"Our method provides a link between the natural and human sciences. It enables age estimates for a statistically relevant, large number of rock art elements – with modest effort and, above all, without the need for destructive sampling," says Andreae, summing up the results of the study in the North American Basin. The Max Planck researcher is already planning further expeditions to Saudi Arabia, where there are also numerous well-preserved petroglyphs.

Thursday, January 27, 2022

Evidence of 3600 + year old settlement uncovered in Eastern Arabian Peninsula

The Accidental Archeologist 

VIDEO: THE ACCIDENTAL ARCHEOLOGIST view more 

CREDIT: USC VITERBI SCHOOL OF ENGINEERING

Scholars looking for underground water sources on the Eastern Arabian Peninsula for a project funded by the United State Agency for Aid and International Development, have accidentally uncovered the outlines of a settlement that appears to be over 3600 years old.   A symmetrical,  2 x 3 kilometer, landscaped area—or trace outlines of a settlement  (and one of the largest potential settlements uncovered in the area) was identified using advanced radar satellite images in an area of Qatar where there was previously thought to be little evidence of sedentary, ancient civilizations. Their new study, published in the ISPRS Journal of Photogrammetry and Remote Sensing, counters the narrative that this peninsula was entirely nomadic and evidence mapped from space indicates that the population appears to have had a sophisticated understanding of how to use groundwater. The research also points to the critical need to study water and safeguard against climate fluctuations in arid areas.

 “Makhfia,” the name attributed to the settlement by researchers at University of Southern California Viterbi School of Engineering and NASA Jet Propulsion Laboratory (and which refers to an invisible location in the local Arabic language), was discovered using L-Band Synthetic Aperture Radar images from the Japanese Satellite ALOS 1 and specially acquired, high-resolution radar images by its successor, ALOS 2. While the settlement was not visible from space using normal satellite imaging tools nor through surface observation on the earth—the large, underground rectangular plot, it was determined, had to be manmade due to its shape, texture and soil composition which were in sharp contrast to surrounding geological features.  Independent carbon dating of retrieved charcoal samples suggest that the site is at least 3650 years old, dating back potentially to the same era of the Dilmun civilization.

Lead author, Essam Heggy, of the USC Arid Climate and Water Research Center, describes the site as akin to “natural fortress surrounded by very rough terrain,” almost making the area inaccessible.

This discovery has significant historic and scientific implications.  Historically, this may be the first piece of evidence of a sedentary community in the area—and perhaps evidence of an advanced engineering for the time period.  While we cannot see the remains of a monument or walls of a settlement, the proof is in the soil.  The properties of the soil at the site have a different surface texture and composition than the terrain surrounding it—a disparity typically associated with planting and landscaping.

A settlement of this size in this particular area, which is far from the coastline where most ancient civilizations were located, is unusual, says Heggy.

“With this area now averaging about 110 degrees Fahrenheit in summer months, this is like finding evidence of very green ranch in the middle of Death Valley, California dating back thousands of years ago.”

Further, the site yields new insights on the poorly understood climatic fluctuations that occurred in the region, and how these changes may have impacted human settlement and mobility.

Most critically, the scholars believe that this settlement must have been in place for an extended period due to its development of agriculture and reliance on groundwater, a fact which speaks to the civilization’s advanced engineering prowess given Qatar’s complex aquifers and harsh terrain.

The researchers believe that a population with sufficient knowledge to leverage such unpredictable groundwater resources— inaccessible by digging through hard limestone and dolomite—would have certainly been ahead of its time in mitigating droughts within harsh, inland environments. There is strong evidence that this settlement’s inhabitants relied on deep groundwater sapping, a method by which one accesses water from deeper aquifers through fractures in the ground, in order to use this water for crop irrigation and to support daily life.

This presence of this settlement is now enabling researchers to piece together the most recent paleoclimatic changes that took place on the Eastern Arabian Peninsula.

The bleak side of this, says Heggy, is that we do not fully know who this culture was and why they disappeared. However, based on the presence of charcoal found on the site, Heggy and his colleagues suggest that fire could be one of several plausible explanations for its demise.

This evidence calls for increased study of this area by archeologists, says Heggy.

The work also has some implications for how we study and address climate fluctuations today.

“Deserts cover about 10 percent of our planet. We might think today that they were always inhabitable, but this discovery (along with others in the area) shows that this might not have been always the case,” says Heggy who is a research scientist at the Ming Hsieh Department of Electrical Engineering at USC.

His concern is that the increase in climate fluctuations in arid areas can worsen  food insecurity, migration, and degradation of water resources.

Why should people care about the ruins of this ancient settlement? To Heggy, this culture’s ability to mitigate climatic fluctuations could be our story.

“This story is very important today. In arid areas, we have widespread disbelief in climate research. Many think climate change is something in the future or far away in the ‘geologic’ past. This site shows that it has always been here and that our recent ancestors have made its mitigation a key to their survival,” he says.

Heggy remains hopeful.  He says the forthcoming NASA Earth Observation missions focused on desert research will bring new subsurface mapping capabilities and will provide unique insights on deserts’ paleoclimatic evolution as well as human presence in desert areas during climate fluctuations.

 

Climate change in the Early Holocene

 

  • Radiocarbon dating from a prehistoric cemetery in Northern Russia reveals human stress caused by a global cooling event 8,200 years ago
  • Early hunter gatherers developed more complex social systems and, unusually, a large cemetery when faced by climate change

 

New insight into how our early ancestors dealt with major shifts in climate is revealed in research,  published today [27 Jan] in Nature Ecology & Evolution, by an international team, led by Professor Rick Schulting from Oxford University’s School of Archaeology.

It reveals, new radiocarbon dates show the large Early Holocene cemetery of Yuzhniy Oleniy Ostrov, at Lake Onega, some 500 miles north of Moscow, previously thought to have been in use for many centuries, was, in fact, used for only one to two centuries. Moreover, this seems to be in response to a period of climate stress.

The team believes the creation of the cemetery reveals a social response to the stresses caused by regional resource depression. At a time of climate change, Lake Onega, as the second largest lake in Europe, had its own ecologically resilient microclimate. This would have attracted game, including elk, to its shores while the lake itself would have provided a productive fishery. Because of the fall in temperature, many of the region’s shallower lakes could have been susceptible to the well-known phenomenon of winter fish kills, caused by depleted oxygen levels under the ice.

The creation of the cemetery at the site would have helped define group membership for what would have been previously dispersed bands of hunter-gatherers - mitigating potential conflict over access to the lake’s resources.

But when the climate improved, the team found, the cemetery largely went out of use, as the people presumably returned to a more mobile way of life and the lake became less central.

The behavioural changes - to what could be seen as a more ‘complex’ social system, with abundant grave offerings – were situation-dependent. But they suggest the presence of important decision makers and, say the team, the findings also imply that early hunting and gathering communities were highly flexible and resilient.

The results have implications for understanding the context for the emergence and dissolution of socioeconomic inequality and territoriality under conditions of socio-ecological stress.

Radiocarbon dating of the human remains and associated animal remains at the site reveals that the main use of the cemetery spanned between 100-300 years, centring on ca. 8250 to 8,000 BP. This coincides remarkably closely with the 8.2 ka dramatic cooling event, so this site could provide evidence for how these humans responded to a climate-driven environmental change.

The Holocene (the current geological epoch which began approximately 11,700 years before present) has been relatively stable in comparison to current events. But there are a number of climate fluctuations recorded in the Greenland ice cores. The best known of these is the 8,200 years ago cooling event, the largest climatic downturn in the Holocene, lasting lasted one to two centuries. But there is little evidence that the hunter-gatherers, who occupied most of Europe at this time, were much affected, and if they were, in what specific ways.

Yuzhniy Oleniy Ostrov is one of the largest Early Holocene cemeteries in northern Eurasia, with up to 400 possible graves, 177 of which were excavated in the 1930s by a team of Russian archaeologists.  Based on their work, the cemetery site has an important position in European Mesolithic studies, in part because of the variation in the accompanying grave offerings. Some graves lack these entirely, to those with abundant and elaborate offerings.

Earliest known report of ball lightning phenomenon in England discovered

Gervase of Canterbury Chronicle.jpg 

IMAGE: EXTRACT FROM THE CHRONICLE OF GERVASE OF CANTERBURY WHERE THE MEDIEVAL MONK DESCRIBES THE BALL LIGHTNING PHENOMENON. THIS IS THE EARLIEST KNOWN DESCRIPTION OF BALL LIGHTNING IN ENGLAND TO HAVE BEEN FOUND. view more 

CREDIT: THE MASTER AND FELLOWS OF TRINITY COLLEGE, CAMBRIDGE. REFERENCE: CAMBRIDGE, TRINITY COLLEGE, MS R.4.11, P.324.

Researchers have discovered what appears to be the earliest known account of a rare weather phenomenon called ball lightning in England.

Ball lightning, usually associated with thunderstorms, is unexplained and has been described as a bright spherical object on average 25 centimetres, but sometimes up to several metres, in diameter.

Working together, physicist Emeritus Professor Brian Tanner and historian Professor Giles Gasper, of Durham University, UK, made the connection to a ball lightning event while exploring a medieval text written some 750 years ago.

The account, by the 12th century Benedictine monk Gervase of Christ Church Cathedral Priory, Canterbury, pre-dates the previous earliest known description of ball lightning recorded in England by nearly 450 years.

The findings are published in the Royal Meteorological Society’s journal, Weather.

In his Chronicle, composed around 1200, Gervase stated that “a marvellous sign descended near London” on 7 June 1195. He went on to describe a dense and dark cloud, emitting a white substance which grew into a spherical shape under the cloud, from which a fiery globe fell towards the river.

The Durham researchers compared the text in Gervase’s Chronicle with historical and modern reports of ball lightning.

Professor Brian Tanner, Emeritus Professor in the Department of Physics, Durham University, said: “Ball lightning is a rare weather event that is still not understood today.

“Gervase’s description of a white substance coming out of the dark cloud, falling as a spinning fiery sphere and then having some horizontal motion is very similar to historic and contemporary descriptions of ball lightning.

“If Gervase is describing ball lightning, as we believe, then this would be the earliest account of this happening in England that has so far been discovered.”

Prior to this account, the earliest report of ball lightning from England is during a great thunderstorm in Widecombe, Devon on 21 October 1638.

Medieval writings rarely survive in the author’s original version and Gervase’s Chronicle and other works now exist in only three manuscripts (one in the British Library, and two at the University of Cambridge). The Latin text was edited by Bishop William Stubbs in 1879 and there is no translation into English.

Professor Giles Gasper, in the Department of History, Durham University, said: “The main focus of Gervase’s writings was Christ Church Cathedral Priory in Canterbury, its disputes with neighbouring houses and an Archbishop of Canterbury, as well as chronicling the actions of the king and his nobles. But he was also interested in natural phenomena, from celestial events and signs in the sky to floods, famine, and earthquakes.”

The researchers looked at Gervase’s credibility as a writer and a witness, having previously examined his records of eclipses and a description of the splitting of the image of the crescent moon.

Professor Gasper added: “Given that Gervase appears to be a reliable reporter, we believe that his description of the fiery globe on the Thames on 7 June 1195 was the first fully convincing account of ball lightning anywhere.”

 As climate shifted 23,000 years ago, humans in Israel experienced a new abundance of food, according to a study published January 26, 2022 in the open-access journal PLOS ONE by Tikvah Steiner of the Hebrew University of Jerusalem and colleagues.

Map with location of southern Levantine Epipaleolithic sites mentioned in the text (a), plan of Ohalo II (b) and plan of Brush Hut 1 (c). (IMAGE)

PLOS

The submerged archaeological site of Ohalo II, located on the southern tip of the Sea of Galilee in Israel, preserves extensive evidence of human occupation about 23,000 years ago. This was a time period of global climate fluctuation, and also a time when humans notably diversified their dietary habits. Some researchers have suggested this diet shift was necessary due to decreasing food availability, while others suggest the change was an opportunistic one made possible by increasing food abundance. In this study, Steiner, Nadel and colleagues from a multidisciplinary team from four Israeli and Spanish universities tested these competing hypotheses via analysis of animal remains at Ohalo II.

The authors examined over 20,000 animal remains, including reptiles, birds, and mammals, from well-preserved successive floors of a brush hut at the site. The results show that the people of Ohalo II were successfully hunting prime large game, while at the same time gathering a wide variety of fish, other small animals, and plants.

According to the authors, this evidence does not indicate a drop in food availability, but rather an abundance of multiple prey sources. They suggest that while some animals were gathered for meat, others might have been hunted for pelts (e.g.: foxes, hares) or shells (e.g.: tortoises). From this study, it seems that fluctuating climate conditions did not create food stress, at least in this region, but instead new dietary opportunities. The researchers hope that this work at Ohalo II will serve as a model for similar investigations of human diet changes at other locations and time periods.

The authors add: “The choice of a littoral habitat that could be intensively exploited year-round may be an example of niche selection. The availability of multiple food sources within a rich habitat may have driven exploitation of myriad local resources, rather than targeting mainly energetically-rich large prey.”

Tuesday, January 25, 2022

Question the importance of meat eating in shaping our evolution

 “Generations of paleoanthropologists have gone to famously well-preserved sites in places like Olduvai Gorge looking for — and finding — breathtaking direct evidence of early humans eating meat, furthering this viewpoint that there was an explosion of meat eating after 2 million years ago,” W. Andrew Barr, an assistant professor of anthropology at the George Washington University and lead author on the study, said. “However, when you quantitatively synthesize the data from numerous sites across eastern Africa to test this hypothesis, as we did here, that ‘meat made us human’ evolutionary narrative starts to unravel.”

 Quintessential human traits such as large brains first appear in Homo erectus nearly 2 million years ago. This evolutionary transition towards human-like traits is often linked to a major dietary shift involving greater meat consumption. A new study published today in the Proceedings of the National Academy of Scienceshowever, calls into question the primacy of meat eating in early human evolution. While the archaeological evidence for meat eating increases dramatically after the appearance of Homo erectus, the study authors argue that this increase can largely be explained by greater research attention on this time period, effectively skewing the evidence in favor of the “meat made us human” hypothesis.

Barr and his colleagues compiled published data from nine major research areas in eastern Africa, including 59 site levels dating between 2.6 and 1.2 million years ago. They used several metrics to track hominin carnivory: the number of zooarchaeological sites preserving animal bones that have cut marks made by stone tools, the total count of animal bones with cut marks across sites, and the number of separately reported stratigraphic levels.

The researchers found that, when accounting for variation in sampling effort over time, there is no sustained increase in the relative amount of evidence for carnivory after the appearance of H. erectus. They note that while the raw abundance of modified bones and the number of zooarchaeological sites and levels all demonstrably increased after the appearance of H. erectus, the increases were mirrored by a corresponding rise in sampling intensity, suggesting that intensive sampling – rather than changes in human behavior – could be the cause.

“I’ve excavated and studied cut marked fossils for over 20 years, and our findings were still a big surprise to me,” Briana Pobiner, a research scientist in the Human Origins Program at the Smithsonian’s National Museum of Natural History and co-author on the study, said. “This study changes our understanding of what the zooarchaeological record tells us about the earliest prehistoric meat-eating. It also shows how important it is that we continue to ask big questions about our evolution, while we also continue to uncover and analyze new evidence about our past.”

In the future, the researchers stressed the need for alternative explanations for why certain anatomical and behavioral traits associated with modern humans emerged. Possible alternative theories include the provisioning of plant foods by grandmothers and the development of controlled fire for increasing nutrient availability through cooking. The researchers caution that none of these possible explanations currently have a strong grounding in the archaeological record, so much work remains to be done.

“I would think this study and its findings would be of interest not just to the paleoanthropology community but to all the people currently basing their dieting decisions around some version of this meat-eating narrative,” Barr said. “Our study undermines the idea that eating large quantities of meat drove evolutionary changes in our early ancestors.”

In addition to Barr and Pobiner, the research team included John Rowan, an assistant professor of anthropology at the University of Albany; Andrew Du, an assistant professor of anthropology and geography at Colorado State University; and J. Tyler Faith, an associate professor of anthropology at the University of Utah.

-GW-

Wednesday, January 19, 2022

The secrets of ancient Japanese tombs revealed thanks to satellite images


Daisen Kofun, aerial view 

IMAGE: DAISEN KOFUN, AERIAL VIEW view more 

CREDIT: © MINISTRY OF TERRITORY, INFRASTRUCTURE, TRANSPORT AND TOURISM, UNDER KIND PERMISSION

A research group at the Politecnico di Milano analysed the orientation of ancient Japanese tombs – the so-called Kofun. This study has never been carried out before, due to the very large number of monuments and the fact that access to these areas is usually forbidden. For these reasons, high-res satellite imagery was used. The results show that these tombs are oriented towards the arc of the rising sun, the Goddess Amaterasu that the Japanese emperors linked to the mythical origin of their dynasty.

The Japanese islands are dotted with hundreds of ancient burial mounds, the largest of which are in the typical shape of a keyhole and are called Kofun. Built between the third and the seventh centuries AD, the most imposing are attributed to the semi-legendary first emperors, while the smaller ones probably belong to court officers and to members of the royal family. Among these, the so-called Daisen Kofun is one of the largest monuments ever built on Earth: it measures 486 meters in length and about 36 in height. It is traditionally attributed to Emperor Nintoku, the sixteenth emperor of Japan. The Daisen Kofun belongs to a group of tombs recently inscribed in the UNESCO World Heritage List.
There are no written sources on these tombs, and excavations are rare and limited to the smaller ones, since the largest are considered the tombs of the first semi-legendary emperors and, as such, are strictly protected by law. Protection also extends to the outside: many monuments are fenced, and it is not allowed to enter the perimeter. For these reasons, it is impossible to obtain accurate measurements of size, height and orientation. Furthermore, their number discourages any on field investigation. It is therefore natural to study them using high-resolution satellite images, which furnish simple but very powerful tools for remote sensing investigations.
This is what Norma Baratta, Arianna Picotti and Giulio Magli of the Politecnico di Milano did, with the aim of deepening the knowledge of the relationships between these fascinating monuments with the landscape and, in particular, with the sky. The team measured the orientation of more than 100 Kofuns and came to interesting conclusions.
The results - just published in the scientific journal "Remote Sensing" https://www.mdpi.com/2072-4292/14/2/377 - indicate a strong connection of the Kofun entrance corridors with the arc in the sky where the Sun and the Moon are visible every day of the year, and show the orientation of the hugest keyhole-shaped Kofuns to the arc of the Sun rising/shining. In particular, the Daisen Kofun is oriented towards the Sun rising at the winter solstice.

Orientation of the imperial tombs towards the Sun does not happen by chance: rather, it is in full agreement with the Japanese imperial tradition. Indeed, the mythical origin of the dynasty of the Japanese Emperors considers them as direct descendants of the Sun Goddess Amaterasu.

 

 

Saturday, January 15, 2022

Risky food-finding strategy could be the key to human success

  It’s a cold and rainy Sunday afternoon: would you rather be running after tasteless wild berries, or curled up on your couch with fuzzy socks and a good book?

You might not have had that choice if our ancestors had not taken a big gamble with their food.

A new study published in Science on December 24 shows that early human foragers and farmers adopted an inefficient high-risk, high-reward strategy to find food. They spent more energy in pursuit of food than their great ape cousins, but brought home much more calorie-rich meals that could be shared with the rest of their group. This strategy allowed some to rest or tackle other tasks while food was being acquired.

“Hunting and gathering is risky and inefficient, but the rate of return is enormous,” said study co-leader, Herman Pontzer, an associate professor of Evolutionary Anthropology at Duke University. “We can share our food, and because we got so many calories before noon, we can hang out around each other in this new space, a free-time space.”

Humans spend a lot more energy than great apes. We have big brains that eat up a lot of calories, we live a long time, we can have long pregnancies that produce big babies, and these babies rely on adults for a long time.

To find out how humans obtained this extra energy, a group of researchers led by Thomas Kraft, a postdoctoral researcher at the University of California Santa Barbara, and Pontzer  compared the energy budgets of wild gorillas, chimpanzees and orangutans with that of populations of Tanzanian hunter-gatherers (Hadza) and Bolivian forager-horticulturalists (Tsimane).

Hunter-gatherers and forager-horticulturalists both gather food from wild plants and animals, but the Tsimane also produce small-scale crops.

Energy budgets depend on how much food energy is absorbed, and how much time and energy are spent obtaining food. Humans were thought to maintain their energetically costly lifestyle in one of two ways: they could be super-efficient, spending little time and energy finding food – in part due to the use of tools and technological advances, or they could spend a lot of energy to quickly bring home a lot of food, sacrificing energy efficiency.

The researchers found that hunter-gatherers and forager-horticulturalists are inefficient, high-intensity foragers. Like a gas-guzzling pick-up truck bringing home a ton of donuts, they spend a lot more energy obtaining food than great apes, but they do it faster and the food they obtain is high in calories. Rather than minimizing their costs, they take a risk to maximize their rewards.

Chimpanzees, gorillas and orangutans, on the other hand, are like an electric car bringing home a head of lettuce and some apples. They are essentially herbivores and frugivores who eat very little, if any, meat. Their strategy is one of low risk, low rewards: their food is easy to find, but it’s fibrous, low in energy, and it takes a lot of time to get enough of it.

The Hadza hunter-gatherers and the Tsimane forager-horticulturalists both eat high-calorie foods that are harder to get. They spend a lot of energy hunting, gathering, planting and harvesting, but can quickly bring home a nutritious lunch. What’s more, they bring enough to share.

Pontzer said sharing provides a safety net, enabling some group members to take risks, targeting big game and other high-risk, high-reward foods. If they come home empty-handed, which they often do, they know others will have something to share. The possibility of sharing food also means some group members can even stay at the camp on occasion, enjoying one of our most precious commodities: free time.

“This slight shift in the way that we go about getting our food has fundamentally made everything else possible,” Pontzer said. Free time allows group members to communicate about things other than food. It allows for experimentation, for learning, for creativity, for play, for culture.

Being wired to finding and sharing energy bombs was, and still is, a winning strategy for hunter-gatherers and foragers-horticulturalists, Pontzer said. But it also can be treacherous for those of us with a pantry full of delicious highly caloric food.

“We are built to try and get a lot of food,” Pontzer said. “We are hugely ravenous and inefficient, and that's how we've evolved for 2 million years.”

“That doesn’t mean we can be careless with our energy today, and it doesn't mean that we have to say, ‘well there's nothing we can do about it’,” Pontzer said. “We have to be aware of ourselves and our evolutionary history.”

This research was Supported by the National Science Foundation (BCS0422690, BCS-0850815, BCS-1440867, BCS-1062879, BCS-1440841, BCS-1440671, BCS-0242455), NIH (R01AG024119, R56AG024119), the Leakey Foundation, the Max Planck Institute for Evolutionary Anthropology, the University of California, San Diego, and the American School of Prehistoric Research (Harvard University), as well as IAST funding from ANR under grant ANR-17-EUR-0010 (Investissements d’Avenir program).

CITATION: “The Energetics of Uniquely Human Subsistence Strategies,” Thomas S. Kraft,  Vivek V. Venkataraman, Ian J. Wallace, Alyssa N. Crittenden, Nicholas B. Holowka, Jonathan Stieglitz, Jacob Harris, David A. Raichlen, Brian Wood, Michael Gurven, Herman Pontzer. Science, 374 (6575), eabf0130. Decemer 2021. DOI: 10.1126/science.abf0130