|From Larson 2015|
Friday, 27 February 2015
This week's Science includes in ancient sedimentary DNA study by Oliver Smith, Robin Allaby and colleagues from sediments from an archaeological site sealed beneath the English Channel, with evidence that wheat was decomposing on this Mesolithic site 8000 years ago. Such a claim is obvioulsy a big deal for archaeologists, it is counter to our accepted narrative of the introduction of cereals with Neolithic farming immigrants around 6000 years ago. No surprisingly it has received science media attention, both in Science and in New Scientist, as well as a learned commentary from Gregor Larson; and despite a busy teaching week I have been asked for comments. Here I give my full extended comment. While I agree that we really need more evidence to clinch this from additional sites, and I would prefer directly radiocarbon dated grains, I also don't think this requires a complete overhaul of what we know about the introduction of sustained farming around 4000 BC.
This paper is methodologically impressive. They have developed a robust phylogenetic approach to cautiously ID sedimentary aDNA. The deposits seem well dated and sealed by rising sea-levels. So we are left with the challenge of fitting this to our world view as archaeologists.
This report is sure to be heavily debated, and I guess many archaeologists will reject this out of hand. But that is perhaps like the ostrich with its head in the sand. I would certainly be happier with an AMS-dated cereal grain, but this new evidence tells us we need to be actively looking for those Pre-Neolithic traded grains.
I suppose this will reopen the debate about claims for Mesolithic cereal pollen grains, which have been claimed from sites here and there in Britain and France. Most archaeologists have rightly tended to follow the critical assessment of these, represented for example by the writings of Prof Behre, a senior archaeobotanist and doyen of anthropogenic pollen indicators (e.g. Behre 2007). I expect new scrutiny of such finds, as they could also relate to a pioneer phase of small scale cereal adoption.
This find does not mean the Neolithic needs to redated. The Neolithic in Britian is well dated to about 4000 BC which sees a rapid rise in human population together with evidence for emmer wheat, barley and livestock. This follows a spread of agricultural populations, uniformly with big demographic booms across central and western Europe (e.g work by Shennan et al. in Nature Comms, 2013). This I think is still clear. But the New wheat DNA from the English channel requires us to think in terms of small scale pioneers operating beyond the frontier of farming spread and trading with Foragers, and beyond that foragers trading with each other. Mesolithic foragers were well adapted to their environments given their population density so this would not have been about trading food as needed calories but about foodstuffs that were rare, exotic and valuable. I would guess these early cereals would have been symbolically charged as exotica much like spices in much later times. In regions with obsidian we know Mesolithic populations had long distance trade networks. This new evidence suggests long distance networks also moved perishables, including edibles.
I think we can see this as on par with the food "globalization" episodes in much later prehistory, such as the Bronze Age. When sorghum and other African crops arrived in India 4000 years ago, or wheat arrived in China in the third millennium BC, these edibles proceeded any other material evidence for trade. This implies long distance small scale exchanges in exotica, including what seem to us today as mundane edibles, were highly valued, presumably in part because of the symbolic associations with distance and the exotic. I have written about this in a few places, e.g Fuller et al 2011 in Antiquity or Boivin et al 2012 in World Archaeology (blogged here).
So perhaps what we are seeing is evidence for an early Holocene equivalent-- the Neolithic grain as the tastey exotica in a the Mesolithic world
I recently collaborated on a paper that analyzed small scoriaecous (glassy) droplets from flotation samples, focused on site in Syria relating to early agriculture, including Abu Hureyra, Jerf el Ahmar, D’Jades, Tell Qaramel: Thy et al. published Journal of Archaeological Science. Previous some of these from Abu Hureyra samples (which are stored at the UCL Institute of Archaeology), have been included in a study from 18 sites around the world that argued that these Younger Dryas spherules derive from a asteroid impact 12,800 years ago that set off the younger Dryas (see Wittke et al PNAS). The analyses in our recent paper argue that the composition of these can be explained by natural soils on earth being heated to high temperatures, but no unreasonable for human settings, such as house fire. Our study includes sites that are significantly after the alleged asteroid impact, and we note that even later example occur. Both myself and co-author Willcox are primarily archaeobotanists, which mean we float archaeological sediments in water to recover ancient charcoal and seeds and scoria droplets are light enough and contain air bubbles and so they float along with the charcoal. We find them in many archaeobotanical samples around the world and of many periods, but certainly not in all samples. This suggests that they are formed occasionally by circumstances on ancient human sites.
This article generated considerable science media attention, and some vociferous debate from the proponents of a Younger Dryas impact (which I have previously written critically about on this blog: the vexed issue of "nano-diamonds"). In the US it made it into CBS news and in the UK into the Daily Mail, and, treatments can be followed from the UCL news item.
Superficially there is much in common between these spherules and the composition of the South India Neolithic ashmounds, where we have large deposits, even mounds, of scoriaceous material, dating from 3000-1400 BC, which appear to be created through intentional burning of built up dung and other deposits in cattle penning sites, reaching temperature of 1200 Centigrade or more leading to scoriaeceous formations.
In the early Near East we are dealing with similar kinds of temperatures. What is intriguing is that these do seem to be frequent inclusions in samples from early village sites, which suggests that circumstances on these first village lent themselves to large fires periodically. I think they are likely to come from uncontrolled fires, as we expect these to reach really high temperatures in some places. Controlled fires in hearths and ovens and I guess will often be kept at lower temperature more manageable for cooking. Although I would accept that more experimental work could explore this question. I would see this as a by-product of people building structures and accumulating rubbish fairly densely in these early settlements and not really having fire prevention or fighting practices. Village life also evolved gradually, and it may but that these earlier forms of village were not very well adapted with respect to reducing accidental fires.
Because the Younger Dryas and the asteroid impact have been related to terminal Pleisticene megafaunal extinctions, much of the media attention revolved around this issue. The Daily Mail reporter asked me “Do you think this lends support for the other theories of what may have led to the extinctions of the Pleistocene megafauna? What theory do you favour yourselves?”
I think the obvious big player in megafaunal extinctions is humans, human population growth, the modification of environments and the changing of foodwebs. Human over-hunting is a somewhat simplistic version of this hypothesis. In the Old World the ecosystem space of large herbivores has been mostly replaced by domesticated cattle, sheep, goats, horses and agricultural fields tend to exclude large herbivores. Therefore the habitat available for large herbivores has been reduced by these replacements, while growing human numbers have added to hunting pressure. These are all key developments of the Anthropocene or the “Used Planet” I written about before. This kind of process can be illustrated with the near extinction of American Bison. Yes these were hunted, but the big reason they were greatly reduced is the prairies of the US became land for farming and cattle ranching so much less and much more marginal habitat was left for the bison, so there was no way for them to recover from impacts of hunting. In the Americas domesticated herbivores are less significant but the addition of humans only about 15000 years ago must have massively altered ecologies and food-webs in ways that destabilized large faunal numbers. Climate change added top these pressures, but I do not see a need for extraterrestrial forces in this equation.
Returning to the spherules: Ultimately scientific evidence should be able to determine the source of these nodules and the reality of a Younger Dryas asteroid impact. The final verdict is perhaps not yet in from the wider jury of the research community. In favour of such an impact is, perhaps, the fact the YD look anomalous compared to the onset of previous interglacials (see, e.g. those in Ruddiman’s alignments in his Anthropocene review). However, a crater is missing and a clear smoking gun for this cataclysm is illusory and debatable. Our article is just one such criticism, but see also the astrophysics critique of Boslough et al (2012).
Nevertheless, I think there is a more significant philosophical difference. A cataclysmic younger dryas provides an extraterritorial trigger for megafaunal extinctions and the start of Near Eastern farming. Human actions, modifying environments and improving the techniques and technologies to do so are what drove the major changes of the early Holocene. Some of this may have been response to changing climate, to shifting in photosynthetic productivity brought about by post-glacial rise in carbon dioxide, but those changes were not sufficient on the their own: it was cultural action, niche construction (see, e.g. Bruce Smith 2011) that was the necessary catalyst. But as the evidence clearly indicates the origins of farming and plant domestication was protracted in the Fertile Crescent and elsewhere, taking millennia of plant evolution, trial and error of human techniques and a messy process of coevolution between culture and it environments (the entangled process of domestication: see this PNAS paper from 2014). There is no strong case for a Younger Dryas trigger to the start of cultivation, and certainly not for the bigger economic transition to domestication and farming. Outside the Near East almost all instances of the transition to domestication and farmer economies occur much later, the later Early Holocene or Mid-Holocene (see. e.g. the summary in Larson et al.from last year in PNAS). We have our own species to blame; I don’t think we need to look to outer space.
It is with sadness that I report the passing of an archaeobotanical colleague and friend, Leo Aoi Hosoya, who passed on 10 July...
The extensive set of direct dates, on the largest early assemblage of wheat and barley in China, provides important new evidence on the arri...
It’s with great sadness we bring you the news that Gordon Hillman died on Sunday 1 st July. He is survived by his daughter Thilaka, and ...
The spread of agriculture to Egypt presents a number of contrasts from that in Europe of east of the Fertile Crescent. For one thing it seem...
I am ever the fan of the obscure crop, the "lost crop", or the highly local. I have drawn attention previously to the forgotten ...
The study of weed origins and evolutionary history is the poor cousin of the archaeobotany of crop domestication. Archaeobotanists can p...