Anthropogenic burning in Hadza country.Credit: James F. O’Connell
Fire, a tool broadly used for cooking, constructing, hunting and even communicating, was arguably one of the earliest discoveries in human history. But when, how and why it came to be used is hotly debated among scientists.
A new scenario crafted by University of Utah anthropologists proposes that human ancestors became dependent on fire as a result of Africa’s increasingly fire-prone environment 2-3 million years ago.
As the environment became drier and natural fires occurred more frequently, ancestral humans took advantage of these fires to more efficiently search for and handle food. With increased resources and energy, these ancestors were able to travel farther distances and expand to other continents.
The study was funded by the National Science Foundation and the findings were published April 10, 2016 in Evolutionary Anthropology.
Find your dream job in the space industry. Check our Space Job Board »
Serendipitous science
Current prevailing hypotheses of how human ancestors became fire-dependent depict fire as an accident — a byproduct of another event rather than a standalone occurrence. One hypothesis, for example, explains fire as a result of rock pounding that created a spark and spread to a nearby bush.
“The problem we’re trying to confront is that other hypotheses are unsatisfying. Fire use is so crucial to our biology, it seems unlikely that it wasn’t taken advantage of by our ancestors,” said Kristen Hawkes, distinguished professor of anthropology at the U and the paper’s senior author.
“Everything is modified by fire; just take a look around at the books and furniture in this room. We’re surrounded by fire’s byproducts,” added Christopher Parker, anthropology postdoctoral research associate at the U and the paper’s first author.
The team’s proposed scenario is the first known hypothesis in which fire does not originate serendipitously. Instead, the team suggests that the genus Homo, which includes modern humans and their close relatives, adapted to progressively fire-prone environments caused by increased aridity and flammable landscapes by exploiting fire’s food foraging benefits.
Parker and Hawkes conducted the research with University of Utah anthropology doctoral candidate Earl Keefe, postdoctoral research associate Nicole Herzog and distinguished professor James F O’Connell.
Shedding light on the past
“All humans are fire-dependent. The data show that other animals and even some of our primate cousins use it as an opportunity to eat better; they are essentially taking advantage of landscape fires to forage more efficiently,” said Hawkes.
By reconstructing tropical Africa’s climate and vegetation about 2-3 million years ago, the research team pieced together multiple lines of evidence to craft their proposed scenario for how early human ancestors first used fire to their advantage.
To clarify the dating and scope of increasingly fire-prone landscapes, the research team took advantage of recent work on carbon isotopes in paleosols, or ancient dirt. Because woody plants and more fire-prone tropical grasses use different photosynthetic pathways that result in distinct variants of carbon, the carbon isotopic composition of paleosols can directly indicate the percentage of woody plants versus tropical grasses.
Recent carbon analyses of paleosols from the Awash Valley in Ethiopia and Omo-Turkana basin in northern Kenya and southern Ethiopia show a consistent pattern of woody plants being replaced by more tropical, fire-prone grasses approximately 3.6-1.4 million years ago. This is explained by reductions in atmospheric carbon dioxide levels and increased aridity. Drier conditions and the expansion of fire-prone grasslands are also evidenced in fossil wood evidence in Omo Shungura G Formation, Ethiopia.
As the ecosystem became increasingly arid and a pattern of rapid, recurring fluctuation between woodlands and open grasslands emerged, many ancestral humans adapted to eating grassland plants and food cooked by fires. In essence, they took advantage of the foraging benefits that fire provided.
Turn up the heat: more fire for more food
More specifically, fire-altered landscapes provided foraging benefits by improving both the processes of searching for and handling food. The research team identified these benefits by using the prey/optimal diet model of foraging, which simplifies foraging into two mutually exclusive components — searching and handling — and ranks resources by the expected net profit of energy per unit of time spent handling. This model identifies changes in the suite of resources that give the highest overall rate of gain as search and handling costs change.
By burning off cover and exposing previously obscured holes and animal tracks, fire reduces search time; it also clears the land for faster growing, fire-adapted foliage. Foods altered by burning take less effort to chew and nutrients in seeds and tubers can be more readily digested. Those changes reduce handling efforts and increase the value of those foods.
“Most people think that the logical reaction would be to run away from fire, but fire provided our ancestors with a feeding opportunity. Evidence shows that other animals take advantage of fire for foraging, so it seems very likely that our ancestors did as well,” said Hawkes.
Without a trace
Landscapes burned by fire, then, had numerous foraging payoffs for genus Homo.
The proposed scenario not only explains how hominins came to manipulate fire for its foraging advantages, but also provides a solution to the baffling mismatch between the fossil and archaeological records. Anatomical changes associated with dependence on cooked food such as reduced tooth size and structures related to chewing appear long before there is clear archaeological evidence of cooking hearths.
Parker and Hawkes’ scenario resolves the mismatch by suggesting that the earliest forms of fire use by the genus Homo would not have left traces in the form of traditional fire hearths.
Instead of cooking over a prepared hearth that would be visible archaeologically, hominins were taking advantage of burns, had an increased energy budget and could travel longer distances. Early fire use, therefore, would have been indistinguishable from naturally occurring fires.
“When our genus appears, almost immediately, those populations got out of Africa. If you look at the other great apes, they’re tied to habitats where juveniles can feed themselves. We were able to expand out of Africa into Europe and Asia because our fire use not only earned higher return rates, but also permitted older women in these communities to help feed juveniles, thereby freeing our ancestors to move into habitats where youngsters couldn’t feed themselves,” said Hawkes.
“This scenario tells a story about our ancestors’ foraging strategies and how those strategies allowed our ancestors to colonize new habitats. It gives us more insight into why we came to be the way we are; fire changed our ancestors’ social organization and life history.”
Looking forward, the research team will take on an ethnographic project with the Hadza people, an indigenous ethnic group in Tanzania that are among the last hunter-gatherers in the world, to learn how they forage in burns. The team will also continue to study more examples of how nonhuman primates forage in burns to confirm the anecdotal evidence that they take advantage of landscape fires, as well as further study fire ecology in tropical Africa and how that allowed ancestors to move to other continents.
This article originally appeared at University of Utah.
Journal References:
- Christopher H. Parker, Earl R. Keefe, Nicole M. Herzog, James F. O’connell, Kristen Hawkes. The pyrophilic primate hypothesis.Evolutionary Anthropology: Issues, News, and Reviews, 2016; 25 (2): 54 DOI: 10.1002/evan.21475