Friday, July 5, 2024

The evolution of the human trophic level during the Pleistocene Miki Ben-Dor X, Raphael Sirtoli, Ran Barkai

https://onlinelibrary.wiley.com/doi/10.1002/ajpa.24247

INTRODUCTION “The first task of the prehistorian must be to decide which trophic level the population he is studying occupied” (Wilkinson, 2014, p. 544). Despite Wilkinson's advice, few researchers referred to past human food consumption in terms of a “trophic level.” This tendency may stem from the perception of humans as the ultimate omnivore, generalist, flexible creatures, capable of adapting their trophic level at short notice to meet variable local ecological conditions. Some even consider acquiring these capabilities as the core of human evolution, including increased brain size (Potts, 1998; R. W. Wrangham et al., 1999). By seeking a “trophic level,” we examine the possibility that unlike 20th-century hunter-gatherers (HG), Paleolithic humans may not have been as flexible in the selection of plant or animal-sourced foods during the Pleistocene as one would infer from an examination of the ethnographic record. Perception of humans' dietary flexibility regarding plant and animal-sourced foods during the Pleistocene (2,580,000–11,700 years ago) receives much support from analogy with 20th century HG's varied diets. The difference in preservation potential between plants and animals in archaeological assemblages has led to the wide use of ethnography in the reconstruction of Paleolithic diets (Cordain et al., 2000; Crittenden & Schnorr, 2017; Eaton & Konner, 1985; Konner & Eaton, 2010; Kuipers et al., 2012; Kuipers et al., 2010; Lee, 1968; F. W. Marlowe, 2005; Stahl et al., 1984; Ströhle & Hahn, 2011). All of the reconstructions present a picture of HG as flexible in their trophic level, depending largely on local ecologies. However, the varied diets of 20th century HG may result from post-Paleolithic technological and physiological adaptations to Anthropocene ecological conditions that are non-analogous to the conditions that humans experienced during most of the Pleistocene. In fact, with a markedly lower abundance of megafauna and with technological features like the use of dogs, bows and arrows, iron, and contact with neighboring herders and farmers, one would expect 20th century HG to be more analogous in terms of dietary patterns to their probable ancestors of the terminal and post-Paleolithic humans rather than to Lower, Middle and even Early Upper Paleolithic humans (Ben-Dor & Barkai, 2020a; Faith et al., 2019). The human trophic level (HTL) and its degree of variability during the Pleistocene are the basis, explicitly or tacitly, of many explanations regarding human evolution, behavior, and culture (Aiello & Wheeler, 1995; Bramble & Lieberman, 2004; Domínguez-Rodrigo & Pickering, 2017; Hawkes & Coxworth, 2013; Kaplan et al., 2007; Munro et al., 2004; Potts, 1998; M. C. Stiner, 2002; Ungar et al., 2006; R. W. Wrangham et al., 1999). For example, competing explanations for humans' extended longevity, the “grandmother hypothesis” (Hawkes & Coxworth, 2013), and “embodied capital hypothesis” (Kaplan et al., 2009), are based on different assumptions of the relative dietary importance of gathered plants versus hunted animal food during human evolution. Potts (1998) assigns the human ability to vary trophic levels in response to climate as critical to human evolution. HTL estimates also support hypotheses about health-promoting, evolutionarily compliant, contemporary diets (Eaton & Konner, 1985). A related question is whether humans evolved toward specialization (stenotopy) or generalization (eurytopy) (Wood & Strait, 2004). Some have hypothesized that human survival and wide dispersal result from evolved dietary flexibility, that is, generalization (Potts, 1998; Ungar et al., 2006), while others (Arcadi, 2006; Vrba, 1980) attribute wide dispersal to carnivory. Humans have inarguably always been omnivores, feeding on more than one trophic level. However, omnivory in mammals is disassociated with dietary variability in terms of animal–plant food selection ratios. Omnivores exist on a wide range of trophic levels and have variable degrees of specialization (Johnston et al., 2011; Lefcheck et al., 2013; Pineda-Munoz & Alroy, 2014). Analysis of a large (N = 139) dataset of mammals' trophic levels (Pineda-Munoz & Alroy, 2014) shows that some 80% of the mammals in the dataset are omnivores, but most of the omnivores (75%) consume more than 70% of their food from either plants or animals, leaving only 20% of the mammals in the dataset to be omnivores-generalists. Interestingly, while all the 16 primates in the dataset are omnivores, 15 of the 16 are specialists. Bioenergetic and physiological constraints may limit the flexibility of omnivores. For example, both chimpanzees and wolves are technically omnivores yet are ill-adapted to high flexibility in their food sources. We intend to examine some of these constraints in humans to assert their impact on the human's trophic level, specialization, and dietary flexibility during the Pleistocene. Little systematic evolution-guided reconstruction of HTL has been published to date. Henneberg et al. (1998) cited the similarity of the human gut to that of carnivores, the preferential absorption of haem rather than iron of plant origins, and the exclusive use of humans as a carnivore host by the Taenia saginata, a member of the Taeniidea family of carnivores' parasites, as supporting Homo sapiens' adaptation to meat-eating. Mann (2000) Pointed to gut structure and acidity, insulin resistance, and high diet quality as evidence of physiological adaptation to consuming lean meat during the Paleolithic. Hancock et al. (2010) explored the association of humans' specific genes with diet, subsistence, and ecoregions, seeking adaptations in recent populations. A notable evolution-based analysis of the degree of carnivory in early humans was performed by Foley (2001, p. 327). He composed a list of behavioral and physiological changes that would have been expected to occur in humans had they adopted carnivory. Foley was mainly interested in the initial shift to carnivory and found that Homo ergaster/erectus comply with many of the predicted changes, concluding that “…the lines of evidence and reasoning put forward here strongly suggest that meat-eating has played a significant role in the evolution of Homo, not just Homo sapiens.” Through natural selection, physiological adaptation to a specific, broader, or narrower food niche is the primary cause of observed biological diversity (Darwin, 1859). Attempting to mitigate the dominance of the analogy with 20th century HG (“the tyranny of ethnography” in the words of Lieberman et al., 2007) in reconstructing Paleolithic diets, we searched the literature on human metabolism and genetics, body morphology, dental morphology and pathology, and life history for signs of adaptations to HTLs, or evolution toward dietary generalization or specialization. We also reviewed relevant archaeological, paleontological, and zoological literature to identify changing patterns in fauna, flora, lithic industries, stable isotopes, and other geoarchaeological data, as well as human behavioral adaptations to carnivory or omnivory, that reflected past HTLs. We concentrate on the lineage leading to H. sapiens because a large part of our evidence comes from the biology of H. sapiens. There is little information about species like rudolfensis, ergaster, heidlebergensis, and possibly antecessor, which may have belonged to the H. sapiens lineage. We thus focused our attention on H. erectus (sensu lato) and H. sapiens, as many relevant sources refer mostly to these species. Because we relied on H. sapiens' biology, we do not deal and do not make any claims regarding the trophic level of other Homo species that do not seem to have belonged to the lineage leading to H. sapiens like floresiensis, naledi, neanderthalensis, Denisovan, and luzonensis. Evolution-based information provides a longer-term view of HTL than that based on 20th century HG, or the ever-localized and partial data from archaeological sites. This view should, in turn, be more relevant to explain profound evolutionary physiological and behavioral phenomena in human prehistory. Below we discuss in some detail each piece of evidence. A summary for quick reference can be found in Table 2 at the beginning of the discussion section. 2 PHYSIOLOGICAL EVIDENCE 2.1 Bioenergetics Compared with other primates, humans have a higher energy requirement for a given fat-free mass (Pontzer et al., 2016), and thus faced intense selective pressure to efficiently acquire adequate and consistent energy, especially to reliably energize the brain (Navarrete et al., 2011). Additionally, due to tool acquisition, prolonged child care, and education, humans need more time free from food acquisition than other animals (Foley & Elton, 1998). Animal-sourced calories are generally acquired more efficiently; carnivores, therefore, spend less time feeding than similar-sized herbivores (Shipman & Walker, 1989). For example, baboons (Papio cynocephalus) devote almost all their daylight hours to feeding (Milton, 1987, p. 103) while adult Ache and Hadza men spend only a third of the day in food acquisition, preparation, and feeding (Hawkes et al., 1997; Hill et al., 1985). Acquiring and consuming medium size animals, at a return rate in the range of tens of thousands of calories per hour, is an order of magnitude more time-efficient than plant-gathering (Kelly, 2013, table 3-3, 3-4). In other words, the price differences in “the supermarket of nature” were likely opposite to the price differences in the supermarkets of today. In nature, for humans, plant-sourced calories cost 10 times the price of meat if it is available. Given limited time and energetic budgets, such a difference in energetic returns leaves little room for flexibility (also referred to as plasticity) in the selection of the two dietary components. Nonetheless, a background consumption of plants and smaller prey is expected when women gather and do not participate in hunts (see Plants section for discussion). Also, differences in the relative availability of plants and animals affect the actual consumption. In particular, large animals are the highest-ranking food according to ethnographic data (Broughton et al., 2011). According to classic optimal foraging theory, an animal would specialize in the highest-ranking type if the encounter rate is high enough (Futuyma & Moreno, 1988). Applied to humans, it means that they should have specialized in large prey if the encounter rates were high enough. Moreover, seasonal fluctuations in many plant species' availability may hinder their reliability as food for a significant portion of the year. In contrast, animals are always available, although with fluctuating fat content. Carnivory could have, therefore, been a more time-efficient and reliable caloric source. The relative abundance of large prey, and thus encounter rate, relative to smaller prey and plants, was probably higher during most of the Pleistocene, at least before the Late Quaternary extinction of megafauna (see the Ethnography, Paleontology and Zooarchaeology sections and Ben-Dor and Barkai (2020a) for references). 2.2 Diet quality In relation to body size, brain size is strongly associated with dietary energetic density in primates and humans (Aiello & Wheeler, 1995; DeCasien et al., 2017; Leonard et al., 2007). Human brains are over three times larger than other primates' brains, and as such, human dietary energetic density should be very high. The most energy-dense macronutrient is fat (9.4 kcals/g), compared with protein (4.7 kcals/g) and carbohydrates (3.7 kcals/g) (Hall et al., 2015). Moreover, plant proteins and carbohydrates typically contain anti-nutrients, which function in plant growth and defense (Herms and Mattson, 1992; Stahl et al., 1984). These anti-nutrients, such as lectins or phytate, appear in complex cellular plant matrix and fibers and limit full energetic utilization and nutrient absorption by humans (Hervik & Svihus, 2019; Schnorr et al., 2015). The most generous estimates from in vitro, human, and animal data suggest that well below 10% of total daily caloric needs can be met from fiber fermentation, and most likely below 4% (Hervik & Svihus, 2019; Høverstad, 1986; Topping & Clifton, 2001). Hence, the protein and fat mixture in animals would probably have provided higher energetic density, and therefore dietary quality. Brain size declined during the terminal Pleistocene and subsequent Holocene (Hawks, 2011; Henneberg, 1988), indicating a possible decline in diet quality (increase in the plant component) at the end of the Pleistocene. 2.3 Higher fat reserves Humans have much higher fat reserves than chimpanzees, our closest relatives (Zihlman & Bolter, 2015). Carrying additional fat has energy costs and reduces human speed in chasing prey or escaping predators (Pond, 1978). Most carnivores and herbivores do not have a high body fat percentage as, unlike humans, they rely on speed for predation or evasion (Owen-Smith, 2002, p. 143). Present-day HG (the Hadza) were found to have sufficient fat reserves for men and women to fast for three and six weeks, respectively (Pontzer et al., 2015). Humans seem very well adapted to lengthy fasting when fat provides their major portion of calories (Cahill Jr & Owen, 1968). Rapid entry to ketosis (when the liver synthesizes ketones from fat) allows ketone bodies to replace glucose as an energy source in most organs, including the brain. During fasting, ketosis allows muscle-sparing by substantially decreasing the need for gluconeogenesis (the synthesis of glucose from protein), and humans enter ketosis relatively quickly. Dogs share similar digestive physiology and animal-rich dietary patterns with humans but do not enter ketosis quickly (Crandall, 1941). Indeed, dogs typically require a diet supplemented by medium-chain triglyceride to increase blood ketones to derive therapeutic benefit, but even then, they do not achieve deep physiological ketosis like humans (Packer et al., 2016). Cahill Jr (2006, p. 11) summarizes the evolutionary implications of humans' outstanding adaptation to ketosis: “brain use of βOHB [a ketone body], by displacing glucose as its major fuel, has allowed man to survive lengthy periods of starvation. But more importantly, it has permitted the brain to become the most significant component in human evolution.” Rapid entry into ketosis has been found in the brown capuchin monkey (Friedemann, 1926), suggesting that this adaptation to fasting may have already existed in early Homo. Researchers who argue against a massive reliance on acquiring large animals during the Pleistocene mention their relative scarcity (Hawkes, 2016). However, besides the fact that they were more prevalent during the Pleistocene (Hempson et al., 2015), the ability to store large fat reserves and to more easily endure fasting may represent an adaptation, enabling humans to endure extended periods between acquiring the less frequently encountered large animals. 2.4 Genetic and metabolic adaptation to high-fat diet Swain-Lenz et al. (2019) performed comparative analyses of the adipose chromatin landscape in humans, chimpanzees, and rhesus macaques, concluding that their findings reflect differences in the adapted diets of humans and chimpanzees. They (p. 2004) write: “Taken together, these results suggest that humans shut down regions of the genome to accommodate a high-fat diet while chimpanzees open regions of the genome to accommodate a high sugar diet.” Speth (1989) hypothesized that humans eating an animal-based diet would display an obligatory requirement for significant fat amounts because they are limited in the amount of protein they can metabolize to energy. Dietary fat is also a macronutrient with priority storage within subcutaneous fat stores; this agrees with assumptions of adaptation to higher fat consumption. The ability to finely tune fat-burning is a prominent feature of human metabolism (Akkaoui et al., 2009; Mattson et al., 2018). The lipase enzyme plays a dominant role in fat storage and metabolism. Comparing the pace of genetic changes between humans and other primates, Vining and Nunn (2016) found that lipase production underwent substantial evolution in humans. Weyer and Pääbo (2016) found some indication of differences in both the regulation and activity of pancreatic lipase in modern humans compared with Neandertals and Denisovans. Given that Neandertals probably consumed a diet higher in meat and fat than anatomically modern humans, the latter was possibly adapting to lower fat consumption. However, these changes are also found in present-day humans, but there is no indication of how early they occurred in H. sapiens evolution. They could have resulted from a shift to a diet higher in plants in the period leading up to the adoption of agriculture, in which a marked increase in genetic changes is evident (Hawks et al., 2007). Additionally, storing larger fat reserves is a derived trait in humans, regardless of nutritional source (Pontzer, 2015). Thus, changes in fat metabolization capacity may, in part, be associated with metabolizing stored fat. In humans, eating predominantly animal foods, especially fatty animal foods, promote nutritional ketosis. This pattern provides generous amounts of bioavailable essential micronutrients with crucial roles in encephalization, such as zinc, heme iron, vitamin B12, and long-chain omega-3 and 6 fatty acids (DHA and arachidonic acid, respectively) (Cunnane & Crawford, 2003). Infants' brains meet all of their cholesterol needs in situ, with 30% to 70% of the required carbons being supplied by ketone bodies (Cunnane et al., 1999). Recently, nutritional ketosis has gained popularity as a possible therapeutic tool in many pathologies, such as diabetes, Alzheimer's disease, and cancer (Ludwig, 2019). 2.5 Omega 3 oils metabolism Another aspect of fat metabolism is the hypothesis that the early human brain's enlargement was made possible through acquiring aquatic foods. Presumably, these foods were the only source of high amounts of docosahexaenoic acid (a long-chain omega-3 fatty acid; DHA) found in the expanding human brain (Crawford, 2010; Cunnane & Crawford, 2014; Kyriacou et al., 2016). In contrast, Cordain et al. (2002) argue that terrestrial animal organs contained sufficient DHA amounts for brain growth. Furthermore, Speth (2010, p. 135) proposed that humans biosynthesized sufficient DHA de novo from precursors. This last argument is compatible with the present existence of several billion people, including some HG, who have never eaten aquatic-sourced food, yet they and their offspring can grow and support much larger brains than early humans. A large part of this population does not consume high proportions of animal-derived food and practices multi-generational vegetarianism without cognitive decline (Crozier et al., 2019). An increased need for DHA to sustain larger brains cannot even support claims for a terrestrial animal-based diet in early humans. Stable isotope analysis shows that at least some Neandertals did not consume much, if any, aquatic dietary resources (M. Richards & Trinkaus, 2009), though their brains were at least as large as that of modern humans. Mathias et al. (2012) identified a genetic change that occurred in African humans about 85 thousand years ago (Kya) in the fatty acid desaturase (FADS) family of genes, showing a marginal increase in efficiency of converting plant-derived omega-3 fatty acids into DHA. This change may signify an increase in dietary plant components at that time in Africa. In Europe, however, a similar change took place only with the arrival of the Neolithic (Ye et al., 2017), suggesting that a plant-based diet was uncommon beforehand. Furthermore, tracer studies show modern adult humans can only convert <5% of the inactive plant-derived omega-3 polyunsaturated fatty acid alpha-linolenic acid (18:3Ω3, ALA) into the animal-derived active version docosahexaenoic acid (20:6Ω3, DHA) (Plourde & Cunnane, 2007). Ye et al. (2017) found that positive genetic selection on FADS in Europe took the opposite direction in HG groups in the period leading up to the Neolithic, possibly signifying increased reliance on aquatic foods. The pre-Neolithic surge in aquatic foods exploitation is also supported by stable isotope analysis (see the section Isotopes and trace elements). 2.6 Late genetic adaptation to the consumption of underground storage organs A recent adaptation to a high-starch diet may be postulated from a study by Hancock et al. (2010, table 4), which showed that populations presently dependent on roots and tubers (underground storage organs [USOs]) are enriched in single nucleotide polymorphisms (SNPs) associated with starch and sucrose metabolism and folate synthesis, presumably compensating for their poor folic acid content. Another SNP in these populations may be involved in detoxifying plant glycosides, such as those in USOs (Graaf et al., 2001). Some researchers consider USOs ideal candidates for significant plant consumption by early humans (Dominy, 2012; B. L. Hardy, 2010; K. Hardy et al., 2016; Henry et al., 2014; R.W. Wrangham et al., 1999). If genetic adaptations to USOs consumption were rather recent, it suggests that USOs did not previously comprise a large dietary component. 2.7 Stomach acidity Beasley et al. (2015) emphasize the role of stomach acidity in protection against pathogens. They found that carnivore stomachs (average pH, 2.2), are more acidic than in omnivores (average pH, 2.9), but less acidic than obligate scavengers (average pH, 1.3). Human studies on gastric pH have consistently found a fasted pH value <2 (Dressman et al., 1990; Russell et al., 1993). According to Beasley et al. (2015), human stomachs have a high acidity level (pH, 1.5), lying between obligate and facultative scavengers. Producing acidity, and retaining stomach walls to contain it, is energetically expensive. Therefore it would presumably only evolve if pathogen levels in human diets were sufficiently high. The authors surmise that humans were more of a scavenger than previously thought. However, we should consider that the carnivorous activity of humans involved transporting meat to a central location (Isaac, 1978) and consuming it over several days or even weeks. Large animals, such as elephants and bison, presumably the preferred prey, and even smaller animals such as zebra, provide enough calories to sustain a 25-member HG group from days to weeks (Ben-Dor et al., 2011; Ben-Dor & Barkai, 2020b; Guil-Guerrero et al., 2018). Moreover, drying, fermentation, and deliberate putrefaction of meat and fat are commonly practiced among populations that rely on hunting for a large portion of their diet (Speth, 2017), and the pathogen load may consequently increase to a level encountered by scavengers. 2.8 Insulin resistance Another hypothesis claiming a human genetic predisposition to a carnivorous, low-carbohydrate diet is the “Carnivore Connection.” It postulates that humans, like carnivores, have a low physiological (non-pathological) insulin sensitivity. It allows prioritizing of glucose toward tissues like the central nervous system, erythrocytes, and testes that entirely or significantly depend on glucose, rather than muscles which can rely on fatty acids and ketosis instead (Brand-Miller et al., 2011); this sensitivity is similarly lower in carnivores (Schermerhorn, 2013). Brand-Miller et al. (2011) speculate that physiological insulin resistance allows humans on a low-carbohydrate diet to conserve blood glucose for the energy-hungry brain. The genetic manifestation of insulin resistance is complex and difficult to pinpoint to a limited number of genes (Moltke et al., 2014). However, Ségurel et al. (2013) found a significantly higher insulin resistance (low sensitivity) in a Central Asian population (Kirghiz) of historical herders, compared with a population of past farmers (Tajiks), despite both groups consuming similar diets. Their findings indicate a genetic predisposition to high physiological insulin resistance levels among groups consuming mainly animal-sourced foods. Additionally, a significant difference in the prevalence of this resistance exists between groups with long-term exposure to agriculture and those that do not, such as Australian aborigines, who have higher resistance. If higher physiological insulin resistance is indeed ancestral, its past endurance suggests that high carbohydrate (starch, sugar) consumption was not prevalent. 2.9 Gut morphology Most natural plant food items contain significant amounts of fiber (R. W. Wrangham et al., 1998), and most plant-eaters extract much of their energy from fiber fermentation by gut bacteria (McNeil, 1984), which occurs in the colon in primates. For example, a gorilla extracts some 60% of its energy from fiber (Popovich et al., 1997). The fruits that chimps consume are also very fibrous (R. W. Wrangham et al., 1998). The human colon is 77% smaller, and the small intestine is 64% longer than in chimpanzees, relative to chimpanzee body size (Aiello & Wheeler, 1995; Calculated from Milton, 1987, table 3.2). Because of the smaller colon, humans can only meet less than 10% of total caloric needs by fermenting fiber, with the most rigorous measures suggesting less than 4% (Hervik & Svihus, 2019; Høverstad, 1986; Topping & Clifton, 2001). A 77% reduction in human colon size points to a marked decline in the ability to extract the full energetic potential from many plant foods. The elongated small intestine is where sugars, proteins, and fats are absorbed. Sugars are absorbed faster in the small intestine than proteins and fats (Caspary, 1992; Johansson, 1974). Thus, increased protein and fat consumption should have placed a higher selective pressure on increasing small intestine length. A long small intestine relative to other gut parts is also a dominant morphological pattern in carnivore guts (Shipman & Walker, 1989, and references therein). This altered gut composition meets the specialization criteria proposed by Wood and Strait (2004) for adaptations that enable animals but hinder plant acquisition for food. A marked reduction in chewing apparatus and a genetic change that reduced the jaw muscle bite force had already appeared 2–1.5 million years ago (Mya) (Lucas et al., 2006). A smaller mandibular-dental complex points to a smaller gut (Lucas et al., 2009); therefore, the carnivorous gut structure may have already been present in H. erectus. 2.10 Reduced mastication and the cooking hypothesis Together with the whole masticatory system, teeth should closely reflect the physical, dietary form because masticatory action is repeated thousands of times each day and is thus under continuous pressure to adjust to efficient dietary processing (Lucas et al., 2009). One of Homo's main derived features is the reduced relative size of the masticatory apparatus components (Aiello & Wheeler, 1995). This reduction is associated with a substantially decreased chewing duration (approximately 5% of daily human activity, compared with 48% in chimpanzees), starting with H. erectus 1.9 Mya (Organ et al., 2011). The masticatory system size in H. erectus, together with reduced feeding duration, is attributed to the increased dietary meat proportion and availability of stone tools (Aiello & Wheeler, 1995; Zink & Lieberman, 2016), high portion of dietary fat (Ben-Dor et al., 2011), or the introduction of cooking early in Homo evolution (R. Wrangham, 2017). We consider cooking plants as a possible but less likely explanation for the reduction in mastication since most researchers date the habitual and controlled use of fire to over a million years after the appearance of H. erectus (Barkai et al., 2017; Gowlett, 2016; Roebroeks & Villa, 2011; Shahack-Gross et al., 2014; Shimelmitz et al., 2014); but see R. Wrangham (2017). It seems that habitual use of fire appeared with the appearance of post-H. erectus species and so can signal increased plant consumption in these species. It should also be noted that although fire was undoubtedly used for cooking plants and meat, a fire has many non-cooking uses for humans (Mallol et al., 2007), including protection from predation, a significant danger in savanna landscapes (Shultz et al., 2012). Also, fire maintenance has bioenergetic costs (Henry, 2017), and in some environments, sufficient wood may not be available to maintain fire (Dibble et al., 2018). While the contribution of cooking to the consumption of plants is not contested, cooking also contributes to the consumption of meat. There is no archaeological indication of a net quantitative contribution of cooking to the HTL. We, however, assume that cooking signals a somewhat higher consumption of plants. 2.11 Postcranial morphology Several derived postcranial morphologic phenotypes of humans are interpreted as adaptations to carnivory. Ecologically, the body size is related to trophic strategies. Researchers have attributed the increase in body size in Homo to carnivory (Churchill et al., 2012; Foley, 2001; T. Holliday, 2012). A recent body size analysis shows that H. erectus evolved larger body size than earlier hominins (S. C. Antón et al., 2014; Grabowski et al., 2015). Simultaneously, larger body size reduces the competitivity in arboreal locomotion and hence in fruit gathering. It is interesting to note that in Africa, humans' body size reached a peak in the Middle Pleistocene, and H. sapiens may have been smaller than his predecessors (Churchill et al., 2012). Since carnivore size is correlated with prey size (Gittleman & Harvey, 1982), this development ties well with an apparent decline in prey size at the Middle Stone Age (MSA) in East Africa (Potts et al., 2018). A similar decrease in body size was identified in the Late Upper Paleolithic and Mesolithic (Formicola & Giannecchini, 1999; Frayer, 1981), also with a concomitant decline in prey size following the Late Quaternary Megafauna Extinction (Barnosky et al., 2004). A series of adaptations to endurance running was already present in H. erectus, presumably to enable “persistence hunting” (Bramble & Lieberman, 2004; Hora et al., 2020; Pontzer, 2017). A recent genetic experiment concerning exon deletion activity in the CMP-Neu5Ac hydroxylase (CMAH) gene in mice led Okerblom et al. (2018) to propose that humans, in whom the deletion was already fixed at 2 Mya, had already acquired higher endurance capabilities at that time. Whether this endurance was used for hunting, scavenging, or another unknown activity early in human evolution is debated (Lieberman et al., 2007; Pickering & Bunn, 2007; Steudel-Numbers & Wall-Scheffler, 2009). Comparing the Early Stone Age sites of Kanjera South and FLK-Zinj, Oliver et al. (2019) suggested that different ecological conditions required different hunting strategies, either cursorial (suitable for persistence hunting), or ambush, which is more appropriate for a woodland-intensive landscape. Some endurance running adaptations may also suggest adaptation to increased mobility in hot weather conditions, as expected from carnivores, given their relatively large home ranges (Gittleman & Harvey, 1982). Another feature associated with hunting in the early stages of human evolution is an adaptation of the shoulder to a spear-throwing action, already present in H. erectus (Churchill & Rhodes, 2009; J. Kuhn, 2015; Roach et al., 2013; Roach & Richmond, 2015). Young et al. (2015) and Feuerriegel et al. (2017) argue that this adaptation came at the expense of a reduced ability to use arboreal niches, meeting the criteria proposed by Wood and Strait (2004) to support compelling morphological evidence of evolution toward carnivorous stenotopy. 2.12 Adipocyte morphology Ruminants and carnivores, which absorb very little glucose directly from the gut, have four times as many adipocytes per adipose unit weight than non-ruminants, including primates, which rely on a larger proportion of carbohydrates in their diet (Pond & Mattacks, 1985). The authors hypothesize that this is related to the relative role of insulin in regulating blood glucose levels. Interestingly, omnivorous species of the order Carnivora (bears, badgers, foxes, voles) display more carnivorous patterns than their diet entails. Thus humans might also be expected to display organization closer to their omnivorous phylogenic ancestry. However, humans fall squarely within the carnivore adipocyte morphology pattern of smaller, more numerous cells. Pond and Mattacks (1985, p. 191) summarize their findings as follows: “These figures suggest that the energy metabolism of humans is adapted to a diet in which lipids and proteins rather than carbohydrates, make a major contribution to the energy supply.” 2.13 Age at weaning Humans have a substantially different life history than other primates (Robson & Wood, 2008), a highly indicative speciation measure. One life history variable in which humans differ significantly from all primates is weaning age. In primates such as orangutans, gorillas, and chimpanzees, weaning age ranges between 4.5 and 7.7 years, but is much lower in humans in HG societies, at 2.5–2.8 years, despite the long infant dependency period (Kennedy, 2005; Robson & Wood, 2008, table 2). Psouni, Janke, and Garwicz (2012, p. 1) found that an early weaning age is strongly associated with carnivory level, stating that their findings “highlight the emergence of carnivory as a process fundamentally determining human evolution.” It is interesting, however, that a comparison of early Homo, Australopithecus africanus, and Paranthropus robustus from South Africa reveals a substantially higher weaning age (4 years) in South African early Homo (Tacail et al., 2019), so it is unclear when the weaning age shortened. 2.14 Longevity Longevity is another life history variable in which humans differ markedly from great apes. While the modal age at death in chimpanzees is 15 years, in 20th century HG, it occurs in the sixth and seventh decades (Gurven & Kaplan, 2007, table 4). There is no argument that longevity extension began with early Homo, although disagreement exists regarding the pace of change. Caspari and Lee (2004) argue for an accelerated extension of longevity in H. sapiens, while others, such as Hawkes and Coxworth (2013), argue for an earlier extension. Two hypotheses attempt to explain life extension in humans; the disparity lies in different perceptions regarding HTL during evolution. Hawkes and Coxworth (2013) support the “grandmother hypothesis,” by which grandmothers' longevity (post-menopausal females) enables sufficient plant food to be collected for infants, whose slower development in comparison with other primates necessitates extended care. The authors (see also Hawkes et al., 2018) base this argument on Hadza dietary patterns, in which gathering by females contributes a large portion of food calories, to demonstrate the marked effect of infant care by grandmothers in releasing their daughters' time for gathering food. As discussed in the Plants section, females may have contributed to hunting as well as gathering. Kaplan et al. (2000) (see also Kaplan et al., 2007) rely on a diet dominated by animal sources, such as in other 20th century HG groups, for example, the Ache. They propose that hunting experience, which fully develops at around 40 years, is crucial to group survival by enabling acquisition of the surplus calories needed to feed less productive, younger group members. The importance of hunting experience presumably caused longevity extension in humans. The problem in comparing the Hadza, with their different ecology, with iron-based material culture as a model for evolutionary dietary patterns is discussed in the Ethnography section. However, it is interesting that even in the Hadza, peak food-acquisition productivity is reached after age 40 in both sexes (F. Marlowe, 2010, fig. 5.11). In summary, extended human longevity suggests that a need for efficient calorie acquisition to maintain both self and an extended sibling dependency period was a dominant driving force in human evolution. The two hypotheses are not necessarily mutually exclusive. 2.15 Vitamins Hockett and Haws (2003) developed a hypothesis that one of the basic tenets of the ancestral human diet was its diversity. They based this on research findings that present-day diets emphasizing diversity increase overall health patterns by lowering infant mortality rates and increasing average life expectancy. Presumably, the wide range of vitamins and minerals associated with diverse diets is advantageous. The relevancy of the initial findings, cited by Hockett and Haws (2003) to the Paleolithic, is questionable, as they relate to modern societies consuming an agricultural, mostly domesticated plant-based diet with declining nutritional value (Davis, 2009). Diversification may, in this case, confer benefit due to the mineral and vitamin accumulation derived from consuming multiple plant types, each of which individually has a narrower range of beneficial contents. Diversity can also refer to portion increases and animal variety in the diet. Kenyan schoolchildren on a high-plant diet receiving meat supplementation showed improved growth, cognitive, and behavioral outcomes (Neumann et al., 2007). Hockett and Haws (2003, table 1) list the key vitamin content in 100 g of plants compared with 100 g of various animal foods (vitamins C, E, D [cholecalciferol], A [retinol & β-carotene], B1 [thiamin], B2 [riboflavin], B3 [niacin], B6 [pyridoxine or pyridoxal], B9 [folate or folic acid], B12 [cobalamin], and iron [heme & non-heme iron]). Comparison of vitamin density (per 100 calories) between terrestrial mammals and plants shows that, in eight of the ten vitamins, terrestrial mammal food is denser, and in most cases several times denser, than plants. If we consider factors like bioavailability and active nutrients, then animal foods appear even more nutritious (Frossard et al., 2000). This result is unsurprising given that humans are also terrestrial mammals, containing the same chemicals as other terrestrial mammals and requiring mostly the same vitamins. Plant food is denser in vitamin E and C. It is well known, however, that scurvy did not affect Polar societies despite lower levels of dietary plant components (Draper, 1977; Fediuk, 2000; Thomas, 1927). Western individuals who lived among Polar populations for several years also showed no signs of vitamin shortage (Stefansson, 1960, p. 171). Controlled further monitoring in the United States of two of these individuals, who consumed meat exclusively for a year, revealed no adverse clinical symptoms (McClellan & Du Bois, 1930). According to the glucose–ascorbate antagonism (GAA) hypothesis (Hamel et al., 1986), the structural similarity between glucose and vitamin C means the two molecules compete to enter cells via the same transport system (Wilson, 2005). Thus, higher requirements for vitamin C in western populations may result from higher consumption of carbohydrates and consequently higher blood glucose levels. Two clinical studies comparing diabetic and non-diabetic patients showed, as predicted by the GAA hypothesis, that diabetic patients with higher blood glucose levels have decreased plasma ascorbic acid levels (Cunningham et al., 1991; Fadupin et al., 2007). Dietary vitamin C requirements can be lowered in multiple ways in the context of very-low-carbohydrate diets high in animal sources, which can affect metabolism such that the oxaloacetate to acetyl-CoA ratio drops below one, stimulating ketogenesis and, in turn, increasing mitochondrial glutathione levels (Jarrett et al., 2008). More glutathione means more enzyme glutathione reductase to recycle dehydroascorbic acid (oxidized vitamin C) into ascorbic acid (reduced vitamin C). Ketogenic diets can also increase uric acid, the major antioxidant in human serum, putatively sparing vitamin C in its antioxidant capacity (Nazarewicz et al., 2007; Sevanian et al., 1985), and conserving it for other tasks. For instance, animal foods also provide generous amounts of carnitine, meaning that less vitamin C is needed to synthesize carnitine (a process to which vitamin C is crucial) (Longo et al., 2016). Therefore, the evidence does not support the hypothesis of Hockett and Haws (2005). There is little argument that Paleolithic diets were higher in plants than recent Polar diets and thus did include some plant-derived vitamin C. Nevertheless, animal-sourced foods provide essential micronutrients in their active forms that plants do not, such as vitamin A (retinol), vitamin K (K2 menaquinones), vitamin B9 (folate), vitamin B12 (cobalamin), vitamin B6 (pyridoxine), vitamin D (cholecalciferol), iron (heme iron), and omega-3 (EPA and DHA). Animal foods are not only qualitatively but also quantitatively superior to plant foods, as determined by measures of nutrient density. 2.16 AMY1 gene Although starch consumption is evident throughout the Pleistocene (K. Hardy, 2018), its relative importance is difficult to elucidate from the archaeological record. Salivary amylase is an enzyme degrading starch into glucose in preparation for cell energy metabolism, and Vining and Nunn (2016) discerned a significant evolution in amylase-producing genes in Homo species but could not determine the temporal dynamics. Initially, a higher number of copies, from 2 to 15, of the salivary amylase-producing gene AMY1 in modern populations consuming high-starch diets was found (G. Perry et al., 2007). The few Neandertal and Denisovan genetic samples have only two AMY1 copies (G. H. Perry et al., 2015), similar to chimpanzees, which consume little starch. G. H. Perry et al. (2015) conclude that the common ancestor of Neandertals and H. sapiens, some 500–600 Kya, also had only two copies, a conclusion supported by Inchley et al. (2016), who surmised that the appearance of multi-copy AMY1 genes in H. sapiens probably occurred quite early after the split from the common ancestor. Several studies have hypothesized that people with a low number of AMY1 copies eating a high-starch diet would suffer from increased rates of obesity and diabetes but failed to find supporting evidence (Atkinson et al., 2018; Des Gachons & Breslin, 2016; Falchi et al., 2014; Fernández & Wiley, 2017; Mejía-Benítez et al., 2015; Yong et al., 2016). Usher et al. (2015) explain that when lower-precision molecular methods are avoided, not even a nominal association between obesity and the copy number of any amylase gene can be observed (p = 0.7). In summary, more research is needed to verify the functional role of salivary amylase in starch metabolism and the timing of the appearance of multiple copies of AMY1. 3 ARCHAEOLOGICAL EVIDENCE 3.1 Plants Archaeobotanical remains, lithic use-wear, residue analyses on lithic/flint tools, and teeth plaque are often used to elucidate human consumption of specific plant food items at the site level (K. Hardy et al., 2016; Henry et al., 2011; Lemorini et al., 2006; Venditti et al., 2019; Melamed et al., 2016). The acquisition of plants requires little use of stone tools and thus is prone to leave fewer artifacts at archaeological sites (F. Marlowe, 2010). Wooden digging sticks are used in ethnographic contexts to extract tubers and are sometimes found in archaeological contexts (Aranguren et al., 2018; Vincent, 1985), but they also preserve poorly. Despite the poor discovery potential, a review of relevant studies of plant consumption in the Lower and Middle Paleolithic (K. Hardy, 2018, table 1) paints a picture of widespread consumption of a wide range of plants. Unfortunately, similarly to the vast number of studies of bones with cut marks in countless sites, these studies cannot provide quantitative information for evaluating even a localized HTL, let alone a global one. According to ethnographic data (Kelly, 2013, table 3-3, 3-4), the energetic return on plant gathering is in the order of several hundred to several thousand calories per hour, while the return on medium-size animals is in the tens of thousands of calories; presumably gathering should be minimal. However, humans are unique in their division of labor in that the ethnographic record shows that females and males may target different foods that they eventually share. Ethnography is a convincing source of evidence for females as plant gatherers. The lower risk of the gathering is compatible with females' role in child care and thus lends credence to this interpretation of the nature of the division of labor in humans. Further, it is proposed, based on ethnographic data, that female's longevity evolved to enable the contribution of gathered plants subsistence by grandmothers beyond their own needs (Hawkes & Coxworth, 2013). However, several scholars have theorized that such division of labor would have been less pronounced or even altogether absent among Paleolithic HG (Haas et al., 2020 and references therein). As discussed in the Ethnography section, the different ecological conditions in the Pleistocene may have also affected the extent to which females gathered plants compared to the Paleolithic period. There is evidence for female hunting in ethnographic and Paleolithic contexts (D. W. Bird et al., 2013; R. B. Bird & Power, 2015; Haas et al., 2020; Khorasani & Lee, 2020; Noss & Hewlett, 2001; Waguespack, 2005). Especially relevant to the Paleolithic is the potential participation of females in large game hunting (Brink, 2008; Haas et al., 2020) in driving large animals to a trap where their movement can be curtailed (Churchill, 1993) or in driving them to expecting male hunters (Waguespack, 2005); methods that provide the opportunity for communal hunting (Lee, 1979, p. 234). Also, females tend to perform more non-subsistence activities in highly carnivorous human groups (Waguespack, 2005). Since the extinction of large animals in the Upper Paleolithic (UP) and the Anthropocene, alternative forms of division of labor may have become less relevant. It may be that we see here what we see in other aspects of ethnographic subsistence analogy, an increase in plant food extraction by females as a result of an adaptation to the decline in prey size and a resultant increase in plant food relative abundance (Ben-Dor & Barkai, 2020a; Waguespack, 2005). Ancient dental plaque has recently gained attention as a source of dietary information, as it enables the identification of plant particles accumulated in plaque (K. Hardy et al., 2012, 2016, 2017; Henry et al., 2011, 2014; Henry & Piperno, 2008). All studies, including the earliest-studied population of Sima del Elefante (1.2 Mya) (K. Hardy et al., 2017) and the Qesem Cave, Israel (400 Kya) (K. Hardy et al., 2016), identified plant remains in tooth plaque and even evidence for cooked starch (but see García-Granero, 2020). Out of 31 dental calculus analysis cases listed in K. Hardy et al. (2018, table 1), only four cases did not show starch residues. The assemblage of cases suggests that consumption of plants was common, although one has to take into account that the consumption of starch encourages dental plaque formation (Brown, 1975; Scannapieco et al., 1993) and that we do not know the percentage of teeth that were not sampled because they had no calculus. But even if we had these data, the identification of plants in dental calculus cannot tell us what the relative plant consumption of the individual was. However, we can summarize that the archaeological and ethnographic record shows that plant foods were a frequent component of the Paleolithic diet. 3.2 Stone tools Although the study of stone-tools is a cornerstone of prehistoric research, its potential to inform a quantitative value, such as HTL, is frustratingly limited. There is plenty of evidence of stone-tool associations with meat and fat processing (Barkai et al., 2010; Lemorini et al., 2006; Lemorini et al., 2015; Nowell et al., 2016; Solodenko et al., 2015; M. C. Stiner, 2002; Toth & Schick, 2019; Venditti et al., 2019). Recently, however, stone-tool use for plant processing in the early Paleolithic has been detected (Arroyo & de la Torre, 2016; Lemorini et al., 2014). S. L. Kuhn and Stiner (2001) review the increased frequency of stone tools attributable to plant processing toward the end of the Pleistocene, Epipaleolithic, and Pre-Pottery Neolithic Near Eastern sites. Lithic tools such as sickle blades, pounding, and grinding stones specific to plant processing appear late in the Pleistocene, both in Southern Asia (Bar-Yosef, 1989, 2002, 2014) and Europe (S. L. Kuhn & Stiner, 2001; Stepanova, 2019). Sickle blades and grain-grinding stone tools appear in the Levant in the early Epipaleolithic at Ohalo some 23 Kya, and at earlier UP sites, but become widespread only during the Natufian, 15.0–11.6 Kya (Groman-Yaroslavski et al., 2016). Their frequency increases further toward the Neolithic (Bar-Yosef, 1989; Wright, 1994). The relative intensity of grinding tool use is also found in 20th century HG plant-dependent groups (S. L. Kuhn & Stiner, 2001). The dearth or complete absence of similar findings during earlier periods, such as the Middle Paleolithic (MP), can be interpreted as indicating lower plant consumption than the UP and later UP. A similar trend is found in Europe, where grinding stones first appear sporadically in the Early UP/Aurignacian, but it is from the advent of later cultures, such as the Gravettian and Magdalenian, that these tools become more frequent (Aranguren et al., 2007; S. L. Kuhn & Stiner, 2001; Revedin et al., 2010; M. C. Stiner, 2002). In a multi-dimensional analysis of the Eurasian archaeological record, M. C. Stiner (2002) found a significant HTL decline by the Late UP. Grinding tools appear in Africa in the MSA, much earlier than elsewhere. However, they are mostly associated with pigment grinding (McBrearty & Brooks, 2000). Increases in plant-processing tools are less conspicuous in Africa than Europe and the Levant (Barton et al., 2016; Bar-Yosef, 2002; Clark, 1997) and include, in addition to MSA grinding stones, perforated stones that may have served as digging stick weights (Villa et al., 2012). Clark (1997) and Villa et al. (2012) indicate that hunting and cutting tools dominated the tool innovation list of the Later Stone Age (LSA) of Africa. In summary, there is little lithic evidence of increased plant consumption in Africa, at least during the early LSA. In China, increased evidence for plant consumption seems to follow the same timeframe as the European–Levantine UP (Guan et al., 2014; Liu & Chen, 2012, pp. 46–57). In summary, the lithic analysis provides evidence for a gradual but significant increase in plant consumption during the UP, especially at its end (Power & Williams, 2018), in Europe and Asia, and thus a decline in HTL in the late UP and consequently, a higher HTL beforehand. 3.3 Zooarchaeology

No comments:

Post a Comment