Friday, July 5, 2024

A human without gender is an it / Our grammar tells us so.

Our Levi-Staussian deep structure tells us that to lack gender is to become an it. br>



I'm not fixated. I'm continuing the centuries long struggles against male-supremacy (millennia long) and white supremacy (500 years long

The evolution of the human trophic level during the Pleistocene Miki Ben-Dor X, Raphael Sirtoli, Ran Barkai

https://onlinelibrary.wiley.com/doi/10.1002/ajpa.24247

INTRODUCTION “The first task of the prehistorian must be to decide which trophic level the population he is studying occupied” (Wilkinson, 2014, p. 544). Despite Wilkinson's advice, few researchers referred to past human food consumption in terms of a “trophic level.” This tendency may stem from the perception of humans as the ultimate omnivore, generalist, flexible creatures, capable of adapting their trophic level at short notice to meet variable local ecological conditions. Some even consider acquiring these capabilities as the core of human evolution, including increased brain size (Potts, 1998; R. W. Wrangham et al., 1999). By seeking a “trophic level,” we examine the possibility that unlike 20th-century hunter-gatherers (HG), Paleolithic humans may not have been as flexible in the selection of plant or animal-sourced foods during the Pleistocene as one would infer from an examination of the ethnographic record. Perception of humans' dietary flexibility regarding plant and animal-sourced foods during the Pleistocene (2,580,000–11,700 years ago) receives much support from analogy with 20th century HG's varied diets. The difference in preservation potential between plants and animals in archaeological assemblages has led to the wide use of ethnography in the reconstruction of Paleolithic diets (Cordain et al., 2000; Crittenden & Schnorr, 2017; Eaton & Konner, 1985; Konner & Eaton, 2010; Kuipers et al., 2012; Kuipers et al., 2010; Lee, 1968; F. W. Marlowe, 2005; Stahl et al., 1984; Ströhle & Hahn, 2011). All of the reconstructions present a picture of HG as flexible in their trophic level, depending largely on local ecologies. However, the varied diets of 20th century HG may result from post-Paleolithic technological and physiological adaptations to Anthropocene ecological conditions that are non-analogous to the conditions that humans experienced during most of the Pleistocene. In fact, with a markedly lower abundance of megafauna and with technological features like the use of dogs, bows and arrows, iron, and contact with neighboring herders and farmers, one would expect 20th century HG to be more analogous in terms of dietary patterns to their probable ancestors of the terminal and post-Paleolithic humans rather than to Lower, Middle and even Early Upper Paleolithic humans (Ben-Dor & Barkai, 2020a; Faith et al., 2019). The human trophic level (HTL) and its degree of variability during the Pleistocene are the basis, explicitly or tacitly, of many explanations regarding human evolution, behavior, and culture (Aiello & Wheeler, 1995; Bramble & Lieberman, 2004; Domínguez-Rodrigo & Pickering, 2017; Hawkes & Coxworth, 2013; Kaplan et al., 2007; Munro et al., 2004; Potts, 1998; M. C. Stiner, 2002; Ungar et al., 2006; R. W. Wrangham et al., 1999). For example, competing explanations for humans' extended longevity, the “grandmother hypothesis” (Hawkes & Coxworth, 2013), and “embodied capital hypothesis” (Kaplan et al., 2009), are based on different assumptions of the relative dietary importance of gathered plants versus hunted animal food during human evolution. Potts (1998) assigns the human ability to vary trophic levels in response to climate as critical to human evolution. HTL estimates also support hypotheses about health-promoting, evolutionarily compliant, contemporary diets (Eaton & Konner, 1985). A related question is whether humans evolved toward specialization (stenotopy) or generalization (eurytopy) (Wood & Strait, 2004). Some have hypothesized that human survival and wide dispersal result from evolved dietary flexibility, that is, generalization (Potts, 1998; Ungar et al., 2006), while others (Arcadi, 2006; Vrba, 1980) attribute wide dispersal to carnivory. Humans have inarguably always been omnivores, feeding on more than one trophic level. However, omnivory in mammals is disassociated with dietary variability in terms of animal–plant food selection ratios. Omnivores exist on a wide range of trophic levels and have variable degrees of specialization (Johnston et al., 2011; Lefcheck et al., 2013; Pineda-Munoz & Alroy, 2014). Analysis of a large (N = 139) dataset of mammals' trophic levels (Pineda-Munoz & Alroy, 2014) shows that some 80% of the mammals in the dataset are omnivores, but most of the omnivores (75%) consume more than 70% of their food from either plants or animals, leaving only 20% of the mammals in the dataset to be omnivores-generalists. Interestingly, while all the 16 primates in the dataset are omnivores, 15 of the 16 are specialists. Bioenergetic and physiological constraints may limit the flexibility of omnivores. For example, both chimpanzees and wolves are technically omnivores yet are ill-adapted to high flexibility in their food sources. We intend to examine some of these constraints in humans to assert their impact on the human's trophic level, specialization, and dietary flexibility during the Pleistocene. Little systematic evolution-guided reconstruction of HTL has been published to date. Henneberg et al. (1998) cited the similarity of the human gut to that of carnivores, the preferential absorption of haem rather than iron of plant origins, and the exclusive use of humans as a carnivore host by the Taenia saginata, a member of the Taeniidea family of carnivores' parasites, as supporting Homo sapiens' adaptation to meat-eating. Mann (2000) Pointed to gut structure and acidity, insulin resistance, and high diet quality as evidence of physiological adaptation to consuming lean meat during the Paleolithic. Hancock et al. (2010) explored the association of humans' specific genes with diet, subsistence, and ecoregions, seeking adaptations in recent populations. A notable evolution-based analysis of the degree of carnivory in early humans was performed by Foley (2001, p. 327). He composed a list of behavioral and physiological changes that would have been expected to occur in humans had they adopted carnivory. Foley was mainly interested in the initial shift to carnivory and found that Homo ergaster/erectus comply with many of the predicted changes, concluding that “…the lines of evidence and reasoning put forward here strongly suggest that meat-eating has played a significant role in the evolution of Homo, not just Homo sapiens.” Through natural selection, physiological adaptation to a specific, broader, or narrower food niche is the primary cause of observed biological diversity (Darwin, 1859). Attempting to mitigate the dominance of the analogy with 20th century HG (“the tyranny of ethnography” in the words of Lieberman et al., 2007) in reconstructing Paleolithic diets, we searched the literature on human metabolism and genetics, body morphology, dental morphology and pathology, and life history for signs of adaptations to HTLs, or evolution toward dietary generalization or specialization. We also reviewed relevant archaeological, paleontological, and zoological literature to identify changing patterns in fauna, flora, lithic industries, stable isotopes, and other geoarchaeological data, as well as human behavioral adaptations to carnivory or omnivory, that reflected past HTLs. We concentrate on the lineage leading to H. sapiens because a large part of our evidence comes from the biology of H. sapiens. There is little information about species like rudolfensis, ergaster, heidlebergensis, and possibly antecessor, which may have belonged to the H. sapiens lineage. We thus focused our attention on H. erectus (sensu lato) and H. sapiens, as many relevant sources refer mostly to these species. Because we relied on H. sapiens' biology, we do not deal and do not make any claims regarding the trophic level of other Homo species that do not seem to have belonged to the lineage leading to H. sapiens like floresiensis, naledi, neanderthalensis, Denisovan, and luzonensis. Evolution-based information provides a longer-term view of HTL than that based on 20th century HG, or the ever-localized and partial data from archaeological sites. This view should, in turn, be more relevant to explain profound evolutionary physiological and behavioral phenomena in human prehistory. Below we discuss in some detail each piece of evidence. A summary for quick reference can be found in Table 2 at the beginning of the discussion section. 2 PHYSIOLOGICAL EVIDENCE 2.1 Bioenergetics Compared with other primates, humans have a higher energy requirement for a given fat-free mass (Pontzer et al., 2016), and thus faced intense selective pressure to efficiently acquire adequate and consistent energy, especially to reliably energize the brain (Navarrete et al., 2011). Additionally, due to tool acquisition, prolonged child care, and education, humans need more time free from food acquisition than other animals (Foley & Elton, 1998). Animal-sourced calories are generally acquired more efficiently; carnivores, therefore, spend less time feeding than similar-sized herbivores (Shipman & Walker, 1989). For example, baboons (Papio cynocephalus) devote almost all their daylight hours to feeding (Milton, 1987, p. 103) while adult Ache and Hadza men spend only a third of the day in food acquisition, preparation, and feeding (Hawkes et al., 1997; Hill et al., 1985). Acquiring and consuming medium size animals, at a return rate in the range of tens of thousands of calories per hour, is an order of magnitude more time-efficient than plant-gathering (Kelly, 2013, table 3-3, 3-4). In other words, the price differences in “the supermarket of nature” were likely opposite to the price differences in the supermarkets of today. In nature, for humans, plant-sourced calories cost 10 times the price of meat if it is available. Given limited time and energetic budgets, such a difference in energetic returns leaves little room for flexibility (also referred to as plasticity) in the selection of the two dietary components. Nonetheless, a background consumption of plants and smaller prey is expected when women gather and do not participate in hunts (see Plants section for discussion). Also, differences in the relative availability of plants and animals affect the actual consumption. In particular, large animals are the highest-ranking food according to ethnographic data (Broughton et al., 2011). According to classic optimal foraging theory, an animal would specialize in the highest-ranking type if the encounter rate is high enough (Futuyma & Moreno, 1988). Applied to humans, it means that they should have specialized in large prey if the encounter rates were high enough. Moreover, seasonal fluctuations in many plant species' availability may hinder their reliability as food for a significant portion of the year. In contrast, animals are always available, although with fluctuating fat content. Carnivory could have, therefore, been a more time-efficient and reliable caloric source. The relative abundance of large prey, and thus encounter rate, relative to smaller prey and plants, was probably higher during most of the Pleistocene, at least before the Late Quaternary extinction of megafauna (see the Ethnography, Paleontology and Zooarchaeology sections and Ben-Dor and Barkai (2020a) for references). 2.2 Diet quality In relation to body size, brain size is strongly associated with dietary energetic density in primates and humans (Aiello & Wheeler, 1995; DeCasien et al., 2017; Leonard et al., 2007). Human brains are over three times larger than other primates' brains, and as such, human dietary energetic density should be very high. The most energy-dense macronutrient is fat (9.4 kcals/g), compared with protein (4.7 kcals/g) and carbohydrates (3.7 kcals/g) (Hall et al., 2015). Moreover, plant proteins and carbohydrates typically contain anti-nutrients, which function in plant growth and defense (Herms and Mattson, 1992; Stahl et al., 1984). These anti-nutrients, such as lectins or phytate, appear in complex cellular plant matrix and fibers and limit full energetic utilization and nutrient absorption by humans (Hervik & Svihus, 2019; Schnorr et al., 2015). The most generous estimates from in vitro, human, and animal data suggest that well below 10% of total daily caloric needs can be met from fiber fermentation, and most likely below 4% (Hervik & Svihus, 2019; Høverstad, 1986; Topping & Clifton, 2001). Hence, the protein and fat mixture in animals would probably have provided higher energetic density, and therefore dietary quality. Brain size declined during the terminal Pleistocene and subsequent Holocene (Hawks, 2011; Henneberg, 1988), indicating a possible decline in diet quality (increase in the plant component) at the end of the Pleistocene. 2.3 Higher fat reserves Humans have much higher fat reserves than chimpanzees, our closest relatives (Zihlman & Bolter, 2015). Carrying additional fat has energy costs and reduces human speed in chasing prey or escaping predators (Pond, 1978). Most carnivores and herbivores do not have a high body fat percentage as, unlike humans, they rely on speed for predation or evasion (Owen-Smith, 2002, p. 143). Present-day HG (the Hadza) were found to have sufficient fat reserves for men and women to fast for three and six weeks, respectively (Pontzer et al., 2015). Humans seem very well adapted to lengthy fasting when fat provides their major portion of calories (Cahill Jr & Owen, 1968). Rapid entry to ketosis (when the liver synthesizes ketones from fat) allows ketone bodies to replace glucose as an energy source in most organs, including the brain. During fasting, ketosis allows muscle-sparing by substantially decreasing the need for gluconeogenesis (the synthesis of glucose from protein), and humans enter ketosis relatively quickly. Dogs share similar digestive physiology and animal-rich dietary patterns with humans but do not enter ketosis quickly (Crandall, 1941). Indeed, dogs typically require a diet supplemented by medium-chain triglyceride to increase blood ketones to derive therapeutic benefit, but even then, they do not achieve deep physiological ketosis like humans (Packer et al., 2016). Cahill Jr (2006, p. 11) summarizes the evolutionary implications of humans' outstanding adaptation to ketosis: “brain use of βOHB [a ketone body], by displacing glucose as its major fuel, has allowed man to survive lengthy periods of starvation. But more importantly, it has permitted the brain to become the most significant component in human evolution.” Rapid entry into ketosis has been found in the brown capuchin monkey (Friedemann, 1926), suggesting that this adaptation to fasting may have already existed in early Homo. Researchers who argue against a massive reliance on acquiring large animals during the Pleistocene mention their relative scarcity (Hawkes, 2016). However, besides the fact that they were more prevalent during the Pleistocene (Hempson et al., 2015), the ability to store large fat reserves and to more easily endure fasting may represent an adaptation, enabling humans to endure extended periods between acquiring the less frequently encountered large animals. 2.4 Genetic and metabolic adaptation to high-fat diet Swain-Lenz et al. (2019) performed comparative analyses of the adipose chromatin landscape in humans, chimpanzees, and rhesus macaques, concluding that their findings reflect differences in the adapted diets of humans and chimpanzees. They (p. 2004) write: “Taken together, these results suggest that humans shut down regions of the genome to accommodate a high-fat diet while chimpanzees open regions of the genome to accommodate a high sugar diet.” Speth (1989) hypothesized that humans eating an animal-based diet would display an obligatory requirement for significant fat amounts because they are limited in the amount of protein they can metabolize to energy. Dietary fat is also a macronutrient with priority storage within subcutaneous fat stores; this agrees with assumptions of adaptation to higher fat consumption. The ability to finely tune fat-burning is a prominent feature of human metabolism (Akkaoui et al., 2009; Mattson et al., 2018). The lipase enzyme plays a dominant role in fat storage and metabolism. Comparing the pace of genetic changes between humans and other primates, Vining and Nunn (2016) found that lipase production underwent substantial evolution in humans. Weyer and Pääbo (2016) found some indication of differences in both the regulation and activity of pancreatic lipase in modern humans compared with Neandertals and Denisovans. Given that Neandertals probably consumed a diet higher in meat and fat than anatomically modern humans, the latter was possibly adapting to lower fat consumption. However, these changes are also found in present-day humans, but there is no indication of how early they occurred in H. sapiens evolution. They could have resulted from a shift to a diet higher in plants in the period leading up to the adoption of agriculture, in which a marked increase in genetic changes is evident (Hawks et al., 2007). Additionally, storing larger fat reserves is a derived trait in humans, regardless of nutritional source (Pontzer, 2015). Thus, changes in fat metabolization capacity may, in part, be associated with metabolizing stored fat. In humans, eating predominantly animal foods, especially fatty animal foods, promote nutritional ketosis. This pattern provides generous amounts of bioavailable essential micronutrients with crucial roles in encephalization, such as zinc, heme iron, vitamin B12, and long-chain omega-3 and 6 fatty acids (DHA and arachidonic acid, respectively) (Cunnane & Crawford, 2003). Infants' brains meet all of their cholesterol needs in situ, with 30% to 70% of the required carbons being supplied by ketone bodies (Cunnane et al., 1999). Recently, nutritional ketosis has gained popularity as a possible therapeutic tool in many pathologies, such as diabetes, Alzheimer's disease, and cancer (Ludwig, 2019). 2.5 Omega 3 oils metabolism Another aspect of fat metabolism is the hypothesis that the early human brain's enlargement was made possible through acquiring aquatic foods. Presumably, these foods were the only source of high amounts of docosahexaenoic acid (a long-chain omega-3 fatty acid; DHA) found in the expanding human brain (Crawford, 2010; Cunnane & Crawford, 2014; Kyriacou et al., 2016). In contrast, Cordain et al. (2002) argue that terrestrial animal organs contained sufficient DHA amounts for brain growth. Furthermore, Speth (2010, p. 135) proposed that humans biosynthesized sufficient DHA de novo from precursors. This last argument is compatible with the present existence of several billion people, including some HG, who have never eaten aquatic-sourced food, yet they and their offspring can grow and support much larger brains than early humans. A large part of this population does not consume high proportions of animal-derived food and practices multi-generational vegetarianism without cognitive decline (Crozier et al., 2019). An increased need for DHA to sustain larger brains cannot even support claims for a terrestrial animal-based diet in early humans. Stable isotope analysis shows that at least some Neandertals did not consume much, if any, aquatic dietary resources (M. Richards & Trinkaus, 2009), though their brains were at least as large as that of modern humans. Mathias et al. (2012) identified a genetic change that occurred in African humans about 85 thousand years ago (Kya) in the fatty acid desaturase (FADS) family of genes, showing a marginal increase in efficiency of converting plant-derived omega-3 fatty acids into DHA. This change may signify an increase in dietary plant components at that time in Africa. In Europe, however, a similar change took place only with the arrival of the Neolithic (Ye et al., 2017), suggesting that a plant-based diet was uncommon beforehand. Furthermore, tracer studies show modern adult humans can only convert <5% of the inactive plant-derived omega-3 polyunsaturated fatty acid alpha-linolenic acid (18:3Ω3, ALA) into the animal-derived active version docosahexaenoic acid (20:6Ω3, DHA) (Plourde & Cunnane, 2007). Ye et al. (2017) found that positive genetic selection on FADS in Europe took the opposite direction in HG groups in the period leading up to the Neolithic, possibly signifying increased reliance on aquatic foods. The pre-Neolithic surge in aquatic foods exploitation is also supported by stable isotope analysis (see the section Isotopes and trace elements). 2.6 Late genetic adaptation to the consumption of underground storage organs A recent adaptation to a high-starch diet may be postulated from a study by Hancock et al. (2010, table 4), which showed that populations presently dependent on roots and tubers (underground storage organs [USOs]) are enriched in single nucleotide polymorphisms (SNPs) associated with starch and sucrose metabolism and folate synthesis, presumably compensating for their poor folic acid content. Another SNP in these populations may be involved in detoxifying plant glycosides, such as those in USOs (Graaf et al., 2001). Some researchers consider USOs ideal candidates for significant plant consumption by early humans (Dominy, 2012; B. L. Hardy, 2010; K. Hardy et al., 2016; Henry et al., 2014; R.W. Wrangham et al., 1999). If genetic adaptations to USOs consumption were rather recent, it suggests that USOs did not previously comprise a large dietary component. 2.7 Stomach acidity Beasley et al. (2015) emphasize the role of stomach acidity in protection against pathogens. They found that carnivore stomachs (average pH, 2.2), are more acidic than in omnivores (average pH, 2.9), but less acidic than obligate scavengers (average pH, 1.3). Human studies on gastric pH have consistently found a fasted pH value <2 (Dressman et al., 1990; Russell et al., 1993). According to Beasley et al. (2015), human stomachs have a high acidity level (pH, 1.5), lying between obligate and facultative scavengers. Producing acidity, and retaining stomach walls to contain it, is energetically expensive. Therefore it would presumably only evolve if pathogen levels in human diets were sufficiently high. The authors surmise that humans were more of a scavenger than previously thought. However, we should consider that the carnivorous activity of humans involved transporting meat to a central location (Isaac, 1978) and consuming it over several days or even weeks. Large animals, such as elephants and bison, presumably the preferred prey, and even smaller animals such as zebra, provide enough calories to sustain a 25-member HG group from days to weeks (Ben-Dor et al., 2011; Ben-Dor & Barkai, 2020b; Guil-Guerrero et al., 2018). Moreover, drying, fermentation, and deliberate putrefaction of meat and fat are commonly practiced among populations that rely on hunting for a large portion of their diet (Speth, 2017), and the pathogen load may consequently increase to a level encountered by scavengers. 2.8 Insulin resistance Another hypothesis claiming a human genetic predisposition to a carnivorous, low-carbohydrate diet is the “Carnivore Connection.” It postulates that humans, like carnivores, have a low physiological (non-pathological) insulin sensitivity. It allows prioritizing of glucose toward tissues like the central nervous system, erythrocytes, and testes that entirely or significantly depend on glucose, rather than muscles which can rely on fatty acids and ketosis instead (Brand-Miller et al., 2011); this sensitivity is similarly lower in carnivores (Schermerhorn, 2013). Brand-Miller et al. (2011) speculate that physiological insulin resistance allows humans on a low-carbohydrate diet to conserve blood glucose for the energy-hungry brain. The genetic manifestation of insulin resistance is complex and difficult to pinpoint to a limited number of genes (Moltke et al., 2014). However, Ségurel et al. (2013) found a significantly higher insulin resistance (low sensitivity) in a Central Asian population (Kirghiz) of historical herders, compared with a population of past farmers (Tajiks), despite both groups consuming similar diets. Their findings indicate a genetic predisposition to high physiological insulin resistance levels among groups consuming mainly animal-sourced foods. Additionally, a significant difference in the prevalence of this resistance exists between groups with long-term exposure to agriculture and those that do not, such as Australian aborigines, who have higher resistance. If higher physiological insulin resistance is indeed ancestral, its past endurance suggests that high carbohydrate (starch, sugar) consumption was not prevalent. 2.9 Gut morphology Most natural plant food items contain significant amounts of fiber (R. W. Wrangham et al., 1998), and most plant-eaters extract much of their energy from fiber fermentation by gut bacteria (McNeil, 1984), which occurs in the colon in primates. For example, a gorilla extracts some 60% of its energy from fiber (Popovich et al., 1997). The fruits that chimps consume are also very fibrous (R. W. Wrangham et al., 1998). The human colon is 77% smaller, and the small intestine is 64% longer than in chimpanzees, relative to chimpanzee body size (Aiello & Wheeler, 1995; Calculated from Milton, 1987, table 3.2). Because of the smaller colon, humans can only meet less than 10% of total caloric needs by fermenting fiber, with the most rigorous measures suggesting less than 4% (Hervik & Svihus, 2019; Høverstad, 1986; Topping & Clifton, 2001). A 77% reduction in human colon size points to a marked decline in the ability to extract the full energetic potential from many plant foods. The elongated small intestine is where sugars, proteins, and fats are absorbed. Sugars are absorbed faster in the small intestine than proteins and fats (Caspary, 1992; Johansson, 1974). Thus, increased protein and fat consumption should have placed a higher selective pressure on increasing small intestine length. A long small intestine relative to other gut parts is also a dominant morphological pattern in carnivore guts (Shipman & Walker, 1989, and references therein). This altered gut composition meets the specialization criteria proposed by Wood and Strait (2004) for adaptations that enable animals but hinder plant acquisition for food. A marked reduction in chewing apparatus and a genetic change that reduced the jaw muscle bite force had already appeared 2–1.5 million years ago (Mya) (Lucas et al., 2006). A smaller mandibular-dental complex points to a smaller gut (Lucas et al., 2009); therefore, the carnivorous gut structure may have already been present in H. erectus. 2.10 Reduced mastication and the cooking hypothesis Together with the whole masticatory system, teeth should closely reflect the physical, dietary form because masticatory action is repeated thousands of times each day and is thus under continuous pressure to adjust to efficient dietary processing (Lucas et al., 2009). One of Homo's main derived features is the reduced relative size of the masticatory apparatus components (Aiello & Wheeler, 1995). This reduction is associated with a substantially decreased chewing duration (approximately 5% of daily human activity, compared with 48% in chimpanzees), starting with H. erectus 1.9 Mya (Organ et al., 2011). The masticatory system size in H. erectus, together with reduced feeding duration, is attributed to the increased dietary meat proportion and availability of stone tools (Aiello & Wheeler, 1995; Zink & Lieberman, 2016), high portion of dietary fat (Ben-Dor et al., 2011), or the introduction of cooking early in Homo evolution (R. Wrangham, 2017). We consider cooking plants as a possible but less likely explanation for the reduction in mastication since most researchers date the habitual and controlled use of fire to over a million years after the appearance of H. erectus (Barkai et al., 2017; Gowlett, 2016; Roebroeks & Villa, 2011; Shahack-Gross et al., 2014; Shimelmitz et al., 2014); but see R. Wrangham (2017). It seems that habitual use of fire appeared with the appearance of post-H. erectus species and so can signal increased plant consumption in these species. It should also be noted that although fire was undoubtedly used for cooking plants and meat, a fire has many non-cooking uses for humans (Mallol et al., 2007), including protection from predation, a significant danger in savanna landscapes (Shultz et al., 2012). Also, fire maintenance has bioenergetic costs (Henry, 2017), and in some environments, sufficient wood may not be available to maintain fire (Dibble et al., 2018). While the contribution of cooking to the consumption of plants is not contested, cooking also contributes to the consumption of meat. There is no archaeological indication of a net quantitative contribution of cooking to the HTL. We, however, assume that cooking signals a somewhat higher consumption of plants. 2.11 Postcranial morphology Several derived postcranial morphologic phenotypes of humans are interpreted as adaptations to carnivory. Ecologically, the body size is related to trophic strategies. Researchers have attributed the increase in body size in Homo to carnivory (Churchill et al., 2012; Foley, 2001; T. Holliday, 2012). A recent body size analysis shows that H. erectus evolved larger body size than earlier hominins (S. C. Antón et al., 2014; Grabowski et al., 2015). Simultaneously, larger body size reduces the competitivity in arboreal locomotion and hence in fruit gathering. It is interesting to note that in Africa, humans' body size reached a peak in the Middle Pleistocene, and H. sapiens may have been smaller than his predecessors (Churchill et al., 2012). Since carnivore size is correlated with prey size (Gittleman & Harvey, 1982), this development ties well with an apparent decline in prey size at the Middle Stone Age (MSA) in East Africa (Potts et al., 2018). A similar decrease in body size was identified in the Late Upper Paleolithic and Mesolithic (Formicola & Giannecchini, 1999; Frayer, 1981), also with a concomitant decline in prey size following the Late Quaternary Megafauna Extinction (Barnosky et al., 2004). A series of adaptations to endurance running was already present in H. erectus, presumably to enable “persistence hunting” (Bramble & Lieberman, 2004; Hora et al., 2020; Pontzer, 2017). A recent genetic experiment concerning exon deletion activity in the CMP-Neu5Ac hydroxylase (CMAH) gene in mice led Okerblom et al. (2018) to propose that humans, in whom the deletion was already fixed at 2 Mya, had already acquired higher endurance capabilities at that time. Whether this endurance was used for hunting, scavenging, or another unknown activity early in human evolution is debated (Lieberman et al., 2007; Pickering & Bunn, 2007; Steudel-Numbers & Wall-Scheffler, 2009). Comparing the Early Stone Age sites of Kanjera South and FLK-Zinj, Oliver et al. (2019) suggested that different ecological conditions required different hunting strategies, either cursorial (suitable for persistence hunting), or ambush, which is more appropriate for a woodland-intensive landscape. Some endurance running adaptations may also suggest adaptation to increased mobility in hot weather conditions, as expected from carnivores, given their relatively large home ranges (Gittleman & Harvey, 1982). Another feature associated with hunting in the early stages of human evolution is an adaptation of the shoulder to a spear-throwing action, already present in H. erectus (Churchill & Rhodes, 2009; J. Kuhn, 2015; Roach et al., 2013; Roach & Richmond, 2015). Young et al. (2015) and Feuerriegel et al. (2017) argue that this adaptation came at the expense of a reduced ability to use arboreal niches, meeting the criteria proposed by Wood and Strait (2004) to support compelling morphological evidence of evolution toward carnivorous stenotopy. 2.12 Adipocyte morphology Ruminants and carnivores, which absorb very little glucose directly from the gut, have four times as many adipocytes per adipose unit weight than non-ruminants, including primates, which rely on a larger proportion of carbohydrates in their diet (Pond & Mattacks, 1985). The authors hypothesize that this is related to the relative role of insulin in regulating blood glucose levels. Interestingly, omnivorous species of the order Carnivora (bears, badgers, foxes, voles) display more carnivorous patterns than their diet entails. Thus humans might also be expected to display organization closer to their omnivorous phylogenic ancestry. However, humans fall squarely within the carnivore adipocyte morphology pattern of smaller, more numerous cells. Pond and Mattacks (1985, p. 191) summarize their findings as follows: “These figures suggest that the energy metabolism of humans is adapted to a diet in which lipids and proteins rather than carbohydrates, make a major contribution to the energy supply.” 2.13 Age at weaning Humans have a substantially different life history than other primates (Robson & Wood, 2008), a highly indicative speciation measure. One life history variable in which humans differ significantly from all primates is weaning age. In primates such as orangutans, gorillas, and chimpanzees, weaning age ranges between 4.5 and 7.7 years, but is much lower in humans in HG societies, at 2.5–2.8 years, despite the long infant dependency period (Kennedy, 2005; Robson & Wood, 2008, table 2). Psouni, Janke, and Garwicz (2012, p. 1) found that an early weaning age is strongly associated with carnivory level, stating that their findings “highlight the emergence of carnivory as a process fundamentally determining human evolution.” It is interesting, however, that a comparison of early Homo, Australopithecus africanus, and Paranthropus robustus from South Africa reveals a substantially higher weaning age (4 years) in South African early Homo (Tacail et al., 2019), so it is unclear when the weaning age shortened. 2.14 Longevity Longevity is another life history variable in which humans differ markedly from great apes. While the modal age at death in chimpanzees is 15 years, in 20th century HG, it occurs in the sixth and seventh decades (Gurven & Kaplan, 2007, table 4). There is no argument that longevity extension began with early Homo, although disagreement exists regarding the pace of change. Caspari and Lee (2004) argue for an accelerated extension of longevity in H. sapiens, while others, such as Hawkes and Coxworth (2013), argue for an earlier extension. Two hypotheses attempt to explain life extension in humans; the disparity lies in different perceptions regarding HTL during evolution. Hawkes and Coxworth (2013) support the “grandmother hypothesis,” by which grandmothers' longevity (post-menopausal females) enables sufficient plant food to be collected for infants, whose slower development in comparison with other primates necessitates extended care. The authors (see also Hawkes et al., 2018) base this argument on Hadza dietary patterns, in which gathering by females contributes a large portion of food calories, to demonstrate the marked effect of infant care by grandmothers in releasing their daughters' time for gathering food. As discussed in the Plants section, females may have contributed to hunting as well as gathering. Kaplan et al. (2000) (see also Kaplan et al., 2007) rely on a diet dominated by animal sources, such as in other 20th century HG groups, for example, the Ache. They propose that hunting experience, which fully develops at around 40 years, is crucial to group survival by enabling acquisition of the surplus calories needed to feed less productive, younger group members. The importance of hunting experience presumably caused longevity extension in humans. The problem in comparing the Hadza, with their different ecology, with iron-based material culture as a model for evolutionary dietary patterns is discussed in the Ethnography section. However, it is interesting that even in the Hadza, peak food-acquisition productivity is reached after age 40 in both sexes (F. Marlowe, 2010, fig. 5.11). In summary, extended human longevity suggests that a need for efficient calorie acquisition to maintain both self and an extended sibling dependency period was a dominant driving force in human evolution. The two hypotheses are not necessarily mutually exclusive. 2.15 Vitamins Hockett and Haws (2003) developed a hypothesis that one of the basic tenets of the ancestral human diet was its diversity. They based this on research findings that present-day diets emphasizing diversity increase overall health patterns by lowering infant mortality rates and increasing average life expectancy. Presumably, the wide range of vitamins and minerals associated with diverse diets is advantageous. The relevancy of the initial findings, cited by Hockett and Haws (2003) to the Paleolithic, is questionable, as they relate to modern societies consuming an agricultural, mostly domesticated plant-based diet with declining nutritional value (Davis, 2009). Diversification may, in this case, confer benefit due to the mineral and vitamin accumulation derived from consuming multiple plant types, each of which individually has a narrower range of beneficial contents. Diversity can also refer to portion increases and animal variety in the diet. Kenyan schoolchildren on a high-plant diet receiving meat supplementation showed improved growth, cognitive, and behavioral outcomes (Neumann et al., 2007). Hockett and Haws (2003, table 1) list the key vitamin content in 100 g of plants compared with 100 g of various animal foods (vitamins C, E, D [cholecalciferol], A [retinol & β-carotene], B1 [thiamin], B2 [riboflavin], B3 [niacin], B6 [pyridoxine or pyridoxal], B9 [folate or folic acid], B12 [cobalamin], and iron [heme & non-heme iron]). Comparison of vitamin density (per 100 calories) between terrestrial mammals and plants shows that, in eight of the ten vitamins, terrestrial mammal food is denser, and in most cases several times denser, than plants. If we consider factors like bioavailability and active nutrients, then animal foods appear even more nutritious (Frossard et al., 2000). This result is unsurprising given that humans are also terrestrial mammals, containing the same chemicals as other terrestrial mammals and requiring mostly the same vitamins. Plant food is denser in vitamin E and C. It is well known, however, that scurvy did not affect Polar societies despite lower levels of dietary plant components (Draper, 1977; Fediuk, 2000; Thomas, 1927). Western individuals who lived among Polar populations for several years also showed no signs of vitamin shortage (Stefansson, 1960, p. 171). Controlled further monitoring in the United States of two of these individuals, who consumed meat exclusively for a year, revealed no adverse clinical symptoms (McClellan & Du Bois, 1930). According to the glucose–ascorbate antagonism (GAA) hypothesis (Hamel et al., 1986), the structural similarity between glucose and vitamin C means the two molecules compete to enter cells via the same transport system (Wilson, 2005). Thus, higher requirements for vitamin C in western populations may result from higher consumption of carbohydrates and consequently higher blood glucose levels. Two clinical studies comparing diabetic and non-diabetic patients showed, as predicted by the GAA hypothesis, that diabetic patients with higher blood glucose levels have decreased plasma ascorbic acid levels (Cunningham et al., 1991; Fadupin et al., 2007). Dietary vitamin C requirements can be lowered in multiple ways in the context of very-low-carbohydrate diets high in animal sources, which can affect metabolism such that the oxaloacetate to acetyl-CoA ratio drops below one, stimulating ketogenesis and, in turn, increasing mitochondrial glutathione levels (Jarrett et al., 2008). More glutathione means more enzyme glutathione reductase to recycle dehydroascorbic acid (oxidized vitamin C) into ascorbic acid (reduced vitamin C). Ketogenic diets can also increase uric acid, the major antioxidant in human serum, putatively sparing vitamin C in its antioxidant capacity (Nazarewicz et al., 2007; Sevanian et al., 1985), and conserving it for other tasks. For instance, animal foods also provide generous amounts of carnitine, meaning that less vitamin C is needed to synthesize carnitine (a process to which vitamin C is crucial) (Longo et al., 2016). Therefore, the evidence does not support the hypothesis of Hockett and Haws (2005). There is little argument that Paleolithic diets were higher in plants than recent Polar diets and thus did include some plant-derived vitamin C. Nevertheless, animal-sourced foods provide essential micronutrients in their active forms that plants do not, such as vitamin A (retinol), vitamin K (K2 menaquinones), vitamin B9 (folate), vitamin B12 (cobalamin), vitamin B6 (pyridoxine), vitamin D (cholecalciferol), iron (heme iron), and omega-3 (EPA and DHA). Animal foods are not only qualitatively but also quantitatively superior to plant foods, as determined by measures of nutrient density. 2.16 AMY1 gene Although starch consumption is evident throughout the Pleistocene (K. Hardy, 2018), its relative importance is difficult to elucidate from the archaeological record. Salivary amylase is an enzyme degrading starch into glucose in preparation for cell energy metabolism, and Vining and Nunn (2016) discerned a significant evolution in amylase-producing genes in Homo species but could not determine the temporal dynamics. Initially, a higher number of copies, from 2 to 15, of the salivary amylase-producing gene AMY1 in modern populations consuming high-starch diets was found (G. Perry et al., 2007). The few Neandertal and Denisovan genetic samples have only two AMY1 copies (G. H. Perry et al., 2015), similar to chimpanzees, which consume little starch. G. H. Perry et al. (2015) conclude that the common ancestor of Neandertals and H. sapiens, some 500–600 Kya, also had only two copies, a conclusion supported by Inchley et al. (2016), who surmised that the appearance of multi-copy AMY1 genes in H. sapiens probably occurred quite early after the split from the common ancestor. Several studies have hypothesized that people with a low number of AMY1 copies eating a high-starch diet would suffer from increased rates of obesity and diabetes but failed to find supporting evidence (Atkinson et al., 2018; Des Gachons & Breslin, 2016; Falchi et al., 2014; Fernández & Wiley, 2017; Mejía-Benítez et al., 2015; Yong et al., 2016). Usher et al. (2015) explain that when lower-precision molecular methods are avoided, not even a nominal association between obesity and the copy number of any amylase gene can be observed (p = 0.7). In summary, more research is needed to verify the functional role of salivary amylase in starch metabolism and the timing of the appearance of multiple copies of AMY1. 3 ARCHAEOLOGICAL EVIDENCE 3.1 Plants Archaeobotanical remains, lithic use-wear, residue analyses on lithic/flint tools, and teeth plaque are often used to elucidate human consumption of specific plant food items at the site level (K. Hardy et al., 2016; Henry et al., 2011; Lemorini et al., 2006; Venditti et al., 2019; Melamed et al., 2016). The acquisition of plants requires little use of stone tools and thus is prone to leave fewer artifacts at archaeological sites (F. Marlowe, 2010). Wooden digging sticks are used in ethnographic contexts to extract tubers and are sometimes found in archaeological contexts (Aranguren et al., 2018; Vincent, 1985), but they also preserve poorly. Despite the poor discovery potential, a review of relevant studies of plant consumption in the Lower and Middle Paleolithic (K. Hardy, 2018, table 1) paints a picture of widespread consumption of a wide range of plants. Unfortunately, similarly to the vast number of studies of bones with cut marks in countless sites, these studies cannot provide quantitative information for evaluating even a localized HTL, let alone a global one. According to ethnographic data (Kelly, 2013, table 3-3, 3-4), the energetic return on plant gathering is in the order of several hundred to several thousand calories per hour, while the return on medium-size animals is in the tens of thousands of calories; presumably gathering should be minimal. However, humans are unique in their division of labor in that the ethnographic record shows that females and males may target different foods that they eventually share. Ethnography is a convincing source of evidence for females as plant gatherers. The lower risk of the gathering is compatible with females' role in child care and thus lends credence to this interpretation of the nature of the division of labor in humans. Further, it is proposed, based on ethnographic data, that female's longevity evolved to enable the contribution of gathered plants subsistence by grandmothers beyond their own needs (Hawkes & Coxworth, 2013). However, several scholars have theorized that such division of labor would have been less pronounced or even altogether absent among Paleolithic HG (Haas et al., 2020 and references therein). As discussed in the Ethnography section, the different ecological conditions in the Pleistocene may have also affected the extent to which females gathered plants compared to the Paleolithic period. There is evidence for female hunting in ethnographic and Paleolithic contexts (D. W. Bird et al., 2013; R. B. Bird & Power, 2015; Haas et al., 2020; Khorasani & Lee, 2020; Noss & Hewlett, 2001; Waguespack, 2005). Especially relevant to the Paleolithic is the potential participation of females in large game hunting (Brink, 2008; Haas et al., 2020) in driving large animals to a trap where their movement can be curtailed (Churchill, 1993) or in driving them to expecting male hunters (Waguespack, 2005); methods that provide the opportunity for communal hunting (Lee, 1979, p. 234). Also, females tend to perform more non-subsistence activities in highly carnivorous human groups (Waguespack, 2005). Since the extinction of large animals in the Upper Paleolithic (UP) and the Anthropocene, alternative forms of division of labor may have become less relevant. It may be that we see here what we see in other aspects of ethnographic subsistence analogy, an increase in plant food extraction by females as a result of an adaptation to the decline in prey size and a resultant increase in plant food relative abundance (Ben-Dor & Barkai, 2020a; Waguespack, 2005). Ancient dental plaque has recently gained attention as a source of dietary information, as it enables the identification of plant particles accumulated in plaque (K. Hardy et al., 2012, 2016, 2017; Henry et al., 2011, 2014; Henry & Piperno, 2008). All studies, including the earliest-studied population of Sima del Elefante (1.2 Mya) (K. Hardy et al., 2017) and the Qesem Cave, Israel (400 Kya) (K. Hardy et al., 2016), identified plant remains in tooth plaque and even evidence for cooked starch (but see García-Granero, 2020). Out of 31 dental calculus analysis cases listed in K. Hardy et al. (2018, table 1), only four cases did not show starch residues. The assemblage of cases suggests that consumption of plants was common, although one has to take into account that the consumption of starch encourages dental plaque formation (Brown, 1975; Scannapieco et al., 1993) and that we do not know the percentage of teeth that were not sampled because they had no calculus. But even if we had these data, the identification of plants in dental calculus cannot tell us what the relative plant consumption of the individual was. However, we can summarize that the archaeological and ethnographic record shows that plant foods were a frequent component of the Paleolithic diet. 3.2 Stone tools Although the study of stone-tools is a cornerstone of prehistoric research, its potential to inform a quantitative value, such as HTL, is frustratingly limited. There is plenty of evidence of stone-tool associations with meat and fat processing (Barkai et al., 2010; Lemorini et al., 2006; Lemorini et al., 2015; Nowell et al., 2016; Solodenko et al., 2015; M. C. Stiner, 2002; Toth & Schick, 2019; Venditti et al., 2019). Recently, however, stone-tool use for plant processing in the early Paleolithic has been detected (Arroyo & de la Torre, 2016; Lemorini et al., 2014). S. L. Kuhn and Stiner (2001) review the increased frequency of stone tools attributable to plant processing toward the end of the Pleistocene, Epipaleolithic, and Pre-Pottery Neolithic Near Eastern sites. Lithic tools such as sickle blades, pounding, and grinding stones specific to plant processing appear late in the Pleistocene, both in Southern Asia (Bar-Yosef, 1989, 2002, 2014) and Europe (S. L. Kuhn & Stiner, 2001; Stepanova, 2019). Sickle blades and grain-grinding stone tools appear in the Levant in the early Epipaleolithic at Ohalo some 23 Kya, and at earlier UP sites, but become widespread only during the Natufian, 15.0–11.6 Kya (Groman-Yaroslavski et al., 2016). Their frequency increases further toward the Neolithic (Bar-Yosef, 1989; Wright, 1994). The relative intensity of grinding tool use is also found in 20th century HG plant-dependent groups (S. L. Kuhn & Stiner, 2001). The dearth or complete absence of similar findings during earlier periods, such as the Middle Paleolithic (MP), can be interpreted as indicating lower plant consumption than the UP and later UP. A similar trend is found in Europe, where grinding stones first appear sporadically in the Early UP/Aurignacian, but it is from the advent of later cultures, such as the Gravettian and Magdalenian, that these tools become more frequent (Aranguren et al., 2007; S. L. Kuhn & Stiner, 2001; Revedin et al., 2010; M. C. Stiner, 2002). In a multi-dimensional analysis of the Eurasian archaeological record, M. C. Stiner (2002) found a significant HTL decline by the Late UP. Grinding tools appear in Africa in the MSA, much earlier than elsewhere. However, they are mostly associated with pigment grinding (McBrearty & Brooks, 2000). Increases in plant-processing tools are less conspicuous in Africa than Europe and the Levant (Barton et al., 2016; Bar-Yosef, 2002; Clark, 1997) and include, in addition to MSA grinding stones, perforated stones that may have served as digging stick weights (Villa et al., 2012). Clark (1997) and Villa et al. (2012) indicate that hunting and cutting tools dominated the tool innovation list of the Later Stone Age (LSA) of Africa. In summary, there is little lithic evidence of increased plant consumption in Africa, at least during the early LSA. In China, increased evidence for plant consumption seems to follow the same timeframe as the European–Levantine UP (Guan et al., 2014; Liu & Chen, 2012, pp. 46–57). In summary, the lithic analysis provides evidence for a gradual but significant increase in plant consumption during the UP, especially at its end (Power & Williams, 2018), in Europe and Asia, and thus a decline in HTL in the late UP and consequently, a higher HTL beforehand. 3.3 Zooarchaeology
Aside from giant birds, crocodiles, and leopards, early humans likely had to contend with bears, sabertooth cats, snakes, hyenas, Komodo dragons, and even other hominins. As prey, the past was not a pleasant place for humans and our ancestors.
Every society has rules and customs concerning sexual relations, marriage, family, • household structures, and child-rearing practices. Most people think of these social regulations as simply the natural way of doing things. But in fact, they are cultural constructs and play vital roles , in establishing and maintaining social alliances, allocating resources, and assigning social obligations. Because marriage and family, in various forms, play fundamental roles in any society, wedding rituals are significant social events. Whether private or public, sacred or secular, weddings reveal, confirm, and underscore important cultural ideas and values. Symbolically rich, they usually feature particular speech rituals, plus prescribed apparel, postures and gestures, food and drink, songs and dances passed down through generations. Here we see a Muslim bride in Gujarat, western India, surrounded by female relatives and friends on the eve before her wedding. Their hands are beautifully decorated with traditional designs created with dye made from the crushed leaves of the tropical henna tree. Known as mehndi, this body artis an age-old custom among Muslims and Hindus in parts of southern Asia, as well as northern Africa. Often the groom's name is hidden within elegant designs of flowers and vines that symbolize love, fertility, and protection. A bride's mehndi evening, traditionally held at her parents' home, is a lively female-only gather* ing with special food, singing, and lovemaking instructions, along with hand painting.

Thursday, July 4, 2024

Dama , my great -grandmother; DeDe’s mother ; Mom’s grandmother

US Courts in general play a reactionary political role in US history . Warren Court was a freak !

US Courts in general play a reactionary political role in US history . Warren Court was a freak ! You mentioned "Black man has no rights that white man must honor " Chief Injustice Taney , and Plessy racist Court ; then so-called Liberal Holmes said socialist war protesters didn't have First Amendment rights to leaflet workers at plant gates . FDR had to try to pack the Court because it was going to strike down New Deal legislation. Every Court since Warren has essentially reversed Warren Court including perverting the 14th Amendment to create " reverse discrimination " doctrine.
Male head less rounded Cresting here; generally tends to be more pronounced Zygomatic arch is usually heavier The Biology of Modern Homo sapiens The second important question involves the circumstances of the individual's death. How long ago did the riy? What was here was the person killed? Was the body transported to the sit of discoe? What washthe cause of the death (drowning, blunt broe yama, pred to the and so for hot alahe waite manner of death (natural, sicide, unt fore rauna, pide Of course not all the things that happen to an individual around the time of death are grident in the skeleton, but many things are. For example, wile sund thesis death are 3 stroke and heart a skull, an unlikely to leave any evidence in the skeletal remains, broken bones, trauma to the skull, and bullet holes can usually be clearly skelet Finally, forensic science has become very popular as a subject of many television series. Unfortunately, the activities portrayed on these programs often border on science serien. We see state-of-the-art equipment most of which does not actually exist) provide relatively instant identification of victims and solve the questions of their demise. The reality is that the determination of the facts is very painstaking work and does not always provide all of the answers. Unfortunately, juries today expect the results that they see in television drama and as a consequence the courts have increasingly spent time educating jurors as to the realities of forensic science. pubic angle Sacrum Ischium tereotypic female and male rom skeletal material, build and handedness. I alter the skeleton in hat individualize the xample) and cultural Summary Growth is an increase in the size of an organism; development is a change from an undifferentiated to a highly organized, specialized state. There are three ways in which growth occurs: Hyperplasia is an increase in the number of cells, hypertro-phy is a general increase in cell size, and accretion is an increase in the amount of intercellular material. Bone growth begins with the appearance of primary and secondary areas of ossification, areas where bone is replacing cartilage. In long bones, growth takes place in growth plates that close when growth ceases in a fairly regular order at characteristic ages. Bone age is the average chronological age at which these events occur. The pattern of tooth formation and eruption serves a similar purpose. Growth can be charted in distance and velocity curves that plot the increase in stature, or some other variable, over time. When examining a growth curve, we notice a period known as the adolescent growth spurt. This is an aspect of puberty, which also includes changes in the reproductive organs and the secondary sexual characteristics. Specific anthropometric measurements also can be plotted against age and illustrate aspects of sexual dimorphism. Differences in patterns of growth and development may be seen in children growing up in stressful environments; these are known as developmental adjustments. The nature and rates of growth and development are controlled by the complex interaction of internal and external factors. These factors include the hormones, which are particularly involved in the control of puberty; environmental factors, including the availability and usage patterns of food; diseases, both those that are genetic and those caused by disease organisms; and heredity, which plays a major role by setting the potential limits to growth measurements such as stature. Improvement in nutrition, better sanitation, and better health services may be responsible for an increase in average stature and weight over the years, a tendency referred to as the secular trend. Forensic anthropology is the study of human skeletal remains in the legal context. The forensic anthropologist estimates several attributes of the individual represented by the skeletal remains, including sex, age, and stature, and determines the cause and manner of death.
Declaration of Independence: A Transcription Note: The following text is a transcription of the Stone Engraving of the parchment Declaration of Independence (the document on display in the Rotunda at the National Archives Museum.) The spelling and punctuation reflects the original. In Congress, July 4, 1776 The unanimous Declaration of the thirteen united States of America, When in the Course of human events, it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature's God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation. We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.--That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, --That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness. Prudence, indeed, will dictate that Governments long established should not be changed for light and transient causes; and accordingly all experience hath shewn, that mankind are more disposed to suffer, while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed. But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and to provide new Guards for their future security.--Such has been the patient sufferance of these Colonies; and such is now the necessity which constrains them to alter their former Systems of Government. The history of the present King of Great Britain is a history of repeated injuries and usurpations, all having in direct object the establishment of an absolute Tyranny over these States. To prove this, let Facts be submitted to a candid world. He has refused his Assent to Laws, the most wholesome and necessary for the public good. He has forbidden his Governors to pass Laws of immediate and pressing importance, unless suspended in their operation till his Assent should be obtained; and when so suspended, he has utterly neglected to attend to them. He has refused to pass other Laws for the accommodation of large districts of people, unless those people would relinquish the right of Representation in the Legislature, a right inestimable to them and formidable to tyrants only. He has called together legislative bodies at places unusual, uncomfortable, and distant from the depository of their public Records, for the sole purpose of fatiguing them into compliance with his measures. He has dissolved Representative Houses repeatedly, for opposing with manly firmness his invasions on the rights of the people. He has refused for a long time, after such dissolutions, to cause others to be elected; whereby the Legislative powers, incapable of Annihilation, have returned to the People at large for their exercise; the State remaining in the mean time exposed to all the dangers of invasion from without, and convulsions within. He has endeavoured to prevent the population of these States; for that purpose obstructing the Laws for Naturalization of Foreigners; refusing to pass others to encourage their migrations hither, and raising the conditions of new Appropriations of Lands. He has obstructed the Administration of Justice, by refusing his Assent to Laws for establishing Judiciary powers. He has made Judges dependent on his Will alone, for the tenure of their offices, and the amount and payment of their salaries. He has erected a multitude of New Offices, and sent hither swarms of Officers to harrass our people, and eat out their substance. He has kept among us, in times of peace, Standing Armies without the Consent of our legislatures. He has affected to render the Military independent of and superior to the Civil power. He has combined with others to subject us to a jurisdiction foreign to our constitution, and unacknowledged by our laws; giving his Assent to their Acts of pretended Legislation: For Quartering large bodies of armed troops among us: For protecting them, by a mock Trial, from punishment for any Murders which they should commit on the Inhabitants of these States: For cutting off our Trade with all parts of the world: For imposing Taxes on us without our Consent: For depriving us in many cases, of the benefits of Trial by Jury: For transporting us beyond Seas to be tried for pretended offences For abolishing the free System of English Laws in a neighbouring Province, establishing therein an Arbitrary government, and enlarging its Boundaries so as to render it at once an example and fit instrument for introducing the same absolute rule into these Colonies: For taking away our Charters, abolishing our most valuable Laws, and altering fundamentally the Forms of our Governments: For suspending our own Legislatures, and declaring themselves invested with power to legislate for us in all cases whatsoever. He has abdicated Government here, by declaring us out of his Protection and waging War against us. He has plundered our seas, ravaged our Coasts, burnt our towns, and destroyed the lives of our people. He is at this time transporting large Armies of foreign Mercenaries to compleat the works of death, desolation and tyranny, already begun with circumstances of Cruelty & perfidy scarcely paralleled in the most barbarous ages, and totally unworthy the Head of a civilized nation. He has constrained our fellow Citizens taken Captive on the high Seas to bear Arms against their Country, to become the executioners of their friends and Brethren, or to fall themselves by their Hands. He has excited domestic insurrections amongst us, and has endeavoured to bring on the inhabitants of our frontiers, the merciless Indian Savages, whose known rule of warfare, is an undistinguished destruction of all ages, sexes and conditions. In every stage of these Oppressions We have Petitioned for Redress in the most humble terms: Our repeated Petitions have been answered only by repeated injury. A Prince whose character is thus marked by every act which may define a Tyrant, is unfit to be the ruler of a free people. Nor have We been wanting in attentions to our Brittish brethren. We have warned them from time to time of attempts by their legislature to extend an unwarrantable jurisdiction over us. We have reminded them of the circumstances of our emigration and settlement here. We have appealed to their native justice and magnanimity, and we have conjured them by the ties of our common kindred to disavow these usurpations, which, would inevitably interrupt our connections and correspondence. They too have been deaf to the voice of justice and of consanguinity. We must, therefore, acquiesce in the necessity, which denounces our Separation, and hold them, as we hold the rest of mankind, Enemies in War, in Peace Friends. We, therefore, the Representatives of the united States of America, in General Congress, Assembled, appealing to the Supreme Judge of the world for the rectitude of our intentions, do, in the Name, and by Authority of the good People of these Colonies, solemnly publish and declare, That these United Colonies are, and of Right ought to be Free and Independent States; that they are Absolved from all Allegiance to the British Crown, and that all political connection between them and the State of Great Britain, is and ought to be totally dissolved; and that as Free and Independent States, they have full Power to levy War, conclude Peace, contract Alliances, establish Commerce, and to do all other Acts and Things which Independent States may of right do. And for the support of this Declaration, with a firm reliance on the protection of divine Providence, we mutually pledge to each other our Lives, our Fortunes and our sacred Honor. Georgia Button Gwinnett Lyman Hall George Walton North Carolina William Hooper Joseph Hewes John Penn South Carolina Edward Rutledge Thomas Heyward, Jr. Thomas Lynch, Jr. Arthur Middleton Massachusetts John Hancock Maryland Samuel Chase William Paca Thomas Stone Charles Carroll of Carrollton Virginia George Wythe Richard Henry Lee Thomas Jefferson Benjamin Harrison Thomas Nelson, Jr. Francis Lightfoot Lee Carter Braxton Pennsylvania Robert Morris Benjamin Rush Benjamin Franklin John Morton George Clymer James Smith George Taylor James Wilson George Ross Delaware Caesar Rodney George Read Thomas McKean New York William Floyd Philip Livingston Francis Lewis Lewis Morris New Jersey Richard Stockton John Witherspoon Francis Hopkinson John Hart Abraham Clark New Hampshire Josiah Bartlett William Whipple Massachusetts Samuel Adams John Adams Robert Treat Paine Elbridge Gerry Rhode Island Stephen Hopkins William Ellery Connecticut Roger Sherman Samuel Huntington William Williams Oliver Wolcott New Hampshire Matthew Thornton

Wednesday, July 3, 2024

Wokism and Islam in UK

I use “male supremracy” rather than “patriarchy for semantic historical roots reasons , in part.

I use "male supremacy " . The Latin and Greek roots for "patriarchy" are "rule of the fathers." However, the system is superiority of all males, particularly husbands at the origin of male supremacy circa 6,000 years ago. It would better be called _husbandry_ , as fathers were already ruling (with mothers equally) over children in the Stone Age; the only hierarchy in the Stone Age. Also, there is some beneficence and protection in father rule over children ; not in husband -rule .

So, l use male supremacy, because it's more than father -daughter relationship.

Sahlins , my main mentor in anthropological theory

One exception for me was my great and eminent anthropologist professor, Marshall Sahlins . He was my advisor and teacher in my senior year majoring in anthropology. He turned out to be pretty much the preeminent anthropologist in the world almost in my lifetime . Wow lucky me .

Rad Mom: What is a Radical Feminist ?



https://youtu.be/ysIHGlczm-Q?si=C5cTjAvIw8ga1yF8



Tuesday, July 2, 2024

Detroit in the Era of an Overwhelmingly majority Black population

Detroit in the Era of an Overwhelmingly majority Black population

I have focused this class on Detroit, Wayne County and Michigan urban activity and traditions of the last 100 years, and especially the last 70 years or so corresponding to my lifetime of 73 years mostly in Detroit, but some other cities .

I focus on Coleman Young because he has been a main leader of Black Detroiters, mover and shaker in the historical period in which the Black population became the overwhelming majority, 85%, a larger Black majority than any other “Inner City’, Urban center ( Urban had become synonymous with Ghetto, Majority Black.) Also, my Economic Determinist, Dialectical theory of Urban Life’s conventions and traditions , and its evolutions and even revolutions I get from Mayor Coleman A. Young, graduate of Maben’s Barbershop Black Bottom Dialectical school and student of W.E.B. Dubois _Souls of Black Folk_. (Elaboration of Maben's later)

DETROIT IS HAITI: UNFORGIVEABLY BLACK

Blast from the past applied to 2014

by: Charles Brown

October 21 2009

tags: Analysis, Detroit, racism, auto industry, capitalism, Malcolm X

The Detroit protest rebellion of 1967 had the impact of crystallizing or aggravating a capital boycott on the citizens, the 99%ers of Detroit, that had then been developing for 15 or 20 years, a divestment by the bourgeoisie - big capital - something like that economic blockade or embargo on Cuba. However, not by law rather by private agreement to disassemble Detroit, as Labor Giant. The relationship of business to Detroit as a result is something like the relationship of world capitalism to Haiti since the revolution there a couple of centuries ago.

With the corporate flight from Detroit , a capital boycott was inflicted on a former concentration point of capital investment, by suburbanization of factories, plant closings , runaway shops to the South, globalization of production .

There was the bullet and then the ballot, a la Malcolm X in reverse: the 1967 rebellion, a mass protest or demonstration, guerilla theatre against white supremacist unemployment, poverty and police brutality.

Then the 1973 election of Coleman Young as Black mayor extraordinaire, an excellent urban technician. For these exercises of Black power, in leaderless protest demonstration as Free Speech in the form that was developed in dozens of Inner City Ghettos at that time; election of a proud Black Mayor ;and really for now being 85 percent majority Black population, Detroit is still under economic blockade punishment by the powers that be.


Coleman Young said, "The newsmagazines called Detroit a model city. They marveled at its strong chin and gushed over the heroic benevolence of Mayor Cavanagh,who had become the gallant knight of the War on Poverty by spearingforty-two million federal dollars for the city's poor people. Cavanagh was widely portrayed as a sort of Great White Sympathizer, and the fact is, he worked hard at maintaining a symbiotic rapport with Black leaders. In that spirit, he had established an amicable relationship that let observers to think of Detroit as being immunized against the _outbreak of inner-city rioting that had torn apart Watts in 1965, bloodied Chicago and Philadelphia, and in 1967 was sweeping the country at a rate that would produce 164 incidents, among them majorrevolts in Cleveland and Newark _ (emphasis added -CB)" - page 170 of_Hardstuff_: the autobiography of Coleman a. Young; "The Big Bang"

Chapter 7

The federal government's Kerner Commission report essentially agreed tha the "riot" protests in the dozens of majority Negro ghettoes around the country had legitimate gripes. “Our Nation Is Moving Toward Two Societies, One Black, One White—Separate and Unequal”: Excerpts from the Kerner Report

"Our Nation Is Moving Toward Two Societies, One Black, One White--Separate and Unequal":...

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Charles Brown : Detroit's 1967 mass, spontaneous guerrilla theater, protest were the culmination of a socioeconomic historical shift which was marked by segregating of residence based on race through white flight to the suburbs especially beginning in the 1950s, escaping the move toward integration represented in open housing law. (See Thomas Sugrue, "The Origins of the Urban Crisis: Race and Politics in Postwar Detroit," and Coleman Young's autobiography, "Hardstuff.")

It was also part of a relative scattering of some main points of industrial production from a concentration in the city of Detroit ( and neighboring Dearborn) to the surrounding suburbs. It was a breaking up of the World War II era Arsenal of Democracy, which had many left-wingers, naturally.

In a way, it seems to have been a shifting of the location of basic production from the Midwest to the South, from the U.S. to other countries, in what gets termed post-industrialism, post-Fordism, restructuring. The concentrated proletarian powerhouse was busted up and racially resegregated, on the typical American model: Black vs. white.

The bourgeoisie cannot really undo what they have done. They are hoisted on their own petard. Detroit is a pariah society in the national media still, as the latest Time article shows. White masses are shy to move back into Detroit, desegregate; although in 2014, there has been some white popular migration into Detroit.

The bourgeoisie will not invest in an African town like this, with so few white people to benefit. The are trying to move more white people in so that they can feel better about investing.

They , the bourgeoisie had to economically blockade us like Cuba, or Haiti have been for decades and scores of decades.

Like the great heavyweight boxing champion of the world, Jack Johnson, Detroit is unforgiveably Black and Proud as the PBS television documentary has it.

Wait, I take that back. They are trying to find ways to invest "in" Detroit, but so that most of the local population will not benefit. They will exploit and "skate" - in other words, con and run, get away without being held responsible for their wrongdoing...unless we hold them responsible, somehow. We, the 99%.

So, Time Magazine had a cover story back in 2009 saying that poverty in Detroit today is in part due to the rebellion of 1967, cause and effect, politically and economically - case closed.

Actually, it's true. The bourgeoisie are still punishing the rebellion, among other things. Perhaps, Time is making a confession.

http://www.pbs.org/unforgivableblackness/about/

Unforgivable Blackness . About the Film | PBS

Jack Johnson — the first African-American Heavyweight Champion of the World, whose dominance over his white opponents spurred furious debates and race riots in the early 20th century —...

pbs.org https://www.mail-archive.com/pen-l@lists.csuc…/msg12528.html

https://www.facebook.com/wethepeopleofdet…/…/465954460093259

https://www.facebook.com/wethepeopleofdetroit/posts/465954460093259

http://coreysviews.wordpress.com/2010/02/18/detroit-is-haiti-unforgivably-black/

Charles Brown‎We the People of Detroit

October 2, 2012 ·

Detroit Belle Isle is the Isle of Haiti, Unforgiveably Black. The State of Michigan doesn't want to help Detroiters with Belle Isle. It wants fewer Detroiters and more non-Detroiters on Belle Isle.

LikeLike · · Share Glenn I agree with you about most things, Charles, but not on this one. Belle Isle is a jewel. The city has allowed it to languish and decay, be overrun with overgrowth, broken glass. And this is not just a recent development. While Idon't trust many of the outstate politicians, its long past time that this world class attraction be cared for andmanaged and be used to help bring people and their money back to Detroit, even if its just for a day.

October 2, 2012 at 6:34pm · Like

Charles Brown Those problems are readily remedial by the City. It is not necessary to turn control of the whole island over to the State to get it done. And the State is not taking it to do that but to get Detroiters off and non-Detroiters on. It's primary purpose shouldn't be to make money. And it can't be made to make much money without turning it into Disney Land or something. The whole deal is a big fraud. Just clean it up , Mayor.

Even if it's just for a day. What does that mean ?

October 2, 2012 at 6:38pm · Like

Glenn they're not readily remediated. They haven't been remediated in DECADES. If people visit the island as a managed park, people pay have a yearly pass. Not a high price to pay. Anyone who comes to the city to go to Belle Isle may spend money on food, gas, and whatnot. For a city whose infrastructure is crumbling and cannot be supported by its meager and shrinking tax base, generating traffic and revenue is necessary to ward off having the entire CITY taken over by the state.

October 2, 2012 at 6:43pm · Like

Charles Brown the entire city has already been taken over by the state with the Consent Agreement. That's not enough revenue. Not that many more people will be coming to the city. And you can be sure they won't be stopping in gas stations in Detroit. Are you kidding. Most Detroiters I know are not that dissatisfied with the Island. The cleaning problem is eminently doable by a competent Mayor. Giving the island to the State to do it is like using a sledge hammer to do the job of a scalpel. Not only too big but the wrong tool.

October 2, 2012 at 6:48pm · Like

Glenn I've lived in and out of the city for 30 years. I have been a champion of the city for the entire time. I won't believe that the city can do the job until I see it. So far, I haven't seen squat. Detroit doesnt' want to give it up, then they'd best get on it October 2, 2012 at 6:51pm · Like