Homo sapiens may have descended from a monkeylike species forced to spend more time on the ground as the Earth entered a period of cooling sometime between fifty and twenty million years ago, during which the amount of jungle shrank dramatically. The largest of these primates would have no doubt been most successful at exploiting land-based food sources while avoiding large predators, peering over the long grass of savanna at greater heights and escaping to the shelter of sparse trees with improved foot speed afforded by longer stride length. In the jungle there is more to eat, nutritious vegetation and plentiful protein from insects and grubs, so these very early ancestors of anthropoids would have had to be creative in order to feed themselves in sustainable and defensible ways, probably behaving cooperatively with a multiplying of deliberate roles, a cognitive root of human division of social labor, also becoming able to learn food-finding techniques from their most innovative members and other species by studious observation, the first selection pressures for humanesque brain plasticity and teaching, leading to cultural transmission of ideas and methods.
An alternate explanation for the transition towards humanlike traits by our anthropoid ancestors is that these species did not encounter much selection pressure from predators, and in combination with ample food of all types in tropical rainforests increased in average size and intelligence due to psychosocial demands on individuals within intraspecies groups. If this is accurate, anthropoids may have been virtually immune to predation by the time they moved out of the jungle, closer to great apes than monkeys in body type, a migration perhaps impelled by shrinking habitat and additional population pressures or maybe even mere curiosity, with human physique and psychology being almost exclusively a product of natural and social selection for reproductive success (as opposed to mere survival) and problem-solving acumen when food became scarce during climate changes over many millions of years.
Whatever the case, it is clear that these anthropoid ancestors began to spend more time upright, encountering selection pressure for scouting out greater distances and walking miles at a time while carrying loads, which assisted in acquisition of food and drink, until humanity’s lineage had adapted bipedalism, standing almost full time on the two hind limbs. Maybe they hunted cooperatively, but likely would not have been able to take down large, live quadruped prey because of deficient running speed. They probably scavenged territories for anything edible, moving around to cope with seasonal or climate-driven changes in food availability, and started competing with predator species for access to kills. This may have played a role in driving hominid preoccupation with tool use, as the ability to throw rocks and wield objects in largish groups would have made securing meat from big game brought down by sizable predators easier to accomplish, in addition to divvying up the meat itself. As anthropoids became hominids eight million years ago, and hominids became hominins five million years ago, the protein content of their diets thus expanded, a palatability not yet lost in this lineage from the days of smallish primates eating tropical bugs more than twenty million years ago. This was the fuel for optimum development of brain metabolism and structure manifesting as peak mental performance, a factor in the evolution towards more flexible, technical and social intelligence.
Populations of evolving hominid species likely remained small and limited in range for millions of years since reliable food sources were meager and local. Quite a bit of creativity was probably required to fill their bellies; only so much success in scavenging could be had when terrain grew unfamiliar, and poaching a kill from lions or other large predators would have been infrequent. Over millions of years, forests, grasslands and deserts expanded and contracted as hominids moved about, possibly first in Africa, adapting however they could to changes in food availability as they migrated, so that by the time the first hominins such as Homo habilis and Homo ergaster were both alive, around 2 million years ago, intelligence and protoculture had reached levels conducive to all but the harshest environments on Earth.
Paleontology and archaeology indicate that Homo ergaster populations spread throughout the Old World, evolving into Homo erectus in Asia and Homo neanderthalensis in the Middle East and Europe. Our own species, Homo sapiens, probably evolved in either East Africa or Asia roughly two hundred thousand years ago. There was some interbreeding between early humans and Neanderthals, but every Homo species besides ours went extinct.
The cosmopolitanism of precivilized Homo sapiens is probably attributable to higher intelligence, the ability to construct finely crafted tools and develop techniques for hunting, trapping and food storage, growing their populations to sizes exceeding what was possible for earlier Homo species. Neanderthals must have been no match for humans if there was direct competition, for archaeology reveals many signs that human cultures were more refined conceptually, active diplomatically, and technologically proficient. Early humans lived a hunter-gatherer lifestyle still present in remote parts of the world, with men leaving mobile camps or small villages to hunt for lengths of time that varied depending on conditions, and women foraging edible plants. Humans had the compulsion to branch out into untapped landscapes, and their populations swelled prodigiously in these areas of abundant food availability. Genetic studies reveal that the entirety of indigenous society in the Americas was seeded by no more than a total of roughly five thousand individuals who trickled into the Western hemisphere by way of a land bridge from Siberia to modern day Alaska during the most recent ice age between fifty and ten thousand years ago.
Despite technical adaptability, human hunter-gatherers could struggle at times when regions of plentiful sustenance grew barren due to climate fluctuations or natural disasters, forcing prehistoric tribes to regularly move so as to merely stay alive. Initial attempts at civilized organization were hampered by climactic transitions, as centralizing tribal federations would be forced to disperse with periodic downturns in the food supply, and there are many ancient ruins to show for it. Upon switch to an agriculture-based diet provided by compact farming, in some parts of the world taking hold as early as 10,000 B.C.E., larger communities could be supported on the same territories, and a hunter-gatherer lifestyle only maintainable in many places by continual relocation with the changing seasons transitioned towards settled living in sprawling, culturally integrated populations. Food supplies became collectivized, and in boom times that could last for centuries, surpluses freed a proportion of the community to specialize in nonfarming trades, so that peripheral technologies advanced at a much faster rate. Some of this inventiveness cycled back into food production, making farming more efficient and stabilizing civilization even further.
Yet in order for civilized living to take permanent hold, specific conditions had to obtain. Many burgeoning cultures faced famine, a major factor in recurring declines. In the Americas, when El Nino and La Nina effects were prolonged or severe, conventional farming techniques could become inadequate, limiting organizational momentum of early settlement in what are modern day Mexico and Peru. Many ancient communities of nontropical regions in sub-Saharan Africa also faced climactic conditions unfavorable to large-scale agriculture, usually remaining tribal or relying on strategic location as hubs of commerce, and these centers of trade always ended up abandoned when resource distribution and availability changed: naturally occurring alterations to the course of rivers affecting travel routes, political upheavals, exhaustion of nonrenewable desiderata, and mercurial weather all took their toll. In tropical jungles, food was so abundant that there was virtually no impetus for advanced agriculture, in addition to higher incidence of disease that becomes intolerable with crowding. In drier and more frigid climates, humans relied on both wild and domesticated herds such as buffalo, deer, oxen, or cattle as their food source, and they were unable to settle in dense communities as areas were razed of vegetation by grazing and animals had to move on. It was in the Middle East’s fertile crescent, Egypt, the Mediterranean, China, north India, and after a time Europe where agriculture was both an upgrade and conduced by the natural environment, chiefly due to appropriate amounts of water, and this is where farming, civilization with its organizational systems, and technological advancement first put down deep roots.
For most of civilized history, food has been a hot commodity, with the majority of human beings having precious little of it, and that only by backbreaking toil carried out under the yolk of economic dependency. Conditions of the average farmer in antiquity are not entirely clear; much of the agrarian economy was managed by aristocrats and those affiliated with the military who used slave labor to tend their crops. Average soldiers could improve their economic standing from plundering territories, purchasing or seizing property throughout an empire and becoming pastoral landlords themselves, a major incentive for the replenishment of martial manpower. It is less clear what the city dwellers of antiquity ate, but markets with meat, produce, cooking oils and grain shipped in from outlying provinces supplied at least some of their needs.
There are extensive records of peasant life in some parts of Medieval Europe. This is true of England, where it is clear from legal documents that most of the population lived a life centered around manors of seigneurs, the aristocratic lords. A minor percentage of peasants owned their own property, most were tenants who owed a share of produce or coinage to the lord in exchange for access to land as well as protection, and the rest were “villeins” who lived on extremely small plots and worked someone else’s land for a paltry wage. Most citizens had particular tracts assigned to them, usually adjacent to their mud and brick abodes, but the system was quite collectivist: oxen teams and plows were often shared or loaned, and at dates of the harvest calendar, peasants communally worked land set aside for the lord’s income, events at which seigneurs provided bountiful feasts at their own expense, the best meals of the year, accompanied by festivals and gaming. This was a big deal, for most of what peasants ate was “pottage”, vegetables mixed together and cooked as a bland soup with trace amounts of meat. Even this minuscule access to food could be reduced by fees and confiscations of produce resulting from any delinquency in meeting one’s work obligations; not appearing in the field one day as expected could cost you. There were no fences, little private property, just the potential for a small fine to keep peasantry tending to their business, and individuals could move about the fields as they saw fit. In most regions of Europe, serfs owed three days a week of work to lords, a practice called “robot”, the rest going towards their own livelihood, but Russian serfs had been made to contribute as much as six grueling days a week to the income of manorial lords, one of many historical factors in the early 20th century overthrow of the Czarist monarchy, replaced by a Marxist dictatorship of the proletariat that at least initially sought to elevate the status of working class citizens on the path to a communist society.
At the end of the Middle Ages and into the Renaissance and Early Modern period, Europe’s population doubled with corollary political, economic and cultural unrest. Upon discovery of the New World, Western European farmers began immigrating to these vast new territories by the millions in search of more land, improved financial prospects and a better future. In the U.S., settlers pushed inexorably west until by the early 20th century all but the most remote areas from the Eastern seaboard to the Pacific coast had been Europeanized. The Midwestern United States, with its fertile soil and favorable climate, was converted into the country’s “breadbasket”, many hundreds of square miles of farmland bolstering population growth and providing a portion of the nation’s income from food exports. Most of this was shipped back to European countries where land formerly allotted to agriculture was fast being industrialized, with a swelling, concentrated class of proletariat factory workers who needed to be fed.
Western European demand for U.S. agricultural products made this sector of the economy a lucrative business, and wealthy interests entered the market, buying up land, working towards a monopoly on farming by consolidation under corporatized auspices. These large companies, with influxes of capital from investment as well as ancillary ties to external markets, could assume the risks of speculation into more efficient methods and equipment, and as ownership contracted with mergers and acquisitions, becoming increasingly exclusive, the presence of competition for materials necessary to run their farms dwindled so that they were able to fund operations for cheaper, which allowed them to lower the prices of their products. In the latter half of the 20th century, this began to drive small-scale landholders, the independent farmers, out of business, families which had lived on their land for generations near small towns that were hubs for a whole way of life. The U.S. government recognized families were being choked out of agriculture by big business and adopted subsidization measures as an aid, but this has only slowed the changeover to full control of food production by national and international corporations.
Agriculture is a good example of capitalism’s social effects, in particular its relationship to governments. Even though corporate monopolies can endanger small businessmen, in this case farmers, regulatory capabilities of the U.S. government are robust enough that transition to farming as big business has not been cultural slash and burn. Quality control is possible and pricing stays within a reasonable range; the food industry has not become unchecked exploitation. U.S. agencies have spent enough on national infrastructure since WW2 that agricultural products can be distributed throughout the country with high efficiency. Supermarket chains that achieved commercial domination after the Great Depression of the 1930’s have streamlined delivery to consumers so it is cost effective to import food from all across the world into the U.S. as well as invest in developing improved varieties of produce and livestock by selective breeding, with novel products having wide appeal. Media assists this effort at diversifying the food industry as cooking shows and cookbooks teach individuals how to utilize exoticisms in dishes, a modest encouragement for the appreciation of diverse cultures.
Unfortunately, government regulation is starting to stall as capitalistic corporations become powerful enough to control legislation going forward. We see the first indications of this trend in disputes over genetically modified foods, as consumer concern about the poorly understood effects of introducing synthetic genes to the environment goes largely unheeded. All of the real decision-making is behind the scenes, but as far as anyone can tell, research and analysis seems to be skewed towards the interests of businesses that can influence every facet of an industry and its regulation. Corporate money is flowing in the industry itself, into oversight, into any oversight of the oversight, into government policymaking, education, and the legal system. Some degree of institutional overlap is to be expected, but companies can throw so much weight around with astronomically huge sums of cash that it is doubtful if anything highly profitable in the short-term, such as increased crop yield or insect resistance, will ever be prevented by a regulatory agency, and it is even more unlikely for the public to acquire detailed information about the food industry’s conduct absent a disaster. U.S. citizens face the same issue with globalization of food sourcing: few new standards are being put in place to guarantee a greater array of products from other countries are accompanied by expanded procedures for quality control, accountable to citizens. Links between the food industry and action on public health are likely to be stuck at 1990’s levels barring incidences that are absolutely impossible to overlook. Of course some simple progress is still possible, such as the introduction of golden rice, a vitamin rich strain offering nutritional supplementation.
In the developing world, where governments are more unstable and less representative, the situation is much worse. As many of these regions exited their phase as colonies of European empires and began to embark upon self-determination, the first move was towards government management of food distribution. Nonsustainable farming techniques that had been imposed by colonial governance were not updated, and especially in Africa, desertification of land and a lack of funding for infrastructure led to the relinquishing of national control, during which foreign corporations entered the picture, dominating the industry abroad. Much of the produce in developing countries is exported to Europe and the U.S. by these companies of Western origin where it can be sold at higher prices, and local governments coordinate with foreign operations no more conscientiously than for a bit of easy cash via taxes, money that usually does not get cycled back into social planning for the sake of bettering the general population’s standard of living, which at the very least could go towards sponsoring roadway projects similar to those undertaken in the U.S. during the 1950’s and 1960’s. Western supermarket chains and complementary food distribution companies are starting to extend their business interests to less affluent countries, but infrastructure limitations inhibit the growth of logistical systems, meaning that for the near future most countries will depend on local food supplies, with periods of famine only countered by the constrained reach of charitable relief.
Well-functioning government is certainly one of the conditions for a healthy capitalist economy, but too much bureaucracy can be a bane to populations. Early Soviet socialism is an instance, as their five year plans for grain distribution were not flexible enough to adjust to fluctuations in supply due to the climate anomalies which were always occurring somewhere within huge territory. Shortages were common and life was less comfortable than in the U.S., its chief competition for political supremacy. In the 1980’s, the Soviet system was overthrown by a populace frustrated with an economy lagging behind its more democratic rivals. Early Chinese socialism likewise faced some tribulations. At the founding of its revolutionary administration, leadership toured industrialized countries and decided to make ironworks a staple of the government-directed economy. However, they did not initially reach an understanding of technical subtleties involved, and a large portion of the population was assigned to manufacture inadequately refined pig iron, worthless on the international market. Food imports for their huge population were impeded for a time by the inability to sell this inferior product, and famines ensued. In the 1980’s, citizens of China demanded that capitalist corporations be allowed to set themselves up in the country under threat of rebellion.
It is apparent that economies go through cycles, periodic booms and busts, and governments are vital to ameliorating lows by tweaking the financial system at key moments. However, the U.S. government has proven limited in its capacity to channel the economy’s course into planning that is representative of citizens’ interests. American corporations have become international, moving much of their manufacturing to foreign countries where labor is cheaper, reducing jobs in the United States as well as sucking money out of the nation’s middle class and into the pockets of a corporate ownership largely unconstrained by any particular country’s regulations. U.S. citizens’ standard of living drops, with deficiency in what should be elevation of the standard of living for developing countries. There are exceptions to this neglect of public welfare by authorities: in Germany, due to subsidization, the country projected that its electrical grid would be powered entirely with clean energy sources such as wind and solar power by 2020, fossil fuel free. The world is full of divergent strategies for political and economic organization, and it will be one of the main tasks of the 21st century to analyze all this variation for the sake of improving quality of life via healthier diets and more.
A free download of the book Standards for Behavioral Commitments: Philosophy of Humanism, also available for preview below. Topics covered include chemistry, biology, genetics, neuroscience, epistemology, the history of Western philosophy, cultural evolution, theory of cognition, ethics and much more.