We have thus far talked of the science, psychology and institutions of human health, food consumption, combat and reproduction as if they are distinct modules with separate developmental histories, but they interact in profound ways. All four of these domains are interdependent: an immune system is useless without adequate metabolism, both of these are impotent if the bodies they sustain are not defensible from external threats, and all three of these categories with their characteristic functionality are wiped from the face of the Earth in one generation without the ability to reproduce.
The following provides one of a vast quantity of possible examples for how each module affects the others in intrinsically reciprocal relationships:
The immune system requires good nutrition to operate normally; an overactive immune system can cause unpleasant and sometimes life threatening allergies to food. Epidemics have been a reason for victory or defeat in war; the spread of disease is utilized by populations against each other, culminating as weaponized biological agents such as anthrax. Circumvention of the body’s defenses by pathogens alters genes, causing birth defects or premature death, but also beneficial mutations; it has been tentatively suggested by some research that the scents of romantic partners can reveal whether immune system profiles are a good match, which may be one of many factors contributing to the health of offspring (simultaneously caused by reproduction and immunity). Victory in war has been achieved by starving populations into submission with sieges; physical rigors of combat training can make the body’s metabolism more efficient so that an individual stays in better health. Maturation of a human to reproductive age of course depends on sufficient nutrition, as does fertility; children of obese parents are more likely to be born with a metabolic disorder. Violent conflict has affected the distribution and intermingling of ethnic groups in thousands of ways; more enlarged communities of Homo sapiens compared to other Homo species due to richer culture and better technology may have been partly responsible for our triumph if direct conflict ever stirred up violence, and knowing human nature it probably would have.
We can further incorporate the modules of human perception/conception and language/symbolizing into this framework of causal reciprocity:
Failure of the immune system to ward off infection can result in damage to sense organs and cognition; over the course of human history, contaminated water frequently produced disease as its transmitting of pathogens went unrecognized until modern times. The human eye requires vitamin A to work properly, and memory can be enhanced temporarily by amphetamines even though addiction and brain damage are likely from prolonged use; archaeology reveals that around 6000 B.C.E. rickets became widespread in Europe as its populations transitioned to agriculture because the association was not made between a cereal-based diet and vitamin D deficiency, resulting in natural selection for lighter skin shades with greater UV absorption in these northern climes. Mongol invasions were a motivator for European exploration of Asia and eventual voyages that discovered the Americas; satellite surveillance makes response to rival military strategies much more effective. Mere sight of their child can activate the genes of parents (a mutually reproductive and perceptual/conceptual cause). Factual description often compels humans to seek further understanding via firsthand experience; new observations made with scientific instrumentation lead to a modifying of descriptive models.
As for the language module:
Communicating that a treatment will work can result in the placebo effect, boosting immune system function and healing by the power of suggestion; cancer may injure portions of the brain, mouth, larynx, bronchial tubes and lungs, sometimes interfering with vocalization. Instructional materials explaining nutrition are an efficient way of teaching individuals how to maintain a healthy diet; the attention span necessary for kids to get maximum benefit from school lectures flags if nutrition is not satisfactory. Cracking German encryption, the Enigma code, during World War 2 led to victory for the Allies; British imperial conquests spanning the globe along with U.S. expansion turned English into the international language of commerce. Conversation is a primary means by which human beings establish intimacy and decide who their romantic partners will be; language talents or impediments can run in families.
From the huge amount of possible combinations, which readers can no doubt add to from their own knowledge, it becomes obvious that division into these general categories is somewhat arbitrary from the perspective of total causality in the real world. A focus on each of these modules of function as individual domains serves more to demonstrate how our previously specified layers of causal emergence tend to hybridize in wide-ranging explanations than to account for the entirety of causal vectors as we reason about a phenomenon.
For instance, there is more to the Mongol invasion than historical records can ever reveal. If we went back in time, we could do a brain scan and psychological evaluation as well as cultural research to find out what made Genghis Khan such an effective expansionist, whether it was his personal characteristics, the rarefied convergence of various material and social factors within Asian cultures, or some concatenation of both. Today’s sociologists could embed themselves on the ground, with interviewing and analysis of what all the players on this historical stage were thinking, what motivated them, including Europeans who emerged from the Middle Ages as such determined explorers and imperialists. Neuroscientists and geneticists from far beyond even our era would perhaps be able to model Genghis Khan’s brain and trait profile with a specificity unimaginable to us today, illuminating exactly what made him tick.
With research into modern life, where records are practically unlimited in specificity and comprehensiveness, and in which powerful science and technology exist for modeling experience in ways markedly exceeding personal observations, opportunities to accumulate fact and formulate truth are much greater. In order to understand the interrelationship of media with behavior, we can do better than making the mere speculation that entertainment mediums are propping up a culture in decline from crushing economic and mimetic pressure on individuals, an environment in which we barely hold on to a disintegrating institution of marriage and stem the tide of degeneration in families. Neurologists can use fMRI analysis on hundreds if not thousands of individuals as they interact with interfaces of their communications devices to grasp how media technologies such as smartphones and computers affect the brain. Scientists can accumulate statistical data about divorce, teenage delinquency and media exposure to discover whether there are suggestive relationships. We can do more intensive analysis of a smaller but representative selection of individuals to get a sense for how the course of their lives is influenced by cultural trends, their thinking modified by social milieus, and where information technology fits in.
Qualitative synthesis of the type undertaken by this book is not authoritative, it is a starting point, a series of thought experiments utilizing argumentative logic as a chemical indicator of sorts, analytical design focusing our reason as a measured infusion of factual matter for explanatory solution, inspiring hypotheses that only precipitate out from the mixture of existence when we agitate its elements, objectively contextualized as if in an Erlenmeyer flask, forcing them as causes into salience and alignment with our conceiving, a dynamic reaction within which they momentarily flicker like a chemical titration, as particularity the mind can resolve, remember, define, record, render meaningful from the perspective of function, essentially control, until tincture of culminative insight is attained.
The potency of theorizing, our fixation of environments as conceptual structure, rapidly progresses, but is still limited in its capacity to penetrate natural fusion of the whole, which transcends naked perception and intuition. Even with the methodologies and explications of science that grow more formidable every day, we must compartmentalize causality: separate academic departments reverse engineer or deconstruct, fracturing reality into fields and components, then reassemble the melange of information and data sets into no more than a patchwork of probable and often loose correlations. Irrefragable certainty is mostly lacking, but the potential to synthesize contexts of analysis into a general picture giving clear implications for further investigation has reached high levels.
With the heftier mass of detail available since arrival of media platforms such as institutional education, printing, audio, video and computers with all their many instantiations, our thoughts about the previous hundred years are less retrograde, but even contemporary inquiries into centuries-earlier epochs present some perhaps permanent difficulties.
Looking to the past, quantum physics and neuroscience are not yet applicable to an examination of prehistory if they ever will be. Evolution of the psyche is fraught with vagueness; a disjunction between primeval memes revealed by archaeology and those of the modern world surfaces; and ancient technology is obviously inferior. In essence, our brains have transformed in some unexplained way. The natural environment is more of a constant, but even so we can only conjecture about the precise timetable and causality of parallel changes in ecosystems and human traits, which becomes truer the further back we go in the lineage of Homo sapiens and prior.
Theorizing of modern behavior and mentality, even from a very broad perspective, still seems to make substantial headway. Neuroscience provides deeply descriptive models of how living brains function, their similarities and differences, and quantum biology will soon be supplementing this knowledge. With conversation and additional formats by which personal experiences are observed and reported, we can get an objective sense for the way subjectivity works, though we must be careful to keep tabs on the influence our models of thought and emotion have on the supple structure of the mind as concepts that are subliminally transmissible in socializing, capable of revising culture in unintended and unwanted ways. Our memes and technology are intuitive to us, and research into their character and spread can begin at a more advanced stage than consideration of prehistory or previous centuries. Natural environments are transforming in ways unprecedented for the historical period as our technological footprint gets bigger, but this is well within our means to understand scientifically if we allocate the necessary resources. We can have more knowledge of everything about contemporary life, but even so, it cannot be stressed enough that the effort to comprehend present reality has always faced obstacles.
A chief complication consists in how phenomena can exceed the bounds of observation and theorizing, not even noticed let alone comprehended, an issue surfacing throughout recorded history. Hat-making used mercury-containing compounds pre-science, hence the phrase “mad as a hatter”, for inhalation of vapors produced was found toxic to the brain. In the 1950’s, schoolchildren dangerously handled mercury without gloves in classrooms, fascinated by the rarity of a metal that was liquid at room temperature. Lead was formerly an ingredient in industrial paints, and homeowners’ walls used to be poisonous. Billions of dollars have been spent removing asbestos pipe insulation from houses, for certain kinds of exposure cause lung cancer, a health hazard that can take decades to materialize as microscopic fibers lodged in the body slowly make their way to the lungs where they produce tumors. CFC’s (chloroflourocarbons), refrigeration chemicals, eat holes in the atmospheric ozone layer protecting Earth’s surface from harmful solar radiation, and though banned, some regions of the world have had to exercise caution when going outside because of increases in skin cancer risk, though the environment is fortunately recovering.
In these cases, innocent mistakes were the culprit and response to these threats was immediate, but often human beings incline towards willful ignorance. This was the situation with tobacco; it was an expensive, high-class, popular product, grown in the New World for centuries as a staple of European economies, but nicotine it contains is well-known for being extremely addictive, and toxins released by smoking it damage every organ system of the body. There must have been indications of these effects for hundreds of years, but mass production of tobacco as cigarettes nevertheless began in the 20th century along with mass marketing of smoking as culture, and inhalation of a burning leaf became one of the leading causes of death in the United States as well as around the world, proving almost impossible to phase out due to its firmly entrenched social niches, despite regulatory mechanisms such as laws and taxation that put tremendous strain on the finances of individuals together with additional inconvenience.
Intentional disregard and selective memory can graduate to systemic bias, in which social groups become attached to beliefs or theoretical paradigms and refuse to acknowledge negative consequences or admit progressive alternatives. We have already touched upon Christian theology’s attachment to Neoplatonist concepts, which delayed by a century Europe’s reincorporation of Aristotelian philosophy, the bridge between religious orthodoxy and progressive approaches of modern science and technology. The epic battle over the move from geocentrism to heliocentrism has been discussed, as committed clergy, even many of those with integrity, refused to so much as look into a telescope for fear of dramatic shifts in our model of the cosmos, and religious high leadership treated direct visual observation as if it was political insurrection and chthonic wickedness.
What has come to be called social Darwinism originated as the simple-minded presumption by 19th century Europeans that distributions of political sway indicate the evolutionary fitness of their nations. Minority ethnicities were beset upon by popular mobs and some official governments in belief that since the majority race had the majority of the power, an absolute majority would be more powerful. Preponderant ethnic groups in many countries backed their embattled populations across the border. This ignited World War 1 when a Serbian with ties to a radical nationalist faction assassinated the Archduke Franz Ferdinand, heir to the Austrian throne. Much larger Austria declared war on the mainly Slavic population of Serbia, predominantly Slavic Russia declared war on Austria, and the rest of Europe along with its colonies was pulled into the fray by way of alliances. The multicultural, middle class U.S. population sledgehammered this sort of vapidness militarily in the first half of the 20th century, but not before the continent of Europe had lost its developmental edge and nearly destroyed itself in a two century frenzy of upheavals and clashes. Unfortunately, ethnic and cultural conflict are a quick and easy cause by which to divert populations away from self-empowering unification, so keep cropping up as tenets of propagandistic doctrines around the globe.
Though research has largely surpassed the definition of evolutionary fitness as efficient causes vying for supremacy, with nature analogized to a game of pool or perhaps a lottery in the minds of the more cynical, our thoughts about scientific fact still get sabotaged by inclinations towards bias. Theorizing of the universe confuses us when we contemplate existence as a vast array of spherical particles, or a probability distribution, perhaps a heterogeneous energy field or a sea of wave interferences. In our attempt to connect specific concepts within large associational frameworks in order to formulate general principles, we can overanalogize the structure of various theories, with subtle distinctions between contexts vacated from consideration and an apex model, still only a concept, erroneously regarded as the essence of reality. Even though we know atoms are a conditional concept, we come to believe we are atoms, and though we know structure and function of the neuron in its correlation to the mind is a succession of revisionary theories, we can fixate on the idea that consciousness is biochemistry, no more than an emergent property of reactions between atoms.
The foundation of our reality is not truth, it is perception, and though we have become much more perceptive by incorporating theory and technology into our acts of observing, standardized modeling and model-based instrumentation are essentially an outcome of no more than hypothesis. The meanings of our conjecturing grow complex and seem to take on a memetic life of their own, but are nevertheless rooted in basic intuitions, first spun by early humans into artistic ideations such as myths from out of raw concepts of substantiality, eventually assuming the form of materialistic speculations, proceeding towards innovative hybridizing of concepts into more abstract generalizations as philosophies, then built into a distending and diversifying superstructure of predictive descriptions we call science, incomparably more powerful but no less hypothetical.
To demonstrate, we can muse about the history of light. Early humans observe the sun move spontaneously, giving light and life to the world as if with supernal purpose, therefore this occurrence is the creative act of a divine will, a god. Astute ponderers, natural philosophers and mathematicians notice that effects of the sun are predictable from the nature of surfaces under its sphere of influence, and we gain the concept of our sun radiating a substance traveling and deflecting across distances. We recognize that luminescent objects besides the sun produce comparable effects, and the concept of light as a single radiating substance with many independent causes is born. We associate light from numerous sources with the generation of heat, one of the first protoconcepts of ‘energy’ or intrinsic changeability in substance. We note the iridescence of light and progress to prism technology that predictably differentiates this substance into the visible spectrum. Early scientists observe by serendipity and then clarify in experiments that light takes the form of a wave. A theory is proposed that the differing colors of light result from different combinations of constituent wavelengths. The concept of the electromagnetic spectrum is introduced in conjunction with observation and theory of light’s interactions with materials, its reflection, absorption, emission and energetics. Invisible portions of the electromagnetic spectrum are postulated as causing many phenomena by the same radiative process as visible light, spawning all kinds of new technologies, such as microwaves, x-rays, radio, infrared sensors and nuclear energy. Theory of electromagnetism and atomic theory are combined in a particle theory of light, the idea that electromagnetic radiation takes the form of discrete energy packets called photons, with each photon absorbed or emitted by an electron, giving each atomic element, compound or molecule with its unique electron arrangement a characteristic color and in many cases, such as with water, glass and metals, some type of reflectivity. The speed of light is theorized as constant at all wavelengths based on equations that model interrelationships of matter and energy within chemical bonds, from which is derived the formula E=mc2 (Energy = mass times the squared speed of light). The behavior of photons is shown to be statistically related under some experimental conditions, and this interactiveness is revealed to happen faster than the conventionalized speed of light as well as retroactively. Extremely sensitive equipment registers photons materializing out of nowhere during chemical processes such as nuclear fission, and together with many additional observations, a theory of energized matter (photons being a type of nearly massless matter) as fluxing in five or more dimensions while capable of spanning three dimensional distances almost instantaneously comes into vogue, though a proven model has not yet been constructed.
We all assume we know exactly what light is, a satisfying notion enters our minds in a split second, but as can be seen from the foregoing, our conceiving of light has a many thousand year history, with this brief summation highlighting no more than the most basic conditions that obtained in order for the most salient leaps forward in our understanding to occur; it can be expanded into volumes. A species with different perception proceeds through an entirely variant history: we are the only species out of more than a million on this planet that has a theory of light; our awareness of light does not even exist for any other species, and our light did not even vaguely exist in human minds either until probably the Holocene era’s dawn of civilization. Conception infuses into our perceiving so pervasively that it gives us a different world to observe: the sun’s “rays” were probably almost inconceivable to early humans, and light “waves” not even possible. Ours is a theoretical reality in a technological chassis, transforming what we attribute to naked perceptions, a technical culture in which our hardiest strains of intuition, ideal geometrical structure for instance, can spread and evolve more viable forms and functions such as those of calculus with guidance from contemplation, while we also give new strains like quantum mechanics and crossbreedings like quantum biology a place to be cultivated and grow.
There is an informal sense in which atoms and neurons are real, for aspects of their nature do present themselves to perception independent of our will, as if they have an existence of their own, but the fact remains that atoms and neurons have always become something besides what we think they are, and this reformative process does not yet show any signs of terminating. What atoms and neurons are is undeniably actual, but the way we know them is evolving in such a fundamental way, even at the bedrock of observation as we modify our instruments and experimental designs, that it is patently false to say theory-reliant beliefs, our objective truths, are reality. Theories are speculating models, though entangled in perception like a root system, so integral to our experience of the universe that we do not even notice without serious reflection upon our own thinking, and even then culturally modulated intuitions can be so instantaneous that careful introspection fails us. The phenomena which produce atoms and neurons exist without our awareness, but atoms and neurons themselves, the supposed truths of their nature, are conditional constructions of our minds arising from a dense web of hypotheses, from our human perspective a marathon of growingly apodictic claims and counterclaims, absent of determinate inception, but in terms of reality as a whole, whatever it is, thus far of limited scope.
If the absolute essence of reality is an empty concept, devoid of factual content, we must wonder what makes a belief true, for it cannot be free-floating in space, a complete arbitrariness; if truths are conditional, they must be conditional upon something. To precis the psychology of truth proposed thus far as simply as possible, we can say that it depends on the structure of single minds, also on social contexts of shared observing and conceptualizing that are intricately cultural and linguistic, derivatively on the convergence of these individual and collective domains in recognition of qualities we generalize as material, physical, spiritual, intentional, social, natural and supernatural, as well as on shades of happening that lie beyond the purviews of mutually verified fact, when they touch upon our awareness at all.
Psychologically, our minds are subject to perceptual illusions that science has no trouble classifying, such as optical or auditory kinds, as well as crude understandings of causality based on personal experience that collected observations can improve upon. Sociologically, our minds often become biased in group settings, overestimating the accuracy of common beliefs because of confirmations by agreement that happen at the expense of observational confirmation. Even well-reasoned beliefs can become insufficient at the onset of some new problem or deficient when a better explanation develops.
Due to these multiple domains of uncertainty, most knowledge consists in only probable truths, scientific ‘theories’ – hypotheses proven to describe patterns in empirical observations aptly enough that additional occurrences are predictable – with every theory being in principle open to refutation by further evidence. There are exceptions to the mostly tentative nature of truth. In the realm of pure concepts such as those of math, structures and processes can be of unquestionable validity due to logic exhaustive of their implications, as in the statement of an operational rule such as the law of distribution in algebra, a criterion for rearranging parts of mathematical expressions in accordance with finite sets of logical constraints.
There are some laws scattered around empirical science as well, such as the first law of thermodynamics stating that matter and energy are conserved, never created nor destroyed. The reason why chemists called this conditional description a law is this: if we assume that the thermodynamics of standard chemistry, during which matter and energy have never vanished nor spontaneously generated but only interconverted such that every instance has been quantifiable and predictable, are a match for the thermodynamics of matter and energy composing the universe generally, then this concept is a logical necessity. It is not a mere proposal, but a foundational presupposition of quantification as it applies to physical modeling. The theory of gravity was a similar such concept, formerly the law of universal gravitation, as it applied to all mathematical models of the 17th century’s known universe, which did not much exceed our own solar system. However, future discovery of conditions in the galaxy that diverge sizably from our own solar system revealed phenomena this classical concept of gravity does not predict, significant enough to disciplines like cosmology that major reclassification took place. Boyle’s gas law is in a similar situation: it fails to predict the behavior of gases under more extreme experimental conditions than could be generated with Early Modern lab equipment, and so also became a special case applying only in environments similar to those that naturally occur on Earth’s surface, but since it is more of a niche idea, addressing something akin to properties of a gas in glassware containers at room temperature as opposed to the entire known universe, scientists have not bothered to rework it much; it made a subtle change in nomenclature to “Boyle’s ideal gas law”. Perhaps the second law of thermodynamics will one day be redefined as a theory if our observation of particles at the subatomic level evinces more phenomena like spontaneous generation of photons and develops deep implications for core models. Under some circumstances, thermodynamic chemistry as a branch of classical mechanics might become a special case of advanced quantum mechanics explicitly indicated as such.
From a cursory analysis it becomes clear that the only incontrovertible laws of our human reality are probably tautologous rules that either assert the boundaries of meaning in set contexts with their deliberately constrained contents or elaborate procedures for manipulating symbols that represent these fixed contents. The laws of noncontradiction and the excluded middle are examples of the first type, assumptive commitments that frame the breadth of possible meaning. The distributive law of algebra is an example of the second type, logic by which to rearrange symbolic characters that state a particular concept into a different but equivalent form for functional reasons, perhaps in order to make calculation easier, or render inferencing more intuitive, as in techniques for translating the linguistic expression of a concept into spatial structure and vice versa, e.g. the slope-intercept form for the equation of a line, a configuration of algebra for geometric purposes.
Any truth claim stretching past the bounds of conceptual frameworks, addressed to reality beyond the mind, has always included at least a sliver of uncertainty because we have thus far not been capable of knowing either the constraints on our minds imposed by a fractionally comprehended physical world or those operative in the even less known total reality. We make decisions and can come up with better hypotheses for how to make our decisions effective, but we do not see anything in its entirety, not even ourselves. All historical, present and imaginable knowledge can conceivably be wrong in some way, and from a perspective that takes historicity into account, it always has been in error and will for the foreseeable future remain, if not illusory, at least incomplete.
Even so, science has made uncertainty manageable with methods that indicate when deviations from expected or desired outcomes are meaningful enough to warrant attention. Mathematics has been fashioned into a statistical tool for determining if differences that show up between almost every trial of a laboratory experiment or any scientific analysis are significant. Though the causality in these discrepancies can never be pinpointed exactly, and if we were more perceptive, smarter or more advanced, this basic unpredictability might remake our worldview into some kind of chaos theory of matter, when we pass the standardized threshold of precision, the randomness of order, the disorder in patterns, is negligible to theory, technology and their functions. There are also qualitative indicators that the unpredictable wiggling around of phenomena has gone from negligible to significant. Developments such as unusual spatial patterns, critical slowing down, or widening oscillations in the dynamical properties of a system are warning signs that give us enough certainty about the imminence of rapid, extreme, disruptive change to garner swift and sure responses.
Philosophical thought experiments of those like Descartes along with the mystique of quantitative science can lead us to view certainty as an arcane topic that requires professional expertise to grasp, but it is merely extending cognizance of negligible uncertainty built into interaction of our minds with the world. To demonstrate this, we can once again consider modules of biological function in their connection to human nature. Consuming a pint of water with a high concentration of Vibrio cholerae, the bacterium that causes cholera, then refusing treatment will result in death. A human being that does not eat for more than a month will die of starvation. A direct hit to the head by a cannonball fired at point blank range will kill you. In order for a human being to be born naturally, fertilization of an egg by a sperm must occur. We cannot yet observe whether there is Earthlike life on Jupiter’s moon Europa even though liquid water is present. It is not possible for a human being to speak one hundred languages with fluency. A three year old cannot perform a heart transplant.
These facts may be so obvious that stating them seems bizarre, but even though we have one hundred percent assurance of their truth, we cannot actually prove them in an absolute sense since there are always more instances to be accounted for. Despite the intrinsic limitations of knowledge, possibility exists within a range, between extremes beyond which the likelihood of an event such as remaining alive, perceiving something or carrying out a task becomes so improbable that giving it consideration is unnecessary to effective hypothesizing. The composition of our apparent world is constrained such that uncertainty evaporates under many conditions, even though no conditions have ever been comprehended to completeness.
So we know some truths with negligible uncertainty though we are unable to identify all of their causes or even how much of this causality our most reliable knowing apprehends, and despite fallacies in perception and reasoning that pervade thinking and social collaboration. Knowing without proving, intuitive knowing or knowing in general is a product of stable causality that has modified our lineage for billions of years, involving a narrow spectrum of environmental conditions relative to existence as a whole, to which we have adapted.
Prehistoric Homo sapiens on the cusp of civilization was from the perspective of ecology a nearly ideal species, invulnerable to self-destructing, destroying its environment, or being driven to extinction by typical selection pressures, even though life could be hard and it was only a matter of time before a mass extinction like that which wiped out the dinosaurs again took place. With advancement of culture, our ability to survive reached such a magnitude that we could do better than perennially act out adaptations as an evolutionarily unassailable species; we could reconstruct the environment to suit our needs. But as ecological and social circumstances transformed in both intended and incidental ways, new pressures took effect that our natures were not adapted for, and we have struggled both with and against each other ever since to tame, harness and exploit for our benefit new dynamics of human life in globalizing civilization. We can undermine ecosystems on a planetary scale, annihilate ourselves with the invention of intelligent technology, potentially secure our way of life against natural cataclysms such as asteroid strikes and social cataclysms such as ruinous violence or the oppression that tends to stagnate progress. Our future is largely ours to control, but we have not yet achieved a self-control and theoretical understanding sufficient to optimize civilization in peak mobilization, quality of life, and general prognosis for human individuals and their collectives.
Theory and self-control converge in the intentional facet of the psyche, which from a biological perspective is a cognitive function structured to regulate behavior, actualize desires and in many cases give pleasure as the decision-making control center of a human organism. Our consciousness integrates causality that influences us: the fluxing within both our bodies and ordinary range of environmental conditions is fixed into artificial but efficient forms of representation as sensations; rendered as a synthetic manifold of perceptions and affect; processed, often inferentially, as a conceptual defining of stream of consciousness, short-term memory, long-term memory and feelings; and enhanced by self-awareness with its intention-driven, desire-motivated and pleasure-seeking observation, reasoning and experimenting in the most intelligent animals such as humans.
The systematic analyses of modernity are in essence an extension of this intentional thinking, the mind creating and assessing theoretical generalizations with their wide-ranging significance for truth and practice via methods of observation and reflection as well as some imitation. Science imposes theories and technologies experimentally, and refines them based on appraisal of their consequences. We collect together generalizations that work, in scientific theorizing and elsewhere, the ones which are predictive or fulfilling, and these practical generalizations become our preeminent truths, functioning as standards for judging when, why and how procedures and other behaviors are appropriate, guiding decisions. Reason’s embodiment in technical theorizing has become basic to our interpretation of the world, deeply influencing customs and valuations, so lets take a closer look at the history and current status of theory’s interrelations with the psyche.
A free download of the book Standards for Behavioral Commitments: Philosophy of Humanism, also available for preview below. Topics covered include chemistry, biology, genetics, neuroscience, epistemology, the history of Western philosophy, cultural evolution, theory of cognition, ethics and much more.