One of the most apparent aspects of modern theory is its intimate relationship with language. Settings where theories are taught, applied and analyzed often feature a barrage of verbiage, and this becomes truer the farther individuals advance professionally. It seems language is a condition of the possibility for highly developed theoretical frameworks: spoken and written communication are necessary when either breadth of fact or complexity reaches a threshold past which clarity can only be achieved with assistance from a linguistic guiding of thought, inducing intricate cognitive procedures usually conjuncted to diagrams, tools and technologies, which do not make obvious sense to naked intuition. In many cases, humans will not even know what they are looking at without changes to the style and content of their thinking, and in many cases their worldviews. When major shifts in the structure of the mind must take place before meeting the requisites of a field is at all to be expected, where purely on the job training would be a months or years long comedy or perhaps tragedy of errors, language is integral to making the transition, stimulating imaginations in a pedagogical environment, molding minds via engagement with descriptions so that the real world of technical practices, electronic interfaces, automated machinery and innovation has at least a minimum intuitiveness. Even savants who grasp concepts with almost no elaboration or practice, for whom arcane knowledge slips into the mind in an effortlessly perfect fit, still need language to at least point them in the proper direction, focusing their attention on pertinent details.
Language enables experts in the theories of a particular discipline to explain concepts to the uninitiated, giving novices from all sorts of subcultures the gist of what is going on, a technical idiom to translate between technical ideas and common idioms. Professionals not only learn how to think themselves, but also acquire an understanding of the way those from differing backgrounds think about their niche, so that connection can be made with diverse knowledge levels and specializations as well as all kinds of social conventions and customs. At the same time, language is nearly superfluous as specialists carry out tasks individually; the practice itself of theorizing and analyzing can be almost nonverbal. The application of theory spans a large communicative spectrum that depends on context, from pedantic to nearly mute, so comprehending lodgment of theory in the psyche, a causal complex which is the core of the relationship between modern knowledge and culture as well as our efforts to transmit concept-based functions and drive progress, demands some in depth consideration of this versatility in the human thought process.
Before assessing the psychology of linguistic meaning, it would be good to specify exactly what we mean when we speak of the theorizing that is central to its content and development. In general, ‘theory’ is a term loosely referring to any postulated explanation for observations. Theories can be as nontechnical as a proposed reason why someone did not get called for a second date, and as informal as a casual notion of why time seems to pass faster the older one gets, but our overall idea of what theorizing is, in all its incarnations, essentially derives from the place it holds in science, its technical function. We call explanations “theories” because scientific theories are our central ideas regarding the causal nature of events. In the previous chapter, scientific theory was defined as “a hypothesis proven to describe patterns in empirical observations aptly enough that additional occurrences are predictable”. Deconstructing this definition can reveal much about the nature of knowledge formation.
First of all, a hypothesis can be regarded as a testable postulate. Postulates in nontechnical situations do not necessarily have to be verified and may not even allow for the possibility, but scientific hypothesizing is distinguished by the fact that it is seriously entertained only when some experimental or analytical context would provide for it to be refuted or corroborated with certainty. A hypothesis predicts a particular result in circumstances already defined with enough specificity that conceptualizings conveyed by a postulating expression can be clearly wrong or viable as an accounting of those circumstances. If we hypothesize that adding table salt to water increases the freezing point, and then discover that this aqueous solution is still a liquid as we adjust the temperature below 0o C, our hypothesis is absolutely wrong, and if we had made the opposite hypothesis, it would be true, though not unfalsifiable simply on principle. We may draw the conclusion that dissolved salts in general lower the freezing point of water, or with more certainty that table salt always lowers the freezing point, but these claims, no matter how likely, are still conceivably disprovable, and any description of the chemistry has to this point lacked exhaustive validity, as sciences of atomic structure transform in a revisionary process that is far from complete, undoubtedly as a whole if not in relation to every application.
Proving a hypothesis does not only depend on its legitimacy in a particular investigation, but also on replicability of conditions under which the hypothesis appeared accurate, with negligibly dissimilar results in trials performed by a multitude of researchers at separate places and times increasing certainty. Keeping the environment as constant as possible isolates factors under consideration such that many accounts of what is going on can be ruled out and precision increased. Using a constant amount of distilled water and table salt in initial trials of the aforementioned experiment eliminates possible variation in the dynamics of these salt solutions due to quantity discrepancies and errant substances, basically unassessed causes. It also makes exact proportionality of table salt concentration to freezing point measurable, so that results allow additional contexts, such as solutions of larger volume or those using tap water to also be assessed. We might hypothesize that changes in volume with constant concentration have negligible effect, or tap water solutions a slightly lower freezing point because of higher solute concentration, which new experiments could validate or debunk. We might also, with extremely stable experimental setups, be able to hypothesize about unexpected factors, as in such standardized laboratory conditions that the only realistic explanation for different results in Denver, Colorado and Los Angeles, California is difference in barometric pressure resulting from elevation above sea level.
With repetition, minor anomalies average to negligibility and a definite pattern begins to stand out from the overall data. With many systematic variations, numerous correlated patterns are generated and a general picture starts to take shape. With investigation of changes in matter, it was the pooling and modifying of rudimentary, alchemical practices into millions of slightly tweaked hypothesis tests employing growingly standardized instrumentation that fashioned the colossal edifice of modern chemistry. Hypotheses are the conceptual scaffolding that allows us to build a proven science.
Descriptive expression essentially uses some combination of symbolic features, whether sounds, pictures, or visual characters, to represent and convey perceptions and meanings as well as discharge affect. Scientific descriptions provide these same benefits, but their symbolism also serves a technical purpose, not merely actualization of the psyche in a fulfilling and socializing activity with functionality adequate enough for survival, reproduction, and motivational desires of all kinds, but an almost psyche-subordinatingly direct and explicit modeling of reality for the sake of clarity and accuracy as the most indisputable conceptualizings of truth.
‘Truth’ is a broad term of many applications, but in science its meaning is simply the apparent nature of a definite pattern, whether it be consistency within and between data sets from experimental trials, the persistent properties of a phenomenon, or any perspicuity at all, orderedness predictable enough that we can say we know what it is going to do in conjunction with notions of its form and function. Ultimate causality has so far exceeded the range of human perception, but the definite patterns of science are conditional perceptions of causality arising in controlled or at least solidly cognized contexts. Scientific modeling describes the systematic cause and effect that manifests during environmental analysis in terms of conceptual parameters upon which comprehension of these environments is based.
Water freezes when temperatures drop below 0o C, a quantified concept established by many experiments, and we also observe that volume and concentration vary in quantitatively definable ways. Our saline solution experiment assesses how table salt as a solute affects the freezing point of water within a context in which these conceptualized phenomena – temperature, volume and concentration – are controlled. Analysis apprehends the research environment within its technical constraints, converting notions of causality from nebulous factors into discrete variables. We make the environment mechanistic – like a machine with its array of precisely defined components – in ways that our hypothesized expectations lead us to believe might result in insight, then vary these components and look for patterns that illuminate their relationships. As we experiment, tweaking these conceptual systems that are like hypothetical machines, some outcomes surprise us and lead to reconceiving the nature of the system: table salt lowers the freezing point of water, like steel parts make machinery more resistant to corrosion than iron parts. As we carry out further observations guided by our expanding collection of hypotheses, small-scale patterns gel into larger scale patterns, but the whole structure as a sprawling mass of concepts becomes difficult for the mind to handle.
In order to lighten the load, these dense webs of innumerable patterns are fitted with simplifying metapatterns, more intuitive interpretations of scientific truth that make reasoning easier, streamlining the cognition accompanying analytical procedures, integrating conceptualizings so practices become more manageable and reliable. Maslow’s hierarchy of needs made the content of sociology holistically intelligible and ideals of diverse cultures analogizable. The periodic table of the elements succinctly organized a huge and growing collection of quantitative patterns in chemical reactivity, rendering them readily memorizable. Atomic theory made the many quantitative differentiations of chemistry friendlier to the mind as a general concept of interactions between spherical particles. Metamodels such as these abbreviate the staggering quantity of basic models describing patterns of scientific observation directly, which often involve extensive mathematical detail, translating them into forms that are more intuitively cognizable and universalizable.
The experiencing of patterns is as complex as cognition which brings it about, all situated in a massive aggregate of the organic much more complex still, in turn located within Earth environments tucked into a tiny corner of the universe from which we get only a sliver of a glimpse into the totality. However, despite a thus far inconceivable surfeit of phenomena taking place behind the scenes or beyond the reach of consciousness, analytical transmutation of environments into ‘definite’ patterns describable as scientific truth requires a very palpable state of awareness, combining perceptual content, conception and often inference in heightened attentiveness, a sort of active, focused mentality. It is almost effortless, but deep introspection is required to gauge this purpose-guided mentality as a subsidiary component rather than the core of one’s consciousness, for our awareness paradoxically tends not to be entirely aware of itself. It is an intentional state of mind that holds an executive role in discerning what qualia are and what they mean.
Though there may certainly be differing ways to classify patterns, we can get a sense for the nature of this self-aware, purposive state of consciousness with reference to the six categories of pattern outlined in section 2, subsection iii, “The General Nature of First Person Experience: Universal Characteristics”. The ocean is a natural pattern: a perception of manifold qualia, an observational resolution of this manifold into a flowing body of water with wave peaks, troughs and whitecaps against a skyline, and the aggregate of concepts that constitute what an ocean is, its causes, effects, implications and meaning. More specifically, we perceive the ineffable color of the ocean, resolved by observational attentiveness into an approximate bluishness linked to our concept of the particular color blue embedded along with its inferential possibilities in the perceptual and conceptual multiplicity of the visible palette, altogether a simple pattern. The theoretical pattern of light’s radiation and reflection, founded on simple patterns drawing from perceptions, has been clarified by experiment as a wave form with constituent wavelengths corresponding to different colors, and we deduce that the ocean is blue because water more readily absorbs the rest of the visible spectrum with its longer wavelengths. Theory allows us to realize that water is perceived as clear in a small vessel because it requires large volumes to absorb all long wavelength light in the environment, and we of course engage with this property when noticing that one can introduce substances imbuing water with any hue, applying methods such as adding food coloring to make transparent water blue, a kind of basic insight into causality enacted as a technological pattern. Perhaps we are swimming at a Hawaiian beach; we feel the sensation of warm water, realize by rapid analysis together with a general understanding that warm ocean water is populated by many sharks, the cognizance of warmth being a symbolic pattern representing to our minds an important feature of the environment not witnessed directly, and decide to stay close to shore, behind coral reef. We look at the artistic patterns of a painting, experience the perception of a lurid blue color, observe it with more intentiveness as a sort of jarring depiction of the ocean, and start to contend with its surrealism, considering whether we like it, what it means, and how to categorize it stylistically.
In all these cases, percepts and concepts blend and morph via an intentional mentality we would identify as targeted thinking, distinguishable as existing in between purely perceptual spacing out and ruminative daydreaming. Active attentiveness, our self’s focused awareness, while certainly not the only practical state of consciousness, is key to all experimentation, for it is in this condition that we strategically analyze our environments and memories in order to characterize phenomena in new, progressive interpretations of causality. Human hunters scrutinized and thereby learned feeding habits of species, inferred the concept of particular foods as bait, and set traps accordingly to make an easier catch. Prehistoric humans noticed that salts absorb water, then that salts cause meat to retain moisture without decomposition, a counter to desiccation and spoilage, even further how absorbed salt seems to prevent freezing, and scientists eventually put these salt solute properties to the test in the lab, establishing with quantitative precision how concentration effects freezing point. Human beings of all times and places figure out what phenomena do and implications for our lives, essentially making the world intelligible by intuiting, forming, recalling, and expressing associations amongst percepts and concepts, from out of which springs the experience of reality as a collection of facts functioning as the basis for understanding, organizing, reconfiguring, adapting the world for our wants and needs.
Modern empirical observation applies this analytical frame of mind within the methodological context of ’empiricism’, a paradigm based upon the modeling of perceptions that are capable of being experienced and comprehended collectively. Our aqueous solution experiments are quintessential examples of the empiricist approach: any human being can prove associated hypotheses true or false anywhere on Earth at any time, even those lacking professional education in the science of chemistry, so that the evidence produced is almost universalizable. We can contrast this with the equally observation-dependent postulate that telepathic communication occurs between ants, which is an entertaining thought, but not conceivable with enough machinelike, mechanistic clarity to parameterize an investigation and formulate a hypothesis verifiable or confutable with certainty. The postulate that life exists on Jupiter’s moon Europa is formulizable such that we can construct a testable hypothesis, such as “alien microorganisms will be observed in the liquid water on Europa’s surface”, but do not have access to environments where this investigation would be practicable.
As scientists carry out research, proven hypotheses accumulate, from which are constructed models accompanied by technical terminology, elaborated via academic papers into a broad framework of description that furthers intuitions, discovery procedures, methods of analysis, and infers a wide range of implications for the relevant field. These productions become the raw material of a discourse that expands factual foundations, disseminates ideas, suggests applications, and contributes to the development of new research projects. Simpler ways of organizing technical concepts are invented, such as Maslow’s hierarchy of needs, the periodic table of the elements and atomic theory, technicalities are translated into descriptional forms more accessible to nonspecialists as well as the general public, all of these knowledge contents and communication tactics are incorporated into education, and derivative endeavors in technology are commenced.
Occurrence simply exists: stuff happens. We perceive this in distinctly human ways, in line with our intuitions, knowledge, opinions, beliefs and worldviews, both individually and socially. Entrepreneuring intellects observe novel aspects of nature, collecting a hodgepodge of facts that at first make only vague sense as a whole, but a pervading orderliness starts to stand out as the causality of many patterns, their structures, interactions and functions. Once we gather enough information, the context grows more coherent, and we reach the point of being able to make many effective predictions. At this stage, assembling facts becomes an enterprise of more systematic experimentation by which we test our burgeoning apparatus of concepts, and we eventually can anticipate how further fact will confirm our notions either true or false, with postulation transitioning to more provable and disprovable protohypothesizing. When the investigative context is understandable enough that many of its features are conceptualized as well-defined and deeply predictable ‘constants’, the environment has been parameterized to an extent rendering it controllable, or at least provisional of some extremely dependable assumptions. Experimental and analytical empiricism has made its arrival, with environments subsumed into conceptual frameworks, including presuppositions sufficient for turning fact-finding into the application of long-standing approaches for discovery as well as information processing: the methods, techniques and classifications of research.
Once methodologies of analysis are firmed into conventions, postulation has progressed to theorizing, which utilizes generally applicable procedures for the introduction and assessment of hypotheses. Patterns revealed in individual analyses give rise to emergent overall patterns, or looking at this process conversely, from the perspective of cognition rather than technical context, the large body of factualized conceptualizings derived from hypothesis-testing coalesces into conceptual frameworks. These top-down paradigms give strategic direction to the search for further descriptions of phenomena, and theoretical modeling of causality graduates to theoretical disciplines, the institutionalizing of foundational thought forms that consolidate the task of converting phenomena into knowledge.
Once the causality of patterns has been theorized such that it is extremely predictable, attempts are made to use theory for predicting real world situations as well. Scientists proved by experiment that table salt lowers the freezing point of water, ponderers also conceived its potential to prevent water from freezing outdoors, and the decision was made to test whether applying this cheap and easily obtainable substance to road surfaces in snowy weather would mitigate formation of ice and keep traffic flowing at a normal pace. Predictions of highly developed theory usually have some degree of effectiveness; in the instance of salting roads, this technique at first seemed to be working like a charm. But initial deployments of theory, though often modestly successful, tend to result in unexpected consequences or insights, for actual environments are always much more complex than any investigative context of analysis with its schematic conceptualizings of phenomena as mechanism. In the case of deicing roadways, technologists discovered that salt has the unwanted effect of causing metal in vehicle surfaces to rust. New experiments were designed to test hypotheses about which likewise inexpensive and accessible chemicals might have the same effect as NaCl and similar substances while not resulting in corrosion. Theoretical predictions were made, and the practice of liquifying water on streets at below freezing temperatures was modified to better outcome.
So to summarize, modern theories, science’s descriptive models of reality, start with basic perceptions, proceeding to active observation, at which time general patterns grow apparent amongst the plethora of individual patterns. We seek out many investigative contexts to test our initial impressions of general patterns, and reflect upon the validity or invalidity of these intuitions. Our intuitions are enriched by cataloging more information until general patterns become a general picture of overall causality, exhaustive of the body of fact under consideration, into which further fact fits with greatly reduced uncertainty. Intuition becomes potent enough that we can predict what subsequent fact will imply in the context of our conceptual framework, so that it is possible to formulate hypotheses, statements of factual truths observers can expect, based on preexisting knowledge, to corroborate or invalidate with a high degree of certainty in investigative contexts of more systematic parameterization. These investigations can take the form of controlled experiments, such as in laboratory chemistry, or an analysis of natural environments using mechanistic models, like an assessment of global warming in terms of the rate at which arctic ice is currently melting compared with the remaining signs of past trends. By sorting, comparing and synthesizing millions of hypothesis tests, emergent theory gives a broad image of reality, making technological applications more efficacious and, if we think critically, with a willingness to challenge and update conventions, we achieve a progressiveness that steadily grows less error prone.
As can be seen, language’s role in current theorizing is simple to discern: it facilitates the popularization and dissemination of ideas by expression that makes indirect experiencing of someone else’s thought processes greater in realizability. This leads to the question of how we reached this point; the multimillennial coevolution of theory and language must be explained, along with how these developments, so vital to the social organizing centered around technology, have changed the way humans conceive as they act and communicate.
Like was touched upon, the psychical processes associated with current theory and technical practice as exacted by lone individuals do not vastly differ from those operative in prehistoric problem-solving, which we know of based on the similar circumstances of hunter-gatherers as recorded during the historical period. A primeval warrior observes the vulnerability of flesh to sharp objects, makes mental note of various types of sharp objects in the environment, experiments with and classifies their compositional properties, develops techniques for fabricating implements of optimal weaponization based on a general conceptualizing of both sharpness and the diverse materials at his culture’s disposal, then enhances his hunting and fighting capabilities. A primeval human perceives to the point of indifference that rainwater collects in puddles, and absentmindedly that many shapes have water-gathering properties, but one day realizes in a flash of insight how vessels can be used to collect water for drinking, applies this general concept derived from exposure to many analogizable patterns by placing pottery outside during a rainstorm to grant quick access to a clean draught, and a technological strategy is born. Automobile drivers accumulate many observations of vehicles during their lives, and ideally also some studied knowledge of vehicle construction, notice an unusual sound while operating their car, realize from their conceptual frameworks that a fan belt is probably the culprit, and take the car into the shop to get the part replaced, averting a total breakdown. At the turn of the 20th century, researcher Marie Curie was intrigued by the uninvestigated phenomenon of radiation in uranium. She systematically observed in a series of hypothesis-testing experiments that varying the amount of this element changes the quantity but not intensity of these emissions, and generalized her results by proposing that the structure of this substance’s atoms is responsible for what she coined “radioactivity”, the first evidence of the atomic nucleus from which radiation of this type exudes. Her Nobel Prize winning discoveries initiated the science of modern atomic physics, after which she supervised introduction of x-ray technology based on her research.
In all these cases, progress is not primarily dependent on talking to oneself or anyone else, but rather on a sequence of fractionally verbal cognitions: becoming aware of general features inherent in a multitude of perceptions with focused observation, finding a way to contextualize the environment of causal factors from which one’s intuition arises in a way sufficient for further refining these intuitions, and converting investigative contexts into technological ones. To condense, we can say that development of systemlike comprehensions, postulates or hypotheses upon which all kinds of explanations are founded, including scientific ones, involves the following procession: basic perception; deliberate observing; generalization as the conception that unifies observed patterns; conceptually parameterized observation; synthesis of analytical observings under mechanistic concepts as in theory and technology; and application of one’s increasingly cogent concepts to alternate environments, both naturally occurring and man-made.
During prehistory, evolving language became entwined with and transformed by technical sorts of concepts when it transitioned from predominantly representing to someone towards exacting a detailed representation of something, a closely scrutinized object or some such phenomenon, at which time the endpoint of verbal communication surpassed libido discharge, concordance or satisfactory response, becoming a matter of correctness. Holding preciseness of language in high regard probably began in force with quantitative measuring, the fastidious approximation of dimensions using visual gauging, hands, feet or mensurational devices, giving language as addressed to objects more than just conceptual meaning, but also a conceptual structure in the context of material causes. Structural conceptualizing, due to constant gravitation and additional properties of the environment, proved extremely practical, and materialistic thought as applied to technology became the foundation for managing logistics of civilization, in settlement planning, management of the food supply and elsewhere.
Soon after the dawn of civilization, technical thinking had ascended to primacy as the fulcrum of life, though still infused with many spiritual notions, and as writing became a form of speech, expressing the phonetics of verbalization in addition to its initial role as a pictorial system for inculcating and preserving concepts, language grew to be more of a reflective mirror for the relatively nonverbal mentality of perception, observation and conception alone, at least in some contexts. This conversion of precision thinking, evolutionarily synergized with quantification of the environment, into precision talking seems to have waxed institutional with a filiating of literature as narrative myth into the genre of philosophical narrative. Painstakingly crafting speech into lengthy writings provided a venue for not only richer symbolic meaning but also the ponderous cerebrations necessary in efforts to translate the substantially nonverbal thoughts involved in managing material causality into nonprecisional natural language that had initially developed for an aesthetic of striking immediacy and, by modern standards, a crudely “good enough” representing of concepts, not as an analytical instrument.
Grammatical language resembling that of current humans likely coexisted in some way with perception, conception, observation and socializing for at least tens of thousands of years prior to the invention of writing, contributing to the stimulation of concepts in other minds during interactions, but though its forms coordinate with thought, the core mutations of spoken language unfold partly independent of cognition, and features of verbal expression transform whenever populations separate or a distinct subculture develops, even one as small-scale as a single family. Words blend, vocabularies expand or contract, sound production undergoes all kinds of phenotypic drifting, a phenomenon beyond the baseline meanings and functions conjuncted to ecosystems, as well as external to structural constraints of single minds and society at any given time. Furthermore, speech offers pleasure for many individuals regardless of its particular content, attributable to a host of causes, and so trajects towards peak expressiveness, its full capacity, via the impact of affect satisfaction and bonding within the framework of both social norms and assertings of distinctiveness. Simply put, humans like language, and so behavior often selects for its enhancement as it evolves.
A unique characteristic of human language is its creative potential, for the possibilities accessible via speech acts are practically unlimited. Humans not only take pleasure in language as a hub of self-meaning and identity, but variability in these highly conceptual pleasures is inexhaustible, and so speech provides a conduit to the diversification of intentional behavior. Communicators innovate expression, these novel expressions shake up social arrangements in a way that gives not only verbalization but also the full-bodied experiencing of new thoughts, feelings and instincts a niche, and human nature reinvents itself as it constructs its linguistic architectures, a social phenotype infinitely accommodating to behavioral phenotypes. If we can say anything, we can at least in principle be anything we can want to be.
Of course behavior is influenced by causes besides language use, factors of environment, brain structure, perception, most of which are at least partially unconscious and that all involve constraints: what we want to be exists within boundaries that limit our purposeful control. So even as we intentionally create our world by arranging, rearranging, remaking society with license afforded by love and tolerance for language, our enriching cultures of behavior can ossify into conventions, becoming foundational to human life, so that behavioral/cognitive archetypes such as the shaman, the warrior, the chief, the artist, the healer, took effect as society evolved. Psychological phenotypes that exceed the self, together with their traditionalizing in practices such as those of religion, war, authority, media or medicine, direct the course of culture as well.
The procession of our prehistoric living was like the surges, cascades and eddies of a river, contained within the shores of human nature’s deep structure and its transcending of our wills, but gradually altering course by carving out new features. With arrival of civilization, huge population and rapid technological progress, the human psyche’s cultures have been amplified into a churning maelstrom, pummeling our unconscious with massive waves, torrential downpours and gale force winds, flooding the surrounding landscape of unconsciousness to much destructiveness, but also with power to more thoroughly reshape the lay of the land.
At the start of antiquity, what became the perfect storm of modern culture was comparatively mild, with more sporadic disturbances, and integration of analytical thought with natural language was able to make its pioneering headway. As literature shifted from primary concern with incarnating the most evocative values, enthralling or edifying its audience, and towards analyzing what and why the world constitutive of its content is materialistically, as an objectified causality, language use acquired a greater dimension of technicality.
The first philosophers stated their intuitions about reality as truth postulates expressed with common words, then organized this primordial fact into generalizations, an attempted accounting of the environment’s structural essence in qualitative terms, the holism of its causal factors. They claimed that events occur because of elemental properties, but did not yet have any kind of formal method to systematically put their musings about substance to the test: technical thought was generalization without science, the postulation of broadly encompassing principles in the absence of what we know as modern hypothesis. These generalizations of causal structure were unprecedented linguistically, and as they grew increasingly abstruse, reconfigured to assimilate new fact, the connotations of common language became inadequate for intellectual discourse. At this stage, the nonscientific naturalism of philosophy began to use terms in idiosyncratic ways and coin new terms – in ancient Greece for instance, ‘nous’, ‘logos’, ‘eidos’ (forms), ‘arete’ (virtue) – to the extent that deep thought and some specialized training were necessary to understand the literature. In B.C.E. Europe and perhaps elsewhere, philosophy parted ways with natural language as it attempted to express technicalized materialism, the structural essences of substance, diverging from mainstream meanings. This process began in a fringe genre of conjectural poetry, but advanced to a professional field – metaphysical epistemology – that by the height of antiquity was cutting edge academics, then after a late first millennium C.E. decline in European scholarship, restored to preeminence during the Middle Ages.
Mathematical language also got off the ground during antiquity as quantitative measurement grew more precise and standardized, calculation with abacuslike devices became commonplace, and an alphabet of numerals was introduced. The methods of math were applied in defining trade value, fully quantified with the introduction of currency. Mensuration for engineering purposes increased in complexity until structure building involved ambitious designs based around labeling and arranging objects as pure mathematical, quantitatively idealistic specs. Proliferation of technological and economic practices gave all the materials of civilization numerical identity via a combination of vocabulary innovations, drawing and writing. Compulsion to rationalize environments with math was extended to the natural world also by communities such as the Pythagoreans of ancient Greece, and these subcultures helped transition the first materialistic concepts into notions of atomistic units and their quantities, which ultimately led to theorizing of a mechanistic, ‘physical’ reality.
The progressing practice of measuring geometrical lines and shapes with precision, then terming these quantities with numerals, produced some new concepts. Human sense-perception innately perceives proportions, but the application of math in defining proportionality resulted in a clearer notion of fractions, the exact relativities of parts to both wholes and each other, also equivalence as identical proportion, and congruency as the resizing of an object, changing scale while constitutive proportions remain identical. Principles of proportionality in various kinds of objects led to methods for finding unknown values without comprehensive measurement, the origins of algebraic thinking.
Concepts of fractionality, identicality, and their utilization in problem-solving were also fostered by calculations performed with standardized objects such as pebbles, beads and coins, then these more nonmaterial applications of math, disengaged from the concrete properties of any particular substance, stretched intuition even further. Philosophers pondered infinity, the limitless divisibility and iterativity of a pure, disembodied quantity, and mathematics eventually arrived at ways to notate, theorize and apply the idea. Outcomes of subtracting a larger quantity from a smaller quantity or subtracting a quantity from itself did not represent anything material, but what if you did it anyway? From these tinkerings, negative numbers and the number zero entered the mathematical repertoire. “Zero” proved to have major ramifications for linear algebra, as equations that are solved using operations of multiplication and division can often be frustrated by this lurking nonquantity, and much of algebra demands some technical effort to work around conniving ciphers. Taking the square root of a negative number was impossible, but mathematicians invented a notation for these ‘imaginary numbers’ and did it anyway.
As the first theoretical math of antiquity deduced principles of quantitation, it relied heavily on verbal explanation, proofs that usually consisted of labeled figures with a couple of natural language paragraphs using something like Roman numerals, which described geometrical properties. As the centuries passed, Hindu-Arabic numerals entered common usage, and this user-friendly alphabet of numerals made feasible the development of mathematical grammar, with symbols for operations, some of which everyone learns – e.g. “=”, “+”, “-“, “x”, “/” – so that quantitative thinking became linguistic, with acts of deduction based around construction and manipulation of symbolic expressions.
Advancements in quantitative language made it possible to prove principles – laws, axioms, theorems – that could not be easily represented with spatial diagramming, and mathematical deduction was empowered to progress independent of material substance and its idealizations in geometry. Nonmaterialistic math expanded with the ongoing process of deducing all kinds of abstract conclusions from symbolically expressed premises in formal proofs, and mathematicians innovated ways to represent these new quantitative concepts in progressive geometries, using various coordinate systems. Thus, mathematical thinking was converted from deducing quantitative fact within the limits imposed by sense-perceptual intuitions of substance, to representing an unbounded range of quantification-related abstractions with all kinds of ideal form, an edifice of functions and formulas alongside complementary dimensionalizing. Complex geometrical structures and the algebraic language which expresses their properties are used to model observations that investigators quantify, making the products of empiricism more intelligible, a finely grained contextualization of fact as organized data.
With technicalization of some natural language contexts and the crafting of a symbolic language for mathematical concepts, institutional education as a means of spreading technical thought and procedure gained in relevance. During antiquity, this began as forums of instruction like Greece’s sophist-taught rhetoric and philosophical academies, or Confucianism-inspired study in the service of sculpting administration and policy within ancient Chinese government. During the Medieval period in Europe, scholastic pursuits at monasteries and universities managed to reach every region on the continent and realize the potential to provide a foundation for enculturing entire populations with systematic thinking. Literature was the focal point of training and professionalism in academia, so it seemed practical to make languages more standardized. In the Middle Ages, word lists for teaching vocabulary had been put together at particular establishments, but by the 17th century alphabetized dictionaries were being compiled, the lexicons of language fixated as universal definitions.
Academic education and then public schooling settled the meanings of common words so that natural language with its tendency towards evolutionary drift became more constrained, by the 20th century almost stationary aside from a trickling in of deliberate additions moderated by scholarship. Ordinary usage stabilized, though these resources are frequently stretched in literary artistry, and as technical contexts continued to diversify in new directions, the definitions of existing words were expanded to encompass scientific and technological connotation. “Matter” came to mean not just material in general, but a precise concept of physical substance and its composition, derived from theoretical chemistry. “Heat” was no longer merely sensation of warmth, but also the kinetic energy of matter. “Energy”, meaning motivation or vitality, was coined in the 19th century as a word for the property of change in matter. In physics, “work” means the energy of motion associated with a force acting upon an object, added to its general sense as “effort”. “Ego” is the ancient Greek word for “I”, entering usage in 18th century metaphysics as “self”, ultimately adopted by Freud’s English translator as the word for his technical use of “Ich”, the German word for “I”, a concept that has become integral to even casual understandings of human psychology in the Western world. And we all know a gigantic assortment of neologisms for new technologies: for example, “telephone”, “car”, “airplane”, “radio”, “television”, “internet”, or “texting”.
Nontechnical words are constantly entering languages also; in English, some examples are “jumbo”, “bonkers”, “autotune”, “bling”, “bromance”, “chillax” and “infomania”. This new vocabulary reflects the nature of modern language use, for appeal and staying power are usually a result of quirkiness or humor rather than literary function, with a strong aura of technological savvy, words related to new electronic gadgetry or that someone might think of in the shower to make friends laugh. Words from foreign languages like “yoga” and “karma” also spread widely as the English-speaking world grows more multicultural. At least in the U.S., the artistic sensibility of language is becoming informally conversational, a move that indicates still greater shifting from aesthetics of ostentatious sound and style to meaning as direct correspondence with objects of nonverbal analysis upon which our increasingly technological culture is based and that absorb more and more of our attention.
Written language is following in the wake of speech as much authorship moves away from the pretentious metaphors or sophisticated grammar of former eras and towards economy of expression. There are a few main genres that all evince this transition to plainer verbiage. In fiction, novels and short stories are the primary forms, and tend to be much less academicized than in the 19th century and prior, as florid prose transformed into realism and naturalism, probably under the influence of scientific viewpoints. Articles are a pithy means of conveying fact in newspapers and on the internet, the foremost source of information for most individuals, a stylistic form designed with reading speed and ease of comprehension in mind. Textbooks and encyclopedias are the core literary tool of education, with a hybrid format including illustrations and diagrams integrated into the flow of their declarative exposition. Poetry is also still popular with many, and has become much more freeform, as writers usually assume a personal, even confessional style rather than conforming to structural conventions and treating of epic themes, invoking emotion and a tone without the requisite for ambitious plots, learned allusions and deference to tradition. Nonfiction books explaining scientific, historical and practical topics are a key player in inserting technical and methodological concepts into popular culture.
The nexus of conceptual development’s intersection with spoken and written language in the contemporary era is academia, for it determines the methods and contents of education, which in turn influence theory, belief, occupation, and the growth of society. It consists in two general fields of discourse: research reporting and literary analysis, which extensively combine in most real world practices. Research reporting explains theoretical models generated by scientific disciplines of all kinds, with procedures of peer review that aid in regulating and strategizing its agenda. It is as purely functional in its meaning as any communications get, exactitude aimed at individuals whose thoughts and activities must be intensively devoted to consideration of a small, almost absolutely defined set of details in order to utilize them at this level of expertise. Some technical articles in scientific journals are full of so much specialized, precision terminology that they are unintelligible to the nonprofessional. Literary analysis is the other end of the spectrum, composing interpretive theses in art and cultural criticism that try to broadly ascertain factors formative to civilization, such as works of art, movements and events, discerning their import as comprehensively as possible in the most intellectualized, deliberative considerations of human action as a whole. It examines language’s role as mass expression, more concerned than any other discipline with pure aesthetics, the nontechnical and often irrational aspects of meaning and behavior, analyzing dynamics that in their generalness and independence from any finite set of facts introduce high levels of ambiguity. Usually its grist is communication of allegorical, unspecialized, everyday, mainstream and popular varieties. In essence, research reporting does the quibbling about basic fact, and literary analysis quibbles about overarching truth.
Schooling of youths indoctrinates research reporting, the qualitative and quantitative evaluation of technical content, with lab reports, papers, group projects, and presentations in both hard and soft science classes: physics, chemistry, biology, anthropology, psychology, sociology, economics and politics. Successfully handling concepts in professional settings depends on possessing a basic fund of general knowledge as well as the ability to translate technicalities into spoken and written form. This allows new facts and methods to be readily grasped, and when language is required, communications effectively carried out.
Literary analysis, the assessing of cultural generalities, is instilled by reading, essay writing, creative writing and stylistic performing in art and literature classes. This subject matter cannot be exhaustively taught because of the indefinite vastness of thought, belief and behavior, which spontaneously generates transcendent to any singular or unified enactment of intentions, and beyond any possible adumbrating of strategies and consequences in social organizing. Thinking analytically about past and present memes does not tend to make their emergence a more closed, law-governed system, and can sometimes shake up society in profound ways, the reason for authority’s love/hate relationship with social commentating, but it gives citizens the ability to comprehend the nature of new memes, in whatever organic and often unpredictable way they arise, apprehending their background of influences, motivations, and the honorable, practical, predacious, deceptive or ill-conceived purposes served. Individuals obtain the capacity to grasp not merely how concepts are to be applied in a specific situation but also a wider sense of why, which exceeds insights available from personal experience, local conditions, and institutions attempting to advocate particular facts and beliefs. Broad-mindedly critical judgement is not so much a benefit in meeting one’s daily needs as bulk immunization against degenerative ideologies, for citizens become capable of penetrating the facade of social movements with thought, recognizing and then rejecting pernicious stereotypes, fanaticism, mass hysteria, apathetic varieties of nihilism, intellectual submissiveness and self-destructiveness to which uneducated human beings are more susceptible. Reflection upon large-scale dynamics of culture, during which the boundaries of individual communities and eras are challenged, is necessary for Homo sapiens to be not just a technologically reasoning animal but further a mimetically rational one, with a self-possessing integrity that does not succumb to herd mentality, an independent intellect mobilized to organize and progress on its own.
Communications aimed at a mass audience are often intended to mislead, manipulate or obscure flaws and uncertainties. Practice with logical expression, while minimally applicable to interpersonal speech and decision-making, enhances with intellectual growth the ability to analyze propagandistic kinds of truth claims, such as those of a political or otherwise rhetorical nature, for which the opportunity to achieve greater clarity by collaboration and teaching of an organized kind often does not exist. Let’s face it: modern citizens are regrettably not allowed to talk about some subjects in an explicit way, with many expressive forms designed for restricting communication to a parroting of the facts and beliefs that communities or institutions have sanctioned, antipathetic to critical analysis. But education can stretch these constraints by indirect means, making it so that what individuals cannot say about society they can still think, assessing assertions in terms of deductive structure, interpreting persuasional language inferentially so as to more readily perceive whether an idea justifiably leads to another.
With training in the construction of thoughts as argumentative proof, declarations of truth by politicians, pundits and promoters are more intuitively resolved into a collection of premises and conclusions mediated by evidence, which can stimulate reasonable skepticism and allow a contemplator to obtain clarification via the targeted pursuit of information, in strategic ways. Instead of rhetorical contexts reinforcing one’s own views via confirmation bias, as notice of only the most self-validating or provocative information, an audience is more likely to conceive in an analytical way what the total presentation implies or fails to imply, impartially assessing its intellectual merits. Recipients of rhetoric that are schooled in logical expression are also less likely to be distracted by irrational peripheralities, a communicator’s appearance, tone of voice or linguistic style, thinking more perceptively about the meanings of even common language, regardless of how appealing an aesthetic milieu might be for our instincts. This fully aware mindset can be a frustration to authority, but nonetheless benefits society by reducing its vulnerability to cultural regression when power becomes corrupt and strives to exact its aims with tactics of duplicity. Enough practice in the finer points of complex logic, and seeing through verbal sleight of hand becomes child’s play.
So putting it all together, we can start by saying that the Holocene era capacity for grammatical language probably evolved between fifty and a hundred thousand years ago. It grew more object-oriented rather than merely concept-oriented as technological methods advanced and thinking became more technical, with simultaneous progress from the rudiments of quantitative conceiving to more complex procedures for measuring, counting and calculating. With the beginnings of contemplative leisure, intellectual specialization and rapid dissemination of ideas at the onset of civilization, thinkers threw considerable weight behind precisely expressing technical concepts using both natural and mathematical language. Technical language of both qualitative and quantitative varieties began to diverge from common tongues, the origins of an academic discoursing that first sought to express in literature the general principles of material substance, a way of formulating thoughts stimulated by technological system-building, then proceeded towards reflection upon the metaphysical essence of existence, as well as contriving modes of logical proof which provided formal methodologies for organizing the analysis of all this technicality. Academia-mediated education spread until it was a primary institution consolidating the whole of language, theory and philosophy. This universalizing of quantification, philosophy, logic, language, and their employment in technological development gave rise to modernity’s empiricist rationalism spearheaded by experimental science. Sophisticated analyticity led to high technology and a deeper awareness of memetic dynamics. Technical concepts and methods function as tools by which to theorize at a more progressed level. Also key in forging the modern episteme and engineering society is evidentiary, inferential proof. This critical approach to expression not only streamlines reasoning in academic settings, but makes civic involvement more cogent by empowering the general public to think systematically. We can secure the species’ future by rationality in policymaking as language infinitely accommodates the recording and analysis of historical precedent together with an expansion of the reach, profundity and lucidity, the humanist power of theory.
A free download of the book Standards for Behavioral Commitments: Philosophy of Humanism, also available for preview below. Topics covered include chemistry, biology, genetics, neuroscience, epistemology, the history of Western philosophy, cultural evolution, theory of cognition, ethics and much more.