EXCLUSVE FEATURE: World-renown Peak Oil expert Richard Heinberg reads the latest issue of his Museletter (#181). Find out why people ask Richard to speak all over the world, from the European Parliament onward. Download the mp3 or stream the audio.
MuseLetter #181 / May 2007
by Richard Heinberg
Talking Ourselves to Extinction
Language is a powerful meta-tool that dramatically amplifies cooperative human efforts to control the environment. Language also opens the possibility for religion and science—which otherwise would not exist. Language helped generate our current ecological dilemma. Can language help solve it?
In systems theory and evolutionary biology, the word emergence describes the development of complex systems or organs; an emergent phenomenon is one based on the interaction of simpler elements but whose characteristics cannot be predicted based on a thorough knowledge of those elements. In the course of a species’ evolution a variation may appear that is retained because it confers an advantage in terms of existing functions; but once in place, the new characteristic may act in combination with other capacities of the organism to make truly novel and unexpected functions possible. Life itself has been described as an emergent property of matter, and sensation and mind are emergent properties of higher organisms.
Human societies are dynamic, complex systems, and most of their signal features are understandable as emergent phenomena. It is a fascinating thought exercise (I’ve been at it for two decades now) to attempt to trace series of events in the past in order to identify the most decisive developments that enabled the emergence of industrial civilization. Of course, societal complexity depends on humans’ ability to capture increasing amounts of energy from their environment, and so their genetic and social attributes that facilitate energy capture are crucial. Which of those attributes are keys to understanding the entire process?
Clearly, most of the emergent features of complex societies (their economies, technologies, and governments) depend on language. Now, language itself is an emergent phenomenon, a link in a long chain of them; however, it was a profoundly consequential one. In the grand edifice of human society, it should be considered a foundation stone.
The questions of how and when language evolved are hotly debated. Some archaeologists argue that the relatively sudden appearance, roughly 40,000 years ago, of counting sticks and new kinds of hunting tools suggests that language arose then. However, humans—including Neanderthals—were anatomically capable of speech much earlier; indeed, there is fossil evidence that the main areas of the brain associated with language (Broca’s area and Wernicke’s area) started to enlarge up to 1.5 million years ago. Moreover, humans’ ability to spread to regions outside of Africa, and especially to islands, may have depended upon their use of language to convey information and intention and to coordinate tasks. It may be that we have been using language so long that our brains, throats, and chests have all evolved in tandem. The situation is likely similar to what has happened in the computer industry over the past few decades: just as hardware and software developers work cooperatively, one designing according to the needs and capacities of the other, our own internal hardware (brain and speech faculties) and software (language) have become, in a sense, made for one another.
Part of the problem in determining when and how language arose may lie in definitions. The term language can refer in a vague or general sense to any sort of communication; but this usage is not always helpful. All animals communicate using sound, color, scent, or gesture. Even plants and fungi communicate with one another using chemicals and gene-packets transmitted via soil or air. Human language differs from these kinds of information transfer in its level of abstraction, its multiplicity of symbols, and in the complexity of its grammar (or system of rules for the manipulation of symbols). It is one thing to signal a somatic or emotional state or a general intention, but quite another to discuss events, including hypothetical ones, in the future or the past, or in distant places.
Language made these things possible, but much more as well. Language generated our peculiarly human form of self-awareness: we can talk about ourselves, talk about talking, and think about thinking. Our relationship with our environment also changed, as language enabled us to coordinate our thinking and behavior across time and distance in a way that was unprecedented, making us a far more formidable species (compare the population size and environmental impacts of humans today with those of chimpanzees or gorillas). Writing only exacerbated these trends, heightening the level of abstraction in language and widening our ability to convey thoughts and align collective action. If talking helped organize effective hunting bands, writing enabled the formation of nation states. Add printing, radio, television, and fossil fuels, and here we are today.
But with language came an array of unintended consequences—which, of course, is just another name for emergent phenomena.
Language and Religion
“In the beginning was the Word,” begins the Gospel according to John. In Genesis, creation commences with a series of spoken commands, starting with “Let there be light.” The creation stories of the ancient Egyptians, Celts, and Mayans likewise emphasized the generative potency of language.
This striking coincidence, noted by many scholars of world mythology, cloaks a supreme irony: while religion ascribes magical power to words, there are reasons to think that religion itself may be an inevitable though accidental outgrowth of language.
It is interesting to speculate as to whether non-human animals have awareness of something that humans might recognize as a spiritual dimension of existence. Do dogs and cats have near-death or out-of-body experiences? Do birds experience awe and wonder when watching the sunrise? There is no way to know for sure. In any case, it is fairly clear that no non-human species has developed a religion—if we mean by this term an organized set of beliefs about the supernatural, and a set of practices oriented to the service or worship of a divine being or beings.
Why not? What is unique about humans that would lead us to construct religions? Are we set apart because we alone possess souls? Or do our brains contain some unusual structure shared by no other animal? Research into “neurotheology,” while controversial, offers some clues: religious or spiritual experiences seem primarily to be associated with the right temporal lobe of the neocortex, implying that feelings associated with such experiences are normal features of brain function under extreme circumstances. Nevertheless, it is likely that the problem of religion is as much an issue of “software” (language) as it is one of “hardware” (brain structure).
Let us suppose that language was initially used only for practical purposes such as coordinating hunting efforts. Slowly, haphazardly, people must have developed rudimentary elements of vocabulary and grammar, often in order to aid with planning—an activity inherently implying the senses of location, time, cause, effect, and intention. Women, men, and children began to make simple sentences to ask and to explain—who, what, where, when, and why? Once the abilities to pose and answer such questions were in place, these inevitably began to be applied to less immediately pressing concerns. The Pleistocene hunter went from asking, “Where did these bison come from?” to “Where did stars, the Moon, the Sun, and people come from?” Hence the mythologies of aboriginal peoples everywhere are rich in origin stories. Language was seductive in its power: once a tiny morsel of reality had been verbally nibbled off, its incomplete digestion provoked a recurring hunger to take another and yet another bite, and eventually to swallow the world whole.
As power over the environment grew, as society became more complex and formidable, religion mutated accordingly. Hunter-gatherers saw nature as alive and filled with spiritual presences that could directly be engaged by way of shamanic practices. Such beliefs and behaviors grew out of these people’s direct interaction with their environment, and fit their needs for social cohesion within an egalitarian context. With division of labor and thus a hierarchical organization of society came full-time specialists who got their food not directly from nature but from other humans; some of these specialists were spiritual intermediaries (priests) who appealed to sky gods detached from nature and the lives of commoners. With writing, myths about the gods could be codified and carried to distant lands (this story is told in fascinating detail in Bruce Lerro’s From Earth Spirits to Sky Gods).
These side effects of language have had their own perplexing and sometimes nasty consequences. With religion we have come to believe absurdities, and hallucinated gods and demons have become central to people’s lives. Here in America in the early 21st century it is considered normal for people to talk regularly with a shared imaginary friend, and to believe that this particular imaginary friend is uniquely efficacious—to believe, in fact, that to talk to any other imaginary friend is blasphemy. While this behavior appears more than a tad daft to non-participants, the latter rarely comment on it publicly because to do so would be impolite, and because the believers are so numerous and so vehement in the defense and promotion of their practices. Now, talking to imaginary friends may serve a useful purpose: back in the early 1980s, Julian Jaynes theorized that conversing with hallucinated gods provides a way for otherwise walled-off verbal and non-verbal areas of the brain to interact with one another. Nevertheless, the practice clearly risks personal and societal disengagement from reality. Many people have been killed simply because they talked to the wrong imaginary friend, or refused to talk to the right one.
German orientalist Max Müller (1823–1900), who virtually created the discipline of comparative religion, put the matter succinctly by asserting that mythology is a “disease of language.”
Perhaps the word disease seems too harsh. After all, mythology has its uses as well: as Joseph Campbell never tired of saying, myth gives us meaning. And surely meaning is a good thing. Nevertheless, the human need for meaning again highlights our obsessive and dependent relationship with language. Meaning is always attached to symbols: we invest a symbol with meaning, and that meaning is conveyed to whoever correctly interprets the symbol. We see a sentence written in an unfamiliar language and we wonder, “What does it mean?” As we have become ever more hooked on linguistic symbols, we have come to see nearly everything as if it were a sign for something else. We look to stars, tea leaves, and coincidences for meaning. The universe is talking to us! Myths are verbal narratives that seek to unpack the meaning of existence. We seldom wonder why it is that life itself must have meaning in order for it to be satisfying; is it possible that existence could be sufficient unto itself, with no need for an embedded message?
Religion consists of more than just mythology, though. Surely religion evolved at least partly to coordinate and moderate collective behavior via systems of morality and ethics. The senses of good and evil, of honor and shame, have become such powerful internal motivators for humans that even most atheists are continually compelled by them. There is nothing quite like this among other species, whose behavior tends to be less learned and more genetically scripted, and who therefore do not engage in the practices of rewarding or punishing one another’s behavior nearly to the same degree we do. Ironically, morality often contributes to humans’ most brutal acts, which have little precedent in other animals (witch burnings, as just one example, were morally motivated).
Nevertheless, the development of complex societies would surely have been difficult if not impossible without morality—which had previously often been turned toward ecological ends, as early societies codified their needs to moderate reproduction, avoid incest, and protect natural resources via their taboos (“Do not kill the red kangaroo during its mating season!”). But then, once religion and society had mutually mutated in the direction of abstraction and complexity, morality became at least partly unhinged from environmental and genetic necessity and began increasingly to adhere to written myths about the verbally hallucinated sky gods.
From an ecological point of view, the results were sometimes inadvertently salutary: religious wars (such as the Crusades) helped temporarily to moderate human population levels—though comparable results had been achieved by some hunter-gatherer societies using gentler methods such as herbal contraception. Some religions also promoted celibacy among priests, monks, and nuns, again helping to stem population growth. But as people’s verbal obsessions began to be taken up with myths that had more to do with consolidating the power of religious elites than with regulating people’s relations with the natural world, religion served increasingly as an instrument of social and ecological conquest.
Nevertheless, if language muddied humans’ connections with nature by way of verbal speculation, regimentation, and hallucination, it also fostered a countervailing tendency.
Grammar, Reason, Logic, and Evidence
Other animals observe, plan, draw conclusions from experience, and continually revise their mental pictures of reality. These capacities, the foundations of reason, are not uniquely human. Logic, which is the study of reasoning, is uniquely human, however, because it requires language.
Logic is inherent in grammar, which people developed and used long before there were grammar schools, or schools of any sort, and young children still absorb the basic rules of grammar intuitively without having to be drilled in them. In language, each coherent packet of meaning (such as a sentence) must adhere to some agreed-upon standards if it is to be useful. In this regard a sentence is like a mathematical equation (mathematics, after all, is itself a language): before an equation can be correct or incorrect, it must conform to basic rules. Unlike the statements “2+6=8” and “3+4=9” (one of which we would recognize as being true, the other false), the statement “=5+7 –” cannot be said to be true or false; it is simply unintelligible because it is not organized as a complete equation according to the rules of arithmetic. (Quantum physicist Wolfgang Pauli, who was known for his abhorrence of sloppy thinking, once famously commented that another scientist’s work was “not even wrong.”)
Grammar and logic give us the basis for making comprehensible statements about the world; linking logic with empirical evidence helps us formulate true statements and recognize when statements are false. This, again, is a long-standing practice: millennia before the scientific method was codified, people relied on feedback between language and sensory data to develop an accurate understanding of the world. Are the salmon running yet? Let’s go look.
However, not all possible statements could be checked empirically. If someone said, “These berries taste good,” that was at least a matter for investigation, even if everyone didn’t agree. But the situation was more complicated if someone said, “The volcano smokes—that must be because the gods are angry; and if the gods are angry it must be because we haven’t provided enough sacrifices.” Unlike the observation that the volcano was smoking, the following two statements and the reasoning behind them had no checkable basis—unless the gods could be called into the village commons and publicly queried about their moods and motives (the attempt to do so may have led to the origin of shamanic trance mediumship). This was magical thinking—reasoning based on mere correlation rather than an empirically, publicly verifiable chain of causation.
It was inevitable that magical thinking would flourish given that there were so many subjects of interest for which empirical investigation was impractical or irrelevant. That situation continues: there is no empirical basis for answering, once and for all and to everyone’s satisfaction, questions like, “Does God exist?”, “Who am I?”, “What happens to us when we die?”, or “What is the greatest good?”
Yet however strong the temptation to engage in it, magical thinking when tied to religion failed to provide much practical help in industry or commerce. As these limits came to be appreciated, and as industry and commerce expanded, philosophers and students of nature began to construct the formalized system of inquiry known as the scientific method. Here was a way to obtain verifiable knowledge of the physical world; better still, it was knowledge that could often be used to practical effect. The method came to hand at a propitious time: wealth was flowing to Europe from the rest of the world due to colonization and slavery; meanwhile the development of metallurgy and simple heat engines had proceeded to the point where the energy of fossil fuels could be put to widespread use. When coupled with the project of technological invention, science and mathematics yielded undreamt-of power over the environment. When further coupled with capitalism (corporations, banking, and investment) and fossil fuels, the result was the industrial growth machine.
All of this would have been fine if we lived in an infinite sea of resources, but instead we inhabit a bounded, finite planet. Humanity had set a course toward disaster.
Language and the Ecological Dilemma
The ecological dilemma (which consists of the mutually rebounding impacts of population pressure, resource depletion, and habitat destruction) is certainly not unique to the modern industrial era; indeed, it is not unique even to humans. However, modern humans have created a dilemma for themselves of unprecedented scope and scale.
The dilemma, whether encountered by people or pigeons, is often a matter of the failure of success: the genetically engrained aims of the organism are to reproduce and to increase its energy capture, but its environment always has limited resources. Thus temporary population blooms (which are, in their way, evidence of biological success) are usually followed by a crash and die-off. In humans, the powers conferred by language, tools, and social organization have enabled many boom-and-bust cycles over the millennia. But the recent fossil-fuel era has seen so much growth of population and consumption that there is an overwhelming likelihood of a crash of titanic proportions.
This should be glaringly obvious to everyone. We know about the ecological dilemma from our ecologists’ studies of population blooms and crashes in other species. Our soil scientists appreciate the limits of modern agriculture. Our geologists understand perfectly well that fossil fuels are finite in quantity. And our mathematicians can easily calculate exponential growth rates to show how quickly population increase and resource depletion will outstrip our ability to satisfy even the most basic human needs. Verbal and mathematical logic, joined with empirical evidence, make an airtight case: we’re headed toward a cliff.
But language also keeps most of us in the dark. This is partly because magical thinking is alive and well—and not just in churches and New Age seminars.
In the last couple of centuries, the magical thinking associated with religion, under assault from science, has found a new home in political and economic ideologies. Economics, which masquerades as a science, began as a branch of moral philosophy—which it still is in fact. For free-market ideologues, the market is God and profit is the ultimate good. We have used language to talk ourselves into the myth of progress—the belief that growth is always beneficial, and that there are no practical limits to the size of the human population or to the useable extent of renewable or even non-renewable natural resources. This particular myth was an easy sell: it is an inherently welcome message (a version of “you can eat your cake and have it too”) and it seemed to be confirmed by experience during a multi-generational period of unprecedented expansion based on the one-time-only consumption of Earth’s hydrocarbon stores.
Meanwhile, at the business end of economic theory, masters of advertising, marketing, and public relations have learned deftly to manipulate symbols and images for emotional effect, sculpting the public’s aspirations for comfort and prestige. This new kind of magical thinking does contribute to commerce and industry—and spectacularly so! (For historical details on this, see the BBC television documentary series “Century of Self” by Adam Curtis, and the books of Stuart Ewen.)
In politics, the 20th century saw battles between the quasi-religious ideologies of the Left and Right—Leninism, Stalinism, Fascism, Nazism, and Maoism, along with British “it’s-for-your-own-good” colonialism and equally benevolent Yankee imperialism. In recent years, the political philosophy of Leo Strauss and his followers has come to the fore via the neoconservative members of the current Bush administration. Strauss taught a doctrine that is really just the explicit utterance of an implicit belief common among ruling elites—that it is the duty of wise leaders to cloak their policies in potent patriotic and religious symbols and myths in order to galvanize the internal ethical imperatives of the masses. In other words, lies (if told by the right people for the right reasons) are not only good and necessary; they are the very foundation of responsible statecraft. On this basis, however, language ceases to provide a toolset for accurately mapping the world and instead becomes a mental haze enveloping society, preventing us collectively from grasping our situation. Only the rulers are expected (or allowed) to know the true score; but all too often they come to believe their own myths.
And so we today live in a fog of words so thick that it largely prevents us from seeing where we are or where we’re headed. Language helps us understand, and at the same time prevents understanding. It enables reason and rationality, yet also facilitates their opposites.
Simply put, language magnifies all of the conflicting priorities and potentials of the human organism.
Can Language Help Us Now?
It might seem that the solution to our quandary is a big dose of logic and empiricism. If only the matter were that simple.
Modern brain research explodes the notion that logic can exist in pristine isolation from emotion and somatic states: as neurologist Antonio Damasio explained in his book Descartes’ Error: Emotion, Reason, and the Human Brain, emotion and reason are not separate; in fact, the latter is inherently dependent upon the former. Domasio explored the unusual case of Phineas Gage, a man whose severe brain injury prevented him from feeling emotions. While Gage remained intelligent and responsive after his accident, he lost the ability to make rational decisions and to reason, because his emotions were inaccessible to the process. Damasio argued that bodily senses give rise to emotions, which in turn provide the basis for rational thought (as well as irrational thought). Thus our state of mind merely reflects our state of body, with emotion as the essential intermediary. The rational and emotional functions of language appear to be handled differently by the hemispheres of the brain: it seems that the left hemisphere processes verbiage that conveys linguistic meaning, while the right hemisphere processes verbal (as well as musical and other artistic) expression that conveys emotional content. There are indications that, in most people, the right hemisphere has a tendency to repress the free functioning of the left, thus making brain activity lopsided and dysfunctional while fomenting self-sabotaging internal conflict. This may be one reason we can appear perfectly rational in our pursuit of ends that are, from another perspective, just plain crazy.
Again, the organism wants energy, space, and the opportunity to reproduce itself. However, if every human’s individual pursuit of those goals went unchecked, there could be no organized society because all collective effort would dissolve in continual one-on-one competition. Humans would go from bloom to crash with no period of stability between, and none of this would serve the organism’s long-term interests.
Therefore the organism also needs to cooperate, to attenuate wants and desires, and to restrain reproduction. Accordingly we have developed innumerable customs, institutions, and moral strictures to promote moderation. The result is the battle of instinct against society that Freud agonized over (and largely mischaracterized) in Civilization and Its Discontents. In stable societies, a truce is struck that may last centuries or millennia. In our modern world, temporary success based on unique historical circumstances has led us to cast most self-limitation aside, and we have given ourselves perfectly good reasons for doing so. The truce is broken, and we are at war with nature and future generations.
Is it possible, now and quickly, to tame the organism’s hunger for growth and head off catastrophe? Yes, in principle. One of the wonders of language is that it makes rapid societal change possible. Where another species would require centuries or millennia of genetic variation and natural selection to adapt itself to new conditions, we can shift our collective behavior in a matter of months or years, given language, media, and effective appeals to ethics. Whether it is possible to do so in the current situation, given the enormous growth momentum developed during the past two centuries, remains to be seen. Nevertheless, it is a useful exercise to imagine how a rapid surge toward collective self-limitation might come about.
Somehow, leaders would have to engage the non-rational aspects of mass consciousness by playing upon our shared needs for meaning and myth, using verbal voodoo to alter attitudes and behavior as rapidly as possible. Wartime jingoism has accomplished something similar on many occasions in the past. An appeal would need to be made, on an ethical basis, to reduce consumption and alter personal aspirations. President Carter tried to do this when he suggested, in 1977, that solving the energy crisis was “the moral equivalent of war”—but sadly other politicians and the arbiters of economy and culture failed to back him up. To be successful, such an effort would require the enthusiastic participation of the advertising, public relations, and entertainment industries, as well as organized religions and all major political institutions.
The campaign would have little chance of success if it were not also based on sound rational arguments, since purely emotional appeals would be rejected out of hand by the most intelligent and influential members of society. Moreover, if an attempt to change collective behavior were not based on empirically verifiable, survival-based necessity, it would amount to crass manipulation worthy of a Karl Rove or an Edward Bernays; hence its moral credibility would soon wane.
In the current instance, the rational basis for the appeal, and its survival necessity, are clear. Nothing is to be lost and everything to be gained by sharing accurate and relevant information about our situation; there is no need to exaggerate the threat.
Today precisely such an effort is already under way with regard to climate change. Al Gore and his famous movie have framed the crisis in moral terms, while hundreds of scientists, by endorsing the conclusions of the IPCC, have established a concurrent appeal to rationality.
As yet, the message does not have a sufficiently broad base of cultural support to curtail ongoing, richly-funded calls to buy, consume, and travel. Perhaps the addition of the Peak Oil message, by highlighting immediate economic and geopolitical threats posed by continued societal reliance on fossil fuels, will help broaden the coalition of support for needed change. But all of this will have to happen very quickly.
* * *
At this point, language is a given. For better or worse, we humans are stuck with it, even if it arguably has contributed to crises that threaten us with extinction. One way or another, the way we deal with the enormous ecological challenge facing us will be mediated by words, words, and more words—some accurately reflecting the situation, others concealing it.
Meanwhile here we are, I writing, you reading. We share—I hope and assume—a commitment to logic and evidence, and to an ethic of collective human and non-human survival that transcends the myths of religion and progress.
There is no denying the satisfaction—even thrill—that comes when language hits its mark by dramatically aiding our understanding of what is by now an unimaginably complex human matrix. Perhaps the most we can do, now as before, though with more urgency than ever, is to harness that thrill by using language skillfully to describe and persuade; and meanwhile to act in ways that are congruent with the ethical content of our words.