Edward C Appel. 21st Century Communication: A Reference Handbook. Editor: William F Eadie. Sage Publication. 2009.
Picture this. Maybe, remember this! You’re a child. You’re playing in your backyard sandbox with your sibling. He or she throws sand in your face. You run to Mother to tattle. “Freddie threw sand at me again,” you exclaim. “Yes, but you hit me first” is his reply. “But you called me a bad name, that’s why,” you retort. (You each “punctuate” your squabble differently, begin your narration of the event at some self-serving turn of events.)
Mom’s had it with this scenario. She’s heard it before. “You know the rules,” she says for the fifth time: “No throwing sand, period! You don’t go near the sandbox, the jungle gym, or the swings, neither of you, for a week! Now go do your homework.” In a day or two, there’re hugs and kisses and a milder tone of voice. You’ve learned your lesson—maybe.
Here’s another, on a no-doubt-incomparable plane of importance. It’s December 8, 1941. The day before, Japanese planes destroyed America’s Pacific fleet at Pearl Harbor in Hawaii, except for aircraft carriers then at sea. President Franklin D. Roosevelt addresses a joint session of Congress to ask for a declaration of war. He details in general fashion the extent of the aggression on this U.S. territory, including heavy loss of military personnel. He lists Japanese attacks on Malaya, Hong Kong, Guam, the Philippines, and Wake and Midway islands as well, extending the scope of Japan’s violation of the rules of international relationship. Using words such as “infamy,” “dastardly,” “sought to deceive,” and “grave danger,” Roosevelt identifies the bad guys against whom the “American people” will rise up “in their righteous might.”
“The very life and safety of our nation” is at stake, Roosevelt says. “All measures [will] be taken for our defense … no matter how long it may take us.” “Unbounded determination” will energize this effort. In the end, “we … will make very certain that this form of treachery shall never endanger us again.” “Absolute victory” is the goal. “We will gain the inevitable triumph—so help us God.”
The Japanese would have “punctuated” this narrative differently, to be sure. References to an oil embargo, for example, might have preceded accounts of Nippon warships setting sail for Oahu.
Anyway, America did win that war. America sacrificed greatly at the front and at home to make it all happen. Contra the propaganda of the Japanese warlords, U.S. soldiers did not commit atrocities, postwar, on Japanese civilians. The United States made Japan’s formerly dictatorial polity a model democracy, quite an improvement. It earned the privilege of buying Japanese cars, radios, TV sets, and cameras in exchange for dollars and raw materials. Or something like that!
Not to make light too offensively of that memorable era in American history, its moment of drama parallels, in rhetorical form and in the abstract, the messages exchanged between Mom and the kids perhaps in your kitchen several years ago. Both narratives fall under the heading of a most rudimentary definition of drama: moral conflict to set right a situation gone wrong or to keep right a situation that could go wrong.
To tease out the implications of that simple construction, if it’s moral conflict, at least one human being has to be involved: if not woman against woman, Mom against the kids, or nation against nation, then man against nature. Or, alternatively, a morally sensitive being, namely a person, has to be involved in the telling of the tale, at any rate. Only creatures with language possess morality.
If it’s conflict, then there’ll be finger pointing, blame laying, or guilt tripping, followed by punishment of some kind. Something’s gone wrong. “You did it” and “Here’s how you’re going to have to pay” nicely serve as generic indictment and disciplinary announcement. There can’t be conflict without talk like that, talk of crime or mistaken-ness. And the offended can’t appropriately follow up on such an accusatory message by saying, “Fine, it’s OK with us, go ahead, keep doing it, for all we care.” A response won’t be fully “dramatic” unless a person or persons intervene in some way, even if only to scold and suggest improvement. A verbal “slap on the wrist” fits with a charge of mistakenness. A long stint in prison or even a sentence of death, sad to say, still serves as penalty for crimes and evils in many parts of the world.
What about the “unbounded determination” and effort it’s going to take to win through to “absolute victory,” or whatever? There’s a sense in which dramatic actors must punish themselves to bring it all to pass. Self-discipline, self-denial, “mortification,” it’s often called: In dramatic conflict, persons usually have to direct their acts of sacrifice inward as well as outward. Even Mom had to stop what she wanted to be doing, raise her voice, gin up her emotions, and role-play the scold to get you back on the straight and narrow. It wasn’t fun for her, either.
How does one know that something has gone wrong in the first place? There must be “rules” of some kind, specific to a given culture, subculture, polity, or context, that the “bad guys or gals” have broken. Such rules of behavior and relationship might be written down, as in the legal codes of conduct the state imposes on its citizens through the actions of legislatures, the police, and courts of law. Or the rules of behavior and relationship might be unwritten, such as the ones that a culture or society imposes on its members in stealth fashion. Dress codes, dating practices, the forms that structure mannered conversations, the dramas enacted at ceremonies, parties, and other social gatherings—the “rules” that circumscribe the social butterflies active therein are not found in statute books, though they will often surface in the advice columns found in newspapers. Want to discover whether an unwritten “rule” is really a rule? Break it and see! The reaction to your social “crime” will be instructive, for sure.
What’s still missing from this brief anatomy of drama? It’s “moral conflict to set right a situation gone wrong.” “No pain, no gain,” the saying goes. After the pain, the punishment, the penalty, the suffering, what do you have, what’s the gain? It’s moral order restored, at least temporarily. That’s usually the stated hope. You might have learned something in the “college of hard knocks.” That, too, is a potential gain after the pain. Human relationships may feel right again, if the penalty fits the “crime.” Isn’t rehabilitation the ideal goal of a stint in prison, movement maybe to a halfway house or probation, and then restoration to complete citizenship? To borrow language from the highest orders of morality, dramas aim toward messages and resolutions of “redemption”: The United States atop the free world after 1945; Japan and Germany, now America’s friends and allies, model democracies and powerhouse economies within two decades; Mom’s embrace and, one would hope, reconciliation between sister and brother.
- A moral order of some kind, held together by rules of relationship applied by the “Moms” of the world, the “big shots” in authority
- Disorder by violation, through weakness, indifference, perversity, or commitment to a different social or political “order”
- Depending on who’s telling the story, guilty bad guys who did the awful deed, opposed by the righteous good guys who act to set things aright
- Maybe a constructive or rebellious, charitable or hostile preparatory attitude that will energize or moderate the intended, corrective action, determine its degree or extent
- A sacrificial act or series of such actions, other directed in the form of punishment of one kind or another and/or self-directed in the form of mortifying, self-interfering efforts, all the way from the mild, muscular exertion of a gesticulating Mom to the martyrdom of an M. L. King, Jr.
- Morally corrective purposes and the means, steps, or stages by which to achieve them, a vision of redemption of some sort, restoration of order, things put back together again where they belong or, preferably, better conditions still!
- Risk, chance, maybe danger in the choice and pursuit of their purpose, as dramatic actors, by definition finite and uncertain human beings, reconnoiter resistances and calculate the odds of success in their quest to set right the moral wrong, improve their problematic situation, the stakes being enormous in declarations of war, high in marriage proposals, middling in negotiations at a car dealership, and likely minimal in the humane reproach of a sand thrower: Neither Moms nor presidents know for sure their selection of strategy will deliver, their gamble pay off.
That’s drama. Messages that contain these elements are dramatic indeed. Where do these features of discourse originate in talk on whatever level of social interaction—for example, the family/child level; the school/student level; the church, synagogue, mosque, temple/parishioner level; the society/member level; the company/employee level; the state/citizen level; the UN, international agreement/nation-state level; you name it? (Each level or dramatic arena will, of course, have its own particular rules and procedures for enforcement, and its own names for these generalized moments or stages of development.) Where, at bottom, do these elements of drama come from, and why is it important to know?
Drama as Rooted in the Content Parts of Speech
This chapter is titled “Dramatic Elements in Messages.” Here’s where the focus gets down to the atomic layer of drama. Language users can’t escape drama in their talk and thought. It is implicit—moral conflict and human struggles to overcome it or avoid it are implicit—in the words that make up the subjects and predicates of speech. What follows will seem ho-hum familiar to you. Its implications may not be.
A subject, the thing or idea a sentence makes a statement about, and a predicate, the sentence part that tells what the subject is, does, or has done to it, are put together in English with four building blocks. These four building blocks can be called the “content parts of speech,” since they make up the content, the “meat and potatoes” so to speak, of the spoken or written sentence. You know what they are. They are nouns, verbs, adjectives, and adverbs. Dictionaries, grammars, and books written by linguists give us the following meanings for these four building blocks of the subjects and predicates that make up sentences:
- Noun: a word, or group of words, that names a person, place, thing, or idea.
- Verb: a word, or group of words, that states an action that is taken or states what type of being a subject has, what it is.
- Adjective: a word, or group of words, that describes a noun, that is, describes a person, place, thing, or idea, offers a sharper picture of what it is, tells “what kind.”
- Adverb: a word, or group of words, that describes a verb, adjective, or other adverb, a word that indicates, for instance, why an action took place, how it was accomplished, when and where it happened, and the manner in which it was carried out.
Armed with these definitions of nouns, verbs, adjectives, and adverbs—the building blocks of the subjects and predicates that go together to make up sentences—the basic forms of thought the English language puts in the mind can now be stated. In terms of traditional grammar—the subject being the main idea and the predicate being what is said about that main idea—the following is clear:
In English discourse,
- A noun subject actor or receiver of action (the “agent” or the “patient,” to use the words linguists call them) performs or receives a verbal action (the who and the what)
- A noun subject is identified, classified, or given an attribute or sensation, if the verb is a linking verb, such as “is” or “feels” (the who and the what)
- For some usually adverbial purpose, end, or goal (the why)
- By way of some usually adverbial means or cause (the how)
- To some usually adverbial extent or degree or with some usually adverbial quality or attitude (the manner in which the action is prepared for and/or carried out)
- Within, in terms of, and in response to a usually adverbial, time-and-place scene, situation, or context (the when and the where of the action)
- The thought of the verb is often completed with what is called a “predicate complement” noun or adjective of some kind
- Noun subjects and objects are often modified by adjectives
These, then, in abbreviated summary, are the basic forms of thought expressed in statements about anything—humans, nonverbal animals, vegetables, inanimate materials, and ideas: noun subject actor or receiver of action (or noun subject identified, etc.), verbal act, typically adverbial purpose, means, manner, and scene.
The Implicitly Moral Nature of this General Pattern of Verbal Action
This anatomy of the most basic forms of thought expressed in English discourse may not seem very “dramatic” on its face. Nobody necessarily gets yelled at, verbally knocked around, when subjects and predicates are put together, especially when describing flowers growing in a garden, or planets circling the sun. If, however, one looks closely at the word purpose and follows through on its implications, one may think differently. An actor or “agent” performing an action for a purpose is at the center of language as drama. If not up to its ears, the word purpose is up to its knees in potential moral conflict. Consider the following.
The word purpose, first of all, names a “negative.” It names something that does not yet exist or something that does not yet exist in sufficient quantity, quality, or duration. If your purpose is to win a ballgame, you get in shape, practice hard, and cooperate with your teammates in drills that perfect the plays the coach has designed. The victory on the field or court though—the purpose or aim of your efforts—has not yet come to pass at the time you formulate or speak of your purpose. The victory, your purpose, is still in the future. And the future is a “negative” for the present. It has not happened yet.
The same holds true even for politicians whose purpose is merely to maintain things the way they are. They don’t want anything new or different from the status quo, as it is labeled. All the laws and economic arrangements presently in place are just fine, they feel. These conservative statesmen and women or “reactionary” social-movement leaders oppose innovators and revolutionaries. Problem: The future isn’t here yet. It is still a negative. What will it be filled with? The conservative status quo? Or revolutionary change and innovation? The purpose these resistance politicians or reactionary social-movement leaders will serve is one of lengthening the duration, enhancing the quantity and perhaps the quality, of the kind of government and social system the nation currently has. Their purpose, like that of the ballplayers, points ahead in time to a set of conditions that do not yet exist, because the future does not yet exist.
Naming as it does a “not yet” or an “is not the case,” the notion of a “purpose” epitomizes language as verbal action, in contrast to language as a passive “mirror on nature.” “Purpose” points up how language can superimpose on the concrete natural conditions around us something that is “not there.” Just as your sought-after college diploma is “not yet” in your pocket, there’s probably “no” elephant stampeding through the room you’re in at this moment. “No elephant” is a concept you can bring to your present observations as you look to your left and your right. A chair might be materially there, along with a desk or table and a computer. But “no” playful pachyderm. “Purpose” and the negative destabilize a scientific or positivist view of language. They undermine an “objective” and therefore value-free notion of verbal description. Something is happening in language you can’t see, hear, feel, taste, or smell. Could it be—drama?
The social psychologist Abraham Maslow (1970) drew a famous diagram in the shape of a pyramid. He called it the hierarchy of human needs and motivations. He could have called the common human needs he lists in ascending order, lower to higher, a hierarchy of human purposes. Certainly, all persons pursue in their daily activities the ends and goals Maslow deems generally important for human life and success.
On the bottom tier of the pyramid, Maslow lists those physical needs humans directly share with nonverbal animals. They are those of food, drink, cover, shelter, reproduction, and elimination. Higher motivations include safety, belongingness, and esteem. At the pinnacle of the pyramid is the motive of self-actualization, self-fulfillment.
The moral drama that the word purpose conjures in the mind by way of hints and suggestions can be seen in a comparison and contrast between humans and other animals on Maslow’s lowest level of needs and motivations. The physical needs of humans Maslow cites are exactly the same, on some level of abstraction anyway, as those of insects and birds, squirrels, and chimpanzees. All animals, human and nonhuman, need and seek after, food, drink, shelter or habitat of some kind, reproduction, and so on. The difference is when nonhuman animals do not meet these needs, they get hungry, thirsty, vulnerable to danger and the elements, and deprived of offspring. When humans do not meet these needs, they get hungry, thirsty, vulnerable to danger and the elements, and deprived of offspring—and they get shamed! The poor themselves are deemed guilty for not properly providing for themselves and their dependents, a typical 19th-century approach to the problems of hunger and homelessness in a society. Or the social order that allows such deprivations to exist within it is blamed, a typical 20th- and 21st-century approach to the problems of hunger and want. Somebody or some condition is blamed. Moral conflict transpires. Dramas of correction and redemption ensue.
The “higher” human needs and motives with a place in Maslow’s pyramid—safety, belongingness, esteem, and self-fulfillment—are no less burdened by moral judgment. Persons who neglect their safety needs and suffer for that neglect didn’t take proper “care,” their neighbors might say. A recluse who doesn’t “belong” anywhere or has no friends, is often stigmatized as a “loner” or social outcast. To a great extent, “esteem” is meted out according to our location on the “social ladder,” within our community of associates. One is deemed “estimable” or “not estimable” by way of that social standing. Men and women with talent who never use it to good advantage are sometimes labeled “failures” in life. In the pursuit of safety, belonging, esteem, and self-actualization, persons are no less open to ethical criticism and immersed in drama than in their search for food, clothing, and shelter.
That’s why Erving Goffman (1959, 1967), a sociologist, said that human beings in society are ritually vulnerable. By that he meant that humans are susceptible to moral judgment by those around them. Whether they fulfill their needs and purposes, lower or higher, is tinged with moral danger. “Face” is at stake. Men and women can hardly help thinking of “good” and “bad,” “valuable” and “worthless,” when they confront, contemplate, and employ the word purpose.
Not Only Whether but Also How
The word purpose connotes a rule-governed process of goal seeking, as well as a morally tenuous burden to meet one’s needs, to some extent at least, in the end. Potential moral censure goes not only withwhether those fundamental human aims and resolves are fulfilled but also how they are fulfilled. All scenes, contexts, and hierarchies within which symbol users operate are circumscribed by rules (Cushman, 1977). Even behind closed doors, the “moral law” inside the human mind that inspired the philosopher Immanuel Kant with such “wonder” works its censorious will. The ultimate scene of human action, the literary scholar Tim Crusius (1990) says, is “language” and the intuition of the negative that suffuses it. And the negative, author of the thou-shalt-nots, judges persons omnipresently (Appel, 1993a, 1993b).
To wit: Unlike crows and coyotes, human beings do not descend on a farmer’s field to sate their hunger and take what vegetables or livestock they want, without fear of legal action. At the supermarket, they don’t stuff their shopping cart with goodies and wheel them out to the car, without stopping first at the checkout counter to pay. They hardly walk down a tree-lined suburban street, notice a beautiful home across a well-manicured lawn, and say to their spouse next to them, “That would be a great place to live! Let’s move in and make it our own tonight.” They don’t pass a strange and gorgeous woman, or handsome man, on the street, and go up and say, “I’ve decided you and I are going to make a baby right now.” Symbolizing animals can’t “strike it rich” at banks and convenience stores with a pistol or an Uzi, without a likely and long vacation behind bars. How they strategize the pursuit of their purposes is even more fraught with potential moral conflict than whether they bring them to pass. Food, safety, friends, status—whatever it is they want, human animals have to conform to the thou-shalt-nots of society, or else!
Moral Women and Men, OK, but “Moral” Animals in Nature, Vegetables, and Minerals? Get Real!
An obvious issue is this one: If the notion of a morally tinged “purpose” is inherent and inescapable in messages about distinctively human actions, what about descriptions of the nonhuman world? Obviously, symbol users have to access these same parts of speech and forms of thought in their representations of anything and everything. Language offers no other content terms and meanings to work with. How does a claim of ubiquitous drama dovetail with talk about creatures and beings that have nothing to do with morality or ethics?
Consider these common turns of phrase and modes of thought that employ exactly the same terms for moral activity used to describe people:
- The stalking action of the lions went unnoticed by the wildebeests.
- The hydraulic action of the running water smoothed the edges of the rocks.
- Our car’s engine acted up on the trip last week.
- A chemical agent in the dish soap cut the grease on the silverware.
- The mother bird gathered twigs for the purpose of building a nest.
Lions, water, automobiles, detergents, and birds are not dramatic beings. None of them has anything at all to do with moral conflict. They are innocent of laws and rules. Not one experiences guilt or shame. Yet the same basic terms that express what they do also describe the legal action Sam took against George, the insurance agent who sells medical policies, and the spiritual purposes of the youth group on a religious retreat. Speakers and writers use these same basic descriptive concepts with respect to nonhuman beings without a sense of metaphor, without a sense of comparing two things that are essentially dissimilar. What gives?
What gives is that the subtleties of the drama inherent in language escape the modern “scientific” mind. Contemporary men and women aren’t quite as aware that they are investing rocks, plants, and nonhuman animals with something akin to moral agency in such declarations as follows:
- The earth revolves around the sun once a year at a 23½-degree slant, generating the four seasons.
- The sun shone down on us, warming our bodies.
- Trees grow their branches toward the sunlight to facilitate photosynthesis and stay alive and healthy.
- A pack of African wild dogs ripped the baby zebra to shreds to satisfy their hunger.
Moral agent: a being that has a capacity for self-initiated action that can effect changes in its environment or in itself for good or ill, for benefit or harm. Each of the noun subject “actors” or “agents” in the sentences above are declared to be doing something to bring about changes in their environment for good or ill. The earth “revolves” on a slant to generate the four picturesque and productive seasons of the year. That’s the implication. The sun “shines” so as to bring warmth and life to the denizens of Planet Earth. That’s the implication. Trees “grow” their branches in a certain direction to make it possible for them to bear fruit to eat, provide shade and beauty for body and eyes, and furnish birds and other animals with a safe habitat. That’s the implication. African wild dogs “rip” a poor little baby zebra apart, live, to preserve themselves and their pups, yes, but they do so in such a vicious, cruel, and disgusting way, no one would want them around as house pets or team mascots. That’s the implication. These four “agents”—and that’s what the linguists call any subject in an active voice, action verb sentence, human or nonhuman—are represented as performing four “actions,” three of which do “good” and one does something “bad,” or at least morally ambiguous. True, African wild dogs have to live, but can’t they display better table manners?
Persons in primitive or archaic cultures would not have required such subtle analysis to understand that they were investing the nonhuman world of animals, plants, and inanimate objects with moral potencies via their speech. They did so openly, naturally, consciously. Following the cues intrinsic to language more naively than modern women and men, they revered, dreaded, even worshipped rocks and trees, rivers and storms as magical, living beings. Primitive peoples saw them as gods who could bring blessings or curses on themselves and their tribe. Their religion was “animism,” the anthropologists and theologians say. In animism, individual animals and plants, nonliving things and natural processes are seen to possess “souls” and supernatural powers by which to bring benefit or harm to the human realm. Nonhuman beings are conceived to be free moral agents, only more majestic than humans in their might and efficacy. The philosopher Ernst Cassirer (1946/1953) theorized that language began in this very way, via the projection of “momentary deities,” “momentary gods,” as he labeled humanity’s first words, on the sheer brute materials of the world as it is (pp. 17-18, 71).
The inherent thrust in language toward higher orders of abstraction led eventually to polytheism, where the god of this river metamorphosed into the god of all rivers, and thence to monotheism, in which “acts” of nature have become the “acts” of one God. Read your insurance policies on hurricanes and floods, tornadoes and earthquakes, for particulars.
The philosopher of language most directly behind what’s been said to this point is Kenneth Burke. He’s certainly the most influential American rhetorician of the past 100 years. His name is the one, in the United States, most commonly associated with drama in discourse and life. Burke (1954) says, “Spontaneous speech is not a naming at all, but [rather] a system of attitudes, of implicit exhortations” (p. 177). Language “exhorts” to moral action, he asserts, or prompts an interpretation of phenomena, all phenomena, as in some way inscribed with moral action.
Burke (1954) adds, with respect to the parts of speech and their latent meanings,
[Jeremy] Bentham [a 19th-century British philosopher] detected a kind of organic flaw in the nature of speech, at least as regards the linguistic ideals which we now ask our vocabularies to embody [that is, the ideal of detached, “scientific,” value free reportage and description]. Speech takes its shape from the fact that it is used by people acting together. It is an adjunct of action. It thus tends naturally toward the use of implicit moral weightings, as the names for things and operations smuggle in connotations of good and bad [italics in original], a noun tending to carry with it a kind of invisible adjective, and a verb an invisible adverb. Our attempts at impersonality [read “objectivity”], as Bentham noted, are generally made by the use of question-begging words, which are impersonal only insofar as speaker and auditor share the same interests. (pp. 191-192)
Burke (1966) lays great stress on the ways the symbolizing animal, Homo sapiens, sees the “positives” of nature through the eyes of “moral negativity,” on the way “man [and woman] must perceive nature through the fog of symbol-ridden social structures that he [also she] has erected atop nature. Material things,” Burke says, “would thus be like outward manifestations of the forms which are imposed upon the intuiting of nature by language [necessarily infused and replete with moral action] and by the sociopolitical orders [i.e., rule-based moral hierarchies] that are interwoven with language.” Thus, “Nature, as perceived by the word-using animal, would be not just the less-than-verbal thing that we usually take it to be. Rather, as so conceived and perceived, it would be infused with the spirit of words, … full of gods, … a fantastic pageantry, a parade of masques and costumes and guild-like mysteries” (pp. 378-379; Burke, 1961/1970, pp. 17-23, 172-241).
The Dramatic Elements in Language and Life: A Reprise and Elaboration
What should be clear by now is this: The ultimate form or pattern of linguistic utterance that results from the trajectory of implications begun by the general pattern of verbal action found in the definitions of nouns, verbs, adjectives, and adverbs (the content parts of speech), merely implicitly moral, is an explicitly moral pattern of verbal action found in discourse about distinctively human activity and in much discourse about nonhuman activity, namely, mythology, astrology, theology, and the “acts of God” clauses of insurance policies. Such moral drama, where scenes of action (the when and the where) are ethically charged; where featured actors and their opponents (the who) are good or bad, or some variation thereof; where the actions they take (the what) atone for wrongdoing sacrificially through victimizing punishments or mortifying self-denial; where purposes and the means to achieve those goals (the why and the how) are morally and redemptively corrective, the protagonist in the drama hopes anyway, though he or she cannot be sure ahead of time; and where attitudes (the manner or “incipient actions”) are pious or impious, repentant or rebellious, morally constructive or perverse—such components of moral conflict fashion the very lens through which symbol users are fated to interpret even the nonsymbolic motions of stars and electrons, whales and microbes.
Up to this point, behavior gone wrong has occupied center stage in this gambit in dramatic criticism. A more comprehensive question to ask with respect to the inception of drama in discourse and life is this one: What, in a given context, may one not do or not fail to do, yes, but also not believe or not fail to believe, not accept or not fail to accept, or else—what? Belief “gone wrong” is often just as censurable as behavior gone wrong in the religious, political, or social life of the being Burke calls Homo loquax.
Start again with the example of language used most thoroughly and transcendentally: theology. In orthodox Protestant Christianity, for instance, the formula is “Scripture alone, grace alone, faith alone.” What a person does or does not believe answers most directly what Burke calls the defining question for self-identity. “What are you being a Christian against” [italics added] (1969a, p. 34) (substitute for “Christian” any label you wish). It was not—and is not mainly today—material conduct that separated Protestant from Catholic or Christian from non-Christian, to cite but one set of examples of religious drama. The Catholic English Queen “Bloody Mary” (1553-1558) killed Protestants, and Calvin and Zwingli’s Protestants killed Anabaptists, for what those “heretics” didn’t believe. In 1648, Europe’s Peace of Westphalia ended a century of violent conflict between Protestants and Catholics substantially over the question of religious faith.
Protestant and Catholic Christians have put such physical enmity behind them, even in Northern Ireland. Sunnis and Shiites in Iraq, however, exemplify, in the Muslim world, that the extremes of religious drama are still extant and problematic, even in the 21st century.
Look for the same “faith” dynamic, with far less bloodshed to be sure, on display in the ongoing dispute between neo-Darwinian evolutionists and advocates of Intelligent Design that’s come to verbal blows even in a court of law. Dover, Pennsylvania, was the venue, in 2005. Kansas has seen a similar confrontation. Strict orthodox evolutionists prescind from the fellowship of the righteous even those who believe in an élan vital or “emergent” or “creative” life force (Bergson, 1911; Chardin, 1959; Whitehead, 1929) that has complexified and “improved” living beings in the direction of heightened powers of adaptation. Biology, to Neo-Darwinians, is founded on principles of nonteleological mutation and “natural selection.”
Creationists and defenders of Design say, no, life forms on earth are too wondrously complex and symmetrical in form for explanations founded on blind accident. A transcendental Power capable of intentional action must have planned it all, or even intervened at crucial moments, as in the generation of eyes.
The dramatic point is that Designers don’t get published in mainstream scientific journals and neo-Darwinians are anathema at the pro-Design Discovery Institute. Beliefs, more so than physical actions, morally separate “good” scientists from “bad” scientists, “good guys” from “bad guys,” on this seemingly persistent issue—depending on which direction you’re coming from!
An Example of Dramatic Conflict, Actual or Potential, One Will Have Had to Experience to Get to College
Preschool children turn 5 or 6 and have to leave the rule-governed enclave called “family,” at least for part of each weekday. They go to school. They go to kindergarten, then elementary school, then middle school, then high school. Maybe later, they go to college. Talk about tightly controlled, rule-governed environments!
Take, for instance, the drama of human relations at the high school level. Ideally, a student has to fulfill a set of stated requirements to pass a particular course, based on things such as homework, tests, compositions, and class participation. The student has to successfully complete a certain number of courses to advance to the grade above and/or perhaps achieve a cumulative grade point average established by the school directors, principals, and faculty, the big wheels, the rule makers, in this particular hierarchy/institution. Proceeding satisfactorily through all four grades—freshman, sophomore, junior, and senior years—the student graduates with pomp and circumstance, family hugs and kisses, and photographs proudly framed.
Assume that the general requirement for each course is a 60% average on work submitted and an overall 2.5 grade point average for one’s entire schedule of subjects. That means, get an average of 70%, or a “C,” on all your school-work and no less than a 60%, or “D,” in any one course. Those are the rules. They establish a certain “moral order” in school and community. Nobody gets a diploma from Jonesville High without producing work at this level of competence.
In high school, then, the curricular drama begins with posted “course and graduation requirements.” Things begin to go wrong for students at Jonesville when their grades slip below 60% in one or more courses at the end of a marking period. “Failure notices” go out to parents. A guidance counselor requests an interview. Athletic coaches admonish about potential ineligibility for sports. Detention hall or extra in-school study time may loom. A period of “probation” might be officially invoked. The “finger of blame” points ever more directly and threateningly at these errant scholars. Overtly dramatic elements pervade their rhetorical context.
If low enough grades continue across two or three or more courses, “failing” students may have to “take the year over.” They are “held back.” Or they are forced to go to summer school when they’d rather be working and earning money, or vacationing at some camp for athletes or cheerleaders or other youth groups. They must “pay a price” for their academic missteps.
The “price having been paid,” their “lesson having been learned,” improvement having been shown, formerly “failing” students, students “at risk,” often become successful students in the academic year(s) to follow. Their “suffering” has brought them new perspectives and understandings. They see things differently now, oftentimes anyway. They work harder and go on to better grades and ultimate graduation.
Course requirements, low grades, warning notices, and maybe academic probation, a failing effort across one’s academic load of classes, being held back or suffering through summer school; then, one hopes,improvement and success and eventual graduation: These are the stages of moral drama as construed and applied in the theater of life called high school.
The word potential appearing in the heading above is implicit in drama in any sphere of human action, certainly including high school. Recall the featured definition in page 267 of this chapter. Drama: moral conflict to set right a situation gone wrong or keep right, a situation that could go wrong. Elements of drama are tacitly in play even in messages that seek to prevent the rule violations that inaugurate the “rising action” of moral conflict. Even when things are going fine for a person in an institution or other venue of drama—a family, a school, a corporation, or any large or small grouping—a threat hangs in the air. Children, students, employees, citizens, members of whatever organization or polity you can think of might break the rules. “Police” are on guard in the form of parents, teachers, hall monitors, foremen, gossips, tattletales, surveillance cameras, the cop on the beat, a standing army, you name it, to protect against just such a contingency.
Temptation is omnipresent. Humans can be morally weak, out of control, potentially ambivalent, subversive of attitude, uncaring, or physically and mentally incompetent in the face of the rigors of rule-governed social life. When she called you at 6:15 a.m. to wake up, get up, get washed and dressed, and fed a bit of breakfast (the school bus comes in 1 hour!), Mom was summoning you to sacrifice, self-discipline, in the face of possible demerit. Proleptic guilt, as it were, tense anticipation of what might happen if, motivates as a deterrent to potential “crime” even before the fact. In the midst of moral order on the human plane, moral disorder hovers within and around as a menacing possibility.
Overlapping scenes of dramatic action complicate this picture yet further. Social expectations now stigmatize the high school “dropout.” A 100 years ago, an eighth-grade education was an acceptable norm. Not so today. It’s a high school diploma, then college or trade school or a job. Dropouts who then loiter and loaf are doubly targets of social censure.
Mom’s wake-up call summoned you to constructive moral action on multiple fronts.
An unresolved question, perhaps an unresolvable question, in the philosophy of language is this one: How creative is discourse in its relationship to the “real world” of rocks and trees, oceans and mountains, atoms and stars? What’s been called the “naïve verbal realism” of the 17th-century philosopher René Descartes may still be in fashion in scientific communities. Such a representationalist, empiricist, even positivist point of view is not, however, in general favor today among philosophers, rhetoricians, or literary and cultural critics. A “Postmodern” conception of language has been particularly influential from, say, 1970 to the early 21st century. In such a view, Richard Tarnas (1991) has argued, all “truth,” argument, and validity are “multiple, local, and temporal.” (pp. 397, 401). There are no universals. There are no foundations to knowledge. Indeterminacy reigns. “Reality” is socially constructed, via language, at a time and place. “All human understanding is interpretation, and no interpretation is final” (p. 397). Language, to the Postmodernists, the communication scholar Trevor Melia says, is something like an untethered balloon, ungrounded in the hard resistances of this universe.
Burke (1954) appears to take a middle position on this subject. All human understanding is interpretation, yes, and no interpretation is final—up to a point. That point is the ontological nature of symbolizing animals themselves, the “what they are” in partial contrast to “what they can know.” Humans are universally immersed in language and drama, Burke (1966) suggests. That’s a foundation of a kind, an ontological foundation. And the symbolic constructions that humans bring to their “reality,” though both interpretive and time and place bound, are disciplined by the “recalcitrance” of that material world, Burke (1954) maintains. Language necessarily superimposes dramatic meanings on the world of objects, that’s true. In the process, however, it not only “selects” via abstraction and “deflects” by way of its inherent burden of tunnel vision, but it also “reflects” a portion of that reality, if not representing it in some Cartesian, “mirror-of-nature” manner or scheme (Burke, 1966). Symbolizing animals, maybe in fits and starts, adjust their symbolic actions to the counteractive motions of their animate and inanimate surroundings. Drama in life, language, and interpretation is thus omnipresent but not likely insular or totally arbitrary.
What to Take Away
If discourse and human striving are infused with drama, and are, essentially, symbolic actions, those who dwell in the house of language ought to be on the lookout for two generic temptations: conflict and self-aggrandizement out of control. The “negative,” symbolic action in its purest form, tells why. What is “not there” never ends, is without number, and ultimately is without peer. Or what is “not there” goes on forever, extends to infinity, and puts to shame everything material and mundane. When other animals are sated, they stop eating, drinking, mating, aggrandizing, or whatever. Too often, humans can’t get enough. They seem goaded by that vision of the eternal and the infinite to make it immanent, struggle for ever more in the here and now. Keep up with the Joneses? A nice intermediate step, yes. Better still,beat the Joneses, surpass them, and grind them into the dust, if need be, so they never pose a threat again.
Symbolizing animals demonstrate way too often, Burke (1966) says, that they are “rotten with perfection.” Moderation, modulation, measured human endeavor, based on maximum self-conscious, more humble self-awareness, is the key, Burke (1984) maintains, to personal and social well-being in the large. “Comic” ambivalence, humility, and charitability, not “tragic” certainty, pride, and hostility, are the prescription. In nuce: This is the formula for the “purification of war” (Burke, 1969a, 1984).