Introduction to The Langauge War by Robin Tolmach Lakoff

The Language War

The Language War

I have added this introductory chapter from The Language War by Robin Tolmach Lakoff because it is an eloquent description of the history of the field of linguistics, the effect of Chomskyan linguistics on the field and the ways this approach to linguistics affects what linguists study and how they see things. I did not edit out sections that refer to later chapters in the book, but have left it intact, even though it does not directly relate to discussions about interpreting, interpretations and how the field of linguistics affects us. I encourage you to read (and buy) this book.

I believe this summary describes very well how Chomskyan linguistics has affected the field, and therefore how it has affected research on ASL linguistics. When we remove the people from our study of language what is the effect? What happens when we take linguistic research that removes people and then turn around and apply it to real people in real situations?
I invite you to leave comments in the Discuss It! section with your reflections on this chapter.   – Marlene Elliott

What I Am Doing Here, And How Am I Doing It?

A question may occur to my fellow linguists and to others as they examine this book: What is an ivory tower denizen, a linguist like me, doing in the notoriously real world of politics? Is this what linguists do? Can do? Should do?

While the Ebonics debate of 1996 – 97 (see Chapter 7) served to bring the field of linguistics to popular awareness (or at least more awareness than it had enjoyed previously, about -9 on a scale of 1 to 10), its workings are still not familiar to everyone. Moreover, even linguists argue among themselves and within themselves about what the field can and should do, what it is about. A book suggesting rapprochement between language and politics is bound to raise questions among the uninitiated and hackles among the initiates.

The popular use of “linguist” is very different from its professional acceptation. If you tell a layperson that you’re a “linguist,” you are very likely to be asked, “And how many languages do you speak?” A “linguist” is someone able to use several languages, that is, use them practically – speak, read, and write in them. In this sense, the consummate linguist must be Francis E. Sommer, described in the New York Times (Honan 1977) as “fluent in 94 languages.”

The Times article calls Mr. Sommer “the Babe Ruth of the Linguistic Society.” But in fact, were Mr. Sommer (who died in 1978) to have attended a meeting of the Linguistic Society of America (LSA), he would have found himself both bewildered by the papers presented and the discussions of them, and ignored or patronized by the professional linguists (in the second sense) who belong to the Society and attend its meetings. The one thing most of us seldom do, in our professional capacities, is use a lot of languages. Many of us, to be sure, know more than one language. A few know many. But we know them in order to study them, rather than to communicate in them. Therefore we are often less concerned with individual languages than with what the totality of languages can show us about language – that it is, what properties all languages necessarily share, in what ways languages may differ and what those facts may tell us about the makeup of the human mind – pardon me, brain (that being the fashionable consideration at present).

Linguists of the LSA type therefore tend to be interested in discovering the abstract properties of languages, the grammatical rules that make them up, and the structures that make them different from one another, yet basically similar. Even this is no easy task. English is, for obvious reasons, the best studied language in the world, and yet we are nowhere near a complete grammar of English, nor in all probability will linguistics produce one in a lifetime of anyone reading these words.

The Times article mentioned what I think is a first: eminent members of LSA talking about their interest in people like Mr. Sommer (as objects of study for LSA-Linguists rather than as potential colleagues). Linguists are quoted as saying that Sommer-linguists have never been studied scientifically by LSA-linguists. But we are beginning to see them as special cases who, if their abilities could be unraveled and scientifically explained, could shed light on the processes the rest of us use with varying degrees of competence when we attempt to learn languages other than our native one.

But even if LSA-linguists agree on what we aren’t, there is less consensus on what we are. Are there natural boundaries to our field? How do we differ (if we do) from rhetoricians, literary critics, psychoanalysts, appellate courts, political spin doctors, and others who, theoretically and practically, determine what language means or accomplishes?

When I entered linguistics some thirty-five years ago, the answer was fairly simple. The field had a well-marked turf that distinguished it from other areas in the humanities and social sciences and thereby justified its independent existence. In recent years that certainty has been under some pressure.

Linguistics in this century began, in the early years of this century, as an offshoot of anthropology, another newish area. Both reflected the growing realization that many of America’s indigenous cultures were nearing extinction. While earlier observers tended to see this as a good thing (the cultures were “primitive” and non-Christian, the languages not possibly the equals of Indo-European and therefore not much interest or worthy of preservation), by the early twentieth century scholars were starting to become more sophisticated. They saw these indigenous cultures and their languages as expressions of the complexity and variety of the human mind, and therefore not only worthy of study, but essential to study if we were to understand ourselves as a species. To this end it was necessary to develop objective and scientific methods of investigation, so as to avoid the subjective perspectives that caused earlier scholars to understand other societies and their ways from the vantage point of their own (and thus necessarily as unintelligible or inadequate). The new science of linguistics also had to devise empirical methods of discovery and analysis, in order not to force the data uncovered in the field into the Cinderella’s slipper of preexisting theories, themselves often based (knowingly or not) on the Indo-European habits of thought innate to the scholars, all speakers of European languages and members of Western cultures.

So American linguistics was created as empirical and antimentalistic, fitting nicely (as intended) with the new notion of “social science.” “Science,” then as now, was ipso facto a good word. A field that could claim scientific status had the right to legitimacy. So linguistics identified itself as belonging to the social sciences rather than the humanities, using the empirical methods of the former rather than the interpretive (“mentalistic”) techniques of the latter. Ideally, linguistic analysis was concerned with form, not function; structure, not meaning; the concrete artifacts of language, not the abstract deeper structures that gave it sense and purpose. During the first half of the century, therefore, linguists concentrated on word-lists (lexicons) and inventories of sounds (phonology) and affixes (grammatical endings, prefixes, and such – that is morphology). Syntax involved relationships between constructions (e.g., active and passive voice), and therefore required assessing whether two sentences had similar meanings (that is, were paraphrases), and so could be done only in a very rudimentary and unsatisfactory way in an antimentalistic field. Therefore, in the field’s antimentalistic period, it received little attention. Within linguistics proper (excluding philosophy of language and general semantics), semantics (the study of the relationships between language forms and their referents, that is meaning) and pragmatics (the study of the relation between language forms and language function, to use one definition) played a role that was minimal to nonexistent.

This began to change with the advent of Noam Chomsky in the later 1950’s. Chomsky’s theory, transformational generative grammar (TGG), permitted – indeed, required – a limited amount of mentalistic analysis. Syntactic relationships were evaluated in part on the basis of paraphrase relations, which required the analyst to make judgments about meaning, albeit on a superficial level. The changes wrought by Chomsky and TGG were spectacular, both on linguistics itself and on many related fields. Not least was the change in the importance of linguistics in the university.

Before the mid-1960’s American linguists was tiny and obscure. Very few universities had full-fledged linguistics departments; some had programs, while many had a linguist or two on their faculties, situated, often uneasily, in anthropology, language, or English departments. But transformational grammar, with its promise that language could be a window into the mind, a glimpse into the universality of language capacities, and hence a way of achieving fundamental understanding of what it means to be human, seized the intellectual imagination. (Chomsky’s emergence in the late 1960’s as a radical critic of the Vietnam War also helped to popularize his still-infant field.) At the same time, during these economically flush years many universities were starting from scratch, and others upgrading themselves from small colleges into major institutions. To get recognition, it was essential to acquire intellectual respectability, as quickly as possible. That was most efficiently accomplished by creating a few prestigious departments filled with “name” scholars who could attract the best graduate students and large research grants.

But large and traditional departments are difficult to change. Tenure made it hard to replace older (often undistinguished) faculty members with respondent new stars. Even for a new university, staffing a first-rank large department was a daunting expensive task. But a first-rate small department would put a university on the intellectual map at once, particularly if the department was in a hot field and had ties to other departments and a bit of extra-university glamour. What fitted those definitions better than linguistics? The field prospered exceedingly, so that today virtually every serious research university has its linguistics department, typically with a faculty of ten to twenty tenured or tenure-track professors.

But as linguistics grew exponentially, we tended to ignore the fact that it was now made up of at least three very different kinds of people, who had entered linguistics with at least three very different assumptions – and therefore had diametrically opposite notions of what we ought to be doing, or even what this “scientific study of language” we claimed to be doing was, what “language” consisted of, and what “science” included. There were still many who had been trained as social scientists, anthropologists interested in learning about languages other than the familiar Indo-European ones as a way of understanding cultures very different from their own. They found the exoticism of the surface forms of those languages compelling, the idea that languages could differ from one another in seemingly innumerable and unpredictable ways. Others (like me) had been trained as humanists, and our interests lay in the hermeneutic potential of TGG. We wanted a way to determine, from their superficial form, what sentences “really” meant at a deeper level, why people made the choices they made, and what those choices signified about ourselves. Still others entered the field from mathematics or formal logic. For them, language was above all a system whose properties could be formalized in equation-like rules. They were less interested in the relationships between language and culture, and language and thought, than in the relations that held between the parts of sentences. This was the centerpiece of the Chomskyan project, as those of us who had entered it under one of the other auspices would ruefully discover. The three kinds of linguists made strange, increasingly uneasy bedfellows, and the field has yet to achieve a rapprochement among them.

In the 1970’s social-science-minded linguists developed other concerns. Just as sociologists like Erving Goffman had been looking inward into their own cultures and finding them pretty exotic, linguists began to collect and study familiar yet unanalyzed language forms like nonstandard dialects or ordinary conversation. In this they broke away from TGG, which certainly concentrated on English, but in a very different way.

To the formal (that is strictly Choskyan) TGGian, the linguist’s task was discovering the abstract grammar of the language, the grammar of the “ideal speaker-listener in a completely homogenous speech community, who knows its language perfectly and is unaffected by such irrelevant conditions as memory limitations, distractions, shifts of attention and interest, and errors (random or characteristic) in applying his knowledge of the language in actual performance” (Chomsky 1965, 3). While variants were known to exist, they were deemed of little importance; and while the context in which words were uttered might affect both their form and their understanding , TGGians saw that as essentially irrelevant to the abstract grammar they were seeking to intuit. Hence empirical data, painstakingly gathered from real people’s actual utterances, was not only not necessary, it was undesirable: it might be corrupt, tainted by trivial external influences, “performance factors.” Transformational theory directed its practitioners to produce the data that they then analyzed, and those analyses then formed the basis of their theories. If this sounds like a dangerously corruptible (and circular) system, let me assure you that today I find it scandalous. The acceptance of these theories and methods drove a wedge between the TGG “mentalists” (both formal and humanists) and their purported colleagues, the empirical social scientists.

None of the parties was then asking a major question about language: how do we use it to construct ourselves, make deals with one another, and weave our social fabric? To the social science end of the field, the fact that language was social was indisputable, but to answer that question was to wallow in the slough of mentalism, surfacing with analyses that were unreliable because not based directly on superficially observable data.

It might seem that that question would occur naturally to the TGGians and their descendants, especially given Chomsky’s extradisciplinary concerns with the politics of language and vice versa. But they had a conservative idea of “meaning.” For them, not unlike their more empirical colleagues (Chomsky after all had himself been trained as an American structural linguist), evidence had to be linguistically observable at the surface or not far below it. The assumptions speakers had in mind when they spoke, or their intentions in choosing one form rather than an apparent equivalent, were not part of “linguistics,” that is, grammatical analysis. These restrictions made for simplicity and elegance, but created in some of us a gnawing desire to see linguistics work as the “window into the mind” Chomsky had promised us it could be.

So both sides – the social scientists and the mathematicians – were to varying degrees antimentalistic and anti-interpretive. A reasonable justification, for both camps, was that to abandon their use of only superficially accessible data was to go back to that dangerous yesteryear in which anything could be related to anything on the analyst’s say-so – the awful results of which could be discerned not only in pre-twentieth-century linguistics, but in contemporary psychoanalysis and literary theory. It also moved linguistics further away from “science,” where rational people who craved respectability wished to resided.

The only problem (for me at least) was that accepting these constraints entailed accepting the impossibility of saying almost everything that might be interesting, anything normal people might want or need to know about language. For instance:

– Why do men and women, who “speak the same language,” regularly misunderstand each other?
– Why do we late-twentieth-century sophisticates, after a century’s barrage of advertising, still find ourselves bedazzled by the language of persuasion, economic and political?
– How do we use language to avoid responsibility for ourselves and allocate it to others?
– How can lawyers on either side in a trial describe the same events in such different ways that jurors fail to agree on a verdict? Or reach a verdict that, to an outside observer, makes no sense at all?
– How do the stories we tell and hear, privately and publicly, give us our understandings of ourselves and the society we inhabit?

And much, much more. Answers to any of these questions must start with close analysis of actual linguistic data and show how the specific forms speakers select have the precise effects they do on social, economic, or political reality. Analyzing the superficial linguistic form of a communication alone cannot explain why these particular words, in those specific combinations, operate to this exact effect on the minds of hearers; much less can it teach us how to be discriminating hearers and responsible creators of language. Looking (like political scientists and communication theorists) only at the effects of linguistic choices as demonstrated in polls and focus groups leaves open the question of what exactly happened to create this effect. So if linguistics is going to raise interesting questions and answer them in useful ways, and exist as more than an ivory tower curiosity, linguists must find ways to bring these different forms of analysis together, to look at language closely, but not to stop with language; to consider the complexity of cause and effect in everything we do that involves linguistic expression (which is pretty nearly everything we do). Analysts of language must use their minds, like transformational grammarians, as a filter or conduit: we must ask ourselves how we are affected by particular uses of language. But as social scientists we must use real, spontaneously created language as the basis of our analyses; and we must be sure that our individual mentalities, producing our own interpretations, are not idiosyncratic; the effect that I describe language having on me should elicit an “aha!” from you a good part of the time. Even when you disagree, you should see that my version might work for someone who brings to the interpretations a context different from yours (for discussion of this issue see Tannen 1984).

Modern “core” linguists still shun such endeavors as unscientific, tending to confine their analyses to the safe havens of relatively small and concrete linguistic artifacts: the sound, the word, the sentence. Even many sociolinguists have been loath to engage in analyses that involve intuition or introspection. Occasional linguists (as opposed to, say, discourse analysts) have alluded to the existence of “structure above the sentence level,” but mostly in the same tones as pre-Columbian Europeans speaking of what lay beyond the Ocean Sea. There be dragons – don’t go there.

Yet we don’t make meanings or express intentions at those smaller levels. Meanings become visible in discourse: connected language used for a purpose, whether in the form of a conversational turn, a haiku, a how-to-manual, a courtroom cross examination, a novel, or any of the innumerable other linguistic actions in which all of us engage regularly. To do so, we have to have internalized a set of rules or principles dictating what is a possible utterance within the relevant genre and what is not. These rules are part of our linguistic knowledge as much as the rules about what constitutes a permissible cluster of sounds in a language or what makes a string of words intelligible as a sentence. Traditionally linguistics has been unwilling to consider the processes by which we understand larger and more abstract units of language (text or discourse) as a part of a speaker’s knowledge of language that linguistic theory must account for. But it seems to me that there is no natural reason to cut off “linguistics” before meaning enters the picture, except as a reflex of the old antimentalism and the desire to keep a tightly formalistic hold on the meaning of “grammar.” True, adhering to the old ways will avoid the dangers of solipsism and incoherence (unless you believe, as I do, that formal statements can be just as incoherent as informal ones). But in simplifying your life this way, you make linguistics an artificial field, condemned to turn away just as things get interesting, unable to make a true rapprochement with literarily analysis, or psychology, or anthropology, or political science. I know some of my colleagues want it that way, and I wish them success; but they should not force us all into that Procrustean bed.

For this reason, I see all the topics I deal with in this book as “linguistics,” ways of understanding “language.” Most linguists would go along with me through Chapter 3, which looks at “political correctness” and therefore hovers around the safely linguistic level of the word or phrase. Even Chapter 4, on “sexual harassment,” as defined by Anita Hill, still relatively small-scale and concrete, may pass muster. Chapter 7, on Ebonics, will seem unexceptionable as an examination of dialect differences and attitudes toward them – a respectable topic for sociolinguistics. But many esteemed colleagues will part company with me there.

The other chapters all cluster at a more abstract linguistic level. They are about the social and political construction of narratives. Who makes our stories, and how do they develop over time and through an assortment of media venues? What happens, more particularly, when groups or individual members of those groups (the O.J. Simpson jury, Hillary Rodham Clinton) who previously were accord no right to self-definition through language, take for themselves that right, in very public ways? I see the appropriation of narrative-construction rights as parallel to, and a natural development of, the earlier appropriation of definitional rights at the word level. The narrative-controlling strategy of Hillary Rodham Clinton discussed in Chapter 5 is, in this view, the direct lineal descendant of the reappropriation of “Black” by the civil rights movement in the late 1960’s. I’m not saying that the linkage is conscious, just that the recent one could not have occurred at the abstract level of narrative had not the earlier one, more readily graspable at the lexical level, familiarized our society with the idea that language could be reclaimed.

Some linguists  will be troubled less by the objects of my analysis than with my analytic procedures. How do I justify my interpretations? Language is the transference of meaning form mind to mind (among many other things). A satisfying theory of how we use language to make and change public and private meaning (which is what this book attempts) can only be tested by what it illuminates and communicates to those who encounter it. There is, alas, no extrinsic, objective, “scientific” test of the claims I will be making. Some might argue that, therefore, what I am doing is not “scientific” and should therefore be read out of both linguistics and decent society. But the scientific method is not the only way we can arrive at understanding. As we enter a new century and a new millennium, it behooves us to keep our eyes open and raise again some old questions: What exactly makes an endeavor “scientific”? And must every way of knowing be “scientific” according to the traditional definitions? Is it, finally, possible to have a linguistics that does what the study of language must: weld the interests and methods of “pure” science, “social” science, and the humanities into one, taking from each what it needs?

Other questions follow from the foregoing. Where do I get my data and what do I do with it? These too are problematic.

Over the history of linguistics, linguists have been of two minds about the collection and use of data. For transformational syntacticians and their descendants, it is important to determine the limits on the applications of grammatical rules, and therefore it is essential for the investigator to construct sentences that would never occur naturally, as well as sentences that might occur but that might not show up in a normal-sized corpus of naturally occurring data. Hence the syntactical must construct sentences and test their grammaticality as a prelude to proposing rules and grammars.

Empirically minded linguist avoid this mentalism and its hazards, at least overtly. But often they find that, in order to say anything of significance, they must work interpretation (i.e. mentalism) into their analyses in covert form. That can create even greater opportunities for corruption than outright mentalism.

Being forced to choose between these possibilities (often framed by their proponents as Manichean opposites) means that, inevitably, valuable opportunities for understanding will be lost. Eschewing mentalism forces the analyst to forfeit the use of the tool humans naturally use to understand language – the mind. So motives, ambiguities, and subtleties are off limits, and linguistics becomes (to my mind anyway) a sterile enterprise.

On the other hand, if we admit deep interpretation into our armamentarium, whose do we choose, since each of us enters every discourse with our own context and perspective, which necessarily color (some might say distort) our interpretations? How do we justify our results? Can they be verified, or falsified? And if not, can they be “scientific”? And if they are not scientific, can they be useful? Area interpreters of human communication (whether linguists, literary critics, or psychoanalysts) engaged in scientific or humanistic enterprise? I want to answer, whether flippantly, greedily, or properly, both. But how can the methods and perspective of those very different discovery systems be welded together into a harmonious whole that yields reliable results?

Who, if anyone, is in charge here? Who certifies the interpretation of discourse? At one time many of us thought the responsibility for meaning (in Western culture at least) lay with the producer: the speaker/writer produced the meaning; the hearer/reader might or might tnot perceive the speaker’s meaning correctly. 1 But increasingly, I think that the speaker does not necessarily encode a single meaning, and that the business of making sense with language is a collaborative and indeterminate business. For most circumstances, though not some of the most interesting ones, if speaker and hearer get close enough to some sort of agreement, that will be fine. 2 But that still leaves open the question of who decides what things “ought” to mean, or “must” have meant. Both speaker and hearer are suspect: they have their own interests. So should we trust an “objective,” uninvolved interpreter? She is, to be sure, outside the immediate fray. But she also has an interest, albeit a theoretical one, in the conclusion. And worse, the interpreter is an outsider, and thus inevitably never privy to the totality of shared context between the participants themselves. Without some form of participant observation, meaning is necessarily lost. It is paradoxical but true that the greater the objectivity, the greater the unreliability.

Some scholars, for instance Deborah Tannen (1984) and John J. Gumperz (1982), have tried to circumvent this impasse by making interpretation a several-stage process. Some time after the initial speech event that they have record, they go back with the transcription or tape to the original participants. The latter are asked to assess what was going on: Why did you laugh here? What about that long pause there? How did you feel at that moment? How did you take the interlocutor’s remark? What did you mean by your response? Then the analyst interprets the participants’ interpretations, whether agreeing, offering alternatives, or disagreeing, and explains what’s going on at all three levels: the original discourse and the two levels of interpretation. But it’s not clear that the original participants are any more reliable interpreters of their own intentions than is the professional linguist, especially some time after the fact. Freud showed that we are unreliable interpreters of our own behavior. And of course the analyst is not above suspicion on several grounds. She might argue that her special knowledge compensates for her distance from the original utterance. But does it, or might it just introduce additional uncertainty?

These questions at present, and perhaps forever, are unanswerable. Arguably this problem destroys any hope of using language as a true window into the mind. Yet much work has been done that seems to the majority of readers, professional and lay, to be rich in insight and even helpful in daily life. Perhaps we are overly pessimistic; perhaps, just as the psychoanalyst Donald Winnicott talked about “good enough” mothering, we can talk about “good enough” linguistic analysis. And if we are aware of the dangers, we can take steps to minimize them if not ever completely avoid them. The awareness of solipsism may enable us to avoid it. I have learned to question my first take on anything, to be alert to the possibility of alternate understandings. Then, almost invariably, any idea that makes it into print has been subjected to a fair amount of discussion with friends and colleagues; offered to students in classes, presented to groups of various kinds as lectures, and explored in the media in interviews and talk shows. Very often my interlocutors offer new insights that I can incorporate. I have learned, finally, this is almost never true that an utterance has only one possible interpretation. Rather, I am offering here (as elsewhere) understandings of public language that I hope will be plausible. Then too, it is important to realize that I am not trying to state unequivocally what any speaker “meant”: that is impossible. Rather I am trying to explain my understanding of that utterance, and thus the understandings of other hearers or readers, but by no means necessarily everyone. I hope thus to be making plausible interpretations, those likely to have been made by people sharing a fair amount of psychological and/or social context.

This conundrum, originating within linguistics as purely methodological dispute, has connections to an argument raging in the rarefied air of literary theory: who is responsible for the making of meaning (if that is even a rational question)?

On one side are the deconstructionists, for instance Jacques Derrida and his followers. In its strongest form, deconstructionism asserts the undecidability of meaning in any text. Neither the original author nor any subsequent reader holds the key to “the” meaning of anything. Anyone who claims that power can do so only through the illegitimate exercise of political superiority or brute force. This is a highly subversive position, denying as it does the legitimacy of both political and cultural authority, and thus has been the target of much conservative critique of the “liberal” or “radical” university (see Chapters 2 and 3). These critics feel betrayed, having been brought up with the comforting certainty that all was knowable, you just had to know someone who knew. If you were the right kind of person, that could be you! In any case meaning is stable and determinate. Life is serene.

But neither deconstructionist chaos nor authoritarian certitude represents the commonsense world that readers and hearers know. When competent speakers engage in any kind of discourse, they form ideas in their minds about what it means and respond accordingly. Sometimes, to be sure, later evidence reveals that an interpretation is at odds with an intention, with resultant embarrassment. But more often there is sufficient consensus for the discourse to proceed to a satisfactory conclusion: “good enough” understanding.

That commonsense consensus matches Stanley Fish’s (1980) idea of the “interpretive community.” As in literature, so in communication more generally: we understand what we encounter based on shared contexts and experiences. In its most obvious form, a linguistic interpretive community is a “speech community,” defined as consisting of everyone who, in some sense, “speaks the same language.” 3 But communicative interpretative communities may also be based on more abstract similarities: gender, political sympathies, aesthetic preferences, occupations. Over the last thirty years feminists have demonstrated the existence of a (formerly private) women’s interpretive community whose presence became public and explicit as a result of the Clarence Thomas confirmation hearings in 1991 (see Chapter 4). Hence the rallying cry of those times, “You just don’t get it!” signifying the emergence of a community ready and able to make and insist on its own interpretive rules. Conservatives predictably decry such developments as “special interests,” “tribalism,” or “balkanization,” but as I will argue in Chapter 3, they demonstrate something much deeper and more positive, if initially uncomfortable. special speech communities have always been a part of human existence, but only recently have they emerged into widespread public recognition.

The interpretive community is the best model for the way we as speakers participate in discourse, as well as for the way we as scholars interpret that discourse and in turn create our own. We must see ourselves, in all our language-using roles, as participants in several always shifting communities of meaning-making. As long as our interpretations work, as long as most of the time, most people respond in a way tha mostly seems appropriate, we are doing well enough. Meaning is made by consensus: the original speaker contributes form, the original audience response, the analyst an explanation linking the two.

There is seldom a need for, or a possibility of, complete overlap of intention and understanding. A general sense of cohesion among participants suffices for most human purposes. The scholar attempts to identify all latent as well as patent understandings, eventually discarding those that are unlikely in the context and/or disputed by the original participants. But we should never flatter ourselves that we have created the complete or ideal interpretation of anything. Our work is particle and provisional. But it’s good enough.

Another controversy arises out of the data I have chosen as the basis of my analyses: largely written, generally mass-media, most often planned discourse. Since the early twentieth century American linguists have seen spontaneous small-group oral communication as “real” language, the only worthwhile object of study. That assessment represents an overreaction to an earlier assumption that only literate communication was worthy of study. Now it’s time for the pendulum to return to the center. In a literate society like ours, meaning is negotiated through a wide array of communicative channels: written language and oral; public and private; formal and informal; spontaneous and constructed; direct and mediated. All of these together create our identities as individuals and members of a society. Each contributes to the totality that is us. If we are interested in the way language creates and constructs us all, we must consider all the forms our language takes. Any claim that some forms of language are “realer” or more legitimate objects of analysis than others is misguided. The question to ask is, how do all the forms language takes in our time work together to produce the results we see around us?

One more caveat: this book is written with what some will allude to 9with a sneer) as a “liberal slant.” I realize that true scholars are supposed to display “objectivity,” seeing things from both sides, or all sides, or none. Along with many postmodern cultural commentators, I wonder whether there is such a thing as true objectivity, or at least whether, if objectivity exists, it is ever true. Often beneath the objective surface a writer’s real beliefs exist in distorted and covert forms, presupposed rather than asserted and therefore difficult to identify and critique. Even in the best case, objectivity creates disengagement, and disengagement is deadly in every sense. So get it: I’m a liberal.

As I write the foregoing, I wonder about it. In doing the research for this book, I read my way through a lot of conservative discourse: George Will, Peggy Noonan, William Bennett . . . I could go on. (What I have suffered for you, dear reader!) However elegantly written, however smartly argued, there is one thing lacking in every example I have encountered: any kind of apologia for its political stance, or even, generally, any explicit acknowledgment that the work is written from any political stance at all.

That is puzzling. Why do I feel a need to explain, if not apologize for, my politics, but George Will doesn’t? Despite the wails of conservatives (see Chapters 2 and 3 especially on this point), there is no case to be made for liberal control of public discourse via a conspiracy of the “liberal media.” If that were so, “liberal” would be what linguists call the unmarked value, requiring no justification, and conservatives would feel a need to confess. I speak about  this issue sat some length in Chapter 2.

The foregoing are my assumptions and ground rules. If you can accept them, reader, read on.

Sabotaging Success by Mary Cook, M.A., R.A.S.

This article was originally published by OmTimes, an on line magazine in their October 2010 issue and is reprinted here by express permission of Mary Cook. Thanks Mary!

Typical definitions of sucess include having a loving partner, financial wealth, and a thriving career. These are admirable goals and there is nothing inherently wrong with them. The problem lies in the false belief that through obtaining them, we will feel happy and fulfilled. Furthermore, we want our fantasy of these goals, and the desired emotion, to manifest permanently.

Not only are we aware of countless people that have achieved far more than the above goals, yet remained unhappy and unfulfilled, we have personally experienced this as well. After a brief exhilaration, we typically realize that we have failed to reach a new positive emotional plateau. We might feel disappointed that our goals did not measure up to our fantasy of them. Our achievements may require unwanted ongoing maintenance, responsibilities, learning or growth on our part to sustain them. Or they may present us with a whole new set of problems. If they indeed meet our highest expectations, then we fear changes, diminishment or loss of what we’ve acquired. And yet, despite the lack of lasting happiness and fulfillment, we set new goals for success, with the same delusion that they will be our emotional deliverance.

When our attention is focused on wanting something different from what we now have, we will fail to arrive anywhere that gives us an improved emotional and mental state. In fact, the new places, faces and outward circumstances have the uncanny ability to stimulate the same old internal themes, thought and feelings from which we tried to escape.

We have the idea that success resides in the future, yet the only power we have is within the present moment. And although the standards for success consistently and persistently rise above wherever we are now, we fail to question our primary assumptions. Instead, we continue to reinforce negative and stressful feelings of insufficiency. Thus we end up sabotaging true success and happiness.

Primary personal beliefs begin in childhood, in circumstances where our well being is in the hands of other people and external events. The ideas that we are not “enough” and we do not have “enough,” and we are dependent upon the external world to correct this, are deeply embedded in our minds and behavior. The more stressful our childhood is, the more tightly we hang onto fear based beliefs and protective defense mechanisms.

When life expands and deepens, instead of amending our earlier assumptions, we generally distort new information or fail to apply it to our personal circumstances. Thus it is the energy of past fears of inferiority and insufficiency from our childhood dependent state, which propels us toward achievements with false hope. The motivating energy of a goal will create an achievement that holds the same energy. So we are in a cycle of sabotage.

We must transcend ordinary thinking in order to resolve the problems created by ordinary thinking. We believe that external factors will bring us happiness and fulfillment. We think that the more intensely and frequently we yearn for these external factors, the more quickly they manifest. We assume that we do not have to change ourselves, in order to attract desired external change. We expect that happiness and fulfillment will be permanent upon the achievement of future goals. When these beliefs are proven wrong time and time again, we pursure them even harder. When confronted that we have these false beliefs, we deny it while our behavior confirms it.

True success is internal. True success is right now. True success is not dependent upon external factors. True success changes into many forms. True success is meeting outward circumstances with the healthiest, most positive response. True success is recognizing that the point of life is learning and growth, not complacency, or stagnation.

Happiness and fulfillment exist within our higher consciousness. Our soul shines light on perceived darkness, and shows us external abundance for perceived deprivation. We must notice and value our own feeling of happiness whenever it arises. We must become aware of the places in our life where we feel fulfilled. External factors that stimulate these feelings are reminding us of the treasure that is always within us. There is no outward success to capture, which will insure our happiness and fulfillment. When we practice experiencing the positive feelings that we associate with our desired goals, we enter higher consciousness. When we take realistic actions that are in sync with these positive feelings, we enter higher consciousness. This is where true success without sabotage exists.

Mary Cook is the author of “Grace Lost and Found: From Addictions and Compulsions to Satisfaction and Serenity,” available from Barnes and Nobles bookstores,, etc. She has 34 years of clinical practice and 29 years of university teaching experience. She is a national speaker and has a private practice in San Pedro, CA. Mary is available for telephone and office counseling, guided meditation, speaking engagements, and in-service training. Contact her at and see her website for further information