Thought, language and political correctness.

Like many other bloggers I’ve noted and given some thought to the debate on political correctness triggered by Anthony Browne’s pamphlet for the right-wing think-tank Civitas, ‘The Retreat of Reason’. I will get around to commenting on Browne’s actual work in due course but in the mean time I spotted this post from Robert Sharp which all but demands immediate comment by virtue of incorporating the statement below:

To repeat: The purpose of Political Correctness is a noble one. It seeks to refine our political debate. It identifies and eliminates discrimination in our everyday language. Inconvenient[l]y for Civitas and Anthony Browne, some of this prejudice exists within the traditions and social mores of British Civil Society, the homogenising behemoth that they exist to defend. They therefore see Political Correctness as a threat, and they go on the offensive. This is truly a tragic irony, as they succeed only in holding back a force of progress, one which seeks to weed out Britain’s prejudices, and recognise its historical mistakes. Only when that process is complete may we call ourselves ‘Great’ once more.

I wonder if Robert realises or appreciates just how sinister a concept he’s putting forward when he talks of the purpose of political correctness being to identify and eliminate ‘discrimination in our everyday language’ for there is far more to this particular idea than merely the removal from common parlance of certain words – such an idea strikes at the very concepts those words express.

It seems strange, in some respects, to have to resort to a source more reactionary even than Browne in order challenge Robert’s assertion of a ‘noble purpose’ for political correctness, yet here such a thing is entirely necessary and one must turn to Joseph de Maistre:

la pensée et la parole ne sont que deux magnifique synonymes

‘Thought and words are only two splendid synonyms’ argued Maistre in what was, at the time, a remarkable and original insight.

How do we think?

We think by using symbols, symbols which make up an articulated vocabulary – the most basic of which are words. One cannot think without recourse to symbols (words) and yet one cannot invest or create such symbols without being possessed of the ability to think. The use of words cannot have been ‘invented’ articifically any more than the ‘use’ of thoughts as the two are identical.

This concept has profound implications when applied to the question of political correctness and particularly when applied in the context of seeking to eliminate discrimination in everyday language for by eliminating discrimination in language we also, logically, eliminate the idea and concept of discrimination from thought. If we have no language to describe, define or express discrimination then we can have no such concept at all.

Some may see that as a good thing, if one eliminates the language of discrimination then one eliminates the concept of discrimination and in doing so one, logically, eliminates discrimination itself.

A noble purpose? Perhaps, but one that is deeply misguided and dangerous.

Were we to eradicate the very concept of discrimination we would lose, in the process, our capacity to understand that concept, and with it the capacity to understand that portion of history that relates to or relies upon an understanding of that concept and through that a part, if not most of our understanding of who we are and of how we came to be the people and the society we are today. It is not merely the case that the concept of discrimination disappears from everyday life, it disappears also from history; not only does discrimination cease to exist but it can never have existed as we would have no conception of what it it, what it may mean, or have meant, nor of how to understand it.

This is something that George Orwell understood all too well in his creation of ‘newspeak’ in the novel 1984. If someone – the State in Orwell’s novel – can control language and can eradicate words that are considered undesirable, then that someone can control the thoughts of a population, they can, quite literally, make certain ideas and concepts not just disappear but cease to exist, cease to have ever existed.

The underpinnings of political correctness, if taken to the level of Robert’s suggested ‘nobel purpose’, quite naturally lead on to the concept of thoughtcrime – what political correctness seeks to weed is not ‘Brtain’s prejudices’ but the very notion of prejudice itself; the ‘homogenising behemoth’ here is not the ‘traditions and social mores of British Civil Society’ but political correctness, which seeks to enforce on society a totalitarian uniformity of thought and concept entirely at odds with the concept of liberty, free expression or even of humanity.

There is little to choose between Robert’s ‘noble purpose’ for political correctness and the equally noble purpose expressed in Burgess’s ‘A Clockwork Orange’ wherein Alex is ‘cured’ of his capacity for violence by means of aversion therapy. however, as Burgess goes on to demonstrate, stripping Alex of his capacity for violence strips him, equally, of his humanity to the extent that he becomes incapable of functioning effectively in the real world. Ultimately Alex’s personal ‘salvation’ cones not at the hands of the doctors who take away, forcibly, his violent urges but only after his capacity for violence is rekindled and he chooses, of his free will, to turn away from violence.

Violence, prejudice, discrimination – all are part of the human condition, facets of human nature without which we would cease to be human at all as surely as if were were to lose our capacity for compassion, our belief in liberty or our ability to think and communicate.

That which distinguishes the civilised man (or woman) from the barbarian is not that the civilised man lacks the capacity for violence or prejudice or discrimination but that they understand that capacity all too well but choose of their own voilition not to exercise it. That decision, that moral and ethical choice, can be made only if one is possessed of concepts of prejudice, discrimination and violence is, therefore, equally possessed and capable of using the symbols (words) needed to express that concept.

—-

Footnote:

None of the above should be construed as necessarily supporting Browne’s thesis in his book, much of which appears, to me at least, to be little more than a poorly constructed and long-winded bout of quasi-ideological whinging by a reactionary old git.

9 thoughts on “Thought, language and political correctness.

  1. In defence of political correctness
    The purpose of Political Correctness is a noble one. It seeks to refine our political debate. It identifies and eliminates prejudice. Inconvenienty for Civitas and Anthony Browne, some of this prejudice exists within the traditions and social mor…

  2. It seems to me that the thesis underlying your worry though is utter nonsense, though. Orwell was wrong – you can’t control thoughts by controlling the meaning of words (always assuming you could even begin to do that in the first place). The articulation of thoughts _may_ depend on words (though assisted sometimes by non-verbal cues).

    One can indeed think without words – do you imagine that aphasics can’t think? Words are the means we use to communicate our thoughts, but they do not underpin thought itself. As Pinker points out “We have all had the experience of uttering or writing a sentence, then stopping and realizing that it wasn’t exactly what we meant to say. To have that feeling there has to be a ‘what we meant to say’ that is different from what we said. Sometimes it is not easy to find /any/ words that properly convey a thought.” (The Language Instinct, pp57 – 8)

    To return to the Orwellian nightmare – if a state (or any other organisation) should try to achieve that control of thought through controlling language they will be simply wasting their time. People will continue to subvert, change, mould and adapt language to meet their own communication needs.

    if the purpose of Political Correctness is to make prejudice literally unthinkable, then it is doomed to inevitable failure, no matter what.

  3. I beg to disagree on several counts.

    First, Maistre’s basis thesis on the unity of thought and language is certainly valid, although perhaps I should qualify that by noting that although the symbols we call words dominates cognition in modern humans it is by no means the only set of symbols associated with thought.

    One can articulate thought through gestures, through visual symbols and also spatially, otherwise we wouldn

  4. If I understand corectly what you mean by the ‘unity of thought and language’, it’s definitely not so. Not at least from my personal experience of aphasia during Migraine attacks (for example finding all my nouns had gone). I was easily able to think, but unable to express myself without resorting to to sign language (such as pointing at things) – that is I knew quite well what I wanted to say without being able to access the appropriate word in my vocabulary. Of course I had only a short period to devise a workaround (about an hour or so before the pain kicked in).

    “..based on the Sapir-Whorf hypothesis… that

  5. I’ll get back to you on some aspects of this in a day or two – working on something else at the moment – but as a starting point for Sapir-Whorf, the ever reliable Wikipedia provides this which should get you started.

    Yes, Whorf’s work in particular does incorporate linguistic relativity and well as determinism – not relativism, the two concepts being rather different – and one has to be careful in dealing with the influence of Humboldt and the ‘Weltanschauung’ hypothesis, which is to this branch of linguistics what the strong anthropic principle is to cosmology; logically valid but too strong and constraining to be practically valid.

    Having said that, I doubt we’ll agree overmuch – as scientific splits go that between the determinists and Chomskyans is pretty much as vicious as any you’ll find between competing hypotheses at the moment.

    Many of the criticisms of Whorf’s original work – which dates to the 1930’s – are valid, especially the whole trope of the inuit language having a huge number of words for snow, which seems a classic case of a scientist fitting the fact to the hypothesis and not the hypothesis tot he facts. But then a fair bit of the opprobrium amongst Chomskyans about Sapir-Whorf making a comeback in no better founded either; it seems very much the old ‘we thought we’d seen that theory off ages ago’ business that you get rather too often in science and which often belies scientists pretensions of being open-minded.

    I suspect, as ever, that what will ultimately emerge, and be required, is a new hypothesis that synthesises ideas taken from both Pinker/Chomsky and S-W, most likely in concert with Berlin’s concept of ‘value pluralism’, which I’m more and more coming to accept as having greater validity and rationality than anything else that’s out there at the moment.

    Just one point I would make on the subject of the unity of thought and language. I’m not suggesting that the processes of thought and language are identical, rather that they function in parallel and are both conjoined and disjoint at the same time; i.e. one cannot think without language and one cannot have language without thought but one can still think using language even one one’s capacity to articulate language, externally and internally is impaired or even absent, as occurs in everything from aphasia to brain injury affecting speech and language processing. It’s rather akin, in some ways to deafness in later life – you can still speak but not necessarily hear what you’re saying, or if you can, what you hear seems garbled and indistinct.

    Where things get tricky is in terms of the persistence of language and how this differs in relation to different forms of brain injury – cognition using words (for want of a better description) has greater persistence than cognition that relies on visual symbols or spatial awareness such that even relatively minor damage to the brains visuo-spatial processing capabilities can impair one ability to ‘think’ visually and/or spatially – there are several gender-linked genetic disorders affecting women which have a side-effect of impairing spatial processing/thinking which have no real parallel in men – in that sense the whole business about women being prone to having difficulty in reading and comprehending maps has some degree of validity.

    Conversely one can suffer extensive damage to the speech centres and verbal langauge processing capacity in the brain and yet suffer no particular impairment in the ability to think – when you talk of ‘losing’ nouns what you’re losing is not the concepts they convey but the ability to externalise those concepts, even internally – the inner voice still functions and it still using language but, temporarily at least, you can’t quite make out what its trying to say.

    Why this should be the case is all wrapped up in the mind-body problem, for which there is surely a Nobel Prize awaiting anyone who can solve it, but as this as occupied scientists and philosophers for centuries without yield any definitive answers then you’ll understand that it’s not a subjetc on which I can offer any more help that they can except to say that if there is a solution to that problem, it shows every sign of being one that will, in my opinion, emerge out of quantum mechanics and grand unified theory. At the radical fringes of, particular, memory studies there are psychologists who’ve ‘played’ with conjectural metaphysical notions that the mind may somehow extend beyond the four dimensions of space-time – one of key problems with memory is capacity, there seeming isn’t enough storage space in the physical brain to cope – but this kind of stuff is so radical that no one really talks openly about it as yet.

  6. Another retort to Anthony Browne’s pamphlet
    The Sharpener: Quaking under the jackboot of political correctness. Or not Recently a hawkish hack writer in the Times and Spectator, among other places, published a pamphlet through Civitas, entitled The Retreat of Reason* ([1], [2]), in which he argu…

  7. Intersting debate – The underpinning langauge theory of PC speech codes is indeed the sapir-worf hypothesis, however, and as a mere psychology graduate my understanding is that is has been falsified and in it’s “hard” version at least, rejected. The argument goes something like this:
    Sapir Worf proposes that language determines thought, it follows that someone speaking say French, would think in French, and that the concepts avaiable to them would thus be determined by the constraints of that language (linguistic determinism). These constraints would be different to someone speaking say, English, who would thus think differently. This seems logical in so far as it explains the “lost in translataion” phenomena, where languages do not neatly map to each other and may lack the capacity to express concepts that are marked in other languages, (e.g the French use of “tu/vouz” to denots familiarity, a concept which is not linguistically marked in English).
    The problem is this – certain concepts (e.g birth, death, mating, eating) are commonly understood across cultures, additionally many non-verbal cues such as facial expressions are innate (in that they are present and regular in children born blind) and culturally invariant (the meaning of a smile, a frown etc are universal). If we were constrained by our symbology, how would this be possible ? The concepts of happy/sad could not be universally understood, as their cognitive representations would vary with native language. Using the famous example of the Inuits having 26 words for snow (which incidentally they haven’t but it serves the purpose here). Far from showing that inuit speakers have a different set of concepts it merely shows that for geographical reasons, snow is relatively important in inuit culture, and this importance is has been culturally marked by the generation of a wide variety of words to describe it. The point is that language is a product of thought, not vice versa. This is why the PC speech code project is doomed – if someone wishes to insult or offend and certain words are unavaiable, the concept they wish to convey does not go away, the negative association will still be mainatined, and simply attached to the new word. The “sensitive” term “special needs” has already suffered this fate. The PC answer – continually change the language until we don’t have to is plain barmy – language and the meanings it conveys are dynamic systems, and negative expressions will simply always adapt to the vaiable vocabulary. Like most left wing thinking, the constructivist approach (of which linguistic dermininsm is a part) denies the possibility of innate characterstics, sticking rigidly to Locke’s “blank slate” conception of the human mind, no matter how many times it is mugged by reality.
    A final refutation. If language determines thought, and pre PC we had only the “language of opression”, then how did anyone know this ? The language of tolerance and sensitivity was unavaiable, so where did the concepts of tolerance and sensitivity come and did awareness that existing language was “wrong” arise if it was impossible to think in any other terms ?

  8. Matt: Jeez it’s been a while since I wrote this.

    First, any deterministic theory in psychology will eventually run into the mind-body problem, negating the possibility of a ‘hard’ theory being entirely valid. That applies as much to Chomskyian linguistics as it does to Sapir-Worf.

    Follow either route to its logical conclusion and you end up in metaphysics, you just hit the mind-body problem in a different place – folling Chomsky’s universalities (BTW) will eventually take to into Jung’s cosmic consciousness.

    What can be said is that S-W operates successfully within certain limits and that, equally, one can consider thought and language to be so closely interrelated as to be almost inseperable – two side of the same coin.

    In that respect there are similarities between psychology and physics, particular quantum mechanics and the search for a grand unified theory. We have partial theories that operate successfully within limits but it may well be that a unified theory will remain elusive as it takes us beyond boundaries conditions to a point where partial theories break down.

    It is, therefore, entirely possible that language is paradoxical in that one can argue both that language ‘creates’ thought and that thought ‘creates’ language, a dual state arrange akin (metaphysically) to the wave-particle duality in quantum physics.

    You’re right that one cannot eradicate concepts by eradicating language, which show that such concepts are persistent. Quite why this should be the case, we (obiviously) cannot be certain although, again, we have partial theories such as the idea of racial/genetic memory that offer some measure of a working hypothesis. Again, however, such theories are likely to break down (as usual) once on hits on the right boundary conditions.

    To be honest, the more one considers these, and other, questions in terms of seeking universalities, the more I more I come around to the thinking that the ‘solution’ to the mind body problem, such as it is, may well lie in the concept that the mind, and therefore thought, memory, etc, may well function at something akin to a quantum level.

    Understanding how it does that, is, however, a very different matter.

Leave a Reply to Robert Sharp Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.