Those who cannot remember the past are condemned to repeat it.
—Santayana
Those what cannot remedy the past can pretend to repeal it.
—Howland Owl
The second text is from Doyle and Sternecky’s revival of Pogo, which met a worse fate than it deserved. Howland attributed it to ‘Santa Ana’, but it was his own (or his authors’) genius that distorted the quotation into a sort of Freudian slip of the Zeitgeist worthy of P. G. Wodehouse. On the one hand you have the serious philosopher of history, weary and worldly-wise, bluntly restating the obvious law for the thousandth tedious time; on the other, the half-baked Postmodernist, illiterate but pretentious, vaguely remembering some better man’s scripture that he may be able to cite to his purpose.
I have been too long in the company of the second kind of people. I mean the kind who mistake wishful thinking for valid reasoning. If you point out the difference between these things they call you heartless, and if you prove them wrong about the facts they call you an intellectual bully. ‘Everyone is entitled to their own opinion,’ they smugly say; and if they are a little more learned, according to their dim and dubious lights, they add that you are merely making a transparent patriarchalist attempt to oppress alternative modalities of enlightenment by determining the parameters of discourse.
I have seen this stubborn resistance to reality, this insistent claim that one’s own ego is the measure of all things, from both ends of the intellectual scale. I have known not one but two men who had exactly the same triumphant response to any objection: ‘Not necessarily.’ They invoked this magic formula with a touching and childlike faith that it was equal to every adversity and would always prove them right. If you told them that two and two make four, or water is wet and runs downhill, or the roof over their heads will fall down if they remove the supporting walls, they fell back on the sovereign specific and thought they had won the argument. It would be charity to call them half-witted. But the difference between their wall-less roof and the castle in the sky of many a Ph.D. candidate is not large, except in decoration and scope.
George Orwell once said, in one of his own rare excursions into wishful thinking, that you ought to simplify your language because then, if you make a stupid remark, its stupidity will be obvious even to yourself. The example of my two half-witted friends shows what a vain hope this is. We are all of us, at times, like John Earle’s Sceptic in Religion, who is ‘always too hard for himself’. We outwit ourselves with great regularity; we get lost in our own garden path. And if the garden seems so pleasant that we would rather lose our way, simplicity of language will not save us. We can all say ‘Not necessarily’, and we all have moods in which that answer will suffice us. The cleverer we are, the more easily we will clothe our stupidity in plausible verbiage. Ignorance in a three-piece suit is more welcome in public than ignorance in the nude.
But to cover a fault is not to cure it. ‘Clothes make the man,’ Mark Twain said. ‘Naked people have little or no influence in society.’ But there are plenty of people who have little or no influence no matter how many clothes they wear. Their ignorance shows through and gives them away, sometimes on very brief acquaintance. This is why people so seldom take Orwell’s advice. We don’t want our stupidity to be obvious, even to ourselves; others might notice it first. Persuasion is the art of engaging people’s sympathies before they have time to discover that you are a damned fool. A sound argument, an agile intellect, or a fluent stream of blarney may not convince anyone, but they give you more time to soften up your audience before your stupidity becomes obvious to them. And of those three, the easiest by far is blarney.
Blarney as such has naturally and properly fallen into disrepute, along with the politicians and salesmen who are its chief practitioners. But whole disciplines have arisen out of the urge to buttress blarney with obfuscation. Lawyers, preachers, and technical people all have their own varieties of blarneous jargon, which serve the important purpose of helping them count coup against their professional rivals, and the vital purpose of preventing laymen from understanding what is really going on. Medical jargon is replete with technical terms that mean, in plain English, ‘Your body is breaking down and we have no idea why.’ At least half the prestige of the major professions depends upon the impenetrability of their language.
Of course the real past masters of the Higher Blarney are to be found in academe. In many fields, learned papers are not merely encouraged but required to be written in a particular kind of abstruse gobbledygook that was deliberately designed to obscure meaning and to completely efface the actual process of discovery. The folklore of science is filled with examples. Kekule discovered the molecular structure of benzene by daydreaming about snakes; but there is no mention of snakes in the paper by which he announced his discovery to the world. Science is supposed to have more dignity than that. But in leaving out the snakes, he also had to leave out the actual line of reasoning that led him from the daydream to the testable hypothesis. Instead he was obliged to invent a wholly factitious argument, working backwards from the proof until he found a starting-point that was both uncontroversial and respectably dull. Kekule’s scientific blarney is pardonable because he did confess the actual process of discovery in his informal writings, and still more because his conclusions were right. But the effect of countless formal papers like his is to falsify the way science is actually done and erect a spurious image of how it ought to be done.
From ignorance in a white lab coat we progress to ignorance in a tweed jacket with elbow-patches. The further we get from the physical sciences, the easier it is for blarney and jargon to subsist in their own right. Some branches of the humanities have hardly any contact with testable reality. Their practitioners can build theories out of soap-bubbles without ever seeing them pricked. Philosophers are especially notorious for this. Descartes set the modern fashion by trying to build a whole system of God and the universe without drawing upon observations of either one. He started with Cogito ergo sum, and laboured valiantly to explain everything in existence without introducing any other axioms. This is monism with a vengeance — even though, on quite other grounds, Descartes is more usually described as a dualist. It is also a highly developed case of half-wittedness, though to do Descartes justice, he had half of a very copious and brilliant wit.
Descartes’ downfall was that he wrote in French, a luminous and perspicuous language, whose good qualities he had neither the authority nor the ambition to destroy. It was all too easy to see what he was getting at. The Church responded, for the most part, with the weary tolerance of a mother with a hyperactive two-year-old; the secular world responded with a hearty horse-laugh. But certain students of philosophy caught the monist virus just the same. Knowing the Cartesian system to be hopelessly inadequate, they were still dazzled by the idea of the system. The idea that all existence is fundamentally one, and fundamentally simple, is tremendously seductive. I have felt the appeal of it myself. You might think even a philosopher would notice that reality is in fact exceedingly complicated, and entertain some doubt whether it can be reduced to a single principle. This would be a mistake. Any philosopher worth his salt can by mere fiat dismiss the complexity of the world as an illusion. Philosophers can dismiss anything as an illusion; they can dismiss everything as an illusion. The intellectual history of India for the last twenty-five hundred years has been an unceasing and quixotic fight against the idea that reality is real.
So other monists arose, or rationalists as they preferred to call themselves, Leibniz and Spinoza and others, writing for the most part in Latin and doing no lasting harm. Then came Immanuel Kant. Kant was half German and half Scottish, an unpropitious combination. He had the arrogance of a stereotypical Prussian and the indifference to appearances of a stereotypical Scot, and he wrote his tortuously complex thoughts in German without much caring whether anybody understood him or not. Interpretation and teaching could be left to men of lesser gifts. In his most famous and characteristic work, Kant expresses himself in code. He uses ordinary words in very precise technical meanings of his own invention, and sometimes he changes the meaning without bothering to notify the reader. He specializes in the page-long sentences that the seventeenth century loved so much and the eighteenth was already beginning to tire of. The tortured syntax of modern literary German and the tortured writing style of modern academia are mere byproducts of his prodigious engine of thought.
Kant made it unfashionable to be clear; he set a fashion for philosophers to demonstrate their profundity by obscurity, and they ended by getting the obscurity without the profundity. The history of philosophy, or at least of self-styled philosophers, since then is not a happy one. It is like a monumental staircase with a weighty German name inscribed on every step — Fichte, Hegel, Schopenhauer, Marx, Heidegger, Nietzsche, Wittgenstein — leading us down at stately pace into the abysm of gibberish. Lately there has been a reaction; men of goodwill, and sometimes even of good sense, have set up as philosophers, and patiently retaught themselves the art of writing clearly. (I have a particular admiration for Harry Frankfurt, who famously wrote a philosophical essay ‘On Bullshit’. I can recommend it to anyone not squeamish about language.) But much damage was done in the meantime.
You might think that linguistics, whose threshold I am just now crossing with the trepidation of Dante taking his first step into Hell, would be an unpromising field for this kind of windy and monomaniacal speculation. There are, after all, such things as languages, and the most elegant theory about language is apt to be exploded if it contradicts the way people actually talk to one another. The trouble is that some linguists, and among them the most prestigious, hardly talk to anybody but themselves.
The whole field of linguistics today lies under the heavy shadow of Noam Chomsky. Some linguists call him a prophet, others say he has set the whole field back by forty years; nobody can ignore him. His weaknesses as a scientist, which in my ill-informed opinion entirely vitiate his strengths, arise directly from his devotion to the Higher Blarney. As a philosopher — for he is that much more than he is a linguist, as Freud was much more a philosopher than a psychologist — he follows firmly in the rationalist school. All through his career he has been obsessed with the idea of a universal system of language. This he used to call ‘Transformational Grammar’; every decade or two he changes the name of his theory and substantially changes its structure, but what never changes is his unshakable conviction that he is right, and that what people actually say doesn’t matter.
Chomsky is not exactly a Postmodernist — he is too definite and dogmatic for that — but he could fairly be called the John the Baptist figure of Postmodernism. It was he who invented, or at least popularized, the idea of ‘deep structure’. In the jargon of Transformational Grammar, this means that the real structure of a sentence — the thing that makes it intelligible, and makes us know that it will be understood before we say it aloud — is radically different from the apparent structure, the one your English teacher bored you to death by parsing and diagramming. This could have been a valuable discovery. Just as Kekule’s real process of discovery was nothing like what he put in his paper on the benzene molecule, the way we actually make up sentences has little to do with the formal models in the grammar books. But that does not mean that our thought-processes are fundamentally linguistic in nature. If we examine the ‘deep structure’ of language with an open mind, and especially with a mind open to the rest of our everyday knowledge, we do not find a strange and esoteric construct imprinted in the human brain. What we find is nothing less than the world itself.
If Chomsky’s failing as a linguist is that he would rather be a philosopher, his failing as a philosopher is that he is merely a linguist. It was he who composed the famous nonsense sentence, ‘Colourless green ideas sleep furiously.’ This he did, not because he wished to make any particular point about meaning, but because he wanted to exclude the whole question of meaning from his terms of reference. He wanted to deal with sentences that were seen to be grammatical even though they were meaningless; he wanted to divorce words from their referents. Words are properly in the domain of a linguist; he can deal with them as a professional in command of his material. Referents are beyond his command and may be beyond his competence. Chomsky, in trying to build a monist model of language, quite properly (by the standards of Cartesian rationalism) excluded things so that he could focus exclusively on words. But the most fundamental observation about language is that words mean things. We test the validity of our sentences, not by reference to some Platonic model of grammar, but by whether we do in fact succeed in communicating the ideas we are trying to represent.
I once had a physics teacher named Harry Zuurbier, a charming and eccentric Belgian with no great aptitude for physics but a remarkable grasp of the linguistic problems peculiar to science. He used to say that if you could not define a technical term in ten words or less, you obviously didn’t know what it meant. (It was, fortunately, a recursive rule. You were allowed to use other technical terms in your definition, provided you could define them.) I have often used Mr. Zuurbier’s method to sharpen my own understanding of words, including some that are not normally thought of as technical at all. I have defined art in this way, and the other day, while caught in traffic on the Crowchild-Glenmore flyover (just as Kekule had one of his daydreams on the upper deck of an omnibus), I set myself to come up with a ten-word definition of language. After a little thought I produced this:
Language, n. Cognitive negotiation with external reality by exchange of conventional signs.
This is a reasonably straightforward definition, though under Zuurbier’s rule it requires some recursive unpacking. By cognitive in this case I mean ‘intended to produce understanding’. By negotiation I mean that the understanding is to be congruent and shared between two or more parties. If we talk to ourselves, we are generally trying to put words to our perception of reality so that we can fix more clearly in our minds what it is that we perceive. We want our thoughts to be congruent with the real world. When we talk to one another we want to share information, to pass on our thoughts so that others will think congruently with us; or at least to pass thoughts back and forth until we reach a compromise that both sides can agree on. Negotiations are of course normally conducted using language, but in this sense all language is a form of negotiation.
By signs I mean what St. Augustine meant in his little book On Christian Doctrine, which, incidentally, is probably the pioneering work on semiotics. A sign is a thing — an object or action — presented not in its own right but to refer to something else. A conventional sign is a sign whose own composition does not stand in any fixed and natural correspondence to its referent. If you ask me the way to the nearest mall and I point down the road, my pointing is not a conventional sign but a natural one. The motion of my finger is not arbitrary, but has a real physical relationship to the location of the mall. If I point at a spot on a map, that is still not a conventional sign, because the map is a simplified scale model of the real terrain. But if the mall is marked on the map as a patch of pink, that is a conventional sign. The mall is not pink, but pink is used on the map to indicate commercial districts; and that is a form of language. And so, of course, is the name of the mall written on the map beside it.
You will see at once, I hope, that this definition spells trouble for Chomsky’s grand theory of language. The idea of nonverbal language (the pink patch on the map) is troublesome in itself, because for Chomsky the sole business of language is to put words together into sentences. But language is quite capable of being nonverbal. Mathematical symbols constitute a kind of nonverbal language — and not an easy language to put into words, as anyone who has tried to read a complex formula aloud from a textbook can attest. Map symbols and the like are another kind. The ‘grammar’ of these nonverbal symbols has nothing to do with sentence structure, but depends entirely on their referents. A mathematical formula is ‘grammatical’ if it follows the rules of mathematics; a map is ‘grammatical’ if it recognizably expresses the facts of geography.
I strongly suspect that the same is true, at bottom, even of spoken language. In conversation, we are always altering our emphasis, our delivery, and our choice of words, depending on the reactions of our hearers, and in particular depending on whether their speech shows that they understand us or not. And unless the subject is either very abstract or very far away, as listeners we generally have opportunities to correct and calibrate our understanding by observing the real objects and actions behind other people’s words. It is for this reason alone that we can transmit some sort of meaning even by means of words like ‘whatchamacallit’ and ‘thingumajig’, and locutions like ‘the thing in the place with the stuff’.
It should be clear enough by now that my sympathies are not with the rationalist school of philosophers, but with their archrivals, the British empiricists. Words mean things; the corollary to this is that in the absence of things, words can dispense with meaning altogether. To take another example from Orwell, one literary critic can say ‘The outstanding feature of Mr X’s work is its living quality’, and another can say ‘The immediately striking thing about Mr X’s work is its peculiar deadness’. We are so accustomed to the combination of metaphor and humbug peculiar to the fine arts that we can read these sentences without batting an eye, and even entertain the illusion that we have actually been told something. Probably the first critic means that X’s work strikes him as a lifelike imitation of reality, and the second means that he finds it depressingly formal and formulaic; but absent context, we can never know for sure.
All language is shaped by its context, which consists not only of the adjacent words and sentences but of their referents as well. A cry of ‘Fire!’ means one thing in a crowded theatre and quite another on a battlefield. Nouns are used grammatically in a certain way because they refer (as a rule) to objects which continue in existence without actively doing anything particular. Verbs are used in another way because they refer to actions which happen, either continuously or at a particular time and place. Lenin’s famous phrase ‘Who whom?’ is a more authentic description of how language really works than all of Chomsky’s theories about ‘D-structure’ and ‘government-binding theory’. We use words to tell each other who does what to whom, and our grammar has been evolved and tested by centuries of daily use to do that job as economically and unambiguously as possible.
When I say ‘The sun rises in the east’, the actual motions of the sun serve, so to speak, as a guarantor of my grammar. The ‘grammar’ of equations guarantees that a function f(x) will behave in a certain way. The ‘grammar’ of celestial bodies guarantees that the relative motions of the earth and sun will show a certain kind of periodicity. The grammar of grammar, so to speak, is what it is because we mean it to describe and reflect the properties and behaviours of the things we want to talk about.
There is, by the way, a more theoretical objection to Chomsky’s notion of a universal grammar. His Holy Grail is a set of rules that would automatically specify, for any language, whether a given sentence is grammatical or not. As I have argued, mathematical symbols are a form of language, and such abstract mathematical objects as numbers are a legitimate ‘thing’ for language to refer to. But even within that restricted domain of grammar, it is in effect possible to create ‘sentences’ that are not provably grammatical. This is simply another way of expressing Gödel’s first incompleteness theorem:
For any consistent formal, computably enumerable theory that proves basic arithmetical truths, an arithmetical statement that is true, but not provable in the theory, can be constructed. That is, any effectively generated theory capable of expressing elementary arithmetic cannot be both consistent and complete.
My skill at formal logic is insufficient to provide a rigorous proof, but I have the strongest intuition that it could be proven, that if this statement is true of the restricted subset of language dealing with arithmetic, then it must be true for language as a whole. (I also have an intuition that such a proof is not as easy to construct as it looks. The fallacy of composition is both tempting and treacherous.) Looking further into the matter, I find this statement, which is formally equivalent to Gödel’s second incompleteness theorem and follows logically from the first:
If an axiomatic system can be proven to be consistent and complete from within itself, then it is inconsistent.
It seems to me that what Chomsky is searching for is an axiomatic system that will be a consistent and complete description of all possible grammatical sentences in all languages — and that he wants to express this system grammatically in some language. If so, the thing is impossible by definition.
One always feels trepidation in saying that a great and widely acknowledged authority on a technical subject has made a fundamental error. It takes a five-year-old to point out that the emperor has no clothes; we older children are not fearless enough. I do take some comfort in Lewis’s sudden discovery that the philosophical emperors of his youth really were the nudes he took them for:
Nor can a man of my age ever forget how suddenly and completely the idealist philosophy of his youth fell. McTaggart, Green, Bosanquet, Bradley seemed enthroned for ever; they wen down as suddenly as the Bastille. And the interesting thing is that while I lived under that dynasty I felt various difficulties and objections which I never dared to express. They were so frightfully obvious that I felt sure they must be mere misunderstandings: the great men could not have made such very elementary mistakes as those which my objections implied. But very similar objections — though put, no doubt, far more cogently than I could have put them — were among the criticisms which finally prevailed. They would now be the stock answers to English Hegelianism.
—C. S. Lewis, ‘Fern-Seed and Elephants’
But even if I happen to be right, I am sure that Chomsky’s followers will carry on unperturbed. Most of them appear to be Postmodernists in the narrow sense, and they have ready access to the stock Postmodernist (and paranoiac) answer to any contrary argument: ‘You’re making a transparent patriarchalist attempt to oppress alternative modalities of enlightenment by determining the parameters of discourse!’ — or, in shorter words, ‘Not necessarily.’ I have known persons, superficially intelligent and apparently educated, who understood no mathematics and refused to believe any controversial statement based upon a mathematical proof. I have even known one person who knew nothing about physics and refused to believe that the law of gravity was anything more than a social construct. He claimed that he could jump off the roof of the Petro-Canada Centre (fifty-three stories up) and fly, if it were not for all the bad vibes from the nasty negative people who wanted him to fall. Protective stupidity is a wonderful thing. It protects the mind perfectly from all unwanted ideas, even if it cannot protect the body from the consequences.
For protective stupidity of this type is merely a symptom of what I may call a disease of epistemology, and a very common one. Before I was six years old, I was intimately familiar with all the basic problems of the relativity of perception, the unreliability of memory, and the uncertainty of knowledge. The first insight came in kindergarten, when I was singing some song or other more or less in tune, as it seemed to me, while fully half the other children were droning on in cacophonous monotone. I now know that I had an unusually well-developed sense of pitch at that age; at the time, I wondered if I sounded as bad to them as they did to me. Before I ever heard of colour-blindness I used to wonder if colours looked the same to me as they did to other people, I think because I had been party to an argument about what colour somebody’s new coat was. And I always seemed to remember things that other people had forgotten, even when they had been eyewitnesses. I have since been told that I have an unusually retentive memory, but I am still surprised and suspicious whenever it turns out to be accurate about anything tricky or obscure.
In short, I was prepared very early in life to be a constitutional skeptic. I knew all the stock questions of epistemology forwards and backwards by experience, long before I knew what epistemology was or learned any of its technical terms. Consequently, when I reached the age of insolent inquiry and began to aggressively question the nature of knowledge, the first rush of answers failed to unseat my reason. Some people, especially those who first encounter these questions in the course of their higher education, react against all the knowledge they thought they had accumulated up to that point; they throw out any number of babies in their haste to be rid of the bath-water. I had thrown away most of the bath-water before the baby was put in the tub.
By that time I had developed what I believe is a very uncommon quality nowadays: an unshakable belief in the honesty of the external world. I don’t mean, of course, that other human beings are honest; all of them are liars in one way or another, and a good many, sad to say, are bigger liars than I am. But reality itself is consistent and true. I am not one of these people with the kind of imagination that can rehearse a thing over and over in memory until they turn black into white and create their own fictitious version of an event. I have a few memories that I thought were of that kind, but several of them have been strikingly confirmed by independent sources and the rest have at any rate not been contradicted. My memory is by no means perfect, let alone photographic, but when I remember a thing at all I usually recall it with reasonable accuracy.
For instance, I can still remember at least one of the songs that gave me such epistemological flutters in kindergarten. I don’t know if it was the particular one that made me worry about my perception of music, but as I wrote the last few paragraphs I could vividly recall the tune and the first verse, and the brown room in the basement of the school that we used as a music room, and the other kids singing out of tune. What I could not remember was the title. A quick Google of the first line was eerily successful. It was ‘The Swapping Song’, also known as ‘Wim-Wam-Waddles’ and by various other names, and it begins:
When I was a little boy I lived by myself,
and all the bread and cheese I had I kept upon the shelf.
To my wim wam waddle,
To my jack straw straddle,
To my Johnnie’s got his fiddle
And he’s going on home.
Except that as I remember it, the teacher sang, ‘all the bread and cheese I had I kept upon a shelf,’ which may be a mistake on my part or a variant reading on hers, and the last line of the chorus was ‘It’s a long way home,’ which for some reason struck me as inexpressibly melancholy. The first link I followed (supplied above) has a lead sheet as well, and the tune is exactly as I remembered it except for a note or two. I have not heard the song anywhere since I was five years old.
It was some help that my parents were comparatively old when they adopted me. My father was a boy during the Great Depression and just slightly too young to enlist in the Second World War, and he used to tell me vivid stories about what it was like to live through those times. He had a distant cousin, Manley Ruttan, who was born in 1882 and moved west from Ontario by covered wagon, and eventually took up a homestead nearly ten miles from Calgary; it is fully-developed suburb now, of no more than the usual hideousness. Manley lived to be 105, and I knew him moderately well. It was from people like these that I imbibed a sense of history as a living and continuing thing, not something that was over and done with and found in dull and lying textbooks.
I was also generally fortunate in my teachers. Rod Kemp taught me history and geography (he would never have any truck with ‘social studies’), and also the useful art of listening with my full attention and retaining a lecture in memory. (‘Listen now, please, scribble later,’ was how he admonished anyone with the temerity to take notes while he was talking. He paused two or three times during each class to let us jot things down.) Doug Coats did call his field social studies, but he had a brilliant gift for teaching local history. It was from him that I learned most about the city I grew up in and the countryside around it; and his teaching dovetailed neatly with Manley’s stories, because his speciality was the period roughly from 1900 to the end of the First World War. So equipped, I never fell into the strange modern habit of thought reflected by the so-called History Channel, where ‘history’ means old television shows and ‘ancient history’ means the shows are in black and white.
There are two possible attitudes to history: you can consider it primarily as fact, or as propaganda. I was steeped from childhood in proofs of the former; the modern attitude, with its constant and restless revisionism (and which bits of history are due for revision can change on the whim of the hour), heavily favours the latter. Which position you choose to favour is a matter of the greatest importance, because all your other thoughts and beliefs will be coloured by it. And you must choose; it is as impossible to hold both opinions and give them equal weight as to focus one eye on the end of your nose and the other on the horizon. Even if the eye muscles would perform the task, the brain cannot attend simultaneously to both points of view.
The propagandist point of view is best epitomized (and satirized) in Orwell’s slogan from Nineteen Eighty-Four:‘Who controls the past controls the future; who controls the present controls the past.’ As Berkeley would instantly have recognized, this is fundamentally a religious statement. If you believe in a transcendent God (and not ‘the god within’, or ‘the Inner Light’, or ‘my Higher Power’; as Chesterton says, that Jones should worship the god within him means that Jones should worship Jones) you must conclude that God controls present, past, and future alike. If not, you have a more difficult problem. It was the Party’s bold claim that it controlled the past by controlling what was said about it in the present. But such a claim can never be made good.
For historiography is not the whole of history. The evidence of the past is too varied and ubiquitous to be falsified or destroyed. Wolf’sProlegomena ad Homerum dismissed the Matter of Troy as a fable, and Homer as a convenient nom de guerre for a collective of anonymous aoidoi, at the very moment when the new sciences of archaeology and philology were making it possible to confirm the existence of both. Orwell thought that no accurate history of the Second World War could ever be written from German sources, but by 1960 William Shirer did just that, famously, basing The Rise and Fall of the Third Reich solidly on captured German documents. Stalin burnt enormous numbers of books and periodicals, doctored photographs, censored encyclopaedias, in a failed attempt to make himself out to be Lenin’s principal collaborator in the October Revolution and make unpersons out of Kamenev, Zinoviev, and above all, Trotsky. Within a few years of his death, it was Stalin who was the unperson — in the U.S.S.R., that is; the rest of the world was never deceived.
But we need not look as far afield as Stalin. My closest encounter with the mindset of the historical relativist came from reading Kim Stanley Robinson. Robinson is a writer of a good many talents, but one theme that continually recurs in his work is the impossibility of knowing anything certain about the past. In Icehenge, he goes so far as to posit a society in which humans are effectively immortal, but outlive their memories, so that eyewitness testimony is no longer admissible in court. From this and other sources I gather that he really does not believe that people have reliable memories of events in their adult lives. This is definitely true of senile people, and it can be true of habitual drug-users; so that I wonder whether Robinson’s own memories have been clouded by too generous consumption of THC. I can say from personal experience that the memories of adulthood can be quite as vivid and accurate as those of childhood, though it can be easier to confuse one event with another because there is so great a number of similar days to choose from. The plot of Icehengeconcerns an apparent replica of Stonehenge that has been found on Pluto, and turns on the question whether it is an ancient alien artefact or merely a modern forgery. In fact it turns out to be a forgery, but the chief surviving forger is utterly convinced that it is real because she has lost all memory of building it. I find this idea ridiculous, but the démarche is sprung so late in the book that I really had no time to throw it against the wall.
It is an old question in metaphysics whether the past has a real and independent existence somewhere beyond our reach, or is merely the sum of our memories and whatever we can piece together from the surviving evidence. If the latter, the propagandist’s view of history has a chance of holding up; if the former, it must be ruled completely out of court. If we tell lies about China, they remain lies even though we never visit the real China: for the real China is still out there to contradict us. And if the real past is still ‘out there’ somewhere, we cannot hope for a better result by telling lies about history.
Einsteinian physics seems to me to give strong support to the reality of the past. The intervals between space-time events, in Einstein’s model, are fixed, but whether they are perceived as times or as distances depends upon the relative motion of the observer. Time is fungible, at least to some degree; it appears to have all the properties of a spatial dimension, except that we happen not to have freedom of movement in it. Even that is not quite true in theory. In certain extreme conditions you would gain the complete freedom to move forwards and backwards in time, at the cost of losing your freedom of movement in space. At least one mathematical model posits that these conditions should exist in the interior of a black hole, though I believe this model is no longer favoured by physicists.
The conviction that there is such a thing as time travel is no longer confined to science-fiction fans; and you cannot travel from one place to another unless they both exist. In point of fact we all do travel forwards in time at the reliably fixed speed of sixty minutes per hour, and there appears to be no way for physical bodies to reverse the journey. But the fact that we cannot get back to a place we left behind does not imply that it no longer exists.
For these and other reasons, and also I think by sheer gut instinct, I have never been able to put aside the belief in the objective reality of the past, any more than that of the present. The fibre of the universe is tough, and very long in the grain. Whether there is a transcendent God or not, we humans cannot hope to control even the present in the way that Stalin and Big Brother attempted. It would be apter, for ordinary purposes, to amend Orwell’s slogan: ‘Who pretends to control the past pretends to control the future; who pretends to control the present pretends to control the past.’ Both the stubbornly persistent evidence of the past and the stark unknowability of the future are always there to remind us that even in the present, our sense of control is largely an illusion. The attempts made by some persons to manipulate the past and determine the future must, I think, be put down to protective stupidity, and to what Lewis calls the characteristic modern inability to disbelieve in advertisements. The people who most want to believe that all history is propaganda are those who are good at propaganda themselves. And of course they believe that they are very much better at propaganda than they really are. Their propaganda tells them so.
Perhaps what made linguistics so easy for a Chomsky to conquer is its own weakness for propaganda and for trying to rewrite the past. For at least half its history as a scientific discipline, linguistics was chiefly concerned with historical data, and with trying to sleuth out the connexions between modern languages by tracing their hypothetical common origins. This branch of linguistics still thrives, though in the English-speaking countries it has lost its academic appeal. To this work there have always been two approaches, corresponding to the two attitudes towards history.
Historical linguists work in terms of what they call ‘*-reality’, from the long-standing practice of indicating hypothetical word-forms (as opposed to those attested in surviving documents) by prefixing them with an asterisk. We know that the French quatre and the Spanishcuatro both go back to Latin quattuor, but we only surmise that quattuor and four both go back to Proto-Indo-European *kwetwer: hence the asterisk.
Some linguists take what I believe to be a sensible and cautious view of *-reality. They say, in effect: ‘Four or five thousand years ago, according to all our evidence, languages X and Y had a common ancestor; and it must have been something very like this.’ To them, the asterisk is a constant reminder that their reconstructions are uncertain and subject to revision. The real language of the Kurgan culture was probably somewhat different from our reconstructions. It could have been completely different, for there is substantial doubt whether the Kurgan people spoke Proto-Indo-European at all. And when Allan Bomhard reconstructs a Proto-Nostratic language with eighteen different word-roots all taking the form *bar-, he does not mean that bar was a word with eighteen widely varying meanings. He only means that whatever the differences of sound were between those eighteen words, they have been lost beyond his ability to reconstruct them, because the evidence simply has not survived. The asterisks warn us that the evidence may not mean what it appears to mean, just like the asterisks in books of athletic records.
But the other attitude, which seems to wax and wane over time and is perhaps gaining ground at present, is that *-reality is the past, that the linguist’s reconstruction is real. Tolkien’s one great weaknesss as a philologist is that he was constantly, if subtly, seduced by this idea. He firmly believed in the objective reality of the past; but he also sometimes felt that he had a mystical connexion with it that others lacked. His unfinished novels The Lost Road and The Notion Club Papers reveal this side of his mind with almost Freudian candor. The protagonists of those stories really were haunted by the ancient racial memories that Tolkien only intermittently fancied he had. Another form of this disease, and a much more virulent and lethal form, appears among those who take the propagandist view of history from the outset. As far as they are concerned, they themselves have the power to create whatever past they want by fiat, as long as they can defend it in debate with their fellow academics. There have been some amusing tempests in teapots between linguists of this school who fancied that they were writing genuine Proto-Indo-European epic poetry. The fact that no two of them quite agreed in their *-poems should have warned them that they were indulging in speculation; too often, they were inspired instead with the fiery spirit of St. Athanasius, solus contra mundum, and the burning desire to stamp out the obvious heretics who dared to dispute their word. From the little I have read, the inventors of the so-called ‘Proto-World’ language seem badly afflicted in this way.
This, then, is the field that I enter with such fear and trembling. And it is of a piece with my other would-be professional fear: indeed, suspiciously symmetrical. For my fiction, and especially The Eye of the Maker, is based upon a conceit wholly incompatible with the Postmodernist world-view. It depends upon the idea that the past is real, that reality is objective and discoverable; that our errors cannot dispel the Thing in Itself, nor our prejudices alter the shape of Things as They Are. This idea is anathema to a great many people, and presenting it in the guise of fantasy is unlikely to mollify them; even though, as I find, nearly every story worth reading is underpinned by that very assumption.
The notion that the past is unreal and the notion that it is recoverable in full detail are equal and opposite errors. Do you really think you can determine whether Russell’s teapot exists? Can you really trace the dust of Alexander till you find it stopping a bunghole? I think not.
I don’t believe the words “in full detail” were ever used. In fact, the author’s apparent approval of Godel seems (to me, at least) to mitigate against such an interpretation of his conclusion.
Small correction: You say that, quote, “Descartes’s downfall was that he wrote in French…” Actually, Descartes’s really important book, the “Meditations on Primary Philosophy”, was written in Latin. (The reason I mention this is that the sentence suggests an assumption, which an awful lot of people seem to make, that you have Descartes’s number once you’ve read the “Discourse on Method” – and this, I believe, does him a grave injustice. Everything I know about Descartes goes to paint him as the sort of man who spends his whole life developing a particular idea, with his understanding of it getting progressively deeper and fuller as he goes along; thus, to treat the earlier of his two classic philosophic treatises as the last word on his thought, and to more or less completely ignore the later, is exactly the wrong way to go about things. It would be better, all things considered, to ignore the “Discourse” and just read the “Meditations” – though the best of all, of course, is to read the “Meditations” first, and then approach the “Discourse” as a rougher, more awkwardly personal treatment of the same themes.)