Michel de Montaigne, back in the 16th century, was the first writer to call his short, informal pieces by the name ‘essai’. The French word means ‘trial’ or ‘attempt’; Montaigne’s essays represented no set body of knowledge, but his own attempts to work out his thoughts in writing. The pieces collected here are in the same rambling and experimental tradition. I sometimes use the French spelling ‘essai’, not because I am terribly pretentious, but to remind me of the original meaning of the word. Nothing posted here should be taken too seriously. —T. S.

Uninteresting things

Now I deny that anything is, or can be, uninteresting.

—G. K. Chesterton, ‘What I Found in My Pocket’

In the noble little essay from which this noble little sentence is taken, Chesterton waxed lyrical about the many things that he found in his pockets: his pocket-knife, the type and symbol of all the swords of feudalism and all the factories of industrialism; a piece of chalk, representing all the visual arts; a box of matches, standing for Fire, man’s oldest and most dangerous servant; and so on and on. (The one thing he did not find there was the magical talisman he was looking for; and that, though he must have felt it too obvious to remark upon explicitly, is the type and symbol of the fairytale. There is always something that the hero will not find in his pockets, so that he must go forth a-questing.)

Now, I heartily agree that every one of these things is very interesting indeed, and all for the same reason: they are things that you can do something with. But since his time, in the advance of all our arts and the decay of all our sciences, we have greatly multiplied another class of things that are, in the main, very uninteresting. You can do nothing with these things; you can only do things to them. And when the best that you can do to a thing is to ignore it, you have reached the very nirvana of uninterestingness. [Read more…]

A dash of rhetoric

In a discussion at The Passive Voice, Karen Myers wrote at considerable length on the differences between the so-called prestige dialects of written English and other languages, and the colloquial dialects that make up the spoken languages. She ended with this bit of advice:

If you want to write an essay, use formal written English. If you want to write a narrative, use voice (the spoken language in all its registers).

Being the ancient curmudgeon that I am, I had to demur. And being the lazy curmudgeon that I am, I had to cut and paste and repost my response here, as a trifle of evidence that I have not gone entirely silent. Here is what I had to say:


 

The key, which hardly any linguists seem to have grasped, is that formal English is rhetorical and colloquial English is conversational. Nearly all the differences between them can be explained in terms of information theory.

(Referring again to ancient history: when I was a linguistics undergrad, information theory was not offered even as an option as part of the curriculum. It was a third-year maths course open, I believe, only to maths majors. My professors not only didn’t know it, they didn’t know it had any applicability to languages, and some of them, I believe, had never heard of it at all.)

In terms of information theory, the strict grammar and finely graduated vocabulary of formal English are error-correction devices. When you converse with another person or a small group, your listeners can give you immediate feedback, and if it appears that they did not understand you according to your intentions, you can immediately offer an explanation or a rewording. You can’t do this in either a published text or a speech to a large crowd. You therefore have to include such cues in the text as are needful to avoid ambiguity and misunderstanding. The art of rhetoric is not concerned only with swaying people’s emotions, as some people suppose; it is also about expressing your case with force and clarity when close two-way communication is not practicable.

This is a difficult art, and those who teach it are rewarded by being called ‘prescriptivists’, who, as any Postmodernist linguist can tell you, are the source of all evil in the universe. One of my professors told me flatly that all prescriptive statements about usage and grammar are by definition wrong; whereupon I concluded that she knew nothing about her subject and transferred to a different class.

The linguists of the anti-prescriptivist school claim that there is no such thing as an error in usage, that whatever any native speaker of a language says is automatically valid and grammatical. But they do not hesitate to use asterisks to indicate erroneous constructions: *goed, *childs, *gooses, and the like. What they object to is that people who know formal language should presume to teach it to those who only know the language in its colloquial registers. They believe that status-signalling is the only reason why formal language exists, and flatly refuse to ask how or why a given dialect came to be associated with high status in the first place.

In the cases of English and German, the formal language was codified chiefly in response to the need to translate the Bible, which is a notoriously difficult text. Luther in German, Tyndale and the Douay-Reims and King James translators in English, had to invent numerous idioms and turns of phrase to accurately convey the meaning of the original in a vernacular edition. And they had to do it in rhetorical, not conversational, language, because their translations would be read by multitudes of people who could not ask the translators for clarification, and they would be read from the pulpit to large crowds of people who could not even request clarification from the preacher. They had to solve the technical problem of error-correction for their translations to be useful at all. (Other translators tried and failed, or succeeded to a lesser extent, and their efforts are forgotten except by specialist scholars.)

A number of languages achieved their first literary form in this way. The principal surviving text in Gothic, for instance, is a large fragment of the Bible as translated by Wulfila (or a committee which he is believed to have led; cf. paintings by the ‘School of Rembrandt’). Several North American aboriginal languages were first committed to writing in the same way and for the same reason. In each case, the dialect employed in the translation became a standard reference point for the later literary use of the language, not because kings and princes favoured it, but because the Christian part of the population were familiar with their vernacular Bible, heard it weekly, often quoted it daily, and there was no other uniform published text which similarly large numbers of people could be expected to know well.

The Latinate garbage which was grafted onto formal English in the eighteenth century – the shibboleths about prepositions and infinitives and so forth (which I agree with you in despising) – was the product of a time when highly educated Englishmen were expected to be learned in Latin, and English gentlemen were expected to pay lip service to Christianity without actually believing it. To these people, the prestigious author par excellence was Cicero, and you can see exactly how they tried to remodel English to resemble his pompous and artificial Latin. But that attempt did not ‘take’ in the long run, because unlike the Bible, the works of Cicero were of no interest at all to the bulk of the literate populace. The professors worshipped Cicero, the schoolmasters assigned him, the upper-class schoolboys were bored by him, and the classes below them didn’t care a fig about him if they had heard of him at all.

In light of all this, I would modify your closing advice: If you want to write a narrative, use written (i.e. rhetorical or error-correcting) English – but disguise it with the idioms of whatever colloquial speech is appropriate. Part of the art of rhetoric, after all, is to plausibly deny that one is being rhetorical. Shakespeare knew this perfectly well, which is why he made Mark Antony say:

I am no orator, as Brutus is,
But, as you know me all, a plain blunt man
That love my friend, and that they know full well
That gave me public leave to speak of him.
For I have neither wit, nor words, nor worth,
Action, nor utterance, nor the power of speech
To stir men’s blood. I only speak right on.


Addendum. I do not know of any finer or pithier example of a man using rhetorical English, and even as he does, pretending that he is ‘just plain folks’ talking colloquially.

Gormenghast and the Great Tradition

I began this essai in April, soon after John Wright wrote the blog post to which it refers, and shortly before I was taken ill. I offer it now with apologies, having decided that it still had something to say, and was worth finishing. —T. S.


John C. Wright, in a post at Castalia House, asks:

Why in the world does anyone consider the Gormenghast Trilogy by Mervyn Peake to be fantasy?

He sketches his own scheme of genre classification, which is radial rather than Aristotelian. In case any of my 3.6 Loyal Readers are unfamiliar with these terms, I offer brief definitions.

Aristotelian categories work by genus and species. (These words were borrowed from Aristotle by modern biologists and used in a different way. Ignore the biological usage for the present.) A genus is a category of things, distinguished by some particular quality only found among its members. This quality is called the differentia. A genus can be subdivided into species, by identifying some additional differentia to distinguish members of that species from the other members of the genus. The classical example is the definition, ‘Man is a rational animal.’ Animal is a genus: we can list off ways in which animals are unlike (say) plants, rocks, or locomotives. Man is a species within that genus, differentiated from the others because he is capable of reasoning.

(At this point, the Village Wag will claim that most men are anything but rational. This is a red herring. All humans, except infants and the severely brain-damaged, are capable of some form of rational thinking process. All of them fail to think rationally on some occasions, and some of them fail on nearly all occasions. This does not take away the capacity, which is the differentia of the species Man. A can-opener is still a can-opener, even if you never take it out of the shrink-wrap. Nuts to the Village Wag.)

There is an alternative system, less talked about but sometimes more useful. A radial category consists of a prototype, which is considered an ordinary or definitive member of the category, and any number of other things which share certain qualities with the prototype. If X is the prototype, the category can be defined as ‘things like X’. The similarity may be greater or lesser, so that there are central and peripheral members of the category.

To take an example used by Wittgenstein, chess is a game, and a ‘central’ game at that; it will do no harm to take it as our prototype for the class game. Chess is played for amusement (though in a professional match, it may be for the amusement of spectators); it has set rules and procedures; it is played with definite equipment (chessmen), in a definite playing-ground (the chessboard); it is a competition between the players, with a fixed standard (checkmate) to determine who wins and who loses. Football is unlike chess in some ways – it has many players instead of just two, and it is a contest of athletic rather than intellectual skill; but it, too, is played for amusement, with set rules, equipment, and playing-ground, in a competition with a winner and a loser. The details of play are very different, but in all the essential points, it is just as much a game as chess.

Tabletop role-playing games, on the other hand, are a peripheral member of the category. They are definitely played for amusement. Some equipment is used, and while the playing-ground is usually an imaginary place, it does have sufficient existence for the purpose of the game (like the imaginary chessboard in mental chess). But the rules and procedures are alterable at the game master’s whim, there is no defined winner or loser, and the players normally act in cooperation rather than competition. We feel that these entertainments count as games, but they are very atypical games.

Narrative fiction can be treated as an Aristotelian or a radial category, whichever you prefer. But once you come to subdivide it (for convenience in choosing stories that you are likely to enjoy), you immediately find yourself in a thicket of radial categories that cannot be approached in any other way. A mother reads ‘Cinderella’ to her child, and the child wants to hear ‘more stories like that’. Maybe what the child really wants is more about fairy godmothers, or young girls who marry charming princes, or magical transformations. But whatever the child wants, the mother is likely to find it in the radial category of ‘things like “Cinderella”’, which we call, for convenience, fairy tales. [Read more…]

On political correctness

Morals consist of political morals, commercial morals, ecclesiastical morals, and morals.

—Mark Twain

 

Here I am not trying to deal with the familiar claim that freedom is an illusion, or with the claim that there is more freedom in totalitarian countries than in democratic ones, but with the much more tenable and dangerous proposition that freedom is undesirable and that intellectual honesty is a form of anti-social selfishness. Although other aspects of the question are usually in the foreground, the controversy over freedom of speech and of the press is at bottom a controversy of the desirability, or otherwise, of telling lies. What is really at issue is the right to report contemporary events truthfully, or as truthfully as is consistent with the ignorance, bias and self-deception from which every observer necessarily suffers.…

The enemies of intellectual liberty always try to present their case as a plea for discipline versus individualism. The issue truth-versus-untruth is as far as possible kept in the background. Although the point of emphasis may vary, the writer who refuses to sell his opinions is always branded as a mere egoist. He is accused, that is, of either wanting to shut himself up in an ivory tower, or of making an exhibitionist display of his own personality, or of resisting the inevitable current of history in an attempt to cling to unjustified privilege.

—George Orwell, ‘The Prevention of Literature

The term ‘political correctness’, which began (and justly so) as a term of abuse, has been embraced by a legion of liars as a justification for their lies; and it has been made so fashionable that nowadays, in most polite circles, it is considered an insult to accuse someone of not being politically correct.

The usual excuse made for this is that political correctness is about not offending people’s feelings unnecessarily; that anyone who opposes it must therefore want to be offensive, and that, you know, is a Very Bad Thing. This characterization of the issue is one of the Big Lies of our time, as a variation of it was in Orwell’s time. The real issue, now as then, is about the desirability, or otherwise, of telling lies.

If Joe Bloggs wishes to say that two and two are four, or that Paris is the capital of France, or to make any other straightforward and uncontroversial statement of fact, he is working on a level where political correctness does not even come into question. What he says is correct, without any modifiers, or else it is in error. The moment you add a modifier to that adjective, you are moving away from the primary issue of truth vs. falsehood, and into secondary matters which may be in plain conflict with it. [Read more…]

The Memory Problem

As I mentioned the other day, during the holidays I passed some time leafing through a stash of ancient computer magazines found in my back room whilst mucking out. I still have nearly every issue of ROM Magazine (1977–78); not to be confused with ROM Magazine (1968–present), the official publication of the Royal Ontario Museum, or R.O.M. Magazine (1983–85), a Canadian zine for Atari hobbyists, nor possibly others. No, this ROM was subtitled ‘Computer Applications for Living’, and an ambitious little periodical it was. To distinguish it from the others, I am tempted to go into Monty Python mode, and call it ‘ROM which is called ROM’, but I shall cramp myself down and stick to the bare three letters.

Microcomputers began to be heard of in about 1973, and the first commercially successful machine, the MITS Altair 8800, came to market about the end of 1974. By 1977, the earliest manufacturers (who mostly sold their machines in kit form) were being pushed aside by relatively large consumer electronics firms like Radio Shack and Commodore, and by an upstart called Apple, which you may have heard of. These early machines were flaky, quirky, and required rather a lot of technical knowledge to operate; and there was little in the way of commercial software, so you generally had to learn to program them yourself.

In consequence, there was a voracious after-market for technical information and how-to stuff, much of it supplied, in those pre-Internet days, by magazines. There was BYTE, which covered the nuts and bolts of the new hardware for an audience mostly of engineers; and Dr. Dobb’s Journal, which covered the bits and bytes of software for an audience mostly of programmers; and Creative Computing, which covered whatever seemed most interesting at the moment (not a bad approach, that); and a raft of mostly short-lived zines dedicated to this platform or that.

And then there was ROM, which was a platform for what have since been called technology evangelists. Its mission was to introduce these weird new toys to society at large, and explain how and why they were going to change the world in drastic and unforeseen ways. It failed on both counts; but not for want of trying, nor for lack of quality.

For if you look at the bylines in the nine issues that were published, you will find yourself staring at a convention of first-rate geniuses. A sampling:  [Read more…]

The exotic and the familiar (Part 4)

Continued from Part 3.

Before we examine the merits that made our three breakthrough fantasies break through, I hope you will permit me a Historical Digression:

As luck or providence would have it, the other night I saw, for the first time, Tim Burton’s magnificently lurid production of Sweeney Todd: The Demon Barber of Fleet Street. That tale has been around, in various forms, for nearly two hundred years; it is one of the hardy perennials of horror fiction – far older than Dracula, almost as old as Frankenstein, almost exactly contemporary with the short stories of Edgar Allan Poe.

Mr. Todd first appeared in 1846, in a story called The String of Pearls, by James Malcolm Rymer and Thomas Peckett Priest – who, for that achievement alone, deserve to be ranked in the first class of Victorian novelists, but never are. For, alas, The String of Pearls was a penny dreadful. That is a term, or insult, that may need a bit of explanation for the benefit of the modern reader.

Every so often, the business of literature is turned topsy-turvy by some new technological development, and the previously unchallenged assumptions of the Grand Old Men of the business are blown to atoms and scattered widely over the waste regions of the cosmos.

[Read more…]

The exotic and the familiar (Part 3)

Continued from Part 2.

In the first half of the twentieth century, the ‘school story’ was one of the most popular genres of British pulp fiction. The giant of the field was Charles Hamilton, better known as ‘Frank Richards’ and ‘Martin Clifford’. Under these two names, he was the lead writer for The Gem and The Magnet, the two leading boys’ weekly magazines in Britain between the World Wars. (He also wrote for other markets under other names, including his own.) For more than thirty years, Hamilton published a 20,000-word story in each magazine every week without fail – more than two million words of fiction per year – until they were killed by the paper shortage of the Second World War. After the war he continued to write, with paperback books taking the place of the vanished pulps. By the time he died in 1961, he had written and published about 100 million words.

Many other writers had a go at school stories. Thomas Hughes founded the genre with Tom Brown’s School Days in 1857, and attracted scores of imitators. Kipling was one of the first; P. G. Wodehouse made a name for himself in the genre before switching to light comedy; and there were, of course, many lesser lights. But the genre died with Hamilton, as it seemed, beyond resurrection. [Read more…]

The exotic and the familiar (Part 2)

Continued from Part 1.

Throughout the 1970s, the ‘New Hollywood’ had been establishing itself. Heroes and villains, Westerns and war movies, were out of fashion. The critics’ new darlings were men like Coppola and De Palma, who pointed their cameras at the mundane and the sordid. The good characters in the new films were ineffectual; the effectual characters, as a general thing, were unselfconsciously evil. This refusal to engage ethical reality was called ‘moral ambiguity’, and praised; the tight focus on a narrow and unrepresentative segment of modern city life was called ‘realism’, and praised more strongly still.

So far as the film business was concerned, fantasy, like animation, was banished to the realm of children’s movies. Such things were considered beneath a grown-up audience, and Hollywood as a whole was trying to be very grown-up indeed. One or two cracked auteurs tried to make animated fantasies for adults, and succeeded in making cult films for stoners and adolescents. [Read more…]

The exotic and the familiar (Part 1)

I’ve heard Brian Aldiss talk about the same phenomenon. For him, a novel often requires two ideas. He describes them as a combination of ‘the familar’ and ‘the exotic’. He begins with ‘the familiar’ – usually something germane to his personal life, either thematically or experientially – but he can’t write about it until ‘the familiar’ is impacted by ‘the exotic’. In his case, ‘the exotic’ is usually a science fictional setting in which ‘the familiar’ can play itself out: ‘the exotic’ provides him with a stage on which he can dramatize ‘the familiar’. Rather like a binary poison – or a magic potion – two inert elements combine to produce something of frightening potency.

The same dynamic works in reverse for me. I start with ‘the exotic’… but that idea declines to turn into a story until it is catalysed by ‘the familiar’.

For example: The Chronicles of Thomas Covenant is squarely – and solely – founded on two ideas: unbelief and leprosy. The notion of writing a fantasy about an ‘unbeliever’, a man who rejects the whole concept of fantasy, first came to me near the end of 1969. But the germ was dormant: no matter how I laboured over it, I couldn’t make it grow. Until I realized, in May of 1972, that my ‘unbeliever’ should be a leper. As soon as those two ideas came together, my brain took fire.

—Stephen R. Donaldson, The Real Story

Three times in the last sixty-odd years, a work of fantasy has come along that redrew the whole map of the field; that banished the limits of the publishable, as then understood, as suddenly and thoroughly as Columbus banished the ‘ne plus ultra’ from the Pillars of Hercules. Lately I have been thinking hard about these works, seeing what they had in common with one another, and what set them apart from the other fantasies of their times, to see whether I could account for the magnitude of their success.

All three of these breakthrough fantasies can be described in terms of Aldiss’s ‘exotic’ and ‘familiar’. Each, considered thematically, is a collision between two great, or at any rate large, ideas. And when I began to look at them in this light, I found a curious thing: which idea was ‘the exotic’ and which was ‘the familiar’ was not as obvious as it seemed. Indeed, the works themselves tended to familiarize the exotic and exoticize the familiar, so that those whose habits of mind were formed afterwards would never quite see the ideas as their first audiences saw them.

Let me see if I can explain what I mean.

[Read more…]

‘Simplicity or style’

Over at The Passive Voice, Passive Guy has reposted a precious little peacock strut by a minor critic, entitled, ‘Simplicity or style: what makes a sentence a masterpiece?’ The author offers one sentence each from Pride and Prejudice, Emma, 1984, Neuromancer, and other works – as if it were the presence of that single sentence in each novel that assured its place in the literary canon.

I found myself strongly moved to reply:


Ah, the Sentence Cult rears its ugly head. A novel is not made of sentences; it is made of scenes and récit, characters and plot elements – building blocks on the narrative level. The individual sentences are always replaceable – else it would be impossible to translate a novel into another language, or make it into a movie. Too often, the writer’s ‘masterpiece’ sentence marks a place where he ought to have followed the advice, ‘Murder your darlings.’

I can think of one notable exception. That is where the great sentence has special meaning and force inside the story. Perhaps it serves as a Leitmotiv; perhaps it is a bit of dialogue that the characters will recall later, and understand more of its import in light of later events. In any case, it must be possible for the reader to take it in stride. If you have to drop out of the story to pause and admire, the writer has manufactured an opportunity to lose you.

All this, of course, is lost on the pinchbeck critic raised on ‘close reading’, which requires one not to experience the interior drama of the story, but instead to remain carefully on the surface. Such a reader is like the nearsighted tourist who spends his whole day looking at pebbles on the beach, and never even notices the ocean.