As I mentioned the other day, during the holidays I passed some time leafing through a stash of ancient computer magazines found in my back room whilst mucking out. I still have nearly every issue of ROM Magazine (1977–78); not to be confused with ROM Magazine (1968–present), the official publication of the Royal Ontario Museum, or R.O.M. Magazine (1983–85), a Canadian zine for Atari hobbyists, nor possibly others. No, this ROM was subtitled ‘Computer Applications for Living’, and an ambitious little periodical it was. To distinguish it from the others, I am tempted to go into Monty Python mode, and call it ‘ROM which is called ROM’, but I shall cramp myself down and stick to the bare three letters.
Microcomputers began to be heard of in about 1973, and the first commercially successful machine, the MITS Altair 8800, came to market about the end of 1974. By 1977, the earliest manufacturers (who mostly sold their machines in kit form) were being pushed aside by relatively large consumer electronics firms like Radio Shack and Commodore, and by an upstart called Apple, which you may have heard of. These early machines were flaky, quirky, and required rather a lot of technical knowledge to operate; and there was little in the way of commercial software, so you generally had to learn to program them yourself.
In consequence, there was a voracious after-market for technical information and how-to stuff, much of it supplied, in those pre-Internet days, by magazines. There was BYTE, which covered the nuts and bolts of the new hardware for an audience mostly of engineers; and Dr. Dobb’s Journal, which covered the bits and bytes of software for an audience mostly of programmers; and Creative Computing, which covered whatever seemed most interesting at the moment (not a bad approach, that); and a raft of mostly short-lived zines dedicated to this platform or that.
And then there was ROM, which was a platform for what have since been called technology evangelists. Its mission was to introduce these weird new toys to society at large, and explain how and why they were going to change the world in drastic and unforeseen ways. It failed on both counts; but not for want of trying, nor for lack of quality.
For if you look at the bylines in the nine issues that were published, you will find yourself staring at a convention of first-rate geniuses. A sampling:
Bill Etra (now, alas, lately deceased) was a pioneer in computer video. Eben Ostby got involved with a man named Lasseter and a quirky little cartoon called ‘Luxo Jr.’, and became one of the founding fathers of Pixar. Lee Felsenstein is a hardware guru who has had a hand in inventing approximately everything; most particularly the VDM-1 display interface, the granddaddy of all graphics cards. Theodor H. ‘Ted’ Nelson is the inventor of hypertext; the World Wide Web is his red-headed stepchild, and he is not proud of the use it has made of his stolen DNA.
Ted Nelson wrote a column for ROM, called ‘Missionary Position’: a mildly daring thing to do in 1977. In one of those columns, he addressed himself to the ‘Memory Problem’. The early microcomputer hobbyists had to work on machines with painfully tiny amounts of RAM – usually 4 or 8 kilobytes; 16K was a dream of sybaritic luxury. Of course they imagined that all their programming difficulties would be solved if only they had enough memory. Nelson, who had been working on mainframe computers for decades, rudely disabused them of this notion. As he put it, the Memory Problem is fundamentally like the Time Problem, and the Money, Sex, and Quiche Problems: there is never any such thing as enough.
Memory, bandwidth, and processor speed, like time, money, bureaucracy, and labour (and possibly also sex and quiche), are subject to Parkinson’s Law. C. Northcote Parkinson originally observed, ‘Work expands so as to fill the time available for its completion.’ In fact, work expands so as to consume all the available X, for almost any value of X. This knowledge is a vaccine against a wide range of disappointments in life; but there are always unvaccinated souls (in technical language, ‘suckers’) who are ready to be taken in.
I skip forward a bit. A day or two after re-reading Ted Nelson’s polemic on the Memory Problem, I was looking at back numbers of Creative Computing from the early eighties. At that time, there was a fad for ‘text adventure’ games like Zork. The first of these, called simply Adventure, had been written for mainframe computers. It was seldom adapted for 8-bit microcomputers; the original code wouldn’t fit in their tiny memories. One digital Don Quixote, Robert A. Howell, tried to cram a full version of Adventure (in BASIC, no less) into an Atari 800 with 32K of RAM and no disk drive. He wrote a long, rather dryly amusing article about this attempt, which took him an entire summer; and Creative published it in their August 1981 issue.
To do him credit, Howell almost succeeded. He had to shorten a lot of the descriptions, and leave out the ‘maze of twisty little passages, all alike’ that occupies considerable room in the original. But by a series of wildly ingenious tricks to save a few bytes here and a few bytes there, he constructed a working version of Adventure that would just squeeze into that 32K with about 50 bytes to spare. If you ran it more than once, it tended to crash the machine, because Atari BASIC did not always reclaim all the memory that it allocated for a program. If that happened, you had to wipe the memory and reload the program from cassette tape. Such hazards were part of daily life with 8-bit computers. For the hobbyists of 1977, 32K of memory represented wealth beyond the dreams of avarice; but Ted Nelson was right, it did not solve the Memory Problem.
That same August, IBM introduced its original PC, blowing the lid off the 8-bit memory limit. The new 16-bit architecture could support a whopping 640K of RAM. That didn’t solve the Memory Problem either. Spreadsheet software was the new ‘killer app’ of the time, and thousands of people bought IBM PCs specifically so they could build bigger spreadsheets. Nowadays, nobody bats an eye at spreadsheets that take up tens or hundreds of megabytes; and those are small files, compared to some of the databases and media files that we work with today. Onward nevertheless—
My first computer, back in 1980, had 16K of RAM, and at least one salesman (for a competing product which I did not buy) airily told me that was more memory than I would ever need. My current production machine has 16 gigabytes. And yet the Memory Problem persists.
For while I was reading that ancient article of Howell’s, I was haunted by a more recent memory. It took me a little while to put my finger on it. But if you go to Amazon’s support pages for Kindle Direct Publishing, and still more if you Google for advice on publishing to KDP, you will find the Memory Problem snarling at you in all its fanged glory.
The great advantage of KDP, from a writer’s point of view, is that it allows you to collect (and keep) 70 percent of the retail price of an ebook sold on Amazon: far more than the pittance that any traditional publisher will give you. (The tradeoff is that you may sell fewer books. But since most books are rejected by traditional publishers and sell no copies at all, even this disadvantage is largely illusory.) However, there is a catch. Once you select the 70 percent royalty option, Amazon deducts a tax from your share of the money – a downloading fee of a few cents per megabyte for each copy sold.
Now, it does not cost Amazon a few cents to download a megabyte of data to a customer; or even a gigabyte. What the tax does is to discourage authors from wasting server space and bandwidth (and space on people’s Kindles) with unnecessarily large ebook files. A 100,000-word novel, saved as straight text with no fancy formatting and a single colour JPEG file for the cover art, occupies about 1 megabyte. If you add interior artwork, or embed your own choice of fonts, the size goes up. Some people have actually been such fools as to publish photographs of every page in a printed book, and call that an ebook. Not only does this spoil all the special advantages of the ebook format (try searching for text in a photograph!), it bloats the file to an indecent size – tens, possibly even hundreds of megabytes. This is one reason why art books are conspicuously absent from the KDP library.
However, sometimes you do have to include interior illustrations – maps, diagrams, line drawings, what have you. And sometimes you have to do tricks with typography that the Kindle engine does not support; and there again you must resort to graphics. The megabytes quickly pile up, and your share of the retail price just as quickly goes down. So, in the interest of authors, readers, and Amazon’s own pocketbook, Amazon kindly supplies you with web pages telling how to compress those graphics, minimize the amount of detail required, and generally skimp on transmission costs.
One byte of memory in 1981 cost rather more than a million bytes today; but I should say that authors and designers nowadays take more effort to save a megabyte than even Robert A. Howell took to save a byte in the old days. It makes sense. Howell was only saving memory on his own computer; we are saving bandwidth and storage space for all our readers, who may number in the thousands.
Decades from now, someone may chance upon this little screed, and marvel that human beings would waste effort on something so trivial as saving a megabyte of download capacity for an ebook file. And he will turn back to his own work on holographic VR environments, or whatever is en vogue at that time, and try to figure out how to cram a quintillion bytes of data down a pokey little fibre-optic line with a bandwidth of a few measly quadrillions; and he may reflect that he, too, is still saddled with the Memory Problem, and heave a mournful sigh before he goes on. And if he pauses for a moment of silence, he may hear a strange dim sound in the distance – the sound of Ted Nelson, cackling with laughter in his grave.
This is marvelous. Sharing.
I remember reading a quote of Bill Gates saying no one would ever need more than 4K, but that might be apocryphal.
And you can accept the 35% royalty rate – and not pay delivery charges – for large books, I believe.
‘More than 640K’ is the version usually quoted. I don’t know whether he actually said that either; but the memory of the original IBM PC is laid out in such a way that you can’t add more than 640K of contiguous user memory.
When PCs first began to have multiple megabytes of RAM, all kinds of weird software hacks had to be used to link up the bottom 640K of the address space with the free memory above the 1-MB mark. The addresses in between were taken up by BIOS, video memory, and other specialized hardware functions; and the operating system required those functions to be assigned to those addresses, so that block of memory could never be used for programs. This was a very grave design flaw, for which Mr. Gates deserves full blame.
The world marching onward. . . .
Back in 1984, we Commodore owners sneered triumphantly or pityingly (from the dizzying heights of our 64K) at the owners of mere Sinclair Spectrums (a mere 32K).
May I say, Mr. Simon, that I like your style. Just the way you put words onto a page appeals to me, puts me at ease, and makes reading through your blog posts a very relaxing and engaging activity, regardless of the subject. I wish I had half of your skill with words.
That said, as a former C.S. student, I find the contents of this post intriguing and amusing in their own right.
Holy moses, you just brought back my teenage years to me in a spattering of recollection. I remember laboriously typing out entire game programs by hand in BASIC for my TRS-80 Color Computer from those magazines.
The truly astonishing thing is that my adolescent’s memory of BASIC was acute enough to successfully translate into picking up Visual BASIC for Applications fast enough to score a coding job at a financial firm despite no experience whatsoever. Life truly does fall into patterns.