The Year of Temptation

Somewhere I read how a Teutonic Knight, to prove his chastity, chose a beautiful woman and lay beside her every night for a year without touching her.


The night she lay beside him first was dark;
But now the moon slips through the arrow loop.
The blade of moonlight finds no fatal mark
Only her hair that has shed its raveling loop.
A child will watch the clouds before the storm
And thrill to thunder's rattle in his bones
While strength and wisdom huddle safe from harm.
Her hair uncoils. He watches back to stone.
Her hair is silvered wire where each strand is loose,
The sheets as white and hot as steel in coals:
All winter's breath and summer's clouds reduce
To floating, knotted waves and shallow shoals.
       The knight has never touched the lady's hair;
       But he is wound and bound and sheltered there.


He left to fight for Christ with sword and lance.
In her door he saw her shadow stand demure,
As still as soldiers await the hostile advance
As hoodless hawks await the order to soar.
Behind him sun, before her blinded gaze.
He watched her stare and hope all down the file
A scrim that furled before the morning haze.
The heat of her eyes on his back did not fade with the miles.
He felt no fear before the howling rush,
No fury when he swung his sword to kill,
No pain to bear a pagan's lucky touch.
Half-through the door of death his heart was still.
       If he put out his hand in the dark and pulled her close beside
       Her flesh could not heal the wound of the love in her eyes.


Her breath is like the fall of steady rain;
Now hard, now soft, while clouds conceal the sky.
Rain is the prayer of farmers' life and gain;
But rain brings mud where knights must walk and die
In the wooded valley twilight. The path is lost,
The pagan voices speak the tongue of rain
The pagan wailing echoes over frost
The wordless speech of rage and rain and pain.
The ancient sacred words of monks and priests,
The patters counted over knots and beads,
The ranging howling passed from beast to beast,
The wind a breath that moans among the reeds:
       Her breath is like some strange and secret speech
       Which none shall learn when none remains to teach.


She swore, before the priest would give his leave,
Never to touch and never a touch to allow.
Sometimes she pulls her arms in through her sleeves
And sleeping winds herself inside a shroud.
The narrow cell is narrower every night.
He sleeps in belt and boots and wrapped in wool.
He flees the bed once the sky is gray with light,
To storm the field like the heavy, heedless bull.
How could the knight who always won before
By heart and strength, the first to leave the castle,
Have known that he already lost the war
Only by choosing to give a needless battle?
       They were no friends who led him to this oath:
       "A year to prove you bravest and purest both."


Her skin is still as smooth as banner silk
Shimmering over the tents of Tartar kings,
Still pale as ice when rivers turn to milk,
Still somehow like all rare and precious things.
The knight has learned with steel that skin is lie,
The lie of life that covers death within.
The strongest knights, the oaks so broad and high
May rot and fall from the smallest gap of skin.
He knows how soon her skin will fail the lady.
The priests have taught him all that age can do.
He knows the painting already is fading
And only memory is always new.
       His blood had yearned for the touch of painted saints:
       But he turns from the taint of blood beneath the paint.

Gender neutrality

When I took up writing essays I learned that writing is best when it is gender-neutral. Tradition told me that he is an adequate contraction for he or she. Languages where gender is obligatory have no problem with it. But as soon as I tried it, I saw that tradition was wrong. In English at least, gender neutrality is simply better than gender conflation, for three reasons.

1. You cannot be gender-specific when you want to be unless you are first gender-neutral. There is no way to gracefully modulate from equating male and human to discussing men and women separately. If he is male… If he is a she… The man who, as a male… The man who, as a woman…

Of course confusing men and human beings may be evil, when it hides women; but even when it is not evil, it is still silly, because it will not let you say anything about the difference.

2. Gender-neutral writing is more forcible. True, formulas like he or she and men and women are tiresome. Interpolating a piece of gender-conflating writing into gender neutrality neuters it. But expressions originally conceived in gender neutrality are more direct and vivid than those that conflate genders.

Some people find men and women or human beings or people intolerably awkward expressions; they would rather say men. Now I like human beings—it asserts biological solidarity without anthropocentrism. But if you want to concern the human condition, why not say we?

For the most part, gender neutrality is only a problem because English overloads the third person. Balance the load and you avoid the problem. Are you talking about yourself? Stand up; say I. Are you addressing your readers directly? Look me in the eye; say you.

Not all gender-neutral expressions are more forcible; but those that are gain so much that they justify the rest.

3. Gender-neutral writing is underdeveloped. Someone who becomes a writer in admiration of great but gender-conflating works of literature will understandably suspect gender neutrality as a subtle form of philistinism. So it can be. So gender-conflation can be a subtle form of misogyny. But the strongest argument for gender neutrality is literary.

It was a favorite technique of the twentieth century to escape the weight of literary history by subjecting writing to constraints. Someone wrote a book without the letter e; which is remarkable, but trivial. Gender neutrality is a nontrivial constraint. Literature is desperately overcrowded, hopelessly competitive. Everything has been done before and done better. But gender neutrality opens a new world, with space, horizons, elbow room.


This needs naming. The ists are easy to recognize: designists are to designers as jocks are to athletes. Most athletes are not jocks; most jocks are not athletes; but jocks worship athletes. Likewise designists worship designers. The ism is also simple: its creed being that design is necessary and sufficient.

This ism is accusable on three grounds. It is irresponsible: the designist holds the designer to no responsibilities, even to design—what would it would even mean for a designer to sell out? It is amoral: designists respond shamelessly to good design in the service of Soviet propaganda, and the response is more convincing than the shame where Nazi hardware is concerned. It is brutal: good design is like good aim. To praise the shot without asking who got shot and why defines brutality.

Designism is dogmatic mediocrity.

Designers must be dogmatic, because they are responsible for just the part of a thing with the least, or without any, constraints. It is the job of a designer to deflate possibility with orthodoxy, to halve the possible into the good and the gauche, and halve it again, until it contracts to the practicable. (Water cannot boil in a perfectly clean pot; some grain must be present for the bubbles to coalesce around. Likewise, without grains of dogma, there is no inspiration.)

Designers must be mediocre, because design targets the masses—in possession or in aspiration. Designers must be able to trust that their own reactions represent the average reaction. Skilled as they may become, designers cannot design unless they remain mediocre in their souls.

Designers are dogmatic and mediocre, but they are not therefore dogmatic about mediocrity. That is the extra step that makes the ism. Review the creed. If design is necessary, then what does is not deduced from the dogmas of design cannot be good. If design is sufficient, then what does not appeal to mediocrity must be a mistake.

Nobody defends bad design; not even I do. But I do not trust design. Bad design vellicates, but good design sedates. Influence, manipulation, and persuasion can deviate our intentions, but not deflect them. They affect, but they do not take. Design harmonizes the things that are intruded into our lives with the patterns of our perception and attention, makes them blend in or fit in. Design camouflauges; design encysts.

Of course it is the intrusion, not the design, that is good or bad. Design is analgesic. Analgesics make life better, they give us control over pain; but when the leech injects them, we are fatted on without noticing the loss. Designism confuses the mechanism with the thing that uses the mechanism, and applies more leeches for a deeper cure.


Colloquial stoicism – the stony stoic temperament – is a vice, inevitably compounded with sullenness, passive aggression, brooding, and envy. Stoicism as a school of thought – Zeno to Aurelius – has nothing in common with it. The big-S Stoics knew how to be happy and how to weep. But though I am perhaps a Stoic myself I think the real thing has its own vices.

Necessity is called the mother of invention. Therefore inventors must be necessitous: the inventor is the obverse of the whiner. Stoicism forbids us to dwell on what we cannot change; but if the inability is only temporary, premature acceptance risks making it permanent.

Asking for help is hard to do. We ask for help only when we must; the sting is the prod. We are each pricked with our own miseries, but suffering reaches its maximum when everyone keeps their troubles to themselves. Nature has usefully given us signals of suffering that compel attention: acutely, to cry, cry out, go off; chronically, depression, distraction, misjudgment. Telling your sad story with tears can get you help when telling it plainly would only get commiseration – or worse, some reciprocal confession. Tell another’s sad story with tears and it sounds like bad news; tell it plainly and it sounds like a joke. We hear, the squeaky wheel gets the grease; we should also hear that the silent wheel gives no warning before it breaks.

Suffering is a kind of work. A certain amount of the bad demands a certain amount of sorrow. And, like work, it can be divided. If someone else joins in, it feels like half the work is done. Maybe your arguments will, step by step, chip away at the bulk of sorrow, someone else’s or your own; but by sharing that sorrow, or sharing in it, you cleave it instantly.

Stoicism is strong medicine. Like any strong medicine, it has side effects. Sometimes invention, consolation, and the power of sympathy are helpless; but Stoicism should not be prescribed for lesser evils.


People do not mean what they say; they say what they mean. Taking things literally is the lowest conversational gambit. Conversation is not mathematics. Reductio ad absurdum is a dead end. “Nobody really...” “There is no true...” “Strictly speaking...” – all true, but trivial. Say something nontrivial. The solipsistic machine of logic never surprises. Inhabit a world that you share. See words as things – stubborn stuff, taking effort, substantial even when they are senseless. If you hear only what was said, not what was meant, you have not heard at all. Judging human things on other than human scales is a disease of the mind. Everything is footnoted with mortality and futility. Everything is perishing. Of course the play looks absurd when you watch it from backstage. But the absurdity is not in the play; it is in you. You are watching from the wrong angle.


I play few computer games and no video games. But when Windows was wondrous, the Internet was a rumor, and time spent with the miracle machine was its own justification, I spent far too much time at it.

I played two more than any others: Civilization and Descent. Civilization was and is a popular game; I have nothing new to say about it. But I cannot assume that anyone remembers Descent. It did have sequels; it was a plausible rival to Doom, making it Lilith to Doom's Eve in the ancestry of first-person shooters. Descent was also an FPS: but an FPS with a difference.

Here is the boilerplate. An all-powerful corporation operates offworld mines crewed with robots. In some of these mines the robots have gone wrong – suffered some infection. They have massacred or imprisoned the human staff. You have a heavily armed one-man ship. In one mine after another you must fight your way to the power core, destroy it, and get out before the mine goes up. If you should rescue any hostages along the way, that would be appreciated, but is not required.

None of this hints how strange and intense playing Descent is.

All your opponents are robots; the hostages wear helmets; except in the cutscenes, nothing like a human face is seen. You are alone from beginning to end.

The mines are not just underground spaces; they are warrens, tangled nests of open and hidden tunnels, labyrinths in three dimensions and zero gravity.

The Wright brothers deserve their fame, not because they were the first to lob a glider into the air with an engine strapped to it, but because they were the first to wrap their heads around the fact that a plane must be controlled in three degrees of freedom – roll, pitch, and yaw. It took about fifty years from the first experiments in powered flight for earthbound minds to make that leap.

In Descent your ship has six degrees of freedom: roll, pitch, yaw, heave, surge, and sway – the combined maneuverability of an airplane, a car, a submarine, and a dream.

This is a puzzle in interface design. Playing Descent with standard gaming equipment is impossible. The default compromise puts two degrees of freedom – pitch and yaw – under the right hand, on the mouse or joystick, and distributes the rest somewhat haphazardly on the keyboard. A good player will change the keybindings to bring all the controls under the left hand, coordinating patterns of motion like musical chords.

(Note that the ship behaves as if it had six separate engines, not one engine with six separate nozzles. With six engines simultaneous motion along multiple axes is a vector sum – which means that to achieve top speed you must move the ship simultaneously along three axes, triangulating your direction. This is called trichording, and mastering it is the only way to actually win the game.)

Just learning to play the game is mind-expanding; but that is not the intense thing about it. The game is claustrophobic. You fly at high speed down tortuous tunnels not much wider than your ship, whirling like a cell in the bloodstream – in a hostile bloodstream. As you thread each level, you map it; but the map must be presented as a model, not a projection – there is no two-dimensional way to make sense of a level in Descent. If you play for more than an hour I guarantee you will dream of those tunnels. You will see them when you shut your eyes.

Descent is unique for good reason. It has the steep learning curve a game could only get away with when there were few other choices. And some people physically cannot play the game; just being in the same room with someone playing Descent can cause motion sickness. (Really, Descent has seven degrees of freedom – roll, pitch, yaw, heave, surge, sway, and puke.)

The question has been raised: “Can video games be art?” Inclusively of video and computer games, I say no. Games can contain art, but the game itself is no more art than a museum is art. Games cannot be art to the satisfaction of genteel tradition. They are not art, but they are something. It is arrogant of me to dictate to a genre I do not participate in, but what I want in a game is not a movie or a novel – old wine in new skins. I want something to rewire my brain; I want something to infest my dreams.

Three Horror Stories


“I’m sorry, sir. You can’t leave. The building is under quarantine.”

“Quarantine? For what? I feel fine. Just calm down. You don’t have to point guns at me. What is the welding equipment for?”


“Honey! I’m home!”

“Honey. I’m home.”

“Very funny. What’s for dinner?”

“Very funny. What’s for dinner.”

“Honey, is something wrong?”

“Honey is something wrong.”

“Stop it! Jesus, honey, stop it!”

“Stop it. Jesus honey stop it.”

“Look at me! Honey, I’m right here. Look at me.”

“Look at me. Honey I’m right here. Look at me.”

“Stop it! Stop it, stop it, stop it!”

“Stop it. Stop it stop it stop it.”

“What’s wrong with you?”

“What’s wrong with you.”

“Honey, where are the kids?”


“Thank God I found you. I don’t know what’s happening. All my things are gone. My keys don’t work. Let’s get out of here. Let’s go home.”

“I’m sorry. I think you’ve mistaken me for someone else.”


Evil is the bad things that happened. Regret is the good things that did not happen. Regret troubles us more. Evil is bad in itself, not because of the things that did not happen instead; but we value what might have been more than what was as absolutely as we value the lives of children over the lives of adults. What happened is finite. What did not happen is infinite.

If harboring regret is weak then we are all weak. We shame regret in others, because we have no way to defy it in ourselves. Meanwhile regret reigns. Dreams beguile hope; fantasies beguile regret. The distinction matters because regret is stronger than hope – hope is finite. Love or money always come into it somehow. But less often as, “This will get me rich” or “This will get me laid” than as “It is not to late to succeed” or “It is not too late to be loved.” This is nontrivial. Hope only wants; regret has something to prove.

There is something heart-softening, something miraculous, about the repair of a regret, about a second chance. We stand naked before these incidents. Forget cures and escapes. A miracle is whatever repairs regret. And even then, even with the miracle, regret can still win. Too much time has been lost. It is too late anyway.

There is a poignancy to cosmologies which transcend regret – say, reincarnation, or the quantum theory of many worlds. We could be born again. We could have been born before. All the connections and chances we recognized but did not make, did not take, they were real – we missed them this time around, but we have made them before, we may take them again. There could be more than one of each of us, flickering above neighboring peaks in one eternal unresolved chord, near as the backs of one another’s mirrors – all arrayed, some strangely better, some strangely worse, some only strangers – but at least one, surely at least one, who was lucky, who was helped, who did not have to, who found a way.

But regret is not for escaping. The worst man is the man without regret. As far as philosophy purges regret, philosophy is bad. Ancient philosophy, the kind that would have us make philosophers of ourselves, is moving and useful, yet there is something in it not quite worthy, even somehow seedy: and this is because it does not recognize or accommodate regret. Here is Polonius. Here is a man learned, wise, crafty, with good taste in poetry – but he lacks regret, and this is enough to make him ridiculous.

Regrets are the broken circuits of actions. There they lie, loose and fatal as wrecks of power lines after a storm wind, still electric. Think of actors, finding in themselves the other people they might have been, getting to know them, putting them on. Think of music – music, after rhythm, runs on regret. (Thus the ancient enmity of music and philosophy.)

Regret is a shadow theodicy with an unknown god. Heaven, the very most that can be hoped for, promises to unite you with your loved ones; but not with the ones you should have loved, not with the ones who should have loved you. Regret is too large a part of this world to be salved by another. The veiled being who afflicted us with regret remains silent. Without evil we could still be ourselves. We are made of regret.

The South

In order, my first observations of the South. The stupendous clouds, like levitating icebergs. The jumble of wealth and poverty. I had seen mansions and hovels, but I had never seen a mansion, a hovel, and a clipped suburban lawn, all along the same mile of highway. The rarity of winter clothing – at temperatures when Northeners would bundle up, Southerners persevered in shorts and T-shirts, as if taking notice of the cold would only encourage it.

I have not gone much farther. I have become a Louisianan, but not really a Southerner. Of course being a Rodriguez helps with that. Here I pass for native unless I bother to deny it; but of course Rodriguez is not so happy a name in the rest of Dixie.

Comparing Louisiana with the South is tricky. Which Louisiana do you mean? Broadly speaking, all Louisiana is divided into four parts. The northwest, the watershed of the Red River, was settled by Americans of the same Scotch-Irish stock as the rest of the South. It is something like East Texas. This is where rock and roll happened. In the less south southeast, the Florida Parishes (Percy’s Feliciana) were settled by the English, and are something like Mississippi. This has been at times the most violent area in the US, host to multigenerational blood feuds. In the southwest, Acadiana was settled by Germans and Acadians; and in the very south southeast, Barataria (including New Orleans) was settled by the French and Spanish. (This is where jazz happened.) These last two regions are both culturally unique to Louisiana and very different from one another. Chances are, when you think of Louisiana, you are thinking of them: aristocatholic decadence and europeasant uninhibition.

But even allowing for regional variation, the interests of Louisiana were never quite the interests of the South. Again and again Louisiana has sacrificed for the South, and the South has taken. In the long view the relationship between Louisiana and the South has the shape of an unhealthy marriage: on Louisiana’s side, all passion and devotion; on the South’s side, something between tolerance and contempt.

In 1864 Louisiana was the wealthiest, most splendid state in the Union; four years later it was the poorest and most desperate, forever. “Often rebuked, but always back returning.” Mississippi, for example, resents Louisiana for stealing the spotlight during Katrina. When they talk about it there is a subtext something like this: We had it bad too, but we didn’t squeal on national television. We hearty salt-of-the-earth goodmen took our knocks, gritted our teeth, and rebuilt while those weirdo slacker heathens whined and sat on their thumbs. One might object something about the different challenges of rebuilding neighborhoods that were swept away in a night vs. rebuilding neighborhoods where the very ground spent weeks steeping in poison; but who can fight myth?

The Civil War is history to Northeners, yesterday to Southerners; something northerners learn about in school, something Southerners learn about at home.

In this respect I am an atypical Northener. Growing up my best friend’s father was a Civil War re-enactor. I know the smells of campfire and canvas, of wet wool and black powder. I think Battle Hymn of the Republic was the first song I ever memorized. I am reflexively blue the way Southerners are reflexively gray.

The contradiction in my feelings about Louisiana and about the South is clearest when I look at the war. I have no quarrel with Sherman, the bugbear of the South in general; but I personally dislike Butler. True, the war was particularly bad here – things went on in the Florida Parishes under Butler’s blind eye that read more like Apocalypse Now than Gone With the Wind – but there is no real basis for this distinction; it is simply emotional.

Still, I am not a Louisianan by birth. As an outsider I have to recognize that Louisiana is part of the South. The distinctions are many, but they are only distinctions, not differences. If I want to understand Louisiana I should understand the South better.


Reading in another language poses a recurring doubt. An image, a turn of phrase, an expression pleases you. Is it original to the author, or is it a commonplace of an unfamiliar tradition? Corollary: a minor writer within a tradition may be a major writer in literature generally, if there are no other survivors. (Even the first entrant to the mainstream from some tributary looms as better writers within that tributary never can.) No novel so trashy, no polemic so petty, no puff so creepy, that if some cataclysm obliterated the rest of the accomplishments of our civilization, it would not impress itself on our posterity. In any living literature there is something in common that counts for nothing from within, and everything from without. Lemma: greatness in writing requires you either to enlist an otherwise hidden tradition and impinge with it, or to imply the presence of an alien tradition, to bring some hidden weight to bear behind the cutting edge.


I look at the sky and see nuclear furnaces and transcendental distance. I look at an apple and see molecules, atoms, quanta of energy and motion. I look at a bird and see evolution, metabolism, aerodynamics. Yet looking at the same things others have seen a sphere of fixed stars, an apple-substance, and upholding angels. Some wit observed of the Ptolemaic cosmos: “If they had been right, what would the world have looked like?” The answer, of course, is that without a telescope it would look the same. Most experience is theory-neutral; we can get by believing almost anything about inner workings and ultimate origins. Even in the part of experience that science enables, people can get by without seeing scientifically. (I recall the moment of horror when I was learning to drive and realized there were people with driver’s licenses and Aristotelian intuitions about motion.)

Thus I try from time to time to put on wrong theories. I try to see Ptolemy’s sky spin or Newton’s sky tick, to see elemental matter with Aristotle or vortical matter with Descartes – to grasp and hold the view as long as I can. It interests me that this can be done at all, but properly the interest is in the consequences. The longer I can hold the view, the more I accommodate it. I feel the possibility of otherwise unknown moods; I feel a derangement in my scale of values; I feel a shift in my physical bearing – somehow how much of the universe is above my head matters to how I hold it; somehow the composition of dirt matters to how I stand on it.

The same life, the same world, but in a different key: the names, the patterns, the movements are the same, yet the overall effect is different. I wonder if this was what it was like to live through Einstein. I think of Feynman’s melancholy observation that science is not an infinite project, that it is at least doomed to run out of nontrivia to discover. I think of the commonplace that schools in art and philosophy sometimes end simply because they are too developed, because they require too much time to catch up with, foreclose too many possibilities. But there is no such escape from truth.


Taking a week off.

Short stories

There are too many short stories in the world. For all x, where x is heartbreaking or horrifying, mystifying or magnificent, pitiful or precious, agonizing or astonishing – some short story already satisfies it perfectly.

There is always room for another novel. Novels are too long for perfection. All novels do something wrong, leave some promise unfulfilled. There is always room in the gaps. The novel is fractal; from the right perspective we could see every novel growing out of another – see Don Quijote as the Mandelbrot set, dark among halos.

More obviously, there is always room for another movie. I suspect good directors become so by watching bad movies. Every bad movie has one good character, one good scene, one good shot, one good line. Watch enough of them and these scattered goods add up to the shadow of a great but unmade movie.

But short stories can be perfect. Pry open the novelist and you find a frustrated reader of novels; pry open the director and you find a frustrated watcher of movies. But pry open a short story writer and you find delight and devotion. This is strange. Perfection is so high and so cold a thing; it should quell and silence us, it should make us prefer some open field. What could inspire us to imitate what we cannot rival?

(There is of course an analogy in music – old Bach is perfect, yet inspires composers – but that is only a parallel mystery.)

The point of fiction is its process. No work of fiction worth writing is fully planned. Not that fiction must be unplanned or shapeless; only that, for the writer, fiction is as much discovery as design – a revelation that may be determined, but cannot be predicted.

Imagine a pantograph mounted to a drafting table. Some points are fixed, some points are free. As the draftsman moves the arms the fixed points determine, as translated by the configuration of the machine, the shape traced out by the free points.

We only see the shape traced out when the drawing is done; but every work of fiction starts with something fixed and something free. Fiction is always experiment. The writer fixes certain points. Given the machine that the writer’s knowledge and sympathy are, what shape will be traced?

In the novel the apparatus is somewhat flexible. Time and tedium, research and tendency, blur the resulting image. But in the short story the apparatus is rigid and quick. The shape is distinct. For the writer, the short story is an experiment; for the reader, the short story is a demonstration.

If the point of fiction were for us to tell about the world – to put on masks and do impersonations, to manipulate puppets and cast our voices into their mouths – then fiction would not be worth writing. Essays and treatises can tell for us more easily, completely and comprehensibly.

We waste our powers when we exercise them only in being ourselves. To observe is to imitate; to sympathize is to become. We all do this; we simply call it knowing a person – knowing how they look at things, knowing what they would say, how they would say it. This is a basic human faculty, something we take pleasure in doing and cultivating for its own sake. We build music on hearing, art on seeing; we build fiction on knowing.

The Golden Disk

They came from the sky in disks of gold and told us we were not alone. When they walked, they walked like us. When they spoke, they spoke like us. They said they had found our golden disk, our message of music, and they had accepted it. They had come for our Bach, to crown him with glory, to admit him to the fellowship of the music masters of a million worlds. We told them he was dead and they asked us what that meant. When we could bear the pity in their so human faces no longer we asked them to leave and they went. You call my silence a conspiracy. But I have no words.

Weakmindedness Part Four


Don't I know how Socrates condemned writing – how it would give the appearance of wisdom but not the substance – with an Egyptian fable where Thoth presents writing, among other useful inventions, only to have them rejected by the god as pernicious?

This little anecdote – a single paragraph of a long dialog, a minor support to a more complex argument, and the least extended of the many fables which adorn the Phædrus, has acquired a reputation and argumentative weight that its duration cannot support. Here it is in full, after Jowett:

At the Egyptian city of Naucratis, there was a famous old god, whose name was Theuth; the bird which is called the Ibis is sacred to him, and he was the inventor of many arts, such as arithmetic and calculation and geometry and astronomy and draughts and dice, but his great discovery was the use of letters. Now in those days the god Thamus was the king of the whole country of Egypt; and he dwelt in that great city of Upper Egypt which the Hellenes call Egyptian Thebes, and the god himself is called by them Ammon. To him came Theuth and showed his inventions, desiring that the other Egyptians might be allowed to have the benefit of them; he enumerated them, and Thamus enquired about their several uses, and praised some of them and censured others, as he approved or disapproved of them. It would take a long time to repeat all that Thamus said to Theuth in praise or blame of the various arts. But when they came to letters, This, said Theuth, will make the Egyptians wiser and give them better memories; it is a specific both for the memory and for the wit. Thamus replied: O most ingenious Theuth, the parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions to the users of them. And in this instance, you who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

Its prominence is more due to the names involved than its contents. It is told by Socrates, historical founder of philosophy. It concerns Thoth, mythical founder of esotericism. To Socrates and Plato he was only one Egyptian deity; but intervening tradition names him Thrice-Great Hermes, founder of all Western esoteric traditions (excluding of course the Cabala, descended from the secret revelation of Moses). Here is the author of the Emerald Tablet, condemned by the god for his vain and foolhardy invention of writing! The irony of the anecdote impresses it in the memory.

But consider the context. I will not rehearse the whole of the Phædrus, only call attention to its last section. Phædrus supposes that the speechifying politicians of Athens care so little for their speeches that they must be persuaded to write them down.

Socrates calls him on this absurdity. He contrasts true and false rhetoric – the false rhetoric of politicians, giving set speeches to a lump audience; and the true rhetoric – that is, the dialectic: to understand and address the conditions and abilities of a specific person. Writing, they agree, is a weak thing, because like speechifying it does not accommodate itself to any particular understanding. Like a painting, it has the semblance of life, but remains dumb when questions are asked of it.

Note that a specific kind of writing is meant – persuasive writing – and that a specific fault is diagnosed – generality. Writing that is addressed to a specific person and meant to be replied to, like a letter, is not considered, nor is writing that preserves facts, like histories or treatises. Within the limits of his actual argument Socrates is hard to disagree with. So it is better to persuade in person. So it is a higher skill to persuade someone in particular than to sway a crowd. But even then Socrates recommends writing to hedge against old age. I would add death and distance. He really has no argument against writing at all; it is merely the occasion to express the difference of the rhetorician and the dialectician, which is not specific to writing.

But to show that Socrates did not mean what people think he meant is not to show that what people think he meant is wrong. Surely writing impairs memory? Surely writing gives us the voice of wisdom, without the substance?

We wrongly think of mnemonic feats as proper to pre-literate cultures; but the ars memoriæ shows that memorization only gained in urgency with the invention of writing. Before writing there was simply less to remember. The feats of illiterate mnemonicists in memorizing long epic poems is rightly impressive. But this means that to be remembered for more than one lifetime, knowledge had to be worked up in poetry – no easier then that is now, whether of the "Sing, goddess" variety or the "Sailor's delight" variety.

By itself writing lets knowledge persist without being remembered, but does not itself retain knowledge. Yes, the knowledge you want is in a book; but that book is chained up in the next country. You may obtain knowledge through reading; but you must bring it back in your head. Printing, not writing, trivialized mnemotechnique.

But may what is said against writing apply then to printing? Consider another anecdote about memory and writing, this one from the Life of Johnson. It is the source of a quote which has become so familiar that it passes for a cliché or a snowclone. Johnson, on his arrival at a house, surveyed the books there. Joshua Reynolds, painter, quipped that his art was superior to writing: he could take in all the paintings in the room at a glance, but Johnson could hardly read all the books. Johnson riposted with a distinction:

Sir, the reason is very plain. Knowledge is of two kinds. We know a subject ourselves, or we know where we can find information upon it. When we enquire into any subject, the first thing we have to do is to know what books have treated of it. This leads us to look at catalogues, and the backs of books in libraries.

Very good, of course; telling; and the standard explanation for the effect of printing: it replaced knowledge of facts with knowledge of the sources of facts. But I am not willing to accept this – I think that Johnson, and we, are wrong.

We need something to compare to language, something else which has gone through the same transition from oral transmission to written form to printing. There is such a comparison in music. Music underwent a transition from aural to written to portable form. Unlike language, its first transition is not prehistoric (as language's tautologically must be); and unlike language, its unwritten forms survive, and may be compared to the written form.

Written music makes no difference at all in how good a musician must be to play well. Of course a musician who plays wholly from written music may not be particularly good at memorizing long pieces or at improvisation. But such inability to memorize may be by choice – pianists strictly play from sheet music because they think it better to do so than to memorize – and the ability to improvise arises mostly as a consequence of the knowledge of music theory (a silly word; it should really be viewed inversely, as musical physiology – since understanding music is something like learning a physical skill – you need the feel before the anatomy means anything) – the theory required to understand, play, and compose music intelligently. Playing from written music does not prevent a musician from playing with feeling tone, living rhythm, and meaningful phrasing.

True, in principle, one could be able to read music but not to play it – but that would be perverse. To be able to read but not to think would be just as perverse. Likewise to call reading thoughtless is absurd, because written words have no meaning of their own. Their meaning must be reconstructed in the mind; and this reconstruction is a skill, an ability, an act, like playing music. The skills you must have to read at all, and the skills you must have to play at all, are far more difficult and important than the skills whose necessity reading relieves. They blend in their perfection: memorization from a position of ability, understanding the principles behind its changes, is altogether a higher thing than memorization from inability, taking every note on faith.

Do I then excuse the net? Do I consider it as safe as sheet music? If the net were another such step, as from writing to printing, I would.

The net is, nontrivially, a machine for exaggerating its own importance. In its function of making information accessible it is not transformative. Comparisons between the net and the invention of printing – even the invention of writing – are commonplace, but absurd. Those who so compare reveal their dependency on inherited thought patterns, on the Whig history of the intellect.

It is comparable to the wrong idea most people have about industry – there was an Industrial Revolution; and since then, more of the same. But of course modern industry is as far from the old mills and factories as they were far from cabin piecework. Just the invention of electric lighting and air conditioning mark transformations of the factory system just as profound as mass production and the assembly line. But somehow we do not notice such changes.

Likewise we do not notice the two most important events in the history of the intellect: the public library and the paperback book. Between these two inventions more information has been made available to any human being than any human lifetime could absorb. They changed a world of scarcity into a world of plenty. The net – a change from plenty to plenitude – is comparatively insignificant. A thousand or a million times too much is still just too much.

But the net does have peculiar advantages; the net is different. It is frictionless, instantaneous, ubiquitous – and consequentially so. Consider drink. Spirits were once the only way to preserve surplus harvests, for storage and transport. Intelligence has had the same use: to distill, compact, and preserve masses of information and experience. Now we can move harvests in refrigerated bulk, preserve them as long and transport them as far, as we like. Of course people still drink; but now drink serves a recreational purpose. When notes are as accessible as narratives, when eyewitnesses are as accessible as reports, there the exercise of intelligence, though perhaps no less useful to sort excess than to defy scarcity, loses its urgency, excusability, and remunerability. The price of the cheapest smartphone is enough to make a "walking encyclopedia."


Everything that has an outside has an inside. (Topological exotica notwithstanding.) So digital augmentation obsoletes intelligence on the outside; what about the inside? Surely, however hard the wind blows against the rest, programmers are safe in the eye of the storm.

So far I have written intelligence and intellect interchangeably; but there is a distinction. It is best made by the example Jacques Barzun relates in his 1959 The House of Intellect. Consider the alphabet: 52 utterly arbitrary signs (26 uppercase, 26 lowercase), with history but without rationale, which make it possible to record and represent anything that can be said. Millions use it, as Barzun observes, who could not have invented it. Intelligence so crystallized, so made current and bankable, is intellect.

The book itself is remarkable – I recommend it for anyone impressed by The Closing of the American Mind, which is tedious and muddled by comparison. But I will not rehearse his argument. The battle is over, the other side won before I was born. Of course intelligence does not require intellect; but without its tools, fending for itself, it moves slowly and wastefully. Relying on its own resources, it becomes too serious; everything is so costly that it cannot take anything lightly. It loses time.

But look to programming! Here is the most purely intellectual discipline which human beings have ever created – as it must be, given (Dijkstra observes somewhere) that computer programs are the most complicated things human beings have learned to make. Programming cannot be learned, it must be adopted; it is a skill not just of action, but of perception.

Some people wonder if programming is an art or a craft. In seeking unpretension the usual answer is craft. But this is wrong. A craftsman works with stubborn materials and gets it right the first time. A carpenter who takes three tries to build a table and throws away the first two is not a craftsman. But three tries at a painting or a program is typical. A craftsman is finished when the work is done; an artist's work must be declared finished. Of course the better the programmer, the larger the chunks of the program that come out right in the first place. But the challenge of programming, its possibilities for flow, lie at the point where it is pursued artistically.

In a small way the early web made this art accessible and meaningful for those who did not think of themselves as programmers. In a small way it brought intellect into lives otherwise unaccommodating of it. The primitive character of the technology and the intrepidity of its early adopters, both required and welcomed intellect. Anyone who accepted the discipline of HTML, who studied literature to write better fanfiction, who studied the fallacies to call them out in forums or newsgroups – they were embracing intellect as they had never before had the freedom to do.

But this is past. You can, of course, study photography toward a better photostream, writing toward a better blog; but the improvement is along a spectrum. Formerly those unwilling to take trouble were absent; now there is no break to differentiate those who take the trouble from those who do not. We are all on the same footing, because we all possess personalities. When all the tools are provided it takes intelligence, but not intellect, to use them well.

The old web promised to change relations, to establish an invisible college; the new web promises to recapitulate existing relations.

(Note that the succession of Web 2 to Web 1 was not gradual or competitive; 9/11 broke Web 1, and something else had to be created to replace it. The relative political tranquility of the time was as important a precondition as the technology; politics, when passionate, is neurotoxic to intellect – a drop in the reservoir makes a reservoir of poison.)

Programming is writing: a very exact kind of writing, for a highly intelligent, totally unsympathetic, viciously literal reader. The tentativeness and disaggregation which are observed in writing as such may be observed in programming as well. The anarchy of Perl and the onramp of PHP yield to the whiteroom of Python and the velvet rope of Ruby. The hacker, who looks inside everything – with or without permission – yields to the developer, whose job is pasting together blackbox libraries and invoking their “magic.” The cathedral has fallen on the bazaar; the freedom of free software is not free as in beer or as in speech, but free as in sunlight, air, and other unmetered utilities.

Of course computers failed to deliver artificial intelligence. But machine intelligence need not equal human intelligence to render it obsolete. The assembly line never matched the finesse of the workbench. The progress of transportation was not from meat legs to machine legs, but from legs to wheels. Programmers have their own ways of taking a spin.


New technology has previously made intelligence easier without depriving it of value; why should the net be so dangerous to it? I have considered only the means of its attack, not the cause of its enmity.

In truth there is no intrinsic reason why the technology of the net must oppress intelligence; I use it heavily in that faith. But though the enmity is not intrinsic, it is still inborn, because the net was conceived in the pursuit of efficiency. Efficiency, as an ideal and a standard, hates intelligence and wants to destroy it.

What is efficiency? Efficiency means maximum return on minimum effort and minimum expense. But not everything that ensures more effect for less done is a measure of efficiency. In the simplest case the efficiency of one technology may be superseded by another technology that is inherently more efficacious – highly efficient systems of horse-powered mail delivery like the English post or the Pony Express were displaced by steam power in its earliest and least efficient forms. The horse's lineage and the rider's tack were the products of a millennial tradition that allowed horse and rider to operate as one animal. Rail, by comparison, was unreliable and unpredictable; intrepid for engineers, opaque for passengers.

If computers were the successors of paper-based information management, as rail was successor to the stagecoach, there would be no problem. The problem is not inherent to computers or to the net at all; it belongs to culture. Technology does not incur, it enables. It is not the fault of the orgasm button that the mouse starves while pressing it. This was always a weakness in the mouse; the button just gave him the chance to destroy himself as he had always been prepared to destroy himself.

We all suspect, most quietly, that the technological developments of the late twentieth century, and of our own time, let down the rapid pace of progress which developed the developed world; everything is sleeker, everything is faster, more brilliant, but little is new. Remember Engelbart débuted the net forty years ago; and that even while wonderful things are being done with it, no one can get their act together to swap out its old, nearly used-up address protocol – IPv4 – for the more capacious IPv6. Progress, once an irresistible force, is now hardly felt; in its place are so many immovable objects, so much foot-dragging, second-guessing, and public relations as the art of excuses.

The parsimonious explanation of this state of affairs is that after decades of focus on efficiency, there is no more room left for innovation – not even on the scale of the refrigerator, let alone the scale of the jet engine. Standards for return on capital have become so high that there is no indulgence left for the expensive and unrewarding infancy of really new technology.

But this is too particular an example. The idea is more general, and more familiar; we carry the lesson in the very frame of our bodies. Human beings are terribly inefficient at moving around. We traded the gait of the quadruped, even the lope of the knuckledragger, for the perpetual high-wire act of the biped. For most of our lives we leave two limbs – two perfectly good forelimbs – hanging unused by our sides while our hind limbs waddle precariously. Through idleness and inefficiency we trade forelimbs for arms and hands, and all that hands can do.

Consider Aristotle: “Civilization begins in leisure.” The phrase passes on Aristotle's credit; but in itself it ought to be shocking, if not absurd. Who believes in leisure? We have psychology; and whatever the school all psychology is one in its unwillingness to cede the possibility of leisure. If there is such a thing as leisure, psychology is impossible just as, if there are such things as miracles, physics is impossible. Everything we do must serve some urgent, exigent purpose of the unconscious or of the genes.

The paradox is curious: we have the most refined instruments of leisure ever devised, but we will not even believe in it. We admit to resting, relaxing, blowing off steam, unwinding, recharging, renewing; we admit to solace, consolation, distraction, and escape – but we do not admit to leisure. We suspend work to work more. For this god we admit no dual.

Be productive! Even the charitable impulse beats the drum. But what is it to be productive? It is (returning a GTD), to always have some next action to proceed to. Now GTD contains wonderful insights; it is not a cheap slogan bucket, but a system well thought out and capable, when adapted and practiced, of changing lives. I have tried it and having done so see work differently. I abandoned it not because it did not work, but because it worked too well. Adopt GTD and you will get more done. I also got less begun. To put it simply, GTD unsticks; but I rely on getting stuck. Getting stuck tells me it is time to leave a project fallow, and work on something else for now.

Of course I do not have to live by deadlines, so I would be an ass to generalize my own experience. But there is so much to do, and so little time. There are only so many open slots; no matter how efficiently you sort, at some point every new claim on your time pushes an old one off into oblivion. I respect GTD, but productivity systems in general strike me as perverse, because they keep the least worthwhile, most predictable claims uppermost, and push the more interesting, amorphous claims down and finally off the edge. It is life laid out in line, without recurrence, without themes, without center. It is the final victory of school over life, when last year's projects are as irrelevant as last year's homework.

But perhaps we are tired. Enough of the open-ended, the uncertain, the unknown. There is something to be said for life that is modest in its ambitions, confident within its limits, at home with itself – at least by way of amor fati, since it is trivially true that the ordinary life cannot be extraordinary. But then we are so tired, and the world is so old. There are so many big ideas; do we really need more? Let the scholars publish and perish; at least it keeps them too busy to preach. And politics – after so much politics nothing settled, nothing is certain – let those who can do nothing else devote themselves to it. Do not trouble us with that frenzy.


What is obsolescence? Plainly the concept is partly technological, partly social. Cars made draft horses and buggy whips obsolete, but there are still mounted police, and buggy whip manufacturers made a smooth transition to manufacturing car parts. (Capital finds its level.) A technology that displaces another never does so completely; no technology is completely interchangeable with another. They all imply their own particular scale of values. Too, there is an index of obsolesciveness, a function of complexity: the axe survived the chainsaw, because it is simple, but the vacuum tube did not survive the transistor, because it is complex. But even these technologies sometimes linger. Investment has inertia, nobody likes to close a factory, and what's so bad about COBOL after all?

Obsolescence happens, it is indeed a force; but it is over-billed. Nothing disappears, nothing ceases to function the day it becomes obsolete. Obsolescence is something that happens to technologies, but it is not the chief or limiting condition of their existence.

(This is vividly though shrilly argued in an interesting but flawed book, The Shock of the Old – shrill, because a cogent case against the doctrine of obsolescence would have to consider not just products but production methods. The author writes as if the axe of 2010 were the same as the axe of 1910 or 1810 – as if there were an equivalence between the product of a blacksmith, an assembly line, and a laser-guided CNC machine. He adduces shipbreakers tearing down the most massive artifacts of industrial civilization with muscle and hand tools, but passes over the fact that this manpower lives by the high technology of the Green Revolution.)

So what do I mean when I say "intelligence is obsolete"? Is its obsolescent real or doctrinal? I think it is doctrinal because, as I said before, intelligence cannot just disappear.

But this sounds circular. What is obsolete? Intelligence is obsolete. What judges obsolescence? Intelligence judges obsolescence–even granting some part of the emotional repulsion of obsolescence to the instinctual distaste for the unfashionable, still the idea operates too broadly not to imply judgment, intelligence, and intellect.

It is this circle one sees, I think, in the odd paradoxes pronounced by the boosters of technology-as-magic, which would solve the bewilderment of technology with more technology, who would loosen the constraints of technology with more constraints, and who would make technology less demanding by ensuring that it follows you everywhere. They recognize technology as something enabling choice and critical judgment in everything except technology, something solvent to ignorance, helplessness, and herd behavior in everything except technology. But I see nothing that makes technology such a fixed point; it seems to me as unstable, as potentially thoughtful, as it encourages everything else to be.

I have heard it argued that technology – be straight, computers – is becoming more like cars – devices practically magical, in that they are operated more through ritual then understanding. This cuts me a little because in fact I understand very little about cars. But why do I know so little about cars? Not lack of curiosity, but lack of opportunity. I have never been in a position where a car was something I could afford to break – not to mention putting life and limb in hazard. But the evident trend of computers is towards commodification; everything done to de-commodify computing devices is ultimately doomed. Once I had one computer, heavy metal taking up desk space, and that was a serious investment in hardware; now I have several computers, and anywhere I can check out a few repositories, I am at home. It is this possibility, the chance to evolve your own peculiar relationship with technology, one that is cumulative, personal, and free; one that you own and control; one that is a slow growth of the mind into the possibilities afforded by intellectual augmentation, not an accommodation of the mind to the tools and metaphors dictated to you – this possibility which allows intelligence to employ technology, not serve it.

In finding intelligence obsolete the doctrine of obsolescence obsolesces itself. A replacement is in order; some new view of technology is required. But then I am hasty. So obsolescence obsoletes itself; so there is a contradiction, so what? A dissonance implies a resolution but nothing says that it has to resolve. Self-contradiction may even strengthen an idea, by imbuing its holders with faith. The rest is to be seen, or done.

Weakmindedness Part Three


Intelligence has never been in fashion. It has been news for a century that individual intelligence has become obsolete and the future belongs to procedures, teams, and institutions. This is a future that has always just arrived. The lesson is not that intelligence has always appeared to be on the verge of becoming obsolete (although it has); the lesson is that something in society hates intelligence and wants it to be obsolete—needs to believe that it is obsolete.

Obviously in a commercial society we are always worth more for what we can own—or for being owned—than for what we can do. And it is true, regarding the advantages of teamwork over intelligence, that all the inputs into the economy from outside it involve teams and companies. An industrial army keeps the wells flowing, the mines bearing, the fields fruiting. The individual cannot keep up—can have an effect, but only indirectly; the way a programmer controls what a computer does, but cannot do what it does. Naturally the institutions intended to handle these inputs expect to deal with teams and institutions—an affinity that propagates throughout society.

Society, remember, is not a human invention, but a pattern in nature which human beings borrow; a pattern we share with bees and ants and mole rats. It has its own logic, its own dynamics, and its own tendency—a tendency which is always toward the intelligence-free ground state of the hive or colony. For society as such intelligence is an irritant, something to be encapsulated and expelled, like a splinter in the thumb, or cicatrized in place, like a piece of shrapnel.

The greater the intelligence, the more likely it is to destroy its own advantage. Be born with exceptional strength and the best thing you can do with it is to use it yourself. Be born with exceptional intelligence and the best thing you can do with it is to turn it on itself—to figure out how the exceptional part of your intelligence works so you can teach it to others. We all think a little like Einstein now, because we have the maxims he so carefully wrought out, the examples he so carefully related.

Of course human beings are not ants or bees or mole rats and society cannot turn them into zombies. People scheme. This is natural: intelligence atrophies when unused. It is no more comfortable to be flabby in mind than in body. Nor would society want us to be; the software of society needs human speech to run on. Society does not want or need human beings to speak well, but it does need them to speak well enough.

To perfect this balance, we have the job, which stands in relation to the mind as aerobics to the body: it keeps you from becoming flabby, without fitting you for any particular use. Not that jobs are inherently useless; only that, given a minimal denomination of employment (say 9–5), real work is always padded with makework to fill it out fungibly.

Society's capacity to encapsulate intelligence is ultimately limitless but not particular responsive. A sudden jump in the efficiency of all workers opens a gap, leaves intelligence idle—this may be called, to borrow a phrase, a cognitive surplus. In the last two decades we have seen one open up; remarkable things emerged from it—the web, the blogosphere, the Wikipedia (more later) &c.—and I think we have begun to see it close, soaked up into flash video and social networking.

The centrality which magazines have resumed in online intellectual life is a sign of its decay. Witness the return of the article, the lowest form of writing, opening with an anecdote and closing with a cop-out. Watch the epicene descendants of the intellectual thugs of undead ideologies playing intellectual. Could this be all that it comes to? All our work, all our hope? The same sad cycle of toothless posturing vs. splenetic emission, only this time on screens instead of paper, and with Star Wars references? Well, we had our chance; now we see what we made of it.


I began by comparing strength and intelligence and should justify it. This is difficult because silly ideas pass about both. Witlings think smart people quote cube roots the same way weaklings think strong people are musclebound. The smart people do not obsess over mental math, knowledge of trivia, and the size of their IQs; the strong people do not obsess over diet, dead lifts and the size of their biceps.

The parallel stereotypes are collateral results of the same error: if an ability is not economically rewarding, people pretend it does not exist. To account for records of its existence, some such stereotype will be foisted as its modern descendant.

Strength has not ceased to exist; it is even still useful. All the marvelous mechanical contrivances of modern life are lubricated with human sweat. To give an extreme example, soldiers now ride in APCs, fire low-caliber assault rifles, call in strikes from guns, helicopters, and drones; but a soldier must still be in good shape, because no matter how elaborate the technologies they employ, there always remain interstices that must be filled out the old-fashioned way.

Strength is necessary, but not advantageous. Everywhere, for free, strength is making civilized life possible; but there is nothing strength can do for free that cannot be done without strength for money. The best that strength can do is keep you from failing; you cannot distinguish yourself with it in any but recreational uses. No one earns a profit or a promotion for being strong.

Likewise by intelligence becoming obsolete I do not mean its disappearance, but its insignificance. The intellectual machinery that makes life faster and more brilliant will always need lubrication; but that work will be invisible, underground, and unrewarded. And being taken for granted, it will cease to be believed in.

Westerners allow themselves to be deluded about the actual range of human strength. Of course it is difficult to prove strength in physical teamwork; when working with someone weaker than yourself, you must moderate your own strength to avoid hurting the other person. Say confuse for hurt and the same applies to intellectual teamwork. Insofar as teamwork is expected, insofar as the idea of intelligence is undermined with untestable explanations ("Anyone could do that if they spent ten years learning it"—will you take ten years to find out?)—that far intelligence will simply cease to be thought of, let alone believed in.

For now, intellectual work is still exalted. The gospel of productivity offers to make it accessible to everyone, by debunking its romance, by making it as tractable as "cranking widgets". Somehow intellectual work reduced to cranking widgets comes across more like intellectual work and less like cranking widgets. But this is to be expected. Twentieth century industry enjoyed the prestige of muscularity, virility, and futurity for decades while it chained generations of children, abused generations of women, and poisoned, wore out, and discarded generations of men. Likewise intellectual work may be expected to enjoy the prestige of thoughtfulness long after thinking has been lost from it.


I cannot get away with referencing the idea of cognitive surplus without engaging it. Or more directly: "What about Wikipedia?"

Do consider Wikipedia. But first, forget what you have read about Wikipedia: it is all lies. No one who opines about it understands it. It is almost certain that if you have not participated in it, you not only do not understand it, but are deluded about it.

I should disclose my participation in Wikipedia. I have written two obscure articles and heavily rewritten another. Beside that, my contributions have been limited to weeding vandalism, polishing grammar and expression (the bad to the acceptable; improving the adequate to the excellent would be rude), and filling in gaping omissions—though I do less and less of any of these, largely because there is less and less need. I do have the Wikimedia franchise.

Let me also stipulate that I love the Wikipedia, esteem it as the best service of the net, and consider it the most important and consequential cultural development of the twenty-first century—much more so than, say, social networking or Google. (Though I acknowledge that the Google-Wikipedia relationship is symbiotic.)

Wikipedia is not spontaneous. The typical Wikipedia article is not a lovely crystal of accretive collaboration. It is a Frankenstein's monster of copy stitched together from a dozen donors, a literary teratoma. Wikipedia as a whole is a ravenous black hole that sucks up endless amounts of copy: the out-of-copyright public domain; the direct to public domain; and the unpublishable. Wikipedia is not just the last encyclopedia; it is the Eschaton of all encyclopedias, the strange attractor drawing on to the end of their history. Wikipedia is the hundred-hearted shambling biomass to which every encyclopedia that has ever existed unwittingly willed its organs. Whole articles from Chamber's Cyclopæedia—the very first encyclopedia—turn up inside it completely undigested. As soon as it was born it ate its parent, the Nupedia, then went about seeking whom it might devour. Its greatest conquest was the celebrated 11th edition of the Encyclopædia Britannica—the last great summary deposition of proud world-bestriding European civilization before it passed judgment on itself. (As the article "Artillery" states: "Massed guns with modern shrapnel would, if allowed to play freely upon the attack, infallibly stop, and probably annihilate, the troops making it.")

If you had heard of the Wikipedia but not seen it you might surmise that the kind of people who would edit it would have a technical and contemporary bias, and that trivia would predominate: there exists a band that no one has ever heard; there exists a town in Scotland where nothing has ever happened. And you would be right. But the massive scholarship of the 1911 encyclopedia perfectly counterbalances the bias and the bullshit. The credibility of the Wikipedia as a universal reference was invisibly secured by this massive treasure, excavated as surely and strangely as Schliemann excavated the gold of Troy. Whole articles from the 1911 edition live in Wikipedia, and even where the revision of obsolete information and prejudiced opinion has replaced most of the article, whole paragraphs and sentences remain intact. If while reading an article in Wikipedia you feel at a sudden chill in the air, shiver with a thrill of dry irony or scholarly detachment, feel a thin rope of syntax winding itself around your brain—the ghosts of 1911 are speaking.

(The Britannica itself dispensed with this material during its reinvention in 1974.)

Do not rely on me; count. Wikipedia requires the use of templates—boilerplate disclaimer—whenever text from a public-domain source is imported as an article. Using Google's site search we can count them. (Keep in mind that these numbers are severely understated; revisers frequently delete these templates once the article has been brought up to date. Also note that I did these searches some months ago.)

1728Chamber's Cyclopedia531
1918Gray's Anatomy2180
1913Catholic Encylopaedia~ 28 000
1911Encyclopædia Britannica~ 120 000

But there are many more such searches to be done. How significant is the importation? I encourage you to try out Wikipedia's "random article" function, on the left, just above the search box. Here is a random sample of ten articles (disambiguations & lists ignored):

  1. Italian race car driver. Stub
  2. A railway station in Melbourne.
  3. One paragraph on a comics anthology. Stub
  4. Lululaund, a eccentric faux-Bavarian mansion in Hertfordshire, destroyed in 1939. (Linked because curious.)
  5. The definition of "bulk email software." Stub
  6. An a Capella quartet, Anonymous 4, who perform medieval music.
  7. "The Postal Orders of Anguilla." (A digression reveals that a postal order is the British for money order.)
  8. Cumbria.
  9. Brief biography and long bibliography of The Most Reverend Marcelo Sánchez Sorondo, Argentinian, Catholic, philosopher, theologian, and historian of philosophy.
  10. A 2006 single.

Or another:

  1. 1946 college football season.
  2. Jean-Marie Roland, de la Platière, the Girondist. 1911
  3. Munching square. Stub
  4. "Personal name."
  5. Cincinnati Redlegs' 1956 season. Stub
  6. Pope Simplicius. Stub
  7. A South African judge. Stub
  8. Torbanite, a variety of coal. Stub
  9. A Scottish football club. Stub
  10. A church on the Isle of Wight. Stub

Or another:

  1. The London Midland and Scottish Railway.
  2. An administrative district in east-central Poland. Stub
  3. A village in north-central Poland. Stub
  4. A game from The Price is Right.
  5. A Gibraltarian politician.
  6. Johann Nikolaus Forkel, German musician. 1911
  7. The Mumbai Amateur Radio Society.
  8. King Amoghabhuti.
  9. An episode of House, M.D.
  10. An office in the Indian National Congress. Stub

The second source is material that is directly released into the public domain: press releases, government documents, think tank reports. A business has two vital functions: to do something and to let people know what it is doing. The latter has always provided great opportunities to the Wikipedia, which is always searching things people might want to know about. Wikipedia has a magpie eye, and press releases are very shiny.

(Wikipedia also picks up shiny stuff where it shouldn't—it's always distasteful to click through a reference link and find that the text of the reference, a private website, evidently not in the public domain, has simply been copied—but then again Wikipedia saves some good copy this way that would otherwise be lost to link rot.)

Beside the brook of business runs the massive river of text thrown off by the metabolism of the military-industrial-governmental complex, large amounts of which are explicitly in the public domain, other parts of which are too evidently of public interest to be neglected. Wikipedia soaks up this stuff like a Nevada golf course.

The third source is sophisticated yet unpublishable material. If you have ever despaired at the thought of how much intellectual energy goes into a school report, written to be read once by someone who learns nothing from it, know that the Wikipedia is there to catch all these efforts. (Or was, rather, before it began to inform them.) I suspect that the preponderance of original articles on the Wikipedia were actually executed as assignments or requirements of teachers or employers. Wikipedia strains the plankton from the sea of busywork like the baleen of a whale.

What is Wikipedia? Wikipedia is a sublimely efficient method of avoiding redundant effort. Wikipedia is write once, remember forever. Wikipedia is make do and mend. Wikipedia is reuse and recycle.

Weakmindedness Part Two


To make his ideas more tractable Engelbart tells a story of two characters: "You", addressed in the second person, and "Joe", who is experienced with augmentation and is giving You a demonstration.

First Joe shows off his workstation. His desk has two monitors, both mounted at slight angles to the desk—"more like the surface of a drafting table than the near-vertical picture TK displays you had somehow imagined." He types into a split keyboard, each half flanking a monitor, poising him over his screens as he works.

The ergonomics are impeccable. Consider how tradition forces us into a crick-necked and hunched-shouldered position whenever we sit at keyboard—how it literally constrains us. Judge how much more of the way you work is so ruled.

To introduce the capabilities of the system Joe edits a page of prose. Lo—every keystroke appears instantly onscreen! When he reaches the end of the line, carriage return is automatic! He can delete words and sentences, transpose them, move them around, make them disappear—"able to [e]ffect immediately any of the changes that a proofreader might want to designate with his special marks, only here the proofreader is always looking at clean text as if it had been instantaneously retyped." He can call up definitions, synonyms and antonyms "with a few quick flicks on the keypad." He can define abbreviations for any word or string of words he employs, whenever he wants, and call them up with substrings or keychords. But you have fonts, yes?

In short the capabilities of Joe's editor are somewhat above those of a word processor and somewhat below those of a programmer's editor.

Here we find one of the problems with Engelbart's vision. It is easier to augment entities than procedures. If in the context of typing a word is just the procedure of hitting a certain sequence of letters, then in the near term, it actually costs energy and time to change your procedure to typing the first few letters of a word and letting the editor expand it. It requires you to think of the word as an entity, not an operation. For most people, this is impractical.

Consider the abacus. The frictionless operation of mental arithmetic seems easier than the prestidigitation the abacus requires. So the most practiced algorists are faster than the fastest abacists. (Sometimes, as in the famous anecdote of Feynman and the Japanese abacist, the algorist's superior knowledge of mathematics will simplify the problem to triviality.) But of course the abacus is easier to learn than mathematics, and for a given amount of practice the average abacist will be much faster than the average algorist.

There are abacus virtuosos who can calculate faster than the abacus can be fingered, who calculate moving their fingers on an empty table, but who cannot calculate at all without moving their fingers—slaves to a skeuomorph.

Skeuomorph is a term from architectural criticism. It names, usually to pejorate, a building's needless imitation of the traces of old-fashioned construction methods and materials it does not employ. But skeuomorphs are not all bad—classical architecture, in its system of pillars, pilasters, entablatures &c. is a representation in stone of the engineering of wooden buildings.

The experience of using a computer is run through with skeuomorphs—the typewriter in the keyboard, the desktop in the screen, the folders on the hard drive, the documents they contain. Through a cultural process they dictate—even to those with little experience of the analog originals—how computers are to be used. Even as they let us in they hold us back.

So it might seem that new user-interface concepts are necessarily a liberation. They should be; but so far they have not been. In particular the recent generation of portable devices have moved farther toward the pole of the abacus—easy for competence, limited for mastery. As they break down walls, they close doors. As they are more and more physical and spatial, they are less and less symbolic.


Now we come to the last part of Joe's demonstration, and leave the familiar behind. The talk from here on is of arguments, statements, dependencies, and conceptual structures. Joe explains that he uses his workstation to produce arguments, composed of statements, arranged sequentially, but not serially. Quote:

This makes you recall dimly the generalizations you had heard previously about process structuring limiting symbol structuring, symbol structuring limiting concept structuring, and concept structuring limiting mental structuring. You nod cautiously, in hopes that he will proceed in some way that will tie this kind of talk to something from which you can get the "feel" of what it is all about.

He warns you not to expect anything impressive. What he has to show you is the sum of great many little changes. It starts with links: not just links between one document and others, but links within the document—links that break down sentences like grammatical diagrams, links that pin every statement to its antecedents and consequences--

[T]he simple capabilities of being able to establish links between different substructures, and of directing the computer subsequently to display a set of linked substructures with any relative positioning we might designate among the different substructures.

Note that this does not just mean creating links—it means creating bidirectional linkages, linkages that have kinds, linkages that can be viewed as structures as well as followed.

Here is a skeuomorph: the index or cross-reference in the hyperlink. The hyperlink are we know it is hyper only in the most trivial sense. You cannot even link a particular part of one document to a particular part of another document unless the target is specially prepared with anchors to hold the other end of the link. Except inside of a search engine (and the futile experiment of trackbacks), a link contributes no metadata to its target. The web has no provisions for back-and-forth or one-to-many links, let alone for, say, uniquely identified content or transclusions.

These are not particularly exotic or difficult ideas; to understand how they might have worked—what the net might have been—look at Nelson's Xanadu.

Understand that the problems of the web—the problems smug commentators vaunt as unpredictable consequences of runaway innovation—these problems were not only thought of, but provided for, before the web existed. Understand that the reason we have these problems anyway is the haphazard and commercially-driven way the web came to be. Understand that the ways in which the web destroys value—its unsuitability for micropayment, for example—and the profits the web affords—like search—are consequences of its non-architecture. If the web had been designed at all, music, news, writing would be booming in proportion with their pervasiveness. Instead we have Google. Instead we have a maze where the only going concern it allows is selling maps.

I should stipulate that the net—the Internet—and the web—the World Wide Web—are different things. The net is the underlying technology, the pipes; the web is one way of using that technology. Email, for example, is part of the net, but not part of web; the same is true of Bittorrent or VOIP. At one level the answer to the question "Is Google making us stupid?" is "No, the web is making us stupid—wiring our brains into the web is just Google's business model."

Certainly it is easy to defend the web against this kind of heckling. Nothing succeeds, as they say, like success. The guy in the back of the audience muttering how the guys on stage are doing it wrong is always and rightfully the object of pity. And there is no way back to the whiteboard; the web is, and it is what it is.

But we must remember that it could have been different—if only to remind us that we will have more choices. What has happened was not inevitable; what is predicted is not inexorable.


The way Joe describes the effect of augmented symbol-structuring is worth quoting in full:

I found, when I learned to work with the structures and manipulation processes such as we have outlined, that I got rather impatient if I had to go back to dealing with the serial-statement structuring in books and journals, or other ordinary means of communicating with other workers. It is rather like having to project three-dimensional images onto two-dimensional frames and to work with them there instead of in their natural form.

This is, of course, against recalls the question, this time in its intended meaning: "Is Google making us stupid?" It is not a problem I have, but people do seem to suffer from it, so I can name the tragedy—we have just enough capacity for symbol-structuring on the web to break down some people's tolerance for linear argument, but not enough to give them multidimensional ways of constructing arguments. The web is a perfectly bad compromise: it breaks down old patterns without capacitating new ones.

Joe moves on from symbol structuring to process structuring. Here the methods resembled those used for symbol structuring—they are links and notes—but they are interpreted differently. A symbol structure yields an argument; a process structure answers the question—"What next?"

And this, of course, is recalls "Getting Things Done"—it is the complement of the next action. GTD's, however, is the abacist approach. Adherents of GTD manipulate physical objects or digital metaphors for physical objects—inboxes and TO DO lists—and reduce them to a definite series of next actions. Ultimately this is all any process structure can disclose—"What do I do now"—and for most tasks something like GTD is adequate.

If there is a hole in your roof that leaks, the fact of the leak will remind you to fix the hole. The process is self-structuring: you will fix it or get wet. So, to a lesser extent, is the letter on your desk. But the email in your inbox—if you expect to answer it, you must find some way to point its urgency. But why should this be? Why can't you instruct the computer to be obtrusive? Why can't digital tasks structure themselves?

They can; but they don't, because there is no metaphor for it. The abacist email program has an inbox; following the metaphor, to get something out of the inbox, you must do something with it. More algorist email programs, like Mutt or Gnus, invert the problem—once you have read a message, unless you explicitly retain it, it disappears. This approach is vastly more efficient, but it has no straightforward paperwork metaphor, so it is reserved for the geeks.

Or again: why can't you develop processes in the abstract? GTD is itself a single abstract workflow. Bloggers are forever writing up their own workflows. Why can't your computer analyze your workflow, guide you through it, help you refine it? Why can't it benchmark your activities and let you know which ones absorb the most time for the least return? Why is there no standard notation for workflows? Of course programmers have something like this in their editors and IDEs; but probably you do not.

Augmenting Human Intellect is worth reading but I am done with it. If I have been successful I have disenchanted you with the net—disenchanted literally; broken the spell it casts over minds that should know better. If I have been successful you understand that the net as you know it is not inevitable; that its future is not fixed; that its role is not a given; that is usefulness for any particular purpose is subject to judgment and discretion.

Weakmindedness Part One

[At some point I noticed that many of my essays contained digressions about the net's effect on our minds and lives. These digressions were faults as digressions, but the topic was interesting, so I began collecting these scraps with the plan of welding them into a coherent essay. The result was on a larger scale than I had expected: around 10 000 words in 13 parts.

This poses a problem of formatting. In the past I have run essays in several parts over as many days. But such a bombardment would be ridiculous. Instead I have gathered these 13 essays into 4 parts, to be run at the usual near-weekly intervals. In this way readers may plausibly have time to digest and follow the argument.]


Is intelligence obsolete? I mean: do the digital technologies of intellectual augmentation make exceptional intelligence obsolete, in the same way that the mechanical technologies of physical augmentation made exceptional strength obsolete? Not, "is the net is making us stupid?" but "does the net make it as impossible to be stupid, as the grid makes it impossible to be powerless?" I want to be free to develop the argument without suspense, so I will conclude now: yes—with reservations about the concept of obsolescence.

I say intellectual augmentation to reference Douglas Engelbart's 1962 Augmenting Human Intellect. I will use this book as the scaffold for the first part of my argument. Anyone who has investigated the origins of the net will know Vannevar Bush's 1945 As We May Think, a prophesy of the Internet in light-table and microfilm. Augmenting Intellect is explicitly an attempt to show how Bush's vision could be made workable in electronic form. It is not a marginal document; six years after it was published the author, head of the Augmentation Research Center at Stanford, gave what is now known as the "Mother of All Demos", where he débuted, among other things, the mouse, email, and hypertext.

Some of the possibilities that Augmenting Human Intellect proposes have been fulfilled; some have been overtaken; some have failed; and some remain untried. The interesting ones are the untried.

The relevant part of Augmenting Human Intellect begins with Engelbart's description of the system he used to write the book--edge-notched cards, coded with the book or person from whom the content was derived. I say "content" because, as anyone who has attempted to maintain a system of notes allowing small, disparate pieces of information to be conserved usefully will realize, it is impossible to strictly distinguish thoughts and facts—the very act of selecting a fact for inclusion implies a thought about it. Engelbart calls these thought-facts kernels. He would arrange these cards into single-subject stacks, or notedecks. In the book he summarizes the frustrations of creating a memo using these cards—the lack of a mechanism for making associations (links, that is, but in both directions), the tedium of copying the links out, the confusion of keeping track of what linked to what. He considers some mechanical system for leaving trails between cards and for copying them, but objects:

It is plain that even if the equipment (artifacts) appeared on the market tomorrow, a good deal of empirical research would be needed to develop a methodology that would capitalize upon the artifact process capabilities. New concepts need to be conceived and tested relative to the way the "thought kernels" could be knitted together into working structures, and relative to the conceptual presentations which become available and the symbol-manipulation processes which provide these presentations.

He proceeds to further object that by the time some such mechanical system could be perfected, electronics would be better suited to the job. And we are off.


But let us pause first to and consider the concept of the kernel. Engelbart is explicit that the kernel itself represents a "structure of symbols" subject to mapping. Yet for purposes of inclusion in a larger symbolic structure the kernel must be treated as smooth and integral. Every symbolic structure is composed of smooth kernels, yet all kernels are spiky. This tension can be dealt with in more than one way.

Imagine a series of levels in the computer's awareness of a kernel's internal structure. (These levels are my own coinage for this essay; they probably correspond to a known mathematical structure, but it seemed easier to reinvent than research.)

Level zero is the simplest possible organization of nuggets, an anonymous jumble where the only thing the system knows about the content of a kernel is that it exists. An unsorted inbox or a directory of temp files are level 0 structures.

Level 1 is a simple filing system: the system knows exactly one thing about the kernel: which folder it belongs to.

A level 2 structure approaches the limits of what is possible with paper: the system knows that a single kernel can be in several places at once. A physical file system where documents are both uniquely identified in some master reference, and where copies of these documents are present in multiple folders, is a level 2 system; so is double-entry bookkeeping, where the kernels have no internal structure at all. Tagging is the computerized equivalent; in library science, faceted classification.

A level 3 structure is possible with paper—using edge-notched cards and pin-sort operations—but in practice it requires a computer. In a level 3 structure the system can retrieve a kernel conditionally, based on which folders it is in—Edward Teach was a pirate and a bearded man, so he is found in the pseudo-folder "Pirates and bearded men". Basically a level 3 structure knows enough about its kernels to do basic set operations—the Venn diagram level.

(Full-text search is a level 3 structure, where each kernel is indexed by every word that it contains.)

In a level 4 structure folders are themselves included in folders. This sounds trivial—manila folders inside hanging folders, or sub-directories in directories—but that would be a level 1 system. In a level 2 structure, each folder is included in multiple folders, including itself. This is impractical on paper, and just practical using symlinks—the POSIX file system is arguably a level 4 structure. But the only nontrivial level 4 structure in operation is Google's search engine, which is smart enough that it could retrieve Edward Teach for "Pirate who did not shave"—it is capable of including the folder "bearded" in the folder "not shaving." (It doesn't, alas).

Understand the distinction between levels 3 and 4: in level 4 the maximum number of searches that can retrieve a kernel is a function of the number of folders and the length of the search query. At level 4 because folders can be retrieved by other folders—not just pseudo-folders—the maximum number of searches assumes at every folder includes every pseudofolder.

But why stop with level 4? Computers could do better. Level 0 jumbles kernels; level 1 puts them in folders; level 2 puts them in multiple folders, level 3 puts folders in folders, level 4 puts folders in multiple folders. The simplest case of a level 5 structure would be the search: "searches that return Edward Teach." This sounds useless until you consider the benefit of gradually narrowing a search by doing one search after another, each on the results of the last. When you think about it this seems very hierarchical—like folders inside folders. With a level 5 structure you could retrieve multiple searches the way you search multiple tags. For example, suppose you want to know how the Queen Anne's Revenge was rigged; and suppose there is a website about this. Now, of course, you could search "Queen Anne's revenge rigging", or search "Queen Anne's Revenge" and then rigging, or "sailing ship" and then "Queen Anne's Revenge"—but no luck. But suppose you could search "searches that return Queen Anne's Revenge + searches that return sailing ship + rigging". Now if it happened that the Queen Anne's Revenge was a French frigate, and French frigates of the early 1700s were rigged in a characteristic way, this search could shake that connection out.

(Note that a strictu sensu blog is actually a sort of level 5 search—a blog that collates links on a certain set of topics is a handmade equivalent to a search on searches that return those topics.)

But enough speculation. The point of proposing these levels is to show that ascending levels in the sophistication of search are independent of the internal structure of a kernel. Even the most sophisticated searches possible, and those not yet possible, are still a matter of folders and contents. And putting a kernel into one or many folders is not the same as parsing it.

Indeed parsing is an impediment to search, not an aid. Certainly it is good when we search on Edward Teach and are directed to a "Blackbeard" chapter in a book about pirates. For our purposes the book as a whole is a kernel; perhaps the chapter is too—we may print it out, or find the book and photocopy it, or collect it screenshot by screenshot. But how far can we break it down? It may be true that half of the chapter is not about Blackbeard at all—this paragraph tells about the town where he was born, this paragraph tells about his ship, this paragraph tells about his competitors—and it may be true that of the paragraphs about him half the sentences are not about him—here is a thought on the nature of brutality, here is a thought about why bearded men are threatening. Yet if you isolate only the sentences that are about Blackbeard specifically, the result is gibberish. You wanted something about Blackbeard? Well, this chapter as a whole is about Blackbeard—but no part of it is actually about him.

This is why PIM is hard: there need not exist any necessary connection between a kernel's internal structure and the folders where it is classified. The relationship is unpredictable. This unpredictability makes PIM hard—/hard/ not as in difficult, but hard as in insoluble, in a way that is revealing of some human limitation. Classification is irreducibly contingent.

Accordingly PIM is always tendentious, always fallible, and not always comprehensible outside of a specific context, or to any other but a specific person. And the most useful abstract classifications are not the best, but the most conventional—like the Dewey Decimal system, whose only advantage was that of existing.


Now I return to Engelbart and his "quick summary of relevant computer technology." It would be tempting to pass over this section of Augmenting Human Intellect as pointless. We know computers; we know what they can do. The introductions necessary in 1962 are needless for us. And true, some of it is funny.

For presenting computer-stored information to the human, techniques have been developed by which a cathode-ray-tube (of which the television picture tube is a familiar example) can be made to present symbols on their screens of quite good brightness, clarity, and with considerable freedom as to the form of the symbol. Under computer control an arbitrary collection of symbols may be arranged on the screen, with considerable freedom as to relative location, size, and brightness.

But we should look, because Augmenting Human Intellect predates a great schism in the design and use of computers. Two sects emerged from that schism. The technologies that Engelbart thought would make augmentation practical largely ended up in the possession of one side of this schism—the losing side.

Engelbart thinks of computers as symbol-manipulating engines. This strikes one in the face when he talks about simulation:

[T]hey discovered that the symbol structures and the process structures required for such simulation became exceedingly complex, and the burden of organizing these was a terrific impediment to their simulation research. They devised a structuring technique for their symbols that is basically simple but from which stem results that are very elegant. Their basic symbol structure is what they call a 'list," a string of substructures that are linked serially in exactly the manner proposed by Bush for the associative trails in his Memex—i.e., each substructure contains the necessary information for locating the next substructure on the list. Here, though, each substructure could also be a list of substructures, and each of these could also, etc. Their standard manner for organizing the data which the computer was to operate upon is thus what they term "list structuring."

This is in reference to IPL-V. A few paragraphs later he writes, with frustrating understatement, "Other languages and techniques for the manipulation of list structures have been described by McCarthy"—followed by eight other names. But McCarthy's is the name to notice; and his language, LISP (LIst Processing) would become the standard tool for this kind of work.

There is a famous essay about the schism, source of the maxim "Worse is Better." It contrasts two styles of programming: the "MIT style"—the style of the MIT AI Lab, with the "New Jersey style"—the Bell Labs style. Software as we know it—based around the C programming language and the Unix family of operating systems, derives from the New Jersey style. Gabriel's essay actually characterizes the New Jersey style as a virus.

But how does this difference in style relate to the concept of "symbolic structures"? Lisp is focused on the manipulation of symbolic structures; and Lisp is the language best suited for this because Lisp code is in fact itself a symbolic structure—every Lisp program is an exhaustive description of itself. C-like languages are a set of instructions to a compiler or interpreter. The instructions are discrete and serial. The resultant symbolic structure exists only when the program is run. (Note that the difference is one of tendency, not of capacity. It is an axiom that any program can be written in any programming language that has the property of being Turing-complete —as all these languages are.)

Why C-like languages won may be summarized by a point of jargon. In Lisp-like languages anything besides manipulating symbolic structures—say, writing a file to disk or rendering it to the screen—is called a side effect. What are side effects to Lisp programmers are the business of C programmers. So instead of symbols and trails we deal with files and windows and websites, and have to hold the structures they are supposed to fit into in our own heads.

Coincidentally in housebuilding the quick and dirty style is called "New Jersey framing." The standard way is to frame a wall as a unit—a grid of studs nailed through their ends at right angles—then stand it up and nail it into place. Jersey framing instead maneuvers each stud into its final position before toenailing it in place—that is, hammering nails in at an angle. The standard style is more secure, but involves delay and forethought; New Jersey framing is less secure, but makes constant progress. New Jersey programming has essentially the same advantages and drawbacks.