I look at the sky and see nuclear furnaces and transcendental distance. I look at an apple and see molecules, atoms, quanta of energy and motion. I look at a bird and see evolution, metabolism, aerodynamics. Yet looking at the same things others have seen a sphere of fixed stars, an apple-substance, and upholding angels. Some wit observed of the Ptolemaic cosmos: “If they had been right, what would the world have looked like?” The answer, of course, is that without a telescope it would look the same. Most experience is theory-neutral; we can get by believing almost anything about inner workings and ultimate origins. Even in the part of experience that science enables, people can get by without seeing scientifically. (I recall the moment of horror when I was learning to drive and realized there were people with driver’s licenses and Aristotelian intuitions about motion.)

Thus I try from time to time to put on wrong theories. I try to see Ptolemy’s sky spin or Newton’s sky tick, to see elemental matter with Aristotle or vortical matter with Descartes – to grasp and hold the view as long as I can. It interests me that this can be done at all, but properly the interest is in the consequences. The longer I can hold the view, the more I accommodate it. I feel the possibility of otherwise unknown moods; I feel a derangement in my scale of values; I feel a shift in my physical bearing – somehow how much of the universe is above my head matters to how I hold it; somehow the composition of dirt matters to how I stand on it.

The same life, the same world, but in a different key: the names, the patterns, the movements are the same, yet the overall effect is different. I wonder if this was what it was like to live through Einstein. I think of Feynman’s melancholy observation that science is not an infinite project, that it is at least doomed to run out of nontrivia to discover. I think of the commonplace that schools in art and philosophy sometimes end simply because they are too developed, because they require too much time to catch up with, foreclose too many possibilities. But there is no such escape from truth.


Taking a week off.

Short stories

There are too many short stories in the world. For all x, where x is heartbreaking or horrifying, mystifying or magnificent, pitiful or precious, agonizing or astonishing – some short story already satisfies it perfectly.

There is always room for another novel. Novels are too long for perfection. All novels do something wrong, leave some promise unfulfilled. There is always room in the gaps. The novel is fractal; from the right perspective we could see every novel growing out of another – see Don Quijote as the Mandelbrot set, dark among halos.

More obviously, there is always room for another movie. I suspect good directors become so by watching bad movies. Every bad movie has one good character, one good scene, one good shot, one good line. Watch enough of them and these scattered goods add up to the shadow of a great but unmade movie.

But short stories can be perfect. Pry open the novelist and you find a frustrated reader of novels; pry open the director and you find a frustrated watcher of movies. But pry open a short story writer and you find delight and devotion. This is strange. Perfection is so high and so cold a thing; it should quell and silence us, it should make us prefer some open field. What could inspire us to imitate what we cannot rival?

(There is of course an analogy in music – old Bach is perfect, yet inspires composers – but that is only a parallel mystery.)

The point of fiction is its process. No work of fiction worth writing is fully planned. Not that fiction must be unplanned or shapeless; only that, for the writer, fiction is as much discovery as design – a revelation that may be determined, but cannot be predicted.

Imagine a pantograph mounted to a drafting table. Some points are fixed, some points are free. As the draftsman moves the arms the fixed points determine, as translated by the configuration of the machine, the shape traced out by the free points.

We only see the shape traced out when the drawing is done; but every work of fiction starts with something fixed and something free. Fiction is always experiment. The writer fixes certain points. Given the machine that the writer’s knowledge and sympathy are, what shape will be traced?

In the novel the apparatus is somewhat flexible. Time and tedium, research and tendency, blur the resulting image. But in the short story the apparatus is rigid and quick. The shape is distinct. For the writer, the short story is an experiment; for the reader, the short story is a demonstration.

If the point of fiction were for us to tell about the world – to put on masks and do impersonations, to manipulate puppets and cast our voices into their mouths – then fiction would not be worth writing. Essays and treatises can tell for us more easily, completely and comprehensibly.

We waste our powers when we exercise them only in being ourselves. To observe is to imitate; to sympathize is to become. We all do this; we simply call it knowing a person – knowing how they look at things, knowing what they would say, how they would say it. This is a basic human faculty, something we take pleasure in doing and cultivating for its own sake. We build music on hearing, art on seeing; we build fiction on knowing.

The Golden Disk

They came from the sky in disks of gold and told us we were not alone. When they walked, they walked like us. When they spoke, they spoke like us. They said they had found our golden disk, our message of music, and they had accepted it. They had come for our Bach, to crown him with glory, to admit him to the fellowship of the music masters of a million worlds. We told them he was dead and they asked us what that meant. When we could bear the pity in their so human faces no longer we asked them to leave and they went. You call my silence a conspiracy. But I have no words.

Weakmindedness Part Four


Don't I know how Socrates condemned writing – how it would give the appearance of wisdom but not the substance – with an Egyptian fable where Thoth presents writing, among other useful inventions, only to have them rejected by the god as pernicious?

This little anecdote – a single paragraph of a long dialog, a minor support to a more complex argument, and the least extended of the many fables which adorn the Phædrus, has acquired a reputation and argumentative weight that its duration cannot support. Here it is in full, after Jowett:

At the Egyptian city of Naucratis, there was a famous old god, whose name was Theuth; the bird which is called the Ibis is sacred to him, and he was the inventor of many arts, such as arithmetic and calculation and geometry and astronomy and draughts and dice, but his great discovery was the use of letters. Now in those days the god Thamus was the king of the whole country of Egypt; and he dwelt in that great city of Upper Egypt which the Hellenes call Egyptian Thebes, and the god himself is called by them Ammon. To him came Theuth and showed his inventions, desiring that the other Egyptians might be allowed to have the benefit of them; he enumerated them, and Thamus enquired about their several uses, and praised some of them and censured others, as he approved or disapproved of them. It would take a long time to repeat all that Thamus said to Theuth in praise or blame of the various arts. But when they came to letters, This, said Theuth, will make the Egyptians wiser and give them better memories; it is a specific both for the memory and for the wit. Thamus replied: O most ingenious Theuth, the parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions to the users of them. And in this instance, you who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

Its prominence is more due to the names involved than its contents. It is told by Socrates, historical founder of philosophy. It concerns Thoth, mythical founder of esotericism. To Socrates and Plato he was only one Egyptian deity; but intervening tradition names him Thrice-Great Hermes, founder of all Western esoteric traditions (excluding of course the Cabala, descended from the secret revelation of Moses). Here is the author of the Emerald Tablet, condemned by the god for his vain and foolhardy invention of writing! The irony of the anecdote impresses it in the memory.

But consider the context. I will not rehearse the whole of the Phædrus, only call attention to its last section. Phædrus supposes that the speechifying politicians of Athens care so little for their speeches that they must be persuaded to write them down.

Socrates calls him on this absurdity. He contrasts true and false rhetoric – the false rhetoric of politicians, giving set speeches to a lump audience; and the true rhetoric – that is, the dialectic: to understand and address the conditions and abilities of a specific person. Writing, they agree, is a weak thing, because like speechifying it does not accommodate itself to any particular understanding. Like a painting, it has the semblance of life, but remains dumb when questions are asked of it.

Note that a specific kind of writing is meant – persuasive writing – and that a specific fault is diagnosed – generality. Writing that is addressed to a specific person and meant to be replied to, like a letter, is not considered, nor is writing that preserves facts, like histories or treatises. Within the limits of his actual argument Socrates is hard to disagree with. So it is better to persuade in person. So it is a higher skill to persuade someone in particular than to sway a crowd. But even then Socrates recommends writing to hedge against old age. I would add death and distance. He really has no argument against writing at all; it is merely the occasion to express the difference of the rhetorician and the dialectician, which is not specific to writing.

But to show that Socrates did not mean what people think he meant is not to show that what people think he meant is wrong. Surely writing impairs memory? Surely writing gives us the voice of wisdom, without the substance?

We wrongly think of mnemonic feats as proper to pre-literate cultures; but the ars memoriæ shows that memorization only gained in urgency with the invention of writing. Before writing there was simply less to remember. The feats of illiterate mnemonicists in memorizing long epic poems is rightly impressive. But this means that to be remembered for more than one lifetime, knowledge had to be worked up in poetry – no easier then that is now, whether of the "Sing, goddess" variety or the "Sailor's delight" variety.

By itself writing lets knowledge persist without being remembered, but does not itself retain knowledge. Yes, the knowledge you want is in a book; but that book is chained up in the next country. You may obtain knowledge through reading; but you must bring it back in your head. Printing, not writing, trivialized mnemotechnique.

But may what is said against writing apply then to printing? Consider another anecdote about memory and writing, this one from the Life of Johnson. It is the source of a quote which has become so familiar that it passes for a cliché or a snowclone. Johnson, on his arrival at a house, surveyed the books there. Joshua Reynolds, painter, quipped that his art was superior to writing: he could take in all the paintings in the room at a glance, but Johnson could hardly read all the books. Johnson riposted with a distinction:

Sir, the reason is very plain. Knowledge is of two kinds. We know a subject ourselves, or we know where we can find information upon it. When we enquire into any subject, the first thing we have to do is to know what books have treated of it. This leads us to look at catalogues, and the backs of books in libraries.

Very good, of course; telling; and the standard explanation for the effect of printing: it replaced knowledge of facts with knowledge of the sources of facts. But I am not willing to accept this – I think that Johnson, and we, are wrong.

We need something to compare to language, something else which has gone through the same transition from oral transmission to written form to printing. There is such a comparison in music. Music underwent a transition from aural to written to portable form. Unlike language, its first transition is not prehistoric (as language's tautologically must be); and unlike language, its unwritten forms survive, and may be compared to the written form.

Written music makes no difference at all in how good a musician must be to play well. Of course a musician who plays wholly from written music may not be particularly good at memorizing long pieces or at improvisation. But such inability to memorize may be by choice – pianists strictly play from sheet music because they think it better to do so than to memorize – and the ability to improvise arises mostly as a consequence of the knowledge of music theory (a silly word; it should really be viewed inversely, as musical physiology – since understanding music is something like learning a physical skill – you need the feel before the anatomy means anything) – the theory required to understand, play, and compose music intelligently. Playing from written music does not prevent a musician from playing with feeling tone, living rhythm, and meaningful phrasing.

True, in principle, one could be able to read music but not to play it – but that would be perverse. To be able to read but not to think would be just as perverse. Likewise to call reading thoughtless is absurd, because written words have no meaning of their own. Their meaning must be reconstructed in the mind; and this reconstruction is a skill, an ability, an act, like playing music. The skills you must have to read at all, and the skills you must have to play at all, are far more difficult and important than the skills whose necessity reading relieves. They blend in their perfection: memorization from a position of ability, understanding the principles behind its changes, is altogether a higher thing than memorization from inability, taking every note on faith.

Do I then excuse the net? Do I consider it as safe as sheet music? If the net were another such step, as from writing to printing, I would.

The net is, nontrivially, a machine for exaggerating its own importance. In its function of making information accessible it is not transformative. Comparisons between the net and the invention of printing – even the invention of writing – are commonplace, but absurd. Those who so compare reveal their dependency on inherited thought patterns, on the Whig history of the intellect.

It is comparable to the wrong idea most people have about industry – there was an Industrial Revolution; and since then, more of the same. But of course modern industry is as far from the old mills and factories as they were far from cabin piecework. Just the invention of electric lighting and air conditioning mark transformations of the factory system just as profound as mass production and the assembly line. But somehow we do not notice such changes.

Likewise we do not notice the two most important events in the history of the intellect: the public library and the paperback book. Between these two inventions more information has been made available to any human being than any human lifetime could absorb. They changed a world of scarcity into a world of plenty. The net – a change from plenty to plenitude – is comparatively insignificant. A thousand or a million times too much is still just too much.

But the net does have peculiar advantages; the net is different. It is frictionless, instantaneous, ubiquitous – and consequentially so. Consider drink. Spirits were once the only way to preserve surplus harvests, for storage and transport. Intelligence has had the same use: to distill, compact, and preserve masses of information and experience. Now we can move harvests in refrigerated bulk, preserve them as long and transport them as far, as we like. Of course people still drink; but now drink serves a recreational purpose. When notes are as accessible as narratives, when eyewitnesses are as accessible as reports, there the exercise of intelligence, though perhaps no less useful to sort excess than to defy scarcity, loses its urgency, excusability, and remunerability. The price of the cheapest smartphone is enough to make a "walking encyclopedia."


Everything that has an outside has an inside. (Topological exotica notwithstanding.) So digital augmentation obsoletes intelligence on the outside; what about the inside? Surely, however hard the wind blows against the rest, programmers are safe in the eye of the storm.

So far I have written intelligence and intellect interchangeably; but there is a distinction. It is best made by the example Jacques Barzun relates in his 1959 The House of Intellect. Consider the alphabet: 52 utterly arbitrary signs (26 uppercase, 26 lowercase), with history but without rationale, which make it possible to record and represent anything that can be said. Millions use it, as Barzun observes, who could not have invented it. Intelligence so crystallized, so made current and bankable, is intellect.

The book itself is remarkable – I recommend it for anyone impressed by The Closing of the American Mind, which is tedious and muddled by comparison. But I will not rehearse his argument. The battle is over, the other side won before I was born. Of course intelligence does not require intellect; but without its tools, fending for itself, it moves slowly and wastefully. Relying on its own resources, it becomes too serious; everything is so costly that it cannot take anything lightly. It loses time.

But look to programming! Here is the most purely intellectual discipline which human beings have ever created – as it must be, given (Dijkstra observes somewhere) that computer programs are the most complicated things human beings have learned to make. Programming cannot be learned, it must be adopted; it is a skill not just of action, but of perception.

Some people wonder if programming is an art or a craft. In seeking unpretension the usual answer is craft. But this is wrong. A craftsman works with stubborn materials and gets it right the first time. A carpenter who takes three tries to build a table and throws away the first two is not a craftsman. But three tries at a painting or a program is typical. A craftsman is finished when the work is done; an artist's work must be declared finished. Of course the better the programmer, the larger the chunks of the program that come out right in the first place. But the challenge of programming, its possibilities for flow, lie at the point where it is pursued artistically.

In a small way the early web made this art accessible and meaningful for those who did not think of themselves as programmers. In a small way it brought intellect into lives otherwise unaccommodating of it. The primitive character of the technology and the intrepidity of its early adopters, both required and welcomed intellect. Anyone who accepted the discipline of HTML, who studied literature to write better fanfiction, who studied the fallacies to call them out in forums or newsgroups – they were embracing intellect as they had never before had the freedom to do.

But this is past. You can, of course, study photography toward a better photostream, writing toward a better blog; but the improvement is along a spectrum. Formerly those unwilling to take trouble were absent; now there is no break to differentiate those who take the trouble from those who do not. We are all on the same footing, because we all possess personalities. When all the tools are provided it takes intelligence, but not intellect, to use them well.

The old web promised to change relations, to establish an invisible college; the new web promises to recapitulate existing relations.

(Note that the succession of Web 2 to Web 1 was not gradual or competitive; 9/11 broke Web 1, and something else had to be created to replace it. The relative political tranquility of the time was as important a precondition as the technology; politics, when passionate, is neurotoxic to intellect – a drop in the reservoir makes a reservoir of poison.)

Programming is writing: a very exact kind of writing, for a highly intelligent, totally unsympathetic, viciously literal reader. The tentativeness and disaggregation which are observed in writing as such may be observed in programming as well. The anarchy of Perl and the onramp of PHP yield to the whiteroom of Python and the velvet rope of Ruby. The hacker, who looks inside everything – with or without permission – yields to the developer, whose job is pasting together blackbox libraries and invoking their “magic.” The cathedral has fallen on the bazaar; the freedom of free software is not free as in beer or as in speech, but free as in sunlight, air, and other unmetered utilities.

Of course computers failed to deliver artificial intelligence. But machine intelligence need not equal human intelligence to render it obsolete. The assembly line never matched the finesse of the workbench. The progress of transportation was not from meat legs to machine legs, but from legs to wheels. Programmers have their own ways of taking a spin.


New technology has previously made intelligence easier without depriving it of value; why should the net be so dangerous to it? I have considered only the means of its attack, not the cause of its enmity.

In truth there is no intrinsic reason why the technology of the net must oppress intelligence; I use it heavily in that faith. But though the enmity is not intrinsic, it is still inborn, because the net was conceived in the pursuit of efficiency. Efficiency, as an ideal and a standard, hates intelligence and wants to destroy it.

What is efficiency? Efficiency means maximum return on minimum effort and minimum expense. But not everything that ensures more effect for less done is a measure of efficiency. In the simplest case the efficiency of one technology may be superseded by another technology that is inherently more efficacious – highly efficient systems of horse-powered mail delivery like the English post or the Pony Express were displaced by steam power in its earliest and least efficient forms. The horse's lineage and the rider's tack were the products of a millennial tradition that allowed horse and rider to operate as one animal. Rail, by comparison, was unreliable and unpredictable; intrepid for engineers, opaque for passengers.

If computers were the successors of paper-based information management, as rail was successor to the stagecoach, there would be no problem. The problem is not inherent to computers or to the net at all; it belongs to culture. Technology does not incur, it enables. It is not the fault of the orgasm button that the mouse starves while pressing it. This was always a weakness in the mouse; the button just gave him the chance to destroy himself as he had always been prepared to destroy himself.

We all suspect, most quietly, that the technological developments of the late twentieth century, and of our own time, let down the rapid pace of progress which developed the developed world; everything is sleeker, everything is faster, more brilliant, but little is new. Remember Engelbart débuted the net forty years ago; and that even while wonderful things are being done with it, no one can get their act together to swap out its old, nearly used-up address protocol – IPv4 – for the more capacious IPv6. Progress, once an irresistible force, is now hardly felt; in its place are so many immovable objects, so much foot-dragging, second-guessing, and public relations as the art of excuses.

The parsimonious explanation of this state of affairs is that after decades of focus on efficiency, there is no more room left for innovation – not even on the scale of the refrigerator, let alone the scale of the jet engine. Standards for return on capital have become so high that there is no indulgence left for the expensive and unrewarding infancy of really new technology.

But this is too particular an example. The idea is more general, and more familiar; we carry the lesson in the very frame of our bodies. Human beings are terribly inefficient at moving around. We traded the gait of the quadruped, even the lope of the knuckledragger, for the perpetual high-wire act of the biped. For most of our lives we leave two limbs – two perfectly good forelimbs – hanging unused by our sides while our hind limbs waddle precariously. Through idleness and inefficiency we trade forelimbs for arms and hands, and all that hands can do.

Consider Aristotle: “Civilization begins in leisure.” The phrase passes on Aristotle's credit; but in itself it ought to be shocking, if not absurd. Who believes in leisure? We have psychology; and whatever the school all psychology is one in its unwillingness to cede the possibility of leisure. If there is such a thing as leisure, psychology is impossible just as, if there are such things as miracles, physics is impossible. Everything we do must serve some urgent, exigent purpose of the unconscious or of the genes.

The paradox is curious: we have the most refined instruments of leisure ever devised, but we will not even believe in it. We admit to resting, relaxing, blowing off steam, unwinding, recharging, renewing; we admit to solace, consolation, distraction, and escape – but we do not admit to leisure. We suspend work to work more. For this god we admit no dual.

Be productive! Even the charitable impulse beats the drum. But what is it to be productive? It is (returning a GTD), to always have some next action to proceed to. Now GTD contains wonderful insights; it is not a cheap slogan bucket, but a system well thought out and capable, when adapted and practiced, of changing lives. I have tried it and having done so see work differently. I abandoned it not because it did not work, but because it worked too well. Adopt GTD and you will get more done. I also got less begun. To put it simply, GTD unsticks; but I rely on getting stuck. Getting stuck tells me it is time to leave a project fallow, and work on something else for now.

Of course I do not have to live by deadlines, so I would be an ass to generalize my own experience. But there is so much to do, and so little time. There are only so many open slots; no matter how efficiently you sort, at some point every new claim on your time pushes an old one off into oblivion. I respect GTD, but productivity systems in general strike me as perverse, because they keep the least worthwhile, most predictable claims uppermost, and push the more interesting, amorphous claims down and finally off the edge. It is life laid out in line, without recurrence, without themes, without center. It is the final victory of school over life, when last year's projects are as irrelevant as last year's homework.

But perhaps we are tired. Enough of the open-ended, the uncertain, the unknown. There is something to be said for life that is modest in its ambitions, confident within its limits, at home with itself – at least by way of amor fati, since it is trivially true that the ordinary life cannot be extraordinary. But then we are so tired, and the world is so old. There are so many big ideas; do we really need more? Let the scholars publish and perish; at least it keeps them too busy to preach. And politics – after so much politics nothing settled, nothing is certain – let those who can do nothing else devote themselves to it. Do not trouble us with that frenzy.


What is obsolescence? Plainly the concept is partly technological, partly social. Cars made draft horses and buggy whips obsolete, but there are still mounted police, and buggy whip manufacturers made a smooth transition to manufacturing car parts. (Capital finds its level.) A technology that displaces another never does so completely; no technology is completely interchangeable with another. They all imply their own particular scale of values. Too, there is an index of obsolesciveness, a function of complexity: the axe survived the chainsaw, because it is simple, but the vacuum tube did not survive the transistor, because it is complex. But even these technologies sometimes linger. Investment has inertia, nobody likes to close a factory, and what's so bad about COBOL after all?

Obsolescence happens, it is indeed a force; but it is over-billed. Nothing disappears, nothing ceases to function the day it becomes obsolete. Obsolescence is something that happens to technologies, but it is not the chief or limiting condition of their existence.

(This is vividly though shrilly argued in an interesting but flawed book, The Shock of the Old – shrill, because a cogent case against the doctrine of obsolescence would have to consider not just products but production methods. The author writes as if the axe of 2010 were the same as the axe of 1910 or 1810 – as if there were an equivalence between the product of a blacksmith, an assembly line, and a laser-guided CNC machine. He adduces shipbreakers tearing down the most massive artifacts of industrial civilization with muscle and hand tools, but passes over the fact that this manpower lives by the high technology of the Green Revolution.)

So what do I mean when I say "intelligence is obsolete"? Is its obsolescent real or doctrinal? I think it is doctrinal because, as I said before, intelligence cannot just disappear.

But this sounds circular. What is obsolete? Intelligence is obsolete. What judges obsolescence? Intelligence judges obsolescence–even granting some part of the emotional repulsion of obsolescence to the instinctual distaste for the unfashionable, still the idea operates too broadly not to imply judgment, intelligence, and intellect.

It is this circle one sees, I think, in the odd paradoxes pronounced by the boosters of technology-as-magic, which would solve the bewilderment of technology with more technology, who would loosen the constraints of technology with more constraints, and who would make technology less demanding by ensuring that it follows you everywhere. They recognize technology as something enabling choice and critical judgment in everything except technology, something solvent to ignorance, helplessness, and herd behavior in everything except technology. But I see nothing that makes technology such a fixed point; it seems to me as unstable, as potentially thoughtful, as it encourages everything else to be.

I have heard it argued that technology – be straight, computers – is becoming more like cars – devices practically magical, in that they are operated more through ritual then understanding. This cuts me a little because in fact I understand very little about cars. But why do I know so little about cars? Not lack of curiosity, but lack of opportunity. I have never been in a position where a car was something I could afford to break – not to mention putting life and limb in hazard. But the evident trend of computers is towards commodification; everything done to de-commodify computing devices is ultimately doomed. Once I had one computer, heavy metal taking up desk space, and that was a serious investment in hardware; now I have several computers, and anywhere I can check out a few repositories, I am at home. It is this possibility, the chance to evolve your own peculiar relationship with technology, one that is cumulative, personal, and free; one that you own and control; one that is a slow growth of the mind into the possibilities afforded by intellectual augmentation, not an accommodation of the mind to the tools and metaphors dictated to you – this possibility which allows intelligence to employ technology, not serve it.

In finding intelligence obsolete the doctrine of obsolescence obsolesces itself. A replacement is in order; some new view of technology is required. But then I am hasty. So obsolescence obsoletes itself; so there is a contradiction, so what? A dissonance implies a resolution but nothing says that it has to resolve. Self-contradiction may even strengthen an idea, by imbuing its holders with faith. The rest is to be seen, or done.