The ending is the most important part. Not in all arts: pictures are endless. Not even in music, where skipping ahead is bad faith. But in writing the ending is definitive. You do not know how a sentence is meant, whether you are being told or being asked, until you reach the end.

The problem with endings is that they are all a kind of punctuation, artificial because the criteria of a good ending are abstract. A speech sums up; a sonnet turns; a story rounds off when something recurs. The key determines the cadence.

I have good reasons to prolong the Ruricolist; I feel how much I owe to it. But I must admit that the Ruricolist is over. The essay series has its natural term. These have been long years and I am different from the man I was when I began. His clothes no longer fit.

My intentions are that the Ruricolist will remain online; comments will remain open; and I will continue, from time to time, to revise my work here. When I have news of other projects, I will post it.

In conversation we are improvisers. For our improvisation to succeed, we must be willing to take whatever comes, trusting the outcome as we trust one another. We never say all we meant to say, or everything we think of, but that is the point: as much as we spend, we leave enriched. Now, at the end, I can affirm what I wrote at the beginning: I wrote for myself—not for friends, not for followers, not for an audience, not for posterity. This was my end of a conversation. And since this was a conversation, it must end as all conversations do, with a kind of aposiopesis, when the bill arrives, the sun comes up, the car stops, and suddenly we part.


Darkness is shadow. The golden shadow of the incandescent bulb; the stainless shadow of the fluorescent; the quivering shadow of the gaslight (seek it where it lives yet; deep down in the oven, the pilot flame is the last gaslight). The footlight, the searchlight, live to dazzle, are stingy with shadows; but most generous of all is firelight, flicker and blaze, casting long shadows that strut and stride, the shadow players whose performance has never been commanded.

You will read that, for our ancestors, the succession of the long, dark nights of winter, solaced only by the wavering fire, relieved only by brief treks through a twilight world stifled with snow, gave on to a kind of trance, and that it is to the visions of the long winter that all superstitions may be traced. Now, the tropics have their own superstitions, but certainly the mind abhors a vacuum, and where there is nothing to be perceived, something will be imagined. So, night by night, they overlaid the everburning stars with bold constellations.

Darkness is night. Morning and evening circle, glooming and gloaming, matutinal rise intersecting crepuscular fall at the liminal coordinate where the spectrum unfolds. Twilight that never ends while the night lights burn: mercurial moonlight over the fields, mercury vapor skyglow over the cities, and the noctilucent auroboros rattling the northern sky, over forests quiet and umbrageous as the shadow lands. The stones under your feet strike triboluminescent sparks. Fireflies constellate with the stars. Far ahead a porchlight shines, generous intent as harborless as a lighthouse.

Darkness is night, darkness is shadow; the one thing darkness is not is the absence of light. The retina is stretched like a drumhead, strung with tense nerves that toll every photon, an inchoate kaleidoscope so sensitive that it need only be pressed behind closed eyes to coruscate with phosphenes like the scintillas of cold light that kindle the eddies of the troubled sea. What light conceals from us, what we see in caves and face-down on the pillow is not darkness but eigengrau, the eyes’ gray, lightened by the twitches of our dreaming nerves. Seeing eyes have never seen full dark. Darkness is not even the opposite of light; it is only a mood of light.


Every year we made a day trip to visit my great-uncle Denny. He lived with his wife in the backwoods of Pennsylvania, in a house older than the United States with wine-dark rafters and a cellar like a cave. The water cycle ran from pitcher pump to outhouse. The old house stood on a rambling property, all deep green, crossed by an abandoned and overgrown railroad.

Denny was an old man, a veteran of Iwo Jima with a steel plate in his head. If I understood his stories correctly he was one of those who raised the first flag there, the little one. Of the second flag, he said “If we’d known, we all would have gone up.”

He had no interest in children. Perhaps I was oblivious; perhaps I was annoyed at being ignored; but when the subject of WWII came up, somehow, I parroted what I had been taught in school, where we had social studies instead of history: that the bombing of Hiroshima was a needless atrocity, only compounded by the spiteful destruction of Nagasaki—all typically American brutality.

That got his attention. He informed me that the only reason he was alive was because of the bomb. Had the war continued he would have been among the first on the beaches of Japan. He would surely have died. He thanked God for Truman and his bomb.

Of course I shut up, but I was more confused than enlightened. We can number the dead and number the saved, but these numbers are not like other numbers. We can count them, but we cannot calculate with them.

Ask: who, exactly, died to save whom? If this were a question of math there would be proportions to work out. “You, lover, your man died to save ten lives. You, father, your daughter died to save three and a half lives. You, mother, your baby died to save half a life. You, child, your dog died to save one twentieth of a life.”

And there would be responsibility to assign, givers to match with receivers. “You, survivor, see the face, read the name, of the man who lost his life to save your life and five other lives. Now you must remember him.”

But there are no such calculations. These numbers only look like numbers. They are lives. They are incommensurable.

It is true but trivial that I cannot put myself in Truman’s place; if I were Truman himself, I would have done as Truman did, and if Truman were someone else, he would have faced someone else’s choice, not Truman’s. But looking at the numbers we must remember that this is not an equation; there are no factors. These numbers only look like numbers. Nothing cancels out. There is no algebra of forgiveness, no solution for innocence.


What makes a loser? There is nothing special about him. Being dull, awkward, foolish, and feckless only makes him unlucky, and being unlucky is not enough to make a loser. What makes him a loser is not that he loses, but that he does not know why he loses.

Losers have always been with us, since Thersites at least, but of course they are rare in hierarchical societies, where everyone is born with a part to play, where every kind of failure is keyed by coordinates of folly and vice. Being a loser is idiopathic, because losers are inconsequential; they do not even have anyone to let down.

He may have abilities, even remarkable ones, but he spoils them. He stops too soon, or he goes too far, and all his good intentions, all his hard work, come to nothing. Worse, just by being the one who has them, he makes his own abilities ridiculous. For his skills, we call him a geek; for his wealth, we call him vulgar; for his commitments, we call him pretentious. He is not a loser because he never wins; he is a loser because even when he wins, he loses.

What makes him a loser are not his mistakes but how he doubles them. Defying logic, he spans the extremes without ever touching the center, impaling himself on both horns of every dilemma, robbing Scylla to pay Charybdis.

He is the one who has nothing to say, but never gets to the point; the one who can’t take a hint, and can’t take a joke; the one who never learns, and the one who never gets over it; the one who can’t talk around girls, and babbles around women; the one who can’t express himself, and the one who gives everything away; the one who never takes a chance until he throws everything away.

In short the loser is a bad actor playing himself. Nothing feels real to him unless he is playing to the balcony. In the beginning, he tries too hard; and every time someone leaves, he tries a little harder. In the end the seats are empty and there he is, alone on the stage, the singularity where tragedy and comedy meet: the clown who does not know he is a clown.

The Traveler

“You haven’t gone yet? You should go. It’s the right time of year, too. It’s wonderful with all that space, and those views, and not a tourist in sight. I wish everybody could go.

“What? What did I…? Oh. That’s an oxymoron, isn’t it? Like ‘nobody goes there anymore, it’s too crowded.’ But that really is how it goes. Whenever we find something that’s really a jewel, people just descend on it until they suffocate it. I can’t even go to Venice anymore. I swear it’s sinking out of embarrassment.

“If we were smart, really smart, we wouldn’t blab about things like that. We’d organize a guild or a secret society. We’d have apprenticeships and an initiation. Seven years of studying languages, and etiquette, and survival skills to become an Honorable Traveler with the right to visit. Plus another ten years of study before you get to take a camera.

“Instead, we love it so much we have to tell somebody about it. And they have to tell somebody and we all love it to death.

“Maybe that’s too harsh. I don’t want to seem elitist. The fact is I pity the tourists even more than I pity the places they ruin. They have no way out. They cross oceans and continents but they pack their boredom, and ignorance, and petulance.

“I don’t know why they bother, unless it’s because they still have that instinct that tells them growing up means leaving home. But no matter how far they go, they drag home along behind. It’s not even travel; it’s just a change of venue.”


Notes repeat themselves, higher or lower, at the interval we now call an octave. Double or halve the speed at which a string vibrates and the sound, in some sense which is as convincing as it is gratuitous, remains the same. And between notes in simple ratios, most of all the interval we call the fifth, there is a sweetness sweeter and more dizzying than wine.

Between the octave and the fifth, the world almost seems made for us. This appearance is deceiving. The world is not just unfair, but rigged. Chances are you know what it is to pick up part A, and part B, never having doubted they went together, only to find that they don't quite fit. The world is like that. Between the octave and the fifth there is a small but shattering discrepancy we call the Pythagorean comma.

The comma of Pythagoras is as bad as the flaming sword. It means that music, even music, must always be compromised, whether by a diet of a few safe notes, or an intricate microtonal dissection of the octave, or a distortion of the fifth.

This distortion (the Western approach) goes by the name of temperament. Since the Middle Ages the West has known and used several exquisite systems of temperament for particular purposes, but in the last century they gave way to a single system brutal in its simplicity. Equal temperament deals with the Pythagorean comma the way the senators dealt with Romulus, when they caught him in a sudden fog, hacked him to pieces and, walking away with the pieces hidden under their togas, called it apotheosis.

(Are the jitters of the West, its frantic days and restless nights, the symptoms of our addiction to this uneasy music, the Pythagorean comma working its way deeper and deeper under our skins?)

Of all things with value, music is the purest, the most abstract. If even music must compromise, what hope is there for anything else? None at all; but do not take it too hard. Consider poor Pythagoras, twice betrayed, once by music, once by math. Traumatic as Gödel, Turing, Russell, and Tarski were for us, how much worse was it for him, the philosopher who thought number was truth and music was beauty, only to find that numbers could be irrational and music sheltered wolves.

The last century was not, as it boasted, the moment when thought ran up against the limits of certainty and perfectibility. From the very beginning, the whole arc from faith to doubt, from certainty to anxiety, has always been with us in Pythagoras and his comma.

The Entrepreneur

“Did I ever tell you about my grandfather? Of course I didn’t. He was nobody. He spent his whole life at the factory, retired, boom, dropped dead. That’s the one thing I’ve been afraid of my whole life, turning out like him, a nobody with nothing to show for himself, nothing to show he ever existed except for a chip of stone at the veterans’ cemetery. Which one? I don’t know. I have his medals around here somewhere.

“After I’m gone, people need to know I was here. They need to know my name, and remember me. I want to be up there with the greats. I want to leave a legacy. For all he did with his life my grandfather might as well never have been born. My life has to mean something. The world has to be different because I lived in it. So thanks for your concern, but I’m fine. And I kind of have to get back to work, so if that’s all . . .”

Cell intelligence

Before we live by ideas, we seem to live among them. Nothing goes unprophesied. The shadows of ideas fall ahead of them and mark out the shape of things to come for those who care to trace it. The prophecies of science fiction writers are an obvious example: I nominate Looking Backward. In 1887 Bellamy felt the shadow of the radio and colored with fancy the pattern of affordances he traced from prophecy.

There are a number of music rooms in the city, perfectly adapted acoustically to the different sorts of music. These halls are connected by telephone with all the houses of the city whose people care to pay the small fee, and there are none, you may be sure, who do not. The corps of musicians attached to each hall is so large that, although no individual performer, or group of performers, has more than a brief part, each day’s programme lasts through the twenty-four hours. There are on that card for to-day, as you will see if you observe closely, distinct programmes of four of these concerts, each of a different order of music from the others, being now simultaneously performed, and any one of the four pieces now going on that you prefer, you can hear by merely pressing the button which will connect your house-wire with the hall where it is being rendered. The programmes are so coordinated that the pieces at any one time simultaneously proceeding in the different halls usually offer a choice, not only between instrumental and vocal, and between different sorts of instruments; but also between different motives from grave to gay, so that all tastes and moods can be suited.

Contrast this prophecy, made in the heat of fiction, with another, made in earnest. I own a book—a curiosity—entitled Cell Intelligence, self-published 1916 by one Nels Quevli: registered pharmacist, bachelor of law, and flaming eccentric. The argument of the book is encapsulated in its full title:

Cell Intelligence the Cause of Growth, Heredity, and Instinctive Actions, Illustrating that the Cell is a Conscious, Intelligent Being, and, by Reason Thereof, Plans and Builds all Plants and Animals in the Same Manner that Man Constructs Houses, Railroads, and Other Structures

This sounds stranger than it is; try The Selfish Cell. Quevli in 1916 maps to Dawkins in 1976. Both Quevli and Dawkins conclude that life does not fall out of any equation, and that since it is not a force or a property of matter, its existence at all is contingent, and its forms must be historical.

There are two main theories by which the growth and development of plants and animals in life are explained: First, chemical and mechanical forces; second, Intelligence or a Divine Being. However, so far no one has yet ventured the proposition or statement that the intelligence that has caused the production of all these structures we see, such as plants and animals, was the property of the cell.

And since it is not determined, it must be intelligent (or selfish) because its survival and ramification imply something equivalent to memory.

I do not pretend to know what intelligence is, nor what memory is, but I want to show that the cell is a being possessed of that something, whatever it is. If man is intelligent the cell must be.

Both are asserting that cell intelligence and human intelligence are the same. The difference is whether we follow Quevli in applying the vocabulary of human intelligence to the cell, or Dawkins in applying the vocabulary of the gene to human intelligence.

Bellamy’s prophecy is interesting, but after Bellamy radio still had to be invented. But Quevli in 1916 knew what Dawkins knew in 1976. Ideas are autologous: the description of an idea, is an idea. To predict it is to bring it about; to imagine it is to create it.

This property of ideas leads to certain perversities. Everywhere we find that the longest training, the deepest commitment, the finest specialization yield ideas that could just as easily have been dreamed up on a long walk or talked out in a bull session. The difference is the imprimatur.

But if specialization does not yield better ideas—if it only makes them more persuasive—then someone who is more interested in understanding than persuasion might ask whether it would be better not to specialize, and cultivate the faculty for ideas directly?

The case could be made that the person who has one idea, then devotes their life to advancing it, is wasting their life: settling for an idea that, being their first attempt, probably isn’t even very good. The case could also be made that intellectual monogamy ought to be the goal of anyone who takes ideas seriously, and that though essayistic dalliance with a series of ideas is perhaps charming in the exuberance of youth, it becomes absurd and pitiable if protracted into maturity.

This tangle recalls others. Being one person—having one personality—is enough for most of us; yet we see writers and actors contain multitudes where each member, whether absorbed from life or condensed from fancy, is as much a person as the person who contains them, having virtues and vices of their own they do not pass on to their host. If myself is something virtualizable, am I wasting myself in being only myself?

But writers and actors are not the best people; what they contain they do not combine. The conversation of Shakespeare was surely intense, but less than Hamlet times Falstaff times Rosalind. And actors especially may owe their multiplicity to nothing but the quality that Borges imputes to Shakespeare (who was also, remember, an actor): they can become anybody only because they are nobody.

The homuncular fallacy is not a real fallacy. It could turn out to be part of the definition of consciousness that it is built from what is also conscious, a potential infinity like two facing mirrors. We contain cells, cells abridge us; we are people with personalities and yet we contain people with personalities. Sometimes it seems that everything is recursive, that even reality only represents itself: considering Robertson’s Titan, for example, I cannot help suspecting that the world, too, only serves to perform what has already been anticipated in imagination.

Nondefinition #33

Sharks. The shark is no pilgrim: half as old as life, streamlined by a million generations bent on the same restless, uncompromised purpose, he has never yet doubted. He has an ancestry but it does not matter. Once hunger met water the shark was inevitable. He is written into the laws of physics between the ratios of buoyancy and the equations of flow and drag. He belongs utterly. When he dies he leaves no bones to protest it. They say that deep enough there is no more up or down, but they should know better. The shark is down. The moment your blood enters the water, you start to fall. In the whole wide ocean there is nothing to catch you. First he smells you; then he hears you; then he sees you; then he feels the current switching in your muscles as you try not to breathe. But you have nothing to be ashamed of. The hunger you feed was not a vain hunger like the lion’s, not a grubby hunger like the worm’s, but perfect hunger: unhurried, impartial, and pure.

The Early Adopter

“People are afraid of the future. I can understand that. The one thing we know for sure about the future is that everything’s going to go wrong, am I right? You’re going to get older, and your marriage will fall apart, and your kids will speak a different language and listen to bad music.

“But I’m in love with the future, because while I’m getting older, and getting shaky and confused, something else is happening. Technology is accelerating so fast that even as I’m coming apart the space of what I can do gets bigger and bigger.

“I may need thicker glasses, but I can talk to somebody in China on a video phone. I may be out of shape, but I can carry a thousand books in my pocket. My hearing, maybe, isn’t as good as it used to be, but I have my own personal pocket radio station that plays all my favorites and follows me everywhere.

“So, sure, it’s true. Maybe if I wait a year the next model will be better and cheaper and they’ll have the bugs worked out and that thing everybody hates, they’ll have changed that. But I’m not getting any younger in the meantime.

“You be sensible. What’s one more year of circling the drain? Mine’s on pre-order.”


To say something unusual in specialized language is easy. A few formulas may unmistakably express a new worldview. To say something unusual in everyday language is very hard. You must choose your words not only to say what you mean, but to refuse to say what the hearer expects. Names alone cannot do it; it takes sentences.

Consider how advanced ideas become basic ones. The joke goes that in 1919, when Eddington was asked whether it was true that only three people understood general relativity, he hesitated and finally excused himself: “I was wondering who the third one might be!” Now undergraduates study it. Postulate that our undergraduates are not smarter than the best minds of 1919. Consider musicians: the violinist’s vibrato, the guitarist’s tremolo, were once the distinctive techniques of particular virtuosos; now they are part of mere competence. Nobody could play Liszt but Liszt until everybody had to play Liszt. What were once expeditions are now vacations.

This is more than a pattern; it is a phenomenon. What happens is naming: giving something a name is the first step in its domestication. The wild equations of general relativity were tamed by the associations that gathered around the name: the bowling ball on the rubber sheet; the paradoxical twins; the absentminded professor; the starship Enterprise. Any whale can be handled once it has enough harpoons in it.

There is a tension between thinking in names and thinking in sentences. Math and science work with names; verbs only participate syntacitly. This is an envied state. Whenever we see a field on the make we see it embracing gerunds, copulation, and anaphor. The textbooks always show the development from sententious thinkers to name-wielding scientists as the axis of progress.

But something is suspicious here. To be useful names must be unlike other words: they must have definitions, and there must be some procedure to ascertain that two definitions refer to the same thing. Otherwise a name is not a name at all; it is just another word.

The decline of Freudianism comes to mind. Freud gave names—ego, id, repression, neurosis—with a certain drama between them. The names and the drama were then taken up by a series of schools. Each school recast the roles with new definitions, or rewrote the old roles into a new drama, until finally the names, because they meant everything, no longer meant anything in particular, and were heard no more.

This matters. How many brilliant thinkers, who might have enriched the study of the mind if only they had been content to write sentences, went to waste following a dumb faith in names? They should have been warned that mere sentences are never wasted: good writing is always good thinking. It can be translated into whatever names are current, and lasts when names fail.

On quirk

Quirkiness is what breeziness was: the style of the writer who writes not as a maker, but as a performer. It may be interesting to compare the two. Breezy and quirky are both inexhaustible. When you lay two breezy or quirky pieces by the same author end-to-end, the grain matches up where the word count cuts off. They are as reliable and predictable as utilities and readers love them for it: the breezy or quirky writer who is not absolutely incompetent can expect their following, however small, to be loyal and loud.

Breezy and quirky do the same job, but in different ways. Breezy is world-wise and wide-awake; quirky is innocent and dreamy. Breezy is suspicious and confrontational; quirky is trusting and fragile. Both are overbearing, but breezy is pushy where quirky is cloying. Breezy is cool and takes things in stride; quirky is breathless and labile. Breezy is a mover, in constant, purposeful coming and going; quirky is a dweller, a homebody. Even when quirky travels, it settles. (Corollary: breezy and quirky both value living light, but for different reasons: breezy streamlines where quirky simplifies.) Breezy and quirky are both fun, but both under false pretenses: breezy is fun because it pretends to be ignorant; quirky is fun because it pretends to be crazy. Of course since real insanity (like real ignorance) is no fun at all, the insanity is aspirational: boredom becomes ADHD, neatness becomes OCD, absentmindedness becomes Alzheimer’s.

Both are ridiculous, but neither deserves mockery. True, breezy and quirky both talk about themselves, endlessly, but neither is narcissistic or needy. They claim interest vicariously, by representing something: whenever they are an X they are just another X. True, breezy and quirky are both indiscreet; but though they are highly personal they are totally unrevealing—a sacrificial persona intervenes between merely human writer and inhuman audience like a patronus.

Of course neither is bad in itself. Archie Goodwin should be breezy; Amélie Poulain should be quirky. For the writer, breezy and quirky are both shams, but perhaps shams are not so bad: perhaps somebody who demands that you be yourself deserves the same reaction as somebody who demands that you go naked. Still, when they go wrong, breezy is very bad, but quirky is worse. Breeziness is at least an adult sham; but quirkiness is falsely childlike in the fairy-friendly way that only fools adults who have forgotten being children, when they would have caught fairies to pull their wings off.

The Miser

[New feature; the idea is something between Theophrastus and Browning, like the “letters” in the periodical essay series without the framing device.]

“I learned something very early on. I saw that you can survive without friends, and you can survive without money, but it has to be one or the other. And I turned out to be much better at making money than I was at making friends.

“I don’t have anything against people who go the other way. Everybody wants to give you a hand—great! Nobody ever gave me a hand. They wanted me to beg and I wouldn’t beg. So I did it anyway, and then—it’s true—I rubbed their noses in it. That’s only natural, if you don’t take it too far.

“I’m not happy; who’s happy? I know money isn’t happiness; I’m not stupid. But I don’t have any regrets because I never had a choice. I wish you people understood that. I wish you people didn’t look at me like I’d gone over to the dark side.

“I know what it is. It’s because you need me and you don’t want to admit it. It’s resentment. Your friends can’t do anything for you unless they have money, and when you follow the money what do you find? You find me. If I tagged my money the way they tag migrating birds, you’d be amazed how far it goes.

“Miser? I’m the most generous guy in the world. In fact I’m the only generous guy in the world, because it’s my money to start with. It doesn’t count when it’s somebody else’s money.”



Technique is the thing that takes the human body, formed by a million years of fight and flight, and turns it to ends nature never proposed. And it is one thing, beneath the conventional distinctions that hide its scope. The fingers of the musician and the body of the athlete are both natural means turned to unnatural ends. What does the turning is technique.

We do not know what our bodies may do. Biology, remember, is made of physics. All our movements only permute the universal grammar of simple machines. The body is a vocabulary: its material is limited, but its combinations are inexhaustible.


Technique is paradoxical. Physiology maps the body’s range, extension, and advantage, but the means by which we use them in concert, our techniques, either ignore physiology, or imply a false one. We control our bodies only as gestalt.

Consider relaxation. The perfect balance of loose and tight for a muscle is the same as for a knot—not so loose that it slips, not so tight that it binds. But we cannot calibrate this balance by feeling it, because the real action of the muscles is the sum of the voluntary tension we perceive and the involuntary tonus we do not. We have to think of relaxing just to prevent the mistake of bracing. It helps to be told to relax, it helps to try to relax; but if you actually relaxed, you would sacrifice control over the good alignment of your joints, and destroy your body as you used it.


Technique is not usually the product of research. Of course physiology is relevant to technique. Duelists applied the discoveries of anatomy in the fencing hall almost as soon as they were exposed on the dissecting table. But fencing survives as technique, not theory.

And research may be an impediment to technique. Techniques are of two kinds. Some techniques amplify our powers, improving what we would do anyway—jump, run, hit. Research helps here by aligning the technique with the underlying complex of mechanism and instinct. But other techniques rather enable than help: they let us do new things. Here research may be a mistake. Science invented the triangular pen to make it easier to write with the fingers; but in the technique of the penman the fingers must not be used at all.

Techniques like these, though not analytic, are not arbitrary; they have their own logic and converge across centuries, and civilizations. Modulo certain constraints of metallurgy, the knights of Christendom and the samurai of Japan worked out the same technique for the two-handed sword. Or, returning to calligraphy, the peculiar penhold used in Eastern brush painting is paralleled exactly in the technique of flourishing with the pointed pen.


All techniques belong to one of three patterns: cues, checks, and controls.

Cues belong to the mind. If we could see athletes and performers as they imagine themselves we would behold the strangest beasts, like the boxer with the wings of a butterfly and the stinger of a bee. Every discipline has its own imagery of this sort, which is part of its mystery, consecrating its pursuit as a shamanic ecstasy of communion with totemic essences.

Checks are miniature acts: the things you do before and after the main act, how you get ready and make sure. A check may be as formal as a routine or as spontaneous as a wind-up.

Checks are mostly important to the learner. Techniques are not behaviors; they cannot be shaped, in the behaviorist sense – perfected through approximation. Practicing wrong just reinforces the mistake. Checks splint the fragmentary elements of technique so they knit together true. Half of knowing how to move is knowing how to stand; half of knowing how to use a tool is knowing how to hold it.

Controls are what prevails, the meanwhiles and the durings. They are the simple things that take time to master, the first things you learn and you always remember: "keep your eye on the ball" or "keep your weight on the balls of your feet." A control is a sort of lever: it is easy, because all you have to do is pull, but it is slow, because by pulling on it you are moving everything else. Controls, as we turn them on and off, almost seem to let us switch between different bodies, adapted for different purposes.


Technique has many enemies.

Strength is an enemy of technique.

Of course brutish, blundering strength is the opposite of technique. But feats of strength have their own technique; the strongman is an athlete, not a species. He uses more sense than muscle.

Strength is problematic because it hides bad technique. With enough strength you cannot feel for yourself the difference between the right and wrong ways of doing something. The wrong way may even feel better: what feels more effortful often seems more effective. Jumping in and slugging through feels good, feels like something to be proud of, in a way that taking your time and doing it right cannot. And strength hides not only bad techniques, but even harmful ones: keeping the harm silent until it is irreversible.

Instinct is an enemy of technique.

Instinct is problematic for learners. The hardest technique, for you, is the one that comes naturally to everyone else. No one will teach it to you, because no one teacher it at all, because it is not obvious that it is possible not to know. Sometimes those who do not have the instinct give up on the skill; but they, though slow to learn and likely to be discouraged, may prove best, because they earn an awareness others lack.

Instinct is also problematic for masters, who always want to streamline their technique—to do more with less. But in doing so they risk omitting something essential they did not know was there, because it was never named to them; something they may find it difficult to regain.

Skill is an enemy of technique.

Masters are rarely good teachers. They may be impatient—ars longa. Even if they are patient, they may be unsympathetic—was I ever such a…? Even if they are patient and sympathetic, what they teach may be not what they do now, but what they did when they still had to think about it.

And even if they are patient, sympathetic, and self-aware, they may teach the wrong things. Techniques feel different when they are new. The gestalt that mastery experiences is not raw but cooked. If it cannot be taught in its finished form, it must be arrived at.

Moreover teaching is a rare and demanding ability; few masters are good teachers, and even fewer have time for it.

And skill has a different use for technique than ignorance has. Ignorance has only two outcomes, getting it wrong or getting it right. Skill has many outcomes. Subtle adjustments and accomodations imperceptible to the ignorant produce wide divergences for complicated ends. The techniques the skilled pay attention to are thus ones the ignorant have no use for. Trying to teach them is pointless.

Of course, bad technique is an enemy of technique.

Techniques link to one another; a bad technique unchains those that depend on it. (To invert, when a generally accepted technique fails to work for you, it usually means you have a deeper problem.)

But good technique is also an enemy of technique.

Technique is its own enemy because the better you become, the harder it is to tell what works. When you are used to bearing a technique in mind, as your muscles learn to perform it on their own, your consciousness of it becomes redundant and may gradually exaggerate means into mannerism.

It becomes harder to test a technique; any change feels like an improvement when its rests tired muscles and favors rested ones.

And, once you have assimilated good technique, the body's mechanisms for self-protection have been disarmed. You could run wrong your whole life and never suffer more than an ache, but once you run well you might run wrong once and never run again. All the safeties are off.


What we learn when we learn is not just what we learn but how we learn. And sometimes we can reuse that knowledge. Obviously your third language is easier than your second. But this head start is not free. For languages, the work has already been done. There are languages with grammars, and there is grammar itself, as an abstraction. But technique as an abstraction does not yet exist, except in name. Between the snobbery of the gracile, to whom all strength is brutality, and the pride of the robust, to whom all delicacy is weakness, it has gone unseen.

I am no hellenizer. Mens sana in corpore sano is a good thing, but only because it is the sum of two good things. Of course the mind and the body benefit each other: a feeble body usually means a confused brain as a feeble mind usually means a clumsy body. But the one may be excellent while the other is only adequate, and when both are excellent it is just a case of two kinds of excellence.

Technique is of course how we build on nature. But technique is also how we find nature. Technique is human instinct: the cue in the totem, the check in the ritual, the control in the talisman.

The human form is caught between nature and culture. Before nature finished standing us on our feet, culture pulled us, still rough, out of the tumbler of tooth and claw. We are neither one thing nor the other. Our bodies depend on our minds as much as our minds depend on our bodies; half the human digestive system is in the oven.

Technique is another such supplement. Under the poorly fitted jacket of flesh and bone that lies heavy on our shoulders we are as far from the grace of the beasts of the field as from that of the beasts of the air. The easy and spontaneous embodiment the rest of nature inherits as its right is, for us, only possible through technique.

The Locomotive

On viewing a restored locomotive displayed in a pavilion.

Black beast, gnarled in heavy sleep
Red sun thaws cold iron.
Slack boiler swells, remembering steam.
Fused wheels flex, grasping the rails.
Scraps of shadow pour from the cold chimney
Silent shrieks rattle the mute whistle.
Face to the sun, I borrow a flush of hope.
Back to the sun, I tread a path of shadow.

Stock market

The stock market exists to discharge the gap between capitalism and reality. To a greater extent than capitalism’s most dedicated enemies imagine, whether a company is in the red or in the black, whether it is getting by or flourishing, is a matter of convention. The bottom line is like the horizon: exact, but it changes with your perspective. Accounting is an etiquette, not a science: the principles of accounting are accepted, not discovered. And because (in the short definition of capitalism) business can run as well on credit as cash, any large aggregation of capital that is not openly burning money remains viable as long as it remains credible.

The purpose of the stock market is to assign, to the companies that submit to its judgment, another value besides the one that stands on paper: one that answers not the accountant’s abstract question, “What is it worth?” but the more cogent and interested question, “What is it worth to me?”

Capitalism did not invent the bubble. Sri Lanka, I have just been reading, is covered in the massive ruins of an ancient irrigation network that was abandoned just as it was finished. The most parsimonious explanation is that ancient Sri Lanka had a bubble in aqueducts. Perhaps the answer to why the Maya built so many splendid cities, and then abandoned them, is a bubble in the building of splendid cities. Egypt had a bubble in tombs; Rome had a bubble in conquest; Europe had a bubble in chivalry. Bubbles are a human failing; capitalism is unique because it pops them.

The stock market is this sensitive needle. Its ticks transcribe the quick ebb and flow of an argument conducted in the binary code of short and long, buy and sell. And this is an argument to which we are all parties. Whenever you spend money or time you express an opinion about the economy. Buying a car is tantamount to saying “Car prices are fair.” Getting an education is tantamount to saying “My prospects are good.” Planting a food garden is tantamount to saying “Food prices are too high.” (Of course you may have other motives, but the economy is touchy and takes everything personally.) The stock market is the great bookie who takes these opinions about the economy and turns them into bets.

To put it another way: the stock market is an arrangement that pays people for being right about the future. This is unusual. Elsewhere in life, being right about the future is punished twice: before, with contempt; after, with hate. But with the stock market, the more unlikely the prediction—the longer the odds—the greater the payout. Thus, as the belief behind a bubble becomes more unquestionable, the reward for questioning it grows. 1929 was the year it became worthwhile to wonder if an upward trend in earnings really meant endless future growth; 2007 was the year it became worthwhile to wonder whether the housing market really could continue to grow forever without ever slowing down.

Crude as it is, this mechanism of homeostastis is unique; and it is what makes the gross, awkward, grasping, adolescent behemoth of capitalism fecund and invulnerable.


Self-deprecation may be a gesture of meekness, like a dog that shows its belly: a sign that you mean no harm, or are not worth harming. It may be calculated to lower expectations: whether out of discretion, not to disappoint them, or by design, to surprise them. Or it may be an indulgence, extorting praise by threatening self-harm.

(In this way self-deprecation helps define friendship: what is a friend but someone who praises you for what you regret, someone who finds unthinkable the things you fear may be true? These precious offices can only be performed when self-deprecation occasions them.)

It may be a way to evade responsibility. “I couldn’t X to save my life” is a polite way to say no when it was wrong to ask. And it avoids embarrassment: perhaps you could X, if you put your mind to it, but when there is nothing to gain, why risk failure when you can excuse abstention?

Of course self-deprecation is not always serious. It may be an indirect boast. “All censure of a man’s self is oblique praise. It is in order to show how much he can spare.” Which is harmless in moderation. Or it may be a provocation. Montaigne appalled his friends by insisting that he had no memory. In the French of the time memory stood for intelligence: but Montaigne made the distinction, and his self-deprecation enforced it.

All these uses are legitimate, but none of them can excuse the habit of self-deprecation. If you expect others to take you seriously, you should try it for yourself. Tell me often enough how stupid and useless you are and I may begin to believe you; apologize for yourself often enough and I will begin to believe you have something to be sorry for.

Of course some lives are just that lost, some people are just that broken; if patience can help them, they have a right to it. But I have no patience for people who fear being resented more than they fear being despised. It doesn't even work.

You may resent people for having things you can never have; but the people you hate are the ones who have the things you can never have, but despise them.

Aristocracy was invulnerable as long as aristocrats took pains to enjoy, and be seen to enjoy, their wealth and privilege; but the moment they started to doubt themselves the masses rose up and devoured them and raised the clear conscience of plutocracy in their place.

All persuasion begins in confidence. And since respect will be given, if those who deserve it cannot stand by their words, deeds, and lives, others will receive it undeservingly.


The axiom of finance is that having something now is better than having the same thing tomorrow. One who calculates by how much is said to discount. The same axiom holds elsewhere, but no one wants to do the math. The sacrifice that would be saintly if it were selfless is too often only thoughtless.

Take someone who refuses from a dish because of ethical objections about how it is now made. Someone who abstains from foie gras has made a permanent stand against an inherent evil, and means it. But someone whose argument begins “Do you know what kind of—” may not have weighed their position. You will not live forever. You are not guaranteed the ability to enjoy food even as long as you live. While you could still enjoy food, your health may forbid it. Whether or not you have bothered to count, at the end of your life there will have been only so many meals and far fewer good ones; are you certain you want to subtract this one?

Anyone who proposes to change the world needs to be asked: “If the world were as you want it to be, what would you do with yourself? Could you be doing it now? Why don’t you?” Perhaps the answer would deprive a worthy cause of a capable supporter; but if there is nothing good or worthwhile in the world except making what is worst in it less bad, then there is nothing good or worthwhile in the world at all.

(Besides, the most dangerous people in the world are the ones who try to change it without having learned how to live in it. Danton on Robespierre: Cet homme-la ne saurait pas cuire des oeufs dur: “That man couldn’t hard-boil an egg.” A man might become a monster only because he was good for nothing else.)

The taste for causes can be a jaded one. Helping people is one way of hiding from them. Trying to save the world is one way of giving up on it. Devoting your life is one way of throwing it away. You say the fruits justify the tree; but who would eat of it, if they knew how it grew?


How right that a dog went first, and that, for a time, she was between us and darkness. All our proud rockets, our brave pioneers, and we entered space as a child might enter a basement, holding onto a dog's tail. She was a Moscow stray. A stray, and therefore nobody's dog, or everybody's—yours and mine, even. A Moscow stray, distant aunt to that remarkable unbreed of hustlers and idlers—Russia's last aristocrats.

Strays live by the old covenant. Dogs never needed us. The deadliest hunter of the African plains is not the lion but the wild dog, whose kills are efficient, coordinated, and relentless. And they chose to throw in their lot with us. What honor! And what responsibility! There is a play (a radio play of Dunsany's) where mankind is put on trial. One by one the animals testify against us; only the dog speaks in our defense, with such praise as is, in its way, worse than accusation:

He is man: that is enough. More is not needed. More could not be needed. All wisdom is in him. All his acts are just; terrible sometimes, but always just.

Bacon writes (against atheism) that men are better for having a god as dogs are better for having a master: a strange and improper argument. But if our faith is as heavy as the faith of dogs is to us, we can have a sort of sympathy, and imagine how gods might know shame.

Muhammad relates that a woman was forgiven a lifetime of sin for giving a thirsty dog a drink of water. Consider how the balance is weighed; what does it mean to harm a dog? "Who could eat a dog?" is really the same question as, "Who could eat a man?" Nobody eats a dog for its meat; men eat dogs like men eat men: to absorb their power. It is at least respectful. When dogs are twisted into brutality, neglected into savagery, beaten into helplessness, there is no respect; better to be eaten. And yet each new puppy is a fresh expression of absolute trust, never diminished. Our terrible debt vanishes in that unquenchable devotion.

In his last years, isolated in deafness, Goya gave up canvas and impasted the walls of his own house with a series of alien images, primordial and apocalyptic at once. A lone dog—all alone—sinks below the horizon, howling as she recedes over the edge of the world. We sent Laika to die; we sacrificed her. She died within hours. And then for five months, dead, in her dark, silent capsule, she circled our bright world, falling and falling as the horizon slipped away from beneath her, one dead dog keeping solitary watch over the billions. Then she fell as a star falls, a burnt offering, trailing fire, scattering earth and sea with her ashes. Even the sky is haunted.

Genteel tradition

In 1911 Santayana was ready to leave the United States. In California (already liminal America), he said what he could not say in Boston:

The truth is that one-half of the American mind, that not occupied intensely in practical affairs, has remained, I will not say high-and-dry, but slightly becalmed; it has floated gently in the backwater, while, alongside, in invention and industry and social organization the other half of the mind was leaping down a sort of Niagara Rapids. This division may be found symbolized in American architecture: a neat reproduction of the colonial mansion – with some modern comforts introduced surreptitiously – stands beside the sky-scraper. The American Will inhabits the sky-scraper; the American Intellect inhabits the colonial mansion. The one is the sphere of the American man; the other, at least predominantly, of the American woman. The one is all aggressive enterprise; the other is all genteel tradition.

This phrase, “genteel tradition,” became the weapon of choice for the Mencken gang. They carried it in their hip pockets like a flask of violet perfume, ready to dash over an opponent’s head. And once the scent was on you, whatever you had to say, all anyone heard was the calico whine of a high-minded Protestant spinster.

But what did Santayana mean by it? He defines the genteel tradition as a form of anthropocentrism: an anthropocentrism that emulsifies transcendentalism – the sense that the world is your creation – with calvinism – the sense that the world is your fault. Historically he traces it to the seventeenth century and the renewal of orthodoxy.

That is where it comes from; but what is it?

The genteel tradition opposes education to life. It wants things to be done the right way, openly, and for the right reasons, or not done at all. It requires play to be exercise; thinking to be persuasion; learning to be study. It wants us to be unfettered and spontaneous, but not to run in the halls. It does not care what is avoided, unless it approves of what is done instead. Just avoiding apathy, boredom, ignorance, prejudice, and stupidity – in the judgment of the genteel tradition, avoiding them is only permissible when they are avoided properly.

Wealth, learning, and beneficence, even on a grand scale, must leave them cold, or positively alarm them, if these fine things are not tightly controlled and meted out according to some revealed absolute standard.

This should sound familiar; this is our world. Santayana thought the genteel tradition was dying; instead it enjoys absolute victory. It has coopted or outlasted every challenge made to it. How did this coffin case recover and reconquer?

In his speech Santayana names Whitman and William James as models of what was to come after the passing of the genteel tradition. How badly his prophecy failed shows in how unthinkable either man is as our contemporary.

Whitman’s generous sympathies would wither in our frost. How dare such a creature of privilege – white, male, educated – presume to contain us? His faith in active humanity – in discoverers, settlers, builders, farmers – is embarrassing. He accepts where we require indignation; he holds faith where we require doubt.

Whitman is an outcast, but James is worse off. He has been brought as low as a dead thinker can be brought. We say he anticipated recent discoveries – “now that we know everything, we can admit he was right all along.” And this is safe to say, because he has no heirs. Our psychology is blithely built on the compulsive, thoughtless quantification that he travestied.

We shelve pragmatism beside hypocrisy. A judgment that can be changed is a judgment that was never held properly. The impulse of the genteel tradition is theocratic: it will have you only hot or cold, never lukewarm.

Santayana’s examples have aged badly. Aggressive enterprise has been outsourced; skyscrapers turned out to be a gimmick, not half so efficient as the anomie of the exurban office park; the colonial mansion was not reproduced, but renovated.

More importantly, women made their own claim on the future: not just assuming male roles, but dignifying female ones. Gender is the worm in the apple of Santayana’s thought. Even for his period he is obtuse about it.

The American intellect is shy and feminine; it paints nature in water-colours; whereas the sharp masculine eye sees the world as a moving-picture – rapid, dramatic, vulgar, to be glanced at and used merely as a sign of what is going to happen next.

Santayana underrated women – women as people, and women as a subject. He observes a divide down the middle of humanity, and assumes that one side mirrors the other: one left, one right; one weak, one strong; one shy, one brash; one sentimental, one enterprising.

(In The Sense of Beauty, for example, while investigating the mutuality of sex and aesthetics, he infers that, because women are the most interesting thing in the world to men, men must be the most interesting thing in the world to women; whereas (aesthetically speaking) women are the most interesting thing in the world to men and women both.)

When Santayana made this metaphor – the genteel tradition is female, modernism is male – he corrupted his view of one dilemma with the quality of caricature that spoiled his view of the other. Thus he sketches both the genteel tradition and modernism (as he names its opposite) clownishly, in greasepaint. If the genteel tradition is feminine, retiring, domestic, careful, then the opposite must be masculine, daring, upthrusting, public. On these terms there is only one complete escapee from the genteel tradition in American letters: Ayn Rand. Ask a silly question, get a silly answer; posit the genteel tradition in Santayana’s playground terms and you get Objectivism.

Every form of modernism is tantamount to testosterism. It is the one thing every species of modernism had in common, the weakness they all shared; so when the thing happened that no one expected, they were all susceptible to it, and the genus went extinct.

(I cannot say how far Santayana is, himself, to blame; but even if others made the same mistake, it is still the same mistake, and bears the same analysis.)

The kind of thinking that modernism liked was the kind of thinking that felt most like work: laborious, therefore masculine, straightforward without the effeminate detours of inspiration or insight, muscular and tense, measurable in foot-pounds and horsepower.

So what happened? This is hard to see because Santayana’s future is our past. It belongs to the middle distance; we cannot see it for our own shadow.

What no one expected was the computer. Suddenly, there appeared the machine that proves there is no connection between how hard thinking feels and what it is worth. The labor theory of value does not apply. Thinking feels hardest when it is most trivial. Calculation is effortful, but not difficult – even a computer can do it.

Somehow we still admire feats of memorization and calculation. What computers prove is that these feats are dead ends. Mental mathematics, total recall, musical prodigality, are not signs of a powerful mind, but of a mind that has plenty of room because nothing else is going on inside it.

In this way the computer refutes modernism. Consider painting. Look across, from the first half of the twenty-first century, to the first half of the twentieth. What do we see there? We see nothing worth doing. There are no more pointillists, impressionists, cubists, because Photoshop trivializes them. There is no more abstract expressionism, no more suprematism, because the possibilities of these schools are exhausted by the screensaver.

“No,” I hear, “the computer no more refutes abstraction than the camera refutes representation.” But a painting is different from a photograph: one cannot see a photograph as a painting that could have been made, but wasn’t. But a work of modernism is always something that could have been generated by computer, but happened to be made by a human being.

(This definition applies to more than painting — Serialism, Brutalism, Oulipo, &c. — but less than everything that has been called modernist. Of course it could be that if everything that has been called modernism be admitted, then modernism has no definition. I draw the line here.)

If we mute the caricature – if we correct for Santayana’s error – what is left then? The idea of a genteel tradition will stand. But what of the accompanying diagnosis? Do we have that divided mind? Certainly we have inherited the division as Santayana made it, and as others elaborated it: we find ourselves obscurely constrained to destroy the genteel – even under other names, like pretentious or inauthentic — wherever we encounter it, like the tribe of Amalek.

But the things the genteel tradition wants and provides are good in themselves. There is sufficiency and even bounty in it. It preserves what might be lost, incubates what might be stillborn. But for the sake of these good things the genteel tradition sacrifices things that may be better. It smothers everything it touches with an anxious sobriety: it would rather leave us in marmoreal disgust, than let us enjoy too meltingly. This I oppose. I side with ecstasy, rhapsody, and multitude, if only from a distance.

Dream places

There are some things that we can trust, even in dreams. One of these is the dream version of a real place. Little as they resemble the places they represent, in the dream we recognize them, and across dreams we return to them, and find them as we left them. Often they are on a larger scale than their models. For places known in childhood this is explicable. Imagination magnifies and interpolates the facts until they match the impression we retain from when we were small in a place and looked up to it. But all my dream places are magnified, whenever I knew them. Perhaps cinematography is to blame. Many who grow up watching black and white dream in shades of gray; perhaps my dreams are wound up to the geographical key of New Zealand. True, I return to dream places which are born of dream stuff, and have no anchor in experience; but dream places, when they are born of real places, retain a cord of connection with them. The change that a place undergoes in becoming a dream place is not lawless: there is a topology, with invariants. The shape of a coast or the path of a river may change, but the waters remain. The dream place has the same palette as its model; no new colors appear. Trees never appear singly, always in stands. New buildings are found, and new features of old buildings, but always of the same stuff as the real ones. Roads widen and narrow, but never change their course, nor whether they turn or go straight. In order of instability the elements of dreams are events, things, people and places. This is a lesson in the mechanics of imagination: even when anything can happen to anyone at any time, it must still happen somewhere.


The far-voyaging French explorers of North America kept running into one another. One explorer could hardly enter a village without finding another in residence or having just left. They could leave one another letters and expect them to be delivered. In Paris a man could disappear; in the wilderness he had to guard his reputation.

Think of traffic as a force. The canal is the artificial version of something natural – the river; likewise, the road is the artificial version of the path. Roads are permanent; paths, unless anchored by permanent settlements or fenced off by property lines, shift freely. The paradox of the wilderness is that the more open and unobstructed it is, the more traffic can converge along optima conditioned by the difficulty of the terrain, the availability of resources, and the use of waterways. In the wilderness all ways are highways.

When we consider ancient or prehistoric peoples and their connections we should not imagine of a web of short links between evenly spaced nodes, news and goods moseying from village to village; we should imagine them swept up into a handful of gigantic, continental paths: stable in their broad geographic sweep, changeful in their fine, local structure. Call them fractal: at ten thousand feet, there is one path; lower there are ten paths; on the ground there are hundreds of paths, routes and reroutes circumventing any obstacle with the ingenuity of flowing water.

The existence of paths on a continental scale does not imply a continental consciousness. In their scale these great paths would have been invisible to those who used them: like the Silk Road (the last great path), each path would have been cut up by jealous middlemen, until one end of the road was a myth to the other.

One can imagine, if not document, a vision of universal history hanging on a set of Great Paths, where it is not migrations that leave paths behind them, but paths that educe migrations. Paths have always been before us: from the beginning, the human race spread not by spilling over from one valley to the next, but by processes that, pacing the lines of least resistance, became the salients of our advance. The ascent of man was not just something that happened; it was a single phenomenon, having its own structure – structured in paths.

Four Years

I decided, before I began the Ruricolist, that four years would be a good place to stop. Beyond that, I feared, I would be protracting something that belonged to a certain moment in my life beyond its natural term. But I was making the rare mistake of overestimating the prospect of change. All my reasons for writing the Ruricolist stand. Now that I come to it, four years is not enough.

April fools

On the net, an effective April Fools’ joke works like contrast dye – you discover, by following its path, who does and does not read the stories they pass along. April 2nd is a good day to unsubscribe, unfollow, and defriend. We owe to April Fools’ Day some great moments: say, table syrup or the spaghetti harvest. But surely there is already enough deceit and treachery in the world. Why dedicate a holiday to it? Perhaps it has the significance of certain seeming-perverse religious performances, honoring the hostile gods of death and ruin, recognizing them in turn lest they obtrude themselves out of turn. If we must be fools, if some god of fools will not be spurned, then, indeed, let us dedicate a day to his honor. And perhaps the holiday inoculates us: being an April fool is painful, but it forearms us for when we are made fools out of season.


Argument has rules. Argument is not a game – the rules are more in spirit than in letter – but there are rules. Certain moves – certain appeals – appeals to personal experience, to scripture, to studies and statistics, even to logic – break the rules and make argument impossible. Of course these are all useful instruments of judgment. But judgment and argument are different things. Judgment ends argument, but arguments do not want to end.

“Arguments want?” Arguments want what we want. Sometimes we argue selfishly, to win. Sometimes we argue selflessly, to keep the conversation going. But mostly we argue precisely to prevent judgment: to reassure ourselves that some matter is open to question, that equivocation is not irresponsible.

Argument has rules, but agreeing to definitions is not one of them. That is putting the cart before the horse. Definitions are liquid: when they meet, they mix. These triboluminescent encounters are what argument is for. All the valid moves in argument – making a distinction, putting in context, elaborating, unpacking – these are all ways to make definitions meet, merge, and mature. Definitions are always at the center of arguments because shaping definitions is what arguments are for.

Argument is harder than it looks. In large part this is because, while contradictions, fallacies, and biases break the rules, pointing them out is a far worse offense. Argument at the level of fallacies and biases is boring. Argument about argument is not argument. Whatever the point at stake, the opponents are in the same old ring, trading the same old jabs and blocks.

Argument is not a way of deciding. It is a way of not deciding, of doing something else instead: learning, wondering, waiting. You know it is the real thing when it is unpredictable – irreducible – and, therefore, nontrivial. More than anything else, argument wants surprise.


Are knots technology? Knots were never invented: like fires, knots happen naturally. But unlike fire they cannot not be made useful by propagation: they have to be translated. Like language, knots are immaterial, passed on by example and subject to regional variation. Unlike language, knots are finite – there are only so many – and eternal – the same knots recur worlds and ages apart.

Like tools, knots are useful and increase our power over nature; but unlike tools, we carry them in our heads, not our hands. When they parallel tools, it is on a different level of abstraction. The trucker’s hitch is an image in cord of a block and tackle. It is no more a tool than a picture of a tool is – and yet it has the power of a tool.

Knots are a form of mathematics, but math with a difference. Arithmetic has a history of progress: but before history began, knots already embodied the highest level of mathematical abstraction. Knot theory is a twentieth century invention. Only in the twentieth century, only after thousands of years of development, did exoteric mathematics finally equal the mathematics esoteric in knots.

Knots are magic. With a piece of cord and a sequence of gestures we produce direct results in matter. “The rabbit jumps out of the hole, runs around the tree, and jumps back down the hole”: what is this but an incantation? Reasoning from knots, we get magic; reasoning from tools, we get technology. Technology works, magic doesn’t; nonetheless, the existence of knots violates the order of nature that technology presumes.


The facts of life cannot be hidden from people whom live among animals. Birth and death are as open and current among them as weather. Human beings cannot learn much from one another; we conceal too much in shame and pride. The short and unreserved lives of animals are the true parables. They enact life back to us on a scale we can grasp. The horse, the cow, the dog, the cat, the chicken (like the llama, the camel, and others) – these are the truly ancient sages.

Sometime in the nineteenth century it became possible for masses of people to live away from animals. Deprived of its foundation in the shared witness of animal life – left untethered – culture became a castle in air.

Victorian prudery came first. Bowdler only becomes possible once he may suppose readers who do not know the way of a dog with a bitch. But his overthrowers were equally unworldly. Freud could only have lived in the city. (Animals, despite their undivided minds, are as neurotic as people.)

But Freud’s city was still a city of horses. When the automobile replaced the horse and left animals with zoos and field trips for their habitats, pathology became derangement. It was the analogies that the observation of animals implicitly afforded us that made reasoning about life possible. We have lost the animals, but we still need the analogies; so we grub them where we can. The machine served until it threatened to master – to remake us in its image, machines, not people. The net may yet make nodes of us.

(Of course analogy is not explanation, but a real explanation would have to explain us all, human and animal: developing a theory of human nature and trying to work animals into it parenthetically is a dead end every time.)

I invite the accusation of anthropomorphism; so be it. The dangers of anthropomorphism are abstract; the dangers of anthropocentrism are practical. It is not a question of dominion; it is a question of definition. We are the rule, not the exception: since we can no longer learn it by observation, we must be told, and trust.


Why should generations be interesting? There are two questions here, because generations have two kinds of interest. They have historical interest: a succession of generations from Lost to Greatest to Silent to Boomer to X to Millennial. And they have personal interest: the generation in the first person, what separates us from our parents and divides us from our children – “my generation,” “our generation.”

As a unit of historical analysis the generation is worse than useless. The biologist’s refutation of race applies: since variability within a generation equals or exceeds the variation supposed to divide generations, generations are supposititious.

Of course generations really are different. Every generation has its own distinctive patterns of behavior – but distinctive is not the same as characteristic. Nothing is more distinctive of a generation than its common names – but, remember, the fact that some names are common does not mean that most people have common names.

The generation is unreal, but unreal is not absurd; unreal things can exist formally, like lines on a map. The generation is likewise formal: consensual, not demonstrable. But why this consensus?

It is a pleasure to be sorted into a particular generation because being sorted, if it is not discriminatory, is inherently pleasant. Advertisers know this. They know that offering to tell “Which X are you?” or “What kind of X are you?” tempts you, for all x.

And something loves to displace the faults of human nature to contingent aspects of it. Sentences that begin by naming “these days”, “this country”, or “our society,” generally become intelligible only once they are universalized, and referred to human beings as such. The generation provides another means for such displacements. Then such awful questions as “Why are we here?” can be rephrased in cozier terms like “My generation has no sense of purpose.”

What is it that we share when we share a generation?

Sharing a generation is the least two people can have in common (who have anything in common at all): thus we are most attached to our generation when we have few other attachments. Those who have something more definite to be loyal to – nation, religion, community, cause – they would never number their generation among the things that define them. And, inversely, those who expect their generation to define them tend to lack particular loyalties.

Sharing a generation is the weakest hold two people can have on each other, who have any hold on each other at all. It is because it is so weak that is so hard to shake off. Say: my generation is my blind spot. I think there is nothing so utterly mysterious to me as my own generation. Because I must draw lovers and friends from it, I want to believe it is better than it is, and when it disappoints me, I see mystery rather than accepting the fact of disappointment.

If it is to take hold at all, generational solidarity takes hold in childhood. Sometimes I would have to explain and defend to adults the things that I and my friends did – defend them to members of other generations. At these times I did not feel as if I were speaking for myself: I felt like an ambassador, charged with a heavy responsibility and answerable to my peers. Suddenly I was not only included, but important. Under threat of scorn from another generation people who would not otherwise speak to me leaped to my defense; people who hardly spoke at all surprised me with the capacity to form complete, reasonable, and persuasive sentences on my behalf.

To embrace the undemanding solidarity of the generation, to build life and work around this experience of inclusion and importance, is understandable. We writers are most susceptible. The generation is our fallback – once we have, for fear of prejudice, abstained from everything else. The temptation is always present: why speak for yourself, when you can speak for your generation? Why stand alone, when you can recruit their implicit support behind everything you say, or make, or do? Why be objective when at last, after everything, you could have them all on your side?

Tower of Silence

So here we lie at last, having arrived,
Upturned, unraveled, undisturbed.
Here we lie where there is no more hurry,
Here we lie where there is plenty of time.
No more alarms, never early or late,
No more errands, nothing to muddle our thoughts.
Free between earth and sky, picked men,
Sweetly discoursing, attended by nodding birds.
And what is soft and dark in us must fly,
But what is hard and bright in us can stay.
For here in the tower of ivory, brilliant and bare
We are the men of ivory, with nothing to fear.


I just spent five days in the hospital; I beguiled them by reading ebooks. I bought a Kindle recently. I had the chance to try one, and was immediately taken with the idea that if I could transfer the reading I do onscreen to the device, my eyes would have an easier life. Once I confirmed this was possible – between Instapaper and Calibre it is straightforward – I bought it.

It immediately paid for itself in canceled magazine subscriptions. With exceptions, I dislike magazines as physical objects: glary, bulky, ad-ridden. Why pay for the piles, when I could get what I want for free, in a more legible form? And I soon relieved my perennial browser session of all the things I kept open to read in fragments. Besides articles and posts, I had also been thinking of books that were unavailable or exorbitant in print (or only available in those dubious POD reprints with the generic covers) but free on the net. I found myself pilfering the treasure-house of Project Gutenberg.

I have a history with ebooks. I was an enthusiast in the false dawn of ebooks, about ten years ago. Back then the idea was not to save publishers, but to destroy and replace them: to behead the behemoths of New York, to throw open the gates and welcome the multitudes in, to replace the stagnant world of editors and exploitation with something brighter and more breathable. This was the mission of the e-book publishers. For a time I seriously meant to become one. This ambition was twice cured. I assisted a judge in an ebook contest; this was my first contact with the slush pile, and it has never washed off. And I realized I spent far more time reading about ebooks than I did reading ebooks. I excused my disaffection with the argument that ebooks would never be practical without the then-speculative technology of e-ink. By the time e-ink showed up, disaffection had become distaste.

But in the hospital, while I was too weak to hold a paper book open, I read ebooks, and was engrossed. Or, better, I was not reading ebooks; I was reading.

I am not a convert. I will always shun anyone who thinks paper is just dead trees. But I must recant a witticism I was formerly proud of. “I cannot remember who said, ‘The world exists to end up in a book’ [Mallarme]; but I am sure no one will ever say, ‘The world exists to end up in an ebook.” Actually the world exists to be written and read; what we read it on is no more decisive than what we write it on.


“People are stupid” is a non-answer, like “God made it so.” It is a dead end. Perhaps you misunderstand what they are trying to do. Perhaps you overestimate the resources available to them; perhaps you underestimate how hard the things you take for granted are for them. Perhaps they are not deluded, but deceived; perhaps you are deceived. Perhaps they choose not to see what they cannot bear to see; perhaps you are choosing not to see something unbearable. Perhaps the system in which they are caught perversely rewards stupidity; perhaps the system is not perverse, but malicious. Perhaps they just made a mistake. Assume people are not stupid and you will always learn something; assume people are stupid and you will never learn anything. Of course stupidity is real. But truly stupid people do not lack intelligence: they reject it. True stupidity is a skill: a kind of aikido that deftly unbalances the most powerful arguments and sends them sprawling on their faces.


Posting here has been erratic and is likely stay that way. I am (still) recovering from relatively minor surgery. My ability to type, let alone write, is not reliable. I intend to keep posting as I can but I make no promise of a regular schedule.


All writing has a sense of audience. The sense may be latent or explicit, attenuated or definite, but it is always there. The more explicit and definite this sense, the fewer the choices the writer must make, and the better-informed those choices are. To have a public is to have it easy. A circle of the likeminded, something the writer can be a member of, is also helpful. But the audience does not have to be real to be sensed. Just to have something to prove and someone to prove it to – notwithstanding whether they are paying attention – is a great help. Even the voice in the wilderness has the wicked world to harangue.

The writers of the Dark Ages read the classical authors and tried to do what they did; but they lacked the classical audience. Their work has a feral, furtive quality. No matter how they tried – and some of them were brilliant, had great minds – something was always missing, always off-key. You do not know their names because their work has little value in itself. The interest it keeps is in tokening what might been – without the thrall of priestcraft, without the isolation of monkhood. I think of them often, and with sympathy.

Still, failing all others, there is a true audience, the true one because it is the only inevitable one – yourself, plus time. The hardest and therefore the worthiest thing to write is the thing you can reread – perhaps now, perhaps later, perhaps ten years later – without embarrassment or frustration.

Of course you hope to improve, and improving to see flaws that were invisible to you before. But the visible flaws are enough. The audience you write for may not notice your omissions and exaggerations. You may get them on your side. They may forgive you for being glib or hasty. You may learn the tone of their voices so well that in your head you can hear them defending you even as you cut the corner. Audience excuses a lower standard. But just because you can write does not mean you should. You will never stop seeing the flaws. They will always be there; they will accuse you forever. Anticipating your own retrospect is the only antidote to so attractive a complacency.

I built a shed over the summer. It is easily the largest thing I have ever built. Naturally I made mistakes, and naturally I learned from them. Having built one shed, I could build a much better one. But a shed is a physical object. Because I am not willing to waste the materials I used; because it works well enough that I cannot justify the effort and expense to replace it; I am stuck with all its crooks and gaps and rough edges.

As a writer you do not have to compromise. You can learn by doing and apply the lessons to what you have done, without losing anything. Style is only the quality obtained by this adaptation of means and ends. The vaunted ability to write without reading, to get it right the first time and let it go, to write altogether efferently, to remove yourself from your audience: this is nothing to be proud of; this is missing the point.

True, self-criticism is a dangerous habit. You know where all the knives are in, and how to twist them. And there is a deadness to self-inspection that deceives. The worst you will ever look is to yourself in the mirror. But I am not talking about criticism. There is always something to criticize, something to change. I am talking about recognition and responsibility: about being able to say, without hesitation or qualification: “Yes, that is mine.”