Spare a moment to consider this apropos holly tree. This is not a young tree, nor a low branch of a great tree; this is the crown of a fallen tree. Last year the weight of snow levered its roots out of the ground and dropped it onto its side branches and the bramble of ligustrum beneath. I had more urgent damage to clear and let it lie, thinking I would return to it when it was dead and soft. But not all its roots were broken. It still circulates; it still lives. Indeed by now all its roots have found their way back into the ground. It can never be as it was; a landscaper would have it removed as a blemish; but I feel sympathy for it. I am willing to give it time to make its adaptations, and to count its prostration as a point of its appeal. It is now more than a tree; it is a tree with a story, as I have told.
Deep in the young, impatient woods an old woman lived in a small cabin, alone except for her dog. She had lived there for a long time. Once the cabin stood in a field. Then she had a husband and children there. But now the husband was buried, the children were far away, and the fields where wheat had grown bore bramble and pine. The last time someone had walked up the path, it was to bring her a puppy. Now the puppy was a dog, and the path only appeared when rain filled it and washed the pine needles away. But the old woman had her high fence, and her rich garden within it, and lived well though her eyes were failing her.
Into the pathless woods, wolves had come. They stalked beyond the fences, but the dog barked his deep bark and kept the posts carefully marked and the wolves left them alone – all the wolves but one. He watched through the fence and saw the woman feed the dog, pet it, sit beside it. He thought that the dog had an easy life. He wanted an easy life like that.
One day a tree fell and made a hole in the fence. The wolf saw his chance. He dragged his matted fur against the brambles until it was straight and smooth and swam in the river until his smell was gone. Then he leapt over the fence while the woman was petting the dog and, as he had seen the dog do, he put his face under his paws and whined. The dog attacked but the old woman – who still had strength – called him off and dragged him inside and shut him in a closet while she befriended the new dog.
Whenever the dog tried to warn her about the wolf she punished him. After a time he gave up warning, and set himself to watching, following the wolf everywhere. The wolf didn’t mind. He was fed and petted and careless. He even took punishment when he had to – he could always leave.
One day the old woman did not come out. The dog cried and the wolf howled at her locked door but she did not answer. After a few days the wolf jumped back through the hole in the fence. The dog stayed while the garden ran to weeds, while he wore thinner and thinner in waiting. He stayed until the brambles wrapped the fences. Then he too jumped through the hole in the fence, out into the woods.
The woods were thick and sharp and he knew nothing of hunting. He nearly starved before he caught something, and nearly starved again, and again, until his instincts were all awakened and he found he could hunt at last. His muscles grew strong and lean, his fur grew thick and matted. He smelled of pine needles and old blood.
Once he looked up from his catch and saw a wolf, then two wolves, then three. They were all around him. He backed off from the kill and the wolves leapt on it – all but one, who sniffed him, then turned, drove the other wolves back, and let the dog eat. When the eating was done the dog left with the wolves, a dog walking as a wolf beside a wolf who had lived as a dog.
Moral: Nature forms strange Eddies at the Shore.
The hardest thing that eyes can do is stare. Precisely because our eyes are such efficient channels of information, the one thing they cannot do is settle passively on a single object. In fact staring is impossible. If the eyes were to stop moving, they would stop seeing; the principle of vision is not focus, but contrast. The possibility of sight depends on the continued tiny, restless motions of the eye called saccades: what the mind sees through the eyes is not the light that falls on them, but the contrasts in light which the saccades gather. Physiological fact is that to see is to look away.
Attention has the same principle. The harder you focus your attention, the narrower a channel your attention becomes, toward the asymptote of sleep. The more freedom you give your attention, the broader it becomes, the better it works. When you are rapt your attention is indeed rapt, trapped—not still, but caged; not pinioned to a subject but centered on it, pacing before and around it. All your instincts are for motion. When you look on a strange object, you walk around it. When you look on a beautiful face, you survey it in passes from eye to eye. The first thing an artist learns is not to stare. The first thing a thinker must learn is how to be distracted.
I hate to quote. This is twice perverse: first, because I admire writing heavy in quotations, and second, because quotations are basic to the essay as a form. The essayists of the age of gold (imperial gold, flowing and counted) were habitual quoters. Montaigne quoted so widely from Plutarch and Seneca that they are regarded as the essay's classical models more for the force of Montaigne's evident admiration than the contents of their works. Bacon, second essayist, in his first essay, of Death, quotes—misquotes—Montaigne. The essayists of the age of silver (mirror silver, clear and perspicuous) prefaced their untitled essays with classical quotations—the quotation being the kernel of the essay to follow. The essayists of the age of iron (plate iron, driving and bearing) often simply titled their essays with quotations, reserving the climax of the essay to address it. The essayists of our age of lead (secret lead, soldering microcosmic circuits) seem to think it indecent to admit concentrated literary effects to their essays except in the form of choice quotations.
Still I hate to quote and feel parasitic when I do it. My rejection is not absolute: when I owe a train of thought to a quotation, or when it embodies a thought so forcibly that any paraphrase would only suggest it—then I admit it. And sometimes borrowing a quote that has famously been used before, allows me to suggest a connection without having to dilute an essay with its explicit statement.
The above understates the case for quotation. Nonfiction begins in quotation—quotation is not just a mechanism peculiar to nonfiction, but the very means by which nonfiction split from fiction, the means by which what can be written about has become more and more. The presocratics, by quoting Homer, declare that they are doing something different than Homer. Socrates and Plato quote Homer and the presocratics; Aristotle quotes Homer, the presocratics, Socrates, and Plato; Zeno quotes Homer and the presocratics and Socrates and Plato and Aristotle, &c. Each new regress of quotation expands what can be written about by a new order of magnitude.
Still, for every good quotation can do, I perceive an equal bad. Literature can be concatenated from quotations—see Burton, Elliot—but if concatenating quotations were enough to make literature, theses would be worth reading. Quotations compress impossibly long arguments; quotations hide chasms with convenient bridges. Quotations let a piece of writing claim a place in a tradition; quotations shroud child thoughts in adult costumes. Quotations prevent duplication of work, save the time waved in saying what has already been said; quotations prevent the integrity of a work. Quotations serve as indices of large thoughts and subtle experiences, name them when they have no names; quotations protract naming until it passes for understanding. Quotations let reading propel writing to the benefit of both; quotations restrict reading to what can be quoted without embarrassment, and conceal real sources of thoughts with quotable ones.
The worst literary offense of quotation is the same as its scholarly benefit: its corroborates and exemplifies. "See! Someone else thought as I do. Someone else has felt as I do." Very good. You bring proofs with your argument. But let me ask: are you sure that all these proofs don't make your argument less provable? An abstract argument, if it is precise and clear, can always be tested. But the more instances and use cases an argument tries to apply to, the vaguer it becomes by the slight fudging each applicaton requires.
I write less to develop new thoughts than to clear out old ones. Most writers read to write; I write to read. Before I can start new trains of thought I need the old ones off the tracks. A few months ago I called the Ruricolist "trephination"; I meant it. I write for the same reason people allow holes to be drilled in their heads: to let off pressure, to drain superfluity. The image is extreme but then I find writing an extreme habit: intense, unnerving, time-consuming, consuming—yet not only worthwhile but necessary, devoutly necessary. Somewhere in the labyrinth that perplexed me into becoming a writer I lost the connection between writing and quotation; or perhaps in my indirect approach I never made it. Either way, I do not regret the lack.
The xerxes is my private unit of chronometry. One xerxes separates year A and year B such that every adult alive in year A is dead by year B. The xerxes, is only an approximation. It excludes outliers—the lapse of a xerxes does not wait on the expiration of the world's last living; it excludes anyone whose very survival is newsworthy. A xerxes is concluded once you can look at a picture or a list of names and be confident that all the faces and all the names you see belong to dead men. Setting adulthood at 15 and lifespan at three score and ten, and affording a margin of safety of 1/3, a xerxes rounds down to a period of 70 years.
The xerxes implies another unit, the half-xerxes. The half-xerxes separates year A and year B such that half of all people alive in year B were not yet adults in year A. I set the half-xerxes, with less certainty, at 35 years—with less certainty because wars, plagues, and baby booms too easily throw it off. (Otherwise we could more elegantly rely on the half-xerxes as our primary unit.)
Nothing happens only when it happens. Everything memorable recurs in memory until memory is extinguished. And when that memory is a shared one, like the memories enforced by disaster and strife, it establishes, among those who remember, a secret language of allusions and reminders with power beyond ordinary language. The consequences of this language, even at dark removes, are still in truth the consequences of that event. Thus in order to judge an event, we must look at it three times: first, contemporaneously, so we can judge what it means when everyone remembers it; second, from a half-xerxes, so we can judge what it means once most people no longer remember it; and third, from a xerxes, so we can judge what it means once no one remembers it. And until the third look all judgments remain provisional.
Consider the world wars. Given the double carnage of the trenches and the Spanish Flu, it is plausible to mark a half-xerxes between the end of WWI and the beginning of WWII. And though a connection is difficult to assert it is suggestive that the Soviet Union, instituted in reaction to WWI, collapsed a xerxes after it. In Russia and Europe, where WWII was bloodiest, the half-xerxes of that war perhaps came in 1968. In the US it took the full 35 years, until 1980 and the collapse of the "liberal consensus." (On this pattern I intend to keep my eyes open in 2015, when the xerxes of WWII concludes.)
I take the name of this unit from a story in Herodotus. Xerxes the Persian, king of kings, looked over his army of a million men, the greatest army the world had ever seen, absolutely loyal to him, he the greatest ruler the world had ever known—a million men aimed under his command to the ruin and conquest of obstreperous Greece. But as he sat and saw on his hillside throne, something gave way in his mind. Some inward support, rotted out with secret melancholy, broke and let him fall. Xerxes looked at a million strong, proud, fearless men; but Xerxes saw only time and decay and death. Xerxes (a poor calculator) thought that in a hundred years not one of these men would be alive; and thinking so, Xerxes, king of kings, before his army of a million men—Xerxes wept.
The caterpillar had only been chewing though leaves, chewing and crawling as he had for his whole short life. Chewing and crawling – these were the only things he knew.
Then he felt something he had never felt before. There was a weight on him. He could not move. His legs were frozen. A feeling ran through him, chill and warmth at once, something slipping in between all the segments of his long narrow body. Then the weight was gone. Still he could not move.
Once he could move again he returned to chewing and crawling, for these were the only things he knew. But the leaves tasted strange to him now. They did not satisfy him. Day by day their taste faded, yet day by day he hungered more. He ate and ate until his skin grew tight, but still he was not satisfied. Once, at the end of a leaf, his hunger was so frenzied that he tried to eat the stem. He cried out in desperation and despair.
At his cry the other caterpillars came to ask him what was wrong. He looked at the other caterpillars, fat and happy, slow, stupid crawlers and slow, patient eaters, and burning heat rose in him. He screamed at them and cursed them. “Chew and crawl! Crawl and chew! Leaf after leaf, all the same! Wake up. This is not all there is. This can’t be all there is. You’re all the same! You all want me to be just like you, nothing, to be nothing, to do nothing, just blend in, just hide, just pretend not to exist.”
They stared at him with blank, hurt stares. Their fat bodies waved and jiggled on their little legs. He hated them the more for being hurt by his hate.
“You’re all disgusting. You’re all pathetic. You’re just holding me back with your stupid crawling and your stupid chewing. I won’t do this anymore. I’m leaving. Nobody follow me.”
He left and nobody followed him.
His days were all his own now. He spent them chewing and crawling, but now he could chew and crawl in his own way. He wasn’t like them anymore. Something had happened to him. Something had touched him and set him apart. He had been chosen for something – chosen, him! So all alone he chewed and crawled and waited for the thing to come, waited for his destiny to arrive, the grand destiny he knew was prepared for him.
Sometimes at first he would stop, stop and scream at no one in particular, just to scream out the rage that filled him at all the fat stupid caterpillars and all their stupid chewing and stupid crawling. But now, even as his thinning body began to swell again, his rage was softening. Instead of desperation he felt clarity, and instead of rage, he felt pity. He had been chosen, he knew, but he had not just been chosen – he had been elevated. He was above them in every way. He would look down on the caterpillars to watch their slow mindless chewing and crawling. Sometimes he would laugh, sometimes cry. There was so much more, yet they couldn’t see it. They were so small, so trapped, so limited. He, he alone, was free.
He could feel his destiny coming. It was close now. Chewing and crawling lost their interest for him. In deep shadow his body burned from an inner sun. He paused in long meditative reveries. He could feel the imminence of some great change. He would meet it with acceptance and gratitude, thankful to have been the one chosen, thankful to be the one who woke up. He no longer slept. His dreams and his waking sight fused until it seemed that everything he saw contained everything he could see. The moment was closer now. The moment was here. The clarity, the clarity hurt. The heat, the heat, he seemed to melt. He could not move but he was moving – there was movement – something moved – something stretched and twisted – something gave way –
Sometime later, in jewel colors still slick with caterpillar-stuff, a wasp took flight.
Moral: The God reveals but not as a Reward.
Severity apes wisdom. It looks as wisdom looks, it acts as wisdom acts. But severity is no more a kind of wisdom than fool's gold is a kind of gold. Severity is to wisdom as pedantry is to intelligence. Any quality of mind or dedication of energies that can achieve intelligence can also incur pedantry. Whoever ends up a pedant has missed becoming intelligent, and whoever ends up intelligent has evaded being a pedant. The same terms holds between whoever ends up severe and whoever ends up wise.
Being a pedant is easier than being intelligent. To continue being a pedant only means repeating what you have done before. To continue being intelligent means judging what you have done before. Severity, in the same way, is easier than wisdom. To continue being severe only requires that you go on denying and refusing. To continue being wise requires that you sometimes deny and sometimes accept, sometimes refuse and sometimes permit, according to the good you can do.
Sometimes you must be severe to be wise. Often you must be severe with yourself, must brace or flense yourself. Rarely you must be severe with others, to awaken or correct them. Severity taints trust (no one hugs a cactus twice): the difference between often and rarely is in the impossibility of resenting your own severity, and the certainty of your severity being resented by others. They will resent your severity even when you owe it to—even when they ask it of you. Sometimes you must be severe to be wise; but the wisdom is in the wisdom, not in the severity.
New ideas receive their most complex formulations first. This is most obvious in intellectual domains, where the work that introduces an idea never shows how simple it can be. The first draft, being new, is labored, and being new-fangled, is cautious.
This is less obvious, but still true, in other domains, like mechanics. A late mechanical clock, though compact, looks far more complex than a room-filling medieval clock. But the new idea is not the clock; it is the escapement. The painstaking blacksmith, evaluating materials, working and reworking them, test-fitting and adjusting and re-fitting—his efforts were more complex than the industrial procedures which allowed an escapement to be made by someone who had no idea what one was. Likewise the modern computer looks far more complex, though compact, than the room-filling Cold War computer. But the new idea was not the machine; it was the transistor, and now Shockley's circuit, which took days to build, is printed by the millions in fractions of seconds.
Of course, as an artifact, the industrial clock is far more complex than the medieval, and the post-industrial computer is far more complex than the industrial. But artifacts are not themselves ideas. Indeed the shortest definition of an idea is to distinguish it from the other kinds of constituent thought as the part that gets simpler over time.
Ideas seems to obey a kind of conservation principle, one of complexity. In order for an idea to stand on its own, it must be complex in itself. In order for an idea to be simple, it must be inside in a complex system. This is easy to understand for clocks and computers—as escapements and circuits get simpler, they get smaller and more fragile, and must be embedded in more complex, larger, more robust objects.
But this appearance is misleading. Consider guns: as the idea of propelling a projectile with expanding gas got simpler, guns did get smaller—a path runs from the gun that destroyed Byzantium's walls to the concealable pistol—but they also got bigger—guns have been built to launch payloads into space. Or consider the internal combustion engine, which powers motor scooters as well as container ships.
This conservation principle holds for all sizes of artifacts, and for all degrees of abstraction. Few people understood the phrase the electrodynamics of moving bodies; most people understand the phrase mass warps space. But the simpler formulation implies an entire profession whose job it is to define exactly what is meant by mass, by warps, and by space.
Let me suggest some practical consequencs.
1. By the time an idea has become simple enough to be generally understood it has usually ceased to be independently useful. Sometimes this is tautological: when everyone understands democracy, democracy already exists.
2. Few improvements are due to ideas; most are due to realizations. Someone realizes that step B could be eliminated by an alternative method of step C; someone has the idea that the entire process is wrongheaded. To equate realizations and ideas both neuters useful but limited realizations by turning them into abstractions, and suppresses ideas by simplifying them prematurely. Treating the elimination of step B as an idea is how we get the anti-ideas of management. Losing real ideas among false ones is how once-great company X is bankrupted by startup or foreign competitors whose ideas inevitably turn out to have been screened as babies from company X's torrential bathwater.
3. When an idea is new it may be unclear which part of the initial formulation is the idea. Often you must proceed with no more than a sense that your line of research contains a new idea somewhere in it. And even when the initial formulation is ready for use, use must sometimes be widespread and pratical before the idea stands out.
Consider guns again. A submachine gun is a sort of hybrid of the rifle and the pistol. It uses pistol rounds in a rifle-sized frame. Since the gun is relatively heavy and the rounds are relatively low-powered, a single man can control the recoil when the weapon is fired on automatic. (Assault rifles work the same way, using a special class of rounds intermediate between pistol and rifle.) But the first submachine gun—the Thompson, that is, the tommy gun—was not designed with this idea in mind.
One of its inventors had observed in his time on battleships that under the conditions of high pressure in the firing of a naval gun different metals would stick to one another. He called his observation (after himself) the Blish Principle of Metallic Adhesion and patented it as a way of dissipating recoil. In fact what makes recoil manageable is a heavier gun. But not only were the first Thompsons built with Blish's bits of brass in them, they continued to be built this way until the scale of wartime production eliminated the extra step. The gun had been in service for two decades before the idea behind it became clear.
But enough complications. Surely I have given this idea enough complexity to start on.
"Hello? I'm still down here. Open the door. Can you hear me—hey! Put the lights back on! This is a joke, right? Very funny! Open the door! Wait—I know you're down here somewhere. I can here you moving around. That is you, right?"
"Well, yes, we have received the test results. There's really nothing to worry about, sir. The guard? Oh, he's always here. Hospital policy. Let's get this over with. Have you experienced an increase in appetite recently? Have you experienced hardening or discoloration of the fingernails? How about any strange shapes or colors that recur in your dreams? Hmm. Well, no, the problem isn't your eyes exactly. Have you recently been to Africa? South America? Oh, the specialist's on his way, there's nothing to worry about. One more question, and this one may sound a little strange, but I need you to answer it honestly. Sir, when was the last time you defiled something?"
"It's great to meet somebody else who's not afraid of heights. Do you come up here often?"
"Every once in a while. I love the view. It gives me a sense of freedom. Like I could do anything up here. Like I can see the world but it can't see me."
"It's a great view. It's great until you look down, yeah? I mean I'm not afraid of heights, but that a helluva drop."
"Close your eyes for a second. I want to show you something."
The history of technology is the history of human weakness. The rest of history is only the surface: what happens once human weakness has been compensated for, or at least accepted. But most things that happen, happen below the surface. There is another history, a deep history that only technology records.
Consider glasses. Every person you see wearing glasses is another person rescued who, a hundred years ago or less, would simply have lived with bad eyesight. And bad eyesight doesn’t feel like not being able to see; it feels like headaches, tiredness, irritability, helplessness. How many billions of us have lived out uneasy lives in desperation and doubt, all for a trick of the light?
The most significant freedom which artists acquired in the 20th century was not freedom from patronage, but the freedom to assume a public with good eyesight.
For the diffusion of political authority, the gradual rise in test scores, and other trends of the last century which suggest the human race is becoming smarter, the most parsimonious explanation is simply that the human race is seeing better.
But consider something less conspicuous: consider bread and water.
Billions of us having pure water on tap means more than victory over worm and germ. Before pure water the only safe drink was drink. Since civilization began ours is perhaps the first generation to live sober.
And bread. Enriched bread means more than the eclipse of diseases like rickets or scurvy. Consider pregnancy – consider the dietetic demands of scientifically managed pregnancy. Number the nutritional concerns to which a conscientious mother is expected to attend. We are overly cautious, of course, but not in every precept. To be born as a peasant – and most people who have ever been born, were born peasants – was to be born maimed in advance by the neglect of every one of these precepts.
Aristocracy, perhaps, is simply the social order that results when only a small proportion of people can be kept well-fed enough to think clearly. Democracy, perhaps, is simply the social order that results when the majority of human beings are not born a little brain-damaged.
Somewhere in his Anatomy of Melancholy, Burton, in the early 17th century, writes that if lead were indeed poisonous – as some recognized even then – then all the nobility would be poisoned with it, for they all brought their water in by lead pipe. (Hand-worked pipes, mind, not machined.) Centuries later, his fellow apprentices in printing thought young Ben Franklin laughably fastidious for wearing gloves to set lead type. Suppose he had not.
Then consider our recent century of lead-based paint and leaded gasoline.
This is what it means to be human. Small things, things we can do nothing about, things we do not even recognize as dangerous, undo us before we know we have been harmed. The pipes in our houses, the paint on our cradles, the gas in our parents’ cars, leave us ruined before we are built, and lead stands the silent ruler of a stupefied world.
This weakness is something we work to forget. We cannot always be on guard. We cannot live our lives as though they were fragile and uncertain. We must build and plan. So we imagine that if life was tougher then, it just meant more rigorous selection. When we look backward we do not see weakness. Our forebears stand as straight as we do. Surely, the average human being then was as healthy as the average human being now, when technology coddles our softer stuff.
But look to the mountains. The mountains, their soil washed sterile with rain, have always subjected their residents to malnutrition. Yet there, even as they lived, worked, built, sang, and bred, their thyroids swelled and poked goiters out of their necks.
Human beings are very tough. What doesn't kill us, we get used to.
“Human weakness” is something real; but it is not in sin and not in absurdity. We are not bad; we are not silly. We won, after all. We came into this world as food for the animals we now keep in zoos and preserves and kennels. But we are still weak. We are weak because we are fragile. We can do everything except save ourselves. Our abilities – our wonderful abilities – are so easily prevented, so easily unseated, that it may be done without our noticing. Being born and being fed are enough to destroy us.
A diamond is as hard as anything; edge to edge, it can always wins. Yet a single well-placed tap can shatter a diamond. Just because a diamond is hard, the slightest flaw affords the leverage to cleave it through and through. We are all such diamonds. It is because we are strong that we are fragile. We are the houses built on sand: and when the house falls down, it is the house, not the sand, that makes the pity.
Yet though we break, though we break so easily and in so many ways, we remain ourselves. Diamond dust is still diamond. The things our blind, starved, poisoned, crippled forebears made in their darkness and desperation transcend their particular frailties and reach us clear and full of strength. When they wrote, we can read. What they sang, we can hear. What they pictured, we can see. What they made, we can use. What they learned, we can know.
Technology is what repairs weakness, and in doing so lets us see it for the first time for what it always was – weakness. History is the record of what we do despite that weakness. And the third thing – call it culture – is what, as it passes from one generation to the next, combines our strengths, omits our weakness, and represents us to ourselves whole: whole as we should be, whole as we can never be.
The appeal of solitude may be as simple as the dislike of repetition. To be gregarious implies infinite patience for retelling the same anecdote, confessing the same weakness, counting over the same favorites, relating the same background. Identity becomes a matter of performance and habit, not expressed in but being the routine of self-introduction. Just to avoid this explains why people may chose to be solitary; just the time won from having avoided it explains how people can find pleasure in something apparently so unnatural.
But is there a positive definition of solitude—is there something that solitude is? Certainly if one ventures a bracketed solitude, and another commits to a prolonged solitude; if one is solitary by choice, and another is driven to it—each gives a different thing the name of solitude.
And surely what solitude is has changed and is changing? Surely technology is banishing solitude as it banishes loneliness?
Distinguish two measures of solitude: quantity of social interactions, and quantity of people interacted with. Eliminate repetitions. By the first measure the most gregarious of our ancestors was more solitary than the most solitary of us. The lines of communication were so few, so thin, and so uncertain that to pursue them itself required solitude—to write a letter is a solitary act. But by the second measure we are freakishly solitary. So many people once had to be dealt with to do the things we do by mail, message, and machine—so many butchers, bakers, candlestick makers, so many drivers, porters, draysmen, all to do what takes us no more than a few words with a cashier! (And the cashier is sometimes dispensable.) The two measures of solitude do not vary together—to be solitary in both senses is possible and perhaps defines loneliness—but to lessen solitude by one measure is to increase it by the other. Thus we are all solitary.
This sounds like a curse; but in truth it is a homeostasis. Life adjusts to provide us a minimum of solitude as the body adjusts to provide us a minimum of warmth. Solitude is a thing, not a state; it answers an appetite, not a purpose. Something vital, something necessary, something catalytic, some nutriment or vitamin of the mind, something as ambient and replenishing to human beings as light is to plants—this something is found, it falls, everywhere, except where other people obstruct it.
Yet the dullness and rigidity of repetition can be avoided, and the vigor and fecundity of solitude can be protected, without isolation. All it takes is to be all things to all men, which is the same skill as getting along with all sorts of people; an easy thing if you are willing to be lead and not to lead in talk, to let people think of you what they want to, and to lie to give simple answers to simple questions when the truth would be obstructively complicated. In this way people can be read almost like books—like old books that fall open to certain pages.
This approach is too habitual with me. Why I write the Ruricolist is uncertain—my reasons change every week—but surely one reason is to take cross-sections of myself without any particular sense of audience. The Ruricolist does not represent me in full; many of my interests go unrepresented here; but here I set the topics, pursue their complications, and claim the right to confuse. In writing about solitude I abandon it. But that is the kind of contradiction that essays live on.
There were mirrors of natural reflections before there were eyes to see; there were signs and similitudes before there were minds to see them. Eyes themelves are but incomplete mirrors, keeping the images that mirrors return. The world before us was full of mirrors, as the world beside us is full of mirrors; but perhaps eyes had never seen something endless until a human being faced mirror to mirror; until a human being adjured into matter that same substantial recess of mirrors in mirrors reflecting inside his own skull; until mind represented mind. We are made of mirrors; perhaps this is why we are so easily trapped by them. It is easier to turn the eyes from glaring in hate or staring in lust than from preening in mirrors. The ancients gazed on clear water and black glass in search of mere shadows of themselves; but we have opened the secret of the silvered mirror. Our backhanded images follow us everywhere. Before we can even speak we are entangled with mirrors. First they show us our selves, then they show us our self-awareness, then our self-awareness of our self-awareness, on and on, back and forth until we are wound up in our selves yet we have no selves without the mirror, until self and mirror are foci of an elliptical orbit around the fact of reflection, until mind and mirror combine into mind—one mind whose parts are all men and all mirrors. Mirrors are our masters; but who minds serving masters who look at us with our own eyes?
Foolish love of self is still more mysterious than foolish love of others. Loving others is patently an adaptive trait. Self-love is neither necessary nor helpful to survival. Our species that must raise its young for a decade or more must set strong social bonds, but the conditions of individual survival are the same for human beings as for other animals. The snake, the squid, the scorpion, fight as hard to live as we do, but they do not love at all.
This assumes that loving at all serves some purpose. With questions of behavior such judgments should hesitate. In animals so complex and complexifying as we, the most that can be proved of a behavior is that it is not maladaptive. The capacity to love does not harm the survival of the race. Beyond that, the fact that the brain supports love may be no more significant than the fact that the brain supports solving crossword puzzles.
But the brain does not perform crosswords the way it falls in love. Love lights up the pith of the brain, while crosswords only stir the bark. But this only moves the question from human beings to mammals generally. And if the outcome of a crossword puzzle decided your success in life, if you had bitter rivals in it, if it promised to bring you loyalty and attention, if it could console you and reconcile you to life, if entailed ecstasy – then a crossword puzzle, too, would illuminate everywhere. But all these things are possible to a human being without love. And as for social cohesion, insects exhibit forms as strong or stronger than ours without any need to love at all.
If love is so mysterious, then why should one object be more mysterious than another? But the symmetry by which you could love yourself is a mystery of its own. How can the self present itself to itself as an object of love? The introspective mind can think about itself only as a sort of mirror image that corresponds to it at every point yet is not it. Of course self-love as narcissism works this way. But selfishness is cognate to narcissism, not collateral: a self-image of sainthood may produce narcissism with selflessness.
Selfish people are not self-centered. They do not pride themselves on their selfishness – they do not even see it. Indeed their most repellent trait is that they resent the selfishness of others without seeing their own, even when it is a double to their own.
The poets are wrong. Foolish love is not blind – love is blind only as the eye is blind, with a blind spot. The beloved may be ugly or stupid or cruel, but the lover who overlooks all these things in one person does not fail to see them in others. Lovers of ugly people do not surround themselves with ugliness; of stupid people, do not surround themselves with stupidity; of cruel people, do not surround themselves with cruelty. But their judgment is not intact. It is always disappointing to meet someone whom you know only through their lover’s description. In these cases, it is shocking.
The impairment of the selfish is the same, another blind spot, only self-directed. But this is only another mystery. How does the blind spot happen? In following the parallels of selfishness and foolish love we avoid the easy and wrong explanations of each. Chemistry, charisma, propinquity, neediness, passive aggression, codependency, pity – these cannot come between you and yourself. They cannot explain the blind spot. Conversely we learn how incurable selfishness is when we compare it to foolish love. You can no more convince someone of the absurdity of their self-regard than you can convince someone of the unworthiness of their beloved. No logic will dispel it, no shock will unseat it, and the more absurd it is, the more intervention will be resented.
This long analogy is the preparation for a brief and severe conclusion: there is no way to prevent selfishness and no way to cure it. Perhaps in refusing to tolerate selfish behavior, in avoiding selfish people, you may nudge some cases away from the brink. But the pit is bottomless and those who fall in cannot be rescued. Their very sin is its own contrapasso, its own poetic justice. They lie in darkness where they eat their own hearts. Leave them there. May it not be one you love.
Nothing is harder to describe in writing than the behavior of a crowd. Something so everyday as how people act when ten of us appear in the same place with the same object of attention, something so affordant for performers, preachers, and politicians, should be accompanied by a large, refined, and subtle vocabulary. Instead we are stuck with a scale of behaviors graduated so coarsely that it is almost useless. A crowd can go wild, roar, applaud, get caught up, be intent, hush, be restless, be tough, be hostile, boo, hiss, jeer, and riot. There are more words, but on investigation they prove to be empty variations.
Crowd—the name itself is almost an abstraction. Its few synonyms only distinguish different venues: a gathering, an assembly, an audience, a congregation, an attendance. Among animals we can distinguish flocks, herds, swarms, pods, colonies, hives, schools, and packs, but among human beings we can only say crowds, crowds, crowds, though the human differences are greater.
(At this point in the essay I consulted a thesaurus, which yielded throng, a contraction for "crowd I don't like", and mob, a contraction for "crowd that doesn't like me." Later I thought of the crush and had to be there, which are promising but undeveloped.)
Language fails, and image fails too: film's cantaloupe-murmuring crowds, its paid extras and vain camera-forward onlookers, are a convention as familiar and as absurd as sound technicians scoring heel clicks to sneakers.
Withal when you hear or think or are tempted to say that language and literature are perfected, that there is nothing left for writers, poets, and translators to do but footnote and allude, remember that there is a hole in language big enough for everything from a picnic to a revolution to fall through. I doubt it is the only one.
Endower Institute researchers today announced the discovery of a completely new kind of number, the "like number." Like numbers are the first new kind of number to be introduced into mathematics since Donald Knuth's "surreal numbers." Surreals were previously considered the most exotic form of numbers, but like numbers, according to lead researcher Dr. Pangloe, are "an even wilder expansion of the conventional idea of what a 'number' is allowed to be."
Unlike most mathematical concepts, "like numbers" were not discovered by a process of abstract reasoning, but through analysis of natural language. In fact, according to Dr. Pangloe, "Most people have an intuitive understanding of like numbers." He blames the failure of mathematicians to study like numbers on "an archaic attachment to the Victorian notion of 'formal' proof. This fetish for 'formality' has blinded so-called mathematicians from embracing what was right in front of them all along."
Like numbers, although they lack familiar numerical properties such as transitivity, associativity, and operability, have a range of new properties such as plausibility, gynormity, and pertinence (or, in some formulations, prurience.) Like numbers form an algebriac structure and support all the operations of a conventional field—addition, multiplication, subtraction, and division—but also allow for allusion, revision, recreation, and interpretation, among many other previously unknown operations.
A like number is notated using a new technique called indifferent expressions—similar in form to lambda expressions. As a conventional number is defined in lambdas as a higher-order anonymous function, a like number is defined in indifference as a higher-order inscrutable gesture. Indifferent expressions, however, are much more general in their potential applications than lambda expressions, and methods are now being developed to apply them not only to mathematical entities but to social phenomena and physical objects. "Just a few days ago," said Dr. Pangloe, "one of my students, in my presence, altered the deadline on his paper from three weeks to like a month. I myself have on several occasions reduced—my wife can testify—three, six, even ten drinks to like one. These are just the kind of real-world applications that mathematicians have been allowed to ignore for much too long."
The research will be published in the forthcoming inaugural edition of the Journal of Experimental Mathematics (Endower Institute J-16). This new publication will be edited by the leading members of the controversial X-Math movement. "It's time," said Dr. Pangloe in his office, pointing to a poster behind him with the movement's symbol, a flaming brain, "for mathematicians to leave behind their obsolete elitist claims of a special status among the sciences and embrace more modern, creditable methods."
But then what are the Zemurray Gardens? Did this hard-edged businessman possess an inner domesticity, did he retreat to a flowery sanctuary to soften his heart? Alas, no; the gardens that bear Zemurray's name were planted against his wishes. When he bought 150 acres of pine woods he reserved a certain portion for his home and assigned the rest to timberland. He thought gardening wasteful.
His wife, Sarah, did not agree. While he was away she hired a gardener and began to plant in secret. The soil (I know it well) was not friendly: thin soil, roots thick as carpets or hard and heavy as rocks, dirt so clayey—what was once the silt in the soil of the land above the Pontchartrain is now the land below the Pontchartrain—so clayey, so dense and heavy, that it holds shapes. (I once repaired a wall by packing wet dirt over a framework of sticks; it was meant to be temporary, but it has lasted five years.) They lay paths through the pine woods and lined them with azaleas, thousands of azaleas of every color. She waited, perhaps, until the azaleas were blooming; then she told her husband. I would like to think that beauty moved him; whatever his reason, he spared the garden. Now above board, she enlarged and dramatized her garden: clearings with statuary, a hill, a mirror lake, an island, cypresses, camellias, a bamboo grove.
But even when I saw it the garden still had a sense of secrecy, layer after layer hidden from each other. From the highway without could be seen only pines behind pines; from the house (the gardens remained private, open only when the azaleas were blooming) and the bright garden and sward attached to it, dark paths led away into the woods. The paths were tunnels really, with dirty floors, excavated just wide enough for a jeep, stretching on and on in brilliant hypnotic monotony, color and shade, high interwoven pines swaying and admitting sunlight in waves as they swayed, sunlight on the azaleas.
Azaleas are too little celebrated in literature because they are difficult to evoke. Their flowers are of no particular size—on the order of pennies and buttons; their leaves and stem are plain and spare; and their colors are frustratingly pure—no glossiness, iridescence, variegation or sheen, just pure bright colors straight off the palette. If pigment was a stuff that fell like snow, if it had fallen on the paths here red, here yellow, here orange, and the paths had been cleared by a careless shoveler, who threw drifts of pigment over the bushes beside the path: that was how it looked. And in the shade of the azaleas sparks smoldered, the coralred berries of ardisia, and bits of broken mirror still full of sky, the skyblue of Canterbury bells.
You could—I did—go around and around this way; but you could also turn off towards the lake; or (since someone left the gate open) you could turn off on footpaths through the timberland, and, if you knew how to venture into unfamiliar woods without getting lost, rest your eyes on green and green for a time. But getting back, you could leave the paths for the lake; and stand on the bridges or sit on the island, or turn off into one of the sheltered clearings, and visit with the improbable figures of Actaeon and Diana, having long ago gotten over their bewilderment at finding themselves in so strange a land. Somewhere there was a grove of moso bamboo, thirty feet high. Somewhere there was the cemetery of the first settlers here, men of England.
This garden held gardens, gardens inside gardens, all mutually secret and enclosed. Gardeners speak lightly of rooms; Zemurray Gardens was a labyrinth—not a maze, not urgent or perplexing, but a labyrinth, long, slow, meditative and sacred.
And over everything, the trees swaying. Even when the air was still, the trees were swaying. On my first visit I looked up at the pines swaying and listened to their creaking and thought: If a real storm ever hits this place it will be a disaster. I was right. The storm came. (Yes, that one, I have heard and written the name often enough). The trees fell and broke, the paths collapsed—it happened to me, how strange that a path could just disappear—the storm came and Zemurray Gardens was destroyed. You can look at pictures of it on Flickr—it is worth doing—but these photographers pictured the wrong things; they gave the wrong idea. But I remember it, in my mind and in my garden. The ardisia is just starting to spread.
The language of the Kptsha-Knr, Qx, was a true isolate, having no relationship to any other known language (although recent efforts have been made to link it to Basque). Qx exhibits many unique features, not least a grammar described by the first researchers as "polyanalytic" and a quality of "neutrality" between speech and nonspeech sounds which allows the language not only to be spoken but also clicked, coughed, sighed, sputtered, swallowed and sneezed.
Once Kptsha-Knr researchers had subjected the university researchers to the surgical liberation necessary for them to pronounce the language, they began corroboration on a dictionary. This thousand page scholarly folio dictionary (Qx declensions and conjugations are mathematically incompressible) would become a bestseller on the strength of a popularized ethnography of the Kptsha written by a visiting journalist, I Saw The Captchas: Among the Backwards Primitive Subhuman Others: As Told by the Dauntless Manly Discoverers Who Penetrated Their Stagnant Isolation and Exposed Their Feeble Barbarism to the Hard Gaze of Civilized Man. (The book is curiously difficult to find in modern libraries and appears to have been removed from a number of bibliographies with white-out.)
A brief "Captcha craze" caught the imagination of the world. For a time it seemed that everyone was doing the Cap Cha Cha, singing Captain Captcha's Cap Chap, or sporting slang straight from the Captcha lexicon—who can forget F. Scott Fitzgerald's immortal short story Time for Knprd or Dorothy Parker's lyric,
Say you'll come up
And we'll brgrp.
But the fad could not last, and the Captchas were forgotten and neglected until finally, in the 1950s, their culture was destroyed by the arrival of rock and roll.
The Captchas might have faded from memory entirely were it not for a film made in the last years of the Captcha craze, Captcha the Flag. An early talkie, the film's wooden acting, hokey script, and "Dialogue-coach voices discussing with gravity the fine points of roasting a chipmunk" (during the scene in which the Nameless Heiress has discovered a half-starved Henry Stark in the woods after her plane crash) endeared it to the hipster subculture of the 1990's. (The film is not available in restoration, due to the difficulty of separating the dialogue in the Captcha scenes from background static.)
Thus when programmers in the early 2000s began discussing the idea of a countermeasure for spam that would be "like the Navajo code, only more so," the Captcha lexicon was the natural choice. And so the Captcha, extinct, unremembered, unmourned, speak once more.
The constituents of the minimum unit of communicable thought must be units of incommunicable thought. I name these evocations, since they are recognized as evocative. The mind cannot make its own evocations; it must collect them as surely as a bird must collect grit for its gizzard. And like a gizzard stone a single evocation may remain in use for a long time, with a valency that internally and unconsciously parallels the external and conscious use of a tool.
Being incommunicable, evocations cannot be defined. Yet they can be traced. What is it that the mind seeks out, hoards, counts over? What mental experiences, without uses of their own, are approached with the same gravity as the most useful tasks? Review the mind's senses: how an image burns onto the mind's eye; how a word echoes and echoes in the mind's ear.
Victims of trauma are at the mercy of recurrence and flashback—at the mercy of the little things, the little fragments of experience, the mere reminders that reel them backwards into memory. This is a warped mirror and retrogradation of a healthy movement of the mind. Reminders drag backwards into memory; evocations lead forward to ideas.
Though evocations are usually single words or images, they need not be. Some evocations are ideas in themselves, constituent to more complex ideas. Whole works of art, whole journeys, whole friendships may be valent as evocations. In certain moods the whole world seems to me the evocation of some superb and singular idea I would lose my soul to enphrase—like an utterer of God's true name, a seraph would circle me to either side, declaring that I had gained the world, and lost the world to come.
A simple example of the use of evocations is at hand. A word catches the attention of a reader. The word is obscure, a little grandiloquent, but it charms him. It has a good rattling rhythm to it, almost a swing. A single stop between two liquids to one side, and a liquid and a sibilant to the other anticipate a euphonious final consonant cluster. Moreover it has a valency that he cannot quite define. Some time later this reader becomes conscious of blogs and the interesting things being done with them. Surely this is an opportunity; but what can he do? He lacks the quickness and lightness for blogging as such. Too often has nothing to say. So he puts the thought aside until he reads Boswell tell how Johnson wrote out a number of the Rambler with the printer's devil looking on. Then he has an idea: "Of course—not ruricolist—The Ruricolist!"
I did not want to write about this. What needed saying has been said. But there will be no marker for his scattered ashes. As I am a writer I owe him the service of an epitaph, to say: Look stranger, there lived a man, his name was Anthony Peter Rodriguez, Junior, he served his country, he had three children, two grandchildren, he never lost a friend.
Around the time I was born my father built a tower. He was tower-minded, he knew Jung's tower, Yeats', Montaigne's, he stood once on the tower in Jericho, ten thousand years old. He mixed concrete and lay rebar and bonded block until his tower stood three stories tall and bomb shelter strong. Our names and our handprints in the concrete. A few years later, we moved away. I wasn't with him when he missed it, but I went with him to check that it was gone. Diamond saws had cut it apart to open a new owner's view. He had hoped to revisit it when he was ninety. He had hoped it would be a legacy for him. Maybe here I can give him a monument more lasting.
But this is backwards. This pleasure is that of anticipation; and anticipation precedes understanding. In particular cases it is the pleasure that catches and sustains attention, that moves attention on after understanding. In the general case it is the habit of indulging the pleasure in anticipation that leads to the habit of seeking understanding. Anticipation is not a lapse in understanding; understanding is a lapse in anticipation. They intermit one another in an alternation that describes not a circle, but a spiral.
The apparent limit of that spiral would be to understand the world. Even if it is impossible to understand the world it is certainly possible to believe that one understands. We read that near the end of his life Thomas Aquinas, that great understander, experienced something that made him judge his life's work sicut palea—all straw. As the story is told this sounds like despair; but I imagine it as ecstasy. In an instant he passed through the false limit of complete understanding to the true limit of pure anticipation. Understanding follows anticipation, anticipation follows understanding, but not forever; complete understanding is followed by pure anticipation, but nothing follows that. The mystics judge light higher than truth; perhaps this is what they mean by something higher than truth, yet not false. May I be so illuminated.
This is no news. Anyone can feel how difficult it is to recreate the dictionary denotations, let alone the literature connotations, of one language in another. But this is only the beginning of the art of translation, its scales and studies. The real art is not in deciding how to repeat, but how to fill in.
Abstraction is constructive omission; an abstract word is both a something that is named and the impression of a number of somethings, themselves omitted, that together instantiate or imply it. Every language omits differently; and it is by this difference, I think, that language influences thought: the discrepancies of their abstractions mean that certain thoughts are harder to think in some languages than others, because in one they can be alluded to, and in another they must be constructed on the spot. The thoughts may be the same yet the attitude of the thinker towards them may be different. Compare computer languages: it is of little difference to an interpreter whether a function is called from a pre-supplied library, or defined on the spot as a lambda, but it makes a great deal of difference to the programmer.
Civilized languages have two genealogies: linguistic and literary. A linguist who says that language has no effect on thought is right in respect of linguistic traits. I do not see that it makes any difference to thought whether the language is gendered or genderless, analytic or agglutinative, nominative-accusative or ergative-absolutive. Thinking is so hard in itself that the general difficulty eclipses the particular difficulties or conveniences of certain languages. But languages do have literary genealogies, and these do shape thought. English apprenticed to Latin and Greek, not Sanskrit or Chinese; to discount this fact as shaping English thought demonstrates an absence of thought. Very few languages have civilized—have literized—themselves. The Old World has Greek, Sanskrit, Chinese, Egyptian and Sumerian. All other languages that have ever been written have done so after or while borrowing, directly or ancestrally, from one of these. (The New World has Nahuatl, but alas, it has no disciples.) All other languages serve some apprenticeship. Afterwards some, like the Romance languages, inherit the family business; some, like English or Japanese, buy out the stock; some, like German or Arabic, steal the plans and build their own versions.
To analyze this phenomenon as a form of domination, a side effect of economic and political power, is not stupid—witness Norman French and English, or Arabic and Persian—but it misses the point. The conquered reshape their languages by translation from their conquerors; but conquerors also reshape their languages by translation from the conquered. Greek came to Rome in the mouths of slaves. One generation of Mongols heard Arabic and Chinese only in cries for mercy; the next whispered them in their bedrooms and gardens. Translation is indeed a convenience, is indeed a political act, but it also a transmission, an inheritance, a maturation. The old language passes on to the young language something that it must strive and work to contain—simply put, power: power to know, power to understand, power to think.
Literary descent has two vectors: borrowing and poetry. Borrowing is the easier, more common, and usually the first method. Poetry is the harder but better method. Borrowing always leaves something behind. When languages are young, fast, and hungry, they take words and ideas as they need them, in the sense that comes easiest; and they often come to use them in ways they were not meant to be used. In studying classical philosophy, for example, the hardest step is to get rid of English definitions. Stoics were not stoic; Epicureans were not epicurean; apatheia is not apathy, a daimon is not a demon, kosmos is not cosmos, demokrateia is not democracy, &c.
The diction of poetry is remote and patient enough, enough removed from the necessity of application, that it can take the time to compass an idea. Still, it is not a perfect transfer: the idea retains an often inappropriate exoticism. A Greek might agree that beauty is truth; but he would not have learned the lesson from his kitchenware.
Language's limits are unresisting but real. No language is without limits; yet the limits of my language are the limits of my world, not as a wall limits my movement, but as the horizon limits my vision: I cannot see past it, yet I can never run into it.
Wind howls through towers of stone,
And the Queen of Spiders speaks.
"You have found me in my easternmost lair, at the door of the sea.
All spiders know and love me, for I am the Queen of Spiders:
But until now no human eyes have seen me and lived, for none have come with a gift.
I know you bear a question. I am bound to answer.
The eyes of all spiders are as my eyes: all that they see is before me.
The ears of all spiders are as my ears: all that they hear is beside me.
I move and am not seen, I stir and am not heard.
I know every crevice and hollow of the earth.
I know every shadow and secret and secret of the earth.
Ask once and I will answer.
Ask twice and you shall die.
Fool's questions shall meet fool's answers.
Wise questions shall meet answers yet wiser.
All things are written in silk and I read all things that are written.
Speak for I am waiting; ask and be answered."
From Abbas Cucaniensis Historia Regum Orientium.
Avaris, King of Egypt, lord of uncountable riches, dreamt that the god came to him, and told him that he was to possess the whole wealth of the earth, in his palace to be rowed between islands of gems on seas of gold coins. The doors of his palace would be of yellow gold, its windows of clear diamonds, its floors of black onyx. His feet would never touch aught but silk carpets. His water would never flow but into silver pots.
Awakening from his dream, Avaris called together his counselors to ask how the god's decree could be hastened. One after another, his counselors told him that the only way to possess the whole wealth of the world was to conquer the world—all his counselors but one, the youngest and most learned. This counselor said that he could bring Avaris all he desired, if he was but given command over all the merchants of Egypt.
Once the King had given him command over the merchants he called them all together and held forth upon the excellence of the dung beetle. The beetle could be obtained only from Egypt; the beetle was sacred and good luck to possess. Thus as the kingdom grew in prosperity and numbers under the wise and just rule of Avaris, everyone would want to posess a dung beetle. But the more who wanted to possess a dung beetle, the costlier dung beetles would become. Thus the merchants should purchase as many dung beetles as possible while they were reasonably priced, for their worth was sure to increase.
The merchants heeded his advice, which was given in the presence of armed men. Soon all of the beetles in Egypt had been bought up, and the cost of a single dung beetle rose to a king's ransom.
The counselor, proud with his achievement, ordered the merchants to bring their beetles to the court of Avaris. Over the noise of thousands of beetles scratching in their cages the counselor explained to Avaris that since each beetle was now worth a fortune, all together the beetles were now worth more gold than there was gold to spend in the world. Here before Avaris were all the world's riches, just as he had dreamt, just as the god had promised him. Was he not pleased? Would there not be a reward for his devoted counselor?
Avaris, it is said, regarded his counselor in silence for a space, then told him to order the merchants to open their cages and scatter their beetles upon the floor. This the counselor did proudly, for each merchant had adorned his stock of beetles with his mark. Was his King not delighted?
Whereupon Avaris descended from his throne, awful in resplendent silk and gold, his feet shod with sandals of gold. And with sandals of gold he began to stomp upon the beetles. Horror overcame the merchants, who fell to their knees weeping. Some sought with their prostrate bodies to cover their beetles, but the King's guards struck them aside with staves of mahogany bound in silver. The patient and tireless King stomped every beetle in the hall.
All the beetles having been stomped, and all the merchants put in chains, Avaris, his shining greaves still slick with the gore of the beetles, ordered the young counselor to be enslaved in the sewers of the palace, there to roll dung for the rest of his life.
Afterwards he called his counselors together to plan world conquest. But all the merchants' gold had been spent buying dung beetles, and no levy could be made to hire soldiers. Thus ended the lives of the counselors of Avaris; and thus ended the dream of Avaris, cursing the god who had destroyed him.
The sin theory of boredom begins with the observation (pithily expressed by Hume) that life is so far from being a pleasure in itself, that we would rather do anything at all with it than simply live. We seek preoccupation above all other ends. Boredom, like original sin, is an inherited debt that we are always working to discharge, that reasserts itself the moment we relax.
The symptom theory of boredom begins with the observation of animals and tribesmen, who are often lazy, but never bored. Boredom, then, is a symptom of the disease of civilization, of the way civilization warps and overstimulates the mind. Boredom is a sign of bad living, a signal that one is taking life too seriously, thinking too much. The proper treatment is to seek distraction and stupefaction, to shut the mind off.
Both theories are equally artificial. Consider that in most places, for most of history, the alternatives to boredom, both preoccupation and distraction, were beyond control. Harvests and wars happened when they happened, things broke when they broke or had to be made when someone needed them. Occasions of all kinds, feasts, bouts, revelries, holidays, followed the seasons. To be aware of boredom as a remediable condition belonged to the few, to rulers, warriors, priests, scholars. Not a luxury itself, like luxury, it prerequires general prosperity. (Look at Prince Hamlet, whose boredom was to a sixteenth century audience a sign of his greatness, a princely attribute lifting him above common men.)
Boredom was always real for everyone, but went unnoticed, because it could not be changed. That innocence is lost to us; we are cursed to be bored and know we are bored. That we must theorize boredom distinguishes us as moderns almost as that we must theorize mortality distinguishes us as human beings.
I subscribe to the sin theory of boredom. I have been bored, but I heartily repent both occasions. I would find them interesting now. Being this kind of person places me on a losing side—one losing so badly that the winning side has yet to notice that there are two sides. The sin theory and the symptom theory are not equally matched; the sin theory makes converts where boredom reigns and must be consciously resisted; the symptom theory makes converts where remedies for boredom are easy and cheap. And remedies for boredom surround us.
I think the sin theory was once predominant; I think TV defeated it. But even as TV fades its victory endures. You may have heard the thesis that the edifices of Web 2.0 represent the liberation of a cognitive surplus previously absorbed by TV. But it would be silly to expect that the culture, the habits, the state of mind of TV watching would disappear because the TVs are shutting off. TV has had generations to train the preponderance of human beings to fear being bored. The TV watchers turned from screen to monitor, moved from sofa to desk chair, but they remained what TV had made them, afraid of boredom, not ashamed of it, and they remade the net accordingly, turned everything social and turned everything social trivial. And because the net is teleologically compelled to absorb other media, out to the limits of culture, the TV watchers, in shaping the net, are performing all that earlier generations feared about the effects of TV on culture, catabolizing the achievements of centuries in a space of years, all so rapidly that it seems a natural phenomenon.
Note that I do not condemn any of the possibilities and powers of Web 2.0 and would neither propose nor assent to abolish nor withdraw from them. Rather I want it to go on, I want to see Web 3.0, I want to see a semantic web and lifelogging and all the rest. I want everything to be tried. But I reserve the right to condemn how it is being used.
The notion that the medium is the message, that new media compel new habits of thought—this slogan of progress is itself obsolete. It obtains only when new media are spaced out generation to generation and can raise their own. But when new media arise and displace the old decade to decade or year to year, we carry over the habits and attitudes of each into the next, in a way that warps its development and frustrates its potential.
But this is not an apocalypse. There are still two kinds of people, those who are ashamed of being bored, who think boredom a sin, and those who are afraid of being bored, who think boredom a symptom; and there always will be. The present victory is too absolute. Generations grow up taking the accomplishments of previous generations for granted. The very universality of this success dooms it; in time to swoon over collaboration and community will seem as absurd as it now seems absurd to swoon over industry and mass transit.
No trend goes on forever; nothing stays cool forever. Here is my great disquiet and doubt about the future. How institutions based on salaries and hierarchies continue to function after they cease to be cool is obvious. But what happens to institutions based on collaboration and community when they cease to be cool?
Count over utopias and dystopias. Do you live in William Morris's News from Nowhere, in Edward Bellamy's Looking Backward, in B.F. Skinner's Walden Two? Do you live in Aldous Huxley's England? In George Orwell's? All of these were reasonable extrapolations of existing social trends; but social trends do not extrapolate. The face of future is like one of those optical illusions where the outline of a vase is also the profile of an old woman or the portrait of a young woman is also the shadow of a skull. Foreground and background always change places. The two theories of boredom are the limits of a pendulum; and the pendulum is swinging still.
Because the dog is barking just to bark
(I've checked it twice―there's nothing I can see.)
I tell myself it's dogs that made the world.
Outside the hungry night with teeth and claws
Inside the world, with space for human things.
My nails are short because a dog's are long;
My teeth are dull because a dog’s are sharp.
We shoot and shine, we spray and fence it in,
We pierce the lucid dark, we switch it off,
But dogs still keep their watch along the night.
The dogs came first, before the walls and roofs,
But cities need them not. The useless dogs
Of Moscow, idle, learn to waste their time.
My dog knows none of this. She keeps the faith.
She knows her work, she does as she was bred,
Makes way for life. The dog will have to learn.
I cannot live like this. But better this
Than nights when no dog barks, and certain sounds—
Sounds faint and dryly rasping in my ears
Drift in across a hanging-open door.
Moral paradox. "Can I talk to you a minute? The cops say you were the last guy to see my brother—I mean, he was my brother, you were up there on the bridge with him. Did he, like, say anything to you? I wish I knew what he was thinking. I mean, everybody knew he was depressed—ever since the infection he just kept putting on weight, his glands don't work right—but he said he was OK with the operation, we thought there was some hope finally. I just don't get it. Right in front of the trolley. Do you think—somebody said to me maybe he thought he could stop the trolley—no, never mind. That's stupid. I mean, he was an engineer—he would have known, tons of metal, the trolley would go right through him and hit the other people anyway. Oh, I'm sorry, man, I'm rambling, you had to see that, all those people, I get it, you can't talk right now. Thanks for listening. Look, let me give you my number. If you ever need to talk, I'm here for you. Okay, we'll talk later. Thanks. You're a great guy."
But the society of human beings, though genetically driven, is not genetically predictable. There is an adventitious variation in how a society forms, as there is epigenetic variation in an individual. A single human being contains the certainty of a society, but not the form of that society.
We cannot establish society on a rational basis because society exists to serve an unreasoning need. Imposing reason on the substructure of a society only weakens its differentiating superstructure, imploding society to one of its basic forms.
What are these basic forms? The most basic division I can find is twofold: rule by the old, and rule by the young.
Rule by the old suits small populations. Thus we find tribes and villages with elders, and the most ancient city-states constituted with Senates. Paradoxically, it is by far the more stable of the two forms, yet it is not the ground state. When it fails, government by the old collapses into government by the young; and it is so much harder for a large population to attain government by the old that it arrives there only as an inheritor to government by youth.
Neither rulers nor ruled are immortal. This fact is not without political significance. People age: this is the advantage of rule by the old over rule by the young. Those who attain power old cannot expect to keep it long, so they tolerate the division of responsibilities requisite to a smooth succession.
But those who attain power young can never risk losing it. In rule by the young the rulers are not usually themselves young. The difference is that in rule by the old, age is the title to power; while in rule by the young, it is youth, and the deeds and quality of youth.
Rule by the old is the base class of the kinds of societies we call tribes, and of ancient republics; rule by the young is the base class of societies we call feudal, and of crime we call organized. At the top of such pyramidal systems the king or boss is probably much older than his vassals or boys at the bottom; but his authority derives from how well he understands and can sway the youth of his comitatus (or his posse).
Rule by the young is perishable; rule by the young lasts only as long as it takes to grow old in power. Of course growing old in power is the trick. Rule by the old, left alone, lasts forever; some extant specimens are older than history. But few societies are left alone forever.
It is in competition with other societies that the rule of the old shows it weakness; and war is the exemplary competition. Where the young rule, there is never any difficulty in finding and empowering a competent military commander. Ability trumps respect. But where the old rule, authority and ability are opposites, because authority is reserved for those least able to abuse it.
Accordingly we see that when a militant rule by the young arises, in its aftermath, even when it loses, all the governments involved go over to the young. It is like rabies: those who try to restrain the first victim get bitten; they all go mad soon.
This, of course, applies only to civilized countries: there is no difference in authority and ability where there is no such thing as tactics. But consider that the humiliation of Carthage brought about the rule of youth that set Hannibal in command; and how that Rome was forced to answer with Scipio was the beginning of the end of the Republic, once his glory set him above the law.
The same competition appears, though more subtly, in commerce, culture, and discovery.
Of the two forms, I would rather live under the rule of the old; I would rather rule by the rule of the young. So ruled, I could plan for the future; so ruling, I could decide it.
Rule by the old is the default form of government, the one adopted when there is no basis for choice; rule by the young is the ground state, the one that comes into effect when no government at all seems possible.
(Thus I regard anarchists, libertarians, &c., as being on the side of the young; they call what they aim for free trade or coöperation, but what they mean is freedom of action for the young, from a government that they despise as belonging to the old. But freedom of action is the same thing as government. If I can hurt you, and you cannot do anything about it, then I rule you.)
Can there be government that is both stable and active? Insofar as rule by the old correlates to aristocracy, and rule by the young correlates to monarchy, the answers after all this fuss would seem to be obvious: democracy.
Note, however, that these two forms of rule are not classes to one or the other of which all governments must belong. They are elemental forms of government. In classification they serve as poles, to one or the other of which a real government sometimes draws nearer; in analysis they are recapitulated and combined at different levels. Any real government—certainly any modern government, on a national scale—is really a hierarchy and network of governments, in which governments both fit inside and parallel other governments.
I want to resist as strongly as I can the notion that there is some simple nosology of government. The threefold division we blame on Aristotle—monarchy, aristocracy, isonomy (democracy)—what use is it? Even if we assign each form its evil twin—tyranny, oligarchy, democracy (ochlocracy)—what does the choice mean, except approval or disapproval? What structural features are different between each twin?
When Aristotle speaks of democracy, he means election by lottery; when he speaks of aristocracy, he means nearly what we mean by meritocracy; what he means by a monarch is more like what we would call a political boss, than the throne-sitters the word king calls up. And this is the best classification we have!
A constitution does more than fix power relations between classes. A constitution—even one that does not works as its written prospectus suggests—is a piece of social machinery. Once people have the leisure to propose them, an artistic diversity of forms of government can be designed and practiced. People can govern themselves by as many schemes of constitutional machinery as they can invent; they may make decisions according to the will of the king or the voice of the people; they can consult the innards of ravens, the shapes of clouds, the pips of dice, the whisperings of prophetesses, the opinions of economists. As long as everyone believes in it, as long as nothing unexpected challenges it, if it works, it works. This is the first kind of societal contract: like a corporate charter, it defines how things work while things work.
But sometimes things stop working. Plague strikes, you lose the war, rivals outspend you, your markets attempt suicide. When no procedure answers to the problem, then the old and young are heard from. Either the old among the powerful combine to keep things from going wrong; or things go wrong, and the young find themselves making decisions.
Here we find the second kind of societal contract. It has only three possible forms.
The first is the contract between old and young, while both have power. The old agree not to seize power from the constitutional forms; the young agree not to accept power when the rabble offers it to them, except in the name of the constituted government as a whole.
The second is the contract between the old and young, after the old take power; promising that when the elite needs renewal, they shall be the ones coöpted.
The third is the contract that the young offer the old when power falls to them: to retain them for their advice, and not to let the rabble blame and destroy them.
Consider the French Revolution, when all three contracts were offered but fell through. The first, when Louis fled, and the Jacobins and Girondins accepted the loyalty of the mob and the bourgeois, respectively; the second, when the Girondins tried to exclude the Jacobins; the third, when the Jacobins recruited the rabble into the Terror.
It is very easy to write a constitution that works; any lawyer can do it. (Writing constitutions used to be a hobby of mine.) But it is very hard to write a constitution that fails gracefully. Fortunately, there is a quick test: the longer the written form of the constitution, the more untrustworthy it is. The English constitution, at zero words, has lasted almost a millennium; the American, at ~4500 words, has lasted over two centuries. The Soviet constitution came in at ~13 300 words—a work of fiction of nearly novel length. The Chinese at ~14 000 strikes me as dubious. The length of the recently defeated EU constitution—about 150 000 words (can that be right?)—is perverse. 1
Consider any common business contract: the longer it is, the more detailed the terms, the more ways the contract can bend without breaking, and the less it should be trusted. The longer the contract, the less relevant—witness unreadable software EULAs, whose terms are less unenforceable than fallacious. When nothing is certain, the only contract that holds is the handshake.
Hippocrates taught that medicine has two parts, diagnosis and prognosis. Prognosis is the showmanlike part. The purpose of giving a disease a name and predicting to the patient its future course is not to inform the doctor, but to impress the patient. Diagnosis is the subtle and difficult part: the ability to trace the course of the disease so far, to discover and remove its causes, to recognize and avoid its dangers. Diagnosis is how the doctor finds a cure; prognosis is what makes the patient agree to it. Most political and sociological thinking is prognostic. I am trying to see how a diagnostic approach might work.
1 Figures for the US and Chinese constitutions include amendments. The figure for the EU constitution derives from an automatic conversion (
pdftotext | wc -w) and may be inaccurate.
Current popular writing works by defending oversimplifications with caricatures. You will read, for example, that creativity now stands revealed as inherently collaborative and social, over against an outdated and misguided nineteenth century—no, Victorian—no, Romantic (they are interchangeable for the purpose) idea of Genius inexplicably and mystically inspired with utterly irreducible, unaccountable, and unanswerable Originality.
You will hear quoted (from whom, I don't know) "Creativity is the art of concealing sources." This sounds awful to our ears; it stimulates our reflexes: concealment—hypocrisy! Expose it! And so we embrace a doctrine of synthetic, social creativity defined only as the opposite to a something that no one ever really believed.
Of course creativity really is synthetic. A mass of knowledge has inertia. The more knowledge is accumulated, the more it is all alive; by concentration and fermentation is becomes fertile. It is true, with reservations, that creativity cannot be independent. Every act of creation is fed by a thousand buried mingling rivers. Even the springs of the desert rise by distant rainfalls. Mere self-expression cannot be original because nothing we can create of ourselves can be as dear or as meaningful to us as the things that we have experienced, that have shaped us. New and awkward is just awkward; old but well done is just well done. For a human being there should be a pleasure in seeing anything done well, with gravity and devotion: whittling or whistling, playing a toccata, laying out a garden, writing a program, spinning a yo-yo, engineering an industrial process, writing a constitution, inventing a language. The value of originality does not occur independently.
The caricature is absurd, but not new. Consider Swift, who maligned early Romanticism by comparing its writers to the spider, spinning flimsy pale cobwebs from her own substance, over against the bee, who patiently collects nectar from a hundred flowers and distills it to thick golden honey. Our idea of social creativity matches this caricature, only taking the part of the beehive, instead of the bee. But this will not do. The vomit of bees and the spinneret-slurry of spiders are alike products of digestion; one happens to be more pleasant to observe than the other. But I have seen golden spiders spin golden webs.
The spider conceals its sources; from the look of its web you cannot tell what meal she has turned to silk. The bee flavors his honey according to the kind of flower he feeds from. But compare their work. The bee turns something pleasant in one way into something pleasant in another way; the spider turns something harmful into something useful, even sometimes beautiful—those golden webs in the morning sun glisten like blossoms.
But enough analogy. The architecture of social and participatory creativity is defined in its legal instruments, the GPL, Creative Commons, &c., as one based in the preservation of attribution. More than a legal reality, this becomes a moral principle. Young people, at least, regard themselves as being at liberty to use any kind of created material—pictures, songs, characters—where and as they please, expecting that their makers, even if they disapprove of the particular expression, will approve of the idea of reuse as a form of promotion to their ultimate financial benefit. So it may indeed be: but note the presumption that attribution is all the control a maker deserves over their work. And note that even where materials are drawn from the public domain and no legal necessity operates, attribution as a moral principle holds.
But attribution is more than courtesy: to have one's attributions in order is the passport of good work, the condition of its admittance to critical consideration.
The drawbacks of this system are two: it excludes personal experience; and it narrows the range of influences it is wise to receive, to the range of influences it is wise to admit to. The difficulty in using personal experience is that precisely the reaction that a criticism of originality most values—"Where did that come from?"—is the reaction that a criticism of attribution most deprecates, because it requires that question to be answered before criticism can begin. When the naming of influences becomes a public act, the choice of influences obeys the necessities of signaling that all public acts entail—some are in fashion, some are out of fashion, some are pedantic, some are pretentious, some are contemptible. Before the work even begins the selection of influences becomes the first move in the tactics of presenting the work. Because you must expect your work to be tasted with the intent of discerning its influences you must collect them out in the open, under the sunlight. What moves in the close and the dark is off-limits.
The obverse of this problem of narrowed influences is the impossibility of discrimination of influences. Once what is permissible has been agreed upon, to ignore some part of that range seems capricious and arrogant. You must take it all seriously. In education, conversation, and manners, it is a virtue to try to take everyone and everything seriously. As you overcome the instinct to scoff at what is unfamiliar or distasteful, you become more a thinking individual and less an accident of genes and community. And if your ambitions are essayistic or critical, you can stop there. But when you sit down to create something—art, music, literature, science—then you must choose: whom do you take seriously? If you cannot choose, you cannot act. You cannot have Leonardo and Warhol, Bach and Glass, Homer and Joyce, Cantor and Kronecker, Witten and Penrose, Gould and Dawkins. You may reject without disrespect: but you must choose. Where one is old and one is new, one must be obsolete, and one modern, or one humane, the other a fad or disease. When both are old one is classic, one is dated or irrelevant. When both are new, one is avant-garde, and one is bourgeois; or one is navel-gazing, the other is world-engaging. You do not even have to always make the same choice: but you must always chose. If you fail to choose, if you cultivate eclecticism to the point of indecision and deference to the point of impotence, you condemn yourself to the insubstantiation of Buridan's ass, which in the thought experiment starves for being unable to choose between two identical piles of hay. I love an army of writers for their particular excellences, and will defend them in entirety for the sake of those excellences. But when I sit down to write there are perhaps a half-dozen writers whom I can take seriously; everyone else seems ridiculous. I only demur that six is probably too many.
Consider the "mash-up." It is, defying the dictionary agon of creativity and criticism, a creative criticism; juxtapositions are not random, but either analogical or contrastive. If two analogical things are comparable, they supply understanding of each other exactly as a written analogy would. If contrastive they heighten the contrast either to absurdity or poignancy. As an art form the mash-up is not a uncalculating jeux d'esprit, not afterthought or diversion; it is tendentious and didactic. It is a popular pastime as politics are: it gives people a chance to opine, and lends to the opinion the sophistication of the underlying elements, as including Democracy or Justice in an assertion lends it the sophistication of the philosophies the words represent.
Mash-ups are not nearly as unpredictable as they should be, if they are primarily creative. Nothing technically prevents mixing parts of a dozen or a hundred songs and movies, but more than two of each would be out of order, because if it were done well the parts used would meld and the critical tension would be absent.
Those most likely to disagree with this are not the makers of mash-ups but academics with the odd, pseudo-ethnographic habit of taking Internet communities and fads far more seriously than they take themselves, of concentrating on the extremes and the fringes and ignoring the consensuses that these communities have about themselves (compare fanfiction, for example). They also are given to confounding creativity with informality—you will hear dialects, for instance, praised as "creative," as if Grimm's Law were an expression of the creativity of German speakers. Dialects indeed require creativity in the sense that they might have been otherwise; but they are not themselves creations because they had to be something and might just as easily have been something else. Creation is by definition not inevitable.
This division between creativity and creation is, in its effects and its attractions, not unlike the division between sex and reproduction. Creativity, as such, is delightful, relaxing, and consoling. But when it results in creations, these attractions disappear. Your creation preoccupies and distracts you, disrupts your sleep, fills you with doubts. Like any offspring it warps you in its gestation, enslaves you in its infancy, and grows by the life it takes from you. To nourish and raise it takes strength and time to spare. Creativity is a disposition, a faculty, a gift; creation is a vocation, a devotion, a discipline. It is absorbing and racking. One creation does not easily follow another, and too many in succession, or at once, can break a constitution and unmake a home. Even once it has some substance and independence one is weighted with an interminable responsibility for it, to find the right place and the right friends for it, to see it sent out into the world. Even then you are left exhausted, restless, and anxious. You can always grow, but you cannot always bear fruit, at least not the same kind. Even the mind needs cover crops, needs seasons to thicken its sap and sink its roots. This is not to reprove those who seeks the pleasure without the responsibility; only to observe that one who, by being creative, presumes to know what it takes to create something, is a damn fool.