Departments

Constructed languages

In life, we are the servants of language. Words are all we leave behind us. What substances we leave are only significant by the names they receive. It is the so named Works of the Romans, not any work alone, that inspires wonder—and it is only the name Work which causes us to regard the aqueduct at Nîmes as more wonderful and more attention-worthy, more sympathetic, than any arch of weatherworn stone in the Badlands of Utah. The name of Apelles the painter was a byword for excellence in his art for a thousand years after the last man who had seen his pictures was dead.

Language is all that we have in common. What we make are only words; what we leave, are only names. The side effect of world-spanning intelligence is that each intellect becomes its own world—a principle compelling diversity, which only the necessity of communication, restrains from dividing and ruinating our species. Therefore, what can be said in any language can always be said in another—because while we are under no obligation to be able to communicate a thought, what can be communicated to anyone, can be communicated to everyone. Think of how strange it is, that only because you both speak the same language, you can communicate with someone separated from you by decades of age, by country and climate, by sex, by class, by way of life. And think how absurd it is, that someone of your age, sharing your background and your circumstances, cannot communicate across that shorter distance, if you do not share a language. Yet, it is possible to know things in one language which you do not know in another: not because they cannot be said in both; but because translation is a kind of alchemy, where solve must precede coagula. What you take in whole need be accessible to you in another language, even if you speak them both, until you have taken it apart in one language, to put it back together in another.

What then could you not learn, by creating a language? Let the language of mathematics bear witness. But my concern for now is with languages created for pleasure, not for purpose.

Tolkien is the master—the Old Master—of the artistically constructed language. Much of the appeal of reading Tolkien is to discover things that he knew, in silver-quick Quenya and in Sindarin's rushy breezes, that he did not quite know in English, and could not always translate. Indeed, On Fairy-Stories has the air of a poor translation from a patois of Elvish languages inside his head. And A elbereth gilthoniel silivren penna míriel teaches you something without having to see a translation at all; but you can feel that you know it, whatever it means, under branches screening stars.

Tolkien is sometimes derided as pseudo-Biblical in his diction; but this is simply wrong. The peculiar cadence of the King James Version is imparted by a plangent alternation between etymologically disparate English synonyms—which misrepresent the straightforwardness of the Hebrew. But Tolkein is etymologically almost pure: page after page, he goes on in a kind of alternate or underground English, the revenant language of Chaucer, the sleek hull of maiden-voyaging English before it was barnacled with borrowings from Latin and Greek. This language—a constructed language of a sort, purely by selection—is, before even myth and archetype, the deepest source and means of his power.

Some purposes require their own languages. Mature poetry has its own grammar; vital religions employ their own dialects, even their own languages (and a wise missionary does not translate everything); and every profession must find its own language, to ennoble the commonplace or to make commonplace the extraordinary. And every language supports another, floating language, of idioms and proverbs—one which can sometimes be carried over whole into new waters, as Erasmus did in his Adagia, which restored to the Renaissance the floating language of the ancient world.

Art, insofar as it is mimetic, is still only sometimes the recapitulation of a natural process. As often, artistry is the power to throw a natural process into reverse. A picture suggests a story; a title finds a poem; a stain on a wall evokes a picture. The constructer of a language only begins by finding, in the space between real languages, the pleasing or striking form of a new and unheard language. The art is to evolve, from this shadow, the succession of necessities comprising the thinkable history that would have formed that language—that would have learned all that that language knows.

Internet or library

Research on the Internet is a meal made of cake and caviar—you may enjoy it, but you cannot live on it. For food to live on, you must go to a library. Minds that live off the Internet acquire a distinctive and pathognomonic flabbiness—the combination of diffidence about facts, with passionate certainty in politics.

The Internet breaks the book's proportion of data and information. It is very common, perhaps the rule, for everything on the Internet on a topic—so many thousands of sites, summing to gigabytes—to be drawn from the information in one book. A thousand sites may provide only so many tertiary paraphrases and plagiarisms of an original secondary source, itself an oversimplified popularization. How often does a search bring you only page after page of eyelash-singing agglomerations of animations and blinking text, or stylish compositions worthy of some old typesetter's catalog, all passing around the same threadbare, context-free and unsourced statistics, all retelling the same questionable anecdote. Most of the best of what the Internet offers on any non-journalistic topic, is on the level of a good children's book—just enough to give you bearings; enough to let you ask intelligent questions, but not firm enough to credibly answer a question with. This is the cake.

The rest of what the Internet has to say about something tends to be astonishingly obscure. You are interested in the occult? Why, then, here is the Alchemy Library. Read Trithemius, study an abstract of Picatrix. Or here is the Twilight Grotto. Study Agrippa, Bruno, the keys of Solomon. Agrippa, they say, had a spirit tied to the collar of his dog; Paracelsus had one in the pommel of his sword. You have the advantage of them—you need not live as a pilgrim, wandering from monastery to monastery, ransacking Europe for books as they did. You may, with Google's help, set out ignorant in the morning; and by evening, have at least a minor spirit inhabiting your cell phone, whereby you may produce static at will, or induce baldness in telemarketers. These two sites are remarkable achievements, labors to whose makers I am profoundly grateful. They exemplify the attraction of the Internet. I might, a century ago, have spent a lifetime and several fortunes in pursuit of all the books they make effortlessly available to me. They are of enormous value, though of weightless, portable stuff—caviar.

This range of materials, from broad introduction to narrow trivia, creates an illusion of depth—but the middle is empty. There are gulfs which the Internet will not help you cross. Consider programming—where, if the Internet can suffice anywhere, it ought to excel. How many tutorials will impart a little Javascript, get you tinkering with PHP? But there is the same gulf here as everywhere else. Programmers not only read books, they have classics—Knuth's Art of Computer Programming, Kernighan and Ritchie's The C Programming Language, &c.—which form necessary steps in the education of a serious programmer. Stop and consider this—it is more than ironic. It means not only that the Internet is not generally sufficient, but not even self-sustaining. It cannot even feed its own.

At some point in any life of study, diverse perspectives become a distraction, and obscure facts become trivia. This turning away is the sign of intellectual hunger; and to fill the mind, you must turn off the computer, you must shut out the world and cloister yourself. Diverse interest alone makes a sciolist; obscure information alone makes a pedant; but a scholar is formed in the library, out of deep and protracted thinking. Yes, a book is a companion; but it is not an interlocutor; it is a guide, like Dante's Virgil in Hell, from doubt, through danger, into light.

All of this might be obvious is we did not confuse education with schooling. Still, I hope most people understand that a good student is one who would, and does, learn without a teacher—one who reads. Reading multiplies schooling; and while classes may profitably shape and direct and compel one's intellectual history, its substance—if it is to have one—is what one learns on one's own.

This is not because of the immaturity of the Internet. Indeed, as the Internet matures, it ceases to rival the library. Everything is increasingly commented upon, annotated, rated and heckled—which has the effect of reading in a café; possible, but showing more interest in coffee and talk, than in reading. Hierarchy increasingly gives way to unlayered tagging—which makes it easier to pursue one's interests, but conversely makes preconceived interests less flexible—is it progress in cookery if wherever you go, to any restaurant anywhere in the world, you can eat a hamburger, just the way you like it back home?

I love the Internet. I need the Internet. I would be diminished by living without the Internet. But the Internet is not the library of the future. The Internet is becoming ever more and more its own medium, something new and stupendous. The Internet is not trying to become a book; it is trying to become a continuous correspondence, a layered and multivalent conversation, a many-handed game, an apotheosis of the social aspect of thinking—but not all thinking is social, and for private thinking, the Internet is not only unhelpful, but poisonous. It turns out, now that we are arriving in the future, that the library of the future, is the library.

The Wikipedia forbids the addition of original research; but that is an enforcable policy, only because the whole atmosphere of the Internet is already so much against it. Much is made of the rise of sites like ArXiv or PLoS, and of their increase in importance at the expense of print journals; but though these take advantage of the capacities of the Internet, they do not belong to it. It should command attention how strange an accomodation they are—that writing by and for the most technically sophisticated audiences possible, written on computers, distributed exclusively over the Internet—is still formatted in order to be printed and read on paper. Discoveries asserted in markup are, with rare and casual exceptions, the work of cranks. Hypertext, as the Wikipedia shows, is a fine way to knit together existing knowledge—but to add something to that knowledge, or to test knowledge, or to find perspective on it, requires the linearity which can be offered perfectly by paper, but which the screen must hide itself to imitate.

Will new technology change this? Consider music. Music did not have to wait for the for iPod to get off of vinyl. Each new, incrementally superior technology—8-track tapes, cassettes, CD's—met with a market. But, despite many attempts, nothing has seriously competed with books, at delivering books as such. I have lost count of how many ebook readers have come and gone without summoning into being a market. Gutenberg and Google Books open up books to be searched; but despite all that they make available, the percentage of time on the Internet devoted to reading through books, must still be negligible.

What are the disadvantages of books which technology offers to address? They have weight. They cannot be searched. But, in practice, no one seems to care about these things. Novels do not need indices. Not all books aspire to become encyclopedias—they are more than raw materials, they are things in themselves, where the ability to jump to a fact you need, omits its progressively established context. And the ability to carry 50 books in my pocket does not give me time to read them. The only usable form of ebook—one which not only has met with acceptance, but has been around, and successful, for decades—is the audio book.

Enthusiasm for the discontinuous and egalitarian formation and organization of knowledge—for the network and the cross-sectioning search—is good. It gives the power to approach the world in a new way; we do not know what we may yet find. But the network perspective is not complete. Linearity and hierarchy are not intellectual original sins. There are real lines, and real hierarchies, and real uses for both approaches. The line brings speed—there is power in separation. The road separates us from nature, but finally lets us see more of it. Hierarchy brings confidence: the mechanism of memory is a network, but its operation is hierarchical, always subordinating the abstract and universal to the concrete and particular. I suspect that, in any system, the line and the network must always be found in vital alternation.

To show that a book serves certain ends better than the Internet is, by itself, only enough to justify books as a luxury item. But my concern is for libraries: not the kind that adorn mansions at a stage in affluence a little after a pool, and a little before a garden; but public libraries, the kind where flimsier books have to be armored by special bindings against use and abuse—by impecunious scholars and cheap students; by parsimonious old men looking back, and grubby-fingered children looking forward and up and down and sideways; by the shy seeking connection and communion, and by the face-addled and smalltalk-stupefied seeking respite.

The real justification required is not intellectual, but economic. So then, books are good; but can we justify the expense of libraries, if we can substitute in the Internet something not quite as good, but good enough?

The production of books is not a problem. It must be hardly a footnote to the paper industry. As long as we have drywall and wallpaper, paper towels and toilet paper, cardboard boxes and bottle labels, fliers and handouts and junk mails, office printer and copier–fodder—we may have books as well, without much trouble or impact.

But this is moot. Most people are sensible and sensitive enough not to be in favor of disbanding or abolishing those libraries which we do have. The great open question is: should we build new ones? The implicit answer of late seems to be that libraries are like city steam heat: worth keeping up if your city already happens to have it, but not worth the effort when building anew—so that, in the developing world, what is important is to build schools, not libraries; to put laptops into children's hands, not library cards.

This is pragmatic; and as far as it is pragmatic, it is good. A laptop can bypass a corrupt government; a library implies funds requiring stewardship. But there is a questionable assumption involved: that, just as undeveloped regions might do best to skip telephone poles in favor of a wireless infrastructure; so they should skip libraries, and develop a paperless culture. But I think this assumption is better put: if developing regions attract enough medical charities, they can skip building hospitals. Patently, the object of medical charities is to relieve the harshness of life, in order to help form institutions which will, in time, build hospitals. Just so: laptops should set in motion the intellectual awareness and appetite, which will, in time, demand and build libraries.

I do not mean that libraries can compete with or replace the Internet; rather, that the Internet increases the importance of the library, as the means of synthesis, consolidation, and continuity in culture. The Internet magnifies diversities of all kinds. It has strengthened every existing group and variety of opinion, and called forth new groups and opinions from vague sympathies and stirrings. But diversity is not an end in itself: the object of the multiplication of individual perspectives is that these perspectives should each somehow add to a shared, universally meaningful sum—they should add to culture in general; they should add to civilization, to the shared human project.

The Internet by itself is unsteerable—a freedom of assembly which veers into faction; a freedom of expression which veers into vandalism. The faster and more weightless the Internet becomes, the more acute its need for books as ballast. Think of a project like the Wikipedia; would it be possible without the books which anchor its citations as references? Thus the very paragon of the possibilities of the Internet is everywhere immanently dependent on the library.

Or think of politics. Politics on the Internet should be the great fable warning against the dangers of booklessness. In the absence of books, it has utterly succumbed to the most caricaturish excesses of faction and vandalism. It is particularly tragic to witness in the United States—which was founded by the bookish—reading books to derive and develop their ideas, writing books to defend their actions. Five minutes with the Federalist Papers or the Debates in the Constitutional Convention or Democracy in America will give you a better idea of, and a stronger sense of a stake in, what America is and stands for, than if you were to read every Internet debate on the subject from Usenet through the blogosphere.

Increasingly, I find that the rhythm of my reading becomes an alternation, in which what I read in books raises questions which the Internet answers; and what I read on the Internet conceives needs which books fulfill. This cycle has neither the attraction of technophilia, nor of technophobia; it is neither shiny and sleek, nor soft and patinated; it has no brand name or buzz, nor ancestry or tradition; but it has certain unglamorous advantages: it is real, and it works. I cannot be alone in having discovered it; I suspect that there are more who thus belong to both sides, than who belong to either—the rivalry is not only mistaken, but mostly fictitious.

Fable of the Whale and the Squid

A whale once determined to settle between himself and the greatest of squids which was the more terrible, and thereby master of the ocean. So he sought out the terrible squid, and fought it. The deep was full of noise: of blasts of killing sound, of roars from carving jets of water; and the battle, watched by whales (for the squids did not care) was a jumble of teeth and tentacles, and something white in the lampless deep, and around it long clouds of grasping sea-dark.

The whale won. The squid sank away where even whales could not follow. And all whales knew that it was a whale who was the master of the ocean. Then the watching whales left, and the victor—ragged, bleeding, exhausted—swam away alone to rest.

But now there were walls of noise against his sides—stunned, he could do nothing as three young whales tore him apart, to become the new masters of the ocean by killing the old.

Moral: The Ambitious and Successful forget that Ambition is common.

Debunking

Debunking is to science as criticism is to art—not useless, but not the thing itself, and often requiring a cast of mind opposite to what it tries to protect. A debunking is a rhetorical technique; a disproof is a logical achievement, or a scientific consensus. In the history of science there is hardly an idea worth general attention that has not met with a wave of debunkings. They rise in jealousy from the center of science, and recede in time to the bewilderment and resentment of the fringes.

A debunking, like any persecution, only strengthens what it attacks; and debunkers, in the long run, are enemies of science, because they belie its attractions. They make science seem progress through mutual abuse, where only the most smug, mordant, and venomous are worthy to possess truth. Debunkers, by their defense of science, do all to make science attractive that pundits do for politics. Most of their criticisms are reducible to copyediting. Take a scientific paper; confuse the punctuation and capitalization; add exclamation points; put an uncredentialed named on it—and they will crush it. Take a crackpot theory; copyedit it to a scientific style; abbreviate the name (J. Smith); add the name of a university or an institute, and they will not even to allow themselves to wonder about it.

Two prominent exceptions, where criticisms are substantial and urgent, are intelligent design and climate change denial. But these are not really pseudosciences; they are themselves debunkings. These two movement have rudely proven that the effect of a debunking has nothing to do with the truth of the proposal, only with the skill, prestige, and power (or powerful alliances) of the debunker. I do not believe that a debunker can defeat truth in the long run—corroborating evidence in time must strain the skill, undermine the prestige, and sour the alliances of the denier—but it can be held off for a very long time. A history of science can be constructed in which truth always promptly wins, only because, in the past, truth opposed in one place has always had somewhere else to win. Until the last few decades there has not been a single, international scientific community and consensus; only several separate national scientific establishments, which have served to correct one another. Only since the end of WWII has there been a single Western scientific establishment; and only since the end of the Cold War (or since Khrushchev, in some degree) has there been a worldwide scientific community. The great disadvantage of this single system is that it is difficult to shame it. For example: the wave theory of light originated in England (with Young), but was not accepted in England until after—through the work of Fresnel—it had been elaborated and accepted in France. At the extreme, in France, Voltaire had to satirize the Cartesian mechanical ether to advance the theory of gravity. Competition between universities can in some degree replace this international competition; but only in disciplines which do not depend on centralized sources of money—where independent budgets allow for independent thinking. Consider the slow acceptance of the role of angiogenesis in the development of tumors, in the costly world of medical research.

In the history of science, it is rarely well-known anomalies that necessitate new theories. Remember that Copernicus kept epicycles in his heliocentrism; that Kepler set out not to discover the shape of orbits, but the spacing between them; that Newton "made no theories" for the material basis of gravitation, at a time when the great question was that of the ether—a question which Einstein also fruitfully ignored. Cranks are drawn to easy solutions for most or all problems; scientists are drawn towards mines of new problems. It is easy to multiply after the fact theories of what is, or is not, science. But the behavior of scientists shows only one rule: scientists go where the work is. And while it is unusual for scientists to have to step back, and declare a body of work nonsense (caloric fluid, for example), it is almost the rule that as a new science advances, it goes from vague pretensions of revolutionary importance, to mere usefulness, or even footnote-filling triviality; and that, as theories mature, they surrender their ambitions, and ceasing to be projects of their own, end serving as instruments of old projects.

For the practical recognition of crankery and quackery, it is not necessary for the borders of science to be patrolled and enforced by debunkers—it is enough to avoid easy answers. The question to ask is not some berating, trolling "Why" ("If God designed us, why the appendix, coccyx, recurved spine?" "If global warming is natural, why do so many climatologists think it is not, why do the models overwhelmingly refute it, why changing patterns of vegetation, glaciation?")—because these are the same kind of questions that the other side is asking ("If evolution is random and undirected, why such useless complexity, why so many broken chains, why dogs can still interbreed?" "If global warming is anthropogenic, why no rush of disasters, why still harsh winters, why no one can agree on what would happen even if it were true?")—and because the answers would not be scientific: "To keep us humble," "Institutional veniality, overconfidence, bad records," "Because so little survives, so little is seen in so short a time," "Because we've been lucky so far." The right question is simply—"And?" Science needs problems, science needs questions; so science cannot abide easy answers, science cannot settle for dead ends.