Departments

Art vs. life

Sometimes art frightens me. Sometimes I wonder what art is taking to match what it gives. Surely talk was faster and more excursive before recording; surely clothing was more splendid and plumed before photography; surely gesture and pose were quicker and more lifelike before movies. Maybe worship was more devoted, before icons and idols; maybe love was stronger, memory keener, regret fiercer before the portrait; maybe voices were softer, birdsong sweeter, before music. Art universalizes particular experience, delivers it across space, time, and language. But what we receive as if transmitted, might only be lost; what we receive as if preserved, might only be embalmed; what we receive as if translated, might only be parodied. How are we, art-shrouded, art-addled, to know any better? Every sense bends to its particular art; do we more watch than see, more listen than hear, more savor than taste? At best art stands between us and life; at worst it claims our lives. What could Arthur Henry Hallam have done with his life to match In Memoriam?—where the artistic perfection of grief prevents us from sharing it, we who so value the expression. I fear art and I love it; I fear it as I love it, because love is power given, and power means abuse of power. So many minds are slaves to art, full of images and stories they do not even recognize as art, puppets to old ingrown art (they call it common sense). I study art, prize it, and judge it not to pass life but to save life: because to study, prize, and judge art is the only defense against it.

Cognitive psychology 5/5

If all this were true it would have two consequences. First it would require a strict distinction between what a person reports their perception to be and what that perception actually is. The act of perceiving a perception in order to describe or render it would be understood as a skill, subject both to cultivation and neglect. What cognitive psychology identifies as debilities of human perception would be no more than an untrained clumsiness. Second and correlatively it would regard the ways that cognitive psychology identifies to influence human behavior as weaknesses to be annealed by education, not intrinsic handles to pull in a desirable direction.

All this essaying is futile, I know. Even if I were right, no one would ever call me right, except in retrospect; and I am very likely wrong. In doubting a large field of scientific work I am certain to sound like a crank. I can only note that I am not nailing up theses; and that if I am wrong I would like to be straightened out.

Postscript 2014

Since I wrote this essay I have participated, as a subject, in several experiments in cognitive psychology, and in consequence I now regard cognitive psychology as a pseudoscience.

Here is the problem: in order to avoid the appearance of shirking, the subject has no choice but to express preferences in the absence of preference and beliefs in the absence of belief. Even without financial stakes, ordinary social conventions compel the subject, in order to be kind to the experimenter, to deceive them.

It goes like this. The experimenter asks, “Does this make me look fat?” The subject says, “No.” And the experiment concludes: “Human beings are incapable of accurate estimation of one another’s weight.” Soon books are written about “cognitive weight bias.” “For our ancestors,” the editorial begins, “underestimating one another’s weight was an important survival strategy.”

This is what cognitive psychology is, and it is all cognitive psychology is: the heedless elaboration of a social solecism. In an experiment in cognitive psychology, the only true psychologists are the subjects.

Cognitive psychology 4/5

In the laboratory this fine distinction can only split hairs. I should supply a larger example that will bear the division better. If you accept the standards that cognitive psychology uses to judge the self-reporting of perceptions, then you must accept that people who draw badly perceive the world as they draw it to be; which is absurd, because the distortions are too severe to live around.

Four objections present themselves.

1. If drawing is a manual skill, then non-artists may simply lack enough control to make the pencil do what they want. Now certainly there is room for manual skill in drawing; but it is not required. Anyone who can negotiate the angles and curves of printing the alphabet has all the control of the pencil required to produce an accurate sketch.

2. If non-artists really see the world as they draw it, they may still be able to function because seeing and drawing differ in the same way as recognition and recollection. The problem is that non-artists often do worse attempting to draw from life than from memory. Partly this because drawing from memory spares them the awkward necessity of refocusing between paper and subject, which is a distinct skill. If you tell someone to draw a leaf from memory the result is usually recognizable; if you give someone a leaf to draw you may get a tracing, or something that looks like the coastline of an imaginary island.

The discrepancy here is in the way data is stored. Following the computational metaphor, recall is presumably something like a value stored as a string, and recognition something like a value stored as a hash.

The difference is this. When you log into a (well-designed) site, the site has no record of your password as such. Instead, the site transforms your password using a mathematical operation which is easy to apply but difficult to reverse, and compares the result, or hash, with a hash that it has on record. The site recognizes your password without recalling it. (This is important because it ensures that if someone were to crack the database they could obtain only the useless hashes, not the passwords themselves.)

To distinguish recall and recollection we could thus suppose that the mind contains two ideas of the leaf: one an ideal, schematic, low-density Leaf-concept that can easily be handled in working memory when thinking about leaves—good enough to figure out how much pressure to use in raking and how big a pile of leaves will fit in one bag—and a separate Leaf-hash, which the brain can readily compare particular objects to in order to recognize them as leaves. If you attempt to draw a leaf, then you can obviously only draw a Leaf-concept, as this is all the information that you brain actually has about leaves; but through some sort of hashing you can still recognize leaves that do not match that account.

(I am doing readers the favor of constructing a stranger account of recognition than the one commonly advanced, which would recognize a leaf by the successive activation of increasingly specific subsystems of identification—from green to flat to foliform to organic to leaf. To suppose that seeing a particular object requires a specific order and combination of activations is so far from the reality of perception that it exceeds my capacity for polite treatment; so I have constructed a stronger account to refute.)

The problem is that this makes a mystery of how anyone ever draws a leaf at all. The artist certainly does not add parameters to the Leaf-value. Indeed the artist ceases to have any Leaf-value at all. Artists draw not things, but configurations of color and shadow.

But if this distinct Leaf-value—the leaf as recalled—is useless either to recognize or depict a leaf—what good is it? One could suppose that the Leaf-value in some sense guides the process of recognition—affords the possibility—that something might be a leaf and directs the brain to check—but really, what does the bright-green oak-maple hybrid I expect my English-speaking readers might draw have to do with leaves—leaves green, red, yellow, brown, anthocyaninic blue and black, leaves from grass blades to tropical aroid elephant ears? Either we must suppose that the brain harbors a Leaf-value for no other purpose than to thwart artists—not even in dreams do trees have such unleaflike leaves—or that this distinction of recognition and recall is not applicable, and that this parody of a leaf is cover for the inability to consciously perceive how we conscious perceive a leaf?

3. If non-artists do not perceive the world as they draw it, the way they draw may simply distort what they see, as memory in general distorts what they experience. This is a good objection because there is an obvious comparison between how memory and drawing both exaggerate emotionally significant aspects of perception, and between how memory artfully fits experience to narrative, and how drawing unartfully fits vision to outline.

Consider outlines. Outlines exist nowhere but in non-artists' drawings. Nature defies outline; vision nowhere finds it. Nonetheless when non-artists draw, invariably they first attempt an outline—even cave painters, who were artists when they drew, loved to outline their hands on the rock. Outlines are not incompatible with art—the Egyptians made a high art of shaded outlines—but they are prior to art. Abstract outlines do not depict anything: their value is, being abstractions, they preserve the symmetries and topologies of what they anonymize—they are mathematical in character and, for simple shapes, perhaps the origin of mathematics.

The comparison with memory and narrative is obvious. We need not appeal to the world cone diagrams of physics to understand that in anything that happens, an imponderable diversity of causes conspire, and that for anything that happens, an innumerable diversity of effects result. Every event is part of the fabric of the whole world.

Narrative, like outline, is unreal but useful; patterns of events, like shapes, though potentially infinite in variety, tend to approximate simple forms with predictable properties.

But drawing cannot distort information in the same way as memory. However badly someone draws, they do not ever act as if they see the world that way. To walk or sit, to touch or pick up, proves that a non-artist does not bungle seeing the world in same way as rendering it. Even in dreams no one sees as badly as they draw. The brain does not elide vision into outline before storing it; the reduction to outline is a miscommunication within the brain.

4. If someone does not draw in a style that resembles Western art, that person is not therefore a non-artist. High cultures elevate as art what Westerners might regard as mistakes. Non-artists may not exist. Perhaps everyone distorts their perceptions artistically when they draw, most in more dramatic ways than the subtle ones traditionally valorized in the West.

Western art has its artificial conventions, but to say that photography and the kind of art that obeys the same laws of optics and projection is essentially a cultural convention requires more stomach than I have. Human eyes only work one way. The anecdote says that a pygmy brought out of the forest could not tell buffaloes on the horizon from insects. Assume the anecdote is true; what does it prove? It proves that there exists such a thing as a myopic pygmy. Or should we believe that pygmies never look up into the crowns of trees? That they cannot tell that the bird overhead is the same as the bird in the bush? To prize optic validity as artistic quality is cultural; but the validity itself is physiological.

So if for the first 3500 years or so of human history no culture or civilization held the goal of art to be to represent just what was seen, then of course we are readily distracted by other goals, and must be induced by long training to give them up for this one particular goal of realism.

But I am unwilling to credit that the artistic way is ever the easy way. The artists of the cave walls, of Egypt and Sumer, of India and Persia, were no lazier than the artists of Venice, Florence, and Amsterdam. They were not primitive; they were not innocent. Assimilating natural errors to artistic traditions they happen to resemble represents a more ridiculous pedestal for Western art than any academy ever proposed.

Too, perspective and foreshadowing are not utterly alien to the brain; I suspect that even those whose arts reject these values, do in fact dream with them; although this would be difficult to prove.

Cognitive psychology 3/5

The nominal scientific method dissevers a discovery from the experiment that made it as early and thoroughly as possible—by suppressing the personalities of the experimenters, by using standardized equipment and methods of analysis, by ensuring reproducibility and undertaking it. All these mechanisms are relatively weak in cognitive psychology. Success results in advancement, experiments themselves with their individual variability are the instruments, and the expense and inconvenience of recruitment, with the backlog of experiments yet to be done, make reproducing experiments a last resort. But psychology has never worked the nominal way—try it, and you get Behaviorism—and this way seems, for the most part, to work.

Cognitive psychology had to fight hard to free itself from Behaviorism. Here we come to the problem. Behaviorism rejected introspection; cognitive psychology accept it; but it exchanges uncritical rejection for uncritical acceptance. I think that there is something that cognitive psychology gets wrong about introspection: it assumes that our perceptions resemble our perception of our perceptions.

(Nominally cognitive psychology rejects "introspection" but accepts "self-reporting"—I do not understand the difference.)

For example: a form of experiment is to present a list of words on a common theme for memorization. The list omits some particularly obvious entry that could be expected to occur with the rest. When memorizers repeat the list back, they often supply this absent word. The obvious conclusion and the one that cognitive psychology draws is that memorizers have perceived the absent word; that a subconscious process—subconscious in the sense of neurologically substratal—inserts the extra word into the memorizer's perception of the list.

But this may not be so. Precisely because the list is simple it obscures the distinction between perception and perception-of-perception. It is also a plausible interpretation that memorizers do not perceive the extra word; instead, in reproducing the list, the memorizers perceive their own perception of the list in an incorrect yet conscious way. This multiplies entities, but I think it provides more parsimonious explanations.

Cognitive psychology 2/5

Before I attempt a reasoned argument I want to sketch three broad points of general discomfort with cognitive psychology.

1. It is difficult to pay sustained attention to cognitive psychology without feeling that the point of many experiments is not the paper, but the press release. The relationship between science and the media is generally discussed as if newsmen trawled scientific publications with a view to finding stories with headline appeal. But this is not how it works; institutions (with what degree of involvement from researchers I do not know) push press releases to sites like EurekAlert!; journalists may check up on them to add human interest, but the transition between experiment and news item takes place inside the institution. As a regime of incentives, this strikes me as perverse.

2. Cognitive psychology is based on the model of the brain as a computer; but this is a trivial statement, nearly a tautology. It implies that a computer is a kind of machine, another something like a clock or a car; a sophisticated machine, certainly, but just a machine; just a machine, and the brain just another example. But this is not what a computer is. The history of computers is not a history of invention; it is a history of discovery. We did not invent computers; we discovered computation. Computation is an aspect of nature, something like emission or gravity, a property of all sufficient systems. If something is not a computer, it is less than a computer. So of course the brain is a computer; and—?

3. Cognitive psychology and ethicism in psychology are roughly coeval. The obvious suspicion is to ask whether we discovered cognitive psychology because it was the only psychology we could discover ethically? Psychologists cannot do harm, so they find no harm in the mind; psychologists cannot deceive, so they find biases, but no gullibility; psychologists must have volunteer consent, so they find the mind sociable and cooperative.

Cognitive psychology 1/5

So pervasive have the claims of cognitive psychology become, so often do I encounter the rhetoric in which the introductory anecdote takes the form of a cognitive psychology experiment, that I find it necessary to decide just what I think of it, and where I stand. I do so best by doing so publicly.

I respect cognitive psychology as the most rigorous, coherent, and fruitful paradigm in psychology. By itself the nosology of biases is one of the most useful achievements of intellect; in practice at least equal to the Greek achievement of logic. Of course, like logic, the doctrine of biases is subject to misuse—the more so because it is new and there is no current wisdom on its limitations. The spirit of catching out logical fallacies trivializes itself in the principle of the fallacy of fallacies—the error of assuming that because an argument is fallacious, its conclusion is wrong. Now, because biases are innate, but the recognition of biases is acquired, the application of the biases cannot be closed by a neat "bias of biases"; but some such wisdom is required to avoid the bookish error of rejecting conclusions not because they are wrong, but because they are biased—which is, in itself, no argument.

Too, we live in a world that, for reasons technological, social, legal, economic, and pragmatic, is a great open field for projects and projectors. If the world is to be saved, very likely someone somewhere is on the right track. Everything has someone working to reform it, review it, or replace it. The second great achievement of cognitive psychology is to let us live amid this redundancy without inhabiting Swift's Academy of Projectors—his satirical asylum of gibbering would-be worldchangers.

All projects, whatever their short-term obstacles, are ultimately opposed by human nature. Cognitive psychology's scheme of a human nature quantified by laboratory experiment, provides a basis for accommodation. But more importantly its exposition of human nature through distinct and discrete experiments enables a kind of catalog of projects, a way for us not part of them to keep them straight. A particular project can always be linked to a particular experiment as a way of working out its implications and possibilities.

Vanity Three

Same as last year and the year before, I am temporarily putting the last year of the Ruricolist into print.

The Ruricolist: Essays and Caprices: Year Threen

I plan to keep it available until the end of June.

Unlike last year I cannot make the PDF available for download from Lulu. You may download a PDF here, but let me explain myself first, because this PDF is not as good as I would like. I am trapped between technology and law. The book is set in Monotype Bembo Book. The metal Bembo is both a celebrated font, and one I am fond of—the first book whose typography consciously charmed me (an old Penguin edition of Josephus) was set in Bembo. But the first digitization of the font is a notorious failure. Bembo Book is a relatively new, much improved attempt. I am pleased with it, and with its result.

The problem is that it is a commercial font. I had to pay for it, and agree to a license. The license forbids me from distributing PDFs with full font embedding. Lulu absolutely requires full embedding of fonts. The solution I arrived at was conversion into DjVu format. DjVu shares some of the aims of PDF—it is a portable document format—but it is more compact and more consistent. It is favored for scholarly facsimiles. And being a raster format (like a photo), it does not support fonts. (If you can display it, you may simply download Year Three in DjVu.) From DjVu I exported a Postscript version, then back-converted the Postscript into a PDF with no fonts. Since the file has been through two conversions, and since it is a raster and anti-aliasing cannot be applied to it, it does not look as well on screen as it would in its original format.

The cover image is one of a number of photos I took year before last of a ruined church out by Hammond. I know nothing about it—not its name, nor its history, nor even its address. I don't even know if it's still there. But it had a visual attraction that even this non-photographer could not defy. The only change made to the image is the usual histogram adjustment of color levels.

P.S. A note for the technically minded. All the tools employed are free software; namely, for coding, Emacs with AUCTeX; for typesetting, XeLaTeX with KOMA-script; and for image editing, GIMP.

Also, a lesson learned. As you value your sanity do not try to produce a one-piece cover for Lulu by any kind of conversion. There is no direct way to turn an image that GIMP can edit into a PDF that Lulu can print without ruinous loss of quality. Use geometry to make a PDF of the appropriate size and just embed the image in it (possibly with PSTricks). It prints perfectly.

Reading

I want to make the argument for isolation, uncertainty, and dissipation in reading. Something in readers compels seeking the true way of reading, the one discipline that suits all books. The attractions are many. There is the sense of belonging in knowing what kind of reader you are; there is the confidence of knowing exactly what you will get from a book; there is the general attraction of focus in life, given by the relief of concerns beyond that focus. But readers, after all, are people who choose to spend unusual proportions of their time alone. The benefits of the independence and self-guidance a reader must possess to read at all hold for the choice of how to read. A reader does not read to rehearse an opinion, but to enliven it, to produce some motion in it analogous to the biology of growth, or healing, or death. If the book can produce the same motion in the idea of reading itself, why not permit it to? Some books are more focused than others; but the value of books as a medium is in their capacity. The benefits of reading a book for instruction, instead of running a search, or of reading a book for diversion, instead of participating in the lives of friends, is that writers, in filling up so huge a thing as even a short book, must call upon and involve their whole original experience and sense of the world, must turn themselves inside-out in such a way that someone else can put them on. If reading is to be creative—and why not let it be, if it can—then it must be by a parallel inversion, a counterpoint to writing in which reading calls on and involves you as writing does; which must therefore be subject to change as your experience and sense of the world changes. And for someone who chooses to be a reader, experience and sense must include reading itself. All of which is to say that I do not worry much about what kind of reader I am nor how I should read. I change and imperceptibly until I re-read a book after a long interval and discover how different a reader I have become.

The End of Sex?

For millions of years the human race has reproduced sexually. But thanks to the eGad, all that may now be coming to an end, says blogger Isaac Bickerstaff.



Let's be frank: sex is how most of us came into the world. Despite a century of scientific advances, in vitro fertilization remains marginal, until now the domain of devoted enthusiasts. Across the world, billions of people are still being conceived through the antisocial, unhygienic act of physical love. Even today, in the age of ubiquitous, always-on high-speed Internet access, reproduction continues in the same medieval paradigm of bearing and begetting that our grandparents practiced.

That paradigm may be coming to end. Potato's (Ticker: POM.TER) new eGad portable tablet computer ships without any support for pornography. Initial reaction has been mixed. One famous blogger, in a screed blasting the eGad, simply states "The Internet is for porn." Another, however, reports: "I'm glad to get away from it. It's like I just escaped a raging, savage master."

In an open letter the CEO of Potato insists that pornography is obsolete. Superior free, open-door dissipations, he says, can completely replace pornography—citing romance, hanging out, cuddling and hand-holding. Many users, however, report frustration at visiting popular websites only to find them covered in black bars.

Beyond transient frustration, however, Potato's rejection of pornography has more serious implications. Isaac Bickerstaff, in an open letter to Potato, describes how important pornography was to his sexual development. He writes:

As something of a geek, more interested in gadgets than girls, by my early teens I was well on the way to sexless bachelorhood. I looked into my future and I saw skinny ties, thick glasses and pocket protectors. Discovering pornography gave me a new way of approaching life. Instead of staring at the floor when girls spoke to me, I could undress them with my eyes. I realized I didn't have to treat woman like human beings—whatever that's supposed to mean. I could treat them as means to my sexual satisfaction. In short, without pornography, I would never have met the beautiful woman I call my wife, or gotten her pregnant. Holding my virginal new eGad, I feel worried for my children. Of course, pornography isn't just going to disappear. But without a pornograph to play it on, will my children ever have the same chance to develop a prematurely jaded sexuality that I did?

Many Potato fans, however, welcome the sexual liberation of the eGad. On a Danish Potato fan forum, user TotalGules writes, "Get yourself an eGad. Why would you be a breeder of sinners? I'm a decent guy but I could accuse myself of such things... let's just say it would be better if my mother hadn't given birth to me. No matter what you do, even if you don't cheat or ogle other women, girls think you're just another dumb horny guy. You talk funny, walk funny, try to be cute, and what do you get? Give it up. Enough of this crap. Go get yourself an eGad."

(Kyoto)

In this preview for the forthcoming Kyoto Journal #74 you may read the first paragraph of my essay on Western music and the Silk Road, "The Hollow Staff". The issue can be pre-ordered.

Pseudoscience

Because historians do not understand science, scientists write their own history. So when scientific opinion changes—when a science turns out to have been a pseudoscience—the dead or powerlessly retired scientists who pursued it are sorted ex post facto. The ones who anticipated the change stay scientists; the ones who fell for it turn out to have been pseudoscientists. Truly they were pseudoscientists all along—everyone knew it—but that media, that damned media, they were the ones who made it seem otherwise—we scientists always knew better; it's your fault for being gullible, you cargo cultists.

You can watch this happen to string theory. In one direction or the other expect to see it in climatology. Let us rehearse the explanations in advance. Someday you may recognize one or the other as a news item or a footnote in history.

Case A:

Despite the overwhelming evidence for the anthropogenic origin of global warming, a movement of so-called skeptics, organized ultimately through the resources of corporations and political parties whose interests were threatened by the urgent measures the situation required, were able able to delay action until the forces behind climate change had become irreversible. Certain scientists, some through misplaced but sincere convictions, but most because it was convenient and attention-getting, continued to cast doubt on the evidence even after the scientific consensus was certain. Nonetheless, none of the best scientists failed to see reason, and it is simply false to assert, as some have, that scientists themselves were at fault.

Case B:

In the tense political atmosphere of the early 21st century it was only natural that political movements were eager to enlist scientific evidence to support their policies. Given the apocalyptic mood of the time—a quick look at the box-office returns for the first decade of the century will show that the impending end of the world was a cultural commonplace—it is unsurprising that what developed was a superficially scientific vision of the apocalypse. The media too were part of this zeitgeist, and exaggerated a concern many scientists had with the unknown effects of carbon dioxide, and certain alarming high-level trends, into a political movement complete with speeches, rallies, and platforms. Nonetheless, none of the best scientists failed to see reason and it is simply false to assert, as some have, that scientists themselves were at fault.

I am not proposing a debate. I am not trying to convince you of anything except the irrelevance of your convictions. Climate change is just a convenient subject.

(For the record my view is better safe than sorry. The absence of anthropogenic global warming would be harder to explain than its presence; I therefore am in favor of anything short of irreversible geo-engineering.

Though I do admit to disgust for those who condemn "economists' reliance on models" with one fork of their tongues, while the other extols "the proven science of climate change"—as if modeling the homeostasis of the economy were any harder than modeling the homeostasis of the climate.)

I am sarcastic because I am accustomed to hear a risible servility. But I am not attacking science. I trust that what has been declared a pseudoscience is so. In this retrospect science is indistinguishable from infallible. But I dispute the genteel hypocrisy which would pretend that pseudoscience has never entered the mainstream of science, or that if it ever had, it would have been due to outside meddling.

The fallacy, I believe, is called No True Englishman (Scotsman). "No true scientist would have participated in X; therefore any so-called scientist who participated in X was not a true scientist." The whole history of the relation of science and pseudoscience is constructed with this tautology.

True, science is not just a vocation, but an affiliation, a group, and therefore, like any other group, contaminated with loyalties and solidarities. But the more acute problem is that science and pseudoscience are not dichotomous. Degrees exist between them.

The world protoscience has been advanced for the pre-scientific pursuits that lead into sciences—as astrology leads into astronomy, as alchemy leads into chemistry, as doctoring leads into medicine. There may be many such intermediate degrees but I propose only one. Some scientific pursuits are neither sciences nor pseudosciences, but placeholder sciences. The textbook scientific method expects the succession, within a defined science, of observation, hypothesis, and theory. But in most scientific work the science itself is a hypothesis. The most common question is not, "What law governs this phenomenon?" but "Is this a phenomenon at all?"

Most ologies are not really fields at all, but gambits: in science a field does not arrive and then demand methods, subjects, a center and journals; rather the methods, subjects, the center and the journals are how hopeful scientists attempt to bootstrap a field into being. In the end the attempt either succeeds as a science, or fails as a pseudoscience; but in the meantime it is neither—it invites a science and clears a space for it, saves it seat. It is a placeholder.

For later scientists to judge a placeholder science according to its final result is unjust. The scientists who failed were not cranks; the scientists who succeeded were not visionaries. To suppose they could have known better in advance is to posit a faculty which, if it did exist, which make actual science superfluous. A scientist can no more anticipate a pseudoscience than a computer can anticipate a halting problem.

All sciences begin as gambits. The sooner this is recognized, the sooner we can avoid misplacing the faith due a mature science in its placeholder; but more importantly, the sooner this is recognized, the sooner we can begin, not just accommodating such gambits, but encouraging them.