Departments

The Ruricolist is now available in print.

Art vs. Life

Sometimes art frightens me. Sometimes I wonder what art is taking to match what it gives. Surely talk was faster and more excursive before recording; surely clothing was more splendid and plumed before photography; surely gesture and pose were quicker and more lifelike before movies. Maybe worship was more devoted, before idols and icons; maybe love was stronger, memory keener, regret fiercer before the portrait; maybe voices were softer, birdsong sweeter, before music. Art universalizes particular experience, delivers it across space, time, and language. But what we receive as if transmitted, might only be lost; what we receive as if preserved, might only be embalmed; what we receive as if translated, might only be parodied. How are we, art-shrouded, art-addled, to know any better? When every sense bends to its particular art, do we more watch than see, more listen than hear, more savor than taste? At best art stands between us and life; at worst it supplants our lives. What could Arthur Henry Hallam have done with his life to match In Memoriam? – where the use of grief in art prevents us from sharing that grief, we who so value the expression. I fear art and I love it; I fear it as I love it, because love is power given, and power brings abuse of power. So many minds are lost to art, full of images and stories they do not even recognize as art, puppets to old, ingrown art (they call it common sense). I study art, value it, and judge it not to pass life but to save life: because to study, value, and judge art is the only defense against it.

Cognitive psychology 5/5

If all this were true it would have two consequences. First, it would require a strict distinction between what a person reports their perception to be and what that perception actually is. The act of perceiving a perception in order to describe or render it would be understood as a skill, subject to cultivation. What cognitive psychology identifies as a bias of human perception would be no more than an untrained clumsiness. And second, it would regard the ways that cognitive psychology identifies to influence human behavior as weaknesses to be compensated by education, not intrinsic handles to pull in a desirable direction.

All this essaying is futile, I know. Even if I were right, no one would ever call me right, except in retrospect; and I am very likely wrong. In doubting a large field of scientific work I am certain to sound like a crank. I can only note that I am not nailing up theses; and that if I am wrong I am only hurting myself.

Postscript 2014

Since I wrote this essay I have participated, as a subject, in several experiments in cognitive psychology. In consequence, I now regard cognitive psychology as a pseudoscience.

I still think cognitive psychology is interesting. It is philosophically interesting, not because it uses science to cast light on the problems of philosophy, but simply because it is interesting philosophy. Its scientific pretensions are false.

Here is the problem: in order to avoid the appearance of shirking, the subject has no choice but to express preferences in the absence of preference and beliefs in the absence of belief. Even without financial stakes, ordinary social conventions compel the subject, in order to be kind to the experimenter, to deceive them.

It goes like this. The experimenter asks, “Does this make me look fat?” The subject says, “No.” And the experiment concludes: “Human beings are incapable of accurate estimation of one another’s weight.” Soon books are written about “cognitive weight bias.” “For our ancestors,” the press release begins, “underestimating one another’s weight was an important survival strategy.” This is what cognitive psychology is, and it is all cognitive psychology is: the heedless elaboration of a social solecism.

To put it another way: professors should not experiment on students for the same reasons professors should not date students. (Effectively all cognitive psychology is done by professors experimenting on students; where the subjects are not students, they are still acting in the role of students.) Between professors and students there are power differentials that preclude sexual consent. But someone who cannot give an honest answer to a sexual proposition certainly cannot give an honest answer to a personality inventory. If it is a bad idea for professors to date students, it is a far worse idea for professors to experiment on them.

Cognitive psychology 4/5

In the laboratory this fine distinction can only split hairs. I should supply a larger example that will bear the division better.

Drawing is self-reporting of a kind. Proposition: if you accept the standards that cognitive psychology uses to judge the self-reporting of perceptions, then you must also accept that people who draw perceive the world as they draw it to be; which is absurd, because they could not survive. When you draw a stick figure, that does not justify the conclusion that stick figures are what you see.

Some objections present themselves.

1​. If drawing is a manual skill, then non-artists may simply lack enough control to make the pencil do what they want.

There is certainly room for manual skill in drawing; but it is not required. Anyone who can negotiate the angles and curves of the alphabet has all the control required for a sketch.

2​. Non-artists do not perceive the world as they draw it, but the way they draw distorts what they see, the same way memory in general distorts what they experience.

There is an obvious comparison between how memory and drawing both exaggerate emotionally significant aspects of perception; between how memory artfully fits experience to narrative, and how drawing unartfully fits vision to outline.

Consider outlines. Outlines exist nowhere but in non-artists’ drawings. Nature defies outline; vision nowhere finds it. Nonetheless when non-artists draw, invariably they first attempt an outline – even cave painters, who were artists when they drew, still loved to outline their hands on the rock. Outlines are not incompatible with art – the Egyptians made high art of shaded outlines – but they are prior to art. Abstract outlines do not depict anything: their value is that, being abstractions, they preserve the symmetries and topologies of what they anonymize – they are mathematical in character and, for simple shapes, the origin of mathematics in the promise of geometry.

The comparison with memory and narrative is obvious. We need not invoke the world-cone diagrams of physics to understand that in anything that happens, an imponderable diversity of causes conspire, and that for anything that happens, an innumerable diversity of effects result. Every event is part of the fabric of the whole world.

Narrative, like outline, is unreal but useful; patterns of events, like shapes, though potentially infinite in variety, tend to approximate simple forms with predictable properties.

But drawing cannot distort information in the same way as memory. However badly someone draws, they do not ever act as if they see the world that way. To walk or sit, to touch or pick up, proves that a non-artist does not bungle seeing the world in same way as rendering it. Even in dreams no one sees as badly as they draw. The brain does not retouch value into outline before storing it; the reduction to outline is a loss within the brain.

3​. If someone does not draw in a style that resembles Western art, that person is not therefore a non-artist. High cultures elevate as art what Westerners might regard as mistakes. Non-artists may not exist. Could it be that everyone distorts their perceptions artistically when they draw, most in more dramatic ways than the subtle ones traditionally valorized in the West?

Western art has its conventions, but to say that photography and the kind of art that obeys the same laws of optics and projection is essentially a cultural convention requires more gall than I have. Human eyes only work one way. The anecdote says that a pygmy brought out of the forest could not tell buffaloes on the horizon from insects. Assume the anecdote is true; what does it prove? It proves that there exists such a thing as a myopic pygmy. Or should we believe that pygmies never look up into the crowns of trees? That they cannot tell that the bird overhead is the same as the bird in the bush? To prize optic validity as artistic quality is cultural; but the validity itself is physiological.

So if for the first 3500 years or so of human history no culture or civilization held the goal of art to be to represent just what was seen, then of course we are readily distracted by other goals, and must be induced by long training to give them up for this one particular goal of realism.

But I am unwilling to credit that the artistic way is ever the easy way. The artists of the cave walls, of Egypt and Sumer, of India and Persia, were no lazier than the artists of Venice, Florence, and Amsterdam. They were not primitive; they were not innocent. Assimilating natural errors to artistic traditions they happen to resemble represents a more ridiculous pedestal for Western art than any academy ever proposed.

Too, perspective and foreshadowing are not utterly alien to the brain; I suspect that even those whose arts reject these values, do in fact dream with them.

Cognitive psychology 3/5

The textbook scientific method abstracts a discovery from the experiment that made it as early and thoroughly as possible – by suppressing the personalities of the experimenters; by using standardized equipment and methods of analysis; by ensuring reproducibility and undertaking it. All these mechanisms are relatively weak in cognitive psychology. Success results in advancement, experiments are themselves the instruments, and with the expense and inconvenience of recruitment, and the backlog of experiments yet to be done, reproducing experiments is a last resort. But psychology has never worked the formal way – try it, and you get Behaviorism – and this way seems, for the most part, to work.

Cognitive psychology had to fight to free itself from Behaviorism. Here we come to the problem. Behaviorism rejected introspection; cognitive psychology accepts it; but it exchanges uncritical rejection for uncritical acceptance. I think that there is something that cognitive psychology gets fundamentally wrong about introspection: it assumes that our perceptions resemble our perception of our perceptions.

(Nominally cognitive psychology rejects “introspection” but accepts “self-reporting” – I do not understand the difference.)

For example: a form of experiment is to present a list of words on a common theme for memorization. The list omits some particularly obvious entry that could be expected to occur with the rest. When memorizers repeat the list back, they often supply this absent word. The obvious conclusion, the conclusion that cognitive psychology draws, is that the memorizers have perceived the absent word; that a subconscious process – sub in the sense of subroutine – inserts the extra word into the memorizer’s perception of the list.

But this may not be so. Precisely because the list is simple it obscures the distinction between perception and perception-of-perception. There is another interpretation. What if the memorizers do not perceive the extra word? What if, instead, in reproducing the list, the memorizers perceive their own perception of the list in an incorrect, yet conscious way?

Cognitive psychology 2/5

Before I attempt a reasoned argument I want to sketch some broad points of discomfort with cognitive psychology.

1​. It is impossible to pay sustained attention to cognitive psychology without suspecting that the point of many experiments is not the paper, but the press release. When we hear “science writer,” we imagine journalists trawling scientific publications for stories with headline appeal. But this is not how it works. Institutions (with what degree of involvement from researchers I do not know) push press releases to sites like EurekAlert!; journalists may check up on them to add human interest; but the transition between experiment and news item happens inside the institution. As a regime of incentives, this strikes me as perverse.

2​. Cognitive psychology is based on the model of the brain as a computer; but this is a trivial statement, nearly a tautology. It implies that a computer is a kind of machine, another something like a clock or car; a sophisticated machine, certainly, but just a machine; just a machine, and the brain just another example.

Cognitive psychology began when the psychologists of the 1960s saw the first electronic computers and found in them an analogy for the mind. Unfortunately this is still true: cognitive psychology still understands the mind as a computer of the 1960s, complete with fMRI blinkenlights.

But this is not what a computer is. The history of computers is not a history of invention; it is a history of discovery. We did not invent computers; we discovered computation. Computation is an aspect of nature, something like heat or gravity, a property of all sufficiently complex systems. If something is not a computer, it is less than a computer. Of course the brain is a computer; and—?

3​. Cognitive psychology and ethics in psychological research are roughly coeval. The obvious suspicion is to wonder: did we discover cognitive psychology because it was the only psychology we could discover ethically? In ages when the human form was held sacred, even after death, anatomy without dissection went badly wrong. Medieval anatomy reflected not the body, but medieval ethics. Does psychology reflect the mind, or does it reflect the ethics that direct our examination of the mind?

Cognitive psychology 1/5

So pervasive have the claims of cognitive psychology become, so often do I encounter the rhetoric in which the introductory anecdote takes the form of a cognitive psychology experiment, that I find it necessary to decide just what I think of it, and where I stand. I do so best by doing so publicly.

I admire cognitive psychology. It is the most coherent and fruitful paradigm in psychology. The nosology of biases, by itself, might be the greatest act of thinking about thinking since the Greek achievement of logic.

Of course, like logic, the doctrine of biases is subject to misuse – the more so because it is new and we have little experience with its limitations. In logic, the spirit of catching out logical fallacies trivializes itself in the principle of the fallacy of fallacies – the error of assuming that because an argument is fallacious, its conclusion is wrong. Now, because biases are innate, but the recognition of biases is acquired, the use in argument of accusations of bias cannot be closed by a neat “bias of biases”; but something equivalent is required to avoid the error of rejecting conclusions not because they are wrong, but because they are biased – which, in itself, is no argument.

Biases are everywhere. But the very pervasiveness of cognitive psychology risks creating a new bias – a meta-bias, a confusion between biased and wrong.

Reading

We readers – some of us are compelled to seek the true way of reading, the one discipline that suits all books. The attractions are many. There is a sense of belonging in knowing what kind of reader you are; there is confidence in knowing exactly what you will get from a book; there is the general attraction of any discipline in life – the focus that comes from the neglect of any concerns beyond that focus.

But we readers, after all, are people who choose to spend unusual proportions of our time alone. The value of the independence we must have to read at all holds for our choice of how to read. We do not read to rehearse an opinion, but to animate it – to produce some motion in it analogous to the biology of growth, or healing, or even decay. (Sometimes we read not to reinforce our opinions, but to escape them, gradually page by page.) And sometimes a book should produce the same motion in the idea of reading itself.

The value of reading a book for instruction, instead of running a search; the value of reading a book for diversion, instead of participating in the lives of friends – the value is that the writer, in filling up so huge a thing as even a short book is, must call upon and involve their whole experience and sense of the world, must turn themselves inside-out in such a way that someone else can put them on.

If reading is to be creative – and why not let it be, if it can? – then it must be by a parallel inversion, a counterpoint to writing in which reading calls on and involves you as writing does; which must therefore be subject to change as your experience of the world changes. And for a reader, experience must include reading itself.

All of which is to say that I try not to worry about what kind of reader I am or about how I should read. I change with reading, and I do not even know it until I re-read a book after a long interval and discover how different a reader I have become.

Pseudoscience

Because historians do not understand science, scientists write their own history. So when the perimeters of science change – when what appeared to be a science turns out to have been a pseudoscience – the dead or, worse, retired scientists who pursued it are sorted ex post facto. The ones who anticipated the change remain scientists; the ones who fell for it turn out to have been pseudoscientists. Of course, they were pseudoscientists all along – everyone knew it – but that media, that irresponsible media, they were the ones who made it seem otherwise – we scientists always knew better; it’s your fault for being gullible, you cargo cultists.

You can watch this happening to string theory. In one direction or the other expect to see it in climatology. Let us rehearse the explanations in advance. Someday you may recognize one or the other as a news item or a footnote in history.

Case A:

Despite the overwhelming evidence for the anthropogenic origin of global warming, a movement of so-called skeptics, organized through the resources of corporations and political parties whose interests were threatened by the urgent measures the situation required, were able to delay action until the forces behind climate change had become irreversible. Certain scientists, some through misplaced but sincere conviction, but most because it was convenient and attention-getting, continued to cast doubt on the evidence even after the scientific consensus was incontrovertible. Nonetheless, none of the best scientists failed to see reason, and it is simply false to assert, as some have, that scientists themselves were at fault.

Case B:

In the tense political atmosphere of the early 21st century it was only natural that movements were eager to enlist scientific evidence to support their policies. Given the apocalyptic mood of the time – a quick look at the box-office returns for the first decade of the century will show that the impending end of the world was a cultural commonplace – it is unsurprising that what developed was a superficially scientific vision of the apocalypse. The media too were part of this zeitgeist, and they freely exaggerated a concern many scientists had with the unknown effects of carbon dioxide, and certain alarming high-level trends, into a movement complete with speeches, rallies, and platforms. Nonetheless, none of the best scientists failed to see reason, and it is simply false to assert, as some have, that scientists themselves were at fault.

I am not proposing a debate. I am not trying to convince you of anything except the irrelevance of your convictions. Climate change is just a convenient subject.

(For the record my view is: better safe than sorry. The absence of anthropogenic global warming would be harder to explain than its presence; I therefore am in favor of anything short of irreversible geo-engineering.

Though I do admit to disgust for those who condemn “economists’ reliance on models” with one fork of their tongues, while the other extols “the proven science of climate change” – as if modeling the economy were any harder than modeling the climate.)

I am sarcastic because I am disgusted. But I am not attacking science. I trust that what has been declared a pseudoscience is so. In this retrospect science is as good as infallible. But I dispute the hypocrisy which would pretend that pseudoscience has never entered the mainstream of science, or that if it ever had, it would have been due to outside meddling.

“No true scientist would have participated in X; therefore any so-called scientist who participated in X was not a true scientist.” The whole history of the relation of science and pseudoscience is constructed with this tautology.

True, science is not just a vocation, but also an affiliation, a group, and therefore, like any other group with a purpose, compromised by loyalties and solidarities. But the more acute problem is that science and pseudoscience are not dichotomous. Degrees exist between them.

The world protoscience has been advanced for the pre-scientific pursuits that led into sciences – as astrology led into astronomy, as alchemy led into chemistry, as doctoring led into medicine. There may be many more intermediate degrees of this kind, but I propose only one. Some scientific pursuits are neither sciences nor pseudosciences, but placeholder sciences. The textbook scientific method expects, within a science, that observation, hypothesis, and theory will follow in order. But in much scientific work the science itself is a hypothesis. The first question is not, “What law governs this phenomenon?” but “Is this a phenomenon at all?”

Most -ologies are not really fields at all, but gambits. In science a field does not arrive and then demand methods, subjects, a center and journals; instead the methods, subjects, the center and the journals are how hopeful scientists attempt to bootstrap a new science into being. In the end the attempt either succeeds as a science, or fails as a pseudoscience; but in the meantime it is neither – it invites a science and clears a space for it, saves it a seat. It is a placeholder.

For scientists of a later generation to judge a placeholder science according to its final result is unfair. The scientists who failed were not cranks; the scientists who succeeded were not visionaries. To suppose they could have known better in advance is to suppose a faculty which, if it did exist, would make actual science superfluous. A scientist can no more know in advance if a science is a pseudoscience than a computer can know in advance if a program will halt.

All sciences begin as gambits. The sooner we recognize this, the sooner we can avoid misplacing the faith due a mature science in its placeholder; but more importantly, the sooner this is recognized, the sooner we can begin, not just accommodating such gambits, but encouraging them.