« Older Home
Loading Newer »

I was on blogging leave last week. Not my usual no-writing-today-so-fuck-right-off-and-read-someone-who-cares leave — actual medical leave. I was confined to the hospital for more than a week with acute appendicitis. (Fine thanks, but a word of advice: if you must have your appendix out, do so before it ruptures.) Hospitals are interesting places; hang around one for a few days and you start to believe that Foucault had a point after all. I shall have more to say about them presently, but for the moment I will confine myself to a few preliminary observations.

1. If you want to know when you can expect to be released, consult the Wiki, not your doctor. The average stay for acute appendicitis is about a week. Your doctor will never tell you this, lest he sound too much like an algorithm in an expert system, which can probably outdiagnose him anyway. Best to keep your mouth shut, particularly if you are inclined to ask questions like, “Can you give me a range of dates within which you expect, with 0.9 probability, to release me?” Complaining to my doctor about her vagueness provoked a stern and rather terrifying lecture about how medicine is both an art and a science and each individual case is different. As it turned out, because of her art and my individuality, I spent thirteen more hours in the hospital than the average.

2. Pet therapy appears to be medically certified. One morning I was awoken from a fitful sleep by a mangy griffon called Kindu — I read the name from his official hospital ID, and yes, his department is “Pet Therapy” — who is apparently hauled from bed to bed, to be petted serially. The hygienic implications of this program may not bear scrutiny.

3. Catholic hospitals take their religion more seriously than you might imagine. St. Vincent’s has a rather large chapel, although I never saw it occupied. It also employs priests who roam the halls, ostensibly to offer succor. This merely annoys the non-believer; and if I did believe, and were sick in a hospital bed, I wouldn’t be in any special hurry to see one either. One of my roommates’ guests also thoughtfully took time from his busy schedule to try to bring me to Christ. This, however, was not authorized by the hospital.

4. In Deconstructing Harry, Woody Allen asks the prostitute he has hired how she likes her job and receives the usual reply. “It’s funny,” he says, “every hooker I meet says it beats the hell out of waitressing. Waitressing must be the worst job in the world.”

It’s not. Nursing is.

5. Old people spend a really remarkable amount of time discussing “Dancing with the Stars.”

Aaron Haspel | Posted November 3, 2007 @ 11:27 AM | Navel-Gazing

In the Seaworld auditorium, waiting for the evening Shamu show.

In the Seaworld auditorium, waiting for the evening Shamu show, watching the wide-screen video. It shows soldiers and firemen, then August Busch III, or maybe IV, representing Anheuser-Busch, the corporate parent. He tells us to honor our heroes, which we do, with applause. A whale trainer, perky, live, asks everyone who has served in the American, Canadian, or British Armed Forces to rise. They do, to more applause. I wonder where Australia went. The video ends and the show begins. Shamu splashes soldier and civilian alike.

Bertha, my server for the evening, points out that she makes the salads. I do not order one.

My ten-year-old niece says that she does not understand me and I scare her.

(After our new Poet Laureate.)

Update: Jim Henley comments. Jim and I have had our disagreements in the past, but we stand to shoulder to shoulder in the unalterable conviction that, as bad a poet as Charles Simic is, Billy Collins is worse, perhaps the worst in human history. Sound and Fury comments.

Aaron Haspel | Posted August 11, 2007 @ 3:09 PM | Navel-Gazing,Poetry

It is a cherished belief, in all Objectivist as well as certain fellow-traveling circles, that economic interventionism must collapse under its own weight. Here, for instance, is Ludwig von Mises, in Planned Chaos:

Many advocates of interventionism are bewildered when one tells them that in recommending interventionism they themselves are fostering antidemocratic and dictatorial tendencies and the establishment of totalitarian socialism. …

What these people fail to realize is that the various measures they suggest are not capable of bringing about the beneficial results aimed at. On the contrary they produce a state of affairs which from the point of view of their advocates is worse than the previous state which they were designed to alter. If the government, faced with this failure of its first intervention, is not prepared to undo its interference with the market and to return to a free economy, it must add to its first measure more and more regulations and restrictions. Proceeding step by step on this way it finally reaches a point in which all economic freedom of individuals has disappeared. Then socialism of the German pattern, the Zwangswirtschaft of the Nazis, emerges.

Mises has just asserted, on the previous page, that for interventionists “the main thing is not to improve the conditions of the masses, but to harm the entrepreneurs and capitalists.” If this is true, it puts his claim that interventionism produces “a state of affairs which from the point of view of [its] advocates is worse than the previous state” in doubt. But what really interests me is the slippery-slope argument that interventionism inherently leads to socialism.

The example Mises chooses to support this thesis is price controls. The government begins by controlling the price of milk. The supply of milk declines, as the marginal producers are driven out of business. This is not what the government wants at all; so it continues by controlling the prices of the factors of milk production. The logic repeats itself a few more times, until we arrive at socialism of the German pattern. Mises, being no mean economist, points out that the government could guarantee milk for poor children more effectively by buying it at the market price and giving it away or selling it at a loss. The populace pays for this in taxes, of course, and you might end up with a black market in milk, but it surely beats price controls. Yet this policy is interventionism, just as price controls are. Does socialism emerge in either case? Or do only particularly stupid forms of interventionism produce the slippery slope?

The Objectivists, as is their wont, go a good deal further. Only Objectivism itself can halt the long, slow slide of the mixed economy into slavery. The go-to guy for over-the-top Objectivist pronouncements is not Ayn Rand herself but her “intellectual heir,” Leonard Peikoff. His book The Ominous Parallels is notable as the only work of German historiography ever written by someone who cannot read German. It also contains this gem:

No one can predict the form or timing of the catastrophe that will befall this country if our direction is not changed. No one can know what concatenation of crises, in what progression of steps and across what interval of years, would finally break the nation’s spirit and system of government. No one can know whether such a breakdown would lead to an American dictatorship directly — or indirectly, after a civil war and/or foreign war and/or protracted Dark Ages of primitive roving gangs.

What one can know is only this much: the end result of the country’s present course is some kind of dictatorship; and the cultural-political signs for many years now have been pointing increasingly to one kind in particular. The signs have been pointing to an American form of Nazism. …

There is only one antidote to today’s trend: a new, pro-reason philosophy.

This new pro-reason philosophy, of course, would be Objectivism. Now I think we can agree that in the twenty-five years since this passage was written two things have not happened. Objectivism has not swept the country, and American-style Nazis have not taken over the government. (Anyone who thinks the Bush gang counts needs to acquaint himself with the real Nazis.)

We have had approximately steady-state interventionism in the United States for a long time. Federal spending has hovered around 20% of GDP since the Second World War — no matter who was President, no matter which party controlled Congress, no matter what. Naturally there has been a great deal of expensive tinkering. The airlines are regulated, then deregulated. Savings and loans are encouraged, through insurance, to invest in risky propositions and then, after they lose hundreds of billions, enjoined from doing so. Liberty advances, when the draft is eliminated; and retreats, when the state sponsors offshore torture and suspends habeas corpus for citizens who are classified as “enemy combatants.” On the one hand the Fairness Doctrine is scrapped. On the other Draconian regulations are imposed in quasi-public spaces like offices, stores, and restaurants. To call these changes marginal would be an exaggeration; to call them a lurch toward fascism would be absurd.

Peikoff hastens to say that neither he nor anyone else can predict “the form or timing” of the coming dictatorship. Mises, similarly, disassociates himself from historical determinism, saying that the socialist tide can be stemmed with “common sense and moral courage,” which does not appear to be in any greater supply now than it was then. Their belief, in other words, commits them to nothing whatever. Barring an unlikely sudden upsurge of Objectivism, common sense, or moral courage, Peikoff and Mises are, epistemologically, on all fours with Christians who await the Rapture.

As Eliezer Yudkowsky puts the matter:

The rationalist virtue of empiricism consists of constantly asking which experiences our beliefs predict — or better yet, prohibit. Do you believe that phlogiston is the cause of fire? Then what do you expect to see happen, because of that? Do you believe that Wulky Wulkinsen is a post-utopian? Then what do you expect to see because of that? No, not “colonial alienation”; what experience will happen to you? Do you believe that if a tree falls in the forest, and no one hears it, it still makes a sound? Then what experience must therefore befall you?

It is even better to ask: what experience must not happen to me? Do you believe that elan vital explains the mysterious aliveness of living beings? Then what does this belief not allow to happen — what would definitely falsify this belief? A null answer means that your belief does not constrain experience; it permits anything to happen to you. It floats.

When you argue a seemingly factual question, always keep in mind which difference of anticipation you are arguing about. If you can’t find the difference of anticipation, you’re probably arguing about labels in your belief network — or even worse, floating beliefs, barnacles on your network. If you don’t know what experiences are implied by Wulky Wilkinsen being a post-utopian, you can go on arguing about it forever. (You can also publish papers about it forever.)

Above all, don’t ask what to believe — ask what to anticipate. Every question of belief should flow from a question of anticipation, and that question of anticipation should be the center of the inquiry. Every guess of belief should begin by flowing to a specific guess of anticipation, and should continue to pay rent in future anticipations. If a belief turns deadbeat, evict it.

Consider this an eviction notice.

Aaron Haspel | Posted August 1, 2007 @ 10:25 AM | Philosophy,Politics

The odor from the stinkbomb that Colby Cosh lobbed at The Sopranos has wafted hither. T.S. Eliot notoriously remarked that the only method of the critic is to be very intelligent. This doesn’t help the writer much, but it saves the reader all kinds of time, allowing him to skip, say, the critical efforts of Brian Williams, the noted newsreader. Williams, to be fair, is terrifically game about the whole business, and one must admire him, in the way Dr. Johnson admired a woman preaching.

By the same standard, we are obliged to treat Colby’s comments seriously:

I haven’t seen very many episodes of The Sopranos over the years — only just enough to know that it was a derivative show universally praised for its originality, and an amazingly slackly-written show universally praised for its tight writing.

David Chase is supposed to have had the whole thing pretty well sketched out in his godlike genius brain right from the get-go, and if you can believe that while fumbling with the loose ends of two dozen plot threads, you’ll believe it was incredibly inventive to have a mob boss living in a New Jersey suburban neighbourhood in the guise of a waste-management executive. (Did the producers ever just go ahead and actually put a “DARK UNDERBELLY OF THE AMERICAN DREAM LOCATED HERE–NO FLASH PHOTOGRAPHY” sign on the front lawn of Casa Soprano?)

Here Colby has forgotten the second, implied part of Eliot’s injunction: to be very intelligent while practicing criticism. Yes, the big theme of the show is unmissable, like many big themes. Their size makes them relatively easy to spot. Let’s try a few. Emma: people must discover their happiness for themselves. Lost Illusions: success and merit are weakly correlated. The Brothers Karamazov: Christianity or moral chaos, the choice is yours. The Man Without Qualities: I, Robert Musil, am the cleverest man in the world.

Two of these are obvious. Two are false. Yet the books are all very much worth reading. Such merit as they have must lie elsewhere.

The Sopranos‘s big theme markedly resembles that of The Great Gatsby, though I for one am thankful to be spared a shot of James Gandolfini floating face-down in his swimming pool. Yet anyone who leveled the “no flash photography” charge at Fitzgerald would be missing the point. You read The Great Gatsby for the beautiful shirts and the voice full of money, the cufflinks of Meyer Wolfsheim and the eyes of Dr. T.J. Eckleburg, Jordan Baker cheating at golf and young Jimmy Gatz studying electricity from 7:15 to 8:15 every morning.

Details from The Sopranos adhere to your consciousness in the same way. A witness is gung-ho to testify in a murder case, until he finds out the perp is Tony Soprano. In the scene in which he changes his mind, the book he’s reading is Nozick’s Anarchy, State, and Utopia. Tony’s dreadful mother dies, and his dreadful sister conducts the wake by going around the room and insisting that each guest dredge up a pleasant memory. In the very top of the frame Uncle Junior enters the room, has no idea what’s going on except that he wants nothing to do with it, and bolts up the stairs. Or in the last season, we have Junior again, now confined to an asylum for the criminally insane, running a poker game for imaginary stakes in a parody of his mob life, itself a parody of legitimate business. Yet to Junior and his young MIT-educated Chinese underling, life in the asylum is the only life there is, and the parody gradually grows earnest. None of these bits advance the main plot threads in the slightest. You watch the show, in short, for what John Crowe Ransom used to call “texture.”

“Slackly written” is an epithet, and you can’t win a wrestling match with an epithet. Certainly for many episodes, and a couple entire seasons, like Five and Six, “slack” is a charitable term. Eighty hours of television, even some of the best television that has ever been, will have its slow spots. In this The Sopranos resembles every epic work of art ever produced by man. You don’t want to read the theory of history with which Tolstoy concludes War and Peace or Victor Hugo’s hymn to the sewers of Paris in Les Miserables either, believe me.

The Sopranos went on too long because most of its characters were not intelligent enough to trace story arcs — more like lazy circles. Christopher goes on drugs, goes off drugs, goes on drugs again, goes off the road while on drugs. Carmela threatens to leave Tony, doesn’t, finally does, returns, threatens to leave again, sticks this time. A.J. does stupid shit, grows older, does really stupid shit. Everyone ends up as he began, or dead. This is what makes it inferior to Deadwood and The Wire. And this is the criticism that Colby might have made and did not, because to make it requires watching more episodes than “not very many.”

Aaron Haspel | Posted July 24, 2007 @ 6:17 PM | Culture,Literature,Movies

Pickup on South Street

Richard Widmark sneers.
Thelma Ritter finks, sells ties.
Audience nods off.

Rancho Notorious

Alas for Fritz Lang,
direction translates poorly
from the German.

Thieves Like Us

Is it possible?
An Altman fanboy favorite
that can be sat through?

Cries and Whispers

The TV is clear
of dust where the red envelope
has lain three weeks.

Aaron Haspel | Posted July 24, 2007 @ 6:07 PM | Movies,Poetry

Brian Doherty’s Radicals for Capitalism is a comprehensive, highly entertaining history of libertarianism with too many points of interest — Murray Rothbard’s solution to the free rider problem (“so what?”), Milton Friedman’s sterling character, The Unbearable Lightness of Being a Deontologist — to deal with in a single post. Instead I want to talk about the notes.

Radicals for Capitalism is a scholarly, though not an academic, book, and like many such books it does plenty of business in the notes. Not as much as some, like Popper’s The Open Society and Its Enemies, in which the notes are longer than the text, but enough. For instance, my friend (and frequent commenter) Jim Valliant’s book on the Brandens, The Passion of Ayn Rand’s Critics, receives a half-page treatment in the endnotes, but none in the text. Out of 2,000 notes, there are 400 or so that you want to read; the rest are simple source citations.

Doherty’s notes receive the standard treatment, which is to say the worst possible. The notes are renumbered by chapter, but each page of notes is headed, usefully, “Notes”; the chapter titles occur only on the beginning page of the notes for that chapter. To look up an endnote, then, you have to remember the number, remember the chapter number, flip to the notes section, locate the beginning page of the correct chapter, and then flip forward to the right note number, only to be disappointed most of the time with a mere source cite. (Admittedly it would be more efficient to use a bookmark, but I never have one handy, and they tend to fall out. At any rate, the necessity confesses design failure.)

Yet this is all so simple to fix. There are five rules for notes:

1. Footnotes, provided they are short and sparse, are better than endnotes. They can be consulted immediately and without effort. Obviously in a book like Doherty’s endnotes are necessary.

2. Each endnote page should be headed by the page numbers of the notes it contains, to facilitate easy flipping. For example, “Notes, pp. 537-558″; not “Notes: Chapter Seven,” or “Notes: A Stupid Chapter Title That I’ve Forgotten and Now You’re Gonna Make Me Look It Up,” or, God forbid, “Notes.”

3. Notes should not be numbered. Numbers tax the reader needlessly, especially when they reach three figures. They should be marked by a symbol in the text, something like this or this. In the back they should be referenced by the page number and the last few words of the passage that they annotate, which are the easiest things to remember.

It would be especially helpful to use two symbols, to distinguish substantive comments from simple citations, telling the reader when to flip to the back and when not to bother. I have never seen this in a scholarly book, and I wonder why.

4. The notes must be indexed. In Doherty’s book they are not. Had Jim Valliant gone looking for himself in the index, as I am assured august persons are wont to do, he would have come up empty. Why make trouble for Jim? If he merits a substantive mention, he also merits an index entry. I realize this is extra work. I expect extra work for my thirty-five bones, now marked down to $23.10, plus shipping.

5. The text should contain as little scholarly detritus as possible. Academic books often include source citations in the text, which avails the author the opportunity to look more erudite and avails the reader nothing, since if he wants to look up the source he has to consult the biblliography anyway. If the book has endnotes, that’s where the source cites belong.

A brilliant exception to this rule is Jacques Barzun’s From Dawn to Decadence, which contains no specific source cites, only an occasional parenthesis, when discussing a topic, that “the book to read is…” or “the book to browse in is…” If you are a nonagenarian and the world’s preeminent living intellectual, you can write like that. The rest of us cannot afford to be so peremptory. Still, Barzun’s asides have furthered my education, which is more than I can say for the usual uncommented bibliography.

Yes, a circle would be better. I can’t get a circle the right size using HTML character codes. Sorry.

Yes, a larger bullet would be better. See above. I trust you get the idea.

Update: Another intransigent opponent of endnotes, Billy Beck, heard from. I thank him for his recommendation of the Zerby book, which I will look up. Kieran Healy comments. Andrew Gelman comments. James Joyner comments. Evan Hughes comments.

Aaron Haspel | Posted March 13, 2007 @ 7:08 PM | Heuristic,Literature

Our Girl in Chicago, Laura Demanski, has roused me from my torpor by asking for an interpretation of Philip Larkin’s Spring. She might want to quote the whole sonnet instead of its last six lines. Fourteen consecutive lines of verse will probably not tax most readers unduly. Larkin was an accomplished and rigorous editor of his own poems, and if he had wanted the octet omitted he might have thought to do so himself.

Green-shadowed people sit, or walk in rings,
Their children finger the awakened grass,
Calmly a cloud stands, calmly a bird sings,
And, flashing like a dangled looking-glass,
Sun lights the balls that bounce, the dogs that bark,
The branch-arrested mist of leaf, and me,
Threading my pursed-up way across the park,
An indigestible sterility.

Spring, of all seasons most gratuitous,
Is fold of untaught flower, is race of water,
Is earth’s most multiple, excited daughter;

And those she has least use for see her best,
Their paths grown craven and circuitous,
Their visions mountain-clear, their needs immodest.

The last three words baffle Laura; we will come to them presently. She cites a “reading” of the poem (which also quotes it only in part) that sheds no light on them.

I’d read lots of odes to Spring in my time but none that contained his piquant blend of lyricism and discontent. How often had I not felt that nature was doing its beautiful best but that my mood or circumstances simply didn’t match it? All of us must, at some time, have felt out of harmony with nature. The line ‘And those she has least use for see her best’ acknowledges the paradox that if one’s life were on a par with all that Spring represents, Spring would not be noticeable except as an accompaniment to one’s own blossoming. We see it so clearly because the contrast with our own state is so marked.

This is less a reading than a view from 10,000 feet, and none too clear at that. The theme of Spring is the radical discontinuity between conscious and subconscious or unconscious life, to which a phrase like “if one’s life were on a par with all that Spring represents” scarcely does justice. The poem is not a description of an unspringlike “mood,” and the “contrast with [his] own state” is incidental. The poet sees spring clearly because he possesses intellect, which is “indigestible,” sterile, unnatural. It is the subject of the poem, although the word never appears. Thinking humans feel “out of harmony with nature” because they are out of harmony with nature.

The theme is not original with Larkin. One finds it in many of the tougher poets — in Emily Dickinson (What mystery pervades a well), in Tristan Corbière (La Rapsode Foraine et la Pardon de Sainte-Anne: “L’innocent est près du ciel”), and in Yvor Winters (A Summer Commentary), among others.

The details in the octet are carefully managed. Nothing is at eye level. Larkin starts on the ground, with people sitting, walking, fingering the grass. “Walk in rings” is literally what people do in parks; it also connotes aimlessness, subconsciousness, mere existence. In the third line the poet shifts his attention upward, to the “calmly” standing cloud and singing bird. The adverb is chosen advisedly: their calm is the calm of belonging, as he himself, in his “pursed-up way,” does not. The light of the sun in the fifth line directs one’s attention to the ground again, to the bouncing balls and barking dogs, and then suddenly we encounter the striking “branch-arrested mist of leaf,” as if the poet were looking upside-down at the tree, growing out of the sky, leaves first, instead of the ground. With all of this back-and-forth between earth and sky, “threading,” in line seven, becomes peculiarly apt.

Lines nine through eleven, with their “piquant” description of the season, have made the poem famous. Such piquancy as they have arises from their continuation of the theme. Spring is “gratuitous” in both the primary and secondary senses. It spawns life — excited, multiple — in a way no other season does, for nothing, gratuitously. (The container, spring, is “excited,” while the contained, the cloud and the bird, are calm.) At the same time, for the poet, spring is also gratuitous, unnecessary — just grist for the conscious mill. “Untaught flower” emphasizes, again, the unbridgeable barrier between thought and not-thought.

In his summary Larkin ironically chooses another nature metaphor, “mountain-clear,” to describe his apartness from nature. One never breaks away entirely. The poet walks in the park too. By now the “immodest needs” should be clear. Spring, for him, will not suffice: it is not enough to breathe and bark and sing and caper. His vocabulary is equally immodest: he treads — threads — “circuitous paths”; those whom the season “has use for” merely walk in rings. Larkin finds this conclusion too grandiose for the feeling of the poem, and he undercuts it with a rhythmic trick. The last line has eleven syllables and rhymes on its feminine ending, giving the impression of trailing off in a mumble. Immodesty, put as modestly as possible.

Update: Laura is not a dullard. She has written a vast deal of entertaining and informative prose, which is not what dullards do. One does not suddenly become a dullard by failing to quote the octet.

Aaron Haspel | Posted February 20, 2007 @ 6:03 PM | Poetry

Begin with a data set, preferably one in which many people are interested. Let’s say, World Series results from 1903 to the present.

Now ask a question about the data, one that should be easy to answer with a highly simplified model. Our question will be: have World Series teams, historically, been evenly matched?

Our model will ignore home-field advantage. In baseball the home team wins 53% or 54% of the time; nonetheless, we will assume that each team has a probability of 0.5 of winning each game. This gives the following expected probabilities for a best-of-seven series running four, five, six, or seven games:

P(4) = 0.125
P(5) = 0.250
P(6) = 0.3125
P(7) = 0.3125

Remember that if the model is too simple to fit the data, you can clean the data. Since 1903, the World Series has been played every year but two. There were a few best-of-nine series and a few more that included ties, which are too complicated to deal with. Throw them out. This leaves 95 series. Draw up a little chart comparing actual and expected probabilities, like so:

Possible outcomes P(Expected) P(Actual)
4-0 0.125 0.179
4-1 0.250 0.221
4-2 0.3125 0.242
4-3 0.3125 0.358

Now answer your own question. If the teams were evenly matched, the results would hew reasonably closely to the expected probabilities from the model. In fact there are anomalies. There are always anomalies. The World Series has been swept 17 times, five more than the model would predict. Plug this into the BINOMDIST function in Excel. (Understanding how this function works is optional and may in some cases be a disadvantage.) You find that, if the probabilities in the model were correct, there would be 17 or more sweeps in 95 occurrences only 8% of the time. A rotten break: you’re three lousy percent under statistical significance. But that aside, eleven of those were won by the team with the better regular-season record, several by teams considered among the all-time greats, including the 1927, 1939 and 1998 Yankees. That probably means something. On the other hand, the team that held the American League record for wins before 1998, the 1954 Indians, was swept by the Giants. Conclude judiciously that, on the whole, the data imply an occasional mismatch.

Look for any bonus anomalies. It doesn’t matter if they have nothing to do with your original question. Our data set turns up a nice one; the series went to seven games 34 out of 95 times — five too many, according to the model. This would occur randomly, assuming correct probabilities, only 20% of the time.

Damn, we’ve missed out on statistical significance again. Instead of looking at how often the series went seven, we can look at how often the team behind 3-2 won the sixth game. 34 out of 57, a somewhat more unusual result. Plug it back into BINOMDIST: we’re down to 9%, which is close but not close enough.

It has become inconvenient to look at the entire data set; let’s take just a chunk of it, say, 1945 to 2002. In those 58 years the World Series lasted seven games 27 times, which would happen by chance a mere 1% of the time. Furthermore, the team behind 3-2 won the sixth game 27 of 39 times; again, a 1% chance. Statistical significance at last!

Next, concoct plausible explanations for your new, statistically significant anomaly. Maybe the team that is behind plays harder, with their backs against the wall. Maybe they use all of their best pitchers, holding nothing in reserve for the seventh game. Maybe the team that is ahead chokes and cannot close it out.

Under no circumstances should you test these explanations. In the World Series the team that won Game Six also won Game Seven 18 times out of 34 — not likely if they had squandered their resources to win Game Six. In basketball, in the NBA Finals, the team that led 3-2 won Game Six 26 times out of 45. This is the opposite of what we found in baseball, in a sport that rewards hard play more and is far more conducive to choking, as anyone knows who has tried to shoot a free throw in a big game. In other words, your explanations, though plausible, are false. The result is probably due to random variation. This should not discourage you from completing your article. Write up your doubts in a separate note several months later.

Finally, check the literature to make sure your idea is original. If it isn’t, which is likely, mention your predecessor prominently in your acknowledgements, and include a footnote in which you pick a few nits.

Submit to suitable journals. Repeat unto death, or tenure, whichever comes first.

Update: Actual professional statisticians comment. Evolgen, who may or may not be a professional statistician, comments.

Aaron Haspel | Posted January 4, 2007 @ 9:34 PM | Baseball

“Sentence first — verdict afterwards,” says the Red Queen in Alice in Wonderland; and the trial of the Knave of Hearts has justly remained the literary standard for injustice, since the book’s publication in 1869.

Being an idiot, I thought the expression originated with Lewis Carroll, until last night. I was reading Macaulay’s 1830 essay on Lord Byron, and ran across the following passage, on Byron’s failed marriage: “True Jedwood justice was dealt out to him. First came the execution, then the investigation, and last of all, or rather not at all, the accusation.” The term “Jedwood justice,” also new to me, implied that the concept is proverbial, and led to a slightly earlier citation, in 1828, from Walter Scott’s Fair Maid of Perth: “Jedwood justice — hang in haste and try at leisure.”

Jedwood (or Jedburgh) justice, it turns out, goes under various aliases: Cupar (or Cowper) justice, Halifax law, Abingdon law, and Lydford law. Cupar and Halifax are dead-ends. A Major-General Brown, of Abingdon, is supposed to have hanged his prisoners and then tried them, but Brewer’s Dictionary of Phrase and Fable appears to be the sole authority for the Major-General’s existence.

Lydford proves more fertile. Chambers’ Book of Days cites an “old English proverb”: “First hang and draw, then leave the cause to Lydford Law.” He also quotes a poem, by the early 17th century poet William Browne, in Lydford’s defense:

I oft have heard of Lydford Law,
How in the morn they hang and draw,
And sit in judgment after:
At first I wondered at it much;
But since, I find the reason such,
As it deserves no laughter.

They have a castle on a hill;
I took it for an old wind-mill,
The vanes blown off by weather.
To lie therein one night, ’tis guessed
‘Twere better to be stoned and pressed,
Or hanged, now chose you whether.

Ten men less room within this cave,
Than five mice in a lantern have,
The keepers they are sly ones.
If any could devise by art
To get it up into a cart,
‘Twere fit to carry lions.

When I beheld it, Lord! thought I,
What justice and what clemency
Rath Lydford when I saw all!
I know none gladly there would stay,
But rather hang out of the way,
Than tarry for a trial!

Browne lived in Tavistock, a neighboring town in West Devon, and he knew what he was talking about: Lydford prison was described in 1512 as “one of the most heinous, contagious, and detestable places in the realm” (here’s a rather bucolic picture of the ruins). Depending on how long one had to tarry for a trial, Browne’s reasoning may have been sound as well. It is amusing at the very least.

My patchy scholarship, abetted by some desultory Googling, can take me no further. Can my readers supply earlier citations, in English or another language?

Update: You can tell me or you can tell Language Hat.

Aaron Haspel | Posted December 30, 2006 @ 1:13 PM | Language

Albert Hirschman has many fans at the arbiter of all things serious, Crooked Timber. Tyler Cowen, in one of his fitful attempts to shore up his left-wing cred, praised Hirschman as deserving of the Nobel Prize in Economics and The Rhetoric of Reaction as “a brilliant study in intellectual self-deception.” Good enough! I ordered up my copy and prepared to be edified.

The Rhetoric of Reaction proposes a taxonomy, or really a nosology, of arguments frequently employed by reactionaries. It begins with T.H. Marshall’s Class, Citizenship, and Social Development and its convenient, if schematic, tripartite division of “the development of citizenship” in the West. According to Marshall, first there were civil rights (freedom of religion, speech, and thought); then political rights (universal suffrage); and finally economic rights (the welfare state). Marshall allots these three developments a century apiece — the eighteenth, nineteenth, and twentieth, respectively. They are “progressive.” Whoever opposes any of them is “reactionary.”

If that’s all it takes, then count me in: I won’t defend universal suffrage, let alone the welfare state. I take solace in my distinguished and eclectic company. Hirschman’s reactionaries range from monarchists like Maistre and Burke to flaming socialists like Mosca and Pareto to welfare state critics like Friedman, Hayek, and Charles Murray, who get an especially raw deal. Friedman, who proposed a negative income tax, and Murray, with his similar grand scheme to replace the welfare state, cannot be fairly characterized as intransigently opposed to “reform.” Violence is being committed on the terms “reactionary” and “reformer.”

But to make a neat taxonomy you have to break a few eggs, and Hirschman’s is very neat indeed. We reactionaries, Hirschman says, argue against a proposed “reform” in three ways. The policy will do the opposite of what was intended (perversity). The policy will do nothing at all (futility). The policy will do other damage unrelated to its ends (jeopardy).

Hirschman’s categories are also more fluid than he acknowledges; the identical argument must be reclassified depending on how the reformer defines his ends. Take gun control. An opponent — the “reactionary” — might, and probably will, argue that it will prevent homeowners from defending themselves. This will reduce the risk to criminals, and thus crime will increase. If the advocate — the “reformer” — defines his end as reducing crime, we have a perversity argument. If he defines his end as reducing household gun accidents, we have a jeopardy argument. Hell, if the reformer defines his end as protecting innocent homeowners, and the additional homeowners who are shot by robbers cancel the ones who no longer shoot themselves, we might even have a futility argument. But it’s the same argument.

Still, Hirschman is on to something here. Jeopardy, futility, and perversity are all variations on unintended consequences, a traditionally rich field for ironists, and his thesis goes a long way toward explaining why “progressives” are so excruciatingly sincere:

There has been a certain lack of balance in the recurring debates between progressives and conservatives: in the effective use of the potent weapon of irony, conservatives have had a clear edge over progressives. In [Tocqueville's] hands [the French Revolution] begins to look naive and absurd, rather than infamous and sacrilegious — the predominant characterization conveyed by earlier critics such as Maistre and Bonald. This aspect of the conservatives’ attitude toward their opponents was also reflected by the German term Weltverbesserer (world improver), which evokes someone who has taken on far too much and is bound to end up as a ridiculous failure…. In general, a skeptical, mocking attitude toward progressives’ endeavors and likely achievements is an integral and highly effective component of the modern conservative stance.

I once read a news item about an oil-slick cleanup, it might have been the Exxon Valdez spill, I can’t remember. Countless mammals and birds are scrubbed; vast trouble is taken. Finally all is ready: the cosmetologists gather on the beach, and a freshly shampooed otter is ceremoniously released into the sea. It swims to the crest of the first wave, where it is promptly eaten by a killer whale. If you laugh, you are a reactionary.

Of course it is funny. But so what? Maybe the otter ran into extremely bad luck. Maybe so many animals were rescued, and so efficiently, that a few meals for Shamu made no difference. Perhaps what really makes a reactionary is that he finds this story not only funny, but a dispositive argument against oil-slick cleanups. I owe this thought to Hirschman, and it is enough to make me glad to have read the book. “Reactionaries” pride themselves on deep thinking and “hard-headed realism” the same way “progressives” pride themselves on moral superiority, and often with no more justification. Not all reforms fail, and not all unintended consequences are bad. It is salutary to be reminded to cast out the beam from your own eye before beholding the mote in your adversary’s.

But Hirschman has broader aims:

There has indeed been a more basic intent: to establish some presumption, through the demonstration of repetition in basic argument, that the standard “reactionary” reasoning, as here exhibited, is frequently faulty….

A general suspicion of overuse of the arguments is aroused by the demonstration that they are invoked time and again almost routinely to cover a wide variety of real situations. The suspicion is heightened when it can be shown, as I have attempted to do in the preceding pages, that the arguments have considerable intrinsic appeal because they hitch onto powerful myths (Hubris-Nemesis, Divine Providence, Oedipus) and influential interpretive formulas (ceci tuera cela, zero-sum) or because they cast a flattering light on their authors and provide a boost for their egos. In view of these extraneous attractions, it becomes likely that the standard reactionary these will often be embraced regardless of their fit.

Hirschman does not establish, beyond noting the similarity in the stories, that the perversity and futility theses “hitch onto” Oedipus and Hubris-Nemesis. And even if they do, where did the myths themselves originate? Isn’t it likely that both Oedipus and the argument from perversity, both Hubris-Nemesis and the argument from futility, originate in observed facts about events?

And his taxonomy is too comprehensive to sustain the charge of overuse. Throw out perversity, futility, and jeopardy, and what’s left? A reform’s ends are always noble, in the eyes of the reformers. Would Hirschman prefer that reactionaries argue against liberty, democracy, a minimal living for the poor, or clean air? When Charles Fourier tells us that socialism will raise the human average to the level of a Goethe or an Aristotle, should we reply that we prefer the human average as it is? Hirschman professes disappointment in the reactionaries: “Instead of the rich historical argumentation to which I was looking forward, the purveyors of the jeopardy claim, from Robert Lowe to Samuel Huntington, have often satisfied with simple affirmations of the ceci-tuera-cela [this will kill that] type.” The arguments in which, by implication, he thinks reactionaries ought to engage would really let him down.

Hirschman is much given to ironizing about the reactionary propensity to ironize. He surely appreciates the irony that his likely audience, “progressives,” will find nothing but confirmation for its beliefs. Few “reactionaries,” who could profit from the book, will ever read it.

Post scripta: It does not bear directly on Hirschman’s thesis, but the casual dishonesty of some of the footnotes is shocking in a scholar of his reputation. He writes of Gustave Le Bon, the author of The Crowd: “His basic principle being that the crowd is always benighted, he makes it apply with remarkable consistency, regardless of the constituents of the crowd and of their characteristics as individuals: ‘the vote of 40 academicians is no better than that of 40 water carriers’ he wrote, thereby managing to insult in passing the French academy with its forty members, an elite body from which he resentfully felt himself excluded.”

There is a footnote after “excluded,” which simply refers to the passage from The Crowd that he quotes, supplying no evidence for Le Bon’s alleged resentment. Hirschman must know that the note belongs directly after the quoted passage. By placing it where he does he bolsters, with an irrelevant citation, an unsupported slur.

Here is Hirschman later in the same chapter, on the 1834 Poor Law Amendment: “…the new arrangements were meant to deter the poor from resorting to public assistance and to stigmatize those who did by ‘imprisoning [them] in workhouses, compelling them to wear special garb, separating them from their families, cutting them off from communication with the poor outside, and, when they died, permitting their bodies to be disposed of for dissection.’”

Hirschman intends the reader to take the quotation at face value, as a factual description of the effect of the Amendment. But the footnote, at the end of the passage, is to Gertrude Himmelfarb’s classic The Idea of Poverty: England in the Early Industrial Age. The note says, accurately, that Himmelfarb is summarizing William Cobbett. The note does not say that Cobbett was one of the most vigorous contemporary opponents of the Amendment; neither does it say that Himmelfarb spends her next five pages qualifying and disputing him. The very page Hirschman quotes has a note of its own: “Cobbett was especially outraged by the practice of dissection, which he took to be the ultimate degradation and desecration caused by the New Poor Law. This was not, of course, part of the law, and it is not clear how common it was for workhouses to dispose of bodies for this purpose. But it was widely believed to be the case, partly because of Cobbett’s repeated charges to this effect.”

I hope Hirschman footnotes his works in economics, the ones that merit the Nobel Prize, more correctly.

Aaron Haspel | Posted December 23, 2006 @ 2:45 PM | Philosophy