Mar 122004
 

All rock critics like Elvis Costello because all rock critics look like Elvis Costello.
–David Lee Roth

Were you a grade-school liberal like me? Anyone who isn’t a socialist at 10 has no heart, anyone who still is at 20 has no brains. I grew up in New York’s legendarily Republican Dutchess County, of which Gore Vidal remarked, after a losing run for assemblyman, “Every four years the natives crawl out of their holes and vote for William McKinley.” Maybe so; what they don’t crawl out of their holes to do is vote for Gore Vidal. Dutchess was FDR’s home county, and he never came close to carrying it in four tries. A straw poll of my 6th grade class revealed that I was the only kid who supported McGovern. What I lacked in numbers I made up in energy, plastering McGovern posters all over the walls of the elementary school. My Nixon hatred confirmed, by junior high I knew all the Watergate players, not just the big boys like Haldeman and Ehrlichman but the whole supporting cast — McCord, Segretti, Egil Krogh, right down to Frank Wills, the security guard at the Watergate Hotel who blew the whole thing open. My chess club adjourned early one sultry night in July 1974 to tune in Nixon’s resignation speech, which I watched with undisguised, not to say lip-smacking, relish.

I understood no more of politics than my Nixonite classmates did. I hated Nixon because my parents hated Nixon and I was too young to have learned to hate my parents; I liked my parents. But I was plenty old enough to hate my classmates, and took a none-too-secret pleasure in the fact that my politics differed from theirs. They were nothing more than a way to be superior.

In high school I refused to listen to Led Zeppelin and Pink Floyd: that shit was for the heads who wore cutoff jean jackets and smoked in the parking lot. I went in instead for Devo, Talking Heads, Sex Pistols, Clash, a few deservedly forgotten groups like the Fabulous Poodles (“Mirror Star” anyone?), and of course, as Professor Lee Roth would have predicted, Elvis Costello. Later on, when my ex-stoner buddies sat me down with the headphones and forced me to listen carefully to Zep and Floyd, I was astonished to discover that it was good, really good, and that my own tastes at the time had held up spottily by comparison. The jean jacket boys were right, and I was wrong. It bothered me, as it would bother anyone. Only after several years of conscientious deprogramming could I listen to these bands without prejudice.

When I started to read poetry I stayed away from Keats and Shelley and Christina Rossetti: that shit was for the girls who liked rainbows and ponies, not that I had anything against rainbows or ponies, just the girls who liked them, who wouldn’t go out with me anyway. Even now I can’t read any Keats besides the Grecian Urn, am notoriously unfair to Shelley, and can admire one or two poems by Rossetti only from a discreet distance.

I have spilled my share of pixels here defending objective values in art. Some art is good, some bad, and confusing them is like thinking that the earth is flat or that there’s a fortune to be made in buying real estate with no money down. I am very far from recanting but I have nagging doubts. Elsewhere, discussing public and private reading, I instanced someone whose favorite song is “Desperado” because it happened to playing when he kissed a girl at the junior high school dance. The example is tendentious; in truth most “private readings” are far more subtle and insidious. You admire someone, and he plays you music, and shows you pictures, and lends you books. You admire the exhibits, but to what extent can that be disentangled from your admiration of the exhibitor, if at all? I have a taste for poems about the relationship between the abstract and the particular; on what grounds can I claim it is any more universal or important a theme than the tribulations of love or the inevitability of the grave?

I exhibit certain poems here, and convince certain readers who might not see them otherwise that they are good. But you will never share my tastes exactly unless you’re exactly like me, and God forbid. Yvor Winters, as any steady reader here knows, is my favorite critic. Do I like him on the merits, such as they are, or do I like him because, of all poetry critics, he’s most like me? David Lee Roth might know. I don’t.

(Update: Rick Coencas comments. George Wallace points out that it was Egil Krogh, not Emil as I originally had it; I would have known that in 8th grade. Eddie Thomas has some especially interesting remarks. Eloise of Spit Bull comments. Jeff Ward comments.)

Feb 252004
 

Eddie Thomas, a better philosopher than he is a statistician, poses the following problem. He is interviewing five candidates for two jobs. Each candidate’s chance of receiving a job offer, a priori, is 40%. After interviewing four candidates Eddie wants to offer a job to the best of the four to protect against his taking another job elsewhere. He is puzzled because it now looks like the final candidate has a 25% chance of a job offer, being one in four remaining, while his chances should be unaffected, remaining at 40%.

This is a type of restricted choice problem. The classic illustration of true restricted choice is the old game show Let’s Make a Deal. Monty Hall shows the player three doors, behind one of which is the grand prize, a Hawaiian vacation or a brand new Cadillac Eldorado. The player chooses one door. Monty then reveals another, behind which the prize is not hidden, and asks the player if he wants to switch. The player should always switch. His chance of choosing the grand prize in the first place was 1 in 3. If he switches, it is 2 in 3, because Monty’s choice of which door to open has been restricted by the choice the player already made. Many people don’t believe this even after it has been explained to them, but it’s true, and can be verified easily by experiment if you doubt it.

Here, on the other hand, because of restricted choice, the probabilities only appear to change. To receive a job offer you must be one of the best two of five candidates. Consider it from the point of view of the first four candidates. Each one has a 25% chance of receiving an offer after four interviews, and a 15% chance (.75 X .20) of receiving an offer after five interviews, for a grand total of 40%, as you would expect. So one way to look at it is that the remaining candidate must also have a probability of receiving an offer of 40% for the probabilities to add up to 200% (two job offers).

Since this will satisfy no one, least of all Eddie, I’ll try another approach. To receive an offer he has to be better than the three remaining candidates. However, his three remaining competitors are not randomly chosen; they have already failed to finish first among the first four. Choice has been restricted. Each one of the three, not having received the first offer, has a far less than 40% chance remaining of securing an offer. In fact, they now have half that chance; for they can only finish second, at best, among the five, while the fifth candidate can still finish either second or first. Therefore their remaining chance is half of their original 40%, since one of the two offers has been closed to them, or 20%, rather than 25%. The fifth candidate still has a 40% chance.

God of the Machine: all probability, all the time!

Feb 212004
 

Five years ago, after the 1999 season, a fellow fantasy league baseball owner and I fell into an argument about Roger Clemens. Clemens was 37 years old. In 1998 he had a brilliant season with Toronto, winning the pitching triple crown — ERA, wins, and strikeouts — and his fifth Cy Young Award. In 1999, his first year with the Yankees, he slipped considerably, finishing 14-10 with an ERA higher than league average for the only time since his rookie season. His walks and hits were up, his strikeouts were down, and my friend was sure he was washed. He argued that Clemens had thrown a tremendous number of innings, that old pitchers rarely rebound from a bad season, and that loss of control, in particular, is a sign of decline. I argued that Clemens is a classic power pitcher, a type that tends to hold up very well, that his strikeout ratio was still very high, that his walks weren’t up all that much, and that his diminished effectiveness was largely traceable to giving up more hits, which is mostly luck.

Of course Clemens rebounded vigorously in 2000 and won yet another Cy Young in 2001. He turned out not be finished by a long shot, and still isn’t. Does this mean I won the argument? It does not. Had Clemens hurt his arm in 2000 and retired, would my friend have won the argument? He would not.

Chamberlain wasn’t wrong about “peace in our time” in 1938 because the history books tell us Hitler overran Europe anyway. He was wrong because his judgment of Hitler’s character, based on the available information in 1938, was foolish; because, to put it in probabilistic terms, he assigned a high probability to an event — Hitler settling for Czechloslovakia — that was in reality close to an engineering zero. He would still have been wrong if Hitler had decided to postpone the war for several years or not to fight it at all.

“Time will tell who’s right” is a staple of the barroom pedant. Of course it will do no such thing: time is deaf, blind, and especially, mute. Yet it is given voice on blogs all the time; here’s Richard Bennett in Radley Balko’s comments section: “Regarding the Iraq War, your position was what it was and history will be the judge.” It’s not an especially egregious instance, just one I happened to notice.

Now you can take this too far. If your best-laid predictions consistently fail to materialize, perhaps your analyses are not so shrewd as you think they are. You might just be missing something. Or not. But this should be an opportunity for reflection, not for keeping score.

We fumble in the twilight, arguing about an uncertain future with incomplete knowledge. Arguments over the future are simply differences over what Bayesian probability to assign the event. There is a respectable opposing school, frequentism, which holds that Bayesian probability does not exist, and that it makes no sense to speak of probabilities of unique events; but it has lost ground steadily for the last fifty years, and if it is right then most of us spend a great deal of time talking about nothing at all. Like Lord Keynes, one of the earliest of the Bayesian theorists, we are all Bayesians now.

This, for argument, is good news and bad news. The good news is that history won’t prove your opponent out. The bad news is that it won’t prove you out either. You thrash your differences out now or not at all. Then how do you know who won the argument? You don’t. Argument scores like gymnastics or diving, not football. It will never, for this reason, be a very popular American indoor sport.

Feb 082004
 

Put a libertarian and non-libertarian in a room and you get an argument, always the same argument. Yesterday at my place it was over whether people who got cancer from industrial emissions would be able to collect fifty years hence. A few weeks back at David Sucher’s it was over whether houses would collapse in earthquakes without building codes. The other day at Radley Balko’s it was over whether without animal cruelty laws the evil neighbors would buy up puppies and kittens and torture them without fear of reprisal. Only the details vary.

Thomas Sowell wrote a book on this subject, A Conflict of Visions, in which he claimed that the fundamental divide is between those who believe in the perfectibility of human nature and those who do not. In fact it is less momentous: it’s between those who believe in the perfectibility of the state and those who do not. Some people think that the state can mete out perfect justice, some don’t. Libertarianism fails, in the eyes of the first group, if any evil goes unpunished. Now no one, not the most ardent statist, believes that the state can mete out justice in every case. But a surprisingly large number of people, a substantial majority, believes that, for every case, a theoretical mechanism for redress or punishment ought to exist. They readily concede that in practice eggs must be broken to make omelets. Mistakes are made. But if the mechanism exists, conscience is assuaged, and that is enough.

Good law sometimes produces unjust outcomes; a famous maxim expresses this, conversely, as “hard cases make bad law.” In Waube v. Warrington, a classic torts case from 1935, a woman watched a negligent driver strike and kill her daughter. The woman died a month later, allegedly of shock, and her husband sued. Maybe it was true, maybe not, but the Wisconsin Supreme Court never reached the question, dismissing the suit on the grounds that damages for mental distress require physical contact, and there was none in this case. If the woman really did die of shock, then the outcome was unjust. Yet the law was sound, based on a bright-line, predictable, common-sense standard. Like any such standard, it does not fit every case. Too bad. Law is collective, justice individual. You can swallow this or you can’t.

Buildings collapse sometimes in earthquakes and kill a lot of people. This happens more often in poor countries than rich ones because taking precautions against a rare catastrophe is a luxury, and rich people can afford more luxuries than poor people. It will continue to happen more often in poor countries, no matter how stringent their building codes, because builders will circumvent regulations that they cannot afford. If the codes are rigidly enforced then fewer houses will be built, and people who formerly lived in shoddy houses will do without instead. You can swallow this or you can’t.

The trouble with animal cruelty laws is that animals are property, and such laws infringe property rights. You can tack on riders like “needless” all you like, but infringement is infringement, and when the only question is how much, the laws become a way to harass people in the animal business. (So far the animal rightists have mostly trained their fire on unpopular targets like foie-gras producers and circus trainers; scientists, assuredly, are next.) Of course without the laws Cruella de Ville can sew herself a nice coat out of Dalmatian puppy hides and there isn’t a damn thing the cops can do about it. You can swallow this or you can’t.

If you’re on Team Perfect and I’m on Team Good Enough, we can argue to eternity and never get anywhere. What say we save our breath and stick to poetry, and stuff like that?

(Update: Spelling and capitalization of “Dalmatian” corrected at the behest of Greg Hlatky, who ought to know. David Sucher professes bemusement. Forager notes that the comments go a long way to prove the thesis.)

Jan 272004
 

In America Chief Justice Marshall, following Blackstone and Coke, first breathed life into the corporation in 1819, writing in the Dartmouth College case, which is widely quoted in judicial opinions to this day: “a corporation is an artificial being, invisible, intangible, and existing only in contemplation of law.” Marshall’s dictum appeals to leftist critics for two reasons. One is practical: if the law, or the State, creates the corporation, then it can also specify the conditions for its existence, regulating and limiting as it sees fit. Marshall actually held to the contrary in Dartmouth College, but the logic is ineluctable. Live by the sword, die by the sword. The other is mystical: it enables them to discuss the corporation as if it had a mind and heart of its own, independent, somehow, of the people in its employ. Invisible, intangible entities are more convenient targets for invective than human beings. Corporation critics, amusingly, often complain of the fictional legal personhood of corporations — cemented by the 1886 Santa Clara case — and simultaneously write of them as if they were animate.

Too many sympathizers with corporations too hastily adopt Marshall’s position. Eugene Volokh, for instance, remarks of the recent corporate free speech cases:

The same issue comes up as to corporations and unions, which get significant government benefits. When may the government say “In exchange for the benefits of the corporate form, or for the special legal powers that unions have, we will insist that you not spend money on election-related speech”? (Most corporations are state-chartered, so that benefit is actually provided by the state government, not the federal government; but I don’t think this matters, given the modern Congressional authority over interstate commerce, which would give Congress the power to preempt or modify state-granted charters.) That’s a really tough question — but the First Amendment text doesn’t answer this question any more than it answers the question “When may the government say ‘In exchange for a government paycheck, we will insist that you not reveal the tax return data that you’ll be asked to process’?”

What are these “significant government benefits” that Professor Volokh is talking about?

Classical corporate theory posits three answers: entity, perpetuity, and liability, and the first two aren’t very serious. Entity is the right of a corporation to give itself a single name in legal documents instead of listing all its shareholders. It is neither a privilege — since it’s as convenient for parties that want to sue the corporation as it is for the corporation itself — nor unique to corporations. Partnerships can easily declare themselves entities, and so can married couples. It’s a naming convention. Surely the theorists can do better.

Yes, corporations theoretically live forever — like vampires! — which means merely that they never have to renew their articles of incorporation. As any contract expert will tell you, it’s easy to make a partnership, club, or any voluntary association immortal in the same way, by changing the by-laws. Immortality also doesn’t avail you much if you go out of business, as most corporations do within a few years.

Limited tort liability is the heart of the matter. Corporations are liable for torts only to the extent of their capital: only the shareholders are liable, and only to the extent of their investment. Since officers have no special liability, unlike general partners in partnerships, this leads to the abuse known as the close or one-man corporation. I can incorporate my business, running it effectively as a sole proprietororship, and shield my assets from liability by deliberately undercapitalizing the corporate shell. If I commit a tort, the aggrieved party will find nothing to sue.

Corporation critics often propose to remedy this by removing the shareholder’s limited liability privilege, which misdiagnoses the problem. The beauty of corporate structure is that it permits people to invest in a business that they have no interest in managing. Nothing nefarious or undemocratic about that; if shareholders wanted to run the business, they’d get a job there instead of buying stock. But if Grandma buys $1000 worth of IBM, why should she be on the hook for her house, when she has no say in IBM’s daily operations? The real answer lies in vicarious liability, which descends in common law from respondeat superior, the doctrine that the master is responsible for the actions of the servant. Them as does (or hire them as does), pays. Unlimited tort liability for the people who actually direct the corporation; liability only to the extent of her investment for Grandma.

If we viewed corporations as what they are, voluntary associations, the speech question would collapse nicely. Corporate free speech would become, instead of a separate, messy legal question, a matter of the free speech of the people who run the corporation. The Nike case, for instance, would be regarded not as a matter of Nike’s free speech, but of Phil Knight’s. And one less invisible, intangible being would haunt the earth.

I owe a lot of this argument to Robert Hessen’s In Defense of the Corporation, the best, and a mercifully brief, book on the subject.

(Update: Alan Sullivan comments.)

Jan 092004
 

David Fiore is back for a third helping (or fourth or fifth, I’ve lost count by now). His erudite reply to my Professor X piece investigates various ancillary points of Emerson scholarship, like his relationship to Coleridge and the important question of whether the notorious transparent eyeball can see itself. David is terrifyingly well-informed on these matters, which fortunately need not concern us here. The question was whether Emerson advocates surrender to emotion. David, to his credit, does not attempt to deny this, and really it would be impossible to deny; every second page of Emerson contains passages to this effect. He takes a different approach:

Having read some of Winters, I see now, Aaron, why you place so much emphasis upon the logical consequences of philosophical positions. But you cannot deal with Emerson (or me!) this way. For Winters, Crane is a superior Emersonian, because he is “not content to write in a muddling manner about the Way; he is concerned primarily with the End.” But this is precisely what makes him such a failure as an Emersonian–and a sane human being. Life is a problem. People, like works of art, are alive so long as they maintain their ideas in tension. To long for the resolution of these tensions, as you do Aaron, is to long for catastrophe. [Italics his.]

Since David has many distinguished predecessors in this view, like “Negative Capability” Keats, who can be excused on grounds of extreme youth, and F. Scott “Opposed Ideas in the Mind at the Same Time” Fitzgerald, I may be forgiven for insisting on some obvious points. Life is indeed a problem, many problems, which one does one’s best to solve, through exercise of the rational faculty. Man acts and chooses: each choice excludes many others. Some choices are wise, others foolish; some conduce to his well-being, others to his destruction. One can no more hold an idea and its opposite at the same time — what, in this case, could “hold” possibly mean? — than one can act on an idea and its opposite at the same time. In the face of these difficulties, Emerson recommends abdication.

Emerson sprang from the dominant 19th-century intellectual tradition in America, New England Nonconformist. It is best represented by the Holmes family (Oliver Wendell Sr. and Jr.) and the James family (Henry Sr., William, Henry, and Alice). Its products include Emily Dickinson, Nathaniel Hawthorne, and Herman Melville. Today New England Nonconformism is extinct; Katharine Hepburn (b. 1907) was perhaps its last degenerate scion.

New England Nonconformists, with very few exceptions, were hobbyists. They liked to toy with ideas, often radical ideas and often very brilliantly. They filled the ranks of the Abolitionists and suffragettes; but they tended not to reason to these positions but intuit them. Their motto could have been Holmes Jr.’s frequent remark that he hated facts, that the chief end of man was to form general propositions, and that no general proposition was worth a damn. Holmes père et fils, Emerson, and William James were all radical skeptics philosophically who conducted themselves personally with exemplary rectitude. What constrained them was a deep prudence and moral sense, informed by the Calvinism of Jonathan Edwards, the doctrine that although good works and success on earth technically avail one nought, as all seats in the Kingdom of Heaven are reserved, they yet demonstrate one’s fitness for Election. Yvor Winters calls this a “New England emotional coloration,” accurately. To put it flippantly, the vote for women was all very well, but “never dip into capital” was a real rule to live by. (On the other hand, in the dominant 20th-century American intellectual tradition, the New York Jewish, ideas became the ticket to success.) Henry James’ American characters act not on ideas but on an inarticulable “moral sense.” This moral sense attenuated as its doctrinal background exerted less and less direct influence, until it finally vanished altogether.

This is why Emerson died rich, old, and in bed, and Hart Crane jumped off an ocean liner.

Jan 012004
 

Sixty years ago Yvor Winters wrote a moving essay on Hart Crane called “What Are We To Do With Professor X?” Crane and Winters were correspondents and friends for several years; they broke over Winters’ largely hostile review of “The Bridge” in 1930; Crane jumped off an ocean liner two years later. Winters charges Crane’s suicide to his belief in Emersonian advocacy of instinct over intellect and change for its own sake. (To anyone who doubts that this is in fact Emerson’s philosophy I suggest reading “The Oversoul,” “Self-Reliance,” “Art,” or “Spiritual Laws” straight through, instead of the little snippets from them that are so frequently quoted.) He contrasts Crane, “a saint of the wrong religion,” who took those ideas with literally deadly seriousness, with genteel Professor X, who holds the same ideas but would not dream of actually practicing them:

Professor X can be met four or five times on the faculty of nearly every university in the country: I have lost count of the avatars in which I have met him. He usually teaches American literature or American history, but he may teach something else. And he admires Emerson and Whitman.

He says that Emerson in any event did not go mad and kill himself; the implication is that Emerson’s doctrines do not lead to madness and suicide. But in making this objection, he neglects to restate and defend Emerson’s doctrines as such, and he neglects to consider the historical forces which restrained Emerson and which had lost most of their power of restraint in Crane’s time and part of the country. [Crane was born in Cleveland in 1899.] … The Emersonian doctrine, which is merely the romantic doctrine with a New England emotional coloration, should naturally result in madness if one really lived it; it should result in literary confusion if one really wrote it. Crane accepted it; he lived it; he wrote it; and we have seen what he was and what he wrote.

Professor X says, or since he is a gentleman and a scholar, he implies, that Crane was merely a fool, that he ought to have known better. But the fact of the matter is, that Crane was not a fool. I knew Crane, as I know Professor X, and I am reasonably certain that Crane was incomparably the more intelligent man. As to Crane’s ideas, they were merely those of Professor X, neither better nor worse; and for the rest, he was able to write great poetry. In spite of popular or even academic prejudices to the contrary, it takes a very highly developed intelligence to write great poetry, even a little of it. So far as I am concerned, I would gladly emulate Odysseus, if I could, and go down to the shadows for another hour’s conversation with Crane on the subject of poetry; whereas, politeness permitting, I seldom go out of my way to discuss poetry with Professor X.

In the role of Professor X today is David Fiore, who is pleased that PETA exists. I have made my objections to the concept of animal rights elsewhere and will not rehearse them here; they are beside my point. Now PETA has been excoriated, properly and often, for its advocacy and funding of violence and terrorism. It is less often noted that these follow necessarily from its position. If you believe, like Ingrid Newkirk, that a rat is a pig is a dog is a boy, then fire-bombing a laboratory is a small price to pay to stop what, by your lights, is mass murder. I can respect this view even as I wish to jail anyone who tries to put it into practice.

David begins courageously enough: “I’ve made a radical choice. So have you.” But he fails to comprehend just how radical the choice is: “And certainly, I don’t condone any acts of violence Animal Rights people might commit. That’s just insanity, you don’t make change by terrorizing the majority. Change happens when the majority assents to it… Moreover, I don’t have the slightest desire to “convert” anyone, I like just about everybody, and I’m not suited to delivering harangues…” David has, and can have, no moral objection to violence on behalf of the bunny rabbits; it is a mere question of tactics: “you don’t make change by terrorizing the majority.” Winters writes that Professor X “once reproved me for what he considered my contentiousness by telling me that he himself had yet to see the book that he would be willing to quarrel over.” And so David, who likes just about everybody, prefers that PETA deliver the harangues on his behalf.

Sometimes hypocrisy is, as La Rochefoucauld says, the tribute vice pays to virtue; sometimes, as in this case, the tribute fanaticism pays to sanity. A significant minority of Americans believes that abortion is murder. Yet in their next breath they will condemn clinic bombers — because they are hypocrites, fortunately. In a society of mass murderers, armed resistance becomes a perfectly logical, even admirable, response.

The most shocking thing about 9/11 wasn’t the deaths, or the image of the World Trade Towers collapsing. It was the realization that some people are willing to die for their ideas, foolish as they are, while most of us treat ideas like shiny playthings that you can put back in the toy chest when you’re finished with them. I have friends who say the trouble nowadays is that no one takes ideas seriously. They should thank their lucky stars. When nearly everyone thinks as badly as possible, Professor X may be the best we can hope for.

(Update: David Fiore replies on his blog, and in the comments.

Nov 092003
 

How full of ourselves we bloggers grow:

Some might conclude from the above that, because I reject the solutions that [Steven] Den Beste and [Victor David] Hanson offer, that I’m implying that something more dire be done to “solve” this problem. I am not. Frankly, personally, I am increasingly resigned to the fact that these problems are without solution, to the point that I’m that close to simply giving up, mothballing this site, and accepting that yes, we’re watching Western Civilization self-destruct before our very eyes and there is nothing to be done about it… I’ll probably end my life in a Death Camp of Tolerance for expressing “divisive” views and making “insensitive” remarks.

Thank God for stalwart conservative bloggers! You might think that manning the barricades against the imminent fall of Western Civilization is a lonely job. You would be wrong; the barricades are crowded with Chicken Littles of all parties, although the smoke from all the shooting prevents them from seeing each other. For some of these brave soldiers Western Civ has already fallen and its revival is the consummation devoutly to be wished. The early Objectivists used to say of Atlas Shrugged, “if this book sells 50,000 copies, the culture is cooked.” Several million copies later, well, here we are.

The sky is always falling. The “new philosophy” was putting “all in doubt” in the 17th century (Donne); “Chaos and dread Night” were descending in the 18th (Pope); “the demons [of unreason] were let loose upon the land” in the 19th (Robert Bridges). Today’s featured blogger, one Porphyrogenitus, has found that it is impossible to persuade people with reason who deride reason itself. ‘Twas ever thus, dude. Derrida and Foucault are pretty small beer compared to Hume’s attack on induction, or Bishop Berkeley’s on the evidence of the senses.

Too many bloggers confuse civilization, or culture, with Zeitgeist, which is white noise. Culture does not consist, and never did, of what is taught in college, or what appears on television or in the newspapers. It is an underground stream, the product of a few dozen of the most intelligent people of each generation, and it always appears sounder retrospectively because time takes out the trash. It is opaque not only to statistical analysis but to all but the most acute critics of the time: there is too much to sort through, and it is too easy to read in the light of the pressing issues of the day. Edmund Wilson ventured in 1935 to guess which contemporary poets would survive, a fool’s errand, and came up with Edna St. Vincent Millay (OK, he was married to her) and several other people you haven’t heard of, for excellent reason. He found Frost dull and ignored Crane, Stevens, and Williams altogether. The point here isn’t that Wilson was a dummy — far from it — but that the state of the real culture, except from a very long vantage point, is extremely difficult to discern.

Is Western Civilization on the verge of destruction? I doubt it, but I don’t know, and neither do you. Ask me in a couple hundred years.

(Update: Marvin Long comments. Julie Neidlinger comments. l8r comments.)

Oct 172003
 

Congratulations, to begin with, to all Red Sox and Cubs fans, who burnished their reputations as lovable losers, with their teams both snatching defeat from the jaws of victory in dramatic fashion. There is a lesson for them in the plight of the Rangers fan. For decades New York Rangers fans had to endure the mocking chants of 1940! 1940! — the last time they won the Stanley Cup — until 1994, when they finally won it again, only to relapse almost immediately into the mediocrity in which they are still mired today. Now the Rangers fan has no mocking chants to endure, because no one cares; the Rangers have just become another average team that hasn’t won for a while. If you can’t always win, next best is to always lose, which is a distinction. I suspect that many Red Sox and Cubs fans secretly root for their teams to lose, or better, almost win.

Last night’s Yankees-Sox game was certainly thrilling (note to Floyd McWilliams: I’m not listening), although I took advantage of the break between the top and the bottom of the 11th to take out the trash and consequently missed Aaron Boone’s game-winning home run. But at various points Fox showed two players and several fans with their hands clasped together, as if in supplication. Yes, the big bearded man in the sky apparently concerns himself with whether the Yankees rally against Pedro in the bottom of the 8th. Aristotle had the first word on this subject:

[F]or while thought is held to be the most divine of things observed by us, the question how it must be situated in order to have [divine] character involves difficulties. For if it thinks of nothing, what is there here of dignity. It is just like one who sleeps…what does it think of? Either of itself or of something else; and if of something else, either of the same thing always or something different…Evidently, then, it thinks of that which is most divine and precious, and it does not change; for change would be change for the worse, and this would be already a movement…Therefore it must be of itself that the divine thought thinks (since it is the most excellent of things), and its thinking is a thinking on thinking.

Aristotle, Platonizing, makes God sound rather like Wittgenstein, but you catch his drift. Spinoza is blunter:

For the reason and will which constitute God’s essence must differ by the breadth of all heaven from our reason and will and have nothing in common with them except the name; as little, in fact, as the dog-star has in common with the dog, the barking animal.

And the last from a god, who ought to know, Dr. Manhattan of Watchmen, chastising Veidt for trying to kill him (note to Jim Henley: I am too a comics blogger!):

I’ve walked across the sun. I’ve seen events so tiny and so fast they hardly can be said have occurred at all. But you…you are a man. And this world’s smartest man means no more to me than does its smartest termite.

Surely God, if He can rouse Himself to intervene in human affairs at all, will find beneath His dignity anything less than the World Series.

Oct 042003
 

(I had a pleasant holiday from you, dear readers, and, I trust, you from me as well. Now let’s get down to it boppers.)

Toxicologists say that the dose is the poison, and Americans could save themselves millions of dollars if they only understood what that means.

Everything on earth, from arsenic to mother’s milk, is toxic if ingested in sufficient quantity. If we graph the lifetime dose on the x-axis and the chance of resulting loathesomeness on the y (what’s with me and the graphs lately?), we wind up with the risk curve. For your quotidian poisons like cigarettes, red meat, and smog, the risk begins at zero and stays very close to it until a certain dose is reached, at which point the curve inflects and the risk begins to increase quite radically. Not all risk curves have this shape, of course. For highly toxic substances like sarin or finely-ground anthrax it inverts. The risk escalates very rapidly and then flattens at the top of the graph after a certain exposure, at which point you die.

The curve, however, is always a curve, never a straight line. Have you heard that every cigarette you smoke cuts five minutes, or eight, or ten, off your life? This is the linear fallacy in full flower. Smokers’ diseases like emphysema and lung cancer concentrate overwhelmingly in the heaviest, longest-term smokers. People who smoke for a few years have scarcely higher mortality than people who never smoke at all. (You kids bear this in mind when you’re thinking of lighting up.) The first cigarette you smoke probably does you no harm at all. The 150,000th — pack a day for twenty years — may, like W.C. Fields’ Fatal Glass of Beer, be the one that does you in. The dose is the poison.

It gets worse. An intense dose over a short period is generally far more toxic than the same dose spread out over a lifetime. Risk varies radically not only with the lifetime dose, but also with its rate, which renders extrapolation effectively impossible. Animal tests classically deal with this fact by ignoring it. Suppose you want to determine the long-term risks of swilling pistachio nuts and maraschino cherries, which contain Red Dye No. 3. Time’s a-wasting, and you don’t have 50 years to conduct your research. Instead you stuff a bunch of gerbils with a whole lot of Red Dye No. 3 over a few weeks or months and see what happens. If a few gerbils get cancer, you extrapolate, bury your reliance on the linear model in a couple of footnotes, and voilà! a new carcinogen. Politicians and journalists thunder against the unacceptable risks to maraschino cherry addicts, the Delaney Clause is invoked, Red Dye No. 3 is banned, a new, slightly less attractive red dye replaces it, and the cycle begins anew.

Good-sized industries have sprung up to exploit the linear fallacy. The EPA tells gullible homeowners to shell out a couple grand to a radon-removal outfit if their radon level in their water is more than 4 pCi/I (picocuries per liter). Turns out that exposure at that level for 20 years increases the lifetime risk of cancer by less than 1%, unless you also smoke, which bumps it up to a ghastly 3% or so. Mind you, this is not increased mortality, but increased risk of cancer. Since the lifetime risk for cancer is in the 25% range, we’re discussing, in terms of overall mortality, something less than 0.3%. Save your money and try to stay out of automobiles instead.

The asbestos boys make the radon boys look positively public-spirited. Asbestos is dangerous if you spend your life working with it; non-smoking asbestos workers have cancer rates about five times the general non-smoking population. Asbestos is essentially harmless when it’s minding its own business insulating pipes. In 1985 the British epidemiologists Doll and Peto estimated the annual lung cancer risk from such exposure to be 1 in 1,000,000; other reputable estimates are similar. Yet the Asbestos Hazard Emergency Response Act, passed in 1987, mandated asbestos removal for 45,000 public schools, many with airborne asbestos concentrations no higher than the outdoors. When you remove asbestos improperly you stir it up and increase the exposure, and since removing asbestos properly is extremely expensive the incentive to do it improperly is immense. $100 billion or so later, overall asbestos risk is probably higher than it ever was. Lead paint, Alar, DDT: the song remains the same.

So there’s good news and bad news. The bad news is you’re going to die of something. The good news is that it almost surely won’t be an exotic environmental poison.