Dec 292003
 

So much to skip, so little time.

I begin with myself, having sucked a bit of late. This and this were too twee and precious for words. This wasn’t really very funny. This was half-right but embarrassingly wrong in several details, and God of the Machine is supposed to be in the details. This and this provoked squalls of irrelevant commentary. This was weird but informative. This was just weird.

The laziest organizing principle in prose is the list. (This post, for instance.) The cowardly lister postpones his imposition of a Few of My Favorite Things on the world until the end of the year, when everyone else is doing it and he has cover. The busy reader will naturally avoid such things. This goes double for that most elaborate of self-congratulation rituals, year-end awards. “Prizes,” said Ezra Pound, “are always a snare.” Besides, nobody ever nominates me for anything.

Suppose you edited a web magazine with open submissions, and you were obliged to publish whatever you received. You’d have The Carnival of the Vanities, now in its 66th tiresome edition, to which Instapundit still links dutifully every week (Glenn has a keyboard macro for “rich, bloggy goodness”). Which beats reading it, I can assure you. Good writers are often bad: bad writers are never, ever good. I confess that I often enjoy the summaries, in which the host of the week endeavors to say something kind about every submission. This testifies to my somewhat sadistic taste in humor.

The Type 1 political blog post cites an anecdotal news item that confirms his biases, whereupon the blogger crows that he was right and this proves it. Degree-of-difficulty: 0.0, since most of us obtain our news from like-minded sources. (Explaining away an item that conflicts with one’s biases, which would be far more interesting, is naturally far less common.) The Type 2 political blog post scours the Internet for the weakest possible opponent of his views and demolishes him line by line. No poliblog is complete without a healthy dose of Type 1 and Type 2, and many poliblogs consist of nothing but. If you devoted the time you’ve spent reading Type 1 and Type 2 to a more constructive activity, like exercising your abs, you might have that eight-pack you’ve always wanted by now.

Tolkien loses me about when Betamillion is making his way through Gallimaufria to secure the Ring of Fire and win the hand of fair Neuralgiel, or something. The pros established an early lead for dullest Lord of the Rings commentary, with the antis now closing fast. I’ll give you the gist here, with a spoiler-laden review of the trilogy:

Good triumphs over evil.

Not having finished any of the books or seen any of the movies, I admit that’s a wild guess.

Dec 262003
 

Another bulletin from the Dept. of Almost Right: Larry Ribstein, blogging on the Forbes list of top ten business movies:

Forbes story on the Ten Greatest Business Movies and related stories on Forbes.com, says a lot about films attitude toward business. The top ten were: Citizen Kane, The Godfather: Part II, It’s a Wonderful Life, The Godfather, Network, The Insider, Glengarry Glen Ross, Wall Street, Tin Men, Modern Times… This film list provides new fodder for my theory. My thesis, again, is that, while films usually portray business in a bad light, they do not really say that business is bad. After all, the films most of us see are produced by big businesses. More precisely, films are made by people working in these businesses. Filmmakers see themselves as artists, the latest in a long line from cave painters through Michelangelo. Yet, unlike many artists, filmmakers art is so costly that films cannot get made without lots of money. Filmmakers must get this money from capitalists, who, in turn, must sell tickets. Because film artists resent their shackles, they often show struggling workers, greedy capitalists, and heroic artists. “Good” businesses are those where the artistic types have the upper hand, and bad businesses are those where the artists have lost. In other words, films see firms from the cramped perspective of the assembly line or the cubicle. From way out in Hollywood, firms often seem like beehives or rabbit warrens, unfit for human habitation.

Larry’s point needs to be sharpened up a bit. All things being equal, people prefer good merchandise to bad, and they make exceptionally fine discriminations. Gillette mightily outsells Schick because its razor blades are better, not a lot, just enough. There are a few exceptions to this rule, mostly in aesthetic products, notably Hollywood itself. Bad art makes more money than good art, in general because bad taste is more prevalent than good taste, and in the specific case of movies because the audience for them is overwhelmingly young, and the taste of the average adolescent is even worse than that of the average adult. These are depressing facts if you work in the taste business. “From way out in Hollywood” it is Hollywood that looks “unfit for human habitation.” A screenwriter might rashly conclude that schlock always trumps quality; and in fact, as a survey of Hollywood movies about business shows, he usually does.

The anti-business movies deal overwhelmingly with schlock purveyors: yellow journalists (Citizen Kane), swampland peddlers (Glengarry Glen Ross), penny stock hustlers (Boiler Room), shady aluminum siding salesmen (Tin Men), and out-and-out gangsters (The Godfather). It’s a Wonderful Life gestures half-heartedly toward the notion of quality as good business, as in the scene where Mr. Potter’s rental agent lectures him on how all the nice houses in Bailey Park are killing his real estate business. But mostly it’s more people vs. profits hoo-rah.

In a “pro-business” movie like Executive Suite, our hero, William Holden, is the research chief for the furniture company, and in his big speech, as he ascends to the chairmanship, he tells the board that the company will never sacrifice quality, profits be damned. That it might actually be more profitable to manufacture good furniture does not cross the screenwriter’s mind. (Holden figures prominently in several famous business movies, Network of course and also the most authentically pro-business movie out of Hollywood that I know, Sabrina, which is disguised as a love story. He was, perhaps coincidentally, Ronald Reagan’s best man.)

Or consider Tucker, a garish and tasteless but ostensibly pro-business movie. Jeff Bridges plays the real-life car designer Preston Tucker, who sets out to build a revolutionary automobile, and succeeds, only to be squelched by a conspiracy of the government with the Big Three. This happens to be pretty much true; but out of this pregnant material the director, Francis Ford Coppola, fashions only another morality tale of how, as Larry would say, the good company, in which the artist, Tucker, is in charge, goes down to defeat, or, as I would say, the evil capitalists foist off shoddy merchandise on an unsuspecting public. Hollywood doesn’t hate business. It just hates businesses that act all businesslike.

(Update: Larry Ribstein replies. Michael Williams comments.)

Dec 262003
 

A while ago Brian Micklethwait had a bit about discomfiture in art to which many bloggers linked approvingly:

As for the endlessly repeated claim that art is supposed to make you feel uncomfortable, I don’t buy that. And I don’t believe the people who say that they do buy it are being honest. I think that a picture which they have no problem with, but which they believe makes other people whom they disapprove of uncomfortable, makes them very comfortable indeed, and that that is the kind of discomfort (i.e. not discomfort at all, for them) which they like, and are referring to with all this discomfort propaganda. They no more like being genuinely discomforted by art than I do.

As a psychological observation this is acute. No one ever talks this way about art that discomfits him, personally. And good art is never described in these terms — only epater le bourgeois stuff, which of course discomfits no one, certainly not the people who describe it as discomfiting, and not the people it’s supposed to discomfit either. “Discomfort” is the last-ditch argument of bad artists or their flaks, like museum directors.

So I almost agree with Brian, except I’d lop off the first sentence. Have you ever talked at length with someone who was far more intelligent than you? Such a person seems armed with all of your thoughts and experiences and much more besides; he answers objections that you have formed fuzzily or not at all. You get the most out of it by shunting aside your own prejudices, as best you can, and following him as he elaborates on his, which are more interesting. Later on you go back and reintroduce yourself, as it were, to your original prejudices, and compare and contrast. The experience is, in a word, discomfiting, not because your interlocutor tries to shock you like a cheap artist, but because he says things that have not occurred to you, and novelty is always unsettling.

Great art is like that, except that its commerce is with a mind greater than any you know personally and on a subject on which it has meditated deeply and you may not have thought at all. Henry James goes so far as to say “it is a very obvious truth that the deepest quality of a work of art will always be the mind of the producer… No good novel ever proceeded from a superficial mind,” and he’s talking to you, Charles Dickens.

Brian discusses visual art and music mostly, and I’m talking about literature, being wary of generalizations about all arts, although I seem to make them often enough. So maybe we are talking at cross purposes. But for all arts (oops, I did it again) the ideal aesthetic attitude is receptiveness — a provisional acceptance of the author’s cultural situation, the benefit of the doubt. You have to be willing to check your damaged self at the door. Many respectable aesthetic theories, like Coleridge’s “suspension of disbelief” and the “pseudo-belief” of T.S. Eliot and I.A. Richards, reasonably begin with the attempt to inculcate this attitude: we need to read Christians and pagans without being either. So yes, good art makes you uncomfortable, but only incidentally, and anyone who makes a big point of the fact is a bad artist.

(Update: David Fiore comments. Great artists aren’t just different, David, they’re better. Get over it.)

Dec 222003
 

To understand the absurd seriousness with which Americans treat higher education, look at their cars. Jacques Steinberg’s The Gatekeepers, which trails a Wesleyan admissions officer and six supplicants for places in the class of 2004, documents this magic moment:

A week before his decision was due, he mailed off a $250 deposit and his official response to Wesleyan: a form that had “YES” preprinted in large type at the top. Jordan then went out to his mother’s car and pressed a clear Wesleyan decal against the inside of the back window.

Jordan’s palpable awe was correctly analyzed Paul Fussell, twenty years ago, in Class:

Americans are the only people known to me whose status anxiety prompts them to advertise their college and university affiliations in the rear windows of their automobiles. You can drive all over Europe without once seeing a rear-window sticker reading CHRIST CHURCH or UNIVERSITÉ DE PARIS. A convention in the United States is that the higher learning is so serious a matter that joking or parody are wholly inappropriate… One would sooner defile the flag than mock the sticker or what it represents by, say, putting it on upside down or slantwise, or scratching ironic quotation marks around “College” or “University.” I have heard of one young person who cut apart and rearranged the letters of his STANFORD sticker so the rear window said SNODFART. But the very rarity of so scandalous a performance is significant.

Fussell, notably, does not assign this behavior to a particular class, but to Americans in general. The college decal afflicts uppers, middles, and proles alike. And I sympathize: if I were about to piss away 150 large I might want a souvenir too. Status anxiety being what it is, I see only one answer to the college decal problem: stop sending kids to college.

College, as a phenomenon, has nothing to do with learning. It is possible to educate oneself at Ball State or at Harvard, or alone in one’s room for that matter, like young Jimmy Gatz, studying electricity from 6:15 to 7:15 every morning and needed inventions from 7:00 to 9:00 every night. It is equally possible not to educate oneself at any of those places. I should know: when Harvard turned me down I beat my breast and rent my garments. I then proceeded not to educate myself at my safety school, Carleton College, which served the purpose admirably, just as Harvard would have.

For certain subjects college facilities are useful; it’s tough to learn biology or chemistry without lab work. But Tiffany will be majoring in sociology, and Eustace in political science. They could read Erving Goffman and Tocqueville on their own time, and $150,000, apparently the going rate for four years at a top university, buys a hell of a lot of private tutoring. Perhaps the parents consider the money well-spent if it simply gets the brats out of the house.

No, college is about bragging rights, and seeing to it that your child has the best possible start in life. Children who attend prestigious colleges are understood, correctly, to have more career success. Here, however, we run into a little cause-effect problem. College admissions officers look for good grades and high test scores and a documented record of achievement; employers look for the same things. If no one went to college, or if the bottom went while the top worked instead, would the income disparity, ten years hence, really be any different?

The children themselves dispense with these niceties. Of the six in The Gatekeepers, each, for all of his oft-asserted independence of mind and spirit, decides to attend the most prestigious school he gets into (as determined by the U.S. News rankings, which the schools follow as assiduously as the children). The single exception is a girl who courageously spurns Harvard in favor of Yale.

Steinberg, who graduated Dartmouth in 1988, is not, himself, the best advertisement for the admissions officers of the Ivy League. (I include Wesleyan, which billed itself for a while as “The Alternative Ivy” and is still trying to live it down.) As a writer he is a diligent reporter. His special weakness is for the inconsequential appositive, for “color,” and The Gatekeepers is full of sentences like this: “For Terri, the mother of a ten-year-old girl and an eight-year-old boy, the idea of traveling to Asia for five weeks a year on Wesleyan’s behalf seemed like a perfect segue to the nearly three years she had spent in Swaziland for the Peace Corps.” Neither the girl nor the boy nor Swaziland ever reappears, for which, I suppose, a more generous reader would be grateful.

Causation gives Steinberg some trouble. One of the students he follows, Jordan Goldman, connives his way, Steinberg never says quite how, into writing lessons with the distinguished novelist Richard Price. Goldman’s best friend has cerebral palsy and is bound to a wheelchair. Steinberg writes, “In Freedomland Price had created characters based on both boys and made them brothers, because he knew how badly they wished they were brothers in real life.” Cosmic stupidity lurks behind that “because.” Early in the book his admissions officer, Ralph Figueroa, interviews at Goucher College and dislikes it because there’s no decent Mexican food. At the end Steinberg says that Goucher has finally passed “the Tortilla Test,” not by improving the food, but by appointing a Mexican-American dean. You begin to feel a little embarrassed for the guy on the one hand, and to wonder, on the other, what Dartmouth is letting in these days.

Imbecility has its uses, letting Steinberg tell what he sees without noticing that it directly contradicts what he believes. Steinberg and his admissions officer firmly believe in affirmative action, a conviction unshaken by the fact that the two obvious affirmative action admittees, an American Indian to Wesleyan and an inner-city Hispanic to Muhlenberg, both drop out freshman year. Ralph rhapsodizes constantly about the importance of “diversity” at Wesleyan; yet he never seems to encounter anyone, on or off campus, with politics to the right of Howard Dean’s. One applicant “was intrigued that so many students were vocal in support of various political causes,” as Steinberg puts it — I would say coyly, except it does not seem to have occurred to Steinberg that there is more than one kind of political cause.

The Gatekeepers also makes clear what admissions officers really do for a living, during the nine months of the year when they aren’t reading applications. They solicit. Ralph spends months on the road, traveling from high school to high school singing the praises of Wesleyan and encouraging applications that he has every expectation of turning down. More applications means more rejections, which means more “selectivity,” which means a higher rank in U.S. News. Nothing scandalous about that, but nothing edifying either.

Suspiciously little in the way of actual academics seems to go on at any of these colleges — especially Wesleyan, which resembles on Steinberg’s account less an institution of learning than a year-round Burning Man festival — but there is an awful lot of travel. The Cornell girl spends six months in a pueblo in Costa Rica and a month in Rome “to write and draw.” The NYU girl goes to Prague, Jordan Goldman goes to Oxford. Only the Yale girl stays put, leading rallies on behalf of her fellow oppressed Yalies, demanding that all college loans be forgiven. The old aristocratic Grand Tour was more effective and no more expensive.

So parents, that round-the-world cruise that you’ve been promising yourself? The money’s just sitting there, in Junior’s college fund; help yourself. It’s his year abroad or yours.

(Update: Craig Henry points to a study that shows a surprisingly weak link between college selectivity and income. Maybe I was too kind. James Joyner comments. Julie Neidlinger comments.)

Dec 182003
 

With a title like that this should be in German and long. Instead it will be in English and short. George Hunka and AC Douglas have gone off the rails with this whole transcendence business. George, normally dyspeptic, soars into the empyrean:

As Kant will happily tell you, there’s no escaping the boundaries of human sensual experience, but as Schopenhauer will whisper in your ear, you can always seek to transcend it through renunciation of the world and through the highest expressions of sensuality itself. Art and religion provide the means for that renunciation. Artists, then, should encourage a path out of the materialist Hegelian world with the techniques at their disposal, whether those techniques are musical, linguistic or visual, just as the priests of all religions have their sacraments and their rituals as a means to transcendence.

This sort of art is utterly useless to the world, for it denies the world itself as a transient petrie dish of suffering and aimless, constantly unsatisfied desires for pleasure. The world itself can’t accept this denunciation of its own importance; therefore it invents Hegel.

Dude! Easy on the transient petrie dish of suffering there! If the alternative is, as it seems to be, being bored or tortured for eternity, then I’ll take my petrie dish of suffering, thanks. With fries. I concede that if the world had invented Hegel it would have some explaining to do, but I think we can let the world off on that score.

The aesthetic emotion is profoundly rooted in human experience. You watch the protagonist and think, that’s me (naturalism), or that’s what I wish I were (romanticism), or that’s what might become of me if things went really, really wrong (tragedy). You read the poem and think, I’ve felt that way, or I would, in those circumstances. You look at the painting and think, I’ve seen that, or I’d like to. (I’ve skipped music, which beats me.) There’s nothing terribly hifalutin about any of this.

Art seems different, somehow, and elaborately wrong-headed theories of aesthetics, like Benedetto Croce’s, have been constructed on this premise. But the word for sitting transfixed in the opera house, or the movie theater, or between the headphones, is not transcendence. It is absorption, or to put it still more mundanely, paying attention. I trust all my readers have become absorbed in a task. Becoming absorbed in a work of art is no different.

There are serious questions to answer in aesthetics. I suggest we try to answer them, and leave Never-Never Land to Tinkerbell, and Schopenhauer.

(Update: George Hunka replies. David Fiore weighs in (and here), as does JW Hastings. Stirling Newberry comments.)

Dec 142003
 

I tire of having to straighten everybody out on everything, but really, all these intelligent bloggers discussing great covers and not one mention of Devo’s (Can’t Get No) Satisfaction? Satisfaction never truly belonged to the Rolling Stones anyway. The Who might have made it their own but never the Stones, who were too smug and well-adjusted for a song so damp and anxious. The famous Keef guitar lick, great as it is, could just as well have shown up in Jumpin’ Jack Flash or Street Fighting Man, it doesn’t fit the lyrics at all. Truth now, Mick: did some girl you were trying to make ever tell you to come back baby later next week? Devo grasps the meaning of “He can’t be a man ’cause he does not smoke the same cigarettes as me.”

(Update: Props to Jeff Taylor, who lists Satisfaction in his top five. David Fiore comments, and posts a more interesting list than any I linked in the first place. The Warrior Monk plumps for the Otis Redding version.)

(More: Ian Hamet, George Wallace, Rick Coencas — yes, it’s godofthemachine.com, where the fun never stops!)

Dec 132003
 

Bad academic writing is called by its perpetrators “difficult” in the same way indulgent parents call their rotten children “difficult.” “Delinquent” would be apter in both cases. Jonathan Culler and Kevin Lamb have proferred the standard excuses in Just Being Difficult? Academic Writing in the Public Arena, which I haven’t read and doubt I could bring myself to read, and on which John Holbo has done a far better demolition job than I could in any case. Holbo quotes a paragraph from Culler’s introduction that gives the flavor:

The claim not to understand might seem an innocent posture that people would seldom adopt willingly, but in fact it is one of considerable power, in which authorities often entrench themselves. Eve Sedgwich has described the “epistemological privilege of unknowing,” whereby “obtuseness arms the powerful against their enemies.”

Pot, kettle. As Holbo says, “If these jerks are going to pretend not even to understand why some people are a bit cheesed off about how badly Homi Bhabha and Judith Butler write, just turn that trick on its head. Don’t even offer the courtesy of a fair debate, if that courtesy will only be abused by willful refusal to respond seriously to serious points. Thank you for being such a pain.”

Few ideas are so difficult that they can’t be expressed in a few sentences or a couple of equations. One doubts that these deep thinkers are up to anything so recondite as, say, Gödel’s theorems of formal undecidability, the proof for which David Berlinski managed to summarize clearly in three pages and about which Ernest Nagel wrote a very short and lucid book.

Legendary bad academic writers like Butler and Bhabha are quite capable, when the chips are down, of turning a respectable English sentence. In fact they tend to reserve their best prose to reply to complaints about their bad academic writing (Butler’s New York Times op-ed on the subject; costs $2.95, but trust me, it’s clear, if silly). They write that way on purpose. They’re hiding something.

Humanities departments are trade unions, and trade unions exist for two reasons: to restrict the supply of their labor, and to increase the demand for it. Of course there is no ultimate demand for Bad Academic Writing, in the sense of actual readers. Yet there is ongoing ancillary demand, from Bad University Presses and Bad Academic Quarterlies. They have quotas to meet and space to fill, while being generally exempt, thanks to generous endowments and still more generous taxpayer sponsorship, from the tiresome obligation to turn a profit. New and cogent thoughts on literature and philosophy will not float these subsidized outlets, not by a long shot. What is needed, and supplied, is a formula for generating an indefinite number of ways to say the same thing. Bad Academic Writing, like so many other bad things, is your tax dollars at work.

There remains the problem of supply: literary criticism and philosophy require no special training, unlike, say, pipe fitting. Modest erudition and a little elbow grease suffice. When T.S. Eliot, asked what a suitable method for criticism might be, answered “to be very intelligent,” he was making the same point in a more flattering way.

To the professionals in the field this state of affairs is deeply unsatisfactory. Doctors have medical boards, lawyers have bar exams, what’s a poor humanities academic to do? The First Amendment unluckily prevents the issuing of licenses to practice philosophy or criticism, so other means are resorted to to keep out the amateurs like, say, T.S. Eliot. These means are tenure and an arcane lingo. If you don’t use the lingo you don’t get tenure, if you don’t get tenure you’re not a professional, and if you’re not a professional you can be safely ignored. Better luck next time.

No matter how you scramble the language of “rearticulations,” “social relations,” “structural totalities,” and “enunciatory modalities,” it always comes out the same: as a critique of post-industrial capitalism. Try this yourself at home. The words are father to the thought, and it is seemlier to make writing a certain way, rather than thinking a certain way, a requirement for guild membership. If it’s hegemony you want, well, I got your hegemonic power structure right here.

Dec 082003
 

Terry Teachout writes of the perils of the goyim among the Jews, but what of the perils of the Jews among the goyim? One of the minor joys of Richard Rhodes’s book The Making of the Atomic Bomb is this stock answer of a Russian physicist when confronted, as he frequently was, with anti-Semitic remarks: “My ancestors were forging checks when yours were still living in trees.”

Not to be understated, either, are the perils of the Jews among the Jews. Harry Cohn, the legendary chairman of Columbia Pictures, was once solicited by a group of writers for a Jewish relief fund. “Relief for the Jews?” said Cohn. “How about relief from the Jews?”

(Update: Rick Coencas comments on the comments.)

Dec 072003
 

This place has gone to seed, in large part, because I’ve been doing some actual work, trying to get a software release out — late, inadequate, but out — and as a consequence have followed Floyd McWilliams’s and Evan Kirchhoff’s theorizing about the future of software with more than academic interest. Evan starts here, Floyd replies, more Evan, more Floyd, and finally Evan again. The question at hand is when all of our jobs shall be outsourced to Softwaristan (India), where they produce high-quality source code for pennies a day, and what we software developers shall be doing for a living when that happens. As Evan puts it, “Floyd says ‘decades,’ I say ‘Thursday.'”

And I say, with due respect to both of these highly intelligent gentlemen, that neither one has the faintest idea what he’s talking about. They are speculating on the state of a science seventeen years in the future, and if they were any good at it they wouldn’t be laboring, like me, in the software mines, but in the far more lucrative business of fortune-telling. I — and I suspect I speak for Floyd and Evan here too — would happily swap W-2s, sight unseen, with Faith Popcorn or John Naisbitt, and they’re always wrong.

Floyd compares the current state of software development to chemistry circa 1700, which is generous; I would choose medicine circa Paracelsus, the Great Age of the Leeches. The two major theoretical innovations in modern software are design patterns and object orientation. Design patterns and object orientation are, depending on how you count, ten and thirty years old respectively, which indicates the blazing pace of innovation in the industry. Design patterns mean that certain problems recur over and over again, and instead of solving them the old-fashioned way, from scratch every time, you write down a recipe, to which you refer next time the problem crops up. Object orientation means that software modules, instead of just encapsulating behavior (“procedural programming”), now encapsulate data and behavior, just like real life! Now doesn’t that just bowl you right over?

Hardware, by contrast, improves so rapidly that there’s a law about it. It is a source of constant reproach to software, which has no laws, only rueful aphorisms: “Adding people to a late software project makes it later,” “right, fast, cheap: choose two,” and the like.

Evan claims, notwithstanding, that “a working American programmer in 2020 will be producing something equivalent to the output of between 10 and 1000 current programmers.” Could be. He points to analogies from other formerly infant industries, like telephones and automobiles. He also cites Paul Graham’s famous manifesto on succinctness as power, without noting that Graham’s language of choice is LISP. LISP is forty years old. If we haven’t got round to powerful languages in the last four decades are we really going to get round to them in the next two?

Floyd counters with an example of an object-relational library that increased his team’s productivity 25-50%, arguing that “as long as development tools are created in the same order of magnitude of effort as is spent using them, they will never cause a 100 or 1000-fold productivity improvement.” Could be. Certainly if, as we baseball geeks say, past performance is the best indicator of future performance, I wouldn’t hold my breath for orders-of-magnitude productivity improvements. On the other hand, bad as software is, enormous sums are poured into it, large segments of the economy depend on it, and the regulators do not even pretend to understand it. This all bodes well for 2020.

Me, I don’t know either, which is the point. Evan works on games, which are as good as software gets; this makes him chipper. Floyd works on enterprise software, which is disgusting; this makes him dolorous. I work on commercial business software, which is in-between; this makes me ambivalent. We all gaze at the future and see only the mote in our own eye.

(Update: Rick Coencas comments. Craig Henry comments.)

Dec 062003
 

These are my first words about Michael Jackson, and I promise they will be my last. What interests me about Michael is not Michael himself, whose habits and daily life are so far outside the realm of ordinary human concerns that the word “eccentric,” implying that he might still be in orbit with the rest of the solar system, no longer applies. It’s the parents of Michael’s little friends who interest me: what could possess them to send their children off to consort with him in Neverland? It can’t be the money — most of them were quite well-off — so it has to be the fame. These wretched people want to be near Michael. They want to talk to him, to ride in his private plane, to be sprinkled with a bit of that magic celebrity pixie dust.

And what exactly is that pixie dust? Let’s channel Tyler Cowen here and consider this in economic terms.

Everyone craves distinction, identity, that special something that sets one apart from, that makes one better than, the neighbors. Distinction, by its nature, must be scarce, or it isn’t distinct any more, and scarce goods in America are increasingly difficult to find. Distinguishing yourself in your profession is one possibility, but that’s a lot of work, and even if you succeed you’re likely to be appreciated only by your colleagues. The Joneses won’t give a damn.

Mere money-making is out. In America, where the plumber makes more than you do and movers take Caribbean vacations, money is no longer a mark of distinction: it is common, in every sense. From this observation Paul Fussell derived a whole book, the horribly snobbish but amusing Class, and Tom Wolfe the better half of a career.

To replace money Wolfe and Fussell proposed taste. Not real taste of course, in the sense of cultivating a well-honed appreciation for some field of endeavor — like professional distinction, that’s hard work, and unlikely to be widely admired. No, Fussell and Wolfe meant taste as fashion, knowing what to listen to, to read, to wear, and to eat. This worked for a while but eventually everyone wised up. There is an episode of Cheers in which Woody’s father wants him to leave his Boston bartending job and come back to the farm in Indiana. The pseud waitress, Diane, makes a movie to persuade Mr. Boyd to let Woody stay and sends it to him. She asks Woody how his father liked it. Woody says, “He liked it all right, but he thought it was too derivative of late Godard.”

Fame at its most rarefied, when one is known by a single name, always has been and always will be scarce. Michael Jackson has been famous this way for thirty-five years and his pal Elizabeth Taylor has been famous for sixty. Even Warhol’s famous aphorism, wrong as it was, implies that there will never be enough fame to go around. Fame is the last universal currency. It collateralizes loans for Donald Trump; it buys a bully pulpit for Rosie O’Donnell and literary influence for Oprah Winfrey. It secures the best table in the restaurant, no reservation required. In an age of almost unimaginable abundance, celebrity is the last scarce good. Is it any wonder that people pursue it, and proximity to it, so assiduously?