Have a Stiff Drink First:
17 December 2004: Beyond an everyday attentiveness to providing for food, clothing, shelter, and recreation, economic self-interest has never seemed a very compelling force to me. Most people seem to pursue their ideas of success without much regard for the personal bottom line, and most people also strike me as not wanting to think very much about that line in the first place. I've known men who enjoyed turning profits well enough, but they always had, for me, the air of happy gardeners, delighted to see what seeds and soil turned up. The person who would abandon a job paying $150,000 for another paying $151,000 (other things being equal) must, I think, be extremely rare, and quite probably troubled.
So I don't expect dollars and cents to play much of a role in political calculations. When they appear to do so, it's a front for something else. When voters appear to get mad about taxes, for example, they're really angry about how they think their tax dollars are being spent - or upon whom - or about the arguable incompetence that would explain persistent tax hikes. In the right circumstances, people will happily pay high taxes. Westchester County, north of the City, contains more than few villages where high property taxes support excellent public schools. These school districts are cooperatives, effectually, for the parents of school-aged children. When your children have gone through the system, you can stay, if you like, but you can leave, too, and make room for a family like the one yours used to be.
A parent in one of these towns might very well argue that he is sending his children to good schools so that they will eventually win lucrative employment. But that is daydreaming, wishful thinking at best, and certainly not economic self-interest, narrowly conceived.
Free market economics are popular with Americans not because they benefit from them, but because free market economics militate against schemes for the redistribution of income, or welfare. When most people feel that their prosperity is in retreat, they are understandably unwilling to allow the government to appropriate any of their dwindling resources for the benefit of those less fortunate.
Anatol Lieven reviews Thomas Frank's What's the Matter with Kansas? in the 2 December issue of the London Review of Books. (In England, the book has been published as What's the Matter with America? The review is behind the LRB's paywall). On the whole, Mr Lieven likes the book, but he faults it for its economic naivete - its faith in economic self-interest. The book is in one of my many piles, but I haven't read it yet, not least because I wonder if I've really got to. I did read an excerpt somewhere, and the gist of it was a bemused incredulity at the stupidity of poor Americans who vote for policies that will make them poorer while making rich Americans richer. Don't they get it, Mr Frank seems to be asking. No, they don't, Mr Lieven replies, because they're not paying attention to being richer or poorer.
They're paying attention to being respectable. This means holding on to middle-class status by holding on to middle-class values. Away from the cities and big towns, in places where anonymity is both unthinkable and unattainable, shadings of personal virtue are more salient than shadings of personal property. If you are a sufficiently nice person, then it does not much matter what kind of car you drive (so long as you keep it clean and in safe repair). If you have to work two jobs and not just one in order to afford any car, that is all right, and certainly not the government's fault. The important thing is to be perceived as a good person. And the surest kind of good person is a traditional person. People who put being interesting ahead of being good had better head for the cities.
I don't think that it's possible for a woman's life to be traditional and interesting (even if only to herself) until she's middle-aged. For anybody, living an interesting life requires some serious disregard for tradition, at least temporarily. For a woman, it's arguably untraditional to seek to live an interesting life or to be an interesting person. To the extent that "interesting" means something more challenging than taking the kids to Orlando, it is probably to be avoided. For middle-class tradition is rooted in family life, in assuming one's God-given family role and, with luck (or grace), carrying the family onward through marriage and parenthood. To be interesting, you have to conceive of your life apart from that of your family. Not as against your family, necessarily, but simply with independence.
Thomas Frank isn't wrong to point out that most Kansans are worse off than they used to be. Where he errs (and I say this hypothetically, not having read his book) is in failing to see that this impoverishment is the very force that has pushed them into Republican arms. For although the Republican leadership is widening the gulf between rich and poor, it is also the party that upholds tradition. Its message is not so much that the middle class is the most important element in American life as it is that being middle-class - professing middle-class values - is the defining American pursuit. So long as one is middle class, and so long as being middle-class is championed, then one need not fear falling out of good society altogether and into what Mr Lieven calls the proletariat.
We in the cities don't see any of this. "Tradition," when used in New York, usually refers to cultures rooted elsewhere, whether as close as New England or as distant as Fujian. Traditions that aren't merely decorative, traditions with the kind of teeth in them that, say, force young women into arranged marriages, are regrettable in our eyes, bad habits that might, it is hoped, be eventually outgrown. We don't find meaning in the dictates of dead people whose claim upon us is mere ancestry. "Family," among the New Yorkers of my acquaintance, is an elective institution, built up over years out of friendships. Siblings are more likely to be troublesome sources of grievance than otherwise. How many people have come here simply to get away from their families? Perhaps not as many as you'd think, but life in the city is certainly flavored by the impulse.
This makes Mr Lieven's assessment all the more chilling:
If Middle America continues to crumble, one of the essential pillars of American political stability and moderation will have gone; and dreams of destroying America's enemies abroad, 'taking back' America at home and restoring the old moral, cultural and social order might well become more powerful and more disturbing. Three factors are critical. First, Frank's conservative-voting Kansans, like most American workers, define themselves not as working-class but as middle-class. Second, religious belief and practice of a 'Protestantoid' kind is at the heart of their conception both of their own identity and of the good society. Third, as Frank writes (echoing the conservative historian Walter Russell Mead), the combination of religious, middle-class and nationalist values has created among these people a view of themselves as something like a Volk - the 'real' or 'true' American people, as Republican campaign rhetoric in the heartland has continually stressed.
Frank deals with all these issues vividly and with great insight, but like much of the left he can't rid himself of the traditional materialist belief that economic interests determine political behaviour, and that if they don't, they should.
Reading Mr Lieven's review, I began to wonder if the Republicans haven't created a perpetual-motion machine. So long as Republican policies keep exurban Americans in a state of social anxiety, they will be guaranteed the support of exurban Americans. How odd it is to be obliged to find comfort in the Bush Administration's overriding characteristic: incompetence.
10 December 2004: What will be the fallout, do you suppose, of Secretary of Defense Rumsfeld's Q&A in Kuwait the other day? When the very courageous Specialist Thomas Wilson asked the Secretary why his unit was so chronically short of armor and supplies, Mr Rumsfeld replied rather testily to the effect that things can't be perfect. But as anyone familiar with the planning for the Iraq war knows, the shortages referred to by Specialist Wilson are the inevitable result of Mr Rumsfeld's disregard for standard Army planning, which he thought too costly. Save dollars, lose lives. It's a grim calculus, and the sooner military families wake up to it, the sooner we'll see the end of Secretary Rumsfeld.
Or so you'd think. Although there have been isolated acts of protest - some reservists are resisting orders to return to Iraq; one or two have even sought asylum in Canada - there is as yet no general movement openly or forcefully critical of the Administration. Although the Iraq misadventure is being fought by a volunteer army, the volunteers come from the same relatively underprivileged sectors of American life that staffed our forces in Vietnam. But I sense that they do not regard the the current morass with simple patriotism. While most soldiers probably do believe that taking some kind of pre-emptive action against Saddam Hussein was necessary - they're soldiers, after all, not middle-aged eggheads living in Yorkville who are busy making blanquette de veau for a family dinner party - it's just possible that many of them see that there are good ways of prosecuting a war, and bad ways, and that the Administration has been doing almost everything wrong, from taking Ahmen Chalabi's nonsense on faith to - scrimping on armor and supplies. I hope that they're wondering why the world's most powerful military has bogged down in a war of attrition. A few of them, having spent some time on the ground in Iraq, may even have good ideas about blocking terrorism.
I haven't spent any time whatsoever among military people, and I've reason to believe that my observations might sound condescending. Let me very bluntly state that they are not meant to be. My admiration for our soldiers is quite deep, not least because they're volunteers. Being the liberal that I am, I have no objection to exploiting the military as a machine of upward mobility; after all, that's how we've staffed our airlines from the start. But although a stint in the Army exposes one to an elevated risk of death and injury, that assumption of risk doesn't justify throwing inadequately armed young people into harm's way. Forcing defenseless soldiers to choose between death from the enemy and death from the officers was Stalin's response to Hitler, and it still sounds gruesome. But Stalin could claim reasons of state that are altogether missing from the Administration's portfolio.
As Andrew Sullivan says, "This is not knee-jerk anti-war sentiment. This is knee-jerk pro-war sentiment." I disagree with Mr Sullivan about the warrant for this war, but, now that it is wearing through its second year, I can see no warrant, either, for fighting it ineptly, and at unnecessary cost to our soldiers.
3 December 2004: Ever since the election, I have been teasing out aspects of a prevailing social persuasion that is very different from my own. I call it a persuasion because I suspect that its religious claims are spurious, and its hostility to reflection and self-awareness make it utterly unphilosophical. I've decided that it is a mistake to label this persuasion "the patriarchy," but I haven't come up with anything better. Is there a myth about a man who would fall apart if he ever looked at himself in the mirror? In a conflation of the Narcissus and Medusa stories, our unlucky hero, upon seeing his own face, would turn to stone. If there were such a myth, this character's name would make the ideal label for what I've been thinking about.
In an article about John Travolta in today's Times, Caryn James writes of Mr Travolta's "religion," Scientology, that it considers "psychiatry and psychology to be evil." Some would say that this explains that flatness of many of Mr Travolta's performances - and those of his "coreligionist," Tom Cruise, as well. But forget about acting. Psychiatry and psychology have done more to undermine the myth of male supremacy than any other intellectual developments. They have exposed macho behavior as a bluff. I don't mean that tough guys don't really want to fight. The bluff is their pretense that fighting is so meaningful that it overcomes the pain of mayhem and death. But psychiatry and psychology have revealed that it is fear that motivates aggression. Tough guys fight because they're afraid not to.
When I started seeing a psychiatrist, in sixth grade, it was a family secret, a potential disgrace. (My adoptive parents didn't know what to do with me, and I didn't know what to do with life, and my sessions with Dr K-, aside from giving me a chance to talk about myself without being interrupted, accomplished nothing.) To an extent, of course, seeking psychiatric help sounded the alarm of significant emotional instability (the word was "mental" - itself a clue to this country's anti-intellectual posture). This was permissible in women, but in a man, even in a boy, it signaled the worst possible character defect that didn't involve outright criminality: lack of self-confidence. To be unsure of oneself was the cardinal failing. (I was actually all too sure of myself as a child. I was sure that I would never, ever fit in, and I was sure that I didn't want to, either. Only when it became necessary to make a place for myself in the world did I question this, and healthy self-doubt didn't take root until I was well into my thirties.) And then there was sex. Weren't psychiatrists thought to grill their patients endlessly about sex?
Surely one of the most interesting differences between men and women today is that women seem to have no trouble at all discussing their sex lives with other women. I've even overheard such conversations myself, in circumstances suggesting that my eavesdropping was neither unnoticed nor objectionable. Men, on the other hand... Sadly, it is very much the case that the fool who divulges his sex life to another man can be sure of winning that man's instant contempt. In order to talk about your sex life, you have to know something about it; you have to think about sex when you are not actually having sex. You have to look into the mirror.
26 November 2003: Before the public conversation about religion in America boils over, I'd like to suggest some clarifications. It has been clear to me, since the election, that some important words are being bandied about without much sense of precision. I suggest that the following clarifications articulate the wellsprings of American political virtue.
Faith. Everything that we do is prompted by faith of some kind. We trust the bank where we cash our paychecks. Our faith in an airliner may not be total, but it's strong enough to get us on board. We believe in concepts, such as truth and justice, that we have neither seen nor felt. Perhaps the majority of human beings alive at this moment believe in a reality that lies beyond mortal life, whether it is a blissful paradise, a fiery hell, or something more neutral. I myself have faith in the meaning of the universe, but I am quite sure that I will never know anything about it, and so many people would say that I have no faith at all. I certainly do not profess a faith.
Religion. The root of this word is the same as that of ligament; religions tie people together. There is no such thing as a private or personal religion - all religions are public. As a matter of convention, it is silly to speak of a religion whose focus is neither a creator of the universe nor the nature of an afterlife. Religion is the bond uniting people with the same focus of this kind; religion articulates the bond by prescribing the creeds and rules of conduct that constitute orthodoxy. It is possible to be a person of faith who subscribes to no religion, and it is also possible to be a religious person without faith. The barrier that conceals an individual's faith from public view can be breached only by the faith of another, as when someone claims to know by divine guidance (an object of faith) that someone else's religious observances are insincere. It is correct to speak of the combination of religious acts and religious witness as a profession of faith.
Politics. Political activity is a cooperation of different groups that is founded upon the understanding - I avoid the word 'belief' here - that the virtue of individuals and the groups that they constitute is not determined by religious profession. Theocratic and ideological regimes, which reject the possibility that goodness can coexist with heterodoxy, are by definition incapable of supporting overt political activity. It is possible and permissible for people engaging in politics to believe that those with other religious views are certain to be judged evil by God and damned to eternal torment. What is neither possible nor permissible is for people to refuse to engage in political activity with those whose religious differences may damn them. Such refusal signals the end of politics and the beginning of tyranny.
Democracy. Modern democracy is political self-government that refuses to privilege any constituent individuals or groups. Laws and procedures apply in the same way and with the same force to all, and are not tempered to the alleged superiority - even that of numbers - of anyone. The influence of privilege signals the end of democracy and the beginning of oligarchy.
19 November 2003: We are all sitting, waiting, aren't we, for President Bush to commit an outrage, to attempt some breach of governmental nicety (or worse) that will signal the correctness of our mistrust. There - see what he just tried to do? A call to arms.
I won't be surprised if such an outrage never occurs. If it does, it's very likely to arise from a Supreme Court nomination, from another Robert Bork duel in the sun; only, this time, the other side will shoot first. But Supreme Court nominations simply don't register among most Americans. (Question: is it 'condescending' of 'East Coast liberal's to 'lecture' the 'heartland' about the importance of the president's power to fill Supreme Court vacancies?) I will say one thing for the President: he's not a grandstander. How can I say that, you ask, of Mr 'Mission Accomplished'? What needs to be understood about that gesture is that it was impressive simply because it could be done. Whether or not the 'mission' had been 'accomplished,' George W. Bush was powerful enough to deck himself out in a flight suit, command a helicopter, and make an appearance on the deck of the U.S. Abraham Lincoln, complete with banner. Forget the war; Mr Bush could make his victory walk happen. That's why he's still the President. Cosmopolitan people will have a hard time understanding this, because an essential part of our intellectual makeup involves forgetting the states of mind that respond to displays of power.
The other day, I happened upon a posting at Fly Bottle that quoted the following observation of Andrew Sullivan's: "Americans tend to believe that talent needs no apology." I couldn't agree less. Sure, from a class-ridden English viewpoint, it's obvious that nobody holds Harrison Ford's skills as a carpenter (which supported him during a very lean youth) against him; nobody says, "Oh, pooh on you, you didn't go to Groton and Harvard; we 'in' types are going to exclude you." I know that that's exactly what happens within some circles in Europe, and I know that it really doesn't happen here. But, by that very token, when Andrew Sullivan uses the word 'talented,' what he means is 'unprivileged.' The merely talented find acceptance here. Except: if their talent happens to be intellectual. In my comment on Mr Sullivan's observation, I wrote that, in my experience, there is no place in America (including Canada and Mexico) where intellectual talent is openly welcomed. We distrust thinkers - and thinking - in the New World.
And if you look at Academia, you see why. The middle-aged professors of today may contribute generously to liberal causes, but it was the professoriate that undermined liberal authority, not William F. Buckley or the VRWC.
Who knows what the President is going to do. I don't worry about it much, because the people who think he's simply the most dreadful president ever nearly carried a majority in the recent vote, and what unifies all of them is a deep dislike of incompetent bullies. They'll be on the lookout for malodorous presidential imbroglios, and I have to say that, after the Sinclair Broadcasting fiasco, I repose a lot of confidence in the Blogosphere.
But you don't need me, at this point, to remind you that the President has been perpetrating overlookable outrages from the beginning of his first administration.
He's been reelected.
5 November 2004: Wednesday was a really bad day for most of the people I know. I myself was almost happy. Not that Mr Bush won the election, certainly; that's very bad news for the United States, and probably for the rest of the world as well. But for me it was simply bad news. What tended to elate me as the day wore on was the virtual sound of millions of Americans waking up and seeing the President for what he is.
Four years ago, I felt like Chicken Little. I've often wondered how it is that I called Dubya's performance so quickly; I don't spend much time at the racetrack of politics. My recognition was quicker than quick; it was instantaneous. I knew what kind of president he'd be when he was just the Governor of Texas. How? Well, I was an unpopular kid in grade school. Even then, I was somewhat unusual - my third-grade teacher wrote in my yearbook that he hoped I'd grow up to be a writer, and I hated sports. Bo-ring! (And a lot of hard work, too. This is fun?) I was always so tall that no one ever picked a fight with me, but verbal abuse was another matter - I suppose, thinking about it, that there was a measure of curiosity about how I'd fight back, but I never did. But I certainly learned to recognize bullies at a distance, and there is absolutely nothing that George W. Bush could say or do that would induce me to reclassify him. I can tell without having to be in the same room, for example, that his odious habit of conferring nicknames on everyone is just a smirky way of belittling others - the bully's ruling passion.
But no one I knew seemed to see this four years ago. Amazingly, the line about 'compassionate conservatism' actually went down! Respectable newsmen opined that the new president would govern on the understanding that his slight margin of victory (negative, in fact) mandated a moderate, centrist Administration. Then they were surprised by the nominations of people like John Ashcroft and Gale Norton for Cabinet posts. They praised Mr Bush's utterly opportunistic response to the attacks of 9/11. They supported his Oedipal adventure in Iraq.
That's all changed now, so I'm happy. Very happy - about that. I'm sorry that so many bad things had to happen in order to rouse my friends and neighbors. Now they know, too, that the sky is falling.
Many of them are also beginning to realize that a majority of Americans voted for Mr Bush so that they could continue sleeping. That's what he promised them, anyway.
As I said, almost happy.
My first act on Wednesday morning was to write a letter to my friend, Susan Babcock. Like so many people who wrote to me yesterday, I began with an expression of grief. At a certain point, I realized that had said everything that I had to say about the election (for the moment, to be sure), and I was about to send the letter when I thought of writing to another friend, Judy Muncy. Looking at the letter that I'd just finished writing, I saw that, while written for Susan's ear, it was completely impersonal. Writers may like to write, but they also like to make their work go as far as it can, so instead of sending the letter to Susan, I sent to everyone on my Daily Blague mailing list. The letter ended with the hope that I'd know by Friday whether or not Senator Kerry had really lost the election. Before long, of course, the Senator conceded, so I deleted that last sentence and published the letter here. I'm going to keep it here for a while, not so much because I think it's important or unusually perceptive, but because the writing of it was such a relief and such a comfort that I felt protected by its aura for the rest of the day. Never before has writing had this magical effect. Writing has often made me feel very good, but yesterday's letter made me feel safe. I only wish that the feeling of safety were not, like all magical effects, an illusion.
3 November 2004: It feels like the long-awaited death of a beloved friend. That friend would be liberal justice.
As the Bush Entourage fills upcoming Supreme Court vacancies, half a century of progressive social development will come to an end, and, what’s more, much of the ground gained will be lost in the reactionary snap. I am glad that I began to ponder, a few weeks ago, the mystery of support for Bush, because when I heard Andrew Kohut, on NPR, mention that those who voted for the President told his exit pollsters that they were very concerned about ‘morality,’ I wasn’t surprised. I hadn’t quite thought my way that far, but I was ready to make the final step. The very fact that ‘morality’ was used to describe ancient attitudes about sexuality makes me feel that I’m living in an Islamic theocracy. Americans in the heartland have not been on the bus for a long time: they don’t like feminism, they don’t like homosexuality, and I suspect that they don’t like interracial marriage, either – but then again neither do most blacks. They still believe, in short, in the patriarchy.
Forget Iraq, the draft, the deficit, the scandals, the economy, the outsourcing, and healthcare. Those are all very minor issues in comparison with the patriarchy: government by a hierarchy of males, self-proclaimed members of the superior gender. We ought to have deduced this from the way the Right hammered away at Kerry’s ‘vacillation,’ his ‘flip-flopping.’ Real men never change their minds. But real men are rarely candid, and either can’t or won’t explain their positions. So we had to figure it out for ourselves.
What’s my problem? I’m a married man enjoying a quiet life. Well, it’s like happiness. Just as happiness is a kind of concentration that dims the sense of self, so is the sadness that I’m feeling: a concentration on a lost worldview (or at least an extremely imperiled one) that makes me forget that I, personally, have little or nothing at stake. Except, of course, my freedom of speech. I hate patriarchy and won’t stop saying so. But now that I have a better idea of what so bitterly divides this country, it’s hard to be optimistic. Just as millions voted for Kerry just to vote against Bush, even more millions, it appears, voted for Bush in order to vote against ideas like mine.
29 October 2004: By this time next week, if we're lucky, the presidential election will be behind us. I'm not optimistic about that, but I'm not without hope, either. As to the outcome of the election, I find that I can't really think about it. It's as though I were scheduled for a crucial exam - a biopsy, say - and weren't going to find out whether I were going to live or die until Wednesday. I continue to be amazed that so many voices in the Blogosphere continue to treat Messrs Kerry and Bush as not-terribly different figures. To me, they're not comparable in any meaningful way, distant cousins though they may be. Mr Kerry has his faults, and is far from the most appealing of candidates - although I must say that I came rather to like, or at any rate to admire, him in the debates - but Mr Bush heads a team of radical reactionaries that has "carjacked" the Republican Party; its dream is to restore the age of the Robber Barons, and its incompetence ought to be obvious to everyone. Why isn't it?
Perhaps it is; perhaps there are many voters to whom incompetence doesn't really matter. Better to be firm and manly, principled and resolute, than to be - what? Right? Capable? Sensible? Prudent? These voters forget, I think, that the President of the United States is not the lone lawman of Hollywood legend, but the pilot of the most redoubtable and complex ship of state ever afloat. Or perhaps they're yielding to wishful thinking. We're a young country, but it's characteristic of the young to wax nostalgic over an always-simpler past, and the tug exerted by the fancy of a simpler America seems to be very strong, particularly for men who regret and resist the fading of patriarchal ways of doing things. I prefer a progressive outlook, if only because I believe in the truth of the Humpty-Dumpty rhyme (it is unchallenged by history), but outlook is less important to me than responsibility, and I am appalled by the cavalier insouciance of the Bush Entourage. The damage that has already been done will take years to repair. The damage that another four years of the same will almost certainly cause might very well destroy the United States. (Between the deficit and our dependence on foreign energy sources, the nation is already living beyond its means, and the wake-up will be ruder the longer it's put off.)
22 October 2004: Until very recently, I thought that critical thinking was a skill possessed by all educated people. As a statement about the present, I still do, but two books that I've just read, Dead From the Waist Down and Reformation, have convinced me that critical thinking was developed - invented - in the Renaissance-Reformation, between 1450 and 1650. Before that development, there was no critical thinking, not even by the smartest people. This is not to say that ancient Romans believed everything they heard. But it is true that they tended to believe anything plausible. It was this tendency that inspired the great forgeries of the seventh and eighth centuries, such as the "Donation of Constantine." In the absence of a standard of rough-and-ready credulity that rejected only the extraordinary, such forgeries would never have been taken seriously. There seems to have been a general idea that if there ought to be a document attesting to something, but such a document couldn't be found, then it was all right to - produce it. Ad maiorem Dei gloriam.
Sadly (so to speak), there are no incontestably Divine manuscripts. By the fifteenth century, the pileup of self-serving charters was a stinking scandal, and with the revival of Greek learning in the West after the fall of Constantinople (1453), the utter lack of Scriptural authority for such keystones of faith as Purgatory fostered a climate of doubt. Over the next several centuries, what we call scholarship - it has nothing at all to do with the medieval theorizing commonly called 'scholasticism' - would be worked out in this new and pragmatic atmosphere. Anyone who has ever written a college term paper requiring footnotes will understand that while there is no theory of scholarship, there is always a purpose. The purpose of scholarship is to make your work so easy to refute - by showing, for example, that the texts that you cite don't exist, or don't say what you claim they say - that you will do everything you can to make your work irrefutable. It is a kind of transparency that's supposed to keep everybody honest.
(Transparency? How can a system requiring such arduous review - nothing could be more laborious than going through all the footnotes in a treatise, or repeating all the experiments in a study - be called 'transparent'? The story of Hendrik Schön, of Bell Labs, is but one of several recent stories about scholars and scientists who got away with 'murder' for rather a long time. It's clear that we've become too complacent about scholarship.)
Although most educated people are not scholars, they are taught by scholars. With the rise of secular education - the scholarly training of students regardless of their intent to pursue academic careers - the political, professional, and executive worlds came to be dominated by people schooled in the principles of scholarship, principles that have come to be called, collectively, 'critical thinking.' Nobody is taught critical thinking as such. There are no courses, and there are no texts. (Correction: such courses and texts that exist are designed to study critical thinking, not to teach it. Think meta!) Students just pick it up. The children of professional people may very well develop the habit of critical thinking long before they reach university. It is possible to become a critical thinker without any higher education at all. Difficult, but possible. (And inadvisable: without higher education, one is likely to be so unevenly informed that the occasional slip back into 'plausible' thinking becomes inevitable.)
Senator Kerry's controversial reference to a hypothetical 'global test' in the first of this season's debates made me wonder if the candidate hadn't meant to say 'sniff test' instead, but been advised against such an earthy term. Sniff tests are, of course, exercises in critical thinking. They're so called because all it takes is a glance to show that something that has flunked a sniff test can't be what it claims to be. When movie stars claim to be philosophers, for example, their statements flunk the sniff test, for any number of reasons. (Enumerating and assessing these reasons would be an interesting way of anatomizing critical thinking at work). Without supporting evidence such remarks are not going to be taken seriously. Sniff tests are proof that critical thinking has taken root not just in many minds but in many families and other environments. That they're so quick, almost immediate, suggests that the habit of critical thinking can become as reflexive as the habit of driving a car.
But just as nobody is born knowing how to drive in traffic, so nobody is born with a command of critical thinking. It has to be learned. Which means that it has to be taught. Here's why I think that critical thinking is the raison d'être of education:
In classical political theory (which prevailed until the Reformation), democracies were bad because entrusting government to uneducated people was seen as a recipe for disaster. To be an educated person, you had to learn very specific skill sets (these gelled into the 'seven liberal arts' toward the end of classical antiquity), which you then employed in the direction of affairs. There may have been elements of critical thinking here, but they were far outweighed by what I'll call the ideology of authority. Whatever Aristotle said about anything, it was held to be self-evident for hundreds of years, before critical thinkers showed how credulous - incapable of critical thinking - Aristotle really was. You went to school to learn who the authorities were, and thereafter you invoked them. Unschooled people, ignorant of the proper authorities and likely to believe anybody plausible, could only be expected to yield to demagogues - false leaders who amass followers by making sweet and impossible promises.
The Reformation took another look at democracy. While no reforming sect encouraged the criticism of its own doctrines, the increased literacy nurtured by the belief that every man must read Scripture for himself fostered a critical outlook about everything else. (This is one reason why, having questioned the Pope's authority, the protestant sects multiplied in waves of intramural dispute.) The Founders of the United States trusted in this critical outlook. They shared a free-market view of ideas: in the absence of constraints, the best ideas would prevail. This was a working hypothesis, to be sure, not a belief, and the Constitution was designed to assure that the worst ideas would have a hard time prevailing. But in the Founders' day, it was all but taken for granted that anyone with the power to vote would exercise that power, and that elections would reflect consensus. For most of this country's history - the runup to the Civil War is the only outstanding exception - that expectation has been borne out.
The legacy of the Age of Ideology - the Twentieth Century - has done much to damage the prestige of critical thinking, not because critical thinking was instrumental in supporting totalitarian regimes but because critical thinking tarnishes the glittering allure of faith-based regimes. We in America always thought that ideology could never take root here, because of our liberal traditions. More complacency! The end of the Cold War has unleashed energies formerly focused on containing the Soviet Union, and intoxicated patriarchal conservatives everywhere that, with Communism defeated, America can and ought to put its own house in order. Their idea of 'order' is, regrettably, authoritarian and ideological. Critical thinking, to many, is unpatriotic.
In short, without a population of critical thinkers, we have no more reason to believe in democracy than Plato did.
15 October 2004: The death of Jacques Derrida the other day has prompted an enlightening wave of comment in print and on the Internet. I still don't know where to put "Derrida" among the philosophers whom I studied in school, but I've learned not to dismiss the man himself as an obfuscator. The patches of his writing that I've come across are perfectly opaque to me, and the brief guide to his thought that I struggled through a few years ago left me feeling like a golden retriever, enthusiastic but clueless. But what I've read this week has led me to suspect that I don't 'get' philosophy itself at all.
To the extent that philosophy is the more or less systematic search for certainties, for fundamental principles upon which to organize the perception of the world, I am now old and shameless enough to admit that I don't see the point of the exercise. I don't feel a need for certainties of that kind. While I would like to be certain that my apartment is safe, that my computers are reliable, and that I don't have to worry about food or health care - just to name three very everyday concerns that, with luck, I can take for granted - I know from experience that certainty on those fronts is unavailable. Perhaps Aristotle would say that I'm preoccupied by "contingency," his term for the accidental/inessential. And I would agree. That's where I live, in contingency.
To the extent that philosophy is an inquiry into the impalpable, I conclude that seekers will find what they want to find. That's why I'm an agnostic materialist. I don't know that God doesn't exist, and I don't assume that my life has no purpose. But I've never seen or felt or in any way experienced so much as a suggestion of God or a hint about purpose. (I myself have plenty of purposes, but that's something else altogether.) So I don't give them much thought. In fact, I only think about them when I see what thinking about them has done to other people. When I hear someone say, "I couldn't live in a world where God did not exist," I feel lucky somehow. That's not one of my problems. I feel a little guilty, too, because the free ride that I'm getting. The world that I live in wouldn't exist if millions of my neighbors and millions or their ancestors didn't believe and hadn't believed that God expects us to try to be good. (I take God's part on that one.) And I would not want to live in a world without Messiah or The Saint Matthew Passion, works obviously inspired by firm belief. So I try to treat other people's faith with the deepest respect - most of the time, a counsel of silence.
To the extent that philosophy purports to explain the meaning of life, I'm very impatient with it. To ponder "the meaning of life" is an adolescent activity that serves as a placeholder for the meaningful things that only adults can do. For teenagers, life has no meaning - yet. Old enough to think ahead but too young to have much useful experience, and, not coincidentally, longing to be taken seriously - to be given, so to speak, the keys to the car - adolescents naturally and understandably ask, "What's the point?" Just as naturally and understandably, they think that there's something defective about older people, occupied by children and careers, who not only can't answer the question but don't seem to see the point of asking it. It's hard for me to find this aspect of youthfulness endearing.
The fact that happy, healthy, and prosperous people tend not to take much interest in philosophy tells me not that such people are shallow or unintelligent or that such indifference is proof of man's fallen nature but rather that we should all be doing whatever we can to make everybody as happy, healthy, and prosperous as possible. How to go about that is the object of my philosophy.
All of which may explain why everything that I've read this week about the uncertainty that Derrida is said to have postulated has always been so perfectly clear to me that I wonder what the fuss is all about.
For 8 October 2004 press here.
1 October 2004: Like all Americans, I was glad that Fafblog's hilarious time-machine-aided retrospection of the debate turned out to be faulty: we did not have to look at a naked, 'emperor's new clothes,' president. It's true that I didn't check up on the post-debate responses of Actual Christ Matthews or Actual Tim Russert, but I was happy with CNN's post-party coverage until my nephew told me to check out Comedy Central. Thanks to that tip, I saw John Stewart for the first time, and I wondered if he had not absolutely commended eighty percent of the under-thirty vote to Kerry.
Much to my surprise, the debate made a Kerry supporter out of me. I'd tuned in for damage control, to make sure that I knew the worst. If Kerry was going to screw things, up, I wanted to witness his missteps at first hand. But Kerry surprised me by not making any. He was relaxed, confident, competent, and deliberate. The president, meanwhile, seemed to suffer from the Curse of Gore: wasn't 'petulant' the word that bedeviled the vice president in 2000? And had Bush ever so closely resembled the cartoon image that Tom Batchell has been perfecting in The New Yorker for years?
All I can think of, however, is Richard Strauss's Elektra. That opera comes to mind not so much because I like to think that Dubya is going to get the Klytemnästra treatment (gratifying as that would be) from his 'nearest and dearest,' but because, like the heroine, I am overcome, after this first debate, by a peculiar happiness. I have been drained by satisfaction; I am happily spent.
It has been four years...
24 September 2004: How I missed it, I don't know, but when I came across a reference to Andrew Card's remark, likening the United States to a ten year-old child in need of protection, in the current issue of The New Yorker, I almost dropped the magazine in disbelief. (Then I Googled it, just to be sure.) Oh, the disbelief had nothing to do with the remark's having been made: the Administration is certainly incompetent enough to reveal the contents of its id. No, what I had trouble believing was that the America I grew up in had been so flattened that this stark derogation of Enlightenment principles could stand as acceptable political discourse. (About a month ago, similar outrage was excited by the legislative philosophy of New York State Senator Joseph L. Bruno.)
What bothers me deeply about Mr Card's outlandish proposition is that it is foundational, not factional, in nature. Political operatives like the Chief of Staff ought to stick to factional arguments: our programs are better than the opposition's, and that sort of thing. Most of the Administration's factional assertions have been dubious to incredible, but it is not in the nature of political parties to stick to the truth, and grown-up voters assess what comes out of the White House with critical minds. But the Bush Administration is unlike all earlier administrations, at least in living memory, in that it overtly circumscribes factional statements with foundational ones. Is the United States really like to a pre-adolescent child? If the Founding Fathers had genuinely believed this, I am sure that they would have instituted an oligarchy ruled by, at the very least, a meritocratic elite. As it was, they regarded the run of their fellow men as adults. This is not to say that adults never make mistakes, or always see things clearly. But they can think for themselves in a way that is not expected of ten year-old children.
The Bush Administration's attacks on America's traditions have usually sounded variations on the theme of "Trust me!" Children have to trust their parents, because their understanding is so limited, but healthy adults are expected to be able to distinguish the reliable from the doubtful. To the extent that many Americans have trusted the Administration, they have indeed acted like ten year-olds (or worse), and this may warrant the president's cynicism. Of course, the Administration isn't asking for trust. It's insisting that, because it's trustworthy (because, in the absence of a track record, it says that it's trustworthy), any and all attempts to check up on it are unpatriotic. This is a manner of rewriting the Constitution, or perhaps of quietly discarding it. To my great chagrin, the bluff seems to convince half of my countrymen.
Committed to ideology, the Bush Administration long ago demonstrated an unwillingness to temper its judgment with the candid assessment of unforeseen developments. Ideology denies the possibility of the unforeseen; that's what makes it so attractive to people uncomfortable with doubt and uncertainty. The Administration is thus incapable of saying anything that would surprise regular listeners. For this very reason, it is incapable of telling the truth. If I hear the president announce that Earth is the third planet from the Sun, I'm going to wonder - and this is not cynicism on my part - what aspect of his reactionary program prompted the announcement. I'm going to be pretty sure that he has been nowhere near a telescope! The coincidence of his statement with scientific observation will have been purely accidental. A broken clock, after all, tells the correct time twice a day.
Even ten year-olds know that.
17 September 2004: One of the customer reviews of Nicholson Baker's Checkpoint, posted on Amazon, ends, depressingly, "the understanding and compromise and patience of days gone by is fraying, even if it is still usually reached, and the end of civil peace may be approaching." Having just finished the novel - it is actually a one-act play shorn of stage directions - I think I know what pushed the reviewer to such a gloomy conclusion.
In Checkpoint, two old high-school friends meet, for the first time in several years, in a hotel room in Washington, D.C., near the White House. Within a minute or two, Jay, a guy-on-the-edge, announces his intention to assassinate George W. Bush. Ben, who has driven some distance in response to what he believed was Jay's cry-for-help phone call, says everything he can think of to persuade Jay that assassination is not the answer to the Bush problem, but Jay is beyond reason. He has screwed up his life, and murdering the man responsible for the war in Iraq looks like a heroic exit move. One certainly shares the dream of seeing the end of what the clever lawyer who runs Fair Shot calls the current "campaignistration." But of course the more Jay carries on, the more one sees that Ben is right. Assassination solves nothing. "Tumblewad," as Jay calls the president, is not responsible for the war in Iraq. We may hold him accountable, yes. But responsibility for the war lies with the American public, which for one reason or another did not make a sufficient objection to the pre-emptive invasion. We all failed. Those of us who were fairly sure that the misadventure would play out as it has failed to convince people who were confident of success - if we knew any. That we couldn't help but fail, because of the surge of fundamentalism in this country, doesn't transform the failure into something else.
According to David Brooks, there's a civil war already underway in this country, but it's being fought without physical violence by opposing elites. I think that's correct. There has been no violence because the two armies are deep in huddles, talking only to themselves, their backs to each other. How long will that go on? Follow the links on whichever side you like, from Andrew Sullivan or from Josh Marshall. Whichever way you go, you're sure to get yourself worked up. Then what? Sooner or later, someone is going to realize, as Jay does, that reading inflammatory blogs doesn't accomplish anything. Which is the great thing about reading: it's safe. But when you're as worked up as Jay is, you want action.
How long can people who loathe Mr Bush continue to live and work alongside those who admire him? For four more years? This would be the place to respond in kind to Dick Cheney's insidious libel that a vote for Kerry is a vote for terrorism, but I'll stop short of explicit stooping. I was about to say that nobody hates Mr Kerry the way millions hate Mr Bush (millions in this city alone), but rigid righters would remind me that they had to put up with someone they detested for eight years, so stop whining.
It ought to be noted that Mr Baker has not published a blueprint for terrorism. Jay's schemes for offing the president fall seriously short of the realistic. Nor is Jay a very convincing terrorist. But at some point, after more or less blood is shed, the powerful in Washington of all parties are going to have to accept responsibility for unleashing terrorism on a peaceful world. That's something that the American public had nothing to do with.
10 September 2004: Although New York survived the Republican Convention far better than many people feared, I was glad to be away from the disruptions, even though the odds are that I should have remained close to home, and possibly never left Yorkville. Then again... But the brouhaha here wasn't the only thing I escaped. Because our host has decided that the Internet isn't for her, and there are only two phone jacks in her house (forget high-speed availability), one behind the wall phone in the kitchen and the other behind her night table; getting online was not really practical. Nor did I see the Times. I was really unplugged, and it was strangely wonderful.
The week's rest may explain why, the closer we get to the election, the more resigned I am to what would have sent me wailing into the streets as recently as six months ago. I'm sorry that it has taken so many people so long to grasp the mendaciousness of the Bush Administration, and I wish that the elections were taking place in four months, not two, but I'm confident that a chorus of critical minds has been awakened and will soon be pummeling the fundamentalists. Fundamentalists, in case you're wondering, are people who have made up their minds never to change their minds about anything of importance again. Imagine an automobile that's on cruise control but whose driver has passed out. Fundamentalists are that reckless.
Because it's harder than ever for most people to imagine shutting down their minds, the fundamentalist power surge took the country by surprise. It has been suggested that the French Revolution, initially rather civil, soured into terror because most French people were too moderate to imagine such an outcome; I hope that we won't have to go that far down the road. But it's a mistake to think that fundamentalists aren't intelligent. Robespierre was a fiercely intelligent fundamentalist. What we need to do in this country is to detach the fundamentalists from the mass of not-very-intelligent people whom they have learned to seduce (and abandon). These people want to be led, and liberals have to get over their squeamishness about leadership.
(Preliminarily, we need to work toward a world in which sex lives become truly private. This side of firmly-policed boundaries regarding violence and youth (which shield the defenseless not so much from inappropriate sexual activity as from the abuse of power), consensual sex is nobody's business. I know perfectly well why it's taking a while for everyone to agree with this proposition, but I'm optimistic about the collapse of patriarchy - it's as extinct as the power to enslave thousands for the construction of pyramids, and all that's left is its malodorousness. Paradoxically, private life for public figures will resume when sex life ceases to be secret. Secrets are always interesting, at least until we know what they are. But if the only thing that's interesting about another person's sex life is that we don't know what it is, then perhaps curiosity will die a natural death.)
Leadership, in case you're wondering, is the art of honestly persuading people to accept bad news and to agree to deal with it in some constructive way. A leader uses polls to find out what constituents don't want to hear, so that he or she can tell it to them. I don't see many leaders on the scene, but I cherish the ideal.
27 August 2004: The most important intellectual problem on the horizon today is the identification of workable limits to the extent of the free market. That free markets produce great (if occasional) harm along with great good has never been in question; until the Industrial Revolution the harm was deemed greater than any conceivable good. Machinery and extractive energy opened up the promise of hitherto inconceivable good, and a variety of social experiments in the nineteenth and twentieth centuries pointed to the conclusion that outside interference with markets rarely yielded the intended results and often yielded horrific ones. In our time, the free market has been exalted as the model of exchange among human beings.
One problem with free market theory occurred to me the other day: the operation that it posits is fundamentally negative. In a free market, with prices following laws of supply and demand, any individual can decide at any moment that a proposed exchange is unsatisfactory, and withdraw from market. Prices deemed excessively high deter buyers; prices deemed excessively low deter producers - who are, of course, buyers in their turn. Thus the market in any particular good or service is always approaching the unacceptable. This mechanism might be thought of as the gravity of human resistance; in a free market, resistance acts as the core of the earth does on physical bodies. Participants in a free market are barraged with an implicit question: How much can you take before you walk away? The ethos of the market, moreover, encourages them to take as little as they can get away with having to take.
Sometimes it seems that free market theory is the only ethos governing social exchange these days. This can't be right, but alternative codes have certainly lost their vitality.
Let's grant that there are conditions in which free market operation is largely, even wholly, beneficial. Let us also grant, on the strength of a single example, that there are conditions in which free market operation is wholly malignant. In family relationships, the freedom of choice required for free market operation is lacking, and it is for that reason that parents are wrong to favor some children over others in any gross or capricious way. Parenthood requires a submission to the uncontrollable that free market operation does not recognize. Between these extremes lies an enormous range of human interactions of greatly varying degrees of liquidity.
By liquidity I mean the exchange of goods or services for full or partial compensation in money; free choice is an important element. It seems fair to presume that the beneficence of free market operation increases as a given exchange approaches total liquidity, which occurs when a seller's consideration consists comprehensively of cash, while the buyer gets everything for which payment has been made. But there is no frontier at which free market operation ceases altogether to be beneficial and becomes altogether malignant. Therefore it is fruitless to rely on distinctions between conditions in which conditions in which free market operation is appropriate and those where it is not. I have posited extremes of which such a distinction is true, but they are extremes, and not particularly expansive extremes. The distinction to be made is far more manifold; it is really an infinite series of distinctions that human judgment can only approximate. In any given set of circumstances, to what extent does free market operation require surveillance or regulation? This question implies another: what constitutes the best means of surveillance and regulation?
For the moment, I assume that the second question must be answered first, because in order to be effective, regulation - I will speak simply of regulation from now on - must constitute an aspect of free market operation. Rather than positing a free market operation that is always the same, and that is subjected to outside interference - a classic example of outside interference would be the laws that used to ban commercial activity on the Sabbath - I want to envision regulation that so closely fits the meeting of given circumstances with free market operation that it may be considered organic to free market operation.
In other words, I want to imagine a positive force that would counter what I have called the gravity of human resistance. Consider a physical object in space. I can arrest its fall by obstructing its path, or I can alter its trajectory by creating a force field, as for example with magnets. The use of countervailing forces does not interfere with the operation of gravity; rather, it transforms it.
I assume, again for the moment, that I'm looking for more than one countervailing force.
If you think that I am going to find them here and now, you must have a few bottles of snake oil in your medicine cabinet. But I will venture to suggest that as the gravity of human resistance posits a universe of autonomous individuals, so there exist a gravity of human sympathy, which links us all indirectly, and a gravity of wisdom, which links us all collectively to the results of our forebears' curiosity. If either of these forces sounds fanciful, I ask you to consider the tremendous role that loyalty (a form of sympathy surely) plays in political and commercial corruption, and the oppressive weight of sacred traditions (a souring of wisdom).
Perhaps it would be best to assume that each of these three forces (resistance, sympathy, and wisdom) develops a positive polarity in the presence of the others. Perhaps it would better still to drop the metaphor of polarity, and to say that resistance, sympathy, and wisdom, each capable of making life worse when operating in isolation, becomes a force for good when obliged to come to terms with the others.
20 August 2004: Most of the time, I write about books with two assumptions in mind: first, that most of my readers are too busy to do the same, and, second, that the only way to inspire somebody to buy and read a book is to make it sound irresistible, something that can be done only by expressing an enviable delight. Today, however, I am going to be more direct, and urge you all to buy this book. Hey, it's cartoons.
I've loved the work of Tom Tomorrow ever since I came across it in the pages of The New York Times - very occasionally. What a delight to see his Y2K nightmare in The New Yorker on the eve of the last century's greatest nonevent - and what a delight to read it again, in all its splendid gorgeous color, in the pages of The Great Big Book of Tom Tomorrow (St. Martin's Griffin, 2003). The book arrived yesterday, and I've been trying not to O.D. (For anyone who needs a little reminder, or a get-acquainted session, with Tom's work, simply click on the 'Republican Matrix' link to the left, under Web Fun.) But overdosing might not be a danger. At a certain point, it's the desire to close the book that becomes irresistible. Too much message! A few hours later, though, and it's safe to pick up again.
Along with a preface and a brief but choice collection of early work, there are three chapters in The Great Big Book, 'The Reign of King George the First,' 'The Tabloid Presidency,' and 'Hell in a Handbasket.' As a political cartoonist, Tom Tomorrow is an equal-opportunity offender. As someone who started out satirizing (and undermining) the ethic of commercial hegemony and endless consumption, he also finds other aspects of the incumbent administration to dwell upon, and surely his representations of the Vice President boasts an astonishing degree of verisimilitude! But Public Figures are not Tom's real target. Most of his four-panel cartoons are framed in one way or another as mass entertainments, usually as broadcast television. The design elements come from the old-timey advertising that Dan Perkins (Mr Tomorrow to you) says obsessed him in his early twenties. His medium, in short, is media, and his subject is us, media's dupes. That's what makes the repetition of certain figures from pane to pane so funny. It ought to be anything but, but the utter lack of visual variation shows up the utter inanity of what's in the text balloons overhead. Mr Perkins (b. 1961) isn't old enough to remember the Bettys and Biffs of print ads and TV spots in the Fifties, but I am, and his spoofs resonate very powerfully with the simpering fraudulence of their promises. For reasons best left to discussion somewhere else, the middle classes of the United States were overtaken, during the Eisenhower years, by a craving for mind control. Just to be sure they'd be ready for it when it took over, many Americans gave up critical thinking for almost an entire decade.
Except when they stubbed their toes and thought they were alone and, quite earthshakingly to propinquitous innocents, cursed. Tom's work transmits the selfsame shock.
It may be the bleakest expression of optimism that you've heard since the media assured us that the Supreme Court-appointed President would be a moderate, but the difference between the Fifties and now is that there's a lot more critical thinking going on in this modern world. Joke! There I go, imitating Biff. Close the book, RJ.
No, buy the book. All profits go directly into the author's and publisher's pockets. What could be more amurkin?
13 August 2004: Whether or not I will ever get through Ron Chernow's biography of Alexander Hamilton remains unclear. The book is characterized by two qualities that don't work at all well together. First, it seeks to refute what the author clearly regards as a tradition of calumny against Hamilton. Second, it takes an almost microscopic view of Hamilton's doings. Harangued again and again into sharing Mr Chernow's admiration for his subject, the reader begins almost to dislike Hamilton. I came to the book prepared to like the man; I have always considered myself a Hamiltonian, and not a Jeffersonian. At the same time, I can understand why Hamilton's easily-bruised temperament and pseudo-aristocratic postures would dampen his popularity among leading Americans. If it hadn't been for Washington's unswerving patronage, Hamilton would probably not be remembered today outside academia, and his peers were only too aware of this much-resented advantage. Mr Chernow's advocacy only deepens the image of a man who would have had a very hard time of it making his own way.
So one day a few weeks ago, casting about for some solid nonfiction but allergic to the idea of more trumpet-blowing, I picked out a book that I'd bought when it came out but promptly decided would be too grim to read. This would be Connie Bruck's When Hollywood Had a King: The Reign of Lew Wasserman, Who Leveraged Talent into Power and Influence (Random House, 2003). ... for the write-up of this book that what followed became, click here.
6 August 2004: The other day, I had a mildly Proustian experience when a chance turn of phrase took me back to high school. Someone was writing that Fox News "flatters the prejudices" of its viewers. Prejudices. This was a word from the past. There was a time, at least in my socio-economic niche, when the simple statement, "You're prejudiced," was shorthand for this: "Even though you don't know any Negroes, you think that they're inferior. This is a rearguard idea. It is both stupid and bad of you to think this way." (In the town where I grew up, "Jews" could be substituted for "Negroes".) To be prejudiced was to have very specific negative views, views that were targets of the progressive, forward-looking activism that sought to end segregation in America. In effect, adolescent peer pressure made a particular prejudice uncool. Hey, whatever works.
Then what happened? Along came the structuralists, the post-structuralists, the deconstructionists and the bourgeoisie-hating mob of leftist academics. According to them, we were all "prejudiced." Our minds are full of unexamined and irrational ideas that influence our decisions, and to say that one prejudice is better than another is itself to express a prejudice. Like so many of the observations that would issue from the mandarin towers, this one was more fatuous than correct. Insisting on the same value for all prejudices and allowing none to be "privileged" is rather like arguing that because our gastrointestinal tract is riddled with helpful bacteria, all bacteria are beneficial. We may indeed all have our prejudices. But prejudices that stigmatize other people on the grounds of identity are unlike all other prejudices, and worse than most other prejudices, in their power to can pervert democracy, by turning it into mobocracy.
In effect, the left gave the right permission to be perniciously pig-headed.
Prejudices are usually unconscious, or nearly. You are unaware of a prejudice until it is pinched by experience. You think of yourself as a kind, open-hearted sort of person when suddenly the sight of, say, two other people holding hands - people of the same gender or of different races - makes you angry and disgusted. There are certainly those who believe that such behavior is wrong, because it offends against scriptural precepts, but in this country, as Martha C. Nussbaum has pointed out, a fear of shame and humiliation is more likely to underlie the prejudiced person's disapproval. The sight of two men kissing triggers anxieties in men who patrol their own sexuality for homosexual 'tendencies.' "If they have the right to kiss, then I have the right to kiss... there must be no such right!" It is dread, not virtue, that drives homophobia, for truly virtuous people, whatever their social views, don't worry about corrupting influences.
So-called identity politics always begins in the attempt to defy prejudicial stigmas. It drives conservatives crazy because nobody likes to be told that he's expressing a prejudice - just as nobody likes to be told that his tie clashes with his shirt - and because, having conceded that black skin does not per se mark someone as inferior, conservatives want to regard the whole civil rights thing as settled and done with. As if skin color were the only stigma. The right's seizure of what it thinks to be the high moral ground of 'colorblindness' is nothing but a patent denial of its own bristling prejudices. It wants us to believe that neither gender nor disability entitles anyone to special treatment because nobody harbors prejudice against them. This is obviously not true. I'm not quite sure why, but people on the right are incapable of concealing their longing for a world in which we all share their prejudices.
I try very hard not to be prejudiced against conservatives.
30 July 2004: Memo to voters: George W. Bush is the man whom Islamic terrorists want to see in the White House for another four years. "We are very keen that Bush does not lose the upcoming elections," wrote the self-styled associates of Al Qaida whose message was published in an Arabic newspaper published in London, shortly after the March 11 bombings in Madrid. Islamic Jihad is stimulated by the American President's "idiocy and religious fanaticism."§
There's nothing like dueling religious fanatics. Those who are violent in the name of an ideal regard practical, peaceable people with contempt, but they can respect bloody-minded adversaries, and perhaps even fear them. What Osama bin Laden and Dick Cheney share right down to their boots is a longing for increased polarization, a new kind of total war in which everybody is a warrior, and every place a battlefield - except, perhaps, for the fortified aeries from which they direct their forces into harm's way. This utterly adolescent fantasy, which can only become a malignancy if it is not excised from the developing mind, is certainly lethal, but it is also ridiculous, and the question about such men is not how they came to be the way they are but rather why more people don't laugh at them.
The answer as to Osama bin Laden is perhaps that his supporters are too hungry, too destitute, too hopeless for real laughter. I don't feel sorry for them, but I am convinced that the only way to eliminate them is to eliminate the conditions in which they flourish. Not an easy task, but one that requires the cooperation of authorities everywhere. Police authorities, for one, to gather intelligence. Financial and legal powerhouses, to husband prosperity. (American prosperity flows directly from its commercial laws, which are sensible for the most part, easily intelligible to those who rely on them, and largely uniform among the states. Have we ever made a concerted effort to export them?) And religious authorities, to expose and brand as fanatics all those who preach offensive violence. The task requires all the adult virtues: patience, the wisdom of experience, memory, and a lighthearted skepticism about bombastic visions.
The answer as to Dick Cheney is that too many Americans are manifestly uninterested in cultivating the adult virtues. A glance at the crowds who turn out for the candidates makes it clear that most people don't believe that appearances matter, but rather present themselves the way teenagers do, on a take-it-or-leave-it basis. The belief that sporting activities deserve serious attention (quite literally a contradiction in terms) has spread alarmingly ludicrous throughout the Western world, and the general notion that tournaments are more important than public affairs is patently rebellious. And the focus on youthfulness hampers, if it doesn't altogether preclude, the development of what used to be heralded as 'maturity.' Striving to impersonate eighteen or twenty year-olds necessarily entails assuming the immature understanding of death, which is that it doesn't really exist. What keeps Dick Cheney going, through four heart attacks and a bleak view of the world, must be that he's at the center of the action, man.
And of course there's the tendency of adolescence to regard anything that isn't adolescent as ridiculous. But adolescent laughter is usually spurious. Adult laughter is triggered by the flash of recognition (as is the adult response to tragedy), by an acute pulse of insight. Even in a crowded theatre, it begins as an individual activity. Teenagers laugh because they're nervous, or because their friends are laughing, or, as all these movies about nasty high-school girls remind us, to inflict pain and humiliation. I know lots of people who laugh at George Bush in this way, and fat lot of good it does. What I laugh at is the idea that political advertising is accurate or, in my case, effective. (Or broadcast news, for that matter.) I don't laugh at the idea that political advertising influences uncritical minds, because it's both unfunny and obvious, but anyone who tells me that nine-figure campaign expenses are compatible with democracy is going to hear a rude snort.
23 July 2004: James Surowiecki - How I'd love to pronounce his name correctly. In Polish, I believe, it would be soor-ahv-yet-ski, but what do I know? I don't even know how my own name would have been pronounced in Gaelic, just that it would probably have required a longer spelling - ahem, James Surowiecki writes one of the most compelling features of The New Yorker, usually the first thing that I read when the magazine arrives. The 'Financial Page' is always that, a page with a small drawing, and it always opens a door between the financial-commercial complex and the rest. Columns that I recall peculiarly clearly include his denunciation of the Air Force-Boeing lease deal, which I believe has fallen through, and the real problem with Long Term Capital Management. I wish that he would publish a collection. Properly dated, these reports would constitute an intriguing history of the the late bubble and its aftermath.
The latter issue appears in Mr Surowiecki's book, The Wisdom of Crowds: Why the Many Are Smarter than the Few and How Collective Wisdom Shapes Business, Economies, Societies, and Nations. Where Malcolm Gladwell's similarly piquant book, The Tipping Point, studied influence, and the spread of ideas through a group of people, Wisdom illustrates a counterpoint: decisions made by groups consisting of people who have been influenced only somewhat, or not at all, by other people usually turn out to be the best decisions. Mr Surowiecki's title deliberately evokes that entertaining old tome, Charles Mackay's Extraordinary Popular Delusions and the Madness of Crowds, first published in 1841 but still usually in print and always available. Mr Surowiecki doesn't come out and say so, but it's clear that he would have advised Mackay to substitute the word mobs for crowds. Just as chemists distinguish between compounds, the ingredients of which are bound into a single substance, and mixtures, in which the aggregates retain their structure and nothing new is formed, so Wisdom distinguished between mobs, which forge numbers of people into single-minded organisms that ruthlessly suppress antagonism, and crowds. All the people in a crowd remain distinctly individual, only peripherally aware of one another (if at all). They have their own perceptions, their own biases, and their own natural resources. Study after study demonstrates that crowds out-perform all but a very few - and sometimes none - of their constituents. Picture the rush-hour population of Grand Central Terminal's Main Concourse. It may look like a mob, but it's actually made up of people who are fairly close to unconscious of one another. Workday issues, what's for dinner, and the urgency of making a particularly train leave little room in most minds for paying attention to fellow passengers. According to The Wisdom of Crowds, this aggregate of individuals passing through the Concourse would produce an average guess as to the height of the famous ceiling that was more accurate than the individual guesses of all but a tiny handful of commuters (some of whom could be counted on to have memorized the figure from various publications). Very few people would come even close to the correct figure, but the averaged wild guesses of the thousands of people involved would, it seems, approach accuracy very closely.
One wants to know how this works, of course, but although Mr Surowiecki doesn't explore the mechanics underlying such strange-seeming wisdom, his book is too interesting for that to disappoint. Perhaps its enough to know that a group of independent, robotic computers are very good at simulating the behavior of human crowds, especially in market trading scenarios. What's important in The Wisdom of Crowds is the distinction, elegantly hammered home, between crowds and mobs. Not all mobs resemble the citizens of Paris during the French Revolution. Sometimes they're quite small - say, the advisers who partook in the discussions that led up to the Bay of Pigs invasion, or, for that matter, to the counsels of the current administration. Loyalty and single-mindedness turn out to be predictors of disaster when brought to bear on affairs of state. The word commonly used to describe the behavior of small mobs is 'groupthink,' and it appears in all meetings, large and small, whose members lack equal access to the floor. The willingness to defer to the head of a NASA team did much to doom the crew of the Columbia. Executives like Jeff Skilling and Dickie Grasso appear to have made mobs of the people around them.
The three characteristics that distinguish crowds from mobs are diversity, independence, and a carefully-structured, integrated decentralization. Diversity insures the vital range of mindsets; if a crowd is made up exclusively of trial lawyers, obstetricians, or rich Republicans, it will degenerate into a consensus-seeking mob. Remember, it's the average judgment that's the best, not the one that everybody is predisposed to agree on. (In other words, deliberative meetings - elections, legislatures, juries, meetings large and small - ought only very rarely to be called upon to ratify some decision already made elsewhere.) Independence is important for much the same reasons. It has been established that too much news, for example, clouds independent judgment and leads to less-successful outcomes. (Defining 'too much' is, of course, quite tricky.) Decentralization is required to insure the first two characteristics, but it must be structured in such a way as to prevent members of a decentralized unit from putting that unit's interests ahead of the parent's. This problem has been highlighted in spades by the intelligence failures attributable to a kind of professional constipation among CIA and FBI personnel - not to mention a complete disinclination of either agency's staff to share information with the other's. Both seem to have lost sight that the interests of the United States are paramount. We can hope that the reorganization of our intelligence apparatus, pursuant to recommendations of the 9/11 Commission Report, will be informed by Mr Surowiecki's analysis.
That the ideas outlined in this book comport with the goals of democracy ought to be very clear; what makes The Wisdom of Crowds essential reading is its catalogue of the damage done by mobs, groupthinkers, and conceited CEOs who don't take advice. Democracy, after all, isn't desirable because it's a Good Thing. The older one gets, the more one is inclined to agree with Churchill's assessment: democracy is terrible, but it's not as bad as the alternatives. The problem with the alternatives is that they're very tempting, and it's no wonder that humanity tried them all out for several millennia before finally trying to make democracy work (and often failing). We all seem to believe that the smartest person in the room will come up with the best solution. Smart people often do, but no smart person is always right, and intelligence is usually more limited in scope than its possessor imagines. Ditto for big, strong guys: military regimes institute peace fairly easily, but they also have a tendency to decay into the kind of kleptocracy that blights Haiti, where the big guys sit around all day quietly making everybody else take care of them. Generations of hereditary monarchy suggest that breeding human intelligence remains painfully elusive.
Looking up from Mr Surowiecki's pages, one quickly sees what's wrong with the Bush Administration's yearning for consensus. It is both profoundly anti-democratic and inclined to miscalculation. The Administration's deeds and misdeeds follow from the principles that it professes to honor, and that is why some critical observers have worried from the start about where the Bush entourage would take us. It's a matter of politics, not policies: the Administration believes in the politics of mobs, which it rather foolishly believes it can control. The mobs are not out in the street rioting, but hypnotized by their televisions. But fewer and fewer appear to be watching the 'right' broadcasts. As the growing ABB (Anyone But Bush) movement indicates, the president and his friends may have stirred up a mob whose only thought is to get him out of the White House. That may be good for democracy in the long run, but it's not very democratic behavior.
16 July 2004: The New York Times has been running an interesting feature this week, the first in a series, apparently, of Great Summer Reads. Each day, a chapter or two of F. Scott Fitzgerald's The Great Gatsby appears in its own fourteen-page section (exclusive of cover). By Sunday, the entire novel - admittedly not a very long one - will have appeared. I couldn't, on Monday, decide whether to bite; I have my own copy to read at any time. But the presentation in installments intrigued me, even though the story is familiar, and even though when I think 'Gatsby,' I see Jack Clayton's film adaptation of 1974. For that very reason, I thought I had better take another look at the text. There was and is, in addition, the charm of knowing that lots of other New Yorkers - how many, I can't help wondering - are doing the same.
Suspecting that there must be some abridgment involved, I pulled down my Scribner's edition and found that indeed there isn't. Then I read a Publisher's Note (written by Charles Scribner III himself) that disclosed a very interesting tidbit: year after year, Gatsby sales figures surpass the total sales of all editions during Fitzgerald's life. This reminded me that when I first read the book it was not quite a classic, but rather a very respected popular novel. Now it's a classic, acknowledged as a profoundly American story told with a fine grace that only lightly masks its grim propulsion and sordid settings. Book in hand, I had to resist a very powerful urge to read what the Times won't distribute until tomorrow, even though I know what's going to happen outside George Wilson's garage in Chapter VII.
I also wonder how much annotation it takes to make the novel intelligible to today's high-school students. They'll have to be told that there was no air-conditioning in 1925 (or 1922, the time of the novel), that most people didn't have cars, and that highways were all but unheard of. (Indeed, the Bronx River Parkway, the nation's first multi-lane, limited-access highway was inaugurated in 1925.) Students will probably already know about Prohibition, and about the rising stock market, but they will probably require a little backgrounding in the spiritual deflation that followed the Great War. They will almost certainly have no idea that the novel is set at a pivotal point in the history of respectability, hitherto regarded as a Good Thing, henceforth as a Bore. They won't know what 'vulgar' used to mean, at least until they understand that 'Trimalchio' shuts down his catering act as soon as he sees it through his beloved Daisy's disapproving eyes. That Daisy is a lady, while Myrtle is only a woman, may be a novel distinction.
All the contradictions of the period are present in The Great Gatsby. The heartland remains bucolic, while cities like New York produce a new kind of cacaphony, generated by unstopping motors, garish neon signs, and ubiquitous popular music. It was the age in which sophisticated paradoxes, once the specialty of Oscar Wilde, spouted from the likes of Jordan Baker, who says that she likes huge parties because it's possible to have some privacy at them. There is a sour cynicism running not beneath but alongside the braying fun, but there is also tremendous optimism about the possibility of getting rich. The only thing missing is the kitchen sink: Fitzgerald seems not to have noticed or taken any interest in the domestic revolution that was beginning to fill kitchens with "appliances."
I can understand why The Great Gatsby was not a runaway success - why, in fact, copies of the second print run were still unsold when Fitzgerald died. It paints a very unflattering picture of American life. Americans would have to triumph in World War II in order to have the self-confidence not to be knocked down by Fitzgerald's bleakness. They would have to be able to say, "that was then," so that by dismissing the age of flappers and gangsters they could regard Gatsby's end as tragic. In fact, it's pathetic: a grotesque accident, a black joke, altogether of its time.
I am reminded of the powerful denunciation of the time that appeared in my high-school American history text:
Disillusion and cynicism spread to almost every part of the social body, inducing both irresolution and irrationality. There was widespread distrust of reason, and as men lost faith in reason they almost ceased to use it. They lost faith, too, in many of those values that earlier generations had taken for granted, and lost even the capacity to believe in values. There were few grand ideas, but a sophisticated rejection of ideas; there was little faith, but a superstition masquerading as faith. For all its cascading energy the age was negative rather than affirmative, incontrovertible in repudiation but weak on affirmation. Never before had so many men known so many arguments for rejecting the heritage of the past; seldom has a generation bequeathed so little that was of permanent value and so much that was troublesome in the future.
- Samuel Eliot Morison and Henry Steele Commager: The Growth of the American Republic, 5th Ed. (Oxford, 1962), Vol. II, p. 653
9 July 2004: Permalinks - there, I've said it. Maybe you know what they are; maybe you don't. It depends upon whether you visit blogs regularly. If the term is new to you, don't worry; you'll have the chance to find out for yourself at the end of these paragraphs. I have spent nearly a year trying to absorb the concept of the Permalink, and, last Friday, the light bulb finally lighted up. I saw how to simulate Permalinks on this site, which, as I am never tired of saying, is not a blog.
On the contrary, it is a Web site constructed in framesets. You may or may not know what that means, either, but it doesn't matter. Here's what matters: an unintended side effect of working with framesets is that the address box in a browser will show the site's principal URL and only that URL no matter where a visitor goes within the site. This means that it is not possible to copy the URL - address - of a particular page, because that address never appears. This means that someone who wants to forward to a friend the link to my recipe for macaroni and cheese will - or would until now - be unable to do so. One would have give instructions: "Portico>Culinarion>Macaroni and Cheese: Doing It Right." And my nephew, Tim, assures me that nobody's going to take the trouble to do that. "You need Permalinks!"
I didn't say that I know what a Permalink is; I've just figured out how to simulate one. I believe that Permalinks are macros provided by services, such as Blogger and Movable Type, that automate Weblog organization. When a blogger posts something new, it appears on the blog's front page (so to speak), but it also appears in the blog's archives with its own URL. That's very convenient for bloggers, but I am my own system administrator. For the longest time, I thought that I was going to have to duplicate everything on my site (thus doubling my bandwidth consumption!), with framesets containing each page in its proper place, and then a sort of appendix of pages 'outside' the framesets, accessible by link. Last Friday, I saw through this conundrum. The pages were already in that appendix. All I had to do was to provide each page with a link to itself, with instructions to open the new (same) page in a new (different) window. If I've lost you, good. It took me a year to see what was right in front of me. The problem was my resistance to nonsense. (That and the fact that I'm teaching myself as I go along.) How do you link something to itself?
As it turns out, there's nothing to it. So now I'm proceeding through the various branches of the site, inserting on each page what for this occasion only I will bracket with quotation marks: "Permalinks." Reading Matter and Music have been fully equipped, and Dates, Dates, Dates is next on my list. Meanwhile, I will work my way through the extremely long pages of Former Front Pages, where, by the way, each new Front Page will appear on the day of publication, instead of a week later.
The complement to the Permalink is the multicolored Portico link at the bottom of each page. Someone visiting Portico via a Permalink will see only the targeted page, with none of the navigational tools of the surrounding frameset. The Portico link acts as though you were coming from another site. If I can figure out how to do so without clutter, I may insert a "no frames?" link, to accomplish the same thing, at the top of each page.
Go ahead, try it. It's there, right below, in the middle.
2 July 2004: Whenever I walk past the movie theatre across the street these days, I see people lined up for the next showing of Fahrenheit 9/11. Lines for anything but a blockbuster, and anytime after the blockbuster's opening weekend, are a complete novelty here in Yorkville. But lines for this particular film are no surprise. Also not a surprise, I expect, is the substance of Michael Moore's latest documentary. Are people going to find out things about the Bush Administration? Or are they paying - are they so fed up with it that they will actually pay - to go and jeer? So what if Fahrenheit 9/11 is tendentious? (Not having seen it, I hesitate to pronounce it 'tendentious at best,' but that's my suspicion.) So what if it's misleading? Haven't we been so misled in the other direction by our unelected president that a little course-correction is in order?
I remember despairing, a few years ago, that Americans would ever wake up to the reality of Life Under Bush. Now I'm in the awkward spot of one whose prayers have been answered. That's to say that I'm still not happy. Perhaps that's the nature of a liberal disposition - never to be quite content. Perhaps I believe that the virtue of happiness lies in the pursuit of it. My foibles notwithstanding, however, I'm dismayed by the growing atmosphere of Payback Time. I'm dismayed by how good it feels.
But while you can probably guess what it is, I'm going to keep my reaction to the prospect of George W. Bush's comeuppance to myself (and, perhaps to their chagrin, to my dinner guests). It is not for shouting from the housetops. The cover of the current issue of The New Yorker (July 5, 2004) shows exactly what I'm hoping won't happen: two families, almost identical but for their blue and red backgrounds, snarling at one another instead of watching the holiday fireworks, which for their part are segregated as well (although artist Christoph Niemann has given this detail an interesting twist that I leave it to you to puzzle out). It is this polarization that I have dreaded ever since Ronald Reagan won in 1980. On bad days, I see our War of Secession as an unfinished business, to be continued in a genuine civil war.
Who would be the contestants in such a disaster? If this country is as polarized as it's often said to be, what really distinguishes the two camps? Permit me to venture a suggestion. On the one side, loyalists, and, on the other, solidaritists. (The latter term, my machine has just advised me, is not a word, or wasn't until now, and perhaps the ay I'm going to define it - beyond "people believing in solidarity, accent on the third syllable" - will simply broadcast my ignorance.) Loyalists are people for whom loyalty is a cardinal virtue, perhaps the paramount virtue. Loyalists take a vertical, hierarchical view of the world; they are loyal to their leaders and they expect loyalty from those who are dependent upon them. In this context, it is mistaken to speak of loyalty among peers - that's solidarity. Solidarity is horizontal, and it tends to efface social distinctions. There is no necessary ideological difference between the two groups - except that the loyalists are prone to think ideologically while their opponents are preoccupied with actuality. Loyalists are loyal to ideas as well as to people. Solidaritists put individual people ahead of every idea. The philosophical difference between the two runs as deep as any human distinction.
It's not hard to see why loyalists are much better organized and often more effective.
People who have never thought about my distinction might not know where to put themselves. Family life calls for a blend of loyalty (to parents and children) and solidarity (with spouses and siblings). Working life requires both, too. (Good workers must at least be loyal to their employer's goals, and solid with their colleagues.) But which virtue would you put first? Would you snitch on your brother, or shield him from parental wrath? Thoughtful people who ask what this hypothetical brother is supposed to have done declare, simply by asking, that they belong to neither camp because they believe that the interplay of virtues depends on circumstances. Happily for my anxieties about national collapse, most people are thoughtful in this way. Its the ideologues that I worry about. I wrote above that solidaritists don't go in for ideology, but this is not absolute. They believe that loyalists are mistaken.
My objections to President Bush have over time become twofold. He always was an ideologue, or the enabler of ideologues, and I believe that that is just as bad for the United States as Soviet ideology was for Russia. (Thanks to Barbara Ehrenreich for pointing out, in her new Times Op-Ed column, that the intellectual ancestry of Administration neoconservatives traces back to the Trotskyites of the 1930's.) But it has also been demonstrated that he is the captain of a team of incompetents. Of course, that may have been inevitable: the object of my first complaint may lead with logical necessity directly to that of my second. But the ideology and the incompetence have played out in different ways and at different times. I am also opposed to the substantive positions that make up Administration ideology, but were they in the hands of a pragmatist I would give them a chance and see how they worked. I would give almost anything for the chance to vote for pragmatists.
But pragmatism, while it is supposed to rule American business, seems to take on a queasy coloring in American politics. It suggests compromise and accommodation and a want of principle. Americans are thoughtful, but they're not as thoughtful about as many things as they ought to be. That's true of everyone on earth, really, but it's an American problem because America is so powerful. So I guess that we need films like Fahrenheit 9/11, just to get people thinking, specifically about the context in which our patriotism - loyalty to the United States - runs up against our solidarity with the rest of the world's inhabitants: our humanity.
9 April 2004: Every other month, I consider giving up my subscription to The New Criterion. I find much that's published in its pages offensive. Most offensive are its curiously Olympian thunderbolts, comprised in equal parts of sarcasm and outrage, and always aimed at liberal targets. Where there ought to be discussion and analysis, summary 'self-evident' statements appear instead. A recent example:
For, stated simply, against the withering boredom that descends upon a culture no longer invaded by visions of eternal order, no civilization can endure.
"Stated simply" means, in effect, that this proposition will not be argued. Perhaps argument is unnecessary: every literate reader of The New Criterion, however aligned, probably knows the outline of debate on this point. But what's unpleasant and inappropriate about the statement is the contempt with which it deploys highly tendentious language. Withering boredom, invasion by visions of eternal order, the guaranteed failure to endure - this way of talking, simply stated, is nasty. Why read such stuff? Granted that I elected The New Criterion as the loyal opposition, a window to views altogether different from my own. You can't just read the things you know you're likely to agree with (although, come to think of it, I rarely agree in detail with The Nation or The New Republic). But is The New Criterion loyal to anything but its own conservatism? I don't think so. Perhaps, one of these days, The Atlantic will more openly declare its commitment to the right, and I'll be free to let the newer magazine go. I doubt that anything in The Atlantic will ever upset me as much as the following:
Many within the languishing denominations of the affluent North, until they are similarly shaken from the slumber of their ignorance, are simply unprepared for the truth that, in the century ahead, Christianity will not only expand mightily, but will increasingly be dominated by believers whose understanding of engagement with the non-Christian or post-Christian world is likely to be one not of accommodation, compromise, or even necessarily coexistence, but of spiritual warfare.
Both extracts come from David B. Hart's "Religion in America: ancient and modern," appearing in the March 2004 issue of The New Criterion. Mr Hart's prediction that the triumph of 'ancient' - that is, primitive, charismatic, or evangelical - Christianity in America will conduce to 'spiritual warfare' - I don't know just what bet Mr Hart means to hedge by the use of 'spiritual' - made me feel lousy for three days. I don't know which distressed me more, the prospect of more warfare, of any kind, or Mr Hart's barely suppressed glee at the prospect. The thrust of "Religion in America: ancient and modern" is an odious comparison with tired old Europe, home of that 'misbegotten abomination,' the European Union, and moribund religious commitment. Mr Hart's favorite fact is that conservatively religious people reproduce at a considerably higher rate than do secular humanists. (The virtue of having a big family is another one of his axioms). Affluent northerners - read, members of the liberal elite - don't know what's going on in the rest of the country, and don't realize that their venerable denominations are doomed. If they did, presumably they wouldn't make a man like John Spong, author of Why Christianity Must Change Or Die, and, according to Mr Hart, a 'notorious simpleton,' the Bishop of Newark.
Mr Hart claims to be an "unapologetic Christian reactionary." He does not specify the era of Church history to which he would like to return, but given that he is an Eastern Orthodox theologian, one imagines an Istanbul restored to Christendom. He does not really care for 'the American Religion,' as Harold Bloom described it twelve years ago, but even so solipsistic a faith is better than no faith at all. Better snake-handlers than Brights. Better the Bacchae-like antics of Pentacostal Christians than existentialist uncertainty. Mr Hart attributes the persistence of the religious impulse in America to the absence of a unified Christendom; our pioneers fled Christendom in order to pursue more idiosyncratic enthusiasms. This is an interesting point that deserves vastly more attention than Mr Hart gives it. He calls this established religious culture the missing middle term between the ancient and modern; nothing less than Christendom, in his view, has allowed Europeans to pass from religious commitment to religious indifference. I couldn't help feeling that 'Religion in America' lacks a middle term, too, capable of bridging the reductive, almost pagan Christianity that has flourished in the absence of authority, and Mr Hart's lofty tone, which bristles with palpable but unarticulated agendas. This is a sermon, not an account of religion in America. But to whom is Mr Hart preaching? Slumbering northern elites, no doubt. Snake-handlers would almost certainly be incapable of penetrating Mr Hart's mandarin style, dotted as it is with exotic words such as 'mantic' and 'leal.' Mr Hart has nothing to say to the 'ancient' Christians whom he has discovered in the American heartland. What he has to say is aimed at readers just like me. It is a species of Schadenfreude-in-advance. Everything looks fine right now, but just you wait. It works: I feel wretched in advance. But then I've been trained by experts, schooled by the Bush Administration in the torment of seeing through its mendacity and incompetence.
Mr Hart also acknowledges that he is a chauvinistic American, and the idea that this country might begin to export its popular religion along with its popular culture does not dismay him. For my part, I think that the time has come for America to stop exporting the accoutrements of teenaged lifestyle. It makes a handful of people filthy rich and it disgraces the nation. The way to put a stop to teenage tyranny is to stop paying so much attention to teenagers. Parents ought to reduce their kids' access to disposable income; between part-time work and full-time shopping, they're not doing their homework - which teachers ought to pile on. It's time to work on convincing kids that despite their most passionate desire they will probably grow up to resemble their parents - closely. Young people are by definition immature, and in today's complicated world it's arguable that nobody becomes an adult until the age of thirty-five. It takes that long to work out the major kinks.
While I don't see myself becoming an isolationist anytime soon, I drew deep refreshment from a piece by William Pfaff in a recent issue of The New York Review of Books (April 8, 2004). At heart a dismissal of former national security adviser Zbigniew Brzezinski's new book, The Choice: Global Domination or Global Leadership (Basic Books), Mr Pfaff's essay ends with something very startling: a clear-eyed consideration of the likely consequences of American military retrenchment. Mr Brzezinksi, in tune with conventional wisdom, is sure that such a withdrawal would create a chaotic vacuum and suck the rest of the world into disorder, but then Mr Brzezinski, who "has never," according to Mr Pfaff, "been a particularly good guide to the future in international politics," believes in American exceptionalism. This, I think, is the real American religion. Too many Americans tend to believe that because they are Americans - stated simply - Jesus loves them and is willing to enter into a personal relationship with each and every one of them, more or less on each and every one's terms. This is certainly more flattering than acknowledging how damned lucky Americans have been. As George F. Kennan has lamented, for Americans "to see ourselves as the center of political enlightenment and as teachers to a great part of the rest of the world [is] unthought-through, vainglorious, and undesirable." In 1951, Mr Kennan said something even more troubling. He predicted that the outcome of the Cold War would be determined by "spiritual distinction." How ironic it would be - and no doubt I'm wrong to speak in the conditional mode - for David Hart's 'ancient' Christians to believe that their kind of spiritual distinction was what Mr Kennan had in mind.
It takes Mr Pfaff three short paragraphs to sketch an argument, which I found persuasive, for an American retrenchment that stops short of disbanding NATO. I urge you to have a look at it. The only thing that our many far-flung bases effectively control is a narcissistic paranoia that goes back to the man I've come to regard as the very worst American president of the last century, Woodrow Wilson. Before Wilson, Americans were largely content to follow Voltaire's advice and cultivate their rich and extensive garden. Since Wilson, we seem to have stumbled into a truly ridiculous grandiosity - not to mention a specialty in last-ditch rescues, most of which wouldn't have been necessary if we really believed that we could help out in the world. (Dallying while Haiti imploded was simply the latest in a long line of prom-queen hesitations.) But then, ridiculous grandiosity is probably the inevitable style of those who presume to be soul-mates with Jesus.
2 April 2004: In his memoir, All in Good Time, Jonathan Schwartz has given us an American life that for all its sophistication seems to share the humor and the pathos of Mark Twain's best work. For those of you who haven't tuned in to American FM radio over the past forty years, the name of Jonathan Schwartz may not mean very much, and it will probably be almost impossible to explain what his name does mean to everyone who has. Interviewed by his much younger WNYC colleague, Brian Lehrer, shortly after the book came out, a month or so ago, Mr Schwartz was genially surprised to learn that when Mr Lehrer was starting out in college radio, "we all wanted to be you." In the late 60's, when Jonathan Schwartz was one of the anchors of what was then called 'alternative rock' on WNEW-FM, I wasn't paying much attention to that kind of music, but I knew enough to associate him with it. So I was surprised, decades later, to discover that he had started out as a champion of the American Standards - the great songs that serious, if not 'classical,' American singers, whether associated with jazz, pop, or the theatre, have covered over and over since the 30's. These are the pop songs that I was getting to know when everybody else was playing the Stones.
Mr Schwartz's father, Arthur, composed more than a few of the standards - "Dancing in the Dark" is perhaps the best-known, performed on recent CDs by Diana Krall and Tierney Sutton - but if anything Mr Schwartz is a connoisseur of performances. Which is perhaps as it should be. Just as it took the great Lee Wiley to slow down the cutesy, show-girly showtunes of Gershwin and Porter ("I've Got A Crush On You," "You Do Something To Me") to make them standards, so the great American songs are considerably less complete on the page than, say, a string quartet by Beethoven. For all the interpretive nuance that performers might bring to the Beethoven, they don't bring cornets or ukeleles. They don't back up Beethoven's already very fully written-out music with catchy new motifs, in the manner of Nelson Riddle, or emphasize its despair with the stabbing violins that are Gordon Jenkins's hallmark. They don't interpolate repeats, as Count Basie does so famously with "April in Paris," and, even more remarkably, as Frank Sinatra does with "East of the Sun." They don't, in a word, improvise. The composer of an American standard, unlike Beethoven, serves rather the same function as learned academies did during the Enlightenment: he frames interesting challenges, which the best performers (and their arrangers) work very hard to take up. Compare, just for example, two versions of one of the great Irving Berlin ballads, "Say It Isn't So," one by Stacey Kent (on The Boy Next Door) and the other by Bobby Short (on Moments Like This). I can't imagine two more different takes on the agony of this song. Bobby's out in the open with the humiliated misery of an imminently abandoned lover; Stacey gives you the pain of having mastered her grief. The world is a better place to have both performances - and as many more as do as fine a job.
But enough about singers and songs. What about Jonathan Schwartz? Well, to begin with, he is a man who learned early to follow his heart. That is clear on every page of this book. I don't think that there are any autobiographical explanations for this; he was just born that way, and if he hadn't been, his life would probably have been dark all the time instead of only intermittently so. His mother was invalided by hypertension when he was a little boy, and she died, gruesomely, when he was fourteen. His stepmother - whom Arthur Schwartz had been seeing for several years before becoming a widower - didn't care for Jonathan (to put it mildly; the language she used!), but a chilly family hearth had long been anticipated by this lonely son of a show-biz father. Jonathan always spent a lot of time by himself - as I suppose all autodidacts do. Instead of schoolwork, he mastered the musical literature to which his father had contributed. An extremely precocious broadcaster, he began radio transmissions, via something called the Electronic Baby-Sitter, at the age of eleven. By now the family had relocated to Manhattan, and any AM radio in the building, if set to 600 KHz, would pick up Jonathan's daily programs of great American music. Gifted with a good, clear speaking voice, Mr Schwartz had no real trouble making a career out of this early love - even if it did take him a while. (He'd have started sooner, ironically, if he'd stuck with college.) Along the way, he established himself as a professional singer and pianist, with gigs not only in the outer reaches of New York's metropolitan area but also across the street from the Georges V in Paris. The climax of this career was a successful series of bookings at Michael's Pub.
Like any autodidact, Jonathan Schwartz has definite opinions about things. He hates the arrangements of Gordon Jenkins, for one. He's a passionate Red Sox fan, for another. Unpleasant associations throb persistently: he would still rather not hear the Original Cast Recording of The Boy Friend, decades after a mortifying experience involving that LP. "The avocado, so much subtler than candy, so delicate and reassuring, was the only spiritual symbol in my life." Not your ordinary eight year-old. It's unlikely that anyone will read All in Good Time and discover in Jonathan Schwartz a soul-mate of entirely shared tastes. Let's face it: we're all a little bit narcissistic about biographies; we like the people we read about, even when they're villains, to share our good taste, and we're disappointed when they don't. Jonathan Schwartz's book encourages the reader to get beyond this. It is nothing if not a detailed (though undoubtedly far from exhaustive) catalogue of his likes and dislikes, but far from annoying or disappointing me it reminded me that, as I said about the two versions of that Berlin ballad, the world is a better place for variety. Taste is not a zero-sum affair. My oldest friend and I can easily coexist even though he hates Mozart and I hate the Metropolitan Opera. A babel of strong opinions is better than a midwest of aesthetic concord.
The other little lesson that All in Good Time teaches is the pointlessness of resenting someone else's undeserved privileges. Jonathan Schwartz had plenty of these, and there's no doubt that they opened doors a little more quickly than might have been the case without his connections. Few little boys were ever tucked into bed by Judy Garland, singing "Over the Rainbow." When he talks about this sort of thing on the radio, he can seem a bit overbearing, but as his book shows this is probably because our standards for modesty in writing are not the same as our standards for modesty in speech. Mr Schwartz writes pretty much as he speaks, but what can come across as name-dropping on the air becomes interesting anecdote on the page. The encounters with Sinatra, of whose work Mr Schwartz is nothing less than a scholar, are all electrifying and, for the most part, scary. I share Mr Schwartz's reverence for Frank Sinatra only slightly more than I share his passion for baseball - and I don't share his passion for baseball at all. But he makes Sinatra and baseball interesting because he knows how to write from the heart. Clearly, but from the heart. He also knows how to write about alcohol problems, the Betty Ford clinic, and a short stay at Paine Whitney without a tinge of sentiment or a soupçon of superiority. I admire him no end for admitting that he is one of AA's failures. That's harder, these days, than standing up in a room and announcing that you're an alcoholic.
I've left out the romance, but there's plenty of it. Mr Schwartz is able to make the women whom he has loved breathe - and even laugh - on the page, but his interest in them does not appear to have received the degree of precortical attention that he has given to music and sport. Girlfriends, for the most part, are women who ask him to turn his music down a bit. Pages and pages are devoted to an affair with an older woman, a dancer, who left him to discover the dawn of the New Age in California, and we learn a good deal about the woman whom he married on the rebound, even though, as he later realized, he never liked her. But his two later, happier marriages receive such summary treatment that one can only conclude that the author decided not to risk stirring up any unnecessary trouble. At the same time, I was pleased to see that while sex appears never to have posed any kind of problem, it has never been a subject for boasting, either. When Mr Schwartz has a lot to say about love, his father is usually the other party.
I said that it would be almost impossible to explain what Jonathan Schwartz is all about, but there's no need to try, for the man himself is there for all to hear on every weekend afternoon between noon and four in the East, nine and one in the West. (To listen in via the Web, simply visit www.wnyc.org during these hours.) But be sure to read the book. It's the compleat American life.
26 March 2003: Thomas Frank, the fiery author of The Conquest of Cool and One Market Under God, has published another extraordinary essay in the current issue of Harper's (April 2004). It's about the 'derangement' of American political life, the enthusiasm, baffling at first glance, with which ordinary 'Red State' Americans vote for the party that has done more to depress their quality of life than all other factors combined, from globalism to high fuel costs. Why do the citizens of the heartland vote Republican? Having catalogued the extreme inequities that about in his native Kansas, Mr Frank writes:
If this is the place where America goes looking for its national soul, then this is where America finds that its soul, after stewing in the primal resentment of backlash, has gone all sour and wrong. If Kansas is the concentrated essence of normality, then here is where we can see the deranged gradually become normal, where we look into that handsome, confident, reassuring all-American face - class president, quarterback, Rhodes scholar, bond trader, builder of industry - and realize that we are staring into the eyes of a lunatic.
I recommend this essay highly, if only because it will almost certainly dampen the glee with which many Democrats now expect to evict George W. Bush from the White House. I'd also like to add two thoughts that are missing from Mr Frank's analysis. The first is that the Republican Party long ago converted White America's widespread misgivings about granting fully equal civil and economic rights to Black America into an absolutely unthinking hatred of the agent responsible for ending segregation: Liberalism. This was Step One. Perhaps we should be grateful that the backlash of which Mr Frank writes was directed at liberals, and not at the beneficiaries of the civil rights struggle. In Step Two, which was taken during the Nineties, a gaggle of clever media types pulled off the neat libel of identifying the pleasures and pastimes that they and their invariably urban neighbors pursue as Liberal. Where Step One prostituted a dark truth, however, Step Two simply inverts a syllogism. It might be true that some liberals aren't drawn to the attractions of NASCAR or Branson. But it is not true that all liberals share a 'cosmopolitan disdain' for mass culture, nor is it the case that everybody who has been on an alumni trip to Provence is a liberal. But the libel has been very effective. Hatred of Liberals rules. Regardless of the economic cost, salaried, and even salary-less, Americans vote for the Party of CEOs. If the enemy of my enemy is my friend, then the Bushes and Cheneys who claim to despise the caution of 'Old Europe' and the clamor of Times Square must stand for industrious, self-sacrificing thrift. As Mr Frank points out, these radical Republicans stand for no such thing, but they certainly stand on it. Having limned a grim portrait of the dead-end slaughterhouse economy of Garden City, Kansas, Mr Frank considers it from the point of view of Kansas's richest suburb.
The good people of Mission Hills remain unfazed by all this. They may be too polite to say it aloud, but they know that poverty rocks. Poverty is profitable. Poverty makes stocks go up and labor come down.
How long the heartland will associate the glitter of metropolitan life with the guilt of what it seems to view as a betrayal, long ago, by Democrats, is anybody's guess, but I don't think that it will do so indefinitely. The children of today's plutocrats, knowing nothing of the self-conscious similarity that united most Americans during the postwar boom (remember 'conformity'?), may grow up to look like aristocrats out of A Tale of Two Cities, or they may assume the noblesse-oblige of aristocrats like FDR. Or both. But the turn in generations is certain to realign American politics for the better, because the old discriminations, against minorities, against gays, against women-in-the-workplace, are not common among young people, and are unlikely to become so. Hatred will eventually run out of gas - that hatred, anyway.
I hope they won't take too long to grow up.
19 March 2004: Let's get something straight: George W. Bush is not, repeat not, a "war president."
Mr Bush and his entourage have dispatched thousands of American soldiers to Iraq, and for the month that it took them to proceed from Basra to Baghdad last year one might have spoken of a war. I don't think that's what it was, but 'war' was at least a arguable label. As in the Gulf War, resistance was sporadic and barely military in character. Once Saddam Hussein's government had been completely crushed, however, there could be no more talk about war. That's the ironic justification for the President's infamous 'mission accomplished' fiasco. The war was indeed over, if there had ever been one, but as for the mission, it had only just been undertaken.
Among the many falsehoods that stream from the White House with Orwellian insouciance, this talk of war is perhaps the most egregious. It is calculated to elicit the sympathy and support of Americans - as Karl Rove understands Americans. Waging a justifiable war of liberation against a tyrant in possession of weapons of mass destruction - that's the official description of what we've been doing in Iraq. It's sentimental nonsense.
Whether the Iraq misadventure was conceived with the objective of locking up oil reserves and their attendant profits will probably always be an impenetrable mystery, as dark as the mind of Dick Cheney. Paul O'Neill, former Treasury Secretary, would have us believe that an action against Hussein was contemplated from the very beginning of the Bush Administration, and I see no reason to doubt him. He was even more honest, in his awkward, undiplomatic way, than Colin Powell. Historians will doubtless work very hard to establish, in some distant day when all the documents are public, the precise connection between 9/11 and the preemptive campaign against Hussein. But whatever the motivations turn out to have been, the fact remains that, having unseated the secularist dictator and curried the favor of his religious opponents, American might has unleashed a civil war in Iraq, in which American troops are incidental victims rather than targets. Such a civil war, between Sunni and Shiite Moslems, and between them and the Kurds, was always predictable. In my view, it was inevitable, but even I was surprised by the recklessness with which the Vulcans - Messrs Cheney, Rumsfeld, Wolfowitz, and others in the Administration who gave themselves this strange nickname - failed to prepare for it. Did they really believe that American soldiers would be welcomed with flowers? The question fits the prototypical question that this Administration will always provoke: cynical wishful thinking or lunacy?
Our soldiers are not policemen. They don't have police training, and - the truly fatal defect - they're Americans, not Iraqis. They haven't built up the street smarts that are more important to effective police work than the most infallible aim. They are out-of-towners, and they're kids. They're hardly better prepared for the dangers facing them than you or I would be, dropped suddenly into the back streets of Baghdad. Mines blow them up before they know what hit them.
As conventional war, with its phalanxes and front lines, becomes less and less likely - too dangerous for two to play, since at least one belligerent is likely to have nuclear capability - armies as we know them may become useless. Instead of late-adolescent, gung-ho cannon fodder, we may find it more useful to deploy mature squads of very well-trained policemen. Our 'special forces' already command many of the skills that such squads would need, but their work is undoubtedly hampered by irrelevant military routine. This very interesting discussion about the future of is peacekeeping, however, is premature. American voters would never have supported an extended police action in Iraq or anywhere else. War, which Americans naturally conceive in analogy to football, is team sport on a large scale, with plenty of opportunity for rooting. The state of the game is always apparent: you know how your side is doing. Wars don't go on indefinitely, and when they're waged against demonstrably inferior teams - er, countries - they don't last very long at all. Such is the popular imagination of war - an imagination that, blocking out Vietnam, looks back hazily to Korea and World War II. An imagination, in short, that is not very well informed. The White House has exploited this fogginess and done everything imaginable to shape it. But as Americans continue to die in Iraq, and as the civil war burns away inexhaustibly, the Bush Administration can only shout louder and wave the flags higher. With what success we shall see.
The latest flag in the Administration's fist is 'appeasement.' This is a neocon word that I suspect means little to most Americans. Appeasement, in neo-conservative circles, is the name of the crime committed by Neville Chamberlain against the Jews when in 1938 he capitulated to Hitler's annexations of Czechoslovakia and Austria. It is the crime of which neo-conservatives, so many of them American Jews, are determined to prevent a recurrence. The principal victim this time around, of course, would be Israel, and as Israel battles Islam, the neo-conservatives have demonized Islam. Not just militant Islamists, but all 'Arabs.' (Saddam Hussein, as I've said, was no Islamist.) 'Arabs' have never quite come to terms with the existence of Israel, but more decisively they have regarded the Israeli occupation of the West Bank as an offense against themselves, not as the mere oppression of Palestinians. Iranians, Pakistanis, and Afghans, none of them Arabian, have joined the collective.
Now the citizens of Spain who gave José Maria Aznar's pro-American government the boot have been charged with appeasement. If you ask me, appeasement is what they were putting an end to: the appeasement, that is, of the Bush bullies. I hope that British voters will clamor for the chance to do the same. And as for us, I hope that Americans will wake up to the fact that this country is not at war. We've got something rather worse on our hands. It doesn't have the consolations of war, and we know next to nothing about it. While the Vulcans bluster on about war, and do everything possible (so it seems) to fuel anti-American antagonism, we remain vulnerable to terrorists, as minuscule, and as deadly, as microbes.
12 March 2004: In Paris last November, I found that my French served me a lot better than I'd expected it to. Almost everything that I said was flawed, but I made myself understood, and the Parisians whom I was lucky enough to meet did not try to shunt me into English. Coming home, I thought that with a little regular conversation I could probably become fluent. But where to find the conversation?
Shortly after the holidays, the conversation found me. It was really Kathleen's doing. She saw something on a bulletin board - but I'm not going to say any more about the very agreeable gentleman who comes to chat with me twice a week, except to say that he is a native speaker who grew up in the South of France. I've been at it for two months now, long enough to discover that the learning curve is convex: the climb is steeper and more arduous the closer I get to the top. After all those years of reading, vocabulary is not really a problem, but although vocabulary is necessary, it isn't sufficient. My shortcomings are of two kinds. First, I'm terribly sloppy with the little bits of language that I ought to know better how to muster. I don't know how many times I've said je m'en est in the rush of conversation, nor how many times I've forgotten to wrap up a negative construction with pas. This is the sort of thing that disciplined hours spent in a language lab would have addressed, but I loathed language labs. (They were, after all, extremely primitive, forty-odd years ago.) The other thing that I have trouble with, of course, is thinking in French.
Which means not thinking in English. At least once in every lesson, my brain simply locks, freezes, as I struggle with a thought that I know won't work in French. Unfortunately, that's all I know, and, desperate to keep up my half of the conversation, I can't easily step back and let the idea reformulate itself in terms more congenial to French. Happily, French is rather more given to formula than English is, and I'm building a reserve of stock phrases. But the effort to avoid speaking English with French words is more tiring than I thought it would be, and when the lesson is over I want to lie down with a cold compress.
Speaking English using French words - I came upon that very apt description of what's hardest about learning any foreign language in a delightful little book called The Philosopher's Demise, by Richard Watson (1995; Godine, 2003). I'm not sure what the title means exactly, but Mr Watson is a professor of philosophy at Washington University in St Louis, and he specializes in the thought of René Descartes, possible the most French of French philosophers, and the patron saint of rigorously clear thinking. In 1986, Mr Watson was invited to deliver a paper at a Descartes conference in Paris. He could have done so in English, but years of feeling guilty about not being able to speak French, compounded by the fact that he could read it (and even translate from it) with fluency, drove him to take French lessons from the wife of a colleague. He was so inept that she didn't take him seriously, and charged only half her hourly rate. When she realized that he was in earnest, and that she was going to have to work, she charged him three-quarters.
As an undergraduate in 1951, Mr Watson had taken a course in Reading French. He was so good at it that he ended up, as we see, working with French texts. But he never studied pronunciation, or tried to speak in a way that a Frenchman wouldn't find appalling, until the conference loomed, and he was in his mid-fifties. You can see, perhaps, why the book was recommended to me. Unlike Mr Watson, I was lucky enough to be tutored for a year by a lovely Russian lady whose French was very aristocratic and who made sure that when I said "La rose est une fleur" - a sentence that's a minefield of peculiarly French sounds - I didn't sound altogether American. We never got much beyond the sounds, but I was twelve, and just barely young enough to pack the necessary lingual acrobatics into my brain. I went on, like Mr Watson, to prefer reading to speaking. Living in the United States, I had no occasion to speak French, but, knowing what's lost in translation, I 'kept up' my reading. Whenever a friend went to Paris, I would beg for the present of a book, in this way accumulating a tiny and eccentric library that was entirely transformed by www.amazon.fr. Recently, I'd taken to reading Le Figaro Magazine and France-Amérique, because the newsstand across the street carried them. But I couldn't speak spoken French, and I couldn't understand very much of it, either. What I was loath to admit was that reading a language is a hell of a lot easier than speaking it, but, again like Mr Watson, a suspicion that this was the case nagged my amour-propre.
Mr Watson learned enough to read his paper in French and to deal with a few questions afterward. But when he found that his French colleagues wouldn't give him the time of day, he enrolled in a course at the Alliance Française in Paris, and most of his book is a very funny account of the grueling summer that he spent there. The misery of a scholar who had never previously met a course in which he couldn't excel nor an exam that he couldn't pass with flying colors, suddenly confronted by a deep-seated and implacable incompetence, makes, at least in Mr Watson's hands, terrific reading, and you do not have to be struggling with the sobjonctif to get the most out of his book. What The Philosopher's Demise is really about, I suppose, is the American delusion of limitless self-improvement. You can try very, very hard - Mr Watson certainly did - and still fail.
It's only midway through the book - and, I gather, his ordeal - that both a friend and a teacher sum up his ability in the same way. "You speak English using French words," says the teacher. "You speak in French words," says the friend, "but you arrange them mostly in English sentences." In short, Mr Watson doesn't think in French. This prompts some Francophobic venting: Mr Watson will be damned if he'll learn to pronounce oiseau correctly, because this French word for 'bird' requires the speaker, at least in Mr Watson's view, to simper - and real men don't simper. (It should be noted that Mr Watson is an accomplished, perhaps even eminent, spelunker.) But his exasperation with the poetry of Jacques Prévert, however blinkered, is funny rather than ill-natured. More serious is his fear of what French will do to his English. To think in French is to betray one's home ground, the mastery with which one writes in one's native tongue. I myself don't share Mr Watson's alarm. Struggling to think in French has had no effect on my English, or at least none that I can see. It's true that for two or three hours after every lesson I say merci and bon soir to doormen and waiters (we always have to go out for dinner after my exertions), but they take it in good humor, and Kathleen thinks it's cute - my helplessness, she means. But as for learning how to think in French, this will remain a totally supplemental skill.
What makes thinking in French so thoroughly compatible with thinking in English, in fact, is an almost complete lack of common ground. There is no rivalry, no contention over how to say something properly. Don't let someone tell you that they say something differently in French. They never say the same thing. Once you get beyond the necessities, the bare minima of communication, the two languages pursue entirely different objectives. The French desire clear and distinct ideas. Americans prize freshness and suggestion. One of my favorite anecdotes from The Philosopher's Demise involves a struggle between student and teacher about poetic expression.
I knew about French rigidity in language. But when Claire asked us to write a poem, despite my resolve not to argue with teachers about what is right in French, I found myself in dispute with her about a metaphor that I had used in mine.
"You can't say that in French," she said.
"You understand what I mean?" I asked.
I had represented meaning as a thread running through a seemingly unrelated sequence of events.
"Yes," she replied, "It is perfectly clear what you mean, but you can't say that in French. The metaphor doesn't exist in French."
"But if it's perfectly clear, then why can't I introduce it?"
She looked at me in disgust. "Maybe if you were a great French poet you could, but you can't say that in French.
Later I learned that I also couldn't say that Descartes set us on the road to modern science because "on the road to" is an American idiom, not French.
This can be disconcerting at first, but once you accept la différence you begin to value it. It won't hurt you to understand, from inside, a genuinely alternative view of the world, and the higher up the cultural ladder you're working, the richer that alternative becomes. Besides, I haven't got a choice. According to John Bridges's How to Be a Gentleman: A Contemporary Guide to Common Courtesy (Rutledge Hill, 1998), "If a gentleman does not speak French, he does not attempt to use French words." What would I do with all that vocabulaire?
5 March 2004: A couple of weeks ago, Alex Ross published a manifesto of sorts in The New Yorker, entitled "Listen to This." The piece ends, as any manifesto ought to, on a high note, aptly summoning the brio of Beethoven's Third Symphony - the 'Sinfonia Eroica composta per festigiare il Souvenire di un grand' Uomo.' Mr Ross reimagines the moment when Beethoven tore the dedication to Napoleon from the score's cover page.
The symphony became a fragmentary, unfinished thing, and unfinished it remains. It becomes whole again only in the mind and soul of someone listening for the first time, and listening again. The hero is you.
For many people today - too many people - music like Beethoven's sleeps in the center of a forest of thorns. The thorns are the attitudes and exhortations in which generations of snobs, both real and cinematic, have buried the art they profess to cherish. This is not an art, they would have us believe, that can be approached without some preliminary coaching. To get the most out of it, we need to pay attention to details that, for a beginner, are simply of no interest. It is only after one has taken a piece of music into one's mind and soul that the how-is-it-done part begins to look intriguing. And then one learns (assuming that one hasn't learned it already) that it is talking about music, not listening to it, that requires a bit of training. Talking about music of any kind is tricky at best, and to talk about Beethoven at any length one probably has to know something about reading music, something about music theory (keys, harmony, counterpoint), and certainly something about the makeup of the orchestra. But most people don't have to talk about Beethoven at length. At most, they want to jump up at what Leonard Bernstein called the wham! at bar 158 of the second movement, and shout, "Did you hear that?!" They don't want to unspool precious experience in halting explanation; they don't want to interrupt the lurching rumble of the triplets in the bass that immediately follows.
You can't start out by saying "Now, this is important." Not where pleasure is concerned. As David Denby discovered when he went back to Columbia to participate in freshman seminars - an experience brilliantly captured in Great Books - every generation must discover the pleasure, and, through that, the importance, of great works for itself. We do not read Dante because our grandfathers read Dante. Our grandfathers read lots of things that we'd never be able to get through, and we read lots of things that will make our grandchildren's eyes roll. We read Dante because he arrests us. We listen to Beethoven because the experience is, or becomes, deeply satisfying. The Eroica will always have to sell itself, but, once it has, then we'll let ourselves be exhorted to give the string quartets, particularly the difficult late ones, a couple of hearings.
When one is beginning to learn this music, a couple of hearings are usually required to make it comfortable. One gets the hang of where things are going: the seven to fifteen minutes of a typical symphonic movement ceases to resemble a trackless waste. A certain familiarity takes root, and perhaps one sings or whistles along. But there comes a wonderful day when this familiarity inverts itself, not into ignorance, of course, but into humility. How often have I said that I know such-and-such by heart, when in fact to know it by heart would be to imply that I'm capable of writing out all the parts from memory. I don't yet know all the parts to anything; I'm still discovering them. Last night, just to give you an example, I noticed, for the first time in about thirty-five years of loving admiration, that at the end of the finale of his last quintet Mozart takes the movement's descending theme and turns it upside down, pointing it toward the heavens. Something like this - something small but not insignificant - happens every time I hear the music that I know best. Not to coin a phrase: the more I know, the more I know I don't know.
The music of Bach, Beethoven, and Brahms has been appropriated by generations of respectable self-improvers and tarnished by their pretension. The tarnish is hardly permanent, but how to remove it? I don't share Alex Ross's alarm. Perhaps a change in audience - inevitable over time - is all that's required. Last night's little Mozart discovery occurred at a thoroughly delightful concert at the Metropolitan Museum of Art. The performers were members of the museum's newly-formed resident chamber ensemble. They were young, as were many people in the audience, and the hollering that accompanied the accolades led me to suspect that more than a few friends of the performers had made the trip uptown. That's as it should be, especially when, as in this case, the performances were superb. A little hollering at the end does not imply an unsophisticated audience in need of housebreaking. There was, for one thing, not a single stroke of clapping between the movements. The enthusiasm was saved up for resounding outbursts after the music was over.
Now, this business of clapping between movements has long been a shibboleth, and the inability to keep still until the end has signified a complete lack of couth. Learning when to applaud was probably the first thing I learned about music, and of course I prized the superiority that it gave me over the untutored. If I'm sorry to see the custom break down, it's because I've grown used to hearing entire works within a frame of silence. But I'm glad that intermittent applause no longer elicits fierce choruses of shushing. It's already pretty clear that the people who show up for concerts are interested in hearing the music, and not in convincing themselves that they're cultured.
The young ones, anyway.
27 February 2004: The bad news this week included two worrisome items. First, the announced candidacy of Ralph Nader. I cannot decide whether the insanity of this move outweighs its immorality, but it doesn't make much difference. Mr Nader's 2000 campaign appears to have assured the victory of George W. Bush, and while I hope that the Democratic Party will field a candidate stronger than Al Gore this time around, the only politician who might derive any tangible benefit from Mr Nader's utterly quixotic entry into the lists is the incumbent. Mr Nader is obviously living in a fantasy land, and, as such, has no business in national politics. But his running, it's important to remember, doesn't require that anyone vote for him Those who do will have proved that they don't understand the first thing about democracy.
The other item was the President's coming out in favor of a Constitutional amendment against gay marriage. Any other outcome was inconceivable, so the news is not surprising. But it underlines the prematurity of the issue. I can understand that people alive right now want the benefits and respect to which their (but for Scripture) respectable behavior entitles them, but I can see, as well, that a majority of Americans is simply unprepared to grant that wish. I had great hopes for the ripening of civil unions into marriage, over time, and perhaps that's what will happen. But the process will be rougher and riskier for having been pushed too soon. And the Democratic candidate, whomever he may be, will have to tread nimbly on this issue. Like Ralph Nader's candidacy, gay marriage looks more than a little frivolous right now. If you want to get worked up about civil rights, cleansing the terrible stain of Guantánamo is a clear priority.
In good news, the British Government unexpectedly dropped its suit against Katharine Gun, despite her willingness to admit that she broke the Official Secrets Act last fall when she leaked an appalling memo from our National Security Agency requesting the cooperation of its British counterpart in spying on senior staff at the United Nations Secretariat. I don't mean to sound cynical, but the worst thing about the memo was clearly the paper it was printed on. Whether or not the espionage was justifiable, writing and sending a memo about it betrayed an unbelievable arrogance - well, an arrogance to which we've become accustomed in the past three years. It was the international, diplomatic equivalent of not flushing the toilet in an old friend's home. That may be why Tony Blair's team, however technically in the right, decided not to make a bad stink worse. Whether Mr Blair's keel will steady him through this latest instance of toxic Washington backwash will be neither good nor bad news, but merely a demonstration (or not) of the power of sang-froid.
And in other good news, WNYC's winter fund drive drew to a close. I must learn to resist the impulse to contribute right away. Fund drives are more bearable if they're leavened by a dash of guilt: since you haven't forked over, you deserve this. 'This' being the endlessly repeated phone number and Web site address, the touted goodies, and, worst of all, the thousand-and-0ne - oh that there were! - reasons for contributing to this worthy cause. (If you have to be told, you're probably too stingy to respond.) Not to mention the interruption of interesting programs. How I miss Steve Post's subversive participation in days gone by! Once I've done my part, I quite righteously want to hear no more about it! Delay has another payoff. Toward the end of each drive, the station's trustees pop out of the woodwork with matching gifts - double matches, even. Meanwhile, I've got to wait "six to eight weeks" to receive my "adorned" copy (the author's word) of Jonathan Schwartz's memoir, All In Good Time. Will he disclose that that's Carly Simon singing the little vocalise with which he begins his weekend broadcasts? Hey, what are you waiting for!
20 February 2003: What a week! Howard's End came rather sooner than anyone would have expected on New Year's Day, and Jeffrey Skilling (protesting his innocence) turned himself in to the FBI. I'm glad that Mr Skilling didn't make a deal with the federal prosecutors, because while I'm only too happy to see the engineers of Enron's downfall go to jail, I would prefer not to omit the trial stage, because settlements obscure who did what to whom and when. And I have a dream: Assuming that Mr Skilling appeals a conviction, we may even get an opinion outlining the standards of executive competence, something that anyone who has read The Smartest Guys in the Room may find wanting in the latter stages of Mr Skilling's career. Presumably, a corporation's board of directors must appoint competent executives, but so far as I know there's little case law on this point. It's a lot to ask for, I know, but, as I say, I have a dream.
As I wrote early last month, and as everyone is saying now, Howard Dean did the Democratic Party a big favor by forcing his rivals to stand up to the Bush Administration's policies, something that Messrs Kerry, Edwards and Lieberman had been unwilling to do until well into last fall. (It's not clear to me that the unlamented Mr Lieberman ever came round.) He managed to combine sharp critiques with undoubted patriotism. Perhaps it took someone a little carried away to accomplish this in the traumatized political world of 9/11 and 'Mission Accomplished.' What I still don't quite understand is how Dr Dean morphed from the rather stiff campaigner that he's said to have been in Vermont into someone slightly scarier than the Terminator. Nor can I imagine how Judith Steinberg Dean could have maintained a medical practice while her husband occupied the White House: how ever would the Secret Service handle that?
Meanwhile, I'm reading a chapter a day - that's all I can take - of David Cay Johnston's Perfectly Legal: The Covert Campaign to Rig Our Tax System to Benefit the Super Rich - and Cheat Everybody Else. This appalling exposé of profoundly structural tax inequities is even more disturbing than the advent of the Bush Administration. Voters apparently believe what politicians tell them, no matter what happens at tax time. My morning was made, so to speak, by Mr Johnston's set-piece on the Alternative Minimum Tax, a golem that has metastasized into a ludicrous tax hike for most middle-class Americans, robbing them of the tax cuts that the 'regular' tax would afford them. Especially punchy are anecdotes concerning successful litigants who have wound up worse off, after trial, than they were before, thanks to phantom income in the form of contingency fees. There's no question that Congress believes that the taxpayers most adversely affected by this 'stealth' tax are absolutely clueless. As I recall, Mr Johnston opined, in a radio interview, that Congress itself is pretty clueless about taxes. Astonishingly, we've revived the ancien régime approach to taxation: exempt the richest. What if you gave a democracy and nobody paid attention? Oh, how I wish the question weren't rhetorical.
In other developments, I just gave Roxio's Go Back first aid software a spin - and it worked! Of course, I wish I hadn't had occasion to use it. And I wish I knew why my computer suddenly failed to connect to the Internet. By restoring the system to its state as of two days ago, I put things to rights (at least for the time being). Along the way, I got to peruse a list of modified files - such as, for instance, the .pst files that contain all my contact information - and 'rescue' them from the Go Back treatment. My computer has been unstable since the New Year, when it was invaded by spyware; even wiping the drives clean and starting over from factory settings (bye-bye, databases) hasn't solved the problem. A la lanterne with Bill Gates!
But that's asking for the sun, moon and stars.
13 February 2004: Exploring the new books tables at the Barnes & Noble branch across the street, I came across a book that I think I'm going to like: From Chivalry to Terrorism: War and the Changing Nature of Masculinity, by Leo Braudy (Knopf, 2003). I had heard of neither the book nor its author, which surprised me a little, because it's hard to miss all of the reviews that non-fiction of Knopf's calibre is sure to generate. Random paragraphs seemed to be well-written, and the author's observation, that the growth of nationalism following the Thirty Years' War was instrumental in putting an end to mercenary armies, struck me as particularly cogent. Reasonably certain that Mr Braudy's book would be both serious and literate, I bought the book and brought it home.
As Mr Braudy makes clear in his introduction, his book is neither about war nor about men, but rather about the impact of warfare upon norms of masculinity, and about the kind of war that men of a given outlook are likely to wage. Not so long ago, the earth trembled under the possibility of annihilation by two evenly-matched superpowers. Whether the menace of nuclear war produced or was forestalled by a masculine ideal of stoic restraint is hard to tell. Nowadays, as crises in Iraq, Israel/Palestine and Chechnya make clear, warfare has lost its symmetry. Western soldiers are trained to fight battles in which their evasive opponents can't be tempted to engage, while these in their turn go after unsuspecting civilians. Terrorism is, in a very weird way, an intellectual approach to war: for the terrorist act is almost never about the act itself but almost always about sending a message somewhere else. Blowing up a subway car in Moscow, Chechen rebels invite (surviving) Russians to ask whether their government's determination to subordinate Chechnya to Russian sovereignty is worth the carnage. Russian leaders, in turn (like leaders everywhere), insist that such invitations must always be declined (as it were) unopened. When the parties to a conflict refuse to engage with or to recognize one another, protraction is inevitable. History suggests that occupations never endure, unless the occupied people cease to regard themselves as essentially different from their occupiers - but even then, the embers of difference can flame up again if fanned by forceful demagoguery. Not that history abounds in relevant examples: as the reference to the Thirty Years' War above reminds us, nationalism, which is at the heart of most modern conflicts (no matter how cloaked in religious invocation), has only been a force for fewer than four centuries.
Nor is warfare the preserve of men anymore. Western armies enlist women, and several Palestinian girls have blown themselves up in suicide bombings. If war does remain a predominantly male enterprise, one still wonders what will happen if and when it loses that association. For many men, the essence of masculinity seems to be nothing more than simply avoiding all electives commonly undertaken by women. (There are now more women than men in the nation's law schools, and far more women than men in the undergraduate ranks.) This sense of taboo is as powerful and determined as it is primitive, and we're right to worry about the hitherto unguessed directions that it will take. Is it spontaneous, or do we have a hope of breeding ourselves out of it? If the former, then even where there is no good reason for excluding the participation of women, we may have to pay some respect to a bad one.
Conversation with a friend has reminded me of something odd - to my eyes - about John D Rockefeller, the founder of Standard Oil. The subject of a justly named biography entitled Titan, Rockefeller was an odd sort of man. His was not a prepossessing presence, and his manners were somewhat fussy and prim. Alopecia made him hairless while he was still in his fifties. Living to great old age, he shriveled up like a mummy. What kind of titan is this? Is there anything masculine - if not manly - about his success? In at least one important sense, certainly yes. Rockefeller was able to reconcile the most predatory business practices with an austerely pious Baptist conscience. Was this a sign of hypocrisy? Even if we remember that the more vicious things that Rockefeller did weren't illegal until some time after he'd started doing them, how do we square the Darwinian bleakness of his competitive zeal with the kind generosity required of all Christians? Perhaps Rockefeller would have laughed at the very idea of squaring them. What has one to do with the other? Business (on his scale, anyway) has nothing to do with ordinary life; it's an arena that no one is forced to enter and one that all enter at their own risk. If we ask of business that it be fair as well as honest, and as honest to the world as it is to its owners, Rockefeller never acknowledged such obligations, and I daresay there are not a few businessmen today to whom our demand might appear groundless and irrelevant. What they forget is that business is no longer an arena apart. In Rockefeller's day, most working adults toiled in agriculture. Now nearly everyone works for a corporation of some size or other. To turn Alfred Sloan's famous dictum on its head, what's good for Americans is good for business.
It's naive of me, perhaps, to believe that nothing guarantees a healthy differentiation of the sexes better than plentiful and open interaction. Let everybody do whatever, and stop worrying about 'gender-appropriate behavior,' and we might well discover that men and women are more richly unalike than anyone has ever guessed - or wished - and that men have nothing to lose but their fear.
6 February 2004: The other day, I read an editorial in The New York Times about the halftime imbroglio at last week's Super Bowl XXXVIII. The writer ended by feeling sorry for Tom Brady. I had to call up a friend to find out who Tom Brady is. Now I know - or rather I haven't quite forgotten yet. I had spent Sunday evening having a nice dinner and then watching My Man Godfrey, oblivious of the big game. It was nice to look back a few days later, though, and to realize that most of America - all right, about a third of the population - was tucked in, safe and sound, by the television. Safe, that is, but for a moment of wanton immodesty. (Or was it garment failure?)
There has been a lot of sarcastic commentary about the hypocrisy the outraged response of everyone from Michael Powell (of the FCC) to the Grammy producers (who struck Janet Jackson from the list of presenters), and this commentary has a point. Why should anyone be upset that at the intermission of a brutal game, in the middle of lascivious line dances, the right breast of a far-from-demure celebrity was briefly exposed? Is television so childish? Or, to put it another way, does television really need to be produced with the protection of children in mind? That children today show little sign of having been protected from anything except wisdom is, of course, no argument for coarseness. But I suspect that prudence rather than prudery prompted the storms of disapproval. In their capacity as husbands, not fathers, American men wanted to protect their wives from the kind of image that, according to rumor, so many of them contemplate at length in private, either in the finished basement or on a business trip. What I mean, of course, is that they want to protect themselves from their wives' suspicions. Nudity in the movies is different, because the darkness casts a certain privacy over each person in the crowd. Nudity in the well-lighted bosom of one's family is something else altogether.
Was there a Super Bowl party at The Residence, I wonder? (When did you first notice that reporters were referring to the living quarters of the White House as 'the Residence'? Not before 2003, surely? Let's hope it doesn't catch on.) I can just imagine a big screen at one end of the East Room, and banquet tables piled with scarfables and soft drinks. What I can't imagine is a sufficiency of leather-upholstered seating - where would they store it the rest of the year? The sound system would have to be formidable - unless, of course, the President outlawed chit chat and limited the rooting to groans and cheers. The President himself would have to remain silent, or by the end of the first quarter the crowd would be as homogenous as a student section.
One way or another, though, the President had to be watching the Super Bowl. How come he hasn't commented on Justin Timberlake's manhandling? Come to think of it, I haven't heard what my mother-in-law thought about it. That she wanted to watch the game at all struck my wife as a surprise, until I reminded her that one of the teams came from the Carolinas, to which her parents have retired.
Organized sports for ordinary people are a byproduct of the industrial revolution, as workers crowded into cities to take up highly repetitive jobs that, if they taxed one's strength at all, taxed it only in one or two ways, and with grim relentlessness. Team sports offered a few gifted people the chance to stand out, and, more important, it offered everyone else the surprise that ensues when skilled competitors meet, together with the contained excitement of hoping that the home team will win. There was a general idea that hitting balls was an advance on hitting other people; or it may have been the discovery that balls can be hit in more interesting and varied ways than other people can be - and balls don't hit back. In most sports, hitting a member of the opposing team leads to a penalty. This is true, to an extent, even of American football. The singular exception, boxing, excites abolitionist fervor among many good citizens, but even boxers have to watch where they strike, and they can't use their knees. These and other rules appear to make games more interesting, if only because they keep them going.
The real virtue of sports, however, seems to be that team spirit - among fans, that is - exhausts, instead of encouraging, bellicose impulses. Testosterone levels have been measured, and found to be elevated in men whose teams have just won a game, while suppressed in their defeated opponents. The actual connection between testosterone and declarations of war is obscure, but modern organized sports always hold out the promise of the next game, or the next season, and this may tell us something about the folly of declaring perpetual peace. Drink, not sport, seems to have motivated the epidemic of violence among British fans a decade or so ago. The film Rollerball - the original one, at any rate; I haven't seen the remake - predicates a world in which team sports have altogether supplanted military organizations; perhaps one can't complain, given the trade-off, that the film's enhanced version of roller hockey encompasses the death of some participants. It's a sublimation of war that might work in the real world. In the film, of course, the scheme is doomed from the start by the substitution of corporations for nations. Corporations are hopelessly addicted to fixing things, if you know what I mean.
Although it bores me to sobs just to think about golf, this is clearly the most advanced sport, in the spiritual sense anyway, because players compete only against themselves. Of course that's not completely true, or Tiger Woods would never win anything. But it does lack the zero-sum aspect of every other sport that I can think of. Golf really belongs among the Olympic Games, which are more about excellence than they are about winning. Turn this the other way round: the most excellent performer always wins the prize, something that is not true of team sports. And because only one person plays at a time, intentional violence is altogether precluded.
How many viewers, I wonder, would be disappointed if halftime entertainments were simply scrapped? The spate of commercials getting their first airing might make a more riveting substitute. In our interactive age, they could even be evaluated, in a competition all their own, by viewers keying in votes from their PDA's. The spots could be broadcast in classes - all the new sedans now, all the erectile-dysfunction cures later. The creative teams behind the winning commercials could appear after the game to receive democratized Cleos. Imagine how arresting advertising would become if the rule against running spots for similar products together were dropped!
Then again, maybe not. This would be a show that I couldn't miss.
30 January 2004: Yesterday afternoon, composing my remarks for this space, I sketched a few lines about living with a faulty computer. There was no question about an invasion of spyware, and an updated list of 'malicious' files from Symantec indicated a number of viruses as well. I wrote as if trying to convince myself that restoring the computer to its factory settings - wiping the hard drive clean and reloading all the applications and data files that I needed - wouldn't be a fate worse than death, but it wasn't until the decision was taken out of my hands (whether by the security software, which seemed obsessed with a filed buried somewhere - I could never find it - in my Windows\System folder, or by the drive's mounting instability, I'll never know) that I rolled up my sleeves. It took more than six hours to get everything back onto the cleaned machine, but instead of being worn out by the tedium of this exercise, I found myself more and more exhilarated. I must have restarted the computer twenty-five times, but each rebooting reminded me how fast this three year-old computer had been. How foolish I'd been to regard a six-hour drill as worse than the pervasive anemia that it could cure. How idiotic to tolerate ten-minute restarts. What made me so sure that I would screw up the repairs?
Better self-doubt than its opposite. Now that we've had a few actual primaries, now that ballots (and even caucuses) have replaced opinion polls, I'm feeling somewhat more lighthearted about the political outlook. The candidates, from whom little more sound bites could issue in such a crowded field, had until Iowa and New Hampshire, remained opaque and uncertain. What was it, for example, about Howard Dean? Why did he, like the president whom he hopes to replace, excite such strong but contrary responses? I write in the wake of what may well prove to have been Dean's clinching mistake, the firing of campaign manager Joe Trippi. Mr Trippi, from what I've read, is not a particularly nice guy, no matter how ideal his ideals; his genius has been bent to game the electoral system in the pursuit of a profession that, I can't help imagining, wouldn't be necessary if people would pay attention. Nice or not, though, Mr Trippi appears in retrospect to have been the secret of Howard Dean's success. That it degenerated into a runaway success, and inevitably crashed, was probably always assured by the character of the candidate, which has come to be summarized as 'cocky.' It was Joe Trippi, and not Howard Dean, who nurtured the campaign's rapport with some six hundred thousand 'Deanie babies,' young people engaging in political activity for the first time and doing so with a zeal not seen since the days of my youth. It was Joe Trippi who knew how to work the Internet (about which Dr Dean was notoriously ignorant until his campaign was well under way), and Joe Trippi who saw that the mere fact of thousands of small contributions would add a luster - not to mention the biggest war chest - such as no Democratic candidate had hitherto enjoyed. Now that Mr Trippi no longer holds any of the reins, the metaphor of runaway horses seems less apt to describe the Dean campaign than that of a manhandled soufflé, collapsing so quickly that one can hardly remember how high it was.
This is not to say that Joe Trippi would have guaranteed Howard Dean's success. Evidently not. Perhaps he did all that he could do for the governor, mustering an army of volunteers and a thicket of donations while arranging for brilliant media exposure. In the end, the political insiders whose misgivings about Dr Dean seemed so spoilsport a month ago were right. The celebrity that good television coverage creates is one thing, the appeal to individual voters quite another. Democrats in two states have plumped for a pro, Senator Kerry, as the guy most likely to defeat George W Bush. I'm not sure that they're right; my doubts about John Kerry spring from my suspicion that he lacks a genuine sense of humor, and my conviction that American voters require a genuine sense of humor in their leaders, if only to soften the rigor of authority. (NB: senators are legislators, not executives.) But I have little doubt that Dr Dean's cockiness has made him as unacceptable to voters as is the Reverend Sharpton, who talks a great line but who trails enough seamy rumor to feed a flock of gulls.
Now that the voting has begun, in short, I'm finding that I've been dreading the coming election as much as I dreaded rebuilding my computer's hard drive - and that I dread it no longer. No doubt the impression that the Bush Administration has run out of credit has something to do with my enthusiasm. But for the most part it's a matter of rolling up sleeves and getting to work. Dr Dean has been famous for appearing in rolled-up sleeves. The sad thing is that he's never shaken the air of doing so in preparation, not for the long hard slog of a national campaign, but for a sparring match. We've had our fill (I hope) of bullies.
23 January 2004: "Howard's End!" was a tabloid headline yesterday. By the time I heard that, I'd also heard the newly restrained voice of Dr Dean on the radio. But the damage, for me, had been done. Like many supporters and (the group I belong to) voters trying to size up the former governor, I was appalled by the vocal coarseness with which the candidate bellowed his litany of impending primaries, and wholly shocked by the whoop that wound it up. I don't think that I've ever heard so uninhibited a sound on the radio. It was beyond naked.
I've worried for a long time now that the Dean campaign is a runaway affair, and I said so last September. Formerly a low-profile doctor and elected official, self-assured but quiet, Howard Dean appears to have been transformed by his surprising success into a flamboyant grandstander. A thoroughly secular man, he nonetheless commands the oratorical powers of a revivalist, and it may be that his appeal lies in his ability to reassure those who are sick of the incumbent's religiosity that their convictions are no less lofty. Substitute 'Bush' for 'Satan' and 'Iraq' for 'Demon rum,' and you can hear the tent flapping in the breeze. Nobody, I'm sure, is more surprised by all of this than Dr Dean himself - and that's the problem. I long to have a national leader with humane vision and profound imagination, but I don't want a leader who is distracted by self-discovery. The very elation of Dr Dean's manner puts one in mind of a boy with a new toy that has turned out to be faster and more powerful than he ever imagined. That glandular rush accompanying this discovery is incompatible with presidential gravitas. The sound of Dr Dean's voice after his (again unexpected) poor showing in Iowa confirms the judgment of many of his critic: he is inclined to act before (i.e., without) thinking.
Being the governor of a small state has been a seriously discounted credential for many observers, but until the other day I wondered why the size of Vermont would matter. Then came Dr Dean's outburst, and it dawned on me that eminence in a state of which it's not ridiculous to say that everybody knows of everybody else might contribute to a lack of caution. If you know that most people are familiar with your record and know something of your character, surrendering to a moment of excitement might not seem reckless. But Tuesday's war whoop was broadcast to millions of listeners who don't have any direct knowledge of Dr Dean, and as for the moment of excitement, it was, under the circumstances, self-willed. The ugly thing about Dr Dean's apparently spontaneous braying was that instead of being inspired by victory, it seemed motivated by a savage pride in being able to rally his troops in defeat.
Now Dr Dean tells us that he is not a perfect person, and wants us to believe that passion and courage drive him to speak out on behalf of ordinary people. This is the language of a sinner; it could also serve as the motto for every demagogue in history. I don't think that Dr Dean is a demagogue. But hearing him on the radio on Wednesday morning made me fear that he might become one. The thickness in his voice betrayed the first taste of an addictive drug. A massive chorus of disapproval may have cured the man of a bad habit before it could take root, but his vulnerability to certain worrisome temptations was laid bare. For all his strengths, Howard Dean lacks the unmistakable grain of presidential timber. The political insiders had been trying to tell everybody this since the Dean 'insurgency' took off. The problem isn't that Dr Dean is 'not real.' The problem is that he's too real.
Which isn't to say that I won't vote for him if gets the nomination. But it will surprise me if he makes it that far. He has already accomplished something very important in forcing his rivals to drop their mumblingly pseudopatriotic respect for the polices of George W. Bush. For over a year, has has given voice to the millions of Americans who feared that the present administration is a disaster in the making. But the other day, his voice menaced another kind of disaster: the catastrophe of a runaway presidency.
16 January 2004: Another holiday weekend coming up! It took the New York Stock Exchange twelve years to recognize Martin Luther King Day as a holiday. Between 1986 and 1997, it simply observed a moment of silence, but by the following year the Exchange realized that this was sending the wrong message: " this is not a real holiday." Now, at least here in New York, it's a holiday equal unto all the others, honored by more employers than New York's 'own' holiday, Columbus Day.
Dr King remains the most important African-American in our history. He understood that the constitutional amendments passed in the wake of the Civil War were not sufficient to liberate former slaves. It would take affirmative action, in the widest possible sense of that term, to level the field. Without some kind of compensating advantage, some sort of deliberate, positive attention, African-Americans would always be second-class citizens, and most hardly even that. What made Dr King important was his courage in bringing this understanding to bear on civil life in the United States. It took guts and it took patience, two virtues that don't often appear in the same person. He did not act alone, but he served as the human fulcrum that all momentous change requires. In honoring him, we honor all who worked with him, and all who have worked since his death in 1968, for the cause of equal civil rights.
Simply declaring a holiday - one of the nicer souvenirs of the Reagan Administrations - wasn't enough, any more than banning slavery was enough. Making slavery illegal did not prevent the Jim Crow laws - nor the parallel segregation that took root in much of the North when black Americans migrated to the nation's industrial centers. The postwar amendments, necessary but insufficient, didn't do a thing to improve anyone's economic lot, except that of the frauds and mountebanks who took advantage of the Confederacy's prostrate economy. Reconstruction, however well-intentioned, was savagely vindictive, an indulgence in reckless righteousness that all but ensured the much longer-lasting backlash that was hailed at the time as 'Redemption.' Emancipated but uneducated, saddled with responsibilities beyond their experience, freed slaves haplessly confirmed their former masters' racial prejudices, which hardened into a bigotry that closed off most avenues of advancement. The title of a famous spiritual, 'No Room at the Inn,' was changed by many singers to 'No Room at the Hotel' - a not-so-subtle reminder that American blacks were suffering, in modern times, the humiliations once endured by the Holy Family. The 'hotel' from which they were excluded was much more than an inn: it was the very world of American opportunity.
I'm reminded of the Nimbys. We are all now familiar with the nasty habit of mind that goes by this acronym for "not in my back yard." Nobody wants to see the value of his or her real estate take a dive because the power company wants to build a plant nearby or the local airport seeks to increase its traffic. Following Kant's Golden Rule, however, it would be hard to find a reason why I, and not some other, ought to be spared the burdens of a general social benefit. I bring this up because something rather like Nimbyism can lead to a certain obtuseness along racial lines. Where Nimbyites want to offload difficulty onto other people, Ikdiskyists (your forbearance, please) imagine that everyone could, with a little will-power, be just like themselves: If I Can Do It, So Can You. Aside from rare, largely teenaged moments when someone is speaking to a person of the same age and background, this statement is an utterly inappropriate expression of egoism. Ikdiskyism (give it a try) is what Martin Luther King Day is up against. What I mean, of course, is that Dr King obliged all Americans to drop this odious dismissal and replace it with an offer: "What can I do to help you attain my level of happiness?"
Liberals are unpopular today because they've been slow - congenitally slow, perhaps, given the nature of the liberal impulse - to understand two things about helping the less fortunate. Sometimes the answer to their offer is, "Not a thing," and one must say to oneself, "There's nothing that I can do here without giving some kind of offense." Often enough it's a matter of not doing something - for example, of not presuming to know someone else's needs and wants. The other liberal failing is a tendency to hide behind institutions when it comes to charity. Charity comes from the heart, and the state, thank goodness, is heartless. There's a big difference between passing a law that bars housing discrimination, on the one hand, and selling your house (or, better, renting a room) to someone your neighbors may deem 'unsuitable' for the neighborhood. Nor can any of us wait for everyone else to attain the maximum of altruism. Imperfection is never an excuse for inaction, and other people's failures, a fortiori, will never justify yours. Liberalism is fundamentally a matter of judging for yourself - with everyone else in mind.
Sometimes, just thinking about something carefully - and perhaps expressing one's thoughts in a letter to a good friend, just to be sure that one's making sense - is all that's required. Aristotle's maxim, that the man who knows what is good will do good, is not so automatic as it sounds, but only because Aristotle's idea of knowledge was not as passive as ours. To know something as Aristotle - or Martin Luther King - knew things is to feel the difference between right and not-right so clearly that taking a step in the latter direction is too painful to consider. To know things as Dr King did - that's my dream. There is only one race of humans.
9 January 2004: If you have logged on to Arts & Letters Daily (www.aldaily.com), you'll have noticed the motto that appears on the upper right-hand corner of the page, Veritas odit moras. Not having studied Latin with any regularity, I didn't know what it was that truth hates, but I was sure that it couldn't be mores. And of course it's not. Having finally consulted my ancient Cassell's Latin dictionary, I find that what truth hates is delay. An apt motto for a Web site! Next question: who said it? Somebody famous from antiquity? Erasmus, perhaps? Or did someone at AL Daily make it up?
Why not just say it in English? English is the new Latin, after all. Spoken properly by very few, English is misspoken by millions, maybe billions. But that's the virtue of English: it's possible to speak it badly and yet still be understood, at least by others who speak it badly in the same way. English is a conversational language, and the best written English echoes the cadences of the spoken tongue. Attempts to create an elegant English have always been ridiculed, and in the twentieth century 'fine writing' was nailed as a sign of inferior intellect. Latin, in contrast, seems designed for inscriptions. Powerful declensions allow it to dispense with many of the prepositions that clutter modern languages, and so pack a maximum of meaning into a minimum of words. It's easy to forget how expensive, in time and materials, writing used to be. Pen and paper didn't exist. Perhaps we would all think more clearly if we had to incise our reflections in granite. Of course, it may be that no one, excepting perhaps Cicero, actually spoke the Latin of literature.
I often wonder how long the language that I speak will remain intelligible. Shakespeare can no longer be understood without a good deal of vocabulary exercise, and he hasn't been dead for four hundred years yet. Dickens may be the earliest writer whom ordinary readers can enjoy without resorting to the dictionary. The pace of change certainly appears to be accelerating. For all the political power currently wielded by conservatives, the present moment is marked by a fascination with the outlaw that encourages the early and widespread adoption of the latest slang, which is often quite intentionally illiterate. I don't think much of slang, myself, because, like the ideal of cool that underlies it, slang is fundamentally anti-expressive: the point that it almost always wants to make is that words are inadequate and misleading. Slang cannot communicate anything new without an attendant translation, which puts it in the class of secret handshakes. And it stales with horrifying speed. But the meanings of many staid-looking words change, too. 'Nice' used to mean finicky, and 'happiness,' when Thomas Jefferson inserted it into the Declaration of Independence, was not a term in common use; it signified, not the contentment of carefree summer day, but prosperity.
David Mamet, in a piece which you can currently - but probably not for long, so I omit the hyperlink - reach from Arts & Letters Daily, deplores the recent coinages weapons of mass destruction and homeland security. Neither, he points out, is used in common speech by the man or woman in the street. It's enough to say that the possessor of such weapons is very dangerous and a threat to our safety. The official phrases are species of cant, which is a counterpart to slang: instead of bubbling up from the street, cant wafts down from Olympus. It is used to concoct notions for which there is little material basis, and without it politics would probably be a lot more interesting.
Most Americans don't grasp that they're thought by most people on earth, Anglophone or not, to speak not English but American. I daresay there are more people in this country who speak standard (grammatical) English than there are in the British Isles, but educated Britons speak with a rich assurance unheard elsewhere. Sometimes I attribute this understated virtuosity to the language's absolute hegemony in England, which appears to have resisted the influence of immigrant tongues. Sometimes the exclusive character of English privilege looks like a more plausible explanation. The American language has been worn flat by waves of non-native speakers. Instead of enriching the language, the influx of generations diluted it.
Truth hates delay - not much of a motto, is it? Perhaps our overexposure to advertising has palled the taste for tag lines.
2 January 2004: Happy New Year! More precisely: Happy, Healthy, and Secure New Year!
Working harder on the site is largely a matter of working longer - it's that simple. My second resolution, however, bristles with difficulties, obliging me to hew assiduously to the line between exhortation and nagging. To exhort means 'to encourage greatly,' and that is the aim of most of my Friday pieces. I encourage everyone who reads this to be as attentive as possible to the stream of ongoing life, to the events great and small that pile up like so much alluvial silt to become history. Every reader who is the citizen of a democracy, furthermore - and this includes the technical 'subjects' of H.M. the Queen - I encourage to accept the responsibilities of citizenship, the foremost of which is to recognize that the failure to vote with intelligent conviction is tantamount to a vote against democracy. American voters this year certainly face a contest with unprecedented global implications; it is not too much to say that the future of the United Nations Organization will be stamped by November's outcome, as will the cast of the American judiciary. But, ah - am I beginning to sound like a scold?
I encourage everyone with access to the Jan. 5, 2004 issue of The New Yorker to read Lawrence Wright's account of mentoring reporters at the Saudi Gazette. The title of the piece, 'The Kingdom of Silence,' sums up Mr Wright's experience depressingly enough, but his portraits of the journalists and other Saudis whom he got to know during his stint in Jeddah will infuse one's clichés about Arabian life with the living color of apt and peculiar detail. Alien as its culture may be - we're told that, because most music is anathema to the religious establishment, the "magnificent" concert hall built at Riyadh in 1989 has never been used for its intended purpose - Saudi Arabia is not a concept but a nation of human beings. (That most of the younger ones are as depressed as you or I should be to live there reaffirms this point.) The intersection of princely privilege and Jeddah's appalling sewage problem that underlies one of the article's longer narrative threads highlights the sheer indispensability of a free press; without it, the residents of this Red Sea port face catastrophic - and wholly avoidable - prospects. My favorite tidbit concerns the profile of Saudi Arabia's religious enforcers, the muttawa'a:
A number of Saudis told me that many of the muttawa'a are ex-convicts who would be unemployable except for the fact that in prison they memorized the Koran. They receive a bounty from the government for every arrest they make: reportedly, three hundred dollars for every Saudi, and half that for a foreigner.
I also encourage everyone to read Jessica Stern's Terror in the Name of God: Why Religious Militants Kill (Ecco, 2003). Ms Stern, a Harvard Fellow, has left the ivory tower for some rather unwelcoming venues in order to interview people who claim to act violently because God has told them to do so. Some of her subjects are Muslim, but many are white Americans, two of whom speak from Death Row (one of them subsequently executed). If this were a perfect world, we could confine terrorists to an island from which escape was impossible, and let them sort things out on their own. But for all the animus that a fundamentalist Christian might harbor against Islam, Ms Stern's book makes it clear that the terrorists' real enemies are moderate folk who are willing to compromise. The fundamentals of Terror in the Name of God expose the deadly allure of violent simplification - the determination to purify, to eliminate confusing choices, to preserve the world of one's childhood. (Many fundamentalists speak of returning to some historically distant moment of alleged purity, but their inconsistencies show that what they really long for is the thoughtlessness of childhood.) Most of us, I believe, manage not only to cope with but to benefit from the complexities of life, so it's all the more terrifying that a handful of people who can't cope and don't want to benefit threaten instead to extinguish it - perhaps, in keeping with some scriptural provision or other, on a global scale.
Copyright (c) 2004 Pourover Press
Welcome to Portico, a Website devoted to the experiences that it’s useful to talk about, written largely by R J Keefe, gent., of Yorkville, New York. I encourage you to download anything that looks interesting and to read it more carefully in print; I also call your attention to my copyright.
Copyright (c) 2004 Pourover Press