26 December 2003: Not without pangs of guilt, I took the day off and wrote nothing for Portico today. I didn't even have the good manners to say so! Instead, I waited a week to write this final entry for 2003.
19 December 2003: Last week, I distinguished a pair of prepositional approaches to freedom; I contrasted the freedom to develop social change with the freedom from government interference in one's affairs. I could, however, just as easily switched the affiliations of to and from. Libertarians amid the conservative fold value the freedom to realize their visions and ambitions, while progressive stewards of the body politic require a government that will assure society's freedom from the undesirable side-effects (environmental pollution has become the most prominent of these, but it was not always so) of libertarian self-realization. Such is the ambiguity of freedom that it can signify totally contradictory things. One man's freedom is another's oppression. This is why the word gives me the willies, and I scowl warily whenever it comes up. Beneath the prepositional flip-flopping, though, I think something really definite can be established. There are people who see themselves as independent operators, and people who see themselves as members of society. I don't think that it's possible to belong to both of these groups. The real question is whether a government can be conceived that will suit both crowds.
There's another question, of course: does everyone fall into one group or the other? I suspect not. Most people don't give the question much thought. It's possible, after all, to see yourself as both as self-reliant and socially-conscious, as many - perhaps most - people do. The moment of decision arrives only when you have to put one of these characteristics ahead of the other, and it is a moment that does not arise in everyday life. But it is the fundamental decision that every participant in a democracy must make, and not to make it is to shirk a fundamental responsibility. Politically active people know where they stand as a matter of course, but as the whole point of representative democracy is to spare most citizens most of the burdens of political activity (the obligations, first, to familiarize yourself with candidates and issues, and, second, to vote as you see fit, are the non-delegable exceptions), there is nothing in the democratic setup that forces the person in the street to do likewise. This is the Achilles heel of democracy: it cannot enforce itself. Democracies are always up for ratification. It is always possible for voters to vote democracy out, as voters in polities as diverse as Weimar Germany and contemporary Algeria have shown. The next-worst thing is not to vote at all, and if neglecting or refusing to vote is an anti-democratic gesture (and I believe that it is), then too many Americans today have given up on the Land of the Free. The irony of democracy is that it depends upon self-starters for the common good. Adding insult to irony, the common good does not depend upon democracy; benevolent autocrats have from time to time done a better job of maintaining it. If only they did it more often and more reliably, there would be no need for the bother of democracy.
Economist Virginia Postrel has caused a stir with her latest book, The Substance of Style: How the Rise of Aesthetic Value Is Remaking Commerce, Culture, and Consciousness, by praising, among many other wonders of modern commerce, the broad availability of many different styles and prices of toilet-bowl brushes. It is certainly very difficult to avoid the conclusion that a society armed with strong opinions about articles of domestic utility and yet misinformed (at least until very recently) about Saddam Hussein's involvement in the terrorist attacks of September 2001 needs to rethink its priorities. On this afternoon's Leonard Lopate Show, Jeanine Garofalo wondered why it is that Americans don't seem to care about knowing anything. It would appear that they have been persuaded that how they feel about something is the same as what they know about it. (The strength of the conservative insurgency may have its roots in a dawning suspicion that having a feeling about, say, poverty or homelessness is perfectly compatible with total ignorance of the matter.) Upping the ante of personal responsibility considerably, Bob Herbert, in his column in today's Times, surmised that Americans are the best-informed people in history, but also the most distanced: they've hit on the knack for not knowing unpleasant things. That's true, perhaps, for readers of the Times - one must develop a carapace of some sort if the daily perusal of that newspaper is not to induce despair - but from my own experience of other people I gather that most simply aren't paying enough attention to be informed. I'm with Ms Garofalo: knowledge isn't cool.
Of course it isn't. Cool is possibly the most elusive sensation ever known, and, like Diana Vreeland's conception of allure, it largely a matter of refusal. Knowledge is above all a matter of accumulation. If cool is a particular kind of discrimination, it still requires objects among which to discriminate; if it's true that all the knowledge in the world won't make you cool, it's also true that you can't be cool if you don't know anything. Cool people who happen to know a lot are generally more esteemed than people whose cool depends on dumb luck. But it remains sadly true that knowledge alone is never going to make anyone sexy.
And democracy is never going to make anyone responsible. It's up to each one of us to make the most of this freedom from coercion by setting aside the freedom to know, and do, nothing.
12 December 2003: For some reason or another, the urgency with which Administration ideologues launched the pre-emptive invasion of Iraq has lately acquired the itchy insistence (for me, at least) of a genuine mystery. When Pentagon officials speak of bringing democracy and freedom to Iraq, what, I wonder, what does this signify? Do the terms 'democracy' and 'freedom' have the same meanings for them that they have for me? I'm fairly convinced that they don't, because what I understand of these abstractions could never justify the military occupation of another country. But that's just me. The older I get, the more deeply I'm impressed by the novelty of democracy as a realized political philosophy: there's so much still to be learned.
For example, it's not generally understood that the last century's totalitarian nightmares were all democratic in origin. The Bolshevik regime in Russia and the fascist regimes of Western Europe offer posterity a catalogue of inclinations that people in democracies ought to resist. But I suspect that neonconservative thinkers don't share this view, because for them, I gather, 'democracy' and 'freedom' are expressions of an individualism that's wholly foreign to the turbulence of mass movements. (Perhaps it's simply not possible to imagine mass movements if you're accustomed to the unpopulated expanses of Idaho and Wyoming.) I sense a profound difference in what might be called the charge of these ideas, 'democracy' and 'freedom.' For me, democracy and freedom are empowering agents that enable people to do things - to develop society together. For conservatives, they seem to be restraints on powers that might limit individual autonomy. It's the difference between freedom to and freedom from.
The freedom-from strain of democratic philosophy, moreover, rests on an unspoken homogeneity. Its foundation appears to be personal relationship with Jesus Christ, and in the pursuit of individual redemption believers are all, paradoxically, alike. The individualism that city dwellers celebrate couldn't be more different. It actively encourages striking divergences in personal style, outlook, and even morality. Conservatives call this 'relativism,' not 'individualism.' Conservative individualism is the right to behave just like everybody else in one's acquaintance, without the interference of government programs that seek to improve the lot of people one doesn't know. Because they're motivated by shared traditions, conservatives don't really see the need for government. Freedom-to individualists look to government to keep chaos at bay. I cannot conceive of democracy without government, but this is exactly what the conservative ideal is coming to look like.
The goal of personal salvation, which for most people involves posthumous existence in a decidely post-social paradise, is often at odds with the humanist goal of improving mankind. I have to say that, given the choice between eternal bliss and contributing to the welfare of the people who live after me, I would unhesitatingly choose the latter. Not because I'm especially generous or high-minded, but simply because the dream of heaven has always, from my earliest Sundays at Mass, contemplating a stained-glass window of the Crucifixion, reeked of selfishness, even greediness. How remarkable of Christianity to sanctify such self-absorption! The good Christian is, of course, commanded to put others first, but always with the final and utterly unaltruistic reward in mind. "Jesus loves me, that I know" - and that's, apparently, all that matters.
Even the word 'humanity' means different things. To me, it denotes a species that evolves over time like any other, and one, moreover, that there's no reason to regard as the last word. Like an evolving species, it adapts ever more completely to its environment - it gets better - while at the same time risking the extinction that will swiftly follow too severe a change in that environment. What distinguishes humanity from other kinds of life is a degree of self-awareness that permits informed choice: humanity can get better deliberately. (It can also wreck its environment.) This is certainly not the traditional, conservative view of humanity, according to which mankind is fixed in its fallen nature, forever dependent on divine love and forgiveness even if it has been weaned somewhat of the need for divine intervention.
Something tells me that I'm never really going to be able to understand the mystery of neoconservative motivation. My wiring simply won't allow it.
5 December 2003: Click here.
28 November 2003: Greetings from Paris. For the first time in my life, I found myself outside the United States on Thanksgiving Day. Not to worry. Thanks to the pull of an excellent concierge, we secured a table at one of the top restaurants, where, what d'you know, a guest at a nearby table asked for roast turkey and got it, even though he was one of a party of two, and, more to the point, turkey was not on the menu. The carving of the bird attracted everyone's incredulous attention. Word quickly spread that the diner was not American.
More remarkably, Kathleen and I have noted that our attempts to speak French have not been rebuffed by salespeople and waiters who, formerly, would have preferred to continue in English, as a prophylactic against the sullying of their native tongue by folks whose command of agreements isn't everything it ought to be. Everyone has been very nice, and we our disclosure as New Yorkers hasn't elicited so much as a raised eyebrow. When I think of our friends (Midwesterners, for most part) who have given up Bordeaux in protest against French 'intransigence' about Iraq, I blush even more deeply than I did before this visit. A very attractive women who sells fine umbrellas and walking sticks on the Boulevard Saint Germain enthused about the prospects of opening a branch in Manhattan, and the wiry garçon at our favorite brasserie told us that he's looking forward to visiting New York as soon as his children are older.
This vacation's having followed a busy time for both of us, we haven't been out every moment soaking up all the possibilities, and I regret that, but such are the limitations of middle age. We've reached the age at which anything but a trip to a spa (something I couldn't endure) requires a preliminary resting-up, something wasn't possible this month. I had a long list of things to see and do, and now I've got to keep it from becoming a long list of disappointments. Merely being in Paris is the point, even if we've spent all but an hour two within a triangle described by the Rue de Rivoli, the Rue de la Paix, and the Avenue de l'Opéra. Who knows, though, what today will bring?
21 November 2003: A correspondent forwarded a manifesto that has been making the Internet rounds. I've been pondering its core, which follows:
IMMIGRANTS, NOT AMERICANS, MUST ADAPT. I am tired of this nation worrying about whether we are offending some individual or their culture. Since the terrorist attacks on Sept. 11, we have experienced a surge in patriotism by the majority of Americans. However, the dust from the attacks had barely settled when the "politically correct" crowd began complaining about the possibility that our patriotism was offending others.
I am not against immigration, nor do I hold a grudge against anyone who is seeking a better life by coming to America. Our population is almost entirely made up of descendants of immigrants. However, there are a few things that those who have recently come to our country, and apparently some born here, need to understand. This idea of America being a multicultural community has served only to dilute our sovereignty and our national identity. As Americans, we have our own culture, our own society, our own language and our own lifestyle. This culture has been developed over centuries of struggles, trials, and victories by millions of men and women who have sought freedom.
We speak ENGLISH, not Spanish, Portuguese, Arabic, Chinese, Japanese, Russian, or any other language. Therefore, if you wish to become part of our society, learn the language!
"In God We Trust" is our national motto. This is not some Christian, right wing, political slogan. We adopted this motto because Christian men and women, on Christian principles, founded this nation, and this is clearly documented. It is certainly appropriate to display it on the walls of our schools. If God offends you, then I suggest you consider another part of the world as your new home, because God is part of our culture.
If Stars and Stripes offend you, or you don't like Uncle Sam, then you should seriously consider a move to another part of this planet. We are happy with our culture and have no desire to change, and we really don't care how you did things where you came from. This is OUR COUNTRY, our land, and our lifestyle. Our First Amendment gives every! citizen the right to express his opinion and we will allow you every opportunity to do so. But once you are done complaining, whining, and griping about our flag, our pledge, our national motto, or our way of life, I highly encourage you take advantage of one other great American freedom, THE RIGHT TO LEAVE.
This outburst seems aimed not at immigrants but rather at (a) cosmopolitan liberals ("apparently some born here") and (b) residents - Latino residents in particular - who persist in breaking the age-old pattern of producing purely Anglophone offspring within three generations. There have always been enclaves of Americans (citizens or not) who don't speak fluent, or any, English; visit one of New York's Chinatowns for immersion in what is still a fairly exotic culture. I think it unlikely, however, that the writer had Chinatown in mind. Asians as a group have been more reluctant than most immigrants to vaunt their origins, and in any case Asian-speaking neighborhoods are extremely compact. This it not true of Latino culture in America. Perhaps the writer lives in Miami, Los Angeles, San Antonio - or New York. In New York, Spanish is only one of many language barriers that one comes up against. Forget enclaves! Janitors and taxi drivers may hail from Central Europe or India, and dealing with those that do can present real, if not very serious, inconveniences.
Strip away the manifesto's intemperance and the insults, and an important proposition emerges: a nation is not a miscellany of divergent individuals but a people with fundamental things in common. That's the national ideal generally; the American model adds an appealing call to self-improvement. We try hard to practice the civic virtues that keep freedom from degenerating into chaos, and we congratulate ourselves for welcoming steady workers who want to live among us. Sadly, though, we don't seem to have a national lifestyle. I'm not sure that we ever did, but we certainly don't now. Poll after poll and election after election reveal a population that's evenly split between outlooks that, if not necessarily opposed, see things differently. That's why it's important to have fundamental things in common. These have become difficult to discern, not because they're not there but because they're obscured by anger and frustration.
A great deal of this anger and frustration swirls around issues of religion. I think it's fair to say that Americans have never fully reconciled their religiosity, remarkable in the developed world, to the ideal of toleration that seems so central to American freedom. Freedom means not just the right to practice one's own religion (within limits that proscribe, say, polygamy, or human sacrifice) but the right not to have to practice anybody else's. The writer says that we are all Christian in America, but this isn't true, not at all. Have American Jews done the right thing by not speaking out against school prayer or the official recognition of Christmas? I happen to think that they have, but their acquiescence may have given their Christian countrymen the idea that Christianity is a sort of all-purpose, universal faith, of which all the more specific denominations are sects. If this were true, then nobody could complain about crosses in courtrooms or school prayer. I'm not sure that it isn't true, by the way. American Christianity, as Harold Bloom has pointed out, is an extremely vague, and utterly personal, affair; it may be misleading to speak of it as a religion - something that groups of people have in common - in the first place. The God in which we trust is an adaptable deity.
An interesting and comprehensive page on the US Treasury Web site sets out the history of 'In God we Trust.' Although it appeared on coins during the Civil War, it did not become the national motto - there was no national motto - until 1956, two years after Congress (under pressure from the Knights of Columbus) inserted 'under God' into the Pledge of Allegiance. Ever since, cultural conservatives have been putting Scripture to tendentious use, forging minor scraps of Mosaic and Pauline legalism into quasi-Constitutional amendments - latterly in the name of 'family values.' As far as I'm concerned, the 50s were the most paranoid years of the American Century, but I suspect that for the writer of this declaration they were a golden age, followed by nothing but downhill decay. That's another point on which we don't see eye to eye. After all, it was in the 50s that African-Americans decided that anything less than first-class status was unacceptable.
The writer is careful not to mention African-Americans, or their struggle for full civil rights, but of course it was that struggle that threw the door open to coming out as different. I don't think that many people foresaw that this would happen. Rather, it was expected that, once they were granted full entree into the mainstream of American life, African-Americans would drop the 'African' part. Given the deformations of history, however, that wasn't possible, and liberals quickly concluded that most African-Americans - and, later, members of other groups - needed more than a bundle of rights. The idea that the guarantee of civil rights requires the leveling of structural disadvantages has been antagonistic (to say the least) to conservatives ever since; it explains their nimble shift from "Separate but Equal" to "Colorblind."
Structural disadvantages are prejudices, built up on both sides of an inequality rooted in differences of skin color, gender, or (more controversially, since this, many conservatives believe, is elective) sexual orientation - prejudices that, thanks to the failure of imagination, harden into presumptions. The idea that women belong in the home is a fine example of such a presumption: we have still not eliminated the consequent structural disadvantages that all women face in the pursuit of happiness. It's not very different from slavery, when you think about it: birth determines everything. Nobody in a free society ought to have to battle such presumptions. The elimination of structural disadvantages, by the way, does not imply the elimination of personal advantages.
As for the right to leave the United States, it's a sad fact that the United States remains one of the few countries that welcomes and absorbs foreigners. Americans don't, in any case, have much of a choice about going elsewhere. Given the philosophy of the current Attorney General, however, it's not inconceivable that the nations of Western Europe might recognize dissident Americans as political refugees.
14 November 2003: About a month ago, it occurred to me that my anxieties about the future of the United States had distracted me from attending to problems closer to home, and I decided to stop thinking so much about politics. The funny thing was that my interest in politics has never been very keen. I familiarize myself with issues and candidates whenever an election loomed, and I'm sure to vote - and that's that until the next election cycle. The whole point of representative government, after all, is to choose other people to oversee the government's work. Politics - the rough business of sitting down with the proponents of opposing views and thrashing out the compromises that keep things running - is their problem, not mine.
This new regime of not thinking about politics lasted about a day. It didn't end because I'd somehow become a political junkie, unable to stay out of touch with the latest campaign developments. It ended because the problem American politics today, I saw, is that truly political behavior has been abandoned. That's what I've been worrying about ever since the Republicans nominated George W. Bush. No, for longer than that. Since the impeachment of Bill Clinton. Longer? Since Reagan?
Even when I agree with their policies, I'm wary of Republicans. Since the party's foundation in the 1850s, it has always stood for principles, and always resisted compromise. There is something nobly single-minded about even the most narrowly pro-business Republican presidents. Democrats, in contrast, are prone to practices inconsistent with their preaching. Democrats are always ready to strike a deal, behind closed doors if possible. Democrats like politics. Remember, it was the Democrats who held the nation together, in the decades before the Civil War, by compromising on slavery. That would seem to give compromise a bad name.
But in a country as large and diverse as ours, compromise is the only alternative to paralysis. I suspect that Grover Norquist, the anti-taxation leader of Americans for Tax Reform, would agree: he's all for paralysis. "Starve the beast" is his motto. Active, deep-pocketed government, in his view, can do nothing but harm. (Domestically, anyway; and there's still plenty of room for a passive government that, slavishly protective of property rights, clears away any and all obstacles from the path of 'free enterprise.') Most of us, however, regard the government as the only reliable provider of some public services, such as roads and public safety, and as the only reliable guarantor of many others, such as drug safety and justice. Most Americans believe in public schools, or at least rely on them for the education of their children. Most Americans expect government to keep the res publica - the public good - in good shape. This costs money, and money - from taxation to appropriation - entails compromise.
The tragedy of slavery should not blind us to the necessity of compromise, which is simply a manifestation of respect for people of different views. It's hard for those who are dazzled by the virtue of their own convictions to show such respect, because in their blindness they lack the imagination, the sympathy, and the adroitness to put themselves in someone else's shoes. In George W. Bush, his cabinet, and the neoconservative advisers responsible for our invasion of Iraq, I see the most dazzled, disrespectful crowd ever to attain national power. What makes me anxious isn't their politics but their utter lack of interest in politics. While the Congressional leaders of the two parties scream at one another (and the Democratic hopefuls self-destructively do the same), and while journalists report every polarizing comment, politics is missing in action. When politics-as-usual returns to Washington, then I'll stop thinking about it.
7 November 2003: The Labor Department reported today that employment increased by 126,0o0 jobs in October. They were mostly service-sector jobs; better-paying manufacturing jobs continued their downward trend, although at a slower rate. This is very good news for the Bush administration. It's good news for everybody, except perhaps the Democratic presidential candidates. But it's not very good news. It's more of the no-news-is-good-news variety: it's not bad news. For a change, all the bad news is coming from Iraq. Another helicopter was shot down earlier today, over Tikrit.
Oops, I forgot - that partial-birth abortion legislation that President Bush enacted the other day. It may never have the force of law in the courts, but that's far from certain, given the current makeup of the Supreme Court. Unconstitutional vagueness and a failure to take the mother's health into account may prove to be the legislation's fatal flaws. But as prospective news goes, it's not really good news. So the law is struck down by 'the courts.' What this means that everyone will dig in deeper; maybe there will be another round of terrorist acts aimed at abortion clinics.
Abortion looks a lot like slavery to me: it's an issue that might require violence to settle. How else to get round the definitional polarization? Is abortion murder? Is a fetus sufficiently human to be murdered? Even if it is, do considerations of the mother's situation outweigh fetal rights? This last question doesn't come up often enough; it doesn't seem to occur to pro-life advocates. A long line of 'cannibal' cases establishes the proposition that it's never all right to murder one person to improve another's chances of surviving a catastrophe, but of course these cases have nothing to do with the law, such as it is, of war. War is an exception. Should the mother of an unwanted child be another? (We might also ask: is a pregnant woman a mother? even though this question is likely to be answered by the first two above.)
Those of us who ask do so, I believe, because human reproduction remains an awfully uneven business. Men have nothing but fun, and they have almost all of it, too. Women face serious inconvenience and not inconsiderable risk - and that's only up to the moment of birth. After that, most women still face years and years of drudgery, which is all right by some of them by but by no means all. Until we can arrange matters so that women are not disproportionally burdened by the birth and raising of children, and until we can guarantee the safety of wives who become pregnant under circumstances that trumpet adultery, it seems only right to give women the right to abort pregnancies. Public experience since the passage of Roe v. Wade in 1973 shows that abortion is an unhappy, sometimes awfully unhappy business. Nobody wants one. But war is hell, too. Just as patriotism excuses - urges - the use of violence against enemies, it's difficult for pro-choice advocates to believe that the life of a grown woman is not more important than that of a fetus.
Did I suggest, a minute ago, that the Civil War settled the issue of slavery? Pardon me; I meant no such thing. The real end of slavery did not even begin until the Voting and Civil Rights Acts of 1964 and 1965, and it is probably still too soon to say that American blacks enjoy the rights and privileges of their white countrymen. Wars settle only one kind of conflict - the occupation of territory by hostile outsiders - and our Civil War did just the reverse, guaranteeing that blacks would be made to pay for the bitterness of the lost Confederate cause. One fine day, I believe, Americans will stop regarding the Civil War as a nightmare with a happy ending, and see it as a nightmare plain and simple. I pray that the vindication of abortion rights will never excuse a similar disaster.
Halloween 2003: The November issue of Harper's contains an essay by noted author and Amherst professor Benjamin DeMott, entitled "Junk Politics," and it has rung a bell that I didn't know was there. I usually respond to the things that I read in magazines in one of three ways: by disagreeing outright, by agreeing with fundamental positions but contesting the reasoning, or by agreeing wholeheartedly and savoring the writer's way of putting things that I already believe. Mr DeMott's piece elicited a fourth, and very rare response: recognizing that the author has built on ideas that I'm already familiar with to reach an idea than in some important way is entirely new.
It helps, I suppose, that "Junk Politics" is not an easy read. Whether the author regards the current political scene with anger or contempt it is hard to tell, but it's clear that the essay is a denunciation, not an dispassionate analysis. Sparkling with intelligence, its connections are not always immediately cogent, and Mr DeMott's compressed style takes some patience to unpack. I was troubled by the failure to mention the Great Awakenings that have disturbed American society since colonial times; for much of what Mr DeMott laments could be taken as evidence of yet another one of these religious outbreaks. I say that these defects help to keep "Junk Politics" in mind, however, because they make an insight-laden piece hard to pigeonhole.
The implicit message of junk politics, Mr DeMott writes,
is that leadership's chief concern should be with setting an upbeat tone and demonstrating a sensitive response to hardship, rather than with honing in on injustice, spelling out practical correctives, arguing for the correctives in public forums, working for their ultimate enactment.
What "Junk Politics" adds to this familiar complaint is the claim that everyone is complicit. Not only the leaders of both political parties, but everyone who wants to press a claim for public attention. As Mr DeMott points out, societies are transformed by powerful metaphors, and the ruling metaphor today is that we're all the same. The price of American celebrity is a confession, however disingenuous, to private weaknesses and personal tragedies - and an accompanying implication (sometimes made explicit) that the confessor is no better than anyone in the audience. If true, these confessions would subvert themselves, for the people who make them are, after all, famous, exceptional for something. If they have, indeed, suffered tribulations common to the general public, then their achievements must be all the more remarkable. We are not all the same. Some people are very lucky.
And some people are very unfortunate. Mr DeMott identifies as one element of junk politics the denial of change - "'changelessness meaning zero interruption in the processes and practices that strengthen existing, interlocking systems of socioeconomic advantage." The notion that we're all the same is designed to muffle the obligation that the very lucky might otherwise feel, if confronted by the spike of their own advantage, to do something to help the unfortunate. Far more comfortable to teach yourself the lie that, having started out on a level playing field, you've achieved success by dint of hard work - as if mere hard work, however necessary, ever sufficed to produce success. Far more convenient to look upon the poor as self-destructive people who lack your own good habits.
Most people, of course, are neither very lucky nor very unfortunate. They're average, and Mr DeMott's keenest insight is that junk politics glorifies mediocrity. Now, 'mediocrity' is a very negative word, an insult really. Nonetheless, it necessarily describes the broad mass of the population, the hump atop the bell curve. So, without using the word at all, the people who, by virtue of their leadership position, ought to be demonstrating their distinction, present themselves as just like the voters to whom they appeal for office. One has to wonder why. To the born-again exhilaration that marks all American religious revivals, something new has been added: resentment. I think that this resentment is fueled by a bombardment of lush-life images from the broadcast media, but I don't want to complain yet again about television; I see it whenever I pick up People at the barber shop. Resentment sets an upper limit to the extent of happiness and good fortune that is tolerable in public discourse. Nobody has worked harder to conceal his access to the good things in life than the president.
[W]hen the president speaks up in his g-dropping, gonna-gotta vein, the themes radiate out: bunch of regular guys and gals here, nobody uppity. Across America's length and breadth the same story: good buddy equal parders talking gonna-gotta together, everybody on earth more or less Crawford kin.
It wasn't Mr DeMott's arguments that struck me at first. It was, rather, an example, an offering in evidence. "[W]itness media coverage of the gentrified saturnalia aboard the USS Abraham Lincoln. Much hostility in newspapers to the repositioning of the ship, the risky landing, the swaggering progress across the deck, the cued cheers - objections that this was a staged, made-for-TV event, lacking authenticity. ... But the theme of the flight-deck drama was sameness not heroic difference, palship not leadership, and the contribution that truly counted was to the broad cause of issue erosion." I remember a great deal of outrage at the rupture of a venerable national tradition: until Mr Bush, no president had ever worn a military uniform while in office. But where I saw an intolerable puffing up, Mr DeMott recognized a genial and very disrespectful Halloween costume. Thanks to his perception, I'm seeing things differently.
27 October 2003: Unlike controversies past and present over torture, poison gas, and land mines, the Boykin affair, set off by General William G. Boykin's derogatory views of Islam - delivered in sermons at religious services - is a matter of words, not weapons. But the toxic effect of words is demonstrably vaster than that of any explosive. What triggers most of today's rifles is a combination of hate speech and jingoistic journalism. Identifying Islam with Satan and asserting that the Almighty is 'bigger' than Allah cannot possibly further the cause of peace.
The Boykin dustup - yet another example, incidentally, the Bush administration's duplicity, preaching peace to the world but currying the favor of Armageddon-minded fundamentalist Christians - divides us into two groups, those who believe that some kinds of behavior are always and everywhere unjustifiable, and those who believe that it's fair to use whatever weapons your enemy uses - indeed, that it's stupid and self-destructive not to do so. Subscribers to the latter theory are prone to see their opponents as impractically principled, but they're wrong to do so, because they're themselves the less pragmatic party. Impassioned by the heat of the moment, they don't learn much either from the past or from the experience of different outlooks.
In a piece published on the Web site of The Daily Journal of Kankakee, Illinois, Fox Broadcasting host Cal Thomas makes the case for the impassionistas. Look at what the prime minister of Malaysia said about Jews running the world! Consider the anti-American animus of the Palestinian Authority! And don't forget that the Ayatollah Khomeini, Saddam Hussein and Osama bin Laden 'started it.' Mr Thomas didn't actually make the third point, but I don't see why he left it out. Whether responding to the Israeli occupation of the West Bank or simply seeking a scapegoat for the relatively backward economies of most Arabic and some other Islamic countries, Muslim demagogues have been branding America and Christianity as evil for well over twenty years. Whether inattentive out of arrogance or genuine cluelessness, the United States has done nothing to justify either of the attacks on the World Trade Center. (To say that we should have seen them coming is not the same as to say that we deserved them.) It is difficult to regard the radical Islamic vision with anything but horror.
Mr Thomas may be quite right to observe that the "notion that religion is not at the heart of the hatred directed at America from outside and now inside the country qualifies as extreme denial. Throughout the Muslim world, America is condemned not mainly because of its ideas but because Islamists believe we are infidels opposed to God." But to suggest that this justifies our behaving likewise reminds me of my sister's adolescent wails that 'all the girls' were doing something that she was forbidden to do.
It's my sense of the practical, not a moral position, that convinces me that insults are always and everywhere wrong, and that public figures who insult America's enemies ought to be relieved of their duties. What possible good such insults can accomplish? Insults merely confirm and intensify previously settled convictions, and I doubt very much that even the best-aimed insult can raise flagging morale.
The West has a long history of fighting with God's blessing. It is no coincidence that the last serious religious war in the West was ended by a set of treaties - collectively known as the Peace of Westphalia (1648) - that laid out the doctrine of national sovereignty. Exhausted by more than a century of intermittent warfare sparked both sincerely and opportunistically by religious differences, the statesmen of Europe accepted the very lesson that the Cold War would teach: the only way to win is not to play. The peoples of Europe, unfortunately, had to learn this lesson all over again for themselves when they acceded to the power of kings and cardinals, but they appear to have done so in the wake of the last century's two World Wars. Populated largely by former Europeans who turned their back on (now vanished) European strictures, Americans have flattered themselves upon their tolerance while, thanks to wealth and distance, rarely being obliged to put it to the test. What makes our Iraqi adventure so lamentable is its dismissal of Westphalia's fundamental principle: that it is wrong to invade a nation for the purpose of regime change. For it is almost certain that this war of ours will end up confirming the Westphalian rule.
Personally, I find General Boykin an embarrassment - yet another yahoo to justify the world's contempt for political America. That he holds onto both of his jobs (in the Army and the Defense Department) is inexcusable, and the moralistic organizations that complain that he has been 'unfairly muzzled' ought to be ashamed of themselves. But what bothers me most is that the pulpit is once again a flashpoint.
17 October 2003: Wow. What a surprise! I pay no attention to baseball as a rule. Once in a great while, I take the subway up to Yankee stadium and float my way through a game on a sea of beer. Then I come home, and the next day I couldn't tell you a thing about what I saw. Who won? I don't know. I can't keep track of that sort of thing. The beer has nothing to do with my memory lapses.
But the playoffs this year have been different. I've sat here with a transistor radio, while trying to write a very different sort of piece, listening to the seventh game of the American League playoffs, not because I give a damn but - because I guess I do. This has been such a strangely exciting series that many people, I suspect, have been drawn in against their ordinary inclination. The problem with such latter-day enthusiasm is that its victims aren't prepared for the consequences of their unwonted interest. What I discovered tonight is that I'm a Boston fan.
I didn't know it until the Yankees won, but the signs were there beforehand. Sitting down to write my Friday piece for Portico, I heard a chorus of screams and shouts from the streets, and I had to know what it was about. (I knew it was about baseball.) One of the things that I really love about living in New York, and living where I do (a sort of Greenwich Village for preppies, bar-wise), is the occasional burst of vocal riot. The cries on New Year's Eve, 2000, were thrilling and prolonged. The somewhat briefer shouts last August told me that the blackout was over. This year's World Series (am I allowed to use that term yet?) has been as audible as the jets landing at LaGuardia on a humid day. I've been taken back to football weekends at Notre Dame. But until late tonight, I was coolly indifferent to the outcome.
Before hunkering down to work, I had to place an order with the Vermont Country Store. Never mind for what - although I will confess to ordering, as a lagniappe, a tin of Charles Chips. In the middle of the transaction, the operator at VCS interjected - rather imprudently, I thought - "I can't believe you're not listening to the Yankees game." I suppose I was interrupting. I was on the point of observing that, living in New York, I couldn't watch the game, but I wasn't sure that the old blackout rule still applied. I did, however, look up the Yankee Web site and tune my transistor radio to WCBS, the designated radio carrier. I had already found an ESPN screen that put the score at Boston 5, Yankees 2, but the exuberance down below (which has only just died down) told me that this was no longer the score. When I finally got the radio going, the teams were tied. And they remained tied for three innings.
I believed that I was genuinely indifferent until the Yankees won. The victory made my heart sick. I was reminded of the article in a recent New Yorker about survivors who jump from the Golden Gate bridge: two seconds into flight, and they'd give anything not to be dying. Perhaps if I had cared, Boston would have won. I certainly cared terribly about Boston's losing.
And that's where I'm going to leave this. I've got to get to New Hampshire tomorrow, and can't work out all the implications of this evening's discovery, which, I assure you, has nothing to do with metaphysical musings about the beauty of baseball. I can tell you already that it has a lot to do with an unfairness that even I'm aware of - an unfairness that gives the Yankees eight pitchers and the Red Sox two. The curse stories are amusing, but the colossal financial inequities in the great American game aren't. If the Yankees have the deepest pockets in baseball, why don't sports fans, even in New York, resent them? I was ashamed to hear my neighbors' triumph. Where is populism when you need it?
10 October 2003: Writing in nearly twenty years ago, Neil Postman made an observation about television that went straight to my list of Top Ten Things to Know.
When a television show is in progress, it is very nearly impermissible to say, "Let me think about that" or "I don't know" or "What do you mean when you say ... ?" or "From what sources does your information come?" This type of discourse not only slows down the tempo of the show but creates the impression of uncertainty or lack of finish. It tends to reveal people in the act of thinking, which is as disconcerting and boring on television as it is on a Las Vegas stage. Thinking does not play well on television. There is not much to see in it. It is, in a phrase, not a performing art. But television demands a performing art, and so what the ABC network gave us was a picture of men of sophisticated verbal skills and political understanding being brought to heel by a medium that requires them to fashion performances rather than ideas.
I wondered why thinking would be boring to watch on television - boring, that is, for someone like me, who's fond of lively discussions that are necessarily full of cogitating pauses. I concluded that watching a discussion in which I cannot not participate is worse than boring, and positively annoying. The onscreen pause reminds me that I'm a passive and quite unnecessary spectator. Televised discussion will follow its course whether I watch it or not. I still haven't figured out quite why this rule doesn't apply to radio, but I don't doubt that it has everything to do with the overpowering force of vision; I don't feel passive when I'm listening to NPR. In any case, I learned from Mr Postman's Amusing Ourselves to Death: Public Discourse in the Age of Show Business (Viking, 1985), from which the passage above is taken (pages 90-91), that television would never, and could never, realize its early promise as an educational medium, because everything on television has to be prepared and canned ahead of time. Spontaneity - an essential ingredient of genuine learning - occurs only within the constraints of game-show rules.
Neil Postman died last weekend; with grim irony, news of his death did not reach the public until Wednesday, and his obituary appeared in The New York Times only yesterday. He did not live to see how completely California's gubernatorial recall confirmed his analysis of television, but having thought of him often during the media circus, only to hear of his death in the same newscast that announced the election results means that I will always associate him with what I take to be a baleful development.
Two facts: (1) There have been 32 gubernatorial recalls in California since 1936. Arnold Schwarzenegger's victory was the first instance of a successful recall. May have had something to do with the candidate's celebrity, d'you think? (2) California's broadcasters are notorious for underreporting political news, and with the Terminator and a host of other exotic figures in the running, they weren't necessarily making an exception for this summer's recall campaign when they gave it all-out coverage.
And a consolation: Arnold Schwarzenegger really won the race. He didn't have to. Forty-nine percent of the voters could have chosen to keep Governor Gray Davis, and lost the first part of the recall, while the second part could have been captured with a much lower percentage. But more people voted for Mr Schwarzenegger than voted against the recall (in other words, for Mr Davis). And yet the quality of the consolation is not great. The relatively massive turnout seems to have had more to do with participatory entertainment than with representative democracy. If government has taken on the look and feel of "Survivor," surely this is grounds for anxiety and not celebration.
That the recall was an exercise in participatory entertainment is suggested by many aspects of the contest. Exit polls indicated that most voters had made up their minds about a month ago - and then stopped judging. Mr Schwarzenegger said very little in the way of political speech, and that little was mostly puffery that seasoned critics of California's hamstrung budget found totally unrealistic. But this didn't matter to the voters. They might regret that their man never really 'addressed the issues,' but they voted for him anyway, because he had singlehandedly transformed a cranky recall into one of his own action pictures; in the coming episodes, he will descend on Sacramento and kick ass.
We'll see. He may well kick ass - it's something that California's voters haven't had occasion to address in their plethora of initiatives and referenda. But he may find that the same people who put him in office have made it impossible for him to construct a workable budget.
Requiescat Neil Postman.
3 October 2003: Click here.
26 September 2003: All week long, I've been fuming at the current state of American journalism - join the club - but what nearly set off an attack of thrombosis occurred yesterday after the Democratic contenders' debate on CNBC. What possible justification can the cable network have had for turning to a right-wing screamer, Chris Matthews, for commentary on the candidates' performance? He quite predictably dismissed the lot as wrong-headed and misguided, shouting down the innocuous anchor; his presence suggested that CNBC didn't expect many Democrats to be watching the debate, and was simply trying to please the angry, conservative audience that it is trying to lure away from Fox. I don't think that I've ever seen anything so scandalous on television - but then, I don't watch it.
But as I say I was already pretty angry myself, having had a hard time digesting the import of two stories in the October 6 issue of The Nation. The first was one of Eric Alterman's columns, this one devoted to the whopper that President Bush has told about his initial reaction to the September 11 attacks. In December, 2001, and January, 2002, the president stated that while he was waiting to enter an elementary school classroom in Florida he saw the impact of the first plane on a nearby television monitor - and thought it was a case of pilot error. But he can't have seen any such thing, since a video of the first plane's impact, taken by a tourist, did not surface until the next day. The 'pilot error' detail is almost weird. No one, of course, thought that the second strike (which of course was captured by live cameras) owed to pilot error, and the president's retrospective fancy reminds me of the 'corroborative detail' that gets K0-Ko and his friends in so much trouble in The Mikado. The misstatement is admittedly not terribly important in itself, but it's a sterling example of the man's reckless way with facts and figures. He doesn't care about them, and he apparently believes that most other people don't, either. Nonetheless, it's journalism's principal job to get things right, and Mr Alterman is right to be dismayed that none of his colleagues has endeavored to set the record straight.
A page or so later, Matt Taibbi filed a report on Gov. Howard Dean's campaign stumping. Mr Taibbi recounts how a reporter from a Florida paper interrupted a discussion that he and the governor were trying to have about Sallie Mae loans to small businesses with the following: "Governor, getting back to substance," he said. "Is it true that you paint your own house?" The piece ends in an epiphany, as Mr Taibbi comes to understand the true purpose of the governor's - or of any candidate's - appearances here and there throughout the country, invariably flanked by a politically correct mix of diverse Americans.
To be full of shit in American politics is a signal to our political press that you are serious, and it was quite obvious that the most transparently meaningless or calculating aspects of Dean's behavior were what most impressed the Sleepless Summer press corps.
This is pretty bad, no? Forget media bias. The problem is rather one of media substance. Who cares if a presidential candidate as such paints his own house? I'll tell you who cares: anyone who's afraid of being bored by more serious issues. Whether or not American journalists themselves have succumbed to this fear, they clearly feel it on behalf of their readers. Marketplace-of-ideas ideology takes it from there.
Ideas are not marketplace commodities. They are not consumer goods, because we don't consume them. On the contrary, ideas consume us. We're the commodities. We're the vessels through which ideas, spreading more or less infectiously throughout a population, manifest themselves and change the world. Our susceptibility to ideas is both uniquely human and dangerously volatile, and the notion than any one idea is as valid as any other is noxious nonsense. Journalists are not the only professionals with a responsibility for discriminating among ideas - for asserting, on the strength of their intelligence and training, that this idea is a good one, and that one not - but they alone manage the day-to-day status of current ideas. A reporter who finds more substance in the painting of one house than in the health of America's small businesses (and in the puzzle of corporate size) has betrayed his calling, and at the very least ought to turn in his word processor for a microphone and a blow drier.
19 September 2003: As expected, General Wesley K. Clark announced this week his candidacy for the Democratic Party's presidential nomination. There is little doubt that the 'draft Clark' movement had everything to do with General Clark's all but superlative military reputation. The commander of NATO peace-keeping forces in Kosovo, the general has been a star ever since he graduated first in his class at West Point. Unlike almost everyone in the Bush administration - especially the so-called 'chicken hawks' - he knows war first hand. Here is a potential candidate, the drafters clearly thought, who can take back the red, white and blue from the Republican radicals. The implication is that a civilian, even a civilian with Senator John Kerry's background, won't be able to stand up to the blasts of neoconservative rhetoric that we can expect in the coming campaign. As a practical matter, I'm inclined to agree. But I worry that General Clark's candidacy may come to be perceived by uncommitted swing voters as a negative force, aimed simply to put a stop to the Bush Revolution. Unless they're positively attracted to a candidate's position, American voters tend to stay at home.
There were already nine contenders in the Democratic field, itself a matter of great worry. Republican strength has always owed its successes to the allegiance with which its supporters rally round the man anointed in the party's sanctum sanctorum; Democrats are naturally divisive. I see a lot of dead wood in the Democratic fold, names that I wish would drop out of the race because it's already crystal-clear that they'll use up political oxygen to no effect. I am not entirely sure that Senator John Edwards is not one of them. Senators Kerry and Joseph I. Lieberman certainly have the required gravitas, but whether this atmospheric ingredient would condense into genuine authority remains unclear. Governor Howard Dean, hitherto the popular favorite, blazed out of nowhere but appears to be headed in the same direction, the victim of runaway success. (We forget, in the automobile age, the terror that 'runaway' used to imply.) Although most committed Democrats seem to like Governor Dean, he has been dismissed as unelectable by many observers, most insistently by The New Republic's Jonathan Chait. (That periodical's objections to the governor must in part be rooted in its support for the war in Iraq, which the governor has always loudly opposed.) Until very recently, Democratic Party dialogue has been polarized between leftists denouncing centrist accommodation, on the one hand, and pragmatists determined to put forward a candidate who can win the election. The movement to draft General Clark was launched by the pragmatists, who without the slightest discomfort liken their man to Dwight David Eisenhower, a Republican not beloved of the Democrats of his day. Well and good. In the current political climate, General Clark is probably no less conservative that General Eisenhower was; he wants to preserve and uphold, not destroy, the fabric of 'New Deal' federal government.
The danger is that if and when leftists, liberals, old-fashioned Democrats and (who knows) even Naderites agree that getting George W. Bush out of the White House is their paramount objective, and that General Clark is the man to do it, they will cloud the air with righteous, anti-Republican vituperation. Their support for the general will look worse than pragmatic - opportunistic. Unable sincerely to sing the praises of whatever platform the general cobbles together, committed Democrats will limit themselves to assailing the Bush administration. This could be a disaster for their party, for this country's voters have never let themselves be scolded into putting anyone into the White House.
Concentrating on an electable candidate always risks accentuating the negative. The fact that things are not going as well for the administration as Bush supporters may like is an advantage for the Democrats, certainly, but not perhaps the advantage that it might seem to be. Nothing will take the place of active popular support for its own candidate.
Governor Dean's meteoric career - which he may yet manage to rein in - will doubtless inspire prospective fund-raisers and check-writers to scrutinize General Clark very carefully before rallying behind him. Four issues have emerged so far, in addition to the hardly negligible objection that the general has never run for elective office. First, is the general as thin-skinned as critics (who may simply resent his success) have alleged? Second, again according to his critics, does he have a tendency to say whatever it is that he thinks will get him what he wants? Third, is he hot-headed, as the contretemps that led to his dismissal from NATO command in Kosovo might suggest? And, finally, will the Clintons' support help or hinder him? These matters need to be cleared up right away. When they are, I do hope that General Clark emerges as a man whom Americans will want to elect as their forty-fourth president.
12 September 2003: Click here.
5 September 2003: Last week, I wrote about the miasma of incompetence that appears to have infected American know-how. Two days later, the top-fold article in the Times's Week in Review Section approached the same topic from another angle, this time focusing on brinksmanship, which is the art of getting away with as much as you can. Brinksmanship necessarily involves testing and, inevitably, exceeding limits - for the sheer hell of it. It is one thing to test limits in a crisis, and quite another to manufacture crises by testing limits for no operational reason. The latter is an indiscriminate abuse of social institutions and resources, and it takes no heed of third-party consequences. That's why I prefer to regard 'brinksmanship,' which is meant to sound daring and manly, as a form incompetence. This peculiar stupidity, to the best of our knowledge, is induced by unregulated hormones.
As if show how timely our anxieties - mine and the Times's - were, as if to suggest that they were even more warranted than we might have thought, this week's news disclosed a financial scandal based on the systematic corruption of a trusted operating system. (Needless to say, I'm not talking about a Microsoft product.) For sixty-odd years, the mutual fund industry has kept its nose clean. Perhaps it has been a little stodgy, and often seriously overpriced, but there have been no serious irregularities in its day-to-day operation. Like all financial institutions, it has harnessed computers to honest workers to produce transactional records that could be taken on faith. Until now. Now, as a result of an investigation launched by New York State Attorney General Eliot Spitzer - already well on his way to outdoing Rudy Giuliani as a scourge of Wall Street - the Bank of America and four other mutual fund sponsors have been implicated in a pattern of fraudulently priced trades designed to enrich not only a hedge fund manager happened to be the son of one of America's richest men but the fund sponsors themselves, all to the detriment of 'ordinary' fund investors. I put 'ordinary' in quotes because both New York State law and the federal legislation governing securities transactions outlaw any distinctions between big and small mutual fund investors - or distinctions of any type. No financial product on the market today is more rigorously required to be egalitarian.
Whether or not the Canary Scandal - the hedge fund involved, pet-owners will be interested to note, was called Canary Capital Partners, LLP, a salute to manager's family fortune, which originated in bird seed produced by Hartz Mountain (currently a real estate concern) - becomes really big news depends on what Mr Spitzer turns up in his ongoing investigations. The early word is that the named mutual funds' delinquencies are widespread throughout the mutual industry, but I doubt that the Canary Scandal will get Enron-level coverage, or even the attention lavished on Martha Stewart. The money is not 'big' enough to qualify for the former. If Canary Capital made $40 million in illegal profits, as Mr Spitzer's charge alleged, then this is the amount has to be divided by the number of outstanding shares in the various mutual funds managed by the five named sponsors and allocated further to the other investors owning sales at the same time as the fraudulently-timed ones; I would be very surprised if the average investor's loss amounted to $100. I doubt that that figure will have to be revised upwards no matter how extensive the abuse turns out to be. This isn't to say that the investors don't deserve to be made whole, but rather to suggest that making them whole isn't going to break any banks. (More about banks in a minute.) And as for glamour, forget it. Even the aggrieved will have a hard time figuring out just what their fund sponsors did. And the malefactors are likely to turn out to be both numerous and mid-level.
But the story is alarming in the same way as the blackout story was. Who's minding the store here? The security of the American financial industry rests on the twin pillars of unimpeachable records and full disclosure. Without going into detail, the Canary Scandal depended on some important systems bypasses. It ought to have been impossible to implement these shortcuts, which allowed trades to be dated improperly and for commissions to be waived. Why would a sponsor waive a commission? There would be no good reason at all, except that the Bank of America, and one of the other sponsors, has a lot of different types of business going - namely, its more traditional banking activities. It seems that the Bank was eager to secure the patronage of Leonard Stern, aforementioned member of the club of America's richest men. Waiving commissions didn't take money out of other investors' pockets (or fail to enrich them), but as I say the mutual fund business is absolutely egalitarian, and such discretionary activity on the part of sponsors is illegal. It also violates the SEC's bedrock disclosure rules. As to the improperly dated transactions (and this is the more serious of the two malpractices), the bank exchanged the quid of specially-priced shares for the quo of Leonard Stern's business, a genuine rip-off of other investors.
I don't know why the three fund sponsors who aren't also banks got involved in this scheme - that will come out in time - but it's clear enough why the banks themselves were, and the Canary Scandal is a reminder that the repeal of the Glass-Steagall Act, which was designed to keep banking and brokerage separate, may have been a mistake, at least insofar as it was replaced by a regulatory vacuum. But behind this public-policy question there is the deeper woe of professional irresponsibility. The fraudulent patterns woven by the sponsors and the hedge-fund manager required the acquiescence if not the active participation of accounting and compliance units. In-house lawyers may have issued unwarranted opinions blessing the scheme, but many among the back-office personnel involved must have had the experience to know that the trades they were being asked to process were 'questionable.' Because the abuses persisted for two years, involving many trades - you don't accumulate $40 million by arbitraging mutual fund shares overnight - they may well have warped the ethical culture of the fund sponsors' operations departments. If these organizations can't be trusted to keep honest and accurate records, then surely a financial dark age is upon us. I don't mean a market in which prices are down, but, much worse, one in which prices don't matter, or can't be determined.
The Wall Street Journal's immediate editorial response to the scandal was to wrap SEC Chairman William H. Donaldson for not having beaten Mr Spitzer to the punch. This could not be more characteristic of the Journal's full-time rooting section for the government-hating conservatives who, among other things, have starved the Commission of the funds required to finance protracted and hostile investigations. The only wonder is how Mr Spitzer funded his.
29 August 2003: This morning's WNYC headlines were grimly related. First, an important Iraqi ayatollah, Mohammed Bakir al-Hakim, was blown up in a car bomb explosion in the holy city of Najaf, together with about 90 members of his congregation. 'Old Baathists' were blamed for the assassination, but so was the American failure to restore order in Iraq. The other leading story concerned 2000 pages of transcripts, made by the Port Authority, builder-owner of the World Trade Center, of telephone and radio communications recorded on 11 September 2001. It had taken a FOIA suit by The New York Times to make the transcripts public. I was overcome by the desire to give the rest of the day a pass.
These stories are related, moreover, in several ways. The American adventure in Iraq is a consequence (if hardly a necessary one) of the terrorist attacks two years ago. But the linkage that bothers me is the note of incompetence common to both. The transcripts, not surprisingly, reveal official confusion about how to handle an unforeseen crisis. It would be harsh to expect anyone, prior to that awful day, to have developed a plan for evacuating a densely populated office complex, but the fact remains that a lot of authoritative advice was wrong, and a lot of heroism misguided. Whether we ought to have known what to do, we didn't know. And we don't seem to be doing much better in Iraq. Even assuming that the severe cost constraints imposed by the administration on the occupation were lifted, it's unlikely that we would be able to calm the hornets' nest of old and new grievances that our invasion stirred up. Our presence alone appears to distract many leading Iraqis from buckling down to the hard job of constructing a state on the ruins of Saddam's misrule. It is simply too easy to blame the Americans for everything that goes wrong. And while our soldiers are certainly more loyal and committed than American forces in Viet Nam were, I wonder if they have a significantly clearer idea of why they're there.
Ever since the blackout, I've been thinking a lot about incompetence. American incompetence, of all things. What happened to our can-do efficiency?
But first, a happier story: it appears that NASA engineers, the people who dream up and design spacecraft, will be given a greater, and possibly a controlling, voice in the conduct of missions. Sanguine project managers will have to yield to the professionally scrupulous. This is a story about competence restored, but of course it is also a story about the corrosive effect of staring too hard at bottom lines. (Not that engineers guarantee safety, as posthumous questions about the design of the Twin Towers have suggested.) If everything is business, if everything is either a success or a failure, then competence becomes too expensive, as well as an onerous drag on the risk-loving impulses of chief executives.
If everything is a business, then everyone is running business risks in every aspect of life - an insane stress. The other day, I came across a chilling remark, made, I believe, by the chairman of the Dallas branch of the Federal Reserve. This gentleman observed that 'no bank failures is too few,' by which he meant that banks aren't taking enough risks. Enough risks for what, and for whom? Is a bank just a business? No, and here's why: a bank serves people who don't necessarily understand banking, and can't be expected to assess the risks that a banker might choose to run. If a bank were just a business, then all of its depositors would have to be bankers, too, or at least capable of not only keeping up with but evaluating bankers as well. Assuming that you're not a banker, are you up to the job? Don't you rather regard bankers as people who specialize in sparing you the obligation to understand banking? Isn't that what you pay them for?
A fortiori as regards electric power and medicine, two of many fields where professional expertise has been insulted by excessive bean counting.
I don't for a minute believe that the supply of truly competent Americans has dwindled. But because doing a good job is not the fastest route to riches, I worry that capable and reliable people will be overlooked in the rush to complete projects and maximize profits. I worry that managers like Alfred Lambert, the father-figure in Jonathan Franzen's novel, The Corrections, will feel like chumps for having been honest and dutiful. And I worry that Americans will fall into the habit of lazily taking someone's word for it that the someone in question knows what he's doing.
22 August 2003: In our neighborhood, the power was restored on Friday at 6:15 PM. The blackout was a long and very tedious ordeal, but I tried to remind myself that it could have been worse. I hadn't been caught in an elevator or on the subway, for example. Kathleen, it's true, had walked all the way uptown from Wall Street, and then climbed umpteen flights to our apartment - without being able to freshen up with a shower afterward. I spent almost all of the twenty-six hours out on our balcony, where at least there was moving air. This had the intended effect of keeping the apartment cool.
By the time I'd cleaned up and put something on the stove for dinner, it was a little too late to sit down to write my now customary Friday piece for the Front Page of Portico. I decided to let it go altogether. Readers might have been curious to know what the blackout was like, but all I can say is that the local skyline, dark under a correspondingly brighter moon (it was almost full), made a strangely uninteresting sight. Fond as I am of history, I have no desire whatsoever to sample life without a full complement of modcons. I would have enjoyed the blackout more if I'd been unconscious.
Late last night, surfing the movie channels, I came across JFK, Oliver Stone's take on conspiracy theories surrounding the Kennedy assassination. It was more than a little reminiscent of the Pentagon and backyard-lab scenes in A Beautiful Mind. Once you believe something, you'll find plenty of evidence for your belief, and, the more preposterous the belief, the more evidence you will find. My all-purpose objection to conspiracy theories is that it's just too hard to keep secrets these days. Sooner or later, the itch to go public becomes irresistible. A good deal of the evidence mustered in JFK concerned efforts to keep the conspiracy against Kennedy a secret, and it involved too many people. I would not be deeply surprised if proof of Mr Stone's coup d'état were one day established beyond a reasonable doubt, but I'm not persuaded by a pile of circumstantial evidence, each item of which has to be held up to the light just so.
A conspiracy theory of an entirely different kind has been floated in the current issue of Harper's Magazine (September 2003). Writing about the disgrace of American public education, John Taylor Gatto suggests that, instead of wondering what's wrong with schools, we ought to ask if they're not, after all, doing exactly what they're designed to do. The conspiracy theory here is that American educators have little or no interest in providing students with a genuine education, but are rather preoccupied with turning out mediocrities who can't (or don't want to) think for themselves. Sounds sinister, doesn't it! But this is a conspiracy of the open-secret kind, for the history of public education in the last century, together with a body of 'revolutionary' texts that outlined the modern educator's objectives, are there for anyone to see.
Mr Gatto traces the underpinnings of public schooling back to Prussia, where education was devised to fragment the lower classes and thus hinder mass movements. The American version was developed about a century later, as heavy industry required both pliable workers and mass consumers. Mr Gatto points out that a work entitled Principles of Secondary Education (1918), by Alexander Inglis, was identified by no less magisterial a figure than James Bryant Conant as a key text of the 'revolution' in education that transformed American schools even as education was being made universally compulsory. The first of Inglis's six 'educational' functions is to "establish," in Mr Gatto's words, "fixed habits of reaction to authority. This, of course, precludes critical judgment completely." Indeed What I remember about public school is the teachers but not their classes, the discipline of detention, and the plethora of quasi-military rules. Homework was one thinly-disguised punishment (for being young and relatively weak); exams, and especially pop quizzes, were another. It is by no means wild of Mr Gatto to compare the typical American school to a prison. Inglis's other functions concern sorting and grading students - what has come to be called 'tracking.' Contrary to common wisdom, this tracking selects not for grades or academic achievement but for obedience. It does so by judging achievement in terms of obedience. The self-directed student will fare no better than the 'academically challenged.'
Consider the words of Woodrow Wilson (president, at the time, of Princeton University), in an address to the New York City School Teachers Association in 1909:
We want one class of persons to have a liberal education, and we want another class of persons, a very much larger class, of necessity, in every society, to forego the privileges of a liberal education and fit themselves to perform specific difficult manual tasks.
This depressing program envisions a large electorate that will do as it's told - that will follow the lead of the privileged few. I can't imagine anything less consistent with this country's founding texts - except for the parts concerning slavery. The 'difficult manual tasks' that Wilson saw at the heart of the economy have largely disappeared, or emigrated to Third-World countries, and 'vocational training' has been relegated to correctional facilities, but there can be no doubt that we live in a country divided between a small class of the liberally educated and everybody else, the latter group trained to do nothing at all - and certainly not to think. (Woodrow Wilson seems to become more monstrous with everything that I read about him.)
Does this bifurcated system work? I don't think so. In Mr Gatto's judgment,
Maturity has by now been banished from nearly every aspect of our lives. Easy divorce laws have removed the need to work at relationships, easy credit has removed the for fiscal self-control; easy entertainment has removed the need to learn to entertain oneself; easy answers have removed the need to ask questions. We have become a nation of children, happy to surrender our judgments and our wills to political exhortations and commercial blandishments that would insult actual adults.
Happy, that is, up to a point - the point at which voters get carried away with referenda, as Californians have. The essential idiocy of government by referendum is this: there is no requirement that voters square the consequences of the referendum at hand with those of ones already 'enacted.' Swept into law by single-issue passions, referenda make government less effective, not more; the ultimate referendum would undoubtedly abolish government altogether. Only a very dimly-educated electorate could make such a mess of direct democracy, and fail, at the same time, to make more of the virtues of indirect democracy - the representative democracy in which this nation was born.
Mr Gatto's response to the nefarious disaster of our public schools is to urge parents to counteract the infantilizing tendencies of public education. I'm not sure that this would be a sound approach, not least because it would certainly subject many children to ostracism and worse. For my part, I have two recommendations. A nice beginning would be to close all schools of education, for there is no way to teach someone who has never taught how to teach, other than to throw that prospective teacher into a classroom and see what happens. Second, it would probably be enough for public schools to do what they can to emulate the nation's private schools, especially the 'elite' boarding schools of the Northeast. It's not that these schools perform miracles, just that they don't stand in the way of obtaining an education. Of course, I'm not without my own Big Ideas about education.
Education is an experience that's difficult to compare, because with few (and by definition unenlightening) exceptions, the stages of education aren't open to repetition. You can't 'do' high school twice, once in an ordinary public school and once at an elite institution such as Andover. But I can assure you that the most rigorous schooling I ever had was at a private boarding school in New Jersey; it was certainly more demanding than all but the first semester of law school. I wish I'd been able to go to Blair for four years instead of two, because my freshman and sophomore years at a public high school in Westchester Country were a waste. Then as now, Bronxville's was a school district that parents paid a stiff premium to inhabit, and teachers' salaries were at or near the top of the scale. As public high schools go, Bronxville was, and probably still is, a very good one. But it had more in common with scout camp than with Blair. Primarily an organ of socialization, it pitched its academics at or below average, and this would have made life boring for almost everybody if the dramas of adolescent social life, together with extracurricular activities, had not provided absorbing distractions. What kind of person you were meant a great deal more than what sort of work you were capable of. Leaving this atmosphere behind was unbelievably liberating. It also taught me that real education is too urgent and absorbing to be judged on an axis of 'fun' and 'not fun.'
As the politicians never tire of saying, education is a number-one problem in this country. But almost everything that they recommend in the way of reform is false. The disparity in education parallels America's disparity of incomes, but is much more dangerous.
15 August 2003: No entry, thanks to the Great Blackout of 2003. (Well, let's hope it was.)
8 August 2003: Click here.
1 August 2003: The other day, reading Eric Alterman's column in The Nation, I came across yet another rehearsal of some astounding poll results. According to a January survey, 44 percent of those polled appeared to believe that Iraqi citizens participated in the terrorist attacks of September 11, 2001. Only 17 percent knew that none of the terrorists was Iraqi. The interesting thing about these figures, as Mr Alterman notes, is that the Iraqi element was new; in the immediate aftermath of the attacks, nobody made a connection between the attackers - who were widely reported as being Egyptian or Saudi - and Iraq. Nor, in the fall of 2001, was any responsibility for the attacks imputed to Saddam Hussein.
Where did this disturbing ignorance come from? Certainly the anything-but-straight-talking Administration must take credit for sowing seeds of misinformation in the course of drumming up support for the invasion of Iraq. But the medium in which the seeds sprouted was undoubtedly television news. It wasn't necessary for newscasters to make false statements. Simply repeating the Administration's anti-Saddam propaganda would have been enough to lead many viewers to make the faulty connection. This wouldn't work in print. Reading requires an engagement with text that easily exposes nonsense for what it is - that's why innuendo and unsupported allegation are confined to gossip columns, and banished altogether from serious newspapers. Serious newspapers treasure their reputations for balanced reporting, fully aware that readers will be able to spot bias when they see it. Television, in contrast, requires so little of its viewers that many, if not most, find it difficult to give the screen their undivided attention. I've no idea how many people under sixty listen attentively to the late news the way my parents did thirty years ago, but I suspect that not many do, especially in this age of multitasking. And I believe that most people who get their news from television do so as if by eavesdropping, catching a word here or there and not giving much thought to what they hear - not, that is, until the pollsters ring up.
Reading a newspaper is a lot more difficult than taking in a television newscast, and it takes a very great deal longer. It seems unlikely that there will ever be a time when most people read a good newspaper every day, or that there ever was such a time. Were people better informed before television? No. But they were more aware of their ignorance and uncertainty. Television's darkest power is its knack for fooling viewers into believing that watching a sixty-minute show makes them experts, and that sitting through a round-table discussion to which they can't contribute is a substitute for thought. Television has to flatter its viewers, if only to make up for the tedium of the passivity that it requires.
Television can be hugely entertaining, and there can be no doubt of its importance in spreading the idea that in matters of importance there is not much difference between men and women, or between blacks and whites, or really between any two similarly intelligent and imaginative people. But it rarely tells the truth and nothing but the truth. Flattened onto the screen's two dimensions, the complexity of the truth becomes confusing and boring complication. The networks are often faulted these days for cutting back on their news budgets, but I think that nothing would be more heroic than for television executives to acknowledge that their medium is no good at all doing news, and so put to an end their wretchedly misleading simulacrum.
25 July 2003: Click here.
18 July 2003: A little knowledge, the saying ought to go, springs eternal. A week or so ago, the Times ran an article about the radically small number of genes on the Y-chromosome. Because this chromosome cannot swap genes during mating, it has been losing them instead, or so the theory goes. The news was that the chromosome has nevertheless developed a way of repairing itself, but this wasn't exactly news you could use. Much more interesting was the flutter of speculation about the Incredible Shrinking Y-Chromosome. Ignoring the crux of the story altogether, wiseacres and ADD sufferers indulged in a flutter of speculation about the eventual extinction of the masculine of the species. Maureen Dowd, the Times's resident wiseacre, weighed with a sarcastic column that, had a man been writing about women, would never have been published in the Grey Lady's pages. The following Monday, the paper published five Letters to the Editor on the subject, headed 'Y Power: Men Are Here to Stay.' It would be funny if it weren't such a sad waste of time.
My favorite letter, so to speak, was sent in by one Rick Reiss of Temecula, California. (I must say that it's heartening to know that the Times is read across the land; indeed, none of the five letters on 'Y Power' came from the New York Metropolitan Area.) Mr. Reiss wished to remind the Editors that "There are still plenty of us regular guys who like to drive trucks and eat steak. We're not going away." Then he wound up: "After eight years of a feminized presidency, most of our society welcomes the resurgence of regular guys who carry on in our battles against terrorists."
Is that what the Right hated about Bill Clinton? That the fatal attraction that he appeared to exert over every woman who crossed his path made him a softie? There have always been two rather contradictory ways of evaluating masculine success: by counting a man's sexual conquests or by appraising his indifference to women. Men who are indifferent to women - and this indifference is usually a veiled hostility - don't score very often, and when they do, their approach savors of rape rather than seduction. It seems clear to me that 'regular guys,' as a rule, have hardly any more use for women than gay men do.
Writing in Psychology Today - an article that I came across through Arts & Letters Daily - Hara Marano outlines "The New Sex Scorecard." After reviewing the physiological differences between male and female brains, and the corresponding propensity for men to develop schizophrenia and for women to succumb to depression, Ms Marano ends on a strange note. According to Baltimore psychologist Shirley Glass, Ph.D., the differences between the ways in which men and women approach extramarital affairs is shrinking. "In what may be a shift of epic proportions, sexual infidelity is mutating before our very eyes. Increasingly, men as well as women are forming deep emotional attachments before they even slip into an extramarital bed together. It often happens as they work long hours together in the office."
Isn't that something! When men spend a lot of time with women in some common activity, they begin to act like women - at least insofar as they care about the person they're unfaithful with. This at any rate, might be the inference drawn by Mr Reiss and regular guys of the 'slam, bam, thank you, ma'am' persuasion. To keep your regular guy status in good order, the answer clearly is to hang around as much as possible with other regular guys, driving trucks, eating steaks, and fighting terrorists.
Seriously, I would really like to know what was 'feminized' about the Clinton Administrations. Was it too cautious, perhaps? Bill Clinton knew - as he acknowledged once he was out of office - that the first President Bush had held back from Baghdad, and resisted the impulse to 'get Saddam,' because his agreement with the Saudi Arabian government, allowing American troops to maneuver from Saudi territory, required this restraint. President Clinton certainly hated war, and fought hard to stay out of it - perhaps too hard, as catastrophes in Central Africa and the Balkans might have been avoided or mitigated by forceful American response. In both cases, we would have gone to war to restore peace. That cannot be said of the current Iraq adventure. Iraqis now have the bitter opportunity to compare peace under a tyrant to chaos in an anarchy. I can't say that one is better than the other.
Was it Hillary? Too much input from an uppity First Lady? The former First Lady is a Senator now - albeit from a state whose citizens, if the Times Letters are any indication, are dangerously unconcerned about the fate of the Y-chromosome - and by all accounts she is a free-standing politician who owes little or nothing to her husband's charisma. While no one could have known for sure that Mrs Clinton would eventually win a major election on her own, it seems unfair to pretend that she was just another presidential better half.
I think I had it right the first time. The Clinton Administrations were feminized because so many women wanted to throw themselves into Bill Clinton's arms. Too many girls! Meanwhile, when are all those regular guys going to hunt down Osama and Saddam? Sometimes it seems that the Y-chromosome will run out of genes first.
11 July 2003: Last Tuesday, in the New York Times for 8 July 2003, there appeared on page A4 an item concerning the Lutheran pastor of Tarbaek, a village to the north of Copenhagen. The pastor, Thorkild Grosboll, subscribes to an unorthodox creed. "I do not believe in a physical God, in the afterlife, in the resurrection, in the Virgin Mary. And I believe that Jesus was a nice guy who figured out what man wanted. He embodied what he believed was needed to upgrade the human being." The story, by Lizette Alvarez, does not state that Rev Grosboll has preached this creed from his pulpit, but for acknowledging it to a Danish newspaper reporter he has brought down the wrath of church hierarchy. In these temperate times, ecclesiastical wrath has confined itself to relieving Rev Grosboll of his pastorate - much to the dismay of his parish of 1500 souls. His parish council has urged the bishop, Lise-Lotte Rebel, to reconsider, and a supportive rally has been staged - on a football night, no less. The bishop, temporizing perhaps, has asked Rev Grosboll to 'clarify' his remarks. This demand is unlikely to produce reconciliation. Described by Ms Alvarez as 'a laid-back man in Oxford tweeds who is beloved by his community,' Rev Grosboll explains, "I want the focus to be on the here and now, as a cultural factor. God is not an argument. God is only a question. He is supposed to be a constant stone in the shoe."
Ms Alvarez reports that Rev Grosboll's dismissal has 'set off a tsunami of theological discourse in workplaces, university halls and cafes across Denmark, where religion seldom penetrates the collective consciousness.' Attendance at the state-supported Church, she notes, is no higher than six percent of the population. That figure certain blunts the impact of the bishop's assertion that Rev Grosboll's remarks are "creating doubt and confusion about the church's values." One suspects, indeed, that Tarbaekers are more likely to attend church services than other Danes (not that I know this to be the case) precisely because their pastor has jettisoned the metaphysics that don't seem to have much to do with the pursuit of a Christian life here on earth. Indeed, I've always thought that the pursuit of a celestial afterlife is a dubious ulterior motive for observing Christian precepts. Personal salvation and charity don't really blend; there's a selfishness about wanting to go to heaven that sits ill with Christian altruism.
The Nicene Creed, notoriously, prescribes no ethical tenets. With the exception of the line about Jesus's death ('crucifixus ... passus et sepultus est' - 'he was crucified ... suffered and was buried'), it is a list of metaphysical propositions about the unseen and the yet-to-come. For a millennium and a half, the profession of this creed, not the implementation of Jesus's teaching, has been required of all Roman Catholics, and of some Protestant Christians as well. One wonders why. The short answer is that Early Christianity was, to put it mildly, multifaceted, a chaotic Babel of competing ideas about the nature of God, Jesus, and the human soul. The problem of evil vexed many minds and inspired many inconsistent ideas about Creation. When Christianity became the official religion of the Roman Empire, the Emperor understandably desired an end of this cacophony, and so a Church Council at Nicea was charged with formulating core Christian beliefs - about metaphysical matters.
But why now? Why, when there is so much for a Christian to concern himself with here below, does it matter what he thinks of what lies beyond? Whether or not you believe in the doctrine of the Trinity can have no bearing on fulfilling Christ's fundamental rule - treat your neighbor as you would be treated. The Gospels of Matthew, Mark and Luke may be searched in vain for Nicene precepts. Why, then, is it important, or even interesting, to be clear about whether or not the Holy Spirit 'proceeds from the father and the son'? Here's my hunch: Christianity - Christ's teaching - is not really a religion.
From the beginnings of recorded history, people - at least the people on this side of the Himalayas - have turned to religion to answer such questions as "How did the world come about?" and "What happens after death?" These are not moral questions, but rather the kind of question that most educated Westerners today expect science to answer - even if they still ask these specific questions of religion. The decidedly pre-Christian doctrine of the immortal soul (it's pagan Greek, not Hebrew) is an answer to the ultimate religious question, which is "What does it all mean? What's the point of existence?" Catholics and Protestants alike are asked to believe that the point of existence is to praise the Lord God and to pray for salvation - that is, redemption from the imperfections of material existence.
But not everyone is riveted by these questions. I daresay few people are, at least most of the time. Most of the time, we find ourselves in a busy, complicated world, in which frustrations abound, pleasures are chancy, and choices are difficult. Affection and dislike propel us toward and away from the people around us with a force that willpower rarely overcomes, while dreams of a happier future tend to make us too impatient with the present to take sensible and disciplined steps toward our goals. The teachings of Jesus are full of wisdom, some of it obvious, some obscure, and much of it difficult, concerning the difficulties of navigating a steady course through the riotous panorama of engagement with the world. A preoccupation with the meaning of life, in contrast, seems solipsistic in a particularly adolescent way, and the imposition of a peculiar set of answers to 'ultimate' questions on everyone in the neighborhood is downright monstrous.
A few years ago, retired Episcopal bishop John Shelby Spong published an important book called Why Christianity Must Change Or Die: A Bishop Speaks to Believers in Exile (HarperCollins, 1998). I could open the book anywhere and find a passage with a bearing on the foregoing, but the following is acutely germane - and if I had Bishop Rebel's email address, I'd forward it to her:
Anger has always marked the religious establishment. This is why so many Christian leaders historically have justified such things as the stifling of debate with ex cathedra pronouncements, the persecution of dissenters, the excommunication of nonconformists, the execution of heretics, and the engagement in religious wars. This is also why anger is always just beneath the surface of organized religion in almost every one of its Western manifestations. The preaching of evangelists is marked by finger pointing and face-contorting expressions of hostility while they talk about the wrath of God. Anger lies underneath the glee expressed by the preachers of Christian history when they assign unbelievers to hell. Anger is the reason why many religious people act as if they will not enjoy the bliss of heaven if they are not simultaneously allowed to view those not so fortunate writhing before their eyes in the fires of hell. Anger is the reason why the Church throughout its history has kept writing creed after creed to clarify just who is in and who is out of this religious enterprise so that religious people would know who their enemies were and could act appropriately against them.
It may well be that all we really need to know about God is that He is a stone in our shoes, a constant reminder that we could do better. Meanwhile, I hope that the Times will follow the continuing story of Rev Grossboll and the parishioners of Tarbaek, Denmark.
le quatre juillet, 2003: Responding to my dismay, in last week's Front Page, on the matter of the President's popularity, a reader wrote, somewhat but not entirely tongue in cheek, "Spoken like a True Ivy League Snob." And I had to laugh, Double Domer that I am. It's the President who's the Ivy Leaguer. I have no reason to believe that the President is a snob - although I'd probably be happier with him if he were. I'm too old and, I hope, too wise to deny that I'm a snob, in some way or other - it's hard to be sure that absolutely none of one's reasonable discriminations grade off into unthinking prejudice. But an Ivy League Snob, I'm afraid, I cannot be.
I'll tell you what I'm a snob about: news. I've been reading A. N. Wilson's really terrific new book (pardon the triteness, but the book is really terrific, and today is a national holiday), The Victorians (Norton, 2003), and just yesterday, in a chapter entitled "The Fourth Estate - Gordon of Khartoum - The Maiden Tribute of Babylon," I came across a passage that beautifully sums up everything that I despise about mass journalism today, and why. Here it is:
No visitant from another age who landed in the midst of our twenty-first century culture would begin to make sense of our popular journalism - prurient, self-righteous, spiteful and pompous - unless they were able to trace its origins to the chiefly North Country traditions of the nineteenth-century Noncomformists. Dickens had ridiculed the Puritan conscience in such grotesques as Mr Chadband (Bleak House). What happened in the following generation was that a fervour, a craving for the emotional excitement of the prayer-meeting and the conversion experience, was awkwardly translated into secular spheres. As has been well said, 'in an epoch of varied achievements, scientific, literary and commercial, the elect of God related themselves to mundane reality almost exclusively through their aptitude for money-making; balancing this imperfect contact with a complex epoch of self-complacency.'
What's different now, here in America, is that the elect of God - and Americans have an irritating conviction that they're that - relate themselves to mundane reality through self-improvement, and they enlist God as a coach. "What would Jesus do?" - that loathsome trivialization. But the prurience, the self-righteousness, the spite and the pomposity, these are on full-dress display every minute on Fox News, and, with worrying frequency, on the other news outlets as well. The prurience, of course, fueled the Clinton-Lewinsky scandal, which in a sensible world would have received hardly any airtime at all, and so deprived Mr Starr and his backers of the oxygen that their pyromania required. The self-righteousness empowers blowhards like Bill O'Reilly to bully his guests. The spite is evident in the obviously resentful fury generated by France's objection to our Iraqi adventure - a resentment that forgets that saying "If it weren't for us, they'd all be speaking German," obligates you to remember that if weren't for them, we'd be toasting the Queen. And the pomposity! Everything about the visuals of television news is pompous, from set design to animation. Television news replaces the imperial parades of centuries past with imperial connectivity, a grandiose and patently sham claim to be in the know. What's patently sham about this claim is that no one on television is going to serve up a single news item without seasoning it to suit what Mr Wilson calls "the new journalism, a monster machine whose twin-turbo was fuelled by sensationalism and moralism."
Of this now all too familiar journalism, Mr Wilson writes, "It was based on a threefold alliance between an eagerly opinionated public, a political class anxious to test and ride these opinions like surfers waiting for the next roller to bear them crashing to shore, and the conduit that brought these two together, the solicitors or procurers known and journalists." We can thank this alliance for the ersatz democracy that we've got to live with until the public outgrows its weakness for ignorant and uncritical opinions. For until it does, the political class will be unable to lead it, and the journalists will have no reason not to coddle it.
In a fit of pique, I ordered a slew of Johnny Hallyday CDs from Amazon in France, and when my guests arrive for tonight's pre-fireworks feast of fried chicken and ribs, we'll be listening to pop music with a French accent. And we'll be drinking Champagne, too, when we toast, I hope not too optimistically, a less sensational future for this great nation.
27 June 2003: Whether or not the Bush Administration will find itself in political hot water because it misled the country into war on specious claims about Saddam Hussein's weapons of mass destruction remains somewhere between uncertain and unlikely, and the outcome will probably not be affected by whatever happens to Tony Blair, whose water is certainly uncomfortably warm right now. But I can't follow the WMD brouhaha very seriously, anyway, because I don't doubt for a minute that senior Administration officials (including the President) believed the faulty intelligence that they had pressured into existence. They were reckless, no doubt, but, worse, they appear to have been incompetent. Few things are more terrifying than the thought of an incompetent Administration at the helm of the world's superpower.
Aside from racking up tax cuts for wealthy individuals, the Bush Administration has accomplished exactly nothing. The ongoing war on terrorism has failed to bag both of its principal targets, Osama bin Laden and Saddam Hussein, and neither Afghanistan nor Iraq has been restored to civil peace. Indeed, it's unclear whether matters are getting better or worse in Iraq. While American and British troops helplessly stand by (there aren't nearly enough of them to police the country), Iraq sinks into generalized lawlessness and widespread vendetta. Meanwhile, at home, the Administration has done little to restore pre-9/11 feelings of safety; it has certainly starved the Homeland Security forces of meaningful resources. And all the other domestic indicators, bearing on the health of the economy, the health of the environment, and the contribution to both of our deranged levels of fuel consumption, are down and falling further.
Why then are the President and his team so popular? This is the mystery that bothers me the most, because what all the answers have in common is the irrationality of the American voter. For my part, George W. Bush has never been anything more than a cheerleader. I'll grant that he's a very good one, but a cheerleader is hardly a political leader. Cheerleaders inspire crowds not with bold and novel propositions but rather with clichés and well-worn gestures: Repeat after me, and you'll feel that you belong to the crowd. Cheerleaders want to make you feel good about the game, and a nation as habituated to televised sports as ours is makes an easy mark for canny manipulation. And it is probably his very mediocrity - laminated, of course, by privilege - that enables the President to pitch his appeal so perfectly. Not that Americans are such fools that they take the man to be somebody just like them. On the contrary. It's agreeable, perhaps, to know that even the son of a President can be so broadly undistinguished.
The first issue that any democracy must address is the political health of its electorate, for in a democracy everything depends on the voters' choices. All the campaign financing in the world cannot overcome the will of the majority, and implicit in the campaign-finance reform movement lies a conviction that American voters are too easily manipulated by advertising. Ideally, all that advertising would be ineffective, because voters would know enough about how to further their own interests to make reasonable, independent evaluations of the candidates. But a nation whose educated professional class by and large fell for the 'new economy' bubble can't be expected to take very good care of itself.
Nobody else, however, is going to take care of you. Certainly not the Cheerleader in Chief.
20 June 2003: Tackling the syllabus for a summer symposium that I'll be attending the week after next at Notre Dame, my alma mater, I finally made my way through St. Augustine's Confessions. This is a very embarrassing admission, because of course I ought to have read the Confessions when I was an undergraduate - I'm certain it was on the Great Books syllabus somewhere, and Great Books was my major - and for the matter of that I ought to have read it even earlier, for it was one of the very first books that I ever bought for myself. Who knows what I expected as a fourteen year-old, but in the event I was put off by the opening prayers, and probably never made it past the second page.
This time round, I found Confessions no less unsympathetic, and reading it was unusually effortful. Afraid that I would never get through it if I put it aside, I plowed through the first nine of the Confession's thirteen books in one day. These nine constitute an autobiography of sorts, up to the author's early middle age, and focus on the difficulties that Augustine had in coming round to accepting the Catholic Church. He was always a Christian, having been raised by a Catholic mother and a father who was baptized on his deathbed, but the Christians of the late fourth century were a contentious lot, and in his youth Augustine was drawn to the Manichean sect. The followers of Mani, a Mesopotamian writer of the previous century, believed, among other things, that matter was evil, the creation of a Prince of Darkness almost as powerful as the God of Light. Although they rejected the Hebrew Bible entirely and regarded most of the New Testament with suspicion, Manicheans did claim to be Christian, and one strain or another of Manichean thinking has managed to infect sizeable groups of Christians throughout European history, most notably among the Albigensians of the early thirteenth century. As a philosophy, however, the Manichean outlook never displayed much internal rigor, and it's no surprise that Augustine came to find it unsatisfactory. Trained as a rhetor (a teacher of oratory, antiquity's central professional skill), Augustine was naturally exposed to the more systematic thinking of Cicero and the Greeks whom Cicero admired, and at some point, he undertook the study of Plotinus, the great third-century Neoplatonist. The fruit of this study would be the classical reconstitution of Christian thought that determined the course of Western philosophy for well over a thousand years.
Curiously, the Confessions don't tell us several key details of this pilgrim's process. Augustine does not, for example, discuss his attraction to the Manicheans, but rather presents himself almost as if he had been brought up in the sect, which he clearly wasn't. Nor does he mention Plotinus, or any other Neoplatonic thinker. These lapses seem disingenuous to me, although the first can be explained as an aspect of the the book's polemical purposes - as a new bishop, Augustine had to defend himself against the objections of local Catholics who remembered his youthful commitments not only to the Manicheans but to 'worldly' success as a celebrated teacher. The second, however, is simply the suppression of an inconvenient fact. For although Plotinus clearly provided Augustine with an extremely sophisticated, internally consistent understanding of spirit, matter, creation, and even of God Himself, Plotinus himself was not a Christian. Therefore he could not serve Augustine as an authority for the project that the latter had undertaken even before his conversion to Catholicism. This project, as I say, was the application of Platonic ideas to Judeo-Christian faith. Augustine was hardly the first thinker to try to join these two so disparate traditions; at the very start of Christianity we find St Paul infusing his reading of Christ's life and mission with unmistakably Greek thinking. And it was this yoking of the Greek to Christ's story that made Christianity respectable to educated elites. But Augustine transformed the yoke into a graft.
How he came to do so is a story conspicuously absent from the Confessions. What we have instead is a prolonged postponement, as Augustine grapples, in book after book, with his two reservations about Catholicism. The first, and more famous, is chastity. By his own account a highly-sexed man, Augustine lived rather virtuously in quasi-marital fidelity with a concubine - a woman of considerably inferior social status - for fourteen years. When he sent her away, it wasn't for high-minded reasons but rather for worldly ones: if he was to advance in the world, he would have to marry a rich girl of his own class or better. In the short period between the end of this relationship and his conversion, Augustine resorted to prostitutes, uttering the famous prayer, "Lord, make me chaste - but not yet." (He was then in his thirties.) Curiously, it was the prospect of marriage - delayed because his intended was underage - that seems to have nudged Augustine toward the sexual renunciation that already distinguished the Catholic hierarchy. (And was there ever a question that, as a Catholic, Augustine would not belong to the hierarchy?) Augustine's second issue with Catholicism was the problem of evil. How to account for the presence of evil in a world created by a supremely good God has probably vexed more people than any other theological problem. The Manicheans 'solved' it, as we have seen, by compromising the omnipotence of God and attributing evil to a Prince of Darkness. For the young Augustine, evil was a substance, a thing, the creation of which was hardto explain. The stroke of genius that allowed Augustine to come round to the Catholic view was to recast evil as a an insubstantial falling away from the goodness of God, a turning in the wrong direction. Thus God did not create evil, but, in allowing his human creatures free will, He made evil possible. (It should be noted that Augustine did not regard natural disasters as evil.)
Now, as everybody knows, the old Hebrew God had not scrupled to inflict evils on the badly-behaved. The Jahweh of Pentateuch is, like the gods of the pagan pantheons, capable of caprice and vengefulness, and He is well-armed with plagues and thunderbolts. But such a Creator was wholly unacceptable to a cultured mind such as Augustine's, as, indeed, He was rejected outright by the Manicheans. From the time of Plato until the twilight of antiquity, intellectuals of every stripe seem to have shared an overriding obsession with the ideal, and Augustine was no exception. Idealism regards the created, material world, obviously imperfect, as a dim reflection of the perfect world of the spirit, and as for the world of the spirit, far from being gauzily mystical, idealism articulates it as a system of clearly interlocking parts that fit together with the finality of a jigsaw puzzle. To the educated minds of antiquity, spirit and light were almost identical - as, indeed, the famous proem of the Gospel of John reminds us. Augustine's great pretense was to find a basis for such a system within the confines of Scripture - and without reference to pagans such as Plato and Plotinus.
A corollary of this kind of systemization is, inevitably, orthodoxy. If the world reflects inherently consistent ideals, then the intelligent people in it must share a consistent understanding of its nature. Heterodoxy is anathema because it introduces confusion. How is one to choose from conflicting explanations? This is an urgent question when knowledge for its own sake is prized - and knowledge has probably never been so widely prized for its own sake as it was in the Hellenistic and early Christian worlds. Indeed, we can use the word gnosis ('knowing') to describe the philosophical objective of most schools of thought, Christian and pagan alike, current during the three centuries on either side of Christ. Given the axiom (framed most memorably by Aristotle) that the man who knows what is right will do what is right, knowing what was right assumed an importance that it's hard for us to imagine. Throughout his episcopate, Augustine never ceased engaging in battles against this sect and that in the name of orthodoxy, and at one point he crossed a fatal line and called in Imperial troops to quiet dissent. Thus began centuries of religious intolerance in the West.
By chance, I encountered a book this week that promises - I've hardly begun it - to counter not only Augustine's understanding of faith and reason but that of the various churches that have built upon it as well. Elaine Pagels's Beyond Belief: The Secret Gospel of Thomas (Random House, 2003) explores the body of scriptural writings that were dismissed from the canon (in about 180) and that would have been lost altogether had a collection of them not been stashed in a pot at a town now known as Nag Hammadi, in Upper Egypt, to be discovered only in 1945. Among the works in the Nag Hammadi library is a Gospel of Thomas. That's Thomas as in 'Doubting Thomas,' the story of whose 'conversion,' told in the Gospel of John, may well be a political lie, designed to disparage Thomas's followers. Ms Pagels writes,
I was amazed when I went back to the Gospel of John after reading Thomas, for Thomas and John clearly draw upon similar language and images, and both, apparently, begin with similar 'secret teaching.' But John takes this teaching to mean something so different from Thomas that I wondered whether John could have written his gospel to refute what Thomas teaches. For months I investigated this possibility and explored the work of other scholars who also have compared these sources, and I was finally convinced that this is what happened.58
The problem, for us, is not how to decide which gospel is closer to the truth of Jesus's life and mission, but rather to learn that no such decision is really necessary. It is an imposition of the past, of the obsession with ideal uniformity that, viewed dispassionately, calls into question the entire project of classical philosophy - a project that Augustine almost single-handedly renewed under Catholic auspices. The problem, for us, is to learn to apply our hard-earned trust in religious toleration, supposedly a cornerstone of this country's foundation, to the religious precincts in which Jesus, but not His church, would have welcomed it.
13 June 2003: A friend wrote the other day with kind words about the link at the bottom of this page that will take you to former Front Pages, and then suggested that I caption each one with a subject heading or a set of key words. Would this be helpful? I'm inclined to think not. What, for example, was last week's Front Page all about? Queen Elizabeth? Hereditary monarchy? Diligent public service? I suppose it was about all of those things, but what interests me most in looking over the piece now is this statement: "I often wonder if the American dislike and distrust of public institutions isn't connected in some way to the absence of royalty, for without some ceremonial personage it is impossible to put a face on government." How would I shoehorn that into a caption?
I'm flattered that anyone would take the trouble to read superseded material. But I expect that anyone who would take the trouble isn't in much of a hurry. And what I am trying to get away from here is the kind of argument that can be reduced to a topic sentence. Look around (on the Web, at any rate), and all you see is argument about everything. Some arguments are better than others, but I doubt that many are really persuasive, for in today's polarized climate, readers greet congenial arguments with warmth and dismiss unsympathetic ones as nonsense, and logic, the armature of argument, has nothing to do with either response. I suppose we all feel a bit hectored, too, by a preponderance of arguments against. In a New Statesman essay linked to Arts & Letters Daily, Timothy Garton Ash writes, "Why define yourself by who you are against, rather than by what you are for?" Mr Garton Ash gently scolds his fellow Europeans for trying to define themselves as not American. Ironically, defining yourself by what you're not is thought by many European critics to be the besetting sin of Americans in general. It's not an unreasonable conclusion, given all the nonsense about Old Europe and 'freedom fries,' but responding to American provincialism by adopting what amounts to European provincialism gets no one anywhere. And besides, it's a case of what Freud called 'the narcissism of small differences.' Americans and Europeans overall probably have more in common than New Yorkers and Texans in particular.
The narcissism of small differences is an optical illusion that magnifies shibboleths into grounds for war, but of course there are some differences that are truly difficult to bridge, and arguments about them are usually very unhelpful. The role of women in society is a troublesome issue today because women in the West no longer abide by strictures that used to be nearly universal. To say that traditionally-minded men feel 'threatened' by women's freedom is a serious understatement. History counsels that the breach will probably be closed by the simple passage of time, as ever fewer men grow up with the traditionalist mindset. The question is whether many lives will be lost in fruitless struggles until then. Can Islam - or Judaism or Roman Catholicism, for the matter of that - truly accommodate gender equality? It's more likely that the fait accompli of gender equality will refashion the views of these faiths. As people see for themselves that gender equality does not - can not - lead to gender identity, they'll be more willing to abandon the privilege that, say, reserves the priesthood unto men. But in order to see this for themselves, they will first have to abandon the purely artificial, 'traditional' differentiations that signal difference for the sake of difference. Which comes first? Neither, so long as everyone is arguing.
What interests me is making connections, not distinctions. Distinctions are too easy. After all, no two things are exactly alike, if only because no two things occupy the same space. The art of making connections is really the art of assessing the gravity of distinctions and then reducing them to coordinates in the vast network of existence. (Again, resemblance is not identity: even a clone of you is not you - a fact that lots of people seem to have trouble grasping). This is tricky work, and I don't think it's wrong to hold that every generation must make its own connections. For the most part, happily, we can ratify connections made in the past, but the difficult work of reordering connections that have been interrupted by change over time is vital. And it's likely to be contentious, if only because the people we call conservative prefer to ignore change while the people we call progressive tend to wish that change were greater. The world will not know real peace until these two groups accept themselves as somewhat blinkered and their opposition as equally human. I don't despair.
6 June 2003: This past week saw the fiftieth anniversary of the coronation of Elizabeth II, the only one, so far, to have been televised. The Queen has reigned longer than all but three of her predecessors, and if she remains on the throne for another fourteen years she'll beat the record, set by her great-great-grandmother, Victoria. Is there more to this than a statistic?
Elizabeth has been a model monarch, dutifully opening hospitals and swimming baths and greeting delegations of every kind of respectable British association. She is a genuine public servant. Her private life, apparently given over whenever possible to her beloved dogs and horses, idealizes the pursuit of simple things. She is said to be genuinely kind, a reputation that it would be difficult to sham in her fishbowl circumstances. She's an all-round good girl, and whatever people think of the House of Windsor or of the institution of the monarchy, there is no question that Elizabeth's personality has kept the idea of royalty viable in the United Kingdom
That's, of course, the problem with monarchy, at least as it is practiced today. What if Elizabeth's younger sister, Margaret, had been the older girl? It's not hard to imagine the tarnish that that unhappily willful woman would have brought upon the crown long since; what's hard to imagine is the survival of the institution. Hereditary monarchy, in which succession is determined by the accident of birth order, will only work today if the right kind of person nabs the slot. Elizabeth is the right kind of person, but is her eldest son? Are any of her sons? Assuming that she outlives them, which she may well do, given her own mother's longevity, what kind of king might Prince William make? What if he has inherited some of his mother's willfulness? Come to think of it, there doesn't seem to be anyone remotely like Elizabeth in her own extended family.
Two hundred years ago, when poor George III was passing in and out of madness, the sovereign was a sacred person, still invested with the aura of divinity that seems to have accompanied kingship from earliest times. In Western Europe, there was always something a little self-conscious about this exaltation, partly because the Pope had a much better claim to it and partly because it took such a very long time for barbarian chieftains such as Clovis (c. 500) to attain the splendid majesty and political (as distinct from military) clout of a St. Louis (c. 1250). And between St. Louis and Louis XIV (c. 1675), the prestige of monarchy sank to some very swampy levels. Nevertheless, the ruler remained the strongest link between God and the body politic. By the end of the nineteenth century, however, the link to God had all but evaporated, and rulers, beginning in Britain, found a new footing for such authority as they retained. With the expansion of suffrage, and the consequent instability of governments, the monarch emerged as a figure above the fray, an embodiment of permanent values unaffected by trends and issues. This imposed an entirely new responsibility on rulers, and the rulers who couldn't rise to the occasion were ousted or worse. Nor, as the example of Nicolas II suggests, was good behavior enough; a ruler must also demonstrate common sense. You would think that the Queen would not be the only member of her family to understand this.
I hope she's not. Even to people who are not her 'subjects' - the use of which term could hardly be more anachronistic - the Queen represents an extraordinarily attractive theory of government. Unencumbered by political alliances and free from legislative and bureaucratic obligations, she is free, if that is the word, to stand on her feet all day long impersonating the State to multitudes of ordinary people. I often wonder if the American dislike and distrust of public institutions isn't connected in some way to the absence of royalty, for without some ceremonial personage it is impossible to put a face on government. Yes, the Queen gets to live in some big palaces (she has lived in Buckingham Palace longer than any other royal, Victoria included), and she certainly gets to wear a lot of magnificent jewelry. But crowns and palaces are hard to begrudge individuals who don't seem to be keen on them. Elizabeth Windsor would rather be outside, wearing a headscarf on horseback. But she carries herself like the woman most people think she ought to be, and in doing so she makes everyday celebrity look like utterly trivial tinsel. She has certainly earned my admiration. I only wish she weren't quite so exceptional.
30 May 2003: The cloning of a mule was reported in today's Times, suggesting that related animals, such as thoroughbreds, might be next. This news will raise another dust storm of deaf debate about bioethics. Those who feel that genetic intervention diminishes the natural will demand that further cloning be halted. It's a good time to try to see the world from their perspective.
It is traditionally known as the dualist view, because it divides existence into two distinct natures, the spiritual and the material, arranged on three levels, with human life, partaking somehow of both natures, occupying the one in the middle. There are at least as many different understandings of spiritual nature as there are religions, but the differences needn't concern us here, because the role of spirit is invariably to enrich human life, which dualists believe would be meaningless without it. Rigorous dualism was invented by the Greeks, and subsequently employed by St Paul and later Christians as an armature for the ideas of Jesus, and as such it prevailed in Western civilization for nearly two millennia. If it no longer commands the respect of all the finest minds, it is still very much the faith of most Americans. Even more peculiar to America is transcendentalism, which is vague about the actual identity of the Supreme Being but no less convinced than the most orthodox Catholic of the sheer fact of immaterial life. In many ways, the United States is an odd country to find in the vanguard of genetic technology.
Dualism has always been at odds with technology, because technology can change the world, and changing the world usually means changing the social arrangements in which human nature expresses itself. This in turn entails questioning the established values in which we codify our ideas about the meaning of life. Eventually, an accommodation is reached, or at least accommodations have always been reached in the past. At the moment - and this is unlikely to change anytime soon - the Roman Catholic Church remains determined not to accommodate most technologies of contraception, while many conservative Christians are equally firm in their opposition to the spread of stem-cell technology. Reproduction will perhaps always be a special case, because so many men and women - perhaps most, although I'd hate to think so - define themselves with respect to traditional reproductive roles (however unhistorical the tradition), but also because reproduction is not just the process by which new human beings are produced but also the moment in which spirit and matter are joined. This is what makes human life sacred to dualists, not some humanist respect for the pricelessness of each autonomous individual.
In practice, the dualists of the West are its conservatives, and they are afflicted by what I call conservative amnesia, or the unthinking assumption that the world when it was created and the world when they were born are the same world. From Creation until just a few years ago, the same order prevailed. Things were always done as they were done in the world the conservative grew up in - which is to say, naturally. The collapse of all human development into a recent moment fixes a single point from which to measure the thing that conservatives dread most - more than pain or death - change. Conservative amnesia explains how the aristocrats who refused to install newfangled electric lighting in their mansions produced a later and otherwise like-minded generation that is perfectly at home with light bulbs but deeply disturbed by personal computers. A second dualism arises, no less powerful than the existential one, between 'the natural' and 'the altered.'
Put these two dualisms together, and you have a philosophy that counsels man to leave things as they are, out of respect for the natural order of things ordained by God. To be sure, it is dangerous to fiddle with convention. Most alterations, like most mutations, are just mistakes. It takes a critical-mass combination of great insight and vastly accumulated experience to generate real improvements in human affairs, and even then these improvements can take generations to prove themselves. But human beings have been making changes since they climbed down from the trees. Consider, simply, agriculture. Have a look at Jared Diamond's Guns, Germs and Steel (Norton, 1997) to see the almost incredible complexity and chanciness of plant cultivation. Sadly, the conservative is not particularly interested in the odds against a viable crop of corn, say, because that problem has been solved, and corn is now part of nature - just look at Iowa! What are not part of nature - yet - are plant strains that have been genetically modified to resist disease. Even the Europeans, who by and large have abandoned existential dualism, are dead-set against this innovation.
Modern medicine and its satellite disciplines (physical therapy, nutrition, fitness technology) have encroached even further on the dualism of 'the natural' and 'the altered.' Doctors have been playing God for good half century now. How else to describe bypass surgery or heart transplants? The treatment of cancer and infectious disease involve a manipulation of 'the natural' that is even more philosophically problematic. It is impossible to fix a point at which the 'natural' body is rendered 'altered' by medicine - unless, of course, any medical intervention at all leads to alteration. This problem has been conveniently elided by resort to a thoroughly unexamined notion of health, seen as a 'natural' state to which medical intervention restores the body from the 'alteration' of disease. Soon, perhaps very soon, it will be possible to change not just one individual's genetic structure - so as to eliminate the instance of a disease, perhaps, but perhaps to achieve some more cosmetic effect - but that individual's germ stream. That, and not cloning, will mark an irreversible transformation of the struggle between 'the natural' and 'the altered.'
Another object of conservative amnesia is price. Things should cost what they have always cost - that is, what they cost when the speaker was young. And rewards should be just as difficult to earn. An interesting example of conservative dismay is the controversy surrounding aptitude-test training. The introduction of aptitude tests was posited, falsely in my view, on the expectation that the tests would assess the 'natural' intelligence of the test-takers. Having recourse to a special cramming course such as the ones popularized by Kaplan Inc. is tantamount to cheating, because teaching kids how to take the test 'alters' their original position, which is no longer the naive one espoused by the test-makers. Or consider bodybuilding. Bodybuilding is supposed to be hard work - no pain, no gain. But the work is hardly natural. The weights and the machines that have been designed to enhance human musculature tend to rule out all unnecessary or, worse, harmful work. 'Natural' work, the kind known to peasants laboring in fields, doesn't make healthy bodies.
This dualism makes for a lot of arduous juggling, and I'm afraid I don't see the point of it. 'The natural' is just a bad idea. While I don't deny the existence of Spirit - I'm agnostic on the subject - I don't look for meaning in the immaterial, and I believe that without change we would lose life along with meaning. Meaning is something to strive for, not to try to harmonize with. And I'm pretty sure that raising a thoroughbred is such a complex affair that cloning Funny Cide won't guarantee any safe bets.
23 May 2003: Commenting on American Francophobia in the current issue of France-Amérique, Denis Lacorne, a French specialist in American studies, regrets the lack of a Franco-American contingent large enough to lobby, if not for the policies of Messrs. Chirac and Villepin, then at least for French business interests. It is indeed curious that of all the European nations, France's contribution to the American melting pot has been proportionally tiny. Most Americans of French extraction trace their ancestry back to Canadian immigrants of the seventeenth century, and their sense of being French is largely uninformed by the tumults that have visited L'Hexagone since then. Upsetting as France's many coups and revolutions have been, they've provided the French with both hope and, in the long run, prosperity, two desiderata that so many other groups have despaired of finding at home. The French have had good reason stay put. But this does leave them without much of an American presence.
But I suspect that American Francophobia is simply a facet of populist anti-elitism, aimed not at the French themselves but at cosmopolitan Northeasterners like me who can read a French menu. I must confess that I can't conceive of a sound education that does not include some training in French. This is an old prejudice that dates back at least to the Grand Siècle, when Louis XIV made France the focal point of Western civilization, and generations of fine artists began to produce work of a classic aplomb not seen in almost two thousand years. Certainly since the Enlightenment, educated people everywhere have not only felt the obligation to learn French but actively enjoyed the richness of French high culture, which until 1945 was unquestionably a new international style. Only recently in France have exponents of serious excellence adopted something of the apologetic tone so familiar in England and the United States. Until recently, French civilization was an unabashedly top-down affair. It is very difficult to walk through central Paris without concluding that the development of a human space that so beautifully meshes the public and the private is best not left either to marketplace chance or to endless debate. Possibly the top-down model works only in France; New York's Robert Moses was a very wrong-headed variant of Baron Haussmann, so obsessed by mad road-building projects that large public works have become, thanks to the inevitable reaction, all but impossible.
But most Americans, like most people everywhere, are innocent of a cultural reverence for France, and the Francophilia exhibited by affluent Yankees must look like evidence of a divided loyalty at the best of times, and of treason in this moment of international disagreement. Compounded by a selective grasp of history that is almost worse than total ignorance, Francophobia exploits a long tradition of Anglo-Saxon contempt and horror - of French smells, mostly. It is, in short, nothing new, and nothing, I think, to get very excited about. In fact, I'm rather grateful for it. Cultural differences that stop well short of violence do no harm whatsoever, but on the contrary make for a better world. On the whole, I would rather be a Franco-American today than an Arab-American; no one is proposing to put the former under surveillance. I do wish, along with M. Lacorne, that there were more of them!
16 May 2003: In a recent cover story for The Nation, "The Right's Grand Ambition: Rolling Back the Twentieth Century," William Greider writes, "Maybe what the right is really seeking is not so much to be left alone by government but to use government to reorganize society in its own right-wing image." The only thing about this statement that I disagree with is the maybe. But the conservative agenda will never succeed so long as the country remains a democracy, because it is simply not in most Americans' interest to submit to the tyranny of plutocrats and robber barons. What frightens me is the success with which the right has clouded and contorted the issues, so that ordinary voters misunderstand the implications of their votes. Mr Greider continues,
Autonomy can be lonely and chilly, as millions of Americans have learned in recent years when the company canceled their pensions or the stock market swallowed their savings or industrial interests destroyed their surroundings. For most Americans there is no redress without common action, collective efforts based on mutual trust and shared responsibilities. In other words, I do not believe that most Americans want what the right wants. But I also think many cannot see the choices clearly or grasp the long-term implications for the country.
I am more and more convinced that the first order of business is to retire the term 'liberal,' which is a very tattered flag no matter how admirable the principles it represents, and recall 'progressive,' the term used by reformers a century ago. There's a self-interestedness about being progressive that liberalism appears to lack. Because so many great liberals have come from the affluent elite, seeking only to serve and needing nothing for themselves, the term 'liberal' has perhaps inevitably developed connotations of condescension and noblesse oblige. But progressives are in it for themselves, just as conservatives are. Where progressives differ from conservatives is in believing that self-reliance is of very limited use, and nowhere near as effective as cooperation. Self-reliance may even be a chimera, an illusion bought into by lucky people who take pleasure in believing that they deserve what they've got. Sure, few people get rich without a lot of hard work, but hard work alone hardly guarantees wealth. Corporate executives, it is said, tend to be taller than average, and we all know that intelligence is very unevenly distributed. There's something actually stupid about claiming that you've made your own way when you were born both tall and smart. Why, indeed, does this matter at all, this self-reliance? Beyond the blush of vanity, the question is an unnecessary conundrum. It's far more important to ask whether you've done your part.
There are lots of parts to play in a society as complex as ours, and the first thing you need to do is to identify the parts that you can do well. Some will pay better than others, and some don't pay at all, and yet all contribute to the commonwealth. Progressive people seek to bring 'market forces' increasingly in line with social utility - which is another way of saying that they seek to minimize income disparity. Not to do away with it altogether by any means, but to minimize it where possible. Progressive shareholders, for example, seek to cap the salaries of the executives that they've hired to run their companies, instead of rewarding them for short-term or one-time-only maneuvers that bump up the share price for a minute or two. They see the connection that leads from environmental mindfulness to the elimination of disease. Progressive citizens understand that racial bigotry on the one hand and the reaction of bigotry's victims on the other both make for a lot of very unnecessary trouble. They appreciate the world's fascination with their rich and powerful land, and yet do what they can to neutralize the venom of resentment. For progressives envision an even better America, and that's why they take a long view. They're not about to let nonsense about self-reliance interfere with leaving their children a healthy country.
Mr Greider concludes:
The Democratic Party, alas, is accustomed to playing defense and has become wary of "the vision thing," as Dubya's father called it. Most elected Democrats, I think, now see their role as managerial rather than big reform, and fear that even talking about ideology will stick them with the right's demon label: "liberal."
Time to revive the Progressive Party!
9 May 2003: When it came out, two years ago, I bought a copy of Barbara Ehrenreich's Nickel and Dimed: On (Not) Getting by in America (Metropolitan Books, 2001), and read the first part, about waitressing in the Florida Keys. One of our most important journalists, Ms Ehrenreich decided to try to support herself in various entry-level jobs. This meant leaving her relatively comfortable professional life behind and finding housing and transportation that such jobs could pay for. She was unable to make the experiment, which was of course not an experiment for the people among whom she found herself working, work. Her account of making do on tips while living in a trailer was so depressing that I couldn't bring myself to continue, and I set the book aside until just the other day, when the prospect of meeting the author prompted me to gobble down the rest of it. A lot had happened in the intervening years, and I'd certainly become thicker-skinned about serious reading. I found the rest of Nickel and Dimed bracing and funny, if hardly more optimistic.
The nub of Nickel and Dimed is that entry-level or minimum-wage workers cannot afford decent housing. Ms Ehrenreich found this out for herself, but according to statistics developed by the National Low Income Housing Coalition (a site that Ms Ehrenreich mentioned in response to my what-can-we-do question), the average fair-market rent for a two bedroom house requires an hourly wage that is nearly three times the federal minimum wage. This means that two people working at low-paying jobs cannot house themselves conventionally, but will have to resort to trailers, seasonal motels, the homes of family members, or shelters. If your heart is not hard enough to write off all minimum-wage earners as unmeritorious and undeserving, then the picture presented by the NLIHC is one that cries out for correction.
Why is there an affordable housing crisis? Aside from the ever-widening income disparity, which claims more existing housing stock for the affluent (gentrification can be a nightmare for the inhabitants of a shabby neighborhood), there is the long-term impact of promoting home ownership, a government policy as old as the income tax itself and a major sector of the national economy since World War II. The colossal homeowners' subsidy that takes the form of a deduction for the payment of interest on a mortgage has made it easier to own a home here than it is almost anywhere else on earth, but with the expectation that all who can afford to do so would buy their own homes, there came a corresponding deprecation of housing built for renters. The association of renting with socially-challenged low-income earners fostered the inevitably self-fulfilling notion that occupants don't take good care of premises that they don't own. Finally, public housing in the twentieth century was cursed by the popularity, among urban theoreticians, of the utopian but soulless ideas of Le Corbusier, which gave us the ghetto of the housing project.
When 'welfare-as-we-know-it' was axed in 1996, it was widely believed that former recipients would be able to support themselves with low-paying jobs. That they cannot is not a reason to return to the old welfare system, but it does give us every reason to rethink public support for social decency. It's a long way from simply handing out money to designing a program that aims to provide everyone who needs it with affordable, decent housing. I don't know what the government is good for if it can't foster this goal and alleviate a serious national blight.
The least you can do is take a minute to check out the NLIHC's Web site, at www.nlihc.org.
2 May 2003: Hurrah! The war in Iraq is over! Saddam Hussein hasn't been accounted for, and neither have his weapons of mass destruction, but military opposition to American troops has ceased. As our intervention in Afghanistan ought to have taught all thinking people, the American mission would come down to this: the war would be over when resistance to our invasion melted away. Installing a US-friendly person as the nominal head of local affairs (in the case of Afghanistan, 'local' means 'Kabul and environs,' no more), we would hale our troops home to a hero's welcome. The Administration could rest assured that no one except nigglers like me would fault it for having altogether failed to accomplish its trumpeted prewar objectives. Is it so hard to remember six-week-old headlines?
I've been reading John Lewis Gaddis's book-length essay, The Landscape of History: How Historians Map the Past (Oxford, 2002). I recommend it particularly to anyone who has grappled (if only in school) with the status of history as a science. Mr Gaddis enlarges what appears to be an intramural dispute with social scientists about the nature of historical variables into a grand comparison of the old, linear and predictive science formalized by Isaac Newton and the new, complex and analytical science engendered by computers and (think 'fractals'). He points out that historians have been working with the concept of 'self-similarity,' so central to the new science, for almost as long as there have been historians, and faults reductive social scientists for managing to predict nothing that isn't obvious to begin with. Because I'm already persuaded that Mr Gaddis is generally right, I'm not going to pursue the social scientists' side of the argument; social science looks like a doomed discipline anyway. I want to spend more time with The Landscape of History, especially after I've read an older book that Mr Gaddis makes a great deal of, Marc Bloch's The Craft of History. For the moment, I'm wondering simply this: who cares? Who cares about history?
A look at the history department of any large Barnes & Noble store - usually adjacent to, but distinct from, the biography section - will reveal that it houses two completely different kinds of book in very unequal proportions. The lion's share of the shelf space offers books about war, particularly the American Civil war and the Long World War of 1914-1945. Almost all the photographs in these copiously-illustrated books show either men in uniform or weapons of some kind, or both: 'military history' does not cover the background of war, or its aftermath, but just the battles. The wide range of quality and format reminds me of the cookbook section, where once again I have to ask, who's buying all this stuff? Wider-scoped history books take up a more modest amount of room, dominated by perennial sellers that are always in stock. None of this suggests that there is a large market for books that are not focused on armed conflict, capsule surveys, or remarkable individuals - a good negative definition of 'history.'
Not too long ago, of course, history was all about battles. Human life was conceived as alternating between brief periods of 'peace,' in which nothing happened but harvests and other regularities, and longer periods of something else, sometimes plague but more often war, climaxing in decisive confrontations that usually entailed political rearrangements (and a round of decapitations). At the beginning of the last century, schoolbook history was still a series of the dates of treaties. But grown-up historians had long before broken with the Greek and Latin chroniclers of antiquity, and were now grappling with an entirely new problem: identifying the role of evolution in history. 'Whig' historians believed that everything changes for the better, and that big historical transformations follow long gestations in which everything 'tends' toward the future. Sophisticated historians, aware that the Whig view is flatly contradicted by the evidence of natural evolution, still labor to find a role for intent in human affairs. This isn't the problem of free will, but it's close: if Henry VIII, in breaking with Rome, didn't intend to join the Reformation movement, then what was he thinking? Evidently, he was thinking that he could replace the Pope with himself and leave everything else as it was - except, of course, those glitteringly wealthy monasteries, which he 'sold' to his 'friends.' But was there more to it than that? And was it all occasioned simply by the failure of Anne Boleyn to produce a male heir? We will never know for sure, anymore than we'll know what exactly we ourselves were thinking when we asked that pretty girl to the junior prom thirty years ago; but because the impact of Henry VIII's decision was logarithmically greater than that of ours, we'll probably never stop asking.
And whom do I mean by 'we'? Ideally, I'd mean 'all educated people,' which, in its ideal turn, would mean 'everybody.' Why? Because, as John Lewis Gaddis compellingly puts it, we're always living at the moment of singularity (think 'black hole') when all sorts of variables that might go one way or another - the weather, the price of coffee, Tiger Woods's latest score - are suddenly fused by contemporaneity into fixed, interdependent fact: possibility becomes history. Most of the variables are important only in certain narrow contexts, but some variables - I think most often these days of media campaigns - have an outsized effect on all the little ones that impinge on our lives. As citizens of a democracy, we have the power to react to what happens to us, but into order to understand what has happened - in order to act intelligently, without regard for the shouting of bullies - we have to have a sense of leading variables, and nothing hones this better than reading in history. Very, very few people will ever write history, and perhaps the ones who do are the only people who really need to read The Landscape of History. But I think that almost everyone would find that it makes the world ahead look a bit different - denser and more complex, and much more open to possibility.
25 April 2003: Several weeks have gone by since Rachel Corrie was mowed down by an armored bulldozer in the Gaza strip, but I haven't been able to come to any conclusion about her determination. That's what got her killed, the determination to stand fast in front of a physician's home that the Israeli military was about to destroy. (Come to think of it, I don't know what became of the house.) As Susan Sontag said recently, courage per se is not a virtue. To applaud it, we need to know what kind of courage we're talking about. And I don't know what kind of courage made Rachel Corrie kneel down in front of the bulldozer. To say that she laid down her life as a martyr doesn't really clarify anything, especially in a conflict notorious for suicide attacks. It's true that, unlike the Palestinian bus bombers, she intended to take no life but her own. But the intransigence of her gesture seems as provoking as it was provocative. I wouldn't be surprised to learn that most Israelis, while feeling deep sympathy for the Corrie family, regard Rachel as a reckless child who almost wickedly threw away her life in opposition to what so many of them see as a struggle for the survival of their nation. Did her death change anyone's mind?
The disgrace of the Palestinian conflict is that it has been allowed to play out at the lower levels of society without any restraining influence from political leaders. Israeli settlers and Palestinian bombers have simply taken matters into their own hands, and their respective governments have backed them up. This is a very twisted kind of democracy, but it seems to be the kind of democracy that's in place almost everywhere. Instead of leading their people (as the assassinated Yitzhak Rabin tried to do), elected officials around the world consult polls and seek only to implement majority views, however wrong-headed. They confuse King Kong-like chest thumping with leadership. In the Palestinian fiasco, there are no heroes, only craven populists, working in concert with corrupt, triumphalist media.
Rather than interfere with soldiers in the line of duty, those who want to help put a stop to the disasters of the Middle East - or to the Administration's war on genuine American values - ought to interfere with their ultimate commanders, the politicians - and with the media as well. Critics of administrations here and in Israel and Palestine ought to do everything that they can think of to convince their neighbors that, first of all, they do support the troops. This is a lesson that I thought we'd learned from Vietnam. What started out as the principled opposition to a dubious military engagement degenerated into a shouting match, heavily tinged by class tensions, between students and soldiers (or, since the soldiers were far away, soldiers' proxies, patriots like Bob Hope). It was pretty clear that, whatever else motivated their opposition, the students didn't want to don uniforms and fight in the jungles. This compromised the anti-war effort right down to the evacuation of Saigon, and left a bitter wound that the war in Iraq appears to have ripped open. Today, those students have grown up into middle-aged opponents of the war who wouldn't be drafted even if the draft were reinstated, but the class distinctions between soldiers and their contemporaries in school are as vivid as ever; we're told repeatedly that today's armed forces are made up of young people for whom military service is the only avenue to any kind of higher education. Elementary decency requires today's anti-war protestors to be sure to wave their flags.
What protestors ought to demonstrate is that the American news industry is stoking some very base impulses in the general population, mostly tribal in nature. The wrong in this is that America is supposed to be a land beyond tribalism, a land where minority views are assiduously protected and the cosmopolitan vision of America's finest leaders is kept on inspiring display at all times. Protestors must figure out how to persuade their families and coworkers that the news from the small screen is largely tendentious nonsense, and anything but trustworthy, objective reporting. So long as our media will pick up lies like the one that Karl Rove is said to have planted about John McCain in the last presidential election (i.e., that McCain had fathered a black child out of wedlock, when in fact he and his wife had adopted a Bangladeshi orphan), all the grains of salt in the kitchen aren't enough. I'd like to have seen Rachel Corrie try to stop a misleading newscast. Much harder to do, in most ways, than simply kneeling down in the path of destruction. But the effort would have been much less divisive. Is it true that Palestinian and Lebanese mothers are naming their newborn daughters after Rachel Corrie? That sort of thing doesn't help.
Has anyone heard that the Rachel Corrie story has been appropriated by the networks or Hollywood? I didn't think so. Not yet. They're all too busy with poor Jessica Lynch, whose story will go into production whether she wants it to or not. I'm glad that Ms Lynch is alive and on the road to recovery. I wish that Ms Corrie were, too. I know that she meant well, but I'm as unsure as ever that that's enough.
28 March 2003: Click here.
21 March 2003: Click here.
14 March 2003: This week's reading started off with two striking associations of the Administrations ideas for the Middle East and the excitements depicted in the fiction of Tim F. LaHaye.On the Op-Ed page of The New York Times on Saturday, March 8, Bill Keller remarked, with a facetious irreverence that would have been funny in a more peaceful context, on the reason for the upwelling of fundamentalist Christian support for Israel:
(Many evangelicals love Israel because in their Biblical end-of-days scenario, the gathering of the Jews in the Holy Land is necessary for the Second Coming. Inconveniently for the Jews, the story calls for them to either abandon their beliefs or be exterminated in time for the great rapture.)
those who are sure that the impending invasion of Iraq is right:
[The war lobby's] most reliable supporters, besides the President, are the sort of evangelical-Christian conservatives who contemplate Armageddon with something like rapture.
Mr. Hertzberg was essaying a dismissal of sorts of those who are sure about the war, sure either way. "On the other side [from the hawks] are traditional pacifists and the sort of angry leftists for whom any exercise of American military power, because it is American, is wrong." Well, pardon me, but I'm neither a pacifist nor an angry leftist, and I'm sure, too, that the invasion will turn out to be a mistake, in one if not both of two possible ways. First, it will destabilize the Middle East. This is eminently foreseeable. If there were nothing more to it, the problem of Kurdish sovereignty ought to dissuade us from making the first move; one far from unlikely outcome will tear Turkey away from Europe and revive an ancient hostility that a century of modernizing goodwill has done little to extinguish. Second, it will upset the Western Alliance at least severely enough to end cooperation between the security forces on both sides of the Atlantic at a time when cooperation will be needed to save lives from terrorism.
The government has warned of the likelihood of suicide bombing on American soil in the wake of the invasion of Iraq, but it's response to the inference that perhaps the invasion ought to be forestalled appears to be tough. And I would expect no less of an Administration headed by a man of the President's religious conviction. Indeed, so sure is he that he's right that he no longer operates on the traditional plane of Republicans and Democrats. He and his staff have ascended to the higher, purer air where Rapture, if not quite a reality, can at least be envisioned. Because my poor mind can no longer cope with explaining the Administration's behavior in terms of the American political tradition, I have decided to rechristen the President's party. Having gutted the independence, as well as the moderation, of the old Republican Party - my nephew snorted derisively when I asked what had happened to Rockefeller Republicans - the President has replaced it with the Rapturean Party. The Rapturean platform has but one principal plank, one that curiously would delight both Platonists and papists: What happens on earth is only a prequel to heaven's eternity. Raptureans roundly reject the secular foundations of American polity; we are all Christians, we all face God's judgment, and the government is an instrument of divine intervention. Don't expect Raptureans to flinch when accused of arrogance or high-handedness or unilateralism or the bullying use of hyperpouvoir. Such slings and arrows only stiffen their resolve, for, as the President put it, "Either you are with us or you are with the terrorists."
This horrific rhetoric, so Stalinist in its brazenness that the irony of it almost sucks the oxygen out of the air, precludes discussion and disagreement; it is closed to uncongenial aspects of reality. I have absolutely no hope or expectation that Mr Bush or anyone currently working for him will ever wake up to the complexity of life, but I pray, in my own secular way, that Americans will take a good look at the Rapturean Party in the coming months and decide that it is, if nothing else, too radical, too extreme for political responsibility. The great well of support for a Rapturean White House is, after all, the half-confused, half-indifferent inattentiveness of the American electorate. I certainly do not think that the only cure for this country's political sloth is a rash of suicide bombing. And if it would guarantee our safety from terrorism, I would hand the Raptureans the next national election. In a heartbeat. I'm proud to say that my principles are dust in the face of suffering.
7 March 2003: From time to time (it might be monthly), the New York Times Book Review runs a cartoon strip by Mark Alan Stamaty, entitled Boox, that ordinarily lampoons the publishing world's wretched excesses. About a month ago, it lampooned a book instead: David Frum's The Right Man. (It's in this book that we learn that speechwriter Frum coined the phrase axis of hatred, subsequently - and idiotically - upgraded to axis of evil.) Here are three lines of balloon dialogue from the February 2 chapter of Boox:
TV Interviewer [to Frum] : "Though your book largely praises the President, you do say he is 'often uncurious [sic] and as a result ill-informed'."
Frum: "Yes, but I also say that every president has his flaws, while George W. Bush's assets of being tenacious and courageous outweigh his shortcomings and make him a great president."
Viewer, to companion: "That makes no sense to me. 'Tenacity' and 'courage' in an 'ill-informed' president could simply result in stubborn enforcement of bad policies, like knee-jerk tax cuts, a lone-cowboy foreign policy and a profligate energy policy."
I've felt like that viewer since Mr Bush announced his candidacy - long before the isolationist of 2000 dreamt of becoming the regime-changer we've gotten to know since. Without imagination and broad knowledge, tenacity and courage aren't particularly arduous virtues; it's only when you see things from many points of view that resolving upon one and sticking to it poses a challenge. Before or after the terrorist attacks, George W. Bush looked and looks like a martinet, not a leader.
In a richly-layered cover article in the current Atlantic Monthly, Richard Brookhiser (a Senior Editor at William F. Buckley's National Review) appears to address the question of the President's imagination, but in the end he does no more than raise it. Everyone has an imagination, of course; without imagination, even the President would not be able to read, or for that matter speak English even as poorly as he does. But I am almost certain that his imagination could be described as unconscious. It is not a faculty that would appear to get much encouragement from its owner. Mr Brookhiser, however, is not interested in the President's gift for speculation. The true purpose of his article is curiously bifurcated. On the one hand, he extols the use that Mr Bush has made of his training at the Harvard Business School. This line of comment appears to be aimed at the general reader, and intended to correct the impression that the President is not intellectually equipped for his job. On the other hand, Mr Brookhiser expresses reservations about the adequacy of the President's range of advisers. This might be a sly way of suggesting that the President is incapable of imagining anything not presented by the people around him, but it certainly looks like a message, from one faction of the consensus-hugging Right to the White House, suggesting that there are conservative points of view that the President is not taking into account. The suggestion is floated, for example, that Saudi Arabia might be a greater enemy than, say, Iraq - but that Messrs Powell and Cheney and Ms Rice haven't thought of this.
What makes me sad is that Michael Walzer's Op-Ed piece in today's New York Times didn't appear a year ago, when it would have been as much to the point as it is now, but would also have helped to clarify the then still rather dazed thinking of those of us who opposed the ever-impending pre-emptive invasion of Iraq. Mr Walzer reminds us that we have been fighting a war - he calls it the 'little war' - ever since the troops came home from Operation Desert Storm, and that we've been winning this war, by interfering with Iraq's shipping and airspace, slowly but surely. Mr Walzer suggests extending the no-flight zone to cover all of Iraq, an idea that occurred to me months ago, which only goes to show what a no-brainer it is. Maybe it's a bad idea, but it doesn't appear to have been publicly debated. Perhaps if we were to urge France and Germany and even Russia to help us fight the little war, they would find it embarrassing to decline - for, after all, this is a war without many casualties. It has put teeth in the inspection process; it has demonstrably contained Saddam Hussein.
Unfortunately, we're saddled with a very stubborn Enforcer-in-Chief of bad policies.
28 February 2003: Greetings from Washington, D.C. It's an unlikely place to seek rest and relaxation, but I find its wide streets, low-slung buildings, and arresting vistas (often of the Capitol's dome) intensely tonic. I can't imagine a city less like New York.
I have spent the afternoon at the National Gallery, where two important shows are currently on view. The smaller is devoted to canvases by Thomas Gainsborough, the larger to works in several media, but mostly paintings, by Edouard Vuillard. The Gainsborough is an all-star show, small but select. Mrs Siddons is there, and so are the before-and-after pictures of Elizabeth Linley (the earlier, a joint portrait with her sister, Mary; in the latter, she's a winsome, almost Victorian beauty, tinged, one suspects, by regret for the singing career that her husband, the playwright Sheridan, forced her to abandon after she eloped with him). Mr and Mrs William Hallet take their bandbox-fresh Morning Walk, and titled ladies, or ladies about to be titled, nod and simper across the galleries. There are several fine landscapes, too, poised between the old romanticism of the Dutch Masters and the impending romanticism of Constable, but it's the grandees who sparkle, putting their best feet forward and defying you to guess the truth.
Vuillard has been a favorite of mine for a long time, but since so many of his paintings are still in private collections, I hadn't seen very much of him before today. I was bowled over by his work's color and composition, which convey a lightly-worn air of mystery that, if you look a little harder, takes on an edge of real menace. Vuillard was clearly very ambivalent about domestic life. Unlike Gainsborough, whose portrait figures for the most part preoccupy stylized outdoor settings utterly unlike the studios in which they actually sat, Vuillard painted people in their homes. When these homes belonged to the grands bourgeois, their allure, in Vuillard's hands, is tremendous. Captivating in an altogether different way is the portrait of Jeanne Lanvin, the couturière, in her office. This work embodies the transformation of women's possibilities in the twentieth century. Nowadays, of course, the sitter might very well be a banker.
I came away with two tons of catalogue. The Vuillard catalogue is thicker by far than Jansen's History of Art (any edition), and the first chapter, covering the painter's earliest work, is off-puttingly entitled 'Permanent Transgression.' As if to make up for this, the quality of the reproductions is the highest that I've ever seen, and most are large enough to give a real sense of the original. So much to the good, because the book will have to do for the foreseeable future. I won't be able to get back for another visit before we go home. One of the pictures that I liked best, In the Salon, Evening, Rue de Naples, does belong to the National Gallery, but the dazzling 'Toiles de Gênes' Boudoir, which shows the wife of the owner of Houbigant tucked away upon an opulent daybed, will disappear back into a private collection after the show's last stop, in Paris next year.
Actually, the catalogue is full of pictures that didn't make it to Washington. Comprehensive exhibits like the Vuillard show may in future exist only in catalogue form, with no two actual installations - and there will almost certainly be at least three - showing all of the same work. Terrorism may have ended what has surely been the most democratic era in art appreciation, with masterpieces - even the Mona Lisa - flying about the world and making it possible for thousands of people to see paintings that they'd never encounter otherwise. Museums like the Frick and Barnes Collections, which don't lend their holdings at all, will appear less idiosyncratic and contrarian. Safe, but sad.
Since I'm on vacation, I'm not disposed to think very hard about the difficulty of assessing a cartload of unfamiliar paintings, and the related issue of my being spoiled to death by the proximity of my home to the Metropolitan Museum of Art, which allows me to see interesting and important pictures over and over and over again. For the moment, let me say that good pictures never become tiresome (this is true of all creative work), and that I wish I could go back, not tomorrow, and not the next day, but, say, next week some time, and have another look at Vuillard. It would make writing about him a little more intelligent and a lot more legitimate.
And what's this? Kathleen tells me that there's a conference that she's thinking of attending in Montreal. If I were the ideal husband, I'd accompany Kathleen to every out-of-town conference, but I'm a husband who hates to travel. Wouldn't you know, though - the Vuillard show will be there at that time. Train or plane?
21 February 2003: Today's email brought an anti-war petition in the form of a chain letter; I was to sign it and pass it on to as many receptive friends as I could think of. So I affixed my name to the bottom of the list - I think it was No. 633 - and sent it to four other people, whom I hope won't be angry with me. The gesture was about as slight as a gesture can be, but it put me in the anti-war camp. Since I'm about as old-Europe as an American can be, this ought to surprise nobody. In the current issues of The Nation and Harper's, you can read Jonathan Schell's arguments against war in general and against pre-emptive war against Iraq in particular. I find them persuasive. The opinions of the war's most intelligent hawk, Kenneth M. Pollack, are widely available, too; an Op-Ed of his appeared in the New York Times today. His arguments do not persuade me. One's position on this matter is not, I fear, a matter of arguments at all, but rather one of glandular disposition. While it's undoubtedly simplistic to pinpoint testosterone as a decisive factor, I cannot read the amiable Defense Secretary's remarks without smelling a high-school locker room. It was an interview with Robert Kagan, again in the Times, that just happened to cast light on the reasoning that had hitherto seemed impenetrable to me. What Mr Kagan actually said isn't important here, because I was reading between the lines. Here's what I read:
The United States is the most powerful nation on earth, and a force for good. Saddam Hussein is a dangerous menace to his countrymen and, possibly, a nuclear menace to his neighbors. Because the United States has the power to depose Saddam Hussein, and to effect 'regime change' in Iraq, it would be wrong for the United States not to depose him, and to fail to bring freedom and democracy to the Iraqi people..
And in between those lines, I read this:
It is (all) right to wage war in order to prevent war.
This inner kernel of meaning will not sound entirely paradoxical to anybody who has heard about fighting fire with fire. But fighting fire with fire requires an adjacency altogether lacking between the United States and Iraq. It also requires an actual fire, not the possibility of one. It is difficult to imagine any administration's entertaining, prior to those of Ronald Reagan, the idea of a pre-emptive war in the other hemisphere. Perhaps that's precisely what today's hawks have uppermost in mind. This country has usually not been ready for war. Its inclination has been to resist the call to arms until attacked. The hawks will probably claim that the United States has been attacked again - but nobody outside a (possibly non-existent) core of true believers credits the idea that Saddam Hussein had much if anything to do with the attacks on New York and Washington in 2001. The claim isn't important to either side, though, because the hawks are insisting that we destroy Saddam Hussein before he can attack us. Or our friends, of whom, strange to say, we have many.
That, gentle reader, is the New World Order. As such, it can't be debated with principles established in the Old. Attacking pre-emption with arguments rooted in the concepts of sovereignty and international relations that were born in 1648 and reaffirmed in 1815, 1919, and, one would have thought, finally, in 1945, won't get anybody anywhere, because those arguments are old. They are history. These are the President's words.
I very much hope that we oldsters contrive to retrieve the car keys from the current administration before the youngsters discover that drinking - drinking pride and glory - and driving don't mix.
14 February 2003: Click here.
7 February 2003: In an editorial-page piece in The New York Times last Saturday, Brent Staples reviewed Condoleezza Rice's position, insofar as it can be discerned, on affirmative action. He noted the insulation that her father's aloofness from civil rights struggles had conferred upon her upbringing, and acknowledged that he, too, had been brought up to take full responsibility for his own success. Unlike Ms Rice, however, he no longer believes that race had nothing to do with determining the course of his life. Just as there were people who, as he discovered in his twenties (so well had his parents protected him from the corrosion of bigotry), really believed that his skin color must have an adverse influence upon his intelligence, so, too, early affirmative action had helped him enter mainstream academia, and earn a Ph.D. at the University of Chicago. "At a time when black Americans were denied basic fairness across the board," he writes of the 1950s and 1960s, "the theory that hard work could trump racism was both noble and patently false."Conservative blacks are not alone in denying the impact of group identity, but at least one can understand their motivation, which appears to be not so much a matter of taking all the credit for their achievements as one of not allowing the abject failure of so many others in the group to excuse the slightest personal shortcoming. Far less intelligible is the unconsciousness, common to so many white Americans brought up in comfortable homes, of the advantages that have floated them over the countless obstructions that impede most human progress.
Privilege has always been a cause for embarrassment in this country, but so long as affluent parents dote on their children there will be no end to it, so perhaps it would be useful to accept, if not inequalities themselves, then their persistence. This acceptance would have take many forms. It would reconcile some privileged children to their backgrounds, and ease the pressure to declare factitious and perverse independence from their families. It would force everyone interested in education to confront, head-on, the massive and inexcusable inequities of a school system tied to property-tax receipts. And it would put an utter end to the credibility of any college-educated executives' claims that they got where they are on their own.
One of the chapter headings in John P. Marquand's Point of No Return is "Don't Let Anyone Tell You, My Young Friends, That There Is Any Such Thing as Luck..." This comes from a politician's address to the hero's eighth-grade graduating class.
Oh, no - there is no such thing as luck, my dear young friends, not for American boys and girls. As you sit here, not so far from entering the contest for life's prizes, you are all starting even because this is America, no matter what may be your religion, or race, or bank account. There is no grease for palms in America. The only grease is elbow grease. Look at our greatest men, born on small farms in small houses, boys without a cent to their names. Did they get there by luck? Oh, no. They got there by making the most of opportunities which are open, thank God, to every American boy and girl.
This passage tells terrible lies. It tells those who won't succeed - the children who will go on to have the same humdrum lives that their parents have known, and whose social circles will remain almost exactly as circumscribed - that their not winding up at the top of the heap is a failure, when in fact they never had a genuine chance. Altogether worse is the endorsement of smug satisfaction for those who, while not exactly getting ahead, simply maintain their parents' superior social position. Perhaps they were diligent and virtuous. But their diligence and virtue met vastly less resistance than, say, the diligence and virtue of the children of immigrant laborers. It's easy to remain unconscious of a negative such as diminished resistance, but a grave failing nonetheless.
Because the persistence of inequality surpasses the strength and value of our efforts to erase it - knowing what we know about manifold humanity, we cannot even imagine how to erase it, much less how to enlist everyone's participation in the attempt - it is perhaps foolish to regard inequality per se as an evil. Worse than foolish, if the denial of equality of opportunity encourages the belief that inequality of outcome is nothing more than the result of unevenly distributed industriousness. There is no room in secular democracy for the idea that a man's success in any field makes him a superior person - at least while he's alive. Let posterity decide that. The successful man ought to have a farmer's humble respect for the unearned blessings and accidents that have brought his labors to fruition.
In the parody tax return appearing on the last page of the current New Yorker (Feb. 10, 2003), Bruce McCall sets out a number of Agree/Disagree propositions for the very wealthy to consider. The fourth item, "I don't have a job, either, but you don't hear me complaining about it," captures the thoughtlessness of many people who do have jobs, but can't stop patting themselves on the back about it.
31 January 2003: Perhaps I'm overreacting. I tend to, where the current President is concerned. With that in mind, I've put the following confession into the mouth of Bill Clinton:
I had a drinking problem. Right now I should be in a bar in Texas, not the Oval Office. There is only one reason that I am in the Oval Office and not in a bar. I found faith. I found God. I am here because of the power of prayer.
Assuming that Clinton would have had reason to make such a statement, and then went ahead and made it, at an official (i.e., recorded) meeting with clergymen, held in the seat of the presidency, I'd have been disappointed but not particularly upset, because, frankly, I wouldn't have believed the statement to be true. I would know that if Bill Clinton believed anything, it would be that voters, not prayer and not God, put him in the Oval Office. If Clinton's fear of God had not been transformed, as it so clearly was, into a fear of voters, then perhaps he would have admitted to the Lewinsky affair at the outset, and spared the nation a trauma. Religious folk might argue that he ought to have avoided the affair with Monica Lewinsky, but I refuse to entertain this proposition, because what I believe, if I believe anything at all, is this: so long as it steers clear of minors and mayhem, nobody's sex life is anybody else's damn business.
I feel the same way about other people's spirituality. I don't want to know more about the President's religious life than what's implied by a photograph of him and the First Lady on the steps of some house of worship. Perhaps because I have never been a religious person, I took comfort as a child in the privacy afforded by the Catholic Church's expansive ritual. I will always, I'm sure, find expressive religious enthusiasm deeply alarming, and I dread what has already been called the Fourth Great Awakening, now scorching the nation. And that's why I'm glad I'm an American, because from the very beginning this has been a citadel of religious freedom. Of negative religious freedom, perhaps, even more than the positive freedom to practice my beliefs. There are limits, after all, to the practice of religion - child sacrifice and, in theory, anyway, polygamy, are not permitted, no matter what religious claim is made on their behalf. But there is no limit to the negative freedom not to have to practice anybody else's religion. Again, in theory. In practice, the current Administration appears to be deaf to the idea that not everyone in the United States is a God-fearing Protestant, Pope-following Catholic, or Sharon-admiring Jew. It implies that anyone who does not belong to one of these groups is unpatriotic. This implication is a stench noxious to the health of our freedoms.
President Bush's confession - which I have extracted from Anthony Lewis's NYRB review of Carl Bernstein's clothbound incense, Bush at War - is doubly indiscreet. What ought to be the object of shame - a 'drinking problem' - has become the gold badge of courage - vice morphed into virtue and lacquered by time. (I suppose I ought to be grateful that the President didn't add a few grateful words about twelve-step programs.) If Mr Bush were to admit to a current drinking problem, appeals to the power of faith would not count for very much, but as the personal problems that the President claims to have overcome through prayer never hurt me in any way, I have nothing to forgive, and I wish I had nothing to forget.
24 January 2003: Last week, I promised to produce a list of everything that's good about America and Americans. For a few hours, I was actually refreshed by the breeze of high-mindedness; what a service it would be, in these dark days, to identify, in handily abbreviated form, the national virtues. By the end of the weekend I hadn't found a single one. Looking around, I saw lots of virtues, but they were all wrapped up in individual personalities, and none of them could be attributed to Americans generally. There were four quiet weekdays ahead for thinking it through, but the uneasy feeling that I'd set myself an impossible assignment was hard to shake.
Worse than impossible - embarrassing. The whole point of the exercise, after all, was to demonstrate that I am not a scold, not a pessimistic critic with a dismal view of America's prospects. For one thing, as David Brooks had pointed out about Americans - but it's true of everyone - nobody likes a scold. For another, scolding, in these aforementioned times, is too easy for anyone with an even faintly liberal, cosmopolitan outlook. It's like scolding people for watching too much television: Chicken Little shooting fish in a barrel while preaching to the choir. It has become almost impossible to say anything interesting about What's Wrong With America, because everything is so obvious - to anyone with eyes.
As it turned out, the four quiet weekdays evaporated on contact. On Tuesday, for example, I sent out the bulk of our Christmas cards. No, not next year's. This - last - year's. I didn't even open Christmas cards until some time around my birthday. I divided them into several piles - New Yorkers (to whom we had sent cards, enclosing invitations to our Christmas Day open house), old friends with whom I'm not in email correspondence who had a right to expect something more than stale season's greetings, business friends of Kathleen's, and 'other.' Over the next ten days, these piles got moved around very carefully from pillar to post and back. By Tuesday, which was, after all, the twenty-first of the month, the conviction that I must either act now or simply forget about replying to this year's cards reached its climax. By bedtime, I had written and addressed cards to nearly everyone, tucking a little bit of personalized boilerplate into almost every envelope. Done. On Wednesday, I went through every scrap of paper in each of quite a few to-do piles. By bedtime - but bedtime never came. I passed out, some time after three, waiting for Kathleen to come home from the financial printer; when I awoke at nine-thirty, I was still alone. This used to happen all the time, but now we have Sarbanes-Oxley to thank. Kathleen managed her all-nighter very nicely (although she appears to have laryngitis), but I was a zombie, even after a late-morning nap. Today, Friday, I had my usual round of errands, and no time to think. But if I don't have the promised list, I think I can explain why there's no need for one.
There are two ways in which to view the population of a democracy: either as adversarial blocs, labeled 'majority' and 'minority,' or as a community of individuals. Nothing good can come of the first view, because blocs are mobs, and virtue does not inhere in mobs. It is a curious fact of human nature that 'mob psychology' is a catalogue of weakness and vice; nothing good can be said about a mob. This is why classical political theory rejected true democracy out of hand; the rule of 'the people' would, it was feared, inevitably decay into mass tyranny. Certainly the Twentieth Century witnessed two spectacular examples of this decay, one into Bolshevist communism, the other into Nazi fascism. (Both made individualism an anti-social crime.) But not every democracy degenerated as these did. Ours didn't degenerate at all. If you're looking for something good to say about Americans, you ought to be completely satisfied by the observation that the American Constitution, which is essentially a constitution of self-restraint, remained in force throughout the century's battery of social and economic tumults.
But this sort of equipoise cannot be taken for granted. To finish the thought I began in the last paragraph, nothing good can come of the view that democracy consists of majorities and minorities because anyone who holds it will make a poor democrat. Only if each person in a democracy sees himself and herself as a unique, free agent, abundantly capable of agreeing with 'the majority' about this, and with 'the minority' about that, can the glory of democracy be realized; and democracy is safe only if most - the only majority that matters - of its citizens see themselves in this way. Only if, that is, the majority refuses to form a majority. (This is why I'm so leery of 'popular culture'; it's not because the items so labeled are shoddy and inferior - although they are - but rather because they trumpet a so-called lowest common denominator that is actually lower than any individual one.)
All mobs may be bad, but that doesn't make them all the same, so it will always be easy to draw up an interesting list of a given mob's bad habits. But the exercise will always be futile. Once a mob has formed, it must be weathered like a storm, and one can only hope for minimal damage. It's to the person who hasn't joined a mob that I speak. If it's scolding - I prefer to regard it as exhortation - to urge everyone you meet to resist the attractions of adjusting one's thinking to suit local norms, to withstand the tide of popular enthusiasm, and to respect and defend every known victimless idiosyncrasy, then I'll be admit to being a scold.
17 January 2003: In an Op-Ed piece in last Sunday's New York Times, David Brooks, the wry author of Bobos in Paradise, endeavored to explain why so many Americans appear to be happy with the Bush Administration notwithstanding its preoccupation with satisfying special interests. Brooks's fifth and final point knocked me over.
Americans do not see society as a layer cake, with the rich on top, the middle class beneath them and the working class and underclass at the bottom. They see society as a high school cafeteria, with their community at one table and other communities at other tables. They are pretty sure that their community is the nicest, and filled with the best people, and they have a vague pity for all those poor souls who live in New York City or California and have a lot of money but no true neighbors and no free time.
This image of the high school cafeteria hit me so hard because only two days before I myself had remarked that ours is a very adolescent body politic. Whether or not Brooks intended to suggest that the outlook of most Americans is adolescent, I felt that his image added a sharp edge to my observation. But something that he'd said a bit earlier in his piece made me feel gagged with duct tape: "Meanwhile, middle-class journalists and academics [and people like me] who seem to look down on megachurches, suburbia and hunters are resented." Indeed. Anybody gifted with a critical intelligence knows what it's like to be the object of this resentment. So, mindful of that deadly maxim about not saying anything if you can't saying anything nice, I'm going to devote the next week to drawing up my list of what's good about the country I live in. Instead of drawing my examples from the press, however, I am simply going to look around me at the people I encounter here in Yorkville and elsewhere in town. And I am going to resist the strong but generally unconscious impulse to compare the people I see with some beau idéal of cosmopolitan genius; I'm looking for good, after all, not for better.
I don't want to convey the impression that I'm going to have a hard time drawing up this list, but suggestions will be even more than usually welcome.
10 January 2003: Click here.
3 January 2003: What's a good - no, better: irresistible - first novel?
While I don't yet intend to make the asking of questions a leitmotif of this page, I do want to thank those who responded to last Monday's question with menus for 'simple dinners.'
And then I want to ask something a little more serious. My nephew, a brilliant graduate student at Columbia (trust me), says that he finds 'literature' boring. He's read science fiction and 'other junk,' but when I quiz him about the 'junk,' he tells me that he only used the word because he thinks that that would be my word for what he likes. So.
What would be a good novel for a smart young man to start with? Let's forget the pieties of English 101 and examine the books we've actually read.
I have another request, too, but this one isn't a question. For almost a year now, the loud little Amazon 'Click to Pay!' icon has stood at the top of Portico's Side Menu - without the slightest result. Before I move it off the Front Page altogether, I'd like to ask you, the regular reader, to consider a small gesture. Monday's my birthday - my fifty-fifth. If you've enjoyed reading Portico, then perhaps you'll not mind sending me a small birthday present. Although I'm certainly not 'in this' for the money, a show of confidence would be welcome. Amazon's facility will accept a contribution as small as $1 - not a sharp stick in the eye. And if you really can't, or don't want to, resort to moolah, then may I ask you to send me a birthday card? You don't know how important you are.
Happy birthday, RJ!
Copyright (c) 2003 Pourover Press
Write to me Former Fronts 2002 Back to Front Page