Faggot!

Every so often I hear straight men claiming that epithets like “faggot” don’t really refer to homosexuals. Rather, they say, it refers to ineffectual men who can’t take care of themselves or anyone else, who can’t give a woman what she needs, men who are cowardly and despicable, men who aren’t Real Men. (I don’t have any links at the moment, though I’ll try to add some the next time I encounter the claim online. I think I’ve seen Eminem and some other rappers saying such things, and I've read similar rationalizations about maricón in Mexican culture. Someone is playing a racist variation on the game here. To see how homophobic epithets are actually used by normal red-blooded American males -- haw haw haw! I can't believe I wrote that with a straight face! -- read some of the comments to this video. I'm still trying to figure out why it generated such hysteria.)

Since “gay” became a schoolyard epithet, soon after we queers mainstreamed it as a more-or-less neutral, non-clinical term for ourselves, I’ve heard the same thing about it as well. It’s true, some of the people who say “that’s so gay” are gay-friendly at other times, have gay friends, and pay liberal lip service to gay issues. And since we did claim the right to use “gay” for ourselves over the protests of our generation of genteel homophobes, I suppose we can’t really say that it has only one fixed meaning, and we shall stop linguistic change from happening in this one area forevermore.

That might even be the best response to “that’s so gay”: to recognize and, as necessary, point out that in that context, it has nothing to do with either the pre-1970 “gay” (“Don we now our gay apparel, fa-la-la fa-la-la la-la-la”) or the post-1970 homosexual “gay” (Gay Pride Now!).

Still, I don’t think any gay man who’s ever been called a fucking faggot (which means pretty much all of us) will take this claim seriously. “Faggot” refers not only to despicable, ineffectual men of any sexual orientation, but to men who have sex with other men, because in masculist culture men who have sex with other men are assumed to be despicable, ineffectual, etc. -- and fucked, in various senses of the word. There’s nothing more horrible in the masculine imagination than being penetrated anally: it takes away a man’s manhood as effectively as castration. For a man to enjoy being penetrated, to seek out the experience, is not thinkable (even if it’s not unknown to the men who deploy homophobic epithets). Gay liberationists were correct that shouting one’s fagitude to the world was a powerful challenge to the male supremacist order; that’s why gay liberation is now history, and today’s gay movement ambivalently calls for gender conformity, except for its reliably successful drag fundraisers.

“Faggot” and its synonyms are the equivalents for males of “whore” and its synonyms for women. What the Faggot and the Slut (as mythic figures) have in common is that they have been penetrated, and are therefore polluted, unclean. In both cases, the target of the epithet may not literally have been penetrated: boys may be targeted because they don’t fit in with other boys, regardless of their sexuality, and girls ditto – a girl may be called a Slut simply because she’s begun to develop breasts earlier than her age mates. But the words are (I think this the right use of the term) performative: by calling you a faggot or a whore, I symbolically penetrate you, establish my manhood, earn and reinforce my membership in the men’s house. (Girls call each other “slut” too.) There are some interesting books on the words for women, starting with Leora Tannenbaum’s Slut! Growing Up Female With a Bad Reputation (Seven Stories, 1999) and Emily White’s Fast Girls: Teenage Tribes and the Myth of the Slut (Scribner, 2002), but they don’t go deeply enough – I could sense the authors drawing back from the abyss. I don’t know of any books (or any significant writings at all) which deal with the words for men, though Richard Trexler’s Sex and Conquest: Gendered Violence, Political Order, and the European Conquest of the Americas (Polity Press, 1995) has some useful discussion, as does Geng Song’s The Fragile Scholar: Power and Masculinity in Chinese Culture (Hong Kong UP, 2004). There’s a lot more thinking to be done about this; I’m just trying out some ideas now.

Meanwhile, what about the males who say that “faggot” refers to somebody else, the cowardly, ineffectual, effeminate guys – and not to their Homo-American buddies? It’s tempting to point out that effeminate men, the sissies who got harassed and beaten up by the Real Men all their lives, are fundamentally tougher than any macho man – but that would be a mistake, partly because it plays into their ritual of competitive toughness and partly because at best it can only send the bullies off in search of someone they can still feel entitled to degrade as a not-man. That’s probably the core point right there: “faggot” does not say anything about the man who’s called one – it does say volumes about the fears and inadequacies of the men who use it as a token in their pathetic dominance games.

Faggot!

Every so often I hear straight men claiming that epithets like “faggot” don’t really refer to homosexuals. Rather, they say, it refers to ineffectual men who can’t take care of themselves or anyone else, who can’t give a woman what she needs, men who are cowardly and despicable, men who aren’t Real Men. (I don’t have any links at the moment, though I’ll try to add some the next time I encounter the claim online. I think I’ve seen Eminem and some other rappers saying such things, and I've read similar rationalizations about maricón in Mexican culture. Someone is playing a racist variation on the game here. To see how homophobic epithets are actually used by normal red-blooded American males -- haw haw haw! I can't believe I wrote that with a straight face! -- read some of the comments to this video. I'm still trying to figure out why it generated such hysteria.)

Since “gay” became a schoolyard epithet, soon after we queers mainstreamed it as a more-or-less neutral, non-clinical term for ourselves, I’ve heard the same thing about it as well. It’s true, some of the people who say “that’s so gay” are gay-friendly at other times, have gay friends, and pay liberal lip service to gay issues. And since we did claim the right to use “gay” for ourselves over the protests of our generation of genteel homophobes, I suppose we can’t really say that it has only one fixed meaning, and we shall stop linguistic change from happening in this one area forevermore.

That might even be the best response to “that’s so gay”: to recognize and, as necessary, point out that in that context, it has nothing to do with either the pre-1970 “gay” (“Don we now our gay apparel, fa-la-la fa-la-la la-la-la”) or the post-1970 homosexual “gay” (Gay Pride Now!).

Still, I don’t think any gay man who’s ever been called a fucking faggot (which means pretty much all of us) will take this claim seriously. “Faggot” refers not only to despicable, ineffectual men of any sexual orientation, but to men who have sex with other men, because in masculist culture men who have sex with other men are assumed to be despicable, ineffectual, etc. -- and fucked, in various senses of the word. There’s nothing more horrible in the masculine imagination than being penetrated anally: it takes away a man’s manhood as effectively as castration. For a man to enjoy being penetrated, to seek out the experience, is not thinkable (even if it’s not unknown to the men who deploy homophobic epithets). Gay liberationists were correct that shouting one’s fagitude to the world was a powerful challenge to the male supremacist order; that’s why gay liberation is now history, and today’s gay movement ambivalently calls for gender conformity, except for its reliably successful drag fundraisers.

“Faggot” and its synonyms are the equivalents for males of “whore” and its synonyms for women. What the Faggot and the Slut (as mythic figures) have in common is that they have been penetrated, and are therefore polluted, unclean. In both cases, the target of the epithet may not literally have been penetrated: boys may be targeted because they don’t fit in with other boys, regardless of their sexuality, and girls ditto – a girl may be called a Slut simply because she’s begun to develop breasts earlier than her age mates. But the words are (I think this the right use of the term) performative: by calling you a faggot or a whore, I symbolically penetrate you, establish my manhood, earn and reinforce my membership in the men’s house. (Girls call each other “slut” too.) There are some interesting books on the words for women, starting with Leora Tannenbaum’s Slut! Growing Up Female With a Bad Reputation (Seven Stories, 1999) and Emily White’s Fast Girls: Teenage Tribes and the Myth of the Slut (Scribner, 2002), but they don’t go deeply enough – I could sense the authors drawing back from the abyss. I don’t know of any books (or any significant writings at all) which deal with the words for men, though Richard Trexler’s Sex and Conquest: Gendered Violence, Political Order, and the European Conquest of the Americas (Polity Press, 1995) has some useful discussion, as does Geng Song’s The Fragile Scholar: Power and Masculinity in Chinese Culture (Hong Kong UP, 2004). There’s a lot more thinking to be done about this; I’m just trying out some ideas now.

Meanwhile, what about the males who say that “faggot” refers to somebody else, the cowardly, ineffectual, effeminate guys – and not to their Homo-American buddies? It’s tempting to point out that effeminate men, the sissies who got harassed and beaten up by the Real Men all their lives, are fundamentally tougher than any macho man – but that would be a mistake, partly because it plays into their ritual of competitive toughness and partly because at best it can only send the bullies off in search of someone they can still feel entitled to degrade as a not-man. That’s probably the core point right there: “faggot” does not say anything about the man who’s called one – it does say volumes about the fears and inadequacies of the men who use it as a token in their pathetic dominance games.

ADVANCE HEALTH CARE DIRECTIVE REGISTRIES

Does your state have an advance health care directive registry? You may not know. I spoke about my book in Tucson, Arizona last week and told my audiences that the Arizona registry should be a model for other states. But almost no one in my audience knew that Arizona had such a registry! It's free; it's easy; and it provides anyone who registers a directive with a wallet card that you can carry with your driver's license. In an emergency, the hospital types the numbers on your wallet card into a database and retrieves your directive. Then all the medical personnel know your own wishes and whom you have selected to make health care decisions for you if you are unable to make them for yourself. Same-sex couples rightly fear being shut out of medical decision-making if one of them has a medical emergency. But the solution to this problem isn't marriage; it's guaranteeing that EVERYONE, gay and straight, partnered and not, has the person they want making their health care decisions. We can and should solve this problem for all LGBT people, and we can do it now by getting more states to follow the example of Arizona. (Idaho is another good one; their wallet cards have bar codes.)

Becalmed Among The Great Unwashed

You know, I don’t think I’m going to finish reading Susan Jacoby’s The Age of American Unreason. I imagine Jacoby feels better for having written it, vented her bile, and talked to the press about it. But I don’t feel better for having read the first eighty pages, so I’m gonna vent my bile right here.

As I expected, The Age of American Unreason is an extended and not very skillful game of “Ain’t It Awful.” In a way, it’s frustrating to read, because I do dislike most of the things she dislikes, but then I don’t need her to tell me about them. On the other hand, I don’t share her fury over the use of “folks”:
a plague spread by the President of the United States, television anchors, radio talk show hosts, preachers in megachurches, self-help gurus, and anyone else attempting to demonstrate his or her identification with ordinary, presumably wholesome American values. Only a few decades ago, Americans were addressed as people or, in the more distant past, ladies and gentlemen. Now we are all folks.
A plague? Darling, get a grip. Reading this, one wants to deliver a Hollywood-style hysterics-stopping slap upside Jacoby’s head, and wipe the flecks of foam from her quivering lips. Someone who gets as worked up over “folks” as about creationism, infotainment, and Larry Summers’s slighting remarks about lady academics – and, as far as I can tell, more upset than she gets about the US war in Iraq – needs to work on her priorities. (Two hundred years ago, Jonathan Swift threw a similar hissyfit over the word “mob”, which would never take the place of “rabble” in his heart. I agree with the writer Jay Quinn that it's a shame Swift didn’t win that battle, so we could talk today about rock stars being “rabbled” by their fans.) If she opposes the war in Iraq, it seems to be because of Bush’s belief that he is Yahweh’s instrument, not because innocent people are getting, like, hurt and killed there. There’s an odd lack of ordinary humanity in Jacoby’s jeremiad. (How can you worry about dying children when Americans are misusing apostrophes?)

Nor has she convinced me that things are that much different from the way they used to be, especially since her own evidence has a way of refuting her. She thinks that reactionary Christian religion is more influential in American life than it was in the 1800s, though she documents plenty of anti-intellectualism and Christian square-headedness from that era, which managed to flourish without the aid of today’s mass media. She brushes aside the Second Great Awakening with a sniff, to focus on an oration by Ralph Waldo Emerson to Harvard College’s Class of 1837. Emerson told his audience that “The mind of this country, taught to aim at low objects, eats upon itself” – making basically the same complaint then that Jacoby’s making now, only without videogames and Oprah. Ironically, the burden of Emerson’s oration was that it was time for American culture to stand on its own two feet, rather than leaning on Europe; Jacoby regards Europe today as a comparatively enlightened place where Christians don’t keep Darwin out of the schools. She also admits that “American freethought” was “never a majority movement,” which is probably putting it mildly, but it still undermines her thesis that things used to be better.

Oh yeah – I asked a dozen or so undergraduates around the dorm where I work if they knew what Pearl Harbor was, since Jacoby told the New York Times that her book was inspired by overhearing two yuppies in a bar on the night of September 11, 2001, who seemed to have no idea about it. Everyone I asked knew that the bombing of Pearl Harbor by the Japanese led to the US entry to World War II. Jacoby will be relieved to know that the coming generation of college students know their history pretty well, even if that fact takes some wind out of her book’s sails.

Becalmed Among The Great Unwashed

You know, I don’t think I’m going to finish reading Susan Jacoby’s The Age of American Unreason. I imagine Jacoby feels better for having written it, vented her bile, and talked to the press about it. But I don’t feel better for having read the first eighty pages, so I’m gonna vent my bile right here.

As I expected, The Age of American Unreason is an extended and not very skillful game of “Ain’t It Awful.” In a way, it’s frustrating to read, because I do dislike most of the things she dislikes, but then I don’t need her to tell me about them. On the other hand, I don’t share her fury over the use of “folks”:
a plague spread by the President of the United States, television anchors, radio talk show hosts, preachers in megachurches, self-help gurus, and anyone else attempting to demonstrate his or her identification with ordinary, presumably wholesome American values. Only a few decades ago, Americans were addressed as people or, in the more distant past, ladies and gentlemen. Now we are all folks.
A plague? Darling, get a grip. Reading this, one wants to deliver a Hollywood-style hysterics-stopping slap upside Jacoby’s head, and wipe the flecks of foam from her quivering lips. Someone who gets as worked up over “folks” as about creationism, infotainment, and Larry Summers’s slighting remarks about lady academics – and, as far as I can tell, more upset than she gets about the US war in Iraq – needs to work on her priorities. (Two hundred years ago, Jonathan Swift threw a similar hissyfit over the word “mob”, which would never take the place of “rabble” in his heart. I agree with the writer Jay Quinn that it's a shame Swift didn’t win that battle, so we could talk today about rock stars being “rabbled” by their fans.) If she opposes the war in Iraq, it seems to be because of Bush’s belief that he is Yahweh’s instrument, not because innocent people are getting, like, hurt and killed there. There’s an odd lack of ordinary humanity in Jacoby’s jeremiad. (How can you worry about dying children when Americans are misusing apostrophes?)

Nor has she convinced me that things are that much different from the way they used to be, especially since her own evidence has a way of refuting her. She thinks that reactionary Christian religion is more influential in American life than it was in the 1800s, though she documents plenty of anti-intellectualism and Christian square-headedness from that era, which managed to flourish without the aid of today’s mass media. She brushes aside the Second Great Awakening with a sniff, to focus on an oration by Ralph Waldo Emerson to Harvard College’s Class of 1837. Emerson told his audience that “The mind of this country, taught to aim at low objects, eats upon itself” – making basically the same complaint then that Jacoby’s making now, only without videogames and Oprah. Ironically, the burden of Emerson’s oration was that it was time for American culture to stand on its own two feet, rather than leaning on Europe; Jacoby regards Europe today as a comparatively enlightened place where Christians don’t keep Darwin out of the schools. She also admits that “American freethought” was “never a majority movement,” which is probably putting it mildly, but it still undermines her thesis that things used to be better.

Oh yeah – I asked a dozen or so undergraduates around the dorm where I work if they knew what Pearl Harbor was, since Jacoby told the New York Times that her book was inspired by overhearing two yuppies in a bar on the night of September 11, 2001, who seemed to have no idea about it. Everyone I asked knew that the bombing of Pearl Harbor by the Japanese led to the US entry to World War II. Jacoby will be relieved to know that the coming generation of college students know their history pretty well, even if that fact takes some wind out of her book’s sails.

But Enough About You ...


This article – well, really it’s only a squib – by one Megan McArdle has been linked by IOZ (in a strong, eloquent post), if not by others, on the web. It’s interesting to watch Ms. McArdle squirm:

Obviously, there are people who were right about the war for the right reasons, and we should examine what their thought process was--not merely the conclusions they came to, but how they got there. Other peoples’ opposition was animated by principles that may be right, but aren’t really very helpful: the pacifists, the isolationists, the reflexive opponents of Republicans or the US military. Within the limits on foreign policy in a hegemonic power, these just aren’t particularly useful, again, regardless of whether you are metaphysically correct.

“It won't work” is the easiest prediction to get right; almost nothing does. The thought process that tells you something probably won't work is not always a good way to figure out what will, even if you were right for the right reasons, as I agree lots of people were. That’s why libertarians have a great track record at predicting which government programs will fail (almost all of them) and a lousy track record at designing ones that do work.

On the other hand, “I thought it would work for X reason”, when it didn’t work, is, I think, a lesson you can carry into both decisions about what to do, and what not to do. On a deeper level, understanding the unconscious cognitive biases that lead smart and well meaning people to believe that things which will not work, will work, is a very good way to prevent yourself from making the same mistake.

It’s a repulsive performance, and while I’m tempted to say that it’s surprising to find it on the site of a liberal magazine like The Atlantic, I have to recall that The Atlantic also spotlighted Dinesh D’Souza’s right-wing tract Illiberal Education, publishing an excerpt before the book was published. Of the first few dozen commenters, most fault McArdle for thinking that the invasion of Iraq hasn’t worked, or it would have if not for the Iraqis, which is probably the best refutation of her position one could ask for.

Notice, in the first paragraph I’ve quoted, how blithely she dismisses the “pacifists”, the “isolationists”, not to mention those who are “reflexively” opposed to the Republican party. I wonder who she has in mind. It’s so easy, and such a popular tactic, not to name names, so no one can quibble over the accuracy of the characterizations. But if someone argues nowadays that the Japanese should not have tried to take over Asia in the 1930s, is that “isolationism”? Does only a “pacifist” say that the Japanese should not have killed Our Boys at Pearl Harbor, or that al-Qaeda was wrong to destroy the World Trade Towers? American pundits and politicians never hesitate to make moral judgments on the actions of our certified enemies; it’s only the US whose motives are beyond question.

Next McArdle moves to the Realpolitik so beloved of mainstream liberals and conservatives alike: well, we live in a world of hegemony, so we have to work within those parameters, don’t we, and not be afraid to get our hands a little dirty. So, the question becomes something like: how can we effectively achieve our aims – never mind whether those aims are good ones? How could Hitler have gone about establishing hegemony over Europe, for instance, in a way that would work? When the Soviets crushed democracy in Czechoslovakia in 1968, is the only permissible question whether their hegemony worked? And how about China’s hegemony over Tibet? A Chinese Megan McArdle could explain that only an isolationist or a pacifist, surely, would deny China’s right to run that country as it wishes. The only question is whether Chinese methods will work, and if not, how to make them work.

As I remember it, American liberals who opposed the invasion of Tibet -- I mean Iraq, sorry! – mostly expressed the fear that “we” would get into another “quagmire” there, like we did in Vietnam. Gloria Steinem, for one, expressed that fear in a speech here at Indiana University. What about the Iraqis who might be killed by our bombs and artillery and white phosphorus, you ask? Who cares? No one’s going to accuse Steinem of pacifism or isolationism! There was debate in The Nation, too, about how comparable Iraq was to Vietnam, though a few knee-jerk anti-Republicans were allowed to express their reflexive rejection of hegemony in its pages.

One commenter at IOZ asked, “But did anyone opposed to the war intelligently warn what would happen if the US went in without a governance plan? I don't recall that being their message.” Gracious, so many demands here, demands that would never be made of supporters of the war – intelligence, for one. But leaving aside those who warned of a quagmire, there’s this article by Noam Chomsky, and all you have to do is browse around Counterpunch in the months leading up to the invasion to find numerous warnings that it would not be the cakewalk promised by the Bushites. Those predictions have mostly been borne out by events, too. But for the likes of Megan McArdle, the deaths of hundreds of thousands of Iraqis and the flight of millions more are of no account in themselves, only as signs of our doing our hegemony wrong.

But then there’s Pete Seeger, the granddaddy of privileged white kids learning folk music, blacklisted from American TV as a Red for many years until he appeared on The Smothers Brothers Show in 1968. Seeger wrote a song called “Waist-Deep in the Big Muddy” about the American experience in Vietnam. The Smothers Brothers bucked CBS censors so Seeger could perform this radical, cutting-edge political song on their show. The key offense, much as in a Stalinist state, was the song’s reference to “the big fool [who] says to push on,” widely taken to mean President Lyndon Baines Johnson. The song is about American soldiers “on maneuvers in Louisiana,” training for the Big One, WWII, who are nearly sucked down into quicksand because of the incompetence of their captain. If we take this song as it was meant to be, as an allegory of America in Vietnam, it’s notable that what menaces Our Boys is a force of nature – opposing human beings are conspicuously absent, to say nothing of napalmed children and slaughtered villagers. Seeger knew better, I hope. But that this pretentious song could have seemed extreme (or daring, depending on your point of view) tells me a lot about American hegemony, even among opponents of the US invasion of Vietnam. … A few years ago I happened on a Pete Seeger songbook at the library and began working through it, learning songs I hadn’t heard in years. I started to learn “Big Muddy,” but as I listened to the words I was singing I couldn’t go on.

I’m also reminded of a joke, which I first encountered in Leo Rosten’s The Joy of Yiddish but found again in Paul Breines’s very serious and important book Tough Jews. Some rabbinic students were drafted into the Tsar’s army more than a century ago, and much to their trainers’ surprise they turned out to be excellent sharpshooters. On the target range they never missed. But when they were put into battle, they refused to fire their guns. Their officers screamed at them, “What’s the matter? Why don’t you shoot?” They replied, “But those are real men out there, sir – if we shoot, we might hurt them.” Crazy pacifists!

But Enough About You ...


This article – well, really it’s only a squib – by one Megan McArdle has been linked by IOZ (in a strong, eloquent post), if not by others, on the web. It’s interesting to watch Ms. McArdle squirm:

Obviously, there are people who were right about the war for the right reasons, and we should examine what their thought process was--not merely the conclusions they came to, but how they got there. Other peoples’ opposition was animated by principles that may be right, but aren’t really very helpful: the pacifists, the isolationists, the reflexive opponents of Republicans or the US military. Within the limits on foreign policy in a hegemonic power, these just aren’t particularly useful, again, regardless of whether you are metaphysically correct.

“It won't work” is the easiest prediction to get right; almost nothing does. The thought process that tells you something probably won't work is not always a good way to figure out what will, even if you were right for the right reasons, as I agree lots of people were. That’s why libertarians have a great track record at predicting which government programs will fail (almost all of them) and a lousy track record at designing ones that do work.

On the other hand, “I thought it would work for X reason”, when it didn’t work, is, I think, a lesson you can carry into both decisions about what to do, and what not to do. On a deeper level, understanding the unconscious cognitive biases that lead smart and well meaning people to believe that things which will not work, will work, is a very good way to prevent yourself from making the same mistake.

It’s a repulsive performance, and while I’m tempted to say that it’s surprising to find it on the site of a liberal magazine like The Atlantic, I have to recall that The Atlantic also spotlighted Dinesh D’Souza’s right-wing tract Illiberal Education, publishing an excerpt before the book was published. Of the first few dozen commenters, most fault McArdle for thinking that the invasion of Iraq hasn’t worked, or it would have if not for the Iraqis, which is probably the best refutation of her position one could ask for.

Notice, in the first paragraph I’ve quoted, how blithely she dismisses the “pacifists”, the “isolationists”, not to mention those who are “reflexively” opposed to the Republican party. I wonder who she has in mind. It’s so easy, and such a popular tactic, not to name names, so no one can quibble over the accuracy of the characterizations. But if someone argues nowadays that the Japanese should not have tried to take over Asia in the 1930s, is that “isolationism”? Does only a “pacifist” say that the Japanese should not have killed Our Boys at Pearl Harbor, or that al-Qaeda was wrong to destroy the World Trade Towers? American pundits and politicians never hesitate to make moral judgments on the actions of our certified enemies; it’s only the US whose motives are beyond question.

Next McArdle moves to the Realpolitik so beloved of mainstream liberals and conservatives alike: well, we live in a world of hegemony, so we have to work within those parameters, don’t we, and not be afraid to get our hands a little dirty. So, the question becomes something like: how can we effectively achieve our aims – never mind whether those aims are good ones? How could Hitler have gone about establishing hegemony over Europe, for instance, in a way that would work? When the Soviets crushed democracy in Czechoslovakia in 1968, is the only permissible question whether their hegemony worked? And how about China’s hegemony over Tibet? A Chinese Megan McArdle could explain that only an isolationist or a pacifist, surely, would deny China’s right to run that country as it wishes. The only question is whether Chinese methods will work, and if not, how to make them work.

As I remember it, American liberals who opposed the invasion of Tibet -- I mean Iraq, sorry! – mostly expressed the fear that “we” would get into another “quagmire” there, like we did in Vietnam. Gloria Steinem, for one, expressed that fear in a speech here at Indiana University. What about the Iraqis who might be killed by our bombs and artillery and white phosphorus, you ask? Who cares? No one’s going to accuse Steinem of pacifism or isolationism! There was debate in The Nation, too, about how comparable Iraq was to Vietnam, though a few knee-jerk anti-Republicans were allowed to express their reflexive rejection of hegemony in its pages.

One commenter at IOZ asked, “But did anyone opposed to the war intelligently warn what would happen if the US went in without a governance plan? I don't recall that being their message.” Gracious, so many demands here, demands that would never be made of supporters of the war – intelligence, for one. But leaving aside those who warned of a quagmire, there’s this article by Noam Chomsky, and all you have to do is browse around Counterpunch in the months leading up to the invasion to find numerous warnings that it would not be the cakewalk promised by the Bushites. Those predictions have mostly been borne out by events, too. But for the likes of Megan McArdle, the deaths of hundreds of thousands of Iraqis and the flight of millions more are of no account in themselves, only as signs of our doing our hegemony wrong.

But then there’s Pete Seeger, the granddaddy of privileged white kids learning folk music, blacklisted from American TV as a Red for many years until he appeared on The Smothers Brothers Show in 1968. Seeger wrote a song called “Waist-Deep in the Big Muddy” about the American experience in Vietnam. The Smothers Brothers bucked CBS censors so Seeger could perform this radical, cutting-edge political song on their show. The key offense, much as in a Stalinist state, was the song’s reference to “the big fool [who] says to push on,” widely taken to mean President Lyndon Baines Johnson. The song is about American soldiers “on maneuvers in Louisiana,” training for the Big One, WWII, who are nearly sucked down into quicksand because of the incompetence of their captain. If we take this song as it was meant to be, as an allegory of America in Vietnam, it’s notable that what menaces Our Boys is a force of nature – opposing human beings are conspicuously absent, to say nothing of napalmed children and slaughtered villagers. Seeger knew better, I hope. But that this pretentious song could have seemed extreme (or daring, depending on your point of view) tells me a lot about American hegemony, even among opponents of the US invasion of Vietnam. … A few years ago I happened on a Pete Seeger songbook at the library and began working through it, learning songs I hadn’t heard in years. I started to learn “Big Muddy,” but as I listened to the words I was singing I couldn’t go on.

I’m also reminded of a joke, which I first encountered in Leo Rosten’s The Joy of Yiddish but found again in Paul Breines’s very serious and important book Tough Jews. Some rabbinic students were drafted into the Tsar’s army more than a century ago, and much to their trainers’ surprise they turned out to be excellent sharpshooters. On the target range they never missed. But when they were put into battle, they refused to fire their guns. Their officers screamed at them, “What’s the matter? Why don’t you shoot?” They replied, “But those are real men out there, sir – if we shoot, we might hurt them.” Crazy pacifists!

Age Is Not Just A Number

I saw it again today on the Web, “Age Is Just a Number.” I guess this cliché makes some sense as a corrective to the idea that at each age you’re permitted to act a certain way and do certain things: dress like this but not like that, look like that but not like this, do this but not that, and so on. But beyond that, I think it’s dead wrong.

Aging has not, so far, been a big deal for me. My health, at 57, remains good. I’m now the oldest person in my department at work, but I think I’m virtually the only full-time worker there who isn’t on some kind of medication for physical or other ailments. I’m just now beginning to get enough gray in my hair to be noticeable, and I’m often told I look younger than my age – a trait that runs in my family, I think. When I met the mother of some of my Korean friends, she asked me how it is that I look so young. “I have no children,” I told her, and she nodded in agreement. But my parents, who did have children (four of us, heaven help them), aged gracefully too, and kept reasonably good health well into their seventies.

But even so, my body has changed and slowed down. It takes longer for cuts and other small injuries to heal. I can’t walk onto a track after a hiatus and run two miles in fourteen minutes, as I could do till I was in my 30s; I might finish one mile, but it would probably take me fourteen minutes by itself. I’ve put on weight gradually over the decades, despite moderate but obviously insufficient exercise and a non-sedentary job, and it won’t come off. I’m definitely less flexible than I used to be, and it takes more work now to try to fix that even a little. When I was in my twenties I frequently masturbated twice a day without having to strain; now it’s, erm, rather less often. (I make sure I have a few orgasms each week, partly so I won’t forget how, and partly because they’re good for prostate health.) I still have a powerful visceral reaction to the sight of attractive people, but that has little to do with the body; sexuality is primarily in the head.

Just on these points, it’s obvious to me that my age is not just a number – it’s written in my body. I try to attend to my flesh and let it, not the number of birthdays, guide me, but I don’t try to ignore its changes.

Now suppose that my brain, all memories intact, could be transplanted into a much younger body – what then? Well, those memories are important too. For me, the assassination of John F. Kennedy in 1963 is a memory, not something I read about in a book or saw on the History Channel. Ditto the Vietnam War, the assassination of Martin Luther King Jr., the impeachment and resignation of Richard Nixon, the assassination of John Lennon, the election of Bill Clinton. That last is worth stressing just a tad: a college senior graduating this year would have been four years old when Clinton was elected – only a little younger than I was when Dwight Eisenhower was re-elected in 1956. I remember the fact of the election, but nothing of the campaign or the issues. The 1960 elections, between Kennedy and Nixon, were the first I paid much attention to. And just think – the September 11th attacks happened over six years ago. There are children in their first year of school who were not born at the time, and children just a few years older for whom they are most blurry memories. Soon they too will be history.

Similarly, Beatlemania, the Summer of Love, Woodstock, disco, punk – all these are memories for me, and I still have most of the records I’ve bought since the 1960s. The Beatles, the Stones, Bob Dylan, the Supremes, the Four Tops, and so on are not what I grew up hearing on my parents’ scratched vinyl or my older brother’s CD player. I can remember when all of it didn’t yet exist. To say nothing of the fact that I was eighteen, freshly graduated from high school, when I read about the Stonewall riots in the Village Voice, a week or so after they happened.

These cultural and historical events and changes – and much more – are written in my body too. I carry them around with me, in an invisible balloon in my head. They are the sea in which my mind swims, the block of ice or amber in which I am encased. I’d be a different person if I’d been born ten years earlier, or ten years later, in ways I can’t even imagine. None of this is in any way a complaint, or a claim that I’m hermetically sealed off from people older or younger than I am. It is merely to say that age is far more than a number. In many ways it’s a quality rather than a quantity, but however you look at it, the difference it makes is real and not an illusion.

Age Is Not Just A Number

I saw it again today on the Web, “Age Is Just a Number.” I guess this cliché makes some sense as a corrective to the idea that at each age you’re permitted to act a certain way and do certain things: dress like this but not like that, look like that but not like this, do this but not that, and so on. But beyond that, I think it’s dead wrong.

Aging has not, so far, been a big deal for me. My health, at 57, remains good. I’m now the oldest person in my department at work, but I think I’m virtually the only full-time worker there who isn’t on some kind of medication for physical or other ailments. I’m just now beginning to get enough gray in my hair to be noticeable, and I’m often told I look younger than my age – a trait that runs in my family, I think. When I met the mother of some of my Korean friends, she asked me how it is that I look so young. “I have no children,” I told her, and she nodded in agreement. But my parents, who did have children (four of us, heaven help them), aged gracefully too, and kept reasonably good health well into their seventies.

But even so, my body has changed and slowed down. It takes longer for cuts and other small injuries to heal. I can’t walk onto a track after a hiatus and run two miles in fourteen minutes, as I could do till I was in my 30s; I might finish one mile, but it would probably take me fourteen minutes by itself. I’ve put on weight gradually over the decades, despite moderate but obviously insufficient exercise and a non-sedentary job, and it won’t come off. I’m definitely less flexible than I used to be, and it takes more work now to try to fix that even a little. When I was in my twenties I frequently masturbated twice a day without having to strain; now it’s, erm, rather less often. (I make sure I have a few orgasms each week, partly so I won’t forget how, and partly because they’re good for prostate health.) I still have a powerful visceral reaction to the sight of attractive people, but that has little to do with the body; sexuality is primarily in the head.

Just on these points, it’s obvious to me that my age is not just a number – it’s written in my body. I try to attend to my flesh and let it, not the number of birthdays, guide me, but I don’t try to ignore its changes.

Now suppose that my brain, all memories intact, could be transplanted into a much younger body – what then? Well, those memories are important too. For me, the assassination of John F. Kennedy in 1963 is a memory, not something I read about in a book or saw on the History Channel. Ditto the Vietnam War, the assassination of Martin Luther King Jr., the impeachment and resignation of Richard Nixon, the assassination of John Lennon, the election of Bill Clinton. That last is worth stressing just a tad: a college senior graduating this year would have been four years old when Clinton was elected – only a little younger than I was when Dwight Eisenhower was re-elected in 1956. I remember the fact of the election, but nothing of the campaign or the issues. The 1960 elections, between Kennedy and Nixon, were the first I paid much attention to. And just think – the September 11th attacks happened over six years ago. There are children in their first year of school who were not born at the time, and children just a few years older for whom they are most blurry memories. Soon they too will be history.

Similarly, Beatlemania, the Summer of Love, Woodstock, disco, punk – all these are memories for me, and I still have most of the records I’ve bought since the 1960s. The Beatles, the Stones, Bob Dylan, the Supremes, the Four Tops, and so on are not what I grew up hearing on my parents’ scratched vinyl or my older brother’s CD player. I can remember when all of it didn’t yet exist. To say nothing of the fact that I was eighteen, freshly graduated from high school, when I read about the Stonewall riots in the Village Voice, a week or so after they happened.

These cultural and historical events and changes – and much more – are written in my body too. I carry them around with me, in an invisible balloon in my head. They are the sea in which my mind swims, the block of ice or amber in which I am encased. I’d be a different person if I’d been born ten years earlier, or ten years later, in ways I can’t even imagine. None of this is in any way a complaint, or a claim that I’m hermetically sealed off from people older or younger than I am. It is merely to say that age is far more than a number. In many ways it’s a quality rather than a quantity, but however you look at it, the difference it makes is real and not an illusion.

Prepare To Be Boarded!

I'm mighty tard (Old Hoosier for "tired") tonight, so I'm not going to say much. I was thinking of writing something about Alexei Panshin's 1968 sf novel Rite of Passage, which I just reread, but I quickly found there was more to say than I felt like saying tonight, so I'll just show you the picture above, which I suddenly remembered when I read a reference to a slide rule in Panshin's book. It's by the popular sf artist Kelly Freas, and it makes me want to try to track down the Murray Leinster story it illustrates, because I feel sure it isn't totally serious.

I think I first saw this cover reproduced around twenty years ago in a book called something like The Science in Science Fiction, which pointed out the most amusing detail in it: the slide rule, once as essential an accessory for technogeeks as a pocket protector or a ham radio license, but totally replaced by the pocket calculator and the microcomputer. Yet it never occurred to most sf writers that computers would be miniaturized. Even the Noble Engineer Heinlein, famed for his technological prophecies, had his far-future starship crews swearing by their trusty slipsticks. The obsolescence of the slide rule clenched in his teeth like a cutlass makes Freas' space pirate even campier.

Prepare To Be Boarded!

I'm mighty tard (Old Hoosier for "tired") tonight, so I'm not going to say much. I was thinking of writing something about Alexei Panshin's 1968 sf novel Rite of Passage, which I just reread, but I quickly found there was more to say than I felt like saying tonight, so I'll just show you the picture above, which I suddenly remembered when I read a reference to a slide rule in Panshin's book. It's by the popular sf artist Kelly Freas, and it makes me want to try to track down the Murray Leinster story it illustrates, because I feel sure it isn't totally serious.

I think I first saw this cover reproduced around twenty years ago in a book called something like The Science in Science Fiction, which pointed out the most amusing detail in it: the slide rule, once as essential an accessory for technogeeks as a pocket protector or a ham radio license, but totally replaced by the pocket calculator and the microcomputer. Yet it never occurred to most sf writers that computers would be miniaturized. Even the Noble Engineer Heinlein, famed for his technological prophecies, had his far-future starship crews swearing by their trusty slipsticks. The obsolescence of the slide rule clenched in his teeth like a cutlass makes Freas' space pirate even campier.

Wilde In The Streets


Here's another review for Gay Community News, published in the January 15-21, 1989 issue. The caricature above is by Max Beerbohm, a younger contemporary and friend of Wilde's who outlived him by more than half a century.

Oscar Wilde

by Richard Ellmann
New York
: Alfred A. Knopf, 1988
680 pp.

Oscar Wilde's London: a Scrapbook of Vices and Virtues 1880-1900
by Wolf Von Eckardt, Sander L. Gilman, and J. Edward Chamberlin
Garden City: Anchor Press/Doubleday, 1987
285 pp.

The Oscar industry grinds on, and its two latest offerings demonstrate the range of its products’ quality.

The idea behind Oscar Wilde’s London is a good one. “This book is not about Oscar Wilde,” the authors assert in the Introduction. “It is about the city that made Oscar Wilde.” If, like me, you’re a bit vague on the actual conditions of late Victorian Britain, a social history sounds like just the thing to help understand how Wilde perceived himself and was perceived in his day. Biographers fill in quite a bit of this background, but there are many details -- such as the fact that when Wilde arrived in London in 1879, electric street lights were just beginning to be installed there -- which don’t belong to biography proper but help to understand its subject.

The best thing about Oscar Wilde’s London is its illustrations, particularly the many photographs, most of which are so clear and sharp they might have been taken yesterday. Not just of the famous, they include some fascinating pictures of daily life by one Paul Martin (see pp. 19-20, 94), whose work I’d like to know better. The text is less impressive. The chapters on London’s growth, on the poor, and on sports and popular entertainment are pretty good. But the book seems rather poorly organized. It offers no information on how the three authors divided up the writing among themselves, and at times I had the feeling that it had been pasted together too quickly. Topics are sometimes dropped almost in the middle, with the outcome of one or another controversy omitted as though everyone knew it. There are also some odd errors which suggest a lack of care in fact-checking. The message on the infamous visiting card left for Wilde by the Marquess of Queensberry, which led to Wilde’s downfall, is quoted here as “To Oscar Wilde posing as a sodemite (sic)” (73). Queensberry did indeed misspell the key word, but I’ve always seen it rendered “Somdomite”, and had thought the error was almost as well-known as some of Wilde's epigrams. (According to Richard Ellmann’s new biography, the actual message was “To Oscar Wilde posing Somdomite”.) I felt that the connection with Oscar Wilde was too tenuous, more of a marketing hook than a unifying principle for the book. Still, Oscar Wilde’s London is worth a look, and it includes a long reading list which should be useful to anyone who wants to explore the subject more thoroughly. See if your library has it.

The late Richard Ellmann completed Oscar Wilde just before his death in 1987, and while it is neither as exhaustive nor as definitive as his famous biography of James Joyce, this new biography is notable for its warmth, good judgment, and good writing. It is the least homophobic of any book on Wilde by a straight author that I’ve seen: not just free of amateur psychoanalysis but a bit disdainful of that popular biographical perversion, and downright scornful of the hypocrisy which destroyed Wilde's life and career. Nowadays we ought to be able to take such an attitude for granted, but unfortunately it’s still rare enough that Ellmann deserves notice for it.

Ellmann, in fact, writes as an unabashed fan of Wilde, and this makes his book even more refreshing. He has many touching stories to tell about Wilde’s generosity and kindness (see especially pp. 412-13), even in areas where other biographers turn up their noses: “What seems to characterize all Wilde’s affairs is that he got to know the boys as individuals, treated them handsomely, allowed them to refuse his attentions without becoming rancorous, and did not corrupt them” (390). He praises Wilde’s defense of ‘Greek love’ at his trial: “For once Wilde spoke not wittily but well.” Ellmann also credits those courageous souls who helped Wilde when he needed it most. Frank Harris, who is often portrayed (not entirely without reason) as a major buffoon in books about Wilde, has a shining moment of humanity that makes up for a lot of silliness. Believing that Wilde had not committed the acts of which he was convicted, Harris arranged to borrow a yacht to smuggle him to the Continent. When he told him of the plan, “...Wilde broke out and said, ‘You talk with passion and conviction, as if I were innocent.’ ‘But you are innocent,’ said Harris, ‘aren’t you?’ ‘No,’ said Wilde. ‘I thought you knew that all along.’ Harris said, ‘I did not believe it for a moment.’ ‘This will make a great difference to you?’ asked Wilde, but Harris assured him it would not” (468). There are people today who couldn't rise to so much humanity. By way of contrast, the painter Sir Edward Coley Burne-Jones “hoped that Wilde would shoot himself and was disappointed when he did not” (479).

There is one area where Wilde’s generosity failed, however, and since no one ever seems to comment on it, I'd like to. Ellmann seems not much bothered by the clear indications that Wilde married because he needed money and public proof of heterosexual normality, and though he was charmed and attracted by Constance Lloyd, he doesn’t seem ever to have taken her seriously. He evidently began to neglect her almost at once, first for his rounds of socializing and travel, then for the young men who occupied his real sexual and romantic interest. After Wilde’s downfall, “Paul Adam, in La Revue blanche of 15 May 1895, argued that Greek love was less harmful than adultery” (482). But Wilde’s love for Alfred Douglas was adulterous, to say nothing of all those hardened little hustlers to whom he was apparently rather kinder than he was to his wife and children. While he was in prison, a reconciliation was arranged which Ellmann seems to think could have succeeded, but it was forestalled by the return of Douglas and by Constance’s death in 1898. I don't doubt that Wilde was so grateful for his wife’s willingness to forgive him that he really believed he loved her, and would change his ways forever. But I also don’t doubt that once he’d regained his freedom, he would have allowed boredom to set in. Despite this, Wilde doesn't come off badly compared to his heterosexual contemporaries -- how many of them went to prison for marrying money or neglecting their wives? -- or to many gay men and lesbians before and since who’ve made the mistake of marrying heterosexually to get a hostile society off their backs. The more so if Ellmann is correct that Wilde had no overt sexual experience with men before his marriage, and some experience with women; that’s a classic formula for disastrous self-deception.

It’s unfortunate that Wilde was unable to pick up the pieces of his life and career after his imprisonment. He had a social conscience, encouraged by his Irish nationalist mother, and had done some interesting political writing; he wasn’t quite the mindless butterfly he sometimes pretended to be. As we watch around us the ominous rise of the same forces that destroyed him, he no longer seems as quaint as he did in the 1970s, and his life has much to teach us. Ellmann’s biography is probably the one to read, and now that it‘s out in paperback it’s the one to own: humane, learned, affectionate and smoothly written, Oscar Wilde is a model of the biographer’s art.

Wilde In The Streets


Here's another review for Gay Community News, published in the January 15-21, 1989 issue. The caricature above is by Max Beerbohm, a younger contemporary and friend of Wilde's who outlived him by more than half a century.

Oscar Wilde

by Richard Ellmann
New York
: Alfred A. Knopf, 1988
680 pp.

Oscar Wilde's London: a Scrapbook of Vices and Virtues 1880-1900
by Wolf Von Eckardt, Sander L. Gilman, and J. Edward Chamberlin
Garden City: Anchor Press/Doubleday, 1987
285 pp.

The Oscar industry grinds on, and its two latest offerings demonstrate the range of its products’ quality.

The idea behind Oscar Wilde’s London is a good one. “This book is not about Oscar Wilde,” the authors assert in the Introduction. “It is about the city that made Oscar Wilde.” If, like me, you’re a bit vague on the actual conditions of late Victorian Britain, a social history sounds like just the thing to help understand how Wilde perceived himself and was perceived in his day. Biographers fill in quite a bit of this background, but there are many details -- such as the fact that when Wilde arrived in London in 1879, electric street lights were just beginning to be installed there -- which don’t belong to biography proper but help to understand its subject.

The best thing about Oscar Wilde’s London is its illustrations, particularly the many photographs, most of which are so clear and sharp they might have been taken yesterday. Not just of the famous, they include some fascinating pictures of daily life by one Paul Martin (see pp. 19-20, 94), whose work I’d like to know better. The text is less impressive. The chapters on London’s growth, on the poor, and on sports and popular entertainment are pretty good. But the book seems rather poorly organized. It offers no information on how the three authors divided up the writing among themselves, and at times I had the feeling that it had been pasted together too quickly. Topics are sometimes dropped almost in the middle, with the outcome of one or another controversy omitted as though everyone knew it. There are also some odd errors which suggest a lack of care in fact-checking. The message on the infamous visiting card left for Wilde by the Marquess of Queensberry, which led to Wilde’s downfall, is quoted here as “To Oscar Wilde posing as a sodemite (sic)” (73). Queensberry did indeed misspell the key word, but I’ve always seen it rendered “Somdomite”, and had thought the error was almost as well-known as some of Wilde's epigrams. (According to Richard Ellmann’s new biography, the actual message was “To Oscar Wilde posing Somdomite”.) I felt that the connection with Oscar Wilde was too tenuous, more of a marketing hook than a unifying principle for the book. Still, Oscar Wilde’s London is worth a look, and it includes a long reading list which should be useful to anyone who wants to explore the subject more thoroughly. See if your library has it.

The late Richard Ellmann completed Oscar Wilde just before his death in 1987, and while it is neither as exhaustive nor as definitive as his famous biography of James Joyce, this new biography is notable for its warmth, good judgment, and good writing. It is the least homophobic of any book on Wilde by a straight author that I’ve seen: not just free of amateur psychoanalysis but a bit disdainful of that popular biographical perversion, and downright scornful of the hypocrisy which destroyed Wilde's life and career. Nowadays we ought to be able to take such an attitude for granted, but unfortunately it’s still rare enough that Ellmann deserves notice for it.

Ellmann, in fact, writes as an unabashed fan of Wilde, and this makes his book even more refreshing. He has many touching stories to tell about Wilde’s generosity and kindness (see especially pp. 412-13), even in areas where other biographers turn up their noses: “What seems to characterize all Wilde’s affairs is that he got to know the boys as individuals, treated them handsomely, allowed them to refuse his attentions without becoming rancorous, and did not corrupt them” (390). He praises Wilde’s defense of ‘Greek love’ at his trial: “For once Wilde spoke not wittily but well.” Ellmann also credits those courageous souls who helped Wilde when he needed it most. Frank Harris, who is often portrayed (not entirely without reason) as a major buffoon in books about Wilde, has a shining moment of humanity that makes up for a lot of silliness. Believing that Wilde had not committed the acts of which he was convicted, Harris arranged to borrow a yacht to smuggle him to the Continent. When he told him of the plan, “...Wilde broke out and said, ‘You talk with passion and conviction, as if I were innocent.’ ‘But you are innocent,’ said Harris, ‘aren’t you?’ ‘No,’ said Wilde. ‘I thought you knew that all along.’ Harris said, ‘I did not believe it for a moment.’ ‘This will make a great difference to you?’ asked Wilde, but Harris assured him it would not” (468). There are people today who couldn't rise to so much humanity. By way of contrast, the painter Sir Edward Coley Burne-Jones “hoped that Wilde would shoot himself and was disappointed when he did not” (479).

There is one area where Wilde’s generosity failed, however, and since no one ever seems to comment on it, I'd like to. Ellmann seems not much bothered by the clear indications that Wilde married because he needed money and public proof of heterosexual normality, and though he was charmed and attracted by Constance Lloyd, he doesn’t seem ever to have taken her seriously. He evidently began to neglect her almost at once, first for his rounds of socializing and travel, then for the young men who occupied his real sexual and romantic interest. After Wilde’s downfall, “Paul Adam, in La Revue blanche of 15 May 1895, argued that Greek love was less harmful than adultery” (482). But Wilde’s love for Alfred Douglas was adulterous, to say nothing of all those hardened little hustlers to whom he was apparently rather kinder than he was to his wife and children. While he was in prison, a reconciliation was arranged which Ellmann seems to think could have succeeded, but it was forestalled by the return of Douglas and by Constance’s death in 1898. I don't doubt that Wilde was so grateful for his wife’s willingness to forgive him that he really believed he loved her, and would change his ways forever. But I also don’t doubt that once he’d regained his freedom, he would have allowed boredom to set in. Despite this, Wilde doesn't come off badly compared to his heterosexual contemporaries -- how many of them went to prison for marrying money or neglecting their wives? -- or to many gay men and lesbians before and since who’ve made the mistake of marrying heterosexually to get a hostile society off their backs. The more so if Ellmann is correct that Wilde had no overt sexual experience with men before his marriage, and some experience with women; that’s a classic formula for disastrous self-deception.

It’s unfortunate that Wilde was unable to pick up the pieces of his life and career after his imprisonment. He had a social conscience, encouraged by his Irish nationalist mother, and had done some interesting political writing; he wasn’t quite the mindless butterfly he sometimes pretended to be. As we watch around us the ominous rise of the same forces that destroyed him, he no longer seems as quaint as he did in the 1970s, and his life has much to teach us. Ellmann’s biography is probably the one to read, and now that it‘s out in paperback it’s the one to own: humane, learned, affectionate and smoothly written, Oscar Wilde is a model of the biographer’s art.

Alaina helps me dye again

Here is Ally helping my get ready to dye today. The bottom picture is of Ally several years ago when she helped me test my dye book colors. Notice the difference in her height. She is officially taller than me.

Atheists Say The Darnedest Things!

Strange. I’ve been encountering an unusual number of strangely misinformed remarks about religion by atheists recently. Maybe I’ve just been in a meaner, crankier mood than usual?

Try this one, from the late Arthur C. Clarke: “Science can destroy a religion by ignoring it as well as by disproving its tenets. No one ever demonstrated, so far as I am aware, the nonexistence of Zeus or Thor, but they have few followers now.” (The site attributes it to Childhood’s End, one of Clarke’s novels, so I presume it’s spoken by a character, not directly by Clarke. Ordinarily it’s not wise to assume that characters speak for their authors, but the blogger who posted the quotation to his own site evidently did, so I’ll go along with him.)

It’s true, Zeus and Thor have few followers now, but the credit (or blame) doesn’t go to science; it goes to a certain rival cult, which achieved its supremacy not by ignoring rival gods but by imposing itself by force, up to and including violence.

Actually, the first thing that popped into my head when I read Clarke’s remark was drapetomania. Discovered in 1851 by a white American doctor named Cartwright, drapetomania was a disease that caused African-American slaves to run away from their masters. As far as I know, no one ever demonstrated scientifically that runaway slaves were not sick, but few would claim now that they were. I’m not saying that Clarke would have accepted the existence of drapetomania, only drawing the parallel to show that proofs and demonstrations are not necessarily relevant.

Clarke was never one of my favorite sf writers anyway, but he finally annoyed me terminally with a remark in the afterword to 3001. After patronizingly expressing affection for his religious friends (some of his best friends are Buddhists and Jews and Christians and Hindus! isn’t he liberal?), he purrs: “Perhaps it is better to be un-sane and happy, than it is to be sane and un-happy. But it is best of all to be sane and happy.”

Maybe Clarke would have accepted the existence of drapetomania. There is no reason I know of to believe that most religious believers are “un-sane.” The complacent assumption that most people are crazy while the assumer is the only sane person is not exactly a sign of perfect sanity, however. It’s certainly not a sign of rationality to try to discredit another person’s beliefs by questioning their sanity. I’ve observed that tactic used by Christian apologists against agnostics and atheists, and of course as a gay man I’m well acquainted with the secular medicalization of unpopular life choices.

Try this comment by another atheist: “Religion starts from the assumption that an ancient text or tradition is true, and seeks to reconcile observed reality with the text.” Well, no it doesn’t. We don’t really know how religion started, but most religions are not based on sacred texts -- Greek and Roman paganism, for instance. Judaism was a novelty in that respect (though its texts were a relatively late development compared to the sacrificial practices, purity rules, and festivals that were its core – and these also changed over time), followed by Christianity and Islam. Christianity started from current events – Jesus’ career as a miracle-worker and preacher, culminating in his death by crucifixion and the claims by his followers that he’d been raised from the dead – not from ancient texts or traditions. Early Christians appealed to the Jewish scriptures to justify their new cult, but they neither took them literally nor based their claims on the texts: rather they interpreted the texts with amazing elasticity to force them to conform to the sacred events. (Nothing in the Hebrew Bible, for example, predicts that the Messiah would be crucified and rise from the dead.)

At around the same time as the Christian New Testament was coalescing, rabbinic Judaism codified its legal rulings into a new compilation, the Mishnah, again partly as a result of historic events: the destruction of the Jerusalem Temple, which ended the sacrificial cult. This forced the reinterpretation of text and tradition to conform to reality, not vice versa.

Ancient texts and traditions weren’t always ancient. Once they become authoritative, they are used by their adherents in a complex way, both influencing believers and being influenced by them. The reinterpretation of texts reconciles the texts with observed reality, trying to make them fit present-day needs.

Finally, the blogger at whose site I found these quotations (except the one from 3001) wrote, as an example of the way that religion is a “fixed system,” unlike science: “The Catholic Church isn't going to ‘adjust’ or ‘self-correct’ their version of God based on conflicting ‘evidence’, whatever that might be; for them he is and will always be the omniscient creator of everything in the universe, and the ultimate answer to every question.” I can’t imagine Western science ever adjusting its basic approach to understanding the workings of the universe, namely trying to explain those workings without appealing to divine or other supernatural agency; that’s a given, though it was arrived at fairly gradually over the past 350 years or so. But even within the Roman Catholic tradition, the understanding of God has changed over the past two millennia. Augustine, for example, used Platonic ideas; Aquinas used Aristotle and other philosophical authorities.

The Church would probably claim that its understanding is indeed “self-correcting” (a popular, if dubious buzzword among scientific apologists these days). On less central issues, like slavery or Christendom’s relation to competing sects, Christian positions have changed quite a bit over the centuries. From the New Testament we know that widely divergent understandings of Christ coexisted and were in conflict from the earliest days of the churches. Outsiders had little or no input into these internal controversies, so I suppose their progress could be described as self-correcting.

It simply isn’t true that religion is a fixed system. As individuals, people change their religious beliefs in ways ranging from wrestling with personal fears and conflicts by interacting with tradition, to joining a new denomination or converting to a different religion – or abandoning religion altogether. Such changes may be affected by thinking about Copernican or Darwinian theory, but they may also take place entirely within a framework of religious thought. Believers sometimes want you to think their beliefs are fixed and solid, but it’s odd to find atheists taking them at their word. Nothing human is fixed and solid, and a look at the history of religious belief and practice will show that religion is no exception to the rule.

Thanks for voting

The contest ended Easter morning and Amanda did not win. The winner was chosen by a panel of judges. She did have the most votes, so thank you to everyone who took the time to vote. Tomorrow is a big dyeing day. My granddaughter, Alaina, is staying with me for a few days and we will be dyeing up a storm. We started folding and organizing wool in the studio tonight. She wore me out. She

Atheists Say The Darnedest Things!

Strange. I’ve been encountering an unusual number of strangely misinformed remarks about religion by atheists recently. Maybe I’ve just been in a meaner, crankier mood than usual?

Try this one, from the late Arthur C. Clarke: “Science can destroy a religion by ignoring it as well as by disproving its tenets. No one ever demonstrated, so far as I am aware, the nonexistence of Zeus or Thor, but they have few followers now.” (The site attributes it to Childhood’s End, one of Clarke’s novels, so I presume it’s spoken by a character, not directly by Clarke. Ordinarily it’s not wise to assume that characters speak for their authors, but the blogger who posted the quotation to his own site evidently did, so I’ll go along with him.)

It’s true, Zeus and Thor have few followers now, but the credit (or blame) doesn’t go to science; it goes to a certain rival cult, which achieved its supremacy not by ignoring rival gods but by imposing itself by force, up to and including violence.

Actually, the first thing that popped into my head when I read Clarke’s remark was drapetomania. Discovered in 1851 by a white American doctor named Cartwright, drapetomania was a disease that caused African-American slaves to run away from their masters. As far as I know, no one ever demonstrated scientifically that runaway slaves were not sick, but few would claim now that they were. I’m not saying that Clarke would have accepted the existence of drapetomania, only drawing the parallel to show that proofs and demonstrations are not necessarily relevant.

Clarke was never one of my favorite sf writers anyway, but he finally annoyed me terminally with a remark in the afterword to 3001. After patronizingly expressing affection for his religious friends (some of his best friends are Buddhists and Jews and Christians and Hindus! isn’t he liberal?), he purrs: “Perhaps it is better to be un-sane and happy, than it is to be sane and un-happy. But it is best of all to be sane and happy.”

Maybe Clarke would have accepted the existence of drapetomania. There is no reason I know of to believe that most religious believers are “un-sane.” The complacent assumption that most people are crazy while the assumer is the only sane person is not exactly a sign of perfect sanity, however. It’s certainly not a sign of rationality to try to discredit another person’s beliefs by questioning their sanity. I’ve observed that tactic used by Christian apologists against agnostics and atheists, and of course as a gay man I’m well acquainted with the secular medicalization of unpopular life choices.

Try this comment by another atheist: “Religion starts from the assumption that an ancient text or tradition is true, and seeks to reconcile observed reality with the text.” Well, no it doesn’t. We don’t really know how religion started, but most religions are not based on sacred texts -- Greek and Roman paganism, for instance. Judaism was a novelty in that respect (though its texts were a relatively late development compared to the sacrificial practices, purity rules, and festivals that were its core – and these also changed over time), followed by Christianity and Islam. Christianity started from current events – Jesus’ career as a miracle-worker and preacher, culminating in his death by crucifixion and the claims by his followers that he’d been raised from the dead – not from ancient texts or traditions. Early Christians appealed to the Jewish scriptures to justify their new cult, but they neither took them literally nor based their claims on the texts: rather they interpreted the texts with amazing elasticity to force them to conform to the sacred events. (Nothing in the Hebrew Bible, for example, predicts that the Messiah would be crucified and rise from the dead.)

At around the same time as the Christian New Testament was coalescing, rabbinic Judaism codified its legal rulings into a new compilation, the Mishnah, again partly as a result of historic events: the destruction of the Jerusalem Temple, which ended the sacrificial cult. This forced the reinterpretation of text and tradition to conform to reality, not vice versa.

Ancient texts and traditions weren’t always ancient. Once they become authoritative, they are used by their adherents in a complex way, both influencing believers and being influenced by them. The reinterpretation of texts reconciles the texts with observed reality, trying to make them fit present-day needs.

Finally, the blogger at whose site I found these quotations (except the one from 3001) wrote, as an example of the way that religion is a “fixed system,” unlike science: “The Catholic Church isn't going to ‘adjust’ or ‘self-correct’ their version of God based on conflicting ‘evidence’, whatever that might be; for them he is and will always be the omniscient creator of everything in the universe, and the ultimate answer to every question.” I can’t imagine Western science ever adjusting its basic approach to understanding the workings of the universe, namely trying to explain those workings without appealing to divine or other supernatural agency; that’s a given, though it was arrived at fairly gradually over the past 350 years or so. But even within the Roman Catholic tradition, the understanding of God has changed over the past two millennia. Augustine, for example, used Platonic ideas; Aquinas used Aristotle and other philosophical authorities.

The Church would probably claim that its understanding is indeed “self-correcting” (a popular, if dubious buzzword among scientific apologists these days). On less central issues, like slavery or Christendom’s relation to competing sects, Christian positions have changed quite a bit over the centuries. From the New Testament we know that widely divergent understandings of Christ coexisted and were in conflict from the earliest days of the churches. Outsiders had little or no input into these internal controversies, so I suppose their progress could be described as self-correcting.

It simply isn’t true that religion is a fixed system. As individuals, people change their religious beliefs in ways ranging from wrestling with personal fears and conflicts by interacting with tradition, to joining a new denomination or converting to a different religion – or abandoning religion altogether. Such changes may be affected by thinking about Copernican or Darwinian theory, but they may also take place entirely within a framework of religious thought. Believers sometimes want you to think their beliefs are fixed and solid, but it’s odd to find atheists taking them at their word. Nothing human is fixed and solid, and a look at the history of religious belief and practice will show that religion is no exception to the rule.

Wishin’ and Hopin’ and Thinkin’ and Prayin’


(Sorry for the visual distortion in this clip; I couldn’t embed the letterboxed version. The wedding imagery reminds us that the Church is the Bride of Christ, and if you imagine the third-person pronouns with initial capitals [“Wear your hair just for Him … You won’t get Him thinkin’ and a-prayin’ …] you have quite a kinky little hymn on your hands.)

My text today, dearly beloved, is from H. Allen Orr’s review of Philip Kitcher’s Living with Darwin, on marketing strategies for atheists:

Too often, the New Atheism forgets to make its humanism humane.

Wow. Where did Orr (or Kitcher) get the idea that religion is humane? One objection I have to Dawkins and the other “New Atheists” (as I’ve said before, I’m always suspicious of talk about “New” anything) is that they are basically secular avatars of the old-fashioned hellfire and brimstone preachers. Orr quotes Kitcher from Living with Darwin:

Often, the voices of reason I hear in contemporary discussions of religion are hectoring, almost exultant that comfort is being stripped away and faith undermined; frequently, they are without charity. And they are always without hope.

Orr agrees with Kitcher in criticizing the New Atheists for the lack of hope in their message, for their apparent glee in, as they imagine, stripping away other people’s illusions; but such has always been the method of revivalist religion, and these boys are revivalists. Theoretically there is hope of salvation for those who heed the Christian message, but the congregation members who reportedly fainted on hearing Jonathan Edwards’s 1741 sermon “Sinners in the Hands of an Angry God” don’t seem to have been reassured. The gospels’ Jesus taught (Matthew 7:13-14) that only a few would find the narrow path to salvation, so the overwhelming majority of humanity would be damned, and there is no hope for those who die unsaved. The parable of the Rich Man and Lazarus (Luke 16:19-31) is striking in its callousness toward the damned. I think it was the historian of religion Jonathan Z. Smith who wrote somewhere that the Good News of Christ was, and is, Bad News for most people.

Christianity, the religion I know the most about, has never been concerned with the tender feelings of believers in competing sects. Whether the competition was Jews, pagans, or (ever since the beginnings of the cult, as the New Testament shows) other Christians, the rhetoric has been vitriolic and unforgiving. Hatefulness is endemic to Christianity, though not specific to it, and the New Atheists tend to come across as reincarnations of the ancient Christian heresiarchs, seeking out and denouncing those who wickedly stray from their version of truth, only with the sectarian elements (and learning) stripped away. I see little to choose between Jonathan Edwards and a sodden, bleary-eyed Christopher Hitchens.

Kitcher is evidently trying to play the Good Atheist Cop to Dawkins’s, Harris’s, Dennett’s, and Hitchens’s Bad Atheist Cop. I doubt it will work, since despite his Christian upbringing Kitcher doesn’t seem to understand or really empathize with religious believers any more than the Bad Cops do, nor does he really offer any hope. Maybe he thinks that someday, someone will come up with some from somewhere. Maybe they’ll cook it up in a lab, a newer genetically-modified EnlightenmentTM hope that will enable people to get over the death of a loved one or the diagnosis of a painful terminal disease without the troublesome, addictive side effects of the old pre-scientific religious hope.

I’m not saying that I understand religious folk either. I’d think it would be easier for atheists like Kitcher, who were raised in religious families and only later broke away. I had no religious upbringing, and realized fairly early in life that I felt no need to believe in gods. I suspect that some of my attitude is temperamental (meaning that I have no idea where it comes from). Somewhere I read about a movie, Pete’n’Tillie, based on a Peter DeVries novel, in which a married couple (played by Carol Burnett and Walter Matthau) suffer through their child’s death of leukemia. One of the parents says something to the effect that it’s less painful to believe that there is no god, that no one is watching Up There, than to believe that Someone is watching but does nothing. That's exactly what I think, but I know that many other people, perhaps most, would rather believe that Someone is up there, weeping great salt tears over our pain and feeling it with us. (And no, I'm not talking about this guy.)

Yet religion is not synonymous with hope. Many atheists have died without fear, and many theists have died in terror of what might await them. Nor does every religion even offer the hope of an afterlife. Judaism, for one, has never been much concerned with the idea; reincarnation, while it promises some kind of survival, doesn’t offer the happy dream of reunion with one’s loved ones in an eternal Sunday afternoon. The philosopher Ludwig Wittgenstein wrote in his Tractatus Logico-philosophicus (6.4312), and again I agree:

Not only is there no guarantee of the temporal immortality of the human soul, that is to say of its eternal survival after death; but, in any case, this assumption completely fails to accomplish the purpose for which it has always been intended. Or is some riddle solved by my surviving for ever? Is not this eternal life itself as much of a riddle as our present life?