Dec 04

Sneering at Courage

One of the overdue lessons of 9/11 is that we can’t afford to sneer
at physical courage any more. The willingness of New York firemen,
Special Forces troops in Afghanistan, and the passengers of Flight 93
to put their lives on the line has given us most of the bright spots
we’ve had in the war against terror. We are learning, once again,
that all that stands between us and the night of barbarism is the
willingness of men to both risk their lives and take the awful
responsibility of using lethal force in our defense.

(And, usually, it is men who do the risking. I mean no disrespect
to our sisters; the kind of courage I am talking about is not an
exclusive male monopoly. But it has been predominently the job of
men in every human culture since Olduvai Gorge, and still is today.
I’ll return to this point later in the essay.)

The rediscovery of courage visibly upsets a large class of bien
pensants
in our culture. Many of the elite molders of opinion in
the U.S and Europe do not like or trust physical courage in men. They
have spent decades training us to consider it regressive, consigning
it to fantasy, sneering at it — trying to persuade us all that
it’s at best an adolescent or brute virtue, perhaps even a vice.

If this seems too strong an indictment, consider carefully all the
connotations of the phrase “testosterone poisoning”. Ask yourself
when you first heard it, and where, and from whom. Then ask yourself
if you have slid into the habit of writing off as bluster any man’s
declaration that he is willing to risk his life, willing to fight for
what he believes in. When some ordinary man says he is willing to
take on the likes of the 9/11 hijackers or the D.C. sniper — or
even ordinary criminals — them, do you praise his determination
or consign him, too, to the category of blowhard or barbarian?

Like all virtues, courage thrives on social support. If we mock
our would-be warriors, writing them off as brutes or rednecks or
simpletons, we’ll find courage in short supply when we need it. If we
make the more subtle error of sponsoring courage only in uniformed men
— cops, soldiers, firemen — we’ll find that we have
trouble growing the quantity or quality we need in a crisis. Worse:
our brave men could come to see themselves apart from us, distrusted
and despised by the very people for whom they risk their lives, and
entitled to take their due when it is not freely given. More than one
culture that made that mistake has fallen to its own guardians.

Before 9/11, we were in serious danger of forgetting that courage
is a functional virtue in ordinary men. But Todd Beamer reminded us of
that — and now, awkwardly, we are rediscovering some of the
forms that humans have always used to nurture and reward male courage.
Remember that rash of news stories from New York about Upper-East-Side
socialites cruising firemen’s bars? Biology tells; medals and
tickertape parades and bounties have their place, but the hero’s most
natural and strongest reward is willing women.

Manifestations like this absolutely appall and disgust the sort of
people who think that the destruction of the World Trade Center was a
judgment on American sins; — the multiculturalists, the
postmodernists, the transnational progressives, radical feminists, the
academic political-correctness brigades, the Bush-is-a-moron elitists,
and the plain old-fashioned loony left. By and large these people
never liked or trusted physical courage, and it’s worth taking a hard
look at why that is.

Feminists distrust physical courage because it’s a male virtue.
Women can and do have it, but it is gender-linked to masculinity just
as surely as nurturance is to femininity. This has always been
understood even in cultures like the Scythians, Teutons, Japanese, and
modern Israelis that successfully made places for women warriors. If
one’s world-view is organized around distrusting or despising men and
maleness, male courage is threatening and social support for it is
regressive.

For multi-culti and po-mo types, male physical courage is suspect
because it’s psychologically linked to moral certitude — and
moral certitude is a bad thing, nigh-indistinguishable from
intolerance and bigotry. Men who believe in anything enough to fight
for it are automatically suspect of would-be imperialism &mdash,
unless, of course, they’re tribesmen or Third Worlders, in which
fanaticism is a praiseworthy sign of authenticity.

Elite opinions about male physical courage have also had more
than a touch of class warfare about them. Every upper crust
that is not directly a military caste — including our own
— tends to dismiss physical courage as a trait of peasants
and proles and the lesser orders, acceptable only when they
know their place is to be guided by their betters.

For transnational progressives and the left in general, male
physical courage is a problem in the lesser orders because it’s an
individualizing virtue, one that leads to wrong-think about
autonomy and the proper limits of social power. A man who develops in
himself the grit that it takes to face death and stare it down is less
likely to behave meekly towards bureacrats, meddlers, and taxmen who
have not passed that same test. Brave men who have learned to fight
for their own concept of virtue — independently of
social approval or the party line — are especially threatening
to any sort of collectivist.

The multiculturalist’s and the collectivist’s suspicions are
backhanded tributes to an important fact. There is a continuity among
self-respect, physical courage and ethical/moral courage. These virtues are
the soil of individualism, and are found at their strongest only in
individualists. They do not flourish in isolation from one another.
They reinforce each other, and the social measures we take to reward
any of them tend to increase all of them.

After 1945 we tried to separate these virtues. We tried to teach
boys moral steadfastness while also telling them that civilized men
are expected to avoid confrontation and leave coping with danger to
specialists. We preached the virtue of `self-esteem’ to adolescents
while gradually abolishing almost all the challenges and ordeals that
might have enabled them to acquire genuine self-respect. Meanwhile,
our entertainments increasingly turned on anti-heros or celebrated
physical bravery of a completely mindless and morally vacuous kind.
We taught individualism without responsibility, denying the unpleasant
truth that freedom has to be earned and kept with struggle and blood.
And we denied the legitimacy of self-defense.

Rudyard Kipling would have known better, and Robert Heinlein did.
But they were written off as reactionaries — and many of us were
foolish enough to be surprised when the new thinking produced a bumper
crop of brutes, narcissists, overgrown boys, and bewildered hollow men
apt to fold under pressure. We became, in Jeffrey Snyder’s famous
diagnosis, a nation
of cowards
; the cost could be measured in the explosion in crime
rates after 1960, a phenomenon primarily of males between 15 and 35.

But this was a cost which, during the long chill of the Cold War,
we could afford. Such conflicts as there were stayed far away from
the home country, warfare was a game between nations, and nuclear
weapons seemed to make individual bravery irrelevant. So it remained
until al-Qaeda and the men of Flight 93 reminded us otherwise.

Now we have need of courage. Al-Qaeda’s war has come to us. There
is a geopolitical aspect to it, and one of the fronts we must pursue
is to smash state sponsors of terrorism. But this war is not
primarily a chess-game between nations — it’s a street-level
brawl in which the attackers are individuals and small terrorist cells
often having no connection to the leadership of groups like al-Qaeda
other than by sympathy of ideas.

Defense against this kind of war will have to be decentralized and
citizen-centered, because the military and police simply cannot be
everywhere that terrorists might strike. John F. Kennedy said this during
the Cold War, but it is far truer now:

“Today, we need a nation of Minutemen, citizens who are not only prepared to
take arms, but citizens who regard the preservation of freedom as the basic
purpose of their daily life and who are willing to consciously work and
sacrifice for that freedom.”

The linked virtues of physical courage, moral courage, and
self-respect are even more essential to a Minuteman’s readiness than
his weapons. So the next time you see a man claim the role
of defender, don’t sneer — cheer. Don’t write him off with some
pseudo-profound crack about macho idiocy, support him. He’s trying to
tool up for the job two million years of evolution designed him for,
fighting off predators so the women and children can sleep safe.

Whether he’s in uniform or not, young or old, fit or flabby
— we need that courage now.

Blogspot comments

Nov 28

Today’s treason of the intellectuals

The longest-term stakes in the war against terror are not just human lives, but whether Western civilization will surrender to fundamentalist Islam and shari’a law. More generally, the overt confrontation between Western civilization and Islamist barbarism that began on September 11th of 2001 has also made overt a fault line in Western civilization itself — a fault line that divides the intellectual defenders of our civilization from intellectuals whose desire is to surrender it to political or religious absolutism.

This fault line was clearly limned in Julien Benda’s 1927 essay Le trahison des clercs: English “The treason of the intellectuals”. I couldn’t find a copy of Benda’s essay on the Web. but there is an excellent commentary on it that repays reading. Ignore the reflexive endorsement of religious faith at the end; the source was a conservative Catholic magazine in which such gestures are obligatory. Benda’s message, untainted by Catholic or Christian partisanship, is even more resonant today than it was in 1927.

The first of the totalitarian genocides (the Soviet-engineered Ukrainian famine of 1922-1923, which killed around two million people) had already taken place. Hitler’s “Final Solution” was about fifteen years in the future. Neither atrocity became general knowledge until later, but Benda in 1927 would not have been surprised; he foresaw the horrors that would result when intellectuals abetted the rise of the vast tyrannizing ideologies of the 20th century,

Changes in the transport, communications, and weapons technologies of the 20th century made the death camps and the gulags possible. But it was currents in human thought that made them fact — ideas that both motivated and rationalized the thuggery of the Hitlers and Stalins of the world.

Benda indicted the intellectuals of his time for abandoning the program of the Enlightenment — abdicating the search for disinterested truth and universal human values. Benda charged that in
abandoning universalism in favor of racism, classism, and political particularism, intellectuals were committing treason against the humanity that looked to them for guidance — prostituting themselves to creeds that would do great ill.

And what are the sequelae of this treason? Most diagnostically, mass murder and genocide. Its lesser consequences are subject to debate, equivocation, interpretation — but when we contemplate the atrocities at the Katyn Forest or the Sari nightclub there can beno doubt that we confront radical evils. Nor can we disregard the report of the perpetrators that that those evils were motivated by ideologies, nor that the ideologies were shaped and enabled and apologized for by identifiable factions among intellectuals in the West.

An intellectual commits treason against humanity when he or she propagandizes for ideas which lend themselves to the use of tyrants and terrorists.

In Benda’s time, the principal problem was what I shall call “treason of the first kind” or revolutionary absolutism: intellectuals signing on to a transformative revolutionary ideology in the belief that if the right people just got enough political power, they could fix everything that was wrong with the world. The “right people”, of course, would be the intellectuals themselves — or, at any rate, politicians who would consent to be guided by the intellectuals. If a few kulaks or Jews had to die for the revolution, well, the greater good and all that…the important thing was that violence wielded by Smart People with the Correct Ideas would eventually make things right.

The Nazi version of this disease was essentially wiped out by WWII. But the most deadly and persistent form of treason of the first kind, which both gave birth to intellectual Naziism and long outlived it, was intellectual Marxism. (It bears remembering that ‘Nazi’ stood for “National Socialist”, and that before the 1934 purge of the Strasserites the Nazi party was explicitly socialist in ideology.)

The fall of the Soviet Union in 1992 broke the back of intellectual Marxism. It may be that the great slaughters of the 20th century have had at least one good effect, in teaching the West a lesson about the perils of revolutionary absolutism written in letters of human blood too large for even the most naive intellectual idealist to ignore. Treason of the first kind is no longer common.

But Benda also indicted what I shall call “treason of the second kind”, or revolutionary relativism — the position that there are no moral claims or universal values that can trump the particularisms of particular ethnicities, political movements, or religions. In particular, relativists maintain that that the ideas of reason and human rights that emerged from the Enlightenment have no stronger claim on us than tribal prejudices.

Today, the leading form of treason of the second kind is postmodernism — the ideology that all value systems are equivalent, merely the instrumental creations of people who seek power and other unworthy ends. Thus, according to the postmodernists, when fanatical Islamists murder 3,000 people and the West makes war against the murderers and their accomplices, there is nothing to choose between these actions. There is only struggle between contending agendas. The very idea that there might be a universal ethical standard by which one is `better’ than the other is pooh-poohed as retrogressive, as evidence that one is a paid-up member of the Party of Dead White Males (a hegemonic conspiracy more malign than any terrorist organization).

Treason of the first kind wants everyone to sign up for the violence of redemption (everyone, that is, other than the Jews and capitalists and individualists that have been declared un-persons in advance). Treason of the second kind is subtler; it denounces our will to fight terrorists and tyrants, telling us we are no better than they, and even that the atrocities they commit against us are no more than requital for our past sins.

Marxism may be dead, but revolutionary absolutism is not; it flourishes in the Third World. Since 9/11, the West has faced an Islamo-fascist axis formed by al-Qaeda, Palestinian groups including the Palestinian Authority and Hamas, the rogue state of Iraq, and the theocratic government of Iran. These groups do not have unitary leadership, and their objectives are not identical; notably, the PA
and Iraq are secularist, while al-Qaeda and Hamas and the Iranians and the Taliban are theocrats. Iran is Shi’a Islamic; the other theocratic groups are Sunni. But all these groups exchange intelligence and weapons, and they sometimes loan each other personnel. They hate America and the West, and they have used terror against us in an undeclared war that goes back to the early 1970s. The objectives of these groups, whether they are secular Arab nationalism or Jihad, require killing a lot of people. Especially a lot of Westerners.

Today’s treason of the intellectuals consists of equating suicide bombings deliberately targeting Israeli women and children with Israeli military operations so restrained that Palestinian children throw rocks at Israeli soldiers without fearing their guns. Today’s treason of the intellectuals tells us that because the U.S. occasionally propped up allied but corrupt governments during the
Cold War, we have no right to object to airliners being flown into the World Trade Center. Today’s treason of the intellectuals consists of telling us we should do nothing but stand by, wringing our hands, while at least one of the groups in the Islamo-fascist axis acquires nuclear weapons with which terrorists could repeat their mass murders in New York City and Bali on an immensely larger scale.

Behind both kinds of treason there lurks an ugly fact: second-rate intellectuals, feeling themselves powerless, tend to worship power. The Marxist intellectuals who shilled for Stalin and the postmodernists who shill for Osama bin Laden are one of a kind — they identify with a tyrant’s or terrorist’s vision of transformingthe world through violence because they know they are incapable of making any difference themselves. This is why you find academic apologists disproportionately in the humanities departments and the soft sciences; physicists and engineers and the like have more constructive ways of engaging the world.

It may be that 9/11 will discredit revolutionary relativism as throughly as the history of the Nazis and Soviets discredited revolutionary absolutism. There are hopeful signs; the postmodernists and multiculturalists have a lot more trouble justifying their treason to non-intellectuals when its consequences include the agonizing deaths of thousands caught on videotape.

It’s not a game anymore. Ideas have consequences; postmodernism and multiculturalism are no longer just instruments in the West’s intramural games of one-upmanship. They have become an apologetic for barbarians who, quite literally, want to kill or enslave us all. Those ideas — and the people who promulgate them — should be judged accordingly.

Nov 06

Post-postmodern politics

The Democratic Party fell off a cliff last night. Never mind their
shiny new governorships — the `smart’ money pre-election was on
them picking up an absolute majority of governor’s seats, and at the
Congressional level they took a shellacking nearly as bad as 1994’s.
The races Terry McAuliffe targeted as most critical — notably
the Florida governorship — were all lost. And the big Democrat
losses bucked historical trends — the mid-term election and the
weak economy should have helped them.

We’re going to hear a lot of gloating from Republicans and
soul-searching from Democrats in the aftermath. The easy explanation
is that 9/11 did the Democrats in; that American elected to get behind
a president who seems to be handling the terror war with decisiveness,
prudence, and strategic acumen.

I think this conventional wisdom is wrong. I think 9/11 merely
exposed a longer-term weakness in the Democratic position, which is
this: the Democrats have forgotten how to do politics that is about
anything but politics itself. They’re a post-modern political party,
endlessly recycling texts that have little or no referent outside
the discourse of politics itself.

The disgusting spectacle they made of Paul Wellstone’s funeral
is diagnostic. We were treated to trumpet calls about honoring
Wellstone’s legacy without any discussion beyond the most superficial
cliches of what that legacy was. All the ritual invocations of
time-honored Democratic shibboleths had a tired, shopworn, unreal
and self-referential feel to them — politics as the literature
of exhaustion.

The preconditions for paralysis had been building up for a long
time; arguably, ever since the New Left beat out the Dixiecrats for
control of the party apparat in 1968-1972. Caught between the
blame-America-first, hard-left instincts of its most zealous cadres
and the bland dishwater centrism recently exemplified by the DLC, the
Democrats found it more and more difficult to be about anything at
all. The trend was self-reinforcing; as Democratic strategy drifted,
the party became ever more dependent on cooperation between dozens of
fractious pressure groups (feminists, gays, race-baiters, the AARP,
the teachers’ and public-employee unions), which made the long-term
drift worse.

Bill Clinton was the perfect master of political postmodernism and
James Carville his prophet. For eight years they were able to
disguise the paralysis and vacuum at the heart of Democratic thinking,
centering party strategy on a cult of personality and an
anything-but-Republicanism that was cunning but merely reactive. The
Republicans cooperated with this strategy with all the naive eagerness
of Charlie Brown running up to kick Lucy’s football, perpetually
surprised when it was snatched away at the last second, repeatedly
taking pratfalls eagerly magnified by a Democratic-leaning national
media.

But Bill Clinton was also a borderline sociopath and a liar, a man
whose superficial charm, anything-to-get-elected energy, and utter
lack of principle perfectly mirrored the abyss at the heart of the
Democratic party. The greedy, glittery, soulless Wellstone-funeral
fiasco was the last hurrah of Clintonism, and it cost Walter Mondale
his last election fight.

Reality had to intrude sometime. The destruction of the WTC
reduced all the politics-about-politics rhetoric of the Democrats to
irrelevance. They stood mute in the face of the worst atrocity on
American soil since Pearl Harbor, arguably the worst in U.S. history.
The superficial reason was that their anti-terror policy was hostage
to the party’s left wing, but the deeper problem was that they long
ago lost the ability to rise above petty interest-group jockying
on any issue of principle at all. The most relevant adjective is not
`wrong’, or `evil’, it’s `feckless’.

Republicans, by contrast, forged a workable consensus during
the Reagan years and never quite lost it. They’ve often been wrong,
frequently been obnoxious as hell, and have their own loony fringe
(abortion-clinic bombers, neo-fascists like Pat Buchanan, and
the Christian Coalition) to cope with. But when Osama bin Laden
demonstrated a clear and present danger to the United States of
America they were able to respond.

They were able to respond not merely with reaction, but by taking
a moral position against terrorism that could serve as the basis of
an effective national strategy. Quarrel with “Homeland Security” all
you like — but then imagine Al Gore in charge of defeating
Al-Qaeda and shudder. He would actually have had to take the likes of
Cynthia McKinney and Maxine Waters seriously.

I think these 2002 elections are going to turn out to have been much
more of a turning point than the aborted `Republican Revolution’ of
1994. Unless Bush’s war strategy completely screws the pooch, he is
going to completely walk over the Democratic candidate in 2004. The
Democrats show no sign of developing a foreign-policy doctrine that can
cope with the post-9/11 world, and their domestic-policy agenda is
tired and retrogressive. Their voter base is aging, and their national
leadership couldn’t rummage up a better Wellstone replacement than
Walter “What decade is this, anyway?” Mondale. The Democratic
party could end up disintegrating within the decade.

This is not a prospect that fills me with uncomplicated glee.
Right-wing statism is not an improvement on left-wing statism; a smug
and dominant GOP could easily become captive to theocrats and
know-nothings, a very bad thing for our nation and the world. And,
unfortunately, the Libertarian Party has courted self-destruction by
choosing to respond to 9/11 with an isolationism every bit as vapid
and mindless as the left’s “No War for Oil!” chanting.

Welcome to post-postmodern politics. Meaning is back, but
the uncertainties are greater than ever.

Blogspot comments

Nov 02

The capsaicinization of American food

Consider spicy-hot food — and consider how recent it is as a
mainstream phenomenon in the U.S. In 2002 many of us cheerfully chow
down on Szechuan and Thai, habaneros and rellenos, nam pla and sambal
ulek. Salsa outsells ketchup. But it wasn’t always that way.

In fact it wasn’t that way until quite recently, historically
speaking. I’ve enjoyed capsaicin-loaded food since I was a pre-teen
boy in the late 1960s; I acquired the taste from my father, who picked
it up in South America. In those days our predilection was the
peculiar trait of a minority of travelers and a few immigrant
populations. The progression by which spicy-hot food went from there
to the U.S. mainstream makes a perfect type case of cultural
assimilation, and the role and meaning that the stuff has acquired on
the way is interesting too.

(Oh. And for those of you who don’t understand the appeal? It’s
all about endorphin rush, like a runner’s high. Pepper-heads like me
have developed a conditioned reflex whereby the burning sensation
stimulates the release of opiate-like chemicals from the brainstem,
inducing a euphoria not unlike a heroin buzz. Yes, this theory has
been clinically verified.)

Baseline: Thirty years ago. The early 1970s. I’m a teenager, just
back in the U.S. from years spent overseas. Spicy-hot food is pretty
rare in American cuisine. Maybe you’d have heard of five-alarm chili
if you’d lived in Texas, but chances are you’d never have actually
eaten the stuff. If you’re from Louisiana, you might have put Tabasco
sauce on your morning eggs. Aside from that, you wouldn’t have
tasted hot peppers outside of a big-city Chinatown.

It’s actually a little difficult to remember how different American
cooking was then. Those were the years when Kool-Whip was cool and
the casserole was king, an era of relentless blandness well-skewered
by James Lileks’s
Gallery of Regrettable Food
. Mom didn’t know any better. Well,
most moms didn’t, anyway; mine had acquired a few clues overseas.

But most Americans of that day inherited the pale hues of British
and German cooking. What zip there was in our cuisine came from
immigrants, especially (at that time) Italians. Thai, Vietnamese
and Ethiopian had not gained a foothold. Chinese was on educated
peoples’ radar but only eaten in restaurants; nobody owned a wok
yet.

Indeed, Chinese food had already caught on in a few leading-edge
subcultures by the mid-1970s: science-fiction fans, computer hackers,
the people who would start to call themselves `geeks’ fifteen years
later. But most of what was available was Americanized versions of
the blander Shanghainese and Cantonese varieties; restaurants that
made a point of authenticity and advertised Szechuan and Hunan cooking
to round-eyes were not yet common.

This all began to change in the early 1980s. The yuppies did it to
us; experimentation with exotic and ethnic foods became a signature
behavior of the young, upwardly mobile urban elite, and the variety of
restaurants increased tremendously in a way that both met that demand
and stimulated it. More importantly, cooking techniques and
ingredients that hadn’t been traditional in European cuisine started
to influence home cooking — white people started buying
woks. And Szechuan fire oil.

The first vogue for Cajun cooking around 1984 was, as I recall,
something of a turning point. Chinese cooking was popular but still
marked as `foreign’; Cajun was not. Spicy-hot gumbo joined five-alarm
chili on the roster of all-American foods that were not only expected
but required to deliver a hefty dose of capsaicin zap. I
remember thinking the world was changing when, in 1987 or ’88, I
first saw spicy Cajun dishes on the menu of a white-bread roadside
diner. In Delaware.

This diner was never going to show up in Michelin’s or Zagat’s; in
fact, it was the next thing to a truck stop. Something else was going
on in the 1980s besides yuppies buying woks — and that was the
embrace of spicy-hot food by the small-town and rural working class,
and its coding as a specifically masculine pleasure.

This probably evolved out of the tradition, going back at least to
the late 1940s, of defining barbecue and chili as what an
anthropologist would call a “men’s mystery”. Despite the existence of
male professional chefs and men who can cook, most kinds of domestic
cooking are indisputably a female thing — women are expected to
be interested in it and expected to be good at it, and a man who
acquires skill is crossing into women’s country. But for a handful of
dishes culturally coded as “men’s food”, the reverse is true.
Barbecue and chili top that list, and have since long before spicy-hot
food went mainstream.

For people who drive pickup trucks, spicy-hot food went from being
a marked minority taste to being something like a central men’s
mystery in the decade after 1985. I first realized this in the early
1990s when I saw a rack of 101 hot-pepper sauces on display at a
gun-and-knife show, in between the premium tobacco and the jerked
meat. There’s a sight you won’t see at a flower show, or anywhere else
in women’s country.

The packaging and marketing of hot sauces tells the same
story. From the top-shelf varieties like Melinda’s XXX (my favorite!)
to novelty items like “Scorned Woman” and “Hot Buns”, much of the
imagery is cheeky sexiness clearly designed to appeal to men.

Nor is it hard to understand why the association got made in the
first place. It’s considered masculine to enjoy physical risk, even
mostly trivial physical risks like burning yourself on a sauce hotter
than you can handle. Men who like hot peppers swap capsaicin-zap
stories; I myself am perhaps unreasonably proud of having outlasted
a tableful of Mexican college students one night in Monterrey,
watching them fall out one by one as a plate of sauteed habaneros
was passed repeatedly around the table.

There’s a sneaky element of female complicity in all this. Women
chuckle at our capsaicin-zap stories the same way they laugh at other
forms of laddish posturing, but then (as my wife eloquently puts it)
“What good is a man if you rip off his balls?” They leave us capsaicin
and barbecue and other men’s mysteries because they instinctively grok
that a certain amount of testosterone-driven male-primate behavior is
essential for the health of Y-chromosome types — and best it
should be over something harmless.

This gastronomic pincer movement — Yuppies pushing spicy food
downmarket, truckers and rednecks pushing it upmarket —
coincided with the rise in cultural influence of Hispanics with a
native tradition of spicy-hot food. In retrospect, it’s interesting that
what mainstream America naturalized was jalapenos rather than
Chinese-style fire oil. Tex-Mex assimilated more readily than
Szechuan, as it turned out.

We can conveniently date that mainstreaming from the year salsa
first passed ketchup in sales volume, 1996. Perhaps not by
coincidence, that’s the first year I got gifted with a jar of
homegrown habaneros. They came to me from an Irish ex-biker, a
take-no-shit ZZ-Top lookalike who runs a tire dealership in the next
town over. He’d be a great guy to have with you in a bar fight, but
nobody who would ever be accused of avant-garde tastes. I guess
that was when I realized spicy-hot food had become as all-American
as apple pie.

Blogspot comments

Jul 18

The Non-Portability of Barbecue

(Originally titled: Travelling in Texas)

I was on the road in Texas last week, addressing Linux user groups in Dallas and Austin. I always enjoy visiting Texas. It’s a big, wide-open place full of generous people who cultivate a proper appreciation of some of my favorite things in life — firearms, blues guitar, and pepper sauces.

And, of course, one of the biggest things Texas has going for it is barbecue. And not the pallid imitation served up by us pasty-faced Yankees here where I live (near Philadelphia, PA) but the real thing. Barbecue, dammit. Red meat with enough fat on it to panic a health-foodist right out of his pantywaist, slow-cooked in a marinade sweeter than a mother’s kiss and eaten with sauces hot enough to peel paint. Garnish with a few extra jalapenos and coleslaw and wash it down with cheap soda, lemonade, or beer. Food of the gods.

I swear your testosterone level goes up just smelling this stuff. After a few mouthfuls of Rudy’s carnivoral bliss you’ll be hankerin’ to cultivate a drawl, wear a Stetson and drive a pickup truck with a gun rack. (I draw the line at country music, though. A man’s got to have some standards.)

At a real barbecue joint like Rudy’s (“Worst barbecue in Texas!”) they serve you piles of beef, pork and chicken wrapped in butcher paper in a plastic basket. No plates, just more butcher paper and bread. And, unfortunately, the bread is where this gustatory Nirvana nearly crashes back to earth. Because the bread at real barbecue places is invariably utter crap — spongy sliced white with all the taste of building insulation.

Here in Philadelphia we can’t make barbecue worth a damn, but we know better than to put a hot sandwich on American bread. One of our regional-food glories is the Philly steak sandwich, fried beef and onions and mushrooms (and usually cheese, but I don’t eat cheese) nestled in a foot-long Italian roll. The bread is important. It’s tasty, it’s chewy, it’s got a crust on it. It’s worthy of respect. One of the reasons you can’t get a decent steak sandwich more than fifty miles from Billy Penn’s hat is that bread. It depends on an Italian baking tradition that just doesn’t exist outside the mid-Atlantic metroplex, and is found in its highest form only in Philly and South Jersey. Philadelphians laugh at the pathetic imitations of “Philly steaks” offered elsewhere for the same reason Texans laugh at barbecue made north of the Mason-Dixon line. And both groups are right to laugh. It just ain’t the same.

Every time I order up a mess of barbecue at a place like Rudy’s or County Line or Dick’s Last Resort I think to myself “Someday, one of these barbecue outfits has got to start offering decent bread. Their sales would go through the roof.” I’ve been waiting for the market to correct this problem for more than twenty years now — and it hasn’t happened. And thereby hangs a mystery.

The mystery is the curious persistence of regional food differences in a country with cheap transport and the best communications network in the world. There are places in the U.S. where you can reliably get really good bread — mostly the coastal metroplexes. There are places you can get real barbecue, in the heartland South and Southwest. And these zones just don’t overlap. (Yes, they have a gourmet-bread bakery in Austin. I suspect, if I went there, I’d find it a lot like the Chinese food in Ann Arbor — impressive to the locals, maybe, but only because their standards are so low.)

I could multiply examples. Sourdough bread — I’ve had it everywhere you can get it and it just doesn’t taste right outside of San Francisco. The East Coast versions are competent, but lack some subtle tang. Yeast strain? Something in the water? Who knows?

Cheesecake. There’s a good one. Anybody who has lived in New York won’t touch most cheesecake made elsewhere at gunpoint, and with good reason. Next to a traditional New-York-style baked cheesecake (the kind you can stand a fork in because it has the approximate density of neutronium) all others are a sort of pathetic, tasteless cheese gelatin. In this case the recipe is clearly what matters.

Or deep-dish pizza. Try to get that done right anywhere but Chicago. Good luck. Actually, the Philly/South Jersey area may be the only other part of the U.S.that can almost make this nut, and our thin-crust pizza is better. But why? Why don’t the good techniques go national and drive out the weaker competition?

The obvious answer would be that nationwide, tastes differ too much for one regional variant to dominate. But many cases there isn’t even any dispute about where the best variant comes from; the superiority of “New York style” cheesecake. for example, is so universally understood that restaurants elsewhere often bill their cheesecake that way even when it’s actually half-composed of “lite” garbage like ricotta or cottage cheese. Nobody who has ever tasted one doubts that Philly steaks are the acme of the art. And nobody — but nobody — who can get both passes up Texas barbecue for what they make in New Haven or Walla Walla.

So you’d think that the market would have propagated Texas slow-cooking, San Francisco yeast starters and the Philly steak roll all over the country by now. But some food technologies travel better than others, and some seem curiously unable to thrive outside their native climes. Cheesecake recipes may survive transmission relatively well, but the mysteries of good barbecue are subtle and deep. Pizzas rely on elaborate oven and dough-mix technology that probably tends to conserve regional variations simply because it’s too capital-intensive to mess with casually.

I’ve meditated on the matter and still can’t decide whether I think that’s a good thing or not. The approved thing for travel writers to do is wax lyrical about the wonderfulness of regional variety, as if it would somehow fail to be an improvement in the world if I could get decent bread with my barbecue. The hell with that kind of sentimentality; I’d rather have a better meal.

But there’s a point buried there somewhere — something that isn’t about the bread or the barbecue, but about what it feels like to sit in a dusty roadside joint like Rudy’s,surrounded by cases of Red Pop and overweight rednecks in tractor caps and checked shirts, with the food of the gods melting in your mouth, and thinking “Damn, this place is tacky, but I hope it lives forever.”

And you know what? I suspect that kind of barbecue joint will live forever, or as close to forever as humans manage, anyway. They’ve probably existed since the first proto-hominids roasted mammoth haunch over a slow fire, washing it down with some badly-made tuber-beer equivalent of Red Pop. And their equivalents will probably persist in the zero-gee arcologies and Dyson spheres of the year 3000. Even if they get hip about the good bread, somewhere in the universe there will always be a Texas. And that’s a good thing.

UPDATE: Some respondents have reminded me of the Piedmont (and specially North Carolina) tradition of pulled-pork barbecue. Let me state for the record that I find it equally delicious. Both the Texas and Piedmont versions are so damn good that there is no call for petty disputation about which is superior. But for those of you who know what I am talking about, I am quite partial to burnt
ends.

UPDATE: Jane Galt has commented in her usual witty and illuminating fashion.

UPDATE: The mystery of San Francisco sourdough, was, as it turns out, solved in 1970. You can buy a starter with the proper symbiosis of bacteria and yeast — and, contrary to myth, local bacteria won’t overwhelm it. Of course this makes it harder to understand why the stuff isn’t everywhere…

Blogspot comments

Jul 17

Diet Considered as a Bad Religion

A current New York Times news story, What If It’s All Been A Big Fat Lie, entertainingly chronicles the discovery that low-fat diets are bad for people. More specifically, that the substitution of carbohydrates like bread and pasta and potatoes for meat that we’ve all had urged on us since the early 1980s is probably the cause of the modern epidemic of obesity and the sharp rise in diabetes incidence.

I have long believed that most of the healthy-eating advice we get is stone crazy, and the story does tend to confirm it. One of my reasons for believing this is touched on in the article; what we’re told is good for us doesn’t match what humans “in the wild” (during the 99% of our species history that predated agriculture) ate. The diet our bodies evolved to process doesn’t include things like large amounts of milled grain or other starches. Our hunter-gatherer ancestors ate wild vegetables (especially tubers) and meat whenever they could get it.

I’ve always had to suppress a tendency to laugh rudely when vegeterians touted their diet as “natural”. Vegetarianism is deeply unnatural for human beings; it’s marginally possible in warm climates only (there are no vegetarians in Tibet because the climate kills them), and only possible even there because we’re at the near end of 4,000 years of breeding for high-caloric-value staple crops.

So what’s the natural diet for human beings? Our dentition (both slashing and grinding teeth) and the structure of our digestive system (short colon, no rumen) is intermediate between that of herbivores like cows and obligate carnivores like cats; both systems resemble those of non-specialized omnivores like bears. Actually, the earlier hominids in the human ancestral line were designed for a more vegetarian diet than we; they had large flat molars and powerful jaws designed for grinding seed-cases. The increase in brain size in the hominid line correlates neatly with a shift to a more carnivorous dentitition and skull structure.

Physical anthropologists will tell you that the shift from hunter-gatherer existence to sedentary agriculture enabled human beings to live at higher population densities, but at the cost of a marked deterioration in the health of the average person. The skeletons of agricultural populations are shorter, less robust, and show much more evidence of nutritional diseases relative to their hunter-gatherer ancestors.

For twenty years I’ve consciously been trying to eat what I think of as a caveman diet — heavy on the meat and raw vegetables, very little sugar, light on the starches. I’m a bit overweight now, not seriously so for a 44-year-old man, but enough to notice; what this NYT article tells me is that I didn’t follow my own prescription strictly enough and ate too much bread and potatoes.

But the evolutionary analysis only tells us what we probably should be eating. It doesn’t explain how the modern diet has come to be as severly messed up as it is — nor why the advice we’ve been getting on healthy eating over the last twenty years has been not merely bad but perversely wrong.

The answer is, I think, implicit in the fact that “health food” has a strong tendency to be bland, fibrous, and nasty — a kind of filboid studge that we have to work at convincing ourselves we like rather than actually liking. Which is, if you think about it, nuts. Human food tropisms represent two million years of selective knowledge about what’s good for our bodies. Eating a lot of what we don’t like is far more likely to be a mistake than eating things we do like, even to excess.

Why do we tend to treat our natural cravings for red meat and fat as sins, then? Notice the similarity between the rhetoric of diet books and religious evangelism and you have your answer. Dietary mortification of the flesh has become a kind of secular asceticism, a way for wealthy white people with guilt feelings about their affluence to demonstrate virtue and expiate their imagined trangressions.

Once you realize that dieting is a religion, the irrationality and mutual contradictions become easier to understand. It’s not about what’s actually good for you, it’s about suffering and self-denial and the state of your soul. People who constantly break and re-adopt diets are experiencing exactly the same cycle of secondary rewards as the sinner who repeatedly backslides and reforms.

This model explains the social fact that the modern flavor of “health”-based dietary piety is most likely to be found in people who don’t have the same psychological needs satisfied by an actual religion. Quick now: who’s more likely to be a vegetarian or profess a horror of “junk food” — a conservative Christian heartlander or a secular politically-correct leftist from the urban coasts?

The NYT article tells us that the dominant dietary religion of the last twenty years is cracking — that the weight of evidence against the fat-is-evil/carbs-are-good theory is no longer supportable. Well and good — but it won’t necessarily do us a lot of good to discard this religion only to get stuck with another one.

I say it’s time to give all bossy nutritionists, health-food evangelists and dietary busybodies the heave-ho out of our lives — tell the sorry bitches and bastards to get over themselves and go back to eating stuff that tastes good and satiates. And enjoy the outraged squawking from the dietarily correct — that, my friends, is the music of health and freedom.

Blogspot comment

Jun 16

The Elephant in the Bath-House

Mary Eberstadt’s Weekly Standard article
The Elephant in the Sacristy
shines a strong light on facts that
will discomfit many of the politically correct. I don’t completely
agree with her analysis; as Amy Welborn argues, Ms. Eberstadt is too quick to dismiss the role of the
doctrine of celibacy in creating an ingrown, perfervid, and corrupt sexual
culture among priests, and too easy on the culture of secrecy and denial
within which priestly abuse flourished.

I would go further than Ms. Eberstadt or Ms. Welborn; I think this
scandal is grounded in the essentials of Catholic doctrines about sex,
sin, guilt, and authority. This is not an accidental corruption of
the church, any more than Stalin was an accidental corruption of
Communism. Bad moral ideas have consequences, and those consequences
can be seen most clearly in the human monsters who are both created by
those ideas and exploiters of them. There is a causal chain that
connects loathsome creatures like the “Reverend” Paul Shanley directly
back to the authoritarianism and anti-sexuality of St. Augustine; a
chain well-analyzed by psychologists such as Stanley Milgram and
Wilhelm Reich. I suggest that any religion that makes obedience to
authority a primary virtue and pathologizes sex will produce abuses
like these as surely as rot breeds maggots.

One need not, however, attack the essentials of Catholic doctrine
to agree with Ms. Eberstadt’s main point: that the dominant media
culture seems bent on obscuring a central fact about the pattern of
crimes — which is that they are predominently homosexual abuse by
priests with a history of homosexual activity. Cases of priestly abuse
of females of any age are rare (though at least one horrifying tale of
multiple priests cooperating in the abuse of a teenage girl has
surfaced from California). The overwhelming majority of the cases
involve either pederasty (homosexual acts with post-pubescent boys and
young men) or homosexual pedophilia with pre-pubescent boys as young
as six years old. Yet you would be hard-put to deduce this from most
of the vague accounts in the U.S. media, which traffic in terms that
seem designed to obscure the gender and age of the victims and the
homosexual orientation of almost all the abusers. Why is that?

Apparently, because one of the rules of the U.S.’s dominant media
culture is that Homosexuals Are Not To Be Stigmatized (I think it’s
carved in stone right next to “Environmentalists are Saints” and “Gun
Owners are Redneck Nut-Jobs”). Gay conservative Andrew Sullivan
famously noted this rule in connection with the Jesse Dirkhising
murder
. We are not supposed to think of either Jesse’s murderers
or abusive priests as homosexuals; that might reflect badly on a
journalistically-protected class by associating it with criminal
behavior.

But more than that; the truth the dominant media culture really
doesn’t want to go near is that pederasty has never been a marked or
unusual behavior among homosexuals, and even advocates of outright
pedophilia are not shunned in the homosexual-activist community.

The public spin of gay activist groups like Queer Nation is that
most male homosexual behavior is androphilia, adult-to-adult
sex between people of comparable ages. And indeed, gay historians agree with
anthropologists that in the modern West, androphilia is more common
relative to pederasty and homosexual pedophilia than has been
historically normal. But another way of putting this is that in most
other cultures and times, pederasty and pedophilia have been more
common forms of homosexuality than androphilia.

Pederasty, at least, remains a common behavior among modern
homosexuals. The `twink’ or compliant teenage boy (usually blond,
usually muscled, depicted in the first dewy flush of postpubescence)
is the standard fantasy object of gay porn. By contrast, I learned
from recent
research
that the archetypal fantasy object of straight porn is a
fully-developed (indeed, usually over-developed) woman in her early
twenties. And a couple of different lines of evidence (including
surveys conducted within the gay population by gays) lead to the
conclusion that older homosexuals actually pursue boys quite a bit
more frequently than either older lesbians or older heterosexual men
pursue girls.

Homosexual activists, when challenged on this point, like to retort
that older men nailing barely-nubile teenage girls is far more
common. And in absolute terms it is — but only because there are
twenty-five to a hundred times more straight men than there are gay
men in the world (reliable figures for the incidence of male
homosexuality range between 1% and 4%). Per capita among gays,
pederasty is more frequent than among straights by a factor of
between three and ten, depending on whose statistics you believe —
and the North American Man-Boy Love Association, actively advocating
pederasty and pedophilia, is welcomed at gay-pride events
everywhere.

If the prevalence of homosexuality in the Catholic priesthood is
the elephant in the sacristy, the homosexuality/pederasty/pedophilia
connection in gay culture is the elephant in the bath-house. No
amount of denying it’s there is going to make the beast go away.

But homosexual activists don’t want straights to see the elephant,
and no wonder. One of the most persistent themes to show up in
hostility towards homosexuals is the fear that they will recruit
impressionable boys who might otherwise have grown up straight. Thus
their insistance for straight consumption that homosexuality is an
inborn orientation, not a choice. Thus also their insistance that the
gay life is all about androphilia, none of that pederasty or
pedophilia stuff going on here. And thus, they’d rather not have
anyone thinking about the fact that most priestly abuse is in fact
classically pederastic and pedophilic behavior by men who behave as
homosexuals and identify themselves as gay.

That there is a pattern in the national media of political
correctness and spin on behalf of preferred `victim’ groups isn’t
news, nor is the fact that homosexuals are among those groups. But
get this: Richard Berke, the Washington editor of the New York
Times
recently said “literally three-quarters of the people
deciding what’s on the front page are not-so-closeted homosexuals”.
There you have it in plain English; gays run the “newspaper of
record”. Berke made these comments before a gay advocacy group — not
merely admitting but outright asserting, as a matter of
pride, that the Times engages in gay-friendly spin
control. And it has already been well established by statistical
content studies that the national media tend to follow where they’re
led by the Times and a handful of other prestige
newspapers, all broadly similar in editorial policy.

The expected next step in this sequence would be for me to start
screaming about the evil of it all and demand that Something Be Done.
If I were a conservative, that’s what I’d do. But in fact it’s not
self-evident that this particular disinformation campaign is worth
anybody’s time to be concerned about, except as yet another example of
wearily predictable bias in the dominant media culture. Whether it is
or not depends upon one’s value judgment about consensual pederasty
and pedophilia.

NAMBLA and its sympathizers in the rest of the gay community think
they’re engaged in a worthy campaign for sexual liberation. If they
are right, then the anti-antigay spin on the priestly-abuse scandal is
arguably analogous to what pro-civil-rights sympathizers in the early
1960s might have done if there had been a long string of incidents of
incidents of black men seducing white women, both parties violating
the miscegenation laws still on the books in many states at that
time.

The pro-spin argument would have run like this: interracial sex is
taboo for no good reason, so soft-pedaling the race of the people involved
as much as possible is a justifiable form of suppressio veri
not outright lying but being economical with the truth. Our readers will
be able to deduce the whole truth if they put in even a little effort, but
be needn’t pave the road for them. By doing this, we will avoid inflaming
racial bigotry and advance the worthy cause of civil rights.

For this analogy to hold good, we need two preconditions. First,
we must believe that almost all the pederasty/pedophilia between
priests and boys has been voluntary. Second, we must believe that
consensual pederasty and pedophilia are not, in fact, harmful to the
boys involved. Intellectual honesty (and, I’ll admit, a low delight
on my part in watching prudes and cultural conservatives turn purple
with indignation) demands that we not dismiss this case without
looking at the evidence.

The modern West condemns pederasty and pedophilia. Our cultural
ancestors did not always do so; among the Athenian Greeks consensual
pederastic relationships were praised and thought to be a good deal
for both parties. Pederasty is socially normal in Afghanistan and
other parts of the Islamic world; pederasty and pedophilia are also
un-tabooed in parts of Southeast Asia and in Japan. Where pederasty
and pedophilia are not taboo, the boys who participate in it
frequently grow up to form normal heterosexual relationships and marry.
In fact, it’s the modern West’s hard separation between straights
who never have sex with other males and gays who
never have sex with females that is anthropologically
exceptional.

Of course, the fact that pederasty and pedophilia have been an
approved practice in other cultures does not automatically mean we
should give them a nod. Cannibalism, slavery and infanticide have
been approved practices too. But the anthropological evidence doesn’t
suggest that boys who have voluntary sex with men automatically turn
into traumatized basket cases; indeed some present-day cultures agree
with the ancient Greeks that such liaisons are good for the maturation
of boys. There are real secondary risks, starting with the fact that
anal sex is a much more effective vector of venereal diseases such as
AIDS than is vaginal sex — but given a cultural context that doesn’t
stigmatize the behavior, clear evidence that consensual pederasty and
pedophilia are intrinsically damaging is remarkably hard to find.

Accordingly, NAMBLA may well be right on one level when they argue
that what matters is not so much which tab A gets put into which slot
B, but whether the behavior was coerced or consensual. According to
this argument, the elephant in the bath-house can be lived with —
might even be a friendly beast — if it’s docile-tempered and won’t
give the tusk to unconsenting parties.

Gay men, or at least the sort of university-educated gay men who
wind up determining what’s on the front page of the New York
Times
and spiking stories like the Dirkhising murder, know
these facts. How surprising would it be if they interpreted most
victims’ charges of abuse as a product of retrospective false
consciousness, implanted in them by a homophobic and gay-oppressing
culture? By suppressing the homosexual identification of most of the
accused priests, gays in the media can protect their own sexual and
political interests while believing — perhaps quite sincerely — that
they are quietly aiding the cause of freedom.

The trouble with this comforting lullaby is that, even if NAMBLA is
right, coercion matters a lot. As Ms. Eberstadt
reports, the pederastically and pedophilically abused often become
broken, dysfunctional people. They show up in disproportionate numbers
in drug and alcohol rehab. They have a high rate of involvement in
violent crime. Worse, they end to become abusers themselves,
perpetuating the damage across generations.

Voltaire once said “In nature there are no rewards or punishments,
only consequences”. Gays experimented with unfettered promiscuity in
the 1970s and got AIDS as a consequence. The mores of gay bath-house
culture turned out to be broken in the way that ultimately matters; a
lot of people died horribly as a result of them.

It may turn out that the consequences of sympathizing with NAMBLA
are almost equally ugly. If a climate of `enlightened’ tolerance for
consensual pederasty and pedophilia tends to increase the rate at
which boys are abused, that is a very serious consequence for which gay
liberationists will not (and should not) soon be forgiven.
The homosexual gatekeepers at the Times may be making
themselves accessories before and after the fact to some truly hideous
crimes.

And this is where we come back to the priestly-abuse scandal.
Because a theme that keeps recurring in
histories
of the worst abusers is that they were trained in
seminaries that were run by homosexual men and saturated with
gay-liberationist subculture. Reading accounts of students at one
notorious California seminary making a Friday-night ritual of cruising
gay bars, it becomes hard not to wonder if gay culture itself has not
been an important enabler of priestly abuse.

Now it’s time to abandon the catch-all term abuse and speak plainly
the name of the crime: sexual coercion and rape. It is very clear
that pederasts and pedophiles in the priesthood have routinely used
their authority over Catholic boys not merely to seduce them, but to
coerce and rape them. In a few cases the rape has been overt and
physical, but in most cases it has been a subtler and arguably more
damaging rape of the victim’s mind and self.

The single most revolting image I have carried away from the
priestly-abuse scandal is victims’ accounts of priests solemnly
blessing them after sex. That is using the child’s religious feelings
and respect for authority to make him complicit in the abuse. If I
believed in hell, I would wish for the priests who perpetrated this
kind of soul-rape to fry in it for eternity.

And we must call it rape; do otherwise is to suppose that
most of the thousands of known victims wanted to be sodomized. Even
if we discard the victims’ and witnesses’ reports, this is highly
unlikely; there were simply too many victims. Some priests had sex
with hundreds of boys, far too many to fit into the 1-4%
cohort of homosexual orientation in the population they had access to.
And we are not entitled to dismiss the victims’ protests in any case,
not given the corollary evidence that the trauma of abuse reverberated
through the victims’ lives, continuing to damage them years and
decades afterwards. Comforting gay-lib delusions about false
consciousness won’t wash here.

Continuing our civil-rights analogy, the correct parallel would
have been with an epidemic of interracial rape, rather than
cohabitation. Had there in fact been such an epidemic, civil-rights
proponents would have faced the question of whether black men had a
particular propensity to rape white women. The analogous question,
whether homosexual men have a particular propensity to rape boys, is
precisely the one that homosexuals and their sympathizers in the media
don’t want anyone to examine — and precisely the question that the
priestly-abuse scandal demands that we ask.

It’s easy to sympathize with gay activists’ fears that opening this
question will expose them to a firestorm of prejudice from people
who will prejudge the answer out of anti-gay bigotry. But the
pattern of homosexual abuse by the Catholic priesthood has been so
egregious and so longstanding that we need to understand the relative
weight of all the causes that produced it — whether those
causes are specific to Catholicism or more general.

Are gay men biologically or psychologically prone to rape boys at a
level that makes a gay man even without a known history of abuse into
a bad risk around boys? Does queer culture encourage a tendency to
rape in gay men who are put in authority over boys?

Here is where the question becomes practical: were the Boy Scouts
of America so wrong to ban homosexual scoutmasters? And here we are
with a crashing thud back in the realm of present politics. After the
numbing, horrifying, seemingly never-ending stream of foul crimes
revealed in the scandal, even staunch sexual libertarians like your
humble author can no longer honestly dismiss this question simply
because it’s being raised by unpleasant conservatives.

The priestly-abuse scandal forces us to face reality. To the
extent that pederasty, pedophilic impulses, and twink fantasies are
normal among homosexual men, putting one in charge of adolescent boys
may after all be just as bad an idea as waltzing a man with a known
predisposition for alcoholism into a room full of booze. One wouldn’t
have to think homosexuality is evil or a disease to make institutional
rules against this, merely notice that it creates temptations best
avoided for everyone’s sake.

Blogspot comments

May 15

Terrorism Becomes Bad Art

Minnesota art student Luke Helder has been charged with the recent string of Midwestern mailbox bombings. There doesn’t seem to be much doubt that he’s the perpetrator.

An art student. Yeah. That fits; the tone of the portentious twaddle in pipe-bomb-boy’s manifesto was exactly that of the artist manque, big ideas being handled stupidly by a doofus whose ambition exceeds both his talent and his intellect. He fronted a grunge band called “Apathy”, we hear.

You know what? I’d lay long odds the band sucks. And I’m not making that guess out of hostility or contempt, either, but because an artist with any confidence in his own ability would have found it a much better way to achieve his artistic goals than anonymously bombing mailboxes. (Artistic goals, in a guy that age, usually have a lot to do with meeting girls. I was a rock musician in my youth, and am therefore un-foolable on this issue.)

It was inevitable, I suppose, that sooner or later terrorism would become bad performance art. It’s easy to condemn pipe-bomb-boy for callously putting people at lethal risk with his toys, but difficult to summon up the kind of personal hatred for this perpetrator that Al-Qaeda’s flamboyant fanatic nut-jobs have so richly earned. I think our ire might be more properly directed elsewhere — at all the people who have cooperated in dumbing down the definition of `art’ so completely that Luke Helder actually thought he was doing it.

Once upon a time, art had something to do with achieving a meeting of minds between artist and audience. The artist’s job was to rework the symbols and materials of his culture into expressions that affirmed and explored the values of that culture and pleased audiences. Artists operated within interpretive traditions that they shared with the non-artists in the audience. The truly able artist earned the privilege of making his work personal and individual, but only by successfully finding an audience and communicating with it in acceptable conventional terms first.

In the late 19th century Western culture began to admit a new definition of `art’ and a new role for artists. Under the influence of modernism and various post-modern movements, artists began to see their job as the systematic subversion of the interpretive traditions they had inherited. “Back to zero!” was the cry. After zero, the new goal could no longer the meeting of minds in a culturally shared commons, but rather that the audience’s minds should be invaded by the disruptive brillance of the artist’s individual insight.

In the hands of a few early moderns — Stravinsky, Brancusi, Picasso, Joyce — the new agenda produced astonishingly fine work. In the hands of too many others, it produced vacuous, narcissistic nonsense. Luke Helder inherited its most vulgar form — the notion that all the artist is required to do is “make a statement” about the contents of his own muddled mind, and it’s the world’s job to catch up.

Luke-boy’s last art project at school was “a pencil sharpener embedded in a tree stump that was rigged to illuminate Christmas lights as it sharpened pencils”. No comedian could make up such a perfect paradigm of bad art. The pointless artifice, the banal superficial cleverness, the utter lack of respect for materials, and the complete disconnection from the millennia-long cultural conversation that includes all the great art of our civilization. It’s really not a long step from this garbage to pipe bombs as `art’. Not a long step at all.

No account of Luke Helder suggests that he’s particularly evil. I wonder…suppose he had learned formal prosody, or how to paint in oils, or compose a fugue, or do figurative sculpture. Suppose he had learned artistic forms and media that were situated in history, connected with the world, concerned with beauty. Suppose he had been taught something for art to be about other than the vacancy in his own head. Suppose he had been taught (shocking concept) standards?

Perhaps, then, he would not have required explosives to express himself.

UPDATE: And back in 1996, there were conceptual art bombs in Seattle.

Blogspot comments

May 13

Acting White

Eugene Volokh comments that
many of the leading promoters of racial identity politics in the
U.S. have begun to lump Asians in with white people, but declines to
attempt an interpretation. Actually this development is very easy to
understand. All you need to break the code is to know that “white” =
“assimilated”.

Asians tend to be perceived as “white” not
because they have white skin but because they behave as white people
are expected to behave — they pursue prosperity and value education,
and seek to blend into the U.S.’s broad middle class rather than
creating a defiant, adversarial ghetto or barrio culture. Compare the
epithet “acting white”, used among urban blacks to sneer at kids with
black skin who work at being good students or holding down regular
jobs.

This is nothing new. Historically, “whiteness” has never been a
purely racial category. As late as the turn of the 20th century,
Irish immigrants in the U.S. were sometimes separated from “whites” in
speech and writing. Later, Eastern Europeans and Italians had to
assimilate to U.S. cultural norms before being considered as “white”
as the English, Germans, and Irish who had preceded them. Today,
prosperous Asians have edged over that border. In our big cities,
Chinese New Year is headed the way of the St. Patrick’s Day Parade,
becoming as American as apple pie.

So why aren’t black people white too? The answer, I suggest, has
very little to do with race and a lot to do with class —
specifically, the persistence of the black urban underclass. Not just
as a population but as a culture that remains mired in high crime,
high rates of single motherhood, high unemployment, and all the other
symptoms of high dependency on government largesse. The “Great
Society” programs of the 1960s and the race-hustling identity politics
that followed stalled out the assimilation process that turned Irish,
Italians, and (recently) Asians into whites.

Try to imagine a Korean equivalent of gangsta rap. Or a bunch of
Vietnamese high-school students taunting one of their own for “acting
white”. Or Chinese kids fixating exclusively on Chinese adults as
role models. These things don’t happen. And that’s why Asians are
white.

UPDATE: Several Asians have written to tell me that I was doing OK
until the last paragraph. There are anti-assimilationists among Asian
immigrants, as it turns out; there is, in fact, even Korean gangsta
music. However, my sources agree that these phenomena don’t persist
among American-born Asians. I think it’s also significant that Asian
anti-assimilationism is not a public phenomenon — it’s
visible to other Asians but there are no movies glorifying it nor
political organizations trading on it.

Blogspot comments