Dec 15

The Last Samurai

Hollywood has given us a run of surprisingly good movies recently.
By ‘surprisingly good‘ I mean that they’re rather better
than one might expect from their genre. Loony Toons: Back In
Action
, for example, could have been a mere merchandising
vehicle, a repetition of clichés and tired sight gags. Instead
it was a wickedly funny combination of Animaniac edginess with classic
Warner Brothers wackiness. It has a few moments of true brilliance
— the sequence in which Elmer Fudd chases Bugs and Daffy through
Salvador Dali’s “The Persistence of Memory” (think of melting clocks)
is jaw-droppingly wonderful, sublime art.

Master & Commander: The Far Side of the World was
also a surprising treat. I’ve read all 20 of the Aubrey/Maturin
novels. The movie doesn’t capture their texture and depth —
that would be impossible, they are deeply literary works — but
as an adventure movie that refers to the books without insulting the
reader’s intelligence it works quite well.

The Lord of the Rings and Harry Potter
movies are so good that hard-core fans of their respective books are
still pinching themselves, wondering when they’re going to wake up to
the discovery that they’re actually watching the usual dumbed-down
Hollywood crap. (I say this as a Tolkien fan so hard-core that I was
able to catch nuances of the spoken Elvish that weren’t in the
subtitles.)

Of course there have been dreadful turkeys where we expected
better, as well. The third Matrix movie and Star
Wars: Attack of the Clones
leap to mind. But dreadful turkeys
are part of the normal scene; what’s abnormal is that New
Line gave Peter Jackson the money and freedom to make
Rings movies that, while rushed and not without the
occasional compromise, are almost achingly good.

Think about it. When was the last time you saw a movie that (a) was
a book adaptation faithful enough for the fans to cheer it, (b) got
great reviews from movie critics, and (c) was boffo box office? Just
counting the Rings and Potter movies and Master & Commander,
we’ve now had five of these in relatively quick succession. Something
is going on here. Can it be that Hollywood is having an attack of
intelligence and taste?

(My wife Cathy suggests Saving Private Ryan as a
precursor of the trend.)

The movie that pushed me to think about this as a pattern, rather
than a series of isolated incidents, is The Last Samurai.
I’d been wanting to see this one since the first trailers six months
ago, but was braced for a disappointment on the scale of Pearl
Harbor
. Hollywood’s record on wide-screen historicals is
dreadful; they tend to be laughably ahistorical — either
mindless spectacles or video sermonettes for whatever form of
political correctness was in vogue the week they were made. Remarkably,
The Last Samurai almost completely avoids these flaws.

I said “almost completely”. The movie is not without
flaws. But even the flaws are interesting. They illustrate the ways
in which Hollywood’s metric for a good (or at least successful) movie
is changing.

Let’s start with the bad stuff. First, way too much camera time
that could have been better employed gets spent on emotive closeups of
the lead’s phiz (a misfeature The Last Samurai shares with the first two Ring
movies and I am thus beginning to think of as ‘the Frodo
flaw’). But this is Hollywood and it’s Tom Cruise and one
supposes such excess is inevitable.

Secondly, the movie is seriously anti-historical in one respect; we
are supposed to believe that traditionalist Samurai would disdain the
use of firearms. In fact, traditional samurai loved firearms
and found them a natural extension of their traditional role as horse
archers. Samurai invented rolling volley fire three decades before
Gustavus Adolphus, and improved the musket designs they imported from
the Portuguese so effectively that for most of the 1600s they were
actually making better guns than European armorers could produce.

But, of course, today’s Hollywood left thinks firearms are
intrinsically eeeevil (especially firearms in the hands of anyone
other than police and soldiers) so the virtuous rebel samurai had
to eschew them. Besides being politically correct, this choice
thickened the atmosphere of romantic doom around our heroes.

Another minor clanger in the depiction of samurai fighting: We are
given scenes of samurai training to fight empty-hand and unarmored
using modern martial-arts moves. In fact, in 1877 it is about a
generation too early for this. Unarmed combat did not become a
separate discipline with its own forms and schools until the very end
of the nineteenth century. And when it did, it was based not on
samurai disciplines but on peasant fighting methods from Okinawa and
elsewhere that were used against samurai (this is why most
exotic martial-arts weapons are actually agricultural tools).

In 1877, most samurai still would have thought unarmed-combat
training a distraction from learning how to use the swords, muskets
and bows that were their primary weapons systems. Only after the
swords they preferred for close combat were finally banned did this
attitude really change. But, hey, most moviegoers are unaware of
these subtleties, so there had to be some chop-socky in the script to
meet their expectations.

One other rewriting of martial history: we see samurai
ceremoniously stabbing fallen opponents to death with a two-hand
sword-thrust. In fact, this is not how it was done; real
samurai delvered the coup de grace by decapitating their
opponents, and then taking the head as a trophy.

No joke. Head-taking was such an important practice that there was
a special term in Japanese for the art of properly dressing the hair on
a severed head so that the little paper tag showing the deceased’s name
and rank would be displayed to best advantage.

While the filmmakers were willing to show samurai killing the
wounded, in other important respects they softened and Westernized the
behavior of these people somewhat. Algren learned, correctly, that
‘samurai’ derives from a verb meaning “to
serve”, but we are misled when the rebel leader speaks of
“protecting the people”. In fact, noblesse oblige was not
part of the Japanese worldview; samurai served not ‘the
people’ but a particular daimyo, and the daimyo served the
Emperor in theory and nobody but themselves in normal practice.

Now for some of the good stuff. It begins with an amazingly strong
performance by Ken Watanabe as the rebel daimyo Katsumoto. From the
first moment that you see him, you believe him; there are no moments
of hey-I’m-Tom-Cruise to mar his immersion in the character, for
which excellent reason he actually upstages Cruise at several key points.

Through Katsumoto and the other Japanese characters, we are made to
see the intertwined quests for perfection of both technique and self
that was so central to the samurai warrior-mystic. Indeed, there are
points at which the filmmakers have some subtle fun with the fact that
Americans of our day, having successfully naturalized Japanese martial
arts into our own culture, have learned to understand that path rather
better than Cruise’s Captain Algren does. I’m thinking especially of
the point at which a bystander watching Algren lose at sword practice
tells him he has “too many minds”. The viewer probably knows what
he is driving at even if Algren does not.

Better: the movie is properly respectful of Japanese virtues
without crossing the line into supine multiculturalism. Captain
Algren appreciates and accepts the best of an alien culture
without renouncing his identity as a Westerner, an officer,
and a gentleman. There is a telling scene after Algren has been
accepted into the life of his Japanese hosts in which he takes a heavy
load from Taka (the female lead), who protests that Japanese men never
help with such things.

Algren replies that he is not a Japanese man. In this and other
ways he refutes an already-standard knock on the movie, which is to
refer to it as “Dances with Samurai”. But this movie,
despite the flaws I’ve pointed out, is more honest and far less
sentimental about the samurai than Dances With Wolves was
about its Sioux. This is progress of a sort.

Algren’s romance with Taka is also handled with a degree of
restraint that is appropriate but surprising. We get no sexual
cheap thrills; instead, we get subtle but extremely powerful
eroticism, notably in the scene where Taka dresses Algren in her
dead husband’s armor just before the final battle.

The film is visually quite beautiful. The details of costume,
weapons, armor, and the simple artifacts of Japanese village life are
meticulously and correctly rendered. In fact there are a number of
points at which the setting is stronger than the script and carries
one through places where the plotting is a bit implausible.

This contrast is an illustration of the uneven way in which
standards have risen. The Last Samurai, the Rings
movies, Master & Commander, and the Harry Potter movies
all have vastly better production values than (I think) they would
have had even ten years ago — perhaps the huge advances in
special-effects technology have created a sort of upward pressure on
the quality of movies’ depictions of reality. On the other hand,
downright silly plot twists are still acceptable and the conventions
of the star-vehicle film remain firmly in place.

One gets ahistorical howlers and (in fiction) violations of the
spirit of the original work, but fewer than formerly. In all these
movies, you can see where they were trimmed to fit Hollywood’s
marketing needs, but the trimming is done with a lot more sensitivity
and taste than it used to be. Occasionally one even sees outright
improvements — the moment in Peter Jackson’s version of
Boromir’s death scene in which the fallen Gondorian hails Aragorn as
his king, for example, achieves more power and poignancy than
Tolkien’s original.

I like this trend a lot, but I’m not sure I understand it. The
Hollywood establishment is in business to make money, but the link
between market demand and the quality of films has always been
tenuous at best. It would be nice to think that film audiences
have required filmmakers to exhibit better taste by developing
better taste themselves, but in the face of all the awful schlock
that still gets churned out and makes money, this is a difficult
case to sustain in general.

It feels to me more as though some balance of power within the
system has shifted and, for whatever reason, creative artists
have gained power at the expense of the marketeers. Thus, for
example, Rowling had more than somewhat to do with the casting
of the Harry Potter movies, and Peter Jackson’s films display
a nearly obsessive concern with getting the look of Middle-Earth
right that could hardly be shared by a typical studio exec.

Whatever the reason, I’m glad of the trend. I spend a lot more
time in movie theaters than I use to — and that’s the
message Hollywood wants to hear.

Dec 09

Ejected in Geneva

The organizers of the Internet Summit in Geneva have had Dr. Paul
Twomey, the president of ICANN (the organization that’s chartered to
administer the international domain-name system), ejected by security
guards after he’d flown twenty hours to participate in the
meeting.

I was not especially surprised. The organizers of the Geneva
summit seem to be very much the same scum of the planet that one
normally finds running these U.N. events — third-string
diplomatic timeservers, addle-brained NGO moonbats, a scattering of
celebrity Eurotrash, and a legion of gray apparatchiks from
authoritarian Third World pestholes. It didn’t astonish me that
they’d use force to keep out anyone who might interfere with their
plans for a government-friendly, politically-correct, censored, and
very thoroughly controlled Internet.

No, the really surprising part is that I found myself sympathizing
with Dr. Twomey. ICANN’s performance, while not the unmitigated
disaster many of its critics like to portray, has not been glorious.
Way too many deals have been done in back rooms and the organization has
been far too kind to expansive trademark claims and other sorts of
corporate land-grab.

Perhaps the one salutary effect of the Geneva summit is to remind us
that things could easily be worse — and almost certainly will be, if
the U.N. gets control.

Dec 08

Cthulhu and Christ

This parody below comes to us from an artist named Howard Hallis, to whom all credit is due. I’ve taken the liberty of reproducing it here because the design of his website leads me to suspect that this cartoon might be replaced by something else the next time he has a fit of artistic inspiration.

This is a brilliant piece of art. While it helps to have a prior acquaintance with the ‘Cthulhu Mythos’ that H.P. Lovecraft developed in now-classic horror stories of the 1920s and ’30s, Hallis does a vivid and effective job of conveying the central themes and feel of the Mythos. But the truly subversive genius of this cartoon lies elsewhere…about which more after you have read it.

This is, of course, a parody of a fundamentalist Christian evangelical tract. More specifically, it is a remarkably accurate take on the style of Jack T. Chick, a pamphleteer who has occupied the scungy basement of Christian evangelism since the 1960s. Both the talking heads are recognizable, stock Chick characters — the sinful, scornful unbeliever and the saintly white-haired minister.

Some cultural-studies type ought to do a book on the way that the Cthulhu mythos has oozed forth from its pulp origins to become Western pop culture’s generic Nightmare From Beyond. This parody could have been written thirty years ago — Chick goes back that far and has been remarkably, er, consistent in his output — but thirty years ago only a handful of SF and fantasy fans would have recognized Cthulhu. Nowadays ol’ squid-face is all over the place; there are, ironically, plush toys.

I put it down to fantasy-role-playing games, which have reached a far larger audience than print SF or fantasy. Gamers have borrowed the Cthulhu mythos so frequently that it’s a cliché — but one which, thanks to the eerie power of Lovecraft’s imagery, never completely loses its power to send a chill down the spine. Even the mere names — the Necronomicon, Yog-Sothoth, the corpse-eaters of Leng, the Hounds of Tindalos, and of course dread Cthulhu himself — is to feel a vast and threatening darkness.

Hallis’s parody draws on a much more specific tradition. The idea of the Campus Crusade for Cthulhu as a parody of the Campus Crusade for Christ was already live when I was in college in the 1970s. But Hallis makes their point more compactly and effectively, and therein lies the real touch of genius in this piece.

Jack T. Chick’s pamphlets speak plainly the most fundamental message of Christian evangelism: believe or be damned. It’s all about fear, the induced fear that if you don’t get straight with God you will burn in Hell. Not for Chick the sugar-coating of talk about love or morality or becoming a better person. Writing for the lowest common denominator, he zeroes in on terror.

But so pervaded is our culture with Christian ideas and imagery that it is difficult to see how nasty and inhumane Chick and his ilk really are; even those of us who are not Christians tend to respond to the fear-mongering with a kind of numbness, reacting to Chick’s ugly, drab oeuvre mainly as an offense against good taste (or a form of unintentional found humor). For the more intelligent sort of Christian, Chick is embarrassing — like a slovenly relative you can’t quite kick out of your house because, after all, he is family.

What is really incisive about Hallis’s parody is his demonstration that very little about the Christian world-view or rhetoric has to change to make it indistinguishable from Lovecraft’s nightmare. Ah, the rapture of being taken up by the Elder Gods! Worship and sacrifice are good things. Trust the preacher, he will make you fear and show you the way.

It used to be popular among a certain sort of leftist to claim that the collectivist and apocalyptic ideas in socialism made it a proper political analog of Christianity. They were arguably correct in this; where they went wrong was in considering the connection flattering to socialism rather than damning of Christianity. Hallis’s parody is a starker demonstration; the fact that both the fictional cult of Cthulhu and the all-too-real religion of Christianity both depend so fundamentally on the terror of the Gods is not grounds for exonerating the former, but rather for condemning the latter.

Dec 06

Da Big Snow

Yup, the blizzard is big. Here in eastern Pennsylvania we’ve had over a foot of snow and
a lot of drifting today. I shoveled my driveway. I’m going to be stiff tomorrow.

Dec 05

Salaries are dropping. Time to celebrate!

So, the latest trend to hit the business magazines is falling programmer salaries. I can’t lay hands on the article just now, but it seems some CEO under pressure to outsource his programming to India had the bright idea of offering lower salaries (competitive with Indian levels, not U.S. levels) to programmers in the U.S. He got 90 applicants, even though the offer was for about half of what used to be considered normal for the positions.

A pointer to this article was posted to my favorite mailing list by a friend who is depressed about programmer salaries dropping, He wasn’t un-depressed by the revelation, at the end of the article, that said CEO ended up jacking some of his salaries back up to “normal” levels to keep his best people.

There are a bunch of ways I could respond to this. One is by arguing that outsourcing programming work is a fad that will largely reverse itself once the true, hidden costs start to become apparent. Even if that weren’t so, the Indian advantage would be temporary at best; as the Indian programmer’s value rises, so will the price he charges. I believe these things are true. But in keeping with tradition here at Armed and Dangerous, I’m going to skip the easy, soft arguments and cut straight to the most important and contentious one of all — falling salaries are good for you.

If you’re a programmer upset by falling programmer salaries, I hope you’re prepared to be equally gloomy about the continuing fall in real-dollar prices of all the other labor-intensive goods you buy. Because trust me, they get cheaper the exact same way — and somewhere out there, there are people who are pissed off and depressed because the market wouldn’t support their old salaries.

But each time this happens, more people gain than lose. The money programmers aren’t making is, ultimately, money some other consumer gets to keep and use for something else, because the price of the bundled goods programmmers were helping produce have dropped. The corporate cost-cutters only get to profit from this as a transient thing, until the next round of price wars. Lather, rinse, repeat.

The free market is a wonderful thing. I was going to call it the most marvellous instrument ever devised for making people wealthy and free, but that would be wrong — the free market isn’t a ‘device’ any more than love or gravity or sunshine are devices, it’s what you have naturally when nobody is using force to fuck things up.

Sometimes, when you and your friends are on the bad end of one of its efficiency-seeking changes, it’s hard to remember that the market is a wonderful thing for almost everybody almost all the time. But it’s worth remembering, just as it’s worth remembering that free speech is a wonderful thing even when it’s the Nazis or Communists exercising it.

Why is this? Because the alternatives to free speech, even when the people pushing them mean well, always turn into petty tyrannies now and become grand tyrannies in the course of time. The alternatives to markets decay into tyranny a lot faster.

Nov 21

The Prudential interview

I’ve spent a lot of time and effort since 1997 developing effective propaganda tactics for
reaching the business world on behalf of the hacker community — among other things, by
popularizing the term ‘open source’. If you want to grok how this is done, read
my October 15 interview with a bunch of Prudential Securities investors.

Pay attention to style as well as content. This is the language you have to learn to speak
to reach the people who write big checks. It’s not very complicated, if you just bear in mind
that these people are obsessed with two things: risk management and return on investment. As they should be — it’s their job.

Nov 20

Jack needs a girlfriend



“Free Love”, eh? Well, that would explain a lot. Jack must have been the dude I saw
damn near run into a doorframe yesterday because he was checking out my wife Cathy so intently he forgot to watch where he
was going. Not that there’s anything wrong with that.

Nov 17

What good is IQ?

A reader asks:

To clarify, while I believe natural selection explains a lot I have
caveats about IQ as a tool for testing intelligence. If you can’t
measure the coast of France with a single number how can you do it
with human intelligence?

Easily. Human intelligence is a great deal less complex than the
coast of France. :-)

It’s fashionable nowadays to believe that intelligence is some
complicated multifactor thing that can’t be captured in one number.
However, one of the best-established facts in psychometry (the science
of measuring mind) is that it is quite difficult to write a test of
mental ability that is not at least 50% correlated with all other such
tests. Or, to put it another way, no matter how you design ten tests for
mental ability, at least about half the variance in the scores for any one
of them statistically appears to be due to a “general intelligence”
that shows up on the other nine tests as well.

Psychometricians call this general intelligence measure “g”. It
turns out to predict important real-world success measures quite well
— not just performance in school but income and job success as
well. The fundamental weakness in multiple-factor theories of intelligence
is that measures of intelligence other than g appear to predict
very little about real-world outcomes. So you can call a lot of other
things “intelligence” if you want to make people feel warm and fuzzy,
but doing so simply isn’t very useful in the real world.

Some multifactor theorists, for example, like to describe accurate
proprioception (an acute sense of body position and balance) as a kind
of intelligence. Let’s say we call this “p”. The trouble with this
is that there are very few situations in which a combination of high p
and low g is actually useful — people need to be able to balance
checkbooks more often than they need to walk high wires. Furthermore,
g is easier to substitute for p than the other way around; a person
with high g but low p can think up a way to not have to walk a high
wire far better than a person with low g but high p can think up a way
not to have to balance a checkbook. So g is in a strict functional
sense more powerful than p. Similar arguments apply to most of the
other kinds of specialized non-g ‘intelligence’ that have been
proposed.

Once you know about g, you can rank mental-capability tests by
how well their score correlates with g. IQ is valuable because a
well-composed IQ test measures g quite effectively. For purposes
of non-technical discussion, g and IQ can be considered the same, and
pychometricians now accept that an IQ test which does not closely track
g is defective.

A lot of ink has been spent by people who aren’t psychometricians
on insisting that g is a meaningless statistical artifact. The most
famous polemic on this topic was Stephen Jay Gould’s 1981 book
The Mismeasure of Man, a book which was muddled,
wrong
, and in some respects rather dishonest. Gould was a
believing Marxist; his detestation of g was part of what he perceived
as a vitally important left-versus right kulturkampf. It is
very unfortunate that he was such a persuasive writer.

Unfortunately for Gould, g is no statistical phantom. Recently g
and IQ have been shown to correlate with measurable physiological
variables such as the level of trace zinc in your hair and performance
on various sorts of reaction-time tests. There are hints in the
recent literature that g may be largely a measure of the default level
of a particular neurotransmitter associated with states of mental
alertness and speed of thought; it appears that calling people of
subnormal intelligence “slow” may not be just a metaphor!

IQ is one of several large science-related issues on which
political bias in the dominant media culture has lead it to present as
fact a distorted or even reversed version of the actual science. In
1994, after Murray and Herrnstein’s The Bell Curve got a
thoroughly undeserved trashing, fifty leading psychometricians and
psychologists co-signed a summary of mainstream
science on intelligence
. It makes eye-opening reading.

The reasons many popular and journalistic accounts continue to
insist that IQ testing is at best meaningless and at worst a sinister
plot are twofold. First, this belief flatters half of the population.
“My IQ may be below average, but that doesn’t matter because IQ is
meaningless and I have high emotional intelligence!” is,
understandably, a favorite evasion maneuver among dimwits. But that
isn’t the worst of it. The real dynamite is not in
individual differences but rather that the distribution of IQ (and
hence of g) varies considerably across groups in ways that are
politically explosive.

Men vs. women is the least of it. With other variables controlled,
men and women in a population have the same mean IQ, but the
dispersion differs. The female bell curve is slightly narrower, so
women have fewer idiots and fewer geniuses among them. Where this
gets touchy is that it may do a better job than cultural sexism of
explaining why most of the highest achievers in most fields are male
rather than female. Equal opportunity does not guarantee equal
results, and lot of feminist theory goes out the window.

But male/female differences are insignificant compared to the real
hot potato: differences in the mean IQ of racial and ethnic groups.
These differences are real and they are large enough to have severe
impact in the real world. In previous blog entries I’ve mentioned the
one-standard-deviation advantage of Ashkenazic Jews over gentile
whites; that’s roughly fifteen points of IQ. Pacific-rim Asians
(Chinese, Japanese, Koreans etc.) are also brighter on average by a
comparable margin. So, oddly enough, are ethnic Scots — though
not their close kin the Irish. Go figure…

And the part that, if you are a decent human being and not a racist
bigot, you have been dreading: American blacks average a standard
deviation lower in IQ than American whites at about 85. And
it gets worse: the average IQ of African blacks is lower
still, not far above what is considered the threshold of mental
retardation in the U.S. And yes, it’s genetic; g seems to be about
85% heritable, and recent studies of effects like regression towards
the mean suggest strongly that most of the heritability is DNA rather
than nurturance effects.

For anyone who believe that racial equality is an important goal,
this is absolutely horrible news. Which is why a lot of
well-intentioned people refuse to look at these facts, and will
attempt to shout down anyone who speaks them in public. There have
been several occasions on which leading psychometricians have had
their books canceled or withdrawn by publishers who found the actual
scientific evidence about IQ so appalling that they refused to print
it.

Unfortunately, denial of the facts doesn’t make them go away. Far from
being meaningless, IQ may be the single most important statistic about
human beings, in the precise sense that differences in g probably drive
individual and social outcomes more than any other single measurable
attribute of human beings.

Mean IQ differences do not justify making assumptions about any individual.
There are African black geniuses and Ashkenazic Jewish morons; humanity and
ethics demand that we meet each individual human being as an individual,
without prejudice. At the same time, group differences have a significance
too great to ignore. In the U.S., blacks are 12% of the population but
commit 50% of violent crimes; can anyone honestly think this is
unconnected to the fact that they average 15 points of IQ lower than the
general population? That stupid people are more violent is a fact
independent of skin color.

And that is actually a valuable hint about how to get beyond
racism. A black man with an IQ of 85 and a white man with an IQ of 85
are about equally likely to have the character traits of poor impulse
control and violent behavior associated with criminality — and
both are far more likely to have them than a white or black man with
an IQ of 110. If we could stop being afraid of IQ and face up to it,
that would give us an objective standard that would banish racism per
se. IQ matters so much more than skin color that if we started paying
serious attention to the former, we might be able to stop paying
attention to the latter.

UPDATE: An excellent summary of science relating to g
is here

Nov 14

Funny, but incorrect

From the November 12 “Kernel Panic”:

Nov 12 2003 strip

In fact, this strip is incorrect. I did not coin the term “open source”;
I only popularized it. It was coined by
my friend Christine Peterson of the Foresight Institute. While it’s true that I more or less ran the brainstorming session and fortunately had enough of a clue to recognize a winner when it popped up, the creative leap was all hers.

UPDATE: Yes, it now reads “popularized”. Chris Wright changed it.

Nov 14

Selecting for intelligence

Mike Smith relays an interesting possible explanation for the observed
statistical fact that American and European Jews have a mean IQ a
standard deviation higher than Caucasian gentiles:

During the period from ancient times to modern times, there was a
constant phenomenon of Jews converting to Christianity (there were
many social pressures to do so). In a nutshell, the idea is that the
lower-IQ Jews were statistically more likely to convert, as it freed
them from having to learn to read Torah. During the Middle Ages, it
was not worth the effort for most people to become literate; the
payback was not worth it. Books were rare and expensive, and learning
to read was no guarantee of getting ahead in life. Of course, people
like to do what they’re especially good at, and the higher-IQ’s among
the Jews did not find learning to read to be such a burden. As such,
they were statistically less likely to convert (and statistically more
likely to become fathers of many children in a culture that valued
intelligence.) It is worth noting that in ancient times, Jews were not
stereotyped as especially intelligent; that stereotype arose in the
Middle Ages.

This is a special case of one of my favorite Damned Ideas, originally
developed by John W. Campbell in the 1960s from some speculations
by a forgotten French anthropologist. Campbell proposed that the
manhood initiation rituals found in many primitive tribes are a
selective machine designed to permit adulthood and reproduction only
to those who can demonstrate verbal fluency and the ability to override
instinctive fears on verbal command.

Campbell suggests that all living humans are descended from groups
of hominids that, having evolved full-human mental capability in some
of their members, found the overhead of supporting the dullards too
high. So they began selecting for traits correlated with intelligence
through initiation rituals timed for just as their offspring were
achieving reproductive capacity; losers got driven out, or possibly
killed and eaten.

Campbell pointed out that the common elements of tribal initiations
are (a) scarring or cicatricing of the skin, opening the way for
lethal infections, (b) alteration or mutilation of the genitals,
threatening the ability to reproduce, and (b) alteration of the mouth
and teeth, threatening the ability to eat. These seem particularly
well optimized for inducing maximum instinctive fear in the subject
while actually being relatively safe under controlled and relatively
hygenic conditions. The core test of initiation is this: can the
subject conquer fear and submit to the initiation on the basis
of learned (verbal, in preliterate societies) command?

Campbell noticed the first order effect was to shift the mean of
the IQ bell curve upwards over generations. The second-order effect,
which if he noticed he didn’t talk about, was to start an arms race in
initiation rituals; competing bands experimented with different
selective filters (not consciously but through random variation).
Setting the bar too low or too high would create a bad tradeoff
between IQ selectivity and maintaining raw reproductive capacity. So
we’re descended from the hominids who found the right tradeoff to push
their mean IQ up as rapidly as possible and outcompeted the groups
that chose less well.

It doesn’t seem to have occurred to Campbell or his sources, but
this theory explains why initiation rituals for girls are a rare and
usually post-literate phenomenon. Male reproductive capacity is
cheap; a healthy young man can impregnate several young women a day,
and healthy young men are instinct-wired to do exactly that whenever
they can get away with it. Female reproductive capacity, on
the other hand, is scarce and precious. So it makes sense to select
the boys ruthlessly and give the girls a pass. Of course if you push
this too far you don’t get enough hunters and fighters, but the right
tradeoff pretty clearly is not 1-to-1.

(This would also explain why humans are designed for mild polygyny,
1 to 3 sexual partners per male. You can spot this by looking at
where human beings are on various physical characteristics that
correlate with degree of polygyny in other primates — disparity in
average size between males and females, for example, is strongly
correlated with it.)

What Campbell did notice is that this theory of selection
by initiation would neatly explain one of the mysteries of human
paleoanthropology — how human beings got so smart so fast. The
differences between H. Erectus and H. Sapiens are not large in
absolute genetic terms (they can’t be, we share over 94% of our genome
with chimps) but they’re hard to credit given normal rates of
morphological change in mammals and only two million years to work
in. Something must have been putting hominids under
abnormally strong selective pressure — and Campbell’s idea
is that we did it to ourselves!

Now, I’m not sure I believe Jews bootstrapping themselves up a
whole standard deviation in less than 2000 years, but if you apply
a similar idea to a longer timeframe it begins to look pretty
reasonable. (And Campbell did suggest that the Jewish practice of
infant circumcision had originally been a manhood rite.)

Within my lifetime, I expect we’re going to have the ability to do
germ-line enhancement of human intelligence. I strongly suspect that that
will set off another arms race — because cultures that suppress
that technology will be once again doomed against cultures that do. And
this time, we’re smart enough to know that in advance…

Nov 14

Communism and the Jews

Uh-oh. I see another identity-politics double-bind coming. Eugene Volokh comments on the anti-semitic canard that Jews were disproportionally influential in the development of Communism. The sides in this kind of dispute are very predictable. One one hand, the anti-Semites, a disgusting crew of racist troglodytes with evil motives. On the other, the good-hearted and right-thinking people in the world exclaiming in horror at the very thought that anyone might say anything veering so close to the classic tropes of anti-Semitic propaganda. (And I am not being the least bit ironic in either description, not this time.)

Unfortunately, the awkward thing about this particular canard is that it happens to be true. And that illustrates a serious problem, an inability to cope that most historians have acquired when questions of history go too near certain forbidden topics and modes of inquiry.

As Eugene Volokh’s sources note, a disproportionately large number of the original Bolsheviks were Jewish. Karl Marx was ethnically Jewish, though his parents had converted to Christianity. It is impossible to study the history of Marxism, Socialism, and Communism without noticing how many Jewish names crop up among the leading intellectuals. It is equally impossible not to notice how many of the Old Left families in the U.S. were (and still are) Jewish — and, more specifically, Ashkenazim of German or Eastern European extraction. Julius and Ethel Rosenberg didn’t come out of nowhere.

It’s not even very hard to understand why this is. There is a pattern, going back to Spinoza in the 1600s, of Jewish intellectuals seeking out the leading edge of certain kinds of reform movements. Broadly speaking, if you look at any social movement of the last 300 years that was secular, rationalist, and communitarian, somewhere in it you would find nonobservant Jews providing a lot of the intellectual firepower and organizational skills. Often a disproportionate share, relative to other population groups.

Communism was one example; there are many others. One of my favorites is the Ethical Culture movement. Today, we have the Free Software movement, not coincidentally founded by Jewish atheist Richard Stallman. There is an undeniable similarity among all these movements, an elusive deep
structure having to do not so much with shared beliefs as a shared style of believing that one might call messianic social rationalism.

Anybody who thinks I’m arguing for a conspiracy theory should check their meds. No, there is something much simpler and subtler at work here. Inherited religious myths, even when they no longer have normative force, influence the language and conceptual frameworks that intellectuals use to approach other issues. The mythologist Joseph Campbell once noted that thinkers with a Catholic background like mine gravitate towards universalizing mysticisms and Protestants towards individualist redemptionism; he could have added that thinkers with a Jewish heritage tend to love messianic social doctrines. (One can cite exceptions to all three, of course, but the correlation will still be there after you’ve done so.)

Thus, assimilated Jews have a particular propensity for constructing secular messianisms — or for elaborating and intellectualizing secular messianisms invented by gentiles. But you can’t say this sort of thing in academia; you get called a racist if you do. And you especially aren’t allowed to notice the other reason movements like Communism sometime look not unlike Jewish conspiracies — which is that the IQ bell curve for Jews has a mean about a standard deviation north of the IQ bell curve for Caucasian gentiles.

In cold and sober truth, in any kind of organization where intelligence matters — even the Communist movement —, you are going to find a disproportionate number of Jews with their hands on the levers. It doesn’t take any conspiracy to arrange this, and it’s not the Jews’ fault the goyim around them are such narrs (Yiddish
for “imbeciles”). It just happens.

But only people like me who don’t give a shit about being castigated for political incorrectness are willing to even whisper these things. Because that’s true, anti-anti-Semites can’t counter anti-Semitic muck-spreading with the truth; instead, they have to pretend that none of the historical patterns around which anti-Semites have constructed
their paranoid delusions have any basis in fact at all.

This is denial, and leaves the good guys in a damn weak position against anti-Semitic racists, who by distorting the record only a little can not only feel they have the truth on their side, but in some nontrivial ways actually be justified in that belief.

Unlike the anti-Semites, I mostly like the cultural traits that led so many Jewish intellectuals to Communism — including one I haven’t mentioned yet, the urge to transcend ethnic tribalism
and order the world according to a Law. But if the road to a Christian hell is paved with good individual intentions, the road to totalitarian hell is paved with communitarian idealism. It’s a tragedy that in Communism Jewish idealism, messianism, and intellectualism nourished a monster that turned on the Jews and killed so many of them.

If the discussion didn’t violate so many taboos, mainstream scholars could start asking even more interesting questions. Like: exactly how and why did thinkers raised in the relatively gentle communitarianism of the Jewish tradition become apologists for the vicious collectivism of Marxism and all its toxic children? And what can we do to keep the like from happening again, to Jews or anyone else?

But these questions probably won’t get seriously asked in my lifetime. Because political correctness has made us afraid to notice that, in some ways, the Jews really have had a special, shaping influence on the reform politics of the modern era, including Communism. About that much, the anti-Semites are right.

Nov 13

The desexualization of the American (fe)male

There’s been quite a blogospheric flap lately about Kim DuToit’s
essay The
Pussification Of The Western Male
. The single feature of the
conversation that surprised me most is that nobody connected it to
Steven den Beste’s equally searing essay Anglo Women are an
endangered species
.

Steve’s point complements Kim’s and amplifies it in some useful
ways. Nobody wants to go back to the days when women were treated as
chattels or second-class citizens. Anyway, attempts to do so would be
doomed for reasons not so much moral as economic; societies that
suppress the productivity and intelligence of 50% of their members are
inevitably going to lose out to societies that don’t. But what Steve and
Kim have pointed out is that Western society often has pursued the
worthy goal of equality in a way that is hamfisted and destructive,
because it tries to remake human nature rather than acknowledging and
working with it.

These essays address two specific problems we’ve been saddled with;
Kim’s with the attack on masculinity, and Steve’s with the attack on
femininity. Among white anglos (especially bicoastal
“progressive” white anglos), it is no longer respectable
for a male person to behave like a man and a female person to behave
like a woman.

In fact, in today’s bien-pensant circles, one can be attacked as a
sexist for suggesting that the phrase “like a man” or
“like a woman” has any meaning at all. Many of us have
become obscurely terrified of sexual dimorphism, apparently out of
fear that acknowledging it will bring back the bad old days.

This kind of attitude has done more damage than most people
realize. Read those essays. There’s something gone badly wrong when
normal boys are dosed with Ritalin for being normally loud and
aggressive, and only strippers have the privilege of hugging a man
they like while at work.

I think our culture will recover from this. Beginning in the
1950s, portions of the kibbutz movement in Israel made the most
fervent try yet at erasing sex differences — they raised kids
in creches and tried to systematically stamp out sex-differentiated
behaviors. They failed; the children of the first generation, despite
intense socialization, gravitated back to traditional sex roles.

We’ll all be happier when we relax enough to acknowledge that
although equality before the law is something every human deserves,
some things naturally fall in men’s country and some in women’s
country — and the fact that minorities of men and women behave
in gender-atypical ways doesn’t change that reality. There will never
be more female soldiers or policemen than male ones, and never more
male nurses and child-rearers than female ones. Men are going to
groove on power tools and women are going to coo at babies; that’s
just the way it is. down to our DNA. Behavioral dimorphism is wired
into us for good reasons that have everything to do with Darwin and
nothing to do with political correctness.

The first stage of recovery is recognizing that there’s a problem
— that men and women find each others’ behavioral as well as
physical sex differences attractive, and that neither men nor women
are well served by efforts to cram us all into a unisex box. My wife
once observed, on behalf of a billion sisters, “What good is a man if
you cut off his balls?” — and she was talking everyday behavior,
not just anatomy or sexual function. There aren’t a lot of men who
will seek out the company of defeminized women if they have a choice
in the matter, either.

That is where essays like Kim’s and Steve’s can help. By waking us up
and pissing us off, they remind us that our sex-linked behaviors and
our preferences for sex-linked behaviors in others actually
matter, that they’re every bit as much a part of our normal
human makeup as having penises or vaginas. People who want us to
forget this for ideological reasons are objectively inhumane.

Nov 12

Yee-ha! W00t! Excelsior!

I got email from Dr. Stanley Schmidt, the editor of Analog,
about an hour ago. The bad news was, he turned down the short story. The
good news was he accepted the fact article.

I’m going to be published in Analog!

/me does geeky victory dance

OK, so this is one of those things where if you don’t immediately
get why it’s wicked cool, no amount of explanation is likely to
enlighten you. I’ll just say I’ve been a science-fiction fan
for 35 years and Analog has always been the banner-bearer
for my kind of SF, the stuff with the rivets in it. I’ve wanted to
get published there when I grew up ever since I was 11 years old;
this is literally a childhood dream come true.

Oh. And Dr. Schmidt asked me to send him more fiction…

Nov 12

Possible outage today

The good folks at ibiblio.org are about to upgrade me from b2 to WordPress. There might be a short outage involved, and it’s possible the

new CSS will garble my pages. Any problems should be transient and fixed
within a few hours.

Nov 12

CSS designer cluelessness in a nutshell

The CSS designer for WordPress, the successor to the
b2 engine that I may be upgrading to shortly, responded to my previous
rant. In a generally thoughtful and responsive post, he said “But
even if [pixel sizes] are defined for fonts, does your browser not let
you easily resize this?”.

This, I’m afraid, is CSS designer cluelessness in a nutshell.

In particular, I should not have to do an explicit operation every
time to get the font sizes I want. In general, answers of the form
“you can override the designer’s preferences by jumping through hoops”
show the wrong attitude. This attitude clashes with the objective
reality of lots of different display devices out there.

It’s also bad human-factors engineering. As the user, my preferences
should be primary
— in font sizes as in all other things. That’s
how the Web is supposed to work, and CSS and web designers who don’t
get this are doing users a major disservice in order to gratify their
own egos.

Ultimately they’re shooting themselves in the foot, too — think about
what will happen over time as display sizes both average larger and
the size dispersion increases (e.g. cell phones and PDAs get WiFi at
the same time desktop displays go to 1600×1200 and higher).
Fixed-size fonts, in farticular, are going to be a bigger and bigger
lose as time goes on.

To the extent you think of yourself as a servant of the user, rather
than an artist whose job it is to make things pretty, that’s when your
designs will have real and lasting value. This is a hard lesson for
artists to learn, but it’s the only way to avoid filling the web with
designs that are gaudy, wearisome, and lose their utility as display
technology improves and becomes more various.

Nov 11

A rant — Why are CSS designers so utterly freaking clueless?

People who put absolute pixel sizes in CSS layouts should be lashed
with knouts. I’ve tripped over this problem yet again while moving my
blog; I’m using b2, and the default
stylesheet shipped with it was obviously produced by some graphics
designer who has failed to grasp the fact that there are lots of
different display sizes and resolutions out there.

OK, for those of you who don’t see the problem here, it goes like this.
Graphic designer composes his layout on a 1024×768 display. To make the
spacing come out all pretty, he specifies a 10 or 11-pixel font which looks
good on that 72-dot-per-inch display. Now I view it on my 1920×1440 display
at over 120dpi resolution — and the font is 40% smaller and a hell of
a strain to read. There are many other, related errors as well, like
specifying absolute box or table widths when percentage of screen width
would be more appropriate.

The basic error here is overcontrolling the layout rather than
letting the user’s browser choose it in execution of the user’s
preferences. Graphics designers are chronically prone to thinking of
a browser as a device for delivering pixels, rather than information.
But it doesn’t have to be this way — and, in fact, HTML isn’t
supposed to be. You can make your CSS scale to the user’s chosen font
size by specifying box dimensions in units of em, en and ex (which are
evaluated relative to the parent box’s current font size) rather than
pixels. But most CSS designers are apparently either too freaking
incompetent to do this or just don’t give a rat’s ass about
display-independence or the user’s preferences to begin with.

This sorry state of affairs is one of the better arguments for the
proposition, widely shared among my peers, that graphics designers
are basically a bunch of dope-smoking ponytailed
dimwits who need to be smacked upside the head on a regular basis and
not let anywhere near a software or web design without strict adult
supervision by a cluebat-wielding programmer.

Another stupid graphic-designer stunt is changing the colors on
visited and unvisited hotlinks away from the browser defaults (it’s especially bad when they’re mapped to the same color). What make this annoying is that it
discards an important visual cue for web page users by making it less
obvious where the hotlinks are. People who do this should be clubbed
with a chair leg until they stop.

Sigh. Here’s the default b2
stylesheet
and here is the stylesheet I
use
. Notice how much simpler mine is? The more you default
rendering decisions to the browser like Ghod and Tim Berners-Lee
intended, the more error-prone crap your stylesheets can omit, the
faster your pages will render, and the better the user experience
will be.

UPDATE: A reader tells me that part of this is the browser vendors’
fault. It seems that on older browsers, only pixel sizes worked
reliably. He says this has long since been fixed but the damage to
CSS designers’ minds was already done. Another reader pointed me to a
good rant on this topic by Jamie Zawinski.

Nov 11

The sleep of reason

I’ve had a copy of David Frum’s Dead Right sitting on my coffee table for months. I didn’t buy it, it was landed on my by an old friend who persists in imagining that I’m interested in reading conservative political theory. In fact, it’s been years since I found conservative theorizing other than wearily predictable. and it would have been a lot more years if I hadn’t been unaccountably late in grasping Russell Kirk’s argument for the organic wisdom of institutions.

John Holbo’s smackdown gives form to all the inchoate reasons I didn’t want to face Frum’s book. Holbo, by his own account, goes looking for a unifying philosophy of conservative thought and finds only an attitude, an aesthetic, a hankering for people and situations to possess certain qualities without a logically or ethically coherent theory of why those qualities would be desirable. Holbo makes much of Frum’s yearning that people should be tough, self-reliant, and self-disciplined and Frum’s apparent willingness that the order of society should punish slackness, even if that is not necessarily the most economically efficient way to arrange things.

Holbo admits that he is loading onto Frum views that Frum would probably deny. But his argument that those views are logical extensions of positions Frum and other conservatives do hold seems basically fair, and so does his charge that Frum-like conservatism is an incoherent mishmash of emotional desires masquerading (not very convincingly) as a political philosophy.

What I am left wondering is why Holbo expected conservatives to have an actual theory in the first place. Or whether he actually expected it at all — his purported surprise and disappointment smells a bit disingenuous to me, a bit like a rhetorical flourish we’re not really expected to believe. Did he really give no thought beforehand to the implication of the label that conservatives use for themselves?

The word conservative is an adjectival noun formed from the verb ‘to conservate’ — to keep something from decaying, to hold it static, to preserve it. Almost all of the core attitudes of conservatism unfold from that definition. Almost all of conservatism is a set of rationalizations for a gut-level inclination to see any sort of change as a threat. Conservatism is the politics of dread, of people who are god-fearing, change-fearing, and
future-fearing.

I say ‘almost all’ because, by historical accident, conservatism has got itself tangled up with impulses of a very different kind — specifically classical-liberal and libertarian ones. Many people who describe themselves as conservatives are in fact nothing of the kind — they are in bed with conservatism only out of a shared loathing of the Marxist/socialist left. The alliance depends on a sort of folie a deux — conservatives fooling themselves that free markets tend to freeze existing power relationships in place, and classical-liberals fooling themselves that freedom can be reconciled with the love of hierarchy and punishment wired into the conservative hindbrain.

The parts of ‘conservative’ theory that actually deserve to be called theory are usually classical-liberal or libertarian intrusions. Nor is this anything new; before being shotgun-wedded to classical liberalism by the threat of Marxism around the beginning of the 20th century, conservatives imported their theory from Aquinas or Plato or Calvin.

In fact, when you get down to trying, it is remarkably hard to name anybody who has done a systematic job of deriving conservative politics from a theory about the nature of good. Especially since the Enlightenment, conservative thinkers have tended to be critics rather than theory-builders, and in fact have tended to distrust theory. Edmund Burke, for example, wasn’t a philosopher so much as he was a critical aphorist. In our own day, Willam F. Buckley has been a similar exemplar of the conservative public intellectual — witty
and devastatingly accurate about the failures and hypocrisies of his opponents, but neither capable of nor interested in producing an entire philosophy of right action or right government.

Russell Kirk is interesting precisely because he bucked this trend to some extent. His idea that the forms of institutions embody an unconscious wisdom about what tends to produce good outcomes is that rarest of things, an argument for conservatism that is not circularly bound to conservative, authoritarian, or religious assumptions.

It’s not enough, though. It isn’t sufficient to justify all the normative things Frum and mainstream conservatives want; you can’t get opposition to cloning stem cells out of it, for example. Nor does it stand comparison with the elaborate theoretical edifices produced bythe Left. The core assumptions of Marxist theory were false-to-fact and its results horrible, but there was a sort of system and logic in between that conservative thinking never really had.

Left-liberals have no room for glee or schadenfreude at conservative expense, though; their position is no better. Having been shown the hard way that Hayek was right and there is no alternative to the market, modern left-liberalism too is essentially a bunch of sentiments and attitudes rather than a philosophy. The practical politics of the left has become little more than a defensive huddle around welfare-state institutions everybody knows are headed for insolvency and collapse, and left attitudes increasingly amount to little more than being against whatever they think conservatives are for.

The inability to frame a positive philosophy is a serious problem for both groups. It reduces their politics to a series of gut rumbles and their conversation to increasingly enraged screaming straight from the hypothalamus (vide Michael Moore and Ann Coulter). A rational debate is hard to have when there isn’t any theory to frame and moderate emotional fixations. Or, as Goethe put it, the sleep of reason begets monsters.