Dec 06

Da Big Snow

Yup, the blizzard is big. Here in eastern Pennsylvania we’ve had over a foot of snow and
a lot of drifting today. I shoveled my driveway. I’m going to be stiff tomorrow.

Dec 05

Salaries are dropping. Time to celebrate!

So, the latest trend to hit the business magazines is falling programmer salaries. I can’t lay hands on the article just now, but it seems some CEO under pressure to outsource his programming to India had the bright idea of offering lower salaries (competitive with Indian levels, not U.S. levels) to programmers in the U.S. He got 90 applicants, even though the offer was for about half of what used to be considered normal for the positions.

A pointer to this article was posted to my favorite mailing list by a friend who is depressed about programmer salaries dropping, He wasn’t un-depressed by the revelation, at the end of the article, that said CEO ended up jacking some of his salaries back up to “normal” levels to keep his best people.

There are a bunch of ways I could respond to this. One is by arguing that outsourcing programming work is a fad that will largely reverse itself once the true, hidden costs start to become apparent. Even if that weren’t so, the Indian advantage would be temporary at best; as the Indian programmer’s value rises, so will the price he charges. I believe these things are true. But in keeping with tradition here at Armed and Dangerous, I’m going to skip the easy, soft arguments and cut straight to the most important and contentious one of all — falling salaries are good for you.

If you’re a programmer upset by falling programmer salaries, I hope you’re prepared to be equally gloomy about the continuing fall in real-dollar prices of all the other labor-intensive goods you buy. Because trust me, they get cheaper the exact same way — and somewhere out there, there are people who are pissed off and depressed because the market wouldn’t support their old salaries.

But each time this happens, more people gain than lose. The money programmers aren’t making is, ultimately, money some other consumer gets to keep and use for something else, because the price of the bundled goods programmmers were helping produce have dropped. The corporate cost-cutters only get to profit from this as a transient thing, until the next round of price wars. Lather, rinse, repeat.

The free market is a wonderful thing. I was going to call it the most marvellous instrument ever devised for making people wealthy and free, but that would be wrong — the free market isn’t a ‘device’ any more than love or gravity or sunshine are devices, it’s what you have naturally when nobody is using force to fuck things up.

Sometimes, when you and your friends are on the bad end of one of its efficiency-seeking changes, it’s hard to remember that the market is a wonderful thing for almost everybody almost all the time. But it’s worth remembering, just as it’s worth remembering that free speech is a wonderful thing even when it’s the Nazis or Communists exercising it.

Why is this? Because the alternatives to free speech, even when the people pushing them mean well, always turn into petty tyrannies now and become grand tyrannies in the course of time. The alternatives to markets decay into tyranny a lot faster.

Nov 21

The Prudential interview

I’ve spent a lot of time and effort since 1997 developing effective propaganda tactics for
reaching the business world on behalf of the hacker community — among other things, by
popularizing the term ‘open source’. If you want to grok how this is done, read
my October 15 interview with a bunch of Prudential Securities investors.

Pay attention to style as well as content. This is the language you have to learn to speak
to reach the people who write big checks. It’s not very complicated, if you just bear in mind
that these people are obsessed with two things: risk management and return on investment. As they should be — it’s their job.

Nov 20

Jack needs a girlfriend



“Free Love”, eh? Well, that would explain a lot. Jack must have been the dude I saw
damn near run into a doorframe yesterday because he was checking out my wife Cathy so intently he forgot to watch where he
was going. Not that there’s anything wrong with that.

Nov 17

What good is IQ?

A reader asks:

To clarify, while I believe natural selection explains a lot I have
caveats about IQ as a tool for testing intelligence. If you can’t
measure the coast of France with a single number how can you do it
with human intelligence?

Easily. Human intelligence is a great deal less complex than the
coast of France. :-)

It’s fashionable nowadays to believe that intelligence is some
complicated multifactor thing that can’t be captured in one number.
However, one of the best-established facts in psychometry (the science
of measuring mind) is that it is quite difficult to write a test of
mental ability that is not at least 50% correlated with all other such
tests. Or, to put it another way, no matter how you design ten tests for
mental ability, at least about half the variance in the scores for any one
of them statistically appears to be due to a “general intelligence”
that shows up on the other nine tests as well.

Psychometricians call this general intelligence measure “g”. It
turns out to predict important real-world success measures quite well
— not just performance in school but income and job success as
well. The fundamental weakness in multiple-factor theories of intelligence
is that measures of intelligence other than g appear to predict
very little about real-world outcomes. So you can call a lot of other
things “intelligence” if you want to make people feel warm and fuzzy,
but doing so simply isn’t very useful in the real world.

Some multifactor theorists, for example, like to describe accurate
proprioception (an acute sense of body position and balance) as a kind
of intelligence. Let’s say we call this “p”. The trouble with this
is that there are very few situations in which a combination of high p
and low g is actually useful — people need to be able to balance
checkbooks more often than they need to walk high wires. Furthermore,
g is easier to substitute for p than the other way around; a person
with high g but low p can think up a way to not have to walk a high
wire far better than a person with low g but high p can think up a way
not to have to balance a checkbook. So g is in a strict functional
sense more powerful than p. Similar arguments apply to most of the
other kinds of specialized non-g ‘intelligence’ that have been
proposed.

Once you know about g, you can rank mental-capability tests by
how well their score correlates with g. IQ is valuable because a
well-composed IQ test measures g quite effectively. For purposes
of non-technical discussion, g and IQ can be considered the same, and
pychometricians now accept that an IQ test which does not closely track
g is defective.

A lot of ink has been spent by people who aren’t psychometricians
on insisting that g is a meaningless statistical artifact. The most
famous polemic on this topic was Stephen Jay Gould’s 1981 book
The Mismeasure of Man, a book which was muddled,
wrong
, and in some respects rather dishonest. Gould was a
believing Marxist; his detestation of g was part of what he perceived
as a vitally important left-versus right kulturkampf. It is
very unfortunate that he was such a persuasive writer.

Unfortunately for Gould, g is no statistical phantom. Recently g
and IQ have been shown to correlate with measurable physiological
variables such as the level of trace zinc in your hair and performance
on various sorts of reaction-time tests. There are hints in the
recent literature that g may be largely a measure of the default level
of a particular neurotransmitter associated with states of mental
alertness and speed of thought; it appears that calling people of
subnormal intelligence “slow” may not be just a metaphor!

IQ is one of several large science-related issues on which
political bias in the dominant media culture has lead it to present as
fact a distorted or even reversed version of the actual science. In
1994, after Murray and Herrnstein’s The Bell Curve got a
thoroughly undeserved trashing, fifty leading psychometricians and
psychologists co-signed a summary of mainstream
science on intelligence
. It makes eye-opening reading.

The reasons many popular and journalistic accounts continue to
insist that IQ testing is at best meaningless and at worst a sinister
plot are twofold. First, this belief flatters half of the population.
“My IQ may be below average, but that doesn’t matter because IQ is
meaningless and I have high emotional intelligence!” is,
understandably, a favorite evasion maneuver among dimwits. But that
isn’t the worst of it. The real dynamite is not in
individual differences but rather that the distribution of IQ (and
hence of g) varies considerably across groups in ways that are
politically explosive.

Men vs. women is the least of it. With other variables controlled,
men and women in a population have the same mean IQ, but the
dispersion differs. The female bell curve is slightly narrower, so
women have fewer idiots and fewer geniuses among them. Where this
gets touchy is that it may do a better job than cultural sexism of
explaining why most of the highest achievers in most fields are male
rather than female. Equal opportunity does not guarantee equal
results, and lot of feminist theory goes out the window.

But male/female differences are insignificant compared to the real
hot potato: differences in the mean IQ of racial and ethnic groups.
These differences are real and they are large enough to have severe
impact in the real world. In previous blog entries I’ve mentioned the
one-standard-deviation advantage of Ashkenazic Jews over gentile
whites; that’s roughly fifteen points of IQ. Pacific-rim Asians
(Chinese, Japanese, Koreans etc.) are also brighter on average by a
comparable margin. So, oddly enough, are ethnic Scots — though
not their close kin the Irish. Go figure…

And the part that, if you are a decent human being and not a racist
bigot, you have been dreading: American blacks average a standard
deviation lower in IQ than American whites at about 85. And
it gets worse: the average IQ of African blacks is lower
still, not far above what is considered the threshold of mental
retardation in the U.S. And yes, it’s genetic; g seems to be about
85% heritable, and recent studies of effects like regression towards
the mean suggest strongly that most of the heritability is DNA rather
than nurturance effects.

For anyone who believe that racial equality is an important goal,
this is absolutely horrible news. Which is why a lot of
well-intentioned people refuse to look at these facts, and will
attempt to shout down anyone who speaks them in public. There have
been several occasions on which leading psychometricians have had
their books canceled or withdrawn by publishers who found the actual
scientific evidence about IQ so appalling that they refused to print
it.

Unfortunately, denial of the facts doesn’t make them go away. Far from
being meaningless, IQ may be the single most important statistic about
human beings, in the precise sense that differences in g probably drive
individual and social outcomes more than any other single measurable
attribute of human beings.

Mean IQ differences do not justify making assumptions about any individual.
There are African black geniuses and Ashkenazic Jewish morons; humanity and
ethics demand that we meet each individual human being as an individual,
without prejudice. At the same time, group differences have a significance
too great to ignore. In the U.S., blacks are 12% of the population but
commit 50% of violent crimes; can anyone honestly think this is
unconnected to the fact that they average 15 points of IQ lower than the
general population? That stupid people are more violent is a fact
independent of skin color.

And that is actually a valuable hint about how to get beyond
racism. A black man with an IQ of 85 and a white man with an IQ of 85
are about equally likely to have the character traits of poor impulse
control and violent behavior associated with criminality — and
both are far more likely to have them than a white or black man with
an IQ of 110. If we could stop being afraid of IQ and face up to it,
that would give us an objective standard that would banish racism per
se. IQ matters so much more than skin color that if we started paying
serious attention to the former, we might be able to stop paying
attention to the latter.

UPDATE: An excellent summary of science relating to g
is here

Nov 14

Funny, but incorrect

From the November 12 “Kernel Panic”:

Nov 12 2003 strip

In fact, this strip is incorrect. I did not coin the term “open source”;
I only popularized it. It was coined by
my friend Christine Peterson of the Foresight Institute. While it’s true that I more or less ran the brainstorming session and fortunately had enough of a clue to recognize a winner when it popped up, the creative leap was all hers.

UPDATE: Yes, it now reads “popularized”. Chris Wright changed it.

Nov 14

Selecting for intelligence

Mike Smith relays an interesting possible explanation for the observed
statistical fact that American and European Jews have a mean IQ a
standard deviation higher than Caucasian gentiles:

During the period from ancient times to modern times, there was a
constant phenomenon of Jews converting to Christianity (there were
many social pressures to do so). In a nutshell, the idea is that the
lower-IQ Jews were statistically more likely to convert, as it freed
them from having to learn to read Torah. During the Middle Ages, it
was not worth the effort for most people to become literate; the
payback was not worth it. Books were rare and expensive, and learning
to read was no guarantee of getting ahead in life. Of course, people
like to do what they’re especially good at, and the higher-IQ’s among
the Jews did not find learning to read to be such a burden. As such,
they were statistically less likely to convert (and statistically more
likely to become fathers of many children in a culture that valued
intelligence.) It is worth noting that in ancient times, Jews were not
stereotyped as especially intelligent; that stereotype arose in the
Middle Ages.

This is a special case of one of my favorite Damned Ideas, originally
developed by John W. Campbell in the 1960s from some speculations
by a forgotten French anthropologist. Campbell proposed that the
manhood initiation rituals found in many primitive tribes are a
selective machine designed to permit adulthood and reproduction only
to those who can demonstrate verbal fluency and the ability to override
instinctive fears on verbal command.

Campbell suggests that all living humans are descended from groups
of hominids that, having evolved full-human mental capability in some
of their members, found the overhead of supporting the dullards too
high. So they began selecting for traits correlated with intelligence
through initiation rituals timed for just as their offspring were
achieving reproductive capacity; losers got driven out, or possibly
killed and eaten.

Campbell pointed out that the common elements of tribal initiations
are (a) scarring or cicatricing of the skin, opening the way for
lethal infections, (b) alteration or mutilation of the genitals,
threatening the ability to reproduce, and (b) alteration of the mouth
and teeth, threatening the ability to eat. These seem particularly
well optimized for inducing maximum instinctive fear in the subject
while actually being relatively safe under controlled and relatively
hygenic conditions. The core test of initiation is this: can the
subject conquer fear and submit to the initiation on the basis
of learned (verbal, in preliterate societies) command?

Campbell noticed the first order effect was to shift the mean of
the IQ bell curve upwards over generations. The second-order effect,
which if he noticed he didn’t talk about, was to start an arms race in
initiation rituals; competing bands experimented with different
selective filters (not consciously but through random variation).
Setting the bar too low or too high would create a bad tradeoff
between IQ selectivity and maintaining raw reproductive capacity. So
we’re descended from the hominids who found the right tradeoff to push
their mean IQ up as rapidly as possible and outcompeted the groups
that chose less well.

It doesn’t seem to have occurred to Campbell or his sources, but
this theory explains why initiation rituals for girls are a rare and
usually post-literate phenomenon. Male reproductive capacity is
cheap; a healthy young man can impregnate several young women a day,
and healthy young men are instinct-wired to do exactly that whenever
they can get away with it. Female reproductive capacity, on
the other hand, is scarce and precious. So it makes sense to select
the boys ruthlessly and give the girls a pass. Of course if you push
this too far you don’t get enough hunters and fighters, but the right
tradeoff pretty clearly is not 1-to-1.

(This would also explain why humans are designed for mild polygyny,
1 to 3 sexual partners per male. You can spot this by looking at
where human beings are on various physical characteristics that
correlate with degree of polygyny in other primates — disparity in
average size between males and females, for example, is strongly
correlated with it.)

What Campbell did notice is that this theory of selection
by initiation would neatly explain one of the mysteries of human
paleoanthropology — how human beings got so smart so fast. The
differences between H. Erectus and H. Sapiens are not large in
absolute genetic terms (they can’t be, we share over 94% of our genome
with chimps) but they’re hard to credit given normal rates of
morphological change in mammals and only two million years to work
in. Something must have been putting hominids under
abnormally strong selective pressure — and Campbell’s idea
is that we did it to ourselves!

Now, I’m not sure I believe Jews bootstrapping themselves up a
whole standard deviation in less than 2000 years, but if you apply
a similar idea to a longer timeframe it begins to look pretty
reasonable. (And Campbell did suggest that the Jewish practice of
infant circumcision had originally been a manhood rite.)

Within my lifetime, I expect we’re going to have the ability to do
germ-line enhancement of human intelligence. I strongly suspect that that
will set off another arms race — because cultures that suppress
that technology will be once again doomed against cultures that do. And
this time, we’re smart enough to know that in advance…

Nov 14

Communism and the Jews

Uh-oh. I see another identity-politics double-bind coming. Eugene Volokh comments on the anti-semitic canard that Jews were disproportionally influential in the development of Communism. The sides in this kind of dispute are very predictable. One one hand, the anti-Semites, a disgusting crew of racist troglodytes with evil motives. On the other, the good-hearted and right-thinking people in the world exclaiming in horror at the very thought that anyone might say anything veering so close to the classic tropes of anti-Semitic propaganda. (And I am not being the least bit ironic in either description, not this time.)

Unfortunately, the awkward thing about this particular canard is that it happens to be true. And that illustrates a serious problem, an inability to cope that most historians have acquired when questions of history go too near certain forbidden topics and modes of inquiry.

As Eugene Volokh’s sources note, a disproportionately large number of the original Bolsheviks were Jewish. Karl Marx was ethnically Jewish, though his parents had converted to Christianity. It is impossible to study the history of Marxism, Socialism, and Communism without noticing how many Jewish names crop up among the leading intellectuals. It is equally impossible not to notice how many of the Old Left families in the U.S. were (and still are) Jewish — and, more specifically, Ashkenazim of German or Eastern European extraction. Julius and Ethel Rosenberg didn’t come out of nowhere.

It’s not even very hard to understand why this is. There is a pattern, going back to Spinoza in the 1600s, of Jewish intellectuals seeking out the leading edge of certain kinds of reform movements. Broadly speaking, if you look at any social movement of the last 300 years that was secular, rationalist, and communitarian, somewhere in it you would find nonobservant Jews providing a lot of the intellectual firepower and organizational skills. Often a disproportionate share, relative to other population groups.

Communism was one example; there are many others. One of my favorites is the Ethical Culture movement. Today, we have the Free Software movement, not coincidentally founded by Jewish atheist Richard Stallman. There is an undeniable similarity among all these movements, an elusive deep
structure having to do not so much with shared beliefs as a shared style of believing that one might call messianic social rationalism.

Anybody who thinks I’m arguing for a conspiracy theory should check their meds. No, there is something much simpler and subtler at work here. Inherited religious myths, even when they no longer have normative force, influence the language and conceptual frameworks that intellectuals use to approach other issues. The mythologist Joseph Campbell once noted that thinkers with a Catholic background like mine gravitate towards universalizing mysticisms and Protestants towards individualist redemptionism; he could have added that thinkers with a Jewish heritage tend to love messianic social doctrines. (One can cite exceptions to all three, of course, but the correlation will still be there after you’ve done so.)

Thus, assimilated Jews have a particular propensity for constructing secular messianisms — or for elaborating and intellectualizing secular messianisms invented by gentiles. But you can’t say this sort of thing in academia; you get called a racist if you do. And you especially aren’t allowed to notice the other reason movements like Communism sometime look not unlike Jewish conspiracies — which is that the IQ bell curve for Jews has a mean about a standard deviation north of the IQ bell curve for Caucasian gentiles.

In cold and sober truth, in any kind of organization where intelligence matters — even the Communist movement —, you are going to find a disproportionate number of Jews with their hands on the levers. It doesn’t take any conspiracy to arrange this, and it’s not the Jews’ fault the goyim around them are such narrs (Yiddish
for “imbeciles”). It just happens.

But only people like me who don’t give a shit about being castigated for political incorrectness are willing to even whisper these things. Because that’s true, anti-anti-Semites can’t counter anti-Semitic muck-spreading with the truth; instead, they have to pretend that none of the historical patterns around which anti-Semites have constructed
their paranoid delusions have any basis in fact at all.

This is denial, and leaves the good guys in a damn weak position against anti-Semitic racists, who by distorting the record only a little can not only feel they have the truth on their side, but in some nontrivial ways actually be justified in that belief.

Unlike the anti-Semites, I mostly like the cultural traits that led so many Jewish intellectuals to Communism — including one I haven’t mentioned yet, the urge to transcend ethnic tribalism
and order the world according to a Law. But if the road to a Christian hell is paved with good individual intentions, the road to totalitarian hell is paved with communitarian idealism. It’s a tragedy that in Communism Jewish idealism, messianism, and intellectualism nourished a monster that turned on the Jews and killed so many of them.

If the discussion didn’t violate so many taboos, mainstream scholars could start asking even more interesting questions. Like: exactly how and why did thinkers raised in the relatively gentle communitarianism of the Jewish tradition become apologists for the vicious collectivism of Marxism and all its toxic children? And what can we do to keep the like from happening again, to Jews or anyone else?

But these questions probably won’t get seriously asked in my lifetime. Because political correctness has made us afraid to notice that, in some ways, the Jews really have had a special, shaping influence on the reform politics of the modern era, including Communism. About that much, the anti-Semites are right.

Nov 13

The desexualization of the American (fe)male

There’s been quite a blogospheric flap lately about Kim DuToit’s
essay The
Pussification Of The Western Male
. The single feature of the
conversation that surprised me most is that nobody connected it to
Steven den Beste’s equally searing essay Anglo Women are an
endangered species
.

Steve’s point complements Kim’s and amplifies it in some useful
ways. Nobody wants to go back to the days when women were treated as
chattels or second-class citizens. Anyway, attempts to do so would be
doomed for reasons not so much moral as economic; societies that
suppress the productivity and intelligence of 50% of their members are
inevitably going to lose out to societies that don’t. But what Steve and
Kim have pointed out is that Western society often has pursued the
worthy goal of equality in a way that is hamfisted and destructive,
because it tries to remake human nature rather than acknowledging and
working with it.

These essays address two specific problems we’ve been saddled with;
Kim’s with the attack on masculinity, and Steve’s with the attack on
femininity. Among white anglos (especially bicoastal
“progressive” white anglos), it is no longer respectable
for a male person to behave like a man and a female person to behave
like a woman.

In fact, in today’s bien-pensant circles, one can be attacked as a
sexist for suggesting that the phrase “like a man” or
“like a woman” has any meaning at all. Many of us have
become obscurely terrified of sexual dimorphism, apparently out of
fear that acknowledging it will bring back the bad old days.

This kind of attitude has done more damage than most people
realize. Read those essays. There’s something gone badly wrong when
normal boys are dosed with Ritalin for being normally loud and
aggressive, and only strippers have the privilege of hugging a man
they like while at work.

I think our culture will recover from this. Beginning in the
1950s, portions of the kibbutz movement in Israel made the most
fervent try yet at erasing sex differences — they raised kids
in creches and tried to systematically stamp out sex-differentiated
behaviors. They failed; the children of the first generation, despite
intense socialization, gravitated back to traditional sex roles.

We’ll all be happier when we relax enough to acknowledge that
although equality before the law is something every human deserves,
some things naturally fall in men’s country and some in women’s
country — and the fact that minorities of men and women behave
in gender-atypical ways doesn’t change that reality. There will never
be more female soldiers or policemen than male ones, and never more
male nurses and child-rearers than female ones. Men are going to
groove on power tools and women are going to coo at babies; that’s
just the way it is. down to our DNA. Behavioral dimorphism is wired
into us for good reasons that have everything to do with Darwin and
nothing to do with political correctness.

The first stage of recovery is recognizing that there’s a problem
— that men and women find each others’ behavioral as well as
physical sex differences attractive, and that neither men nor women
are well served by efforts to cram us all into a unisex box. My wife
once observed, on behalf of a billion sisters, “What good is a man if
you cut off his balls?” — and she was talking everyday behavior,
not just anatomy or sexual function. There aren’t a lot of men who
will seek out the company of defeminized women if they have a choice
in the matter, either.

That is where essays like Kim’s and Steve’s can help. By waking us up
and pissing us off, they remind us that our sex-linked behaviors and
our preferences for sex-linked behaviors in others actually
matter, that they’re every bit as much a part of our normal
human makeup as having penises or vaginas. People who want us to
forget this for ideological reasons are objectively inhumane.

Nov 12

Yee-ha! W00t! Excelsior!

I got email from Dr. Stanley Schmidt, the editor of Analog,
about an hour ago. The bad news was, he turned down the short story. The
good news was he accepted the fact article.

I’m going to be published in Analog!

/me does geeky victory dance

OK, so this is one of those things where if you don’t immediately
get why it’s wicked cool, no amount of explanation is likely to
enlighten you. I’ll just say I’ve been a science-fiction fan
for 35 years and Analog has always been the banner-bearer
for my kind of SF, the stuff with the rivets in it. I’ve wanted to
get published there when I grew up ever since I was 11 years old;
this is literally a childhood dream come true.

Oh. And Dr. Schmidt asked me to send him more fiction…

Nov 12

Possible outage today

The good folks at ibiblio.org are about to upgrade me from b2 to WordPress. There might be a short outage involved, and it’s possible the

new CSS will garble my pages. Any problems should be transient and fixed
within a few hours.

Nov 12

CSS designer cluelessness in a nutshell

The CSS designer for WordPress, the successor to the
b2 engine that I may be upgrading to shortly, responded to my previous
rant. In a generally thoughtful and responsive post, he said “But
even if [pixel sizes] are defined for fonts, does your browser not let
you easily resize this?”.

This, I’m afraid, is CSS designer cluelessness in a nutshell.

In particular, I should not have to do an explicit operation every
time to get the font sizes I want. In general, answers of the form
“you can override the designer’s preferences by jumping through hoops”
show the wrong attitude. This attitude clashes with the objective
reality of lots of different display devices out there.

It’s also bad human-factors engineering. As the user, my preferences
should be primary
— in font sizes as in all other things. That’s
how the Web is supposed to work, and CSS and web designers who don’t
get this are doing users a major disservice in order to gratify their
own egos.

Ultimately they’re shooting themselves in the foot, too — think about
what will happen over time as display sizes both average larger and
the size dispersion increases (e.g. cell phones and PDAs get WiFi at
the same time desktop displays go to 1600×1200 and higher).
Fixed-size fonts, in farticular, are going to be a bigger and bigger
lose as time goes on.

To the extent you think of yourself as a servant of the user, rather
than an artist whose job it is to make things pretty, that’s when your
designs will have real and lasting value. This is a hard lesson for
artists to learn, but it’s the only way to avoid filling the web with
designs that are gaudy, wearisome, and lose their utility as display
technology improves and becomes more various.

Nov 11

A rant — Why are CSS designers so utterly freaking clueless?

People who put absolute pixel sizes in CSS layouts should be lashed
with knouts. I’ve tripped over this problem yet again while moving my
blog; I’m using b2, and the default
stylesheet shipped with it was obviously produced by some graphics
designer who has failed to grasp the fact that there are lots of
different display sizes and resolutions out there.

OK, for those of you who don’t see the problem here, it goes like this.
Graphic designer composes his layout on a 1024×768 display. To make the
spacing come out all pretty, he specifies a 10 or 11-pixel font which looks
good on that 72-dot-per-inch display. Now I view it on my 1920×1440 display
at over 120dpi resolution — and the font is 40% smaller and a hell of
a strain to read. There are many other, related errors as well, like
specifying absolute box or table widths when percentage of screen width
would be more appropriate.

The basic error here is overcontrolling the layout rather than
letting the user’s browser choose it in execution of the user’s
preferences. Graphics designers are chronically prone to thinking of
a browser as a device for delivering pixels, rather than information.
But it doesn’t have to be this way — and, in fact, HTML isn’t
supposed to be. You can make your CSS scale to the user’s chosen font
size by specifying box dimensions in units of em, en and ex (which are
evaluated relative to the parent box’s current font size) rather than
pixels. But most CSS designers are apparently either too freaking
incompetent to do this or just don’t give a rat’s ass about
display-independence or the user’s preferences to begin with.

This sorry state of affairs is one of the better arguments for the
proposition, widely shared among my peers, that graphics designers
are basically a bunch of dope-smoking ponytailed
dimwits who need to be smacked upside the head on a regular basis and
not let anywhere near a software or web design without strict adult
supervision by a cluebat-wielding programmer.

Another stupid graphic-designer stunt is changing the colors on
visited and unvisited hotlinks away from the browser defaults (it’s especially bad when they’re mapped to the same color). What make this annoying is that it
discards an important visual cue for web page users by making it less
obvious where the hotlinks are. People who do this should be clubbed
with a chair leg until they stop.

Sigh. Here’s the default b2
stylesheet
and here is the stylesheet I
use
. Notice how much simpler mine is? The more you default
rendering decisions to the browser like Ghod and Tim Berners-Lee
intended, the more error-prone crap your stylesheets can omit, the
faster your pages will render, and the better the user experience
will be.

UPDATE: A reader tells me that part of this is the browser vendors’
fault. It seems that on older browsers, only pixel sizes worked
reliably. He says this has long since been fixed but the damage to
CSS designers’ minds was already done. Another reader pointed me to a
good rant on this topic by Jamie Zawinski.

Nov 11

The sleep of reason

I’ve had a copy of David Frum’s Dead Right sitting on my coffee table for months. I didn’t buy it, it was landed on my by an old friend who persists in imagining that I’m interested in reading conservative political theory. In fact, it’s been years since I found conservative theorizing other than wearily predictable. and it would have been a lot more years if I hadn’t been unaccountably late in grasping Russell Kirk’s argument for the organic wisdom of institutions.

John Holbo’s smackdown gives form to all the inchoate reasons I didn’t want to face Frum’s book. Holbo, by his own account, goes looking for a unifying philosophy of conservative thought and finds only an attitude, an aesthetic, a hankering for people and situations to possess certain qualities without a logically or ethically coherent theory of why those qualities would be desirable. Holbo makes much of Frum’s yearning that people should be tough, self-reliant, and self-disciplined and Frum’s apparent willingness that the order of society should punish slackness, even if that is not necessarily the most economically efficient way to arrange things.

Holbo admits that he is loading onto Frum views that Frum would probably deny. But his argument that those views are logical extensions of positions Frum and other conservatives do hold seems basically fair, and so does his charge that Frum-like conservatism is an incoherent mishmash of emotional desires masquerading (not very convincingly) as a political philosophy.

What I am left wondering is why Holbo expected conservatives to have an actual theory in the first place. Or whether he actually expected it at all — his purported surprise and disappointment smells a bit disingenuous to me, a bit like a rhetorical flourish we’re not really expected to believe. Did he really give no thought beforehand to the implication of the label that conservatives use for themselves?

The word conservative is an adjectival noun formed from the verb ‘to conservate’ — to keep something from decaying, to hold it static, to preserve it. Almost all of the core attitudes of conservatism unfold from that definition. Almost all of conservatism is a set of rationalizations for a gut-level inclination to see any sort of change as a threat. Conservatism is the politics of dread, of people who are god-fearing, change-fearing, and
future-fearing.

I say ‘almost all’ because, by historical accident, conservatism has got itself tangled up with impulses of a very different kind — specifically classical-liberal and libertarian ones. Many people who describe themselves as conservatives are in fact nothing of the kind — they are in bed with conservatism only out of a shared loathing of the Marxist/socialist left. The alliance depends on a sort of folie a deux — conservatives fooling themselves that free markets tend to freeze existing power relationships in place, and classical-liberals fooling themselves that freedom can be reconciled with the love of hierarchy and punishment wired into the conservative hindbrain.

The parts of ‘conservative’ theory that actually deserve to be called theory are usually classical-liberal or libertarian intrusions. Nor is this anything new; before being shotgun-wedded to classical liberalism by the threat of Marxism around the beginning of the 20th century, conservatives imported their theory from Aquinas or Plato or Calvin.

In fact, when you get down to trying, it is remarkably hard to name anybody who has done a systematic job of deriving conservative politics from a theory about the nature of good. Especially since the Enlightenment, conservative thinkers have tended to be critics rather than theory-builders, and in fact have tended to distrust theory. Edmund Burke, for example, wasn’t a philosopher so much as he was a critical aphorist. In our own day, Willam F. Buckley has been a similar exemplar of the conservative public intellectual — witty
and devastatingly accurate about the failures and hypocrisies of his opponents, but neither capable of nor interested in producing an entire philosophy of right action or right government.

Russell Kirk is interesting precisely because he bucked this trend to some extent. His idea that the forms of institutions embody an unconscious wisdom about what tends to produce good outcomes is that rarest of things, an argument for conservatism that is not circularly bound to conservative, authoritarian, or religious assumptions.

It’s not enough, though. It isn’t sufficient to justify all the normative things Frum and mainstream conservatives want; you can’t get opposition to cloning stem cells out of it, for example. Nor does it stand comparison with the elaborate theoretical edifices produced bythe Left. The core assumptions of Marxist theory were false-to-fact and its results horrible, but there was a sort of system and logic in between that conservative thinking never really had.

Left-liberals have no room for glee or schadenfreude at conservative expense, though; their position is no better. Having been shown the hard way that Hayek was right and there is no alternative to the market, modern left-liberalism too is essentially a bunch of sentiments and attitudes rather than a philosophy. The practical politics of the left has become little more than a defensive huddle around welfare-state institutions everybody knows are headed for insolvency and collapse, and left attitudes increasingly amount to little more than being against whatever they think conservatives are for.

The inability to frame a positive philosophy is a serious problem for both groups. It reduces their politics to a series of gut rumbles and their conversation to increasingly enraged screaming straight from the hypothalamus (vide Michael Moore and Ann Coulter). A rational debate is hard to have when there isn’t any theory to frame and moderate emotional fixations. Or, as Goethe put it, the sleep of reason begets monsters.

Nov 10

Dehumanization

A reader, responding to the suggestion that we call the Baathist
holdouts in Iraq
werewolves
, asked rhetorically whether the intent was to dehumanize
them. Lurking behind this question was the theory that war supporters
like me need to make our enemies into un-persons in order to justify
continuing to kill them.

This question displays a kind of self-absorption by a person who
cannot really imagine a moral stance different from his/her own. In
such tender-minded thinking, the world is neatly divided into humans
that one must treat pretty much as though they were one’s next-door neighbor,
and non-humans who are not part of the moral community. The possibility
that a human being could be outside the moral community is essentially
ignored.

But there are human beings who are outside the moral community by nature.
We call them psychopaths. They lack the wiring for empathy and reciprocity
that makes it possible for most human beings to cooperate; they can (and
often do) commit sickening atrocities for pleasure. Fortunately, most
psychopaths have other kinds of neurological deficits as well and are
therefore not very bright.

Some people who probably were not born with psychopathy make themselves
into psychopaths. Consider, as a relevant example, Saddam Hussein and
his sons. They fed living people into shredders for amusement. No semantic
debate over whether that sort of monster is “human” or “dehumanized”
is going to change my judgment that that it deserves a violent
death as quickly as that result can be arranged.

The Baathist holdouts in Iraq are the hench-monsters of the
Husseins — the men who tore infants’ eyes out and strapped women
to tables in rape rooms. Calling them “werewolves” or “orcs” is not
an attempt to dehumanize them; that would be pointless, since they
have already dehumanized themselves.

Nov 07

Call them Werewolves

The blogosphere has shown some ability to change the terms and
terminology of the terror-war debate in the U.S. It’s time for a bit
of meme-hacking. Let’s see if we can displace terms like “insurgent”
or “Saddam loyalist” with one that conveys the true depth of evil we
are facing. I have a candidate to propose.

A little more than sixty years ago, the U.S. and its allies went to
war another psychopathic, mass-murdering dictator — Adolf
Hitler. In 1944, as the Third Reich was collapsing, the SS organized
a Nazi resistance to commit assassinations, sabotage and guerrilla
warfare behind Allied lines. The parallels in organization and
tactics with Baathist-holdout activity in Iraq are very
close
.

It is a matter of record from Saddam Hussein’s autobiography that
he admired Hitler’s ruthless efficiency and sought to emulate it. We
should revive for these remnant Baathist thugs the term, redolent of
willful evil and darkness, that the Nazi resistance fighters used for
themselves.

Call them werewolves. It’s what they deserve.

Blogspot comments