Nov 30

As the pervnado turns

I’m a libertarian who tried to stop Donald Trump with my vote in the PA primaries – even changed party registration to do it. But Trump’s opponents may make me unto a Trump supporter yet.

From Harvey Weinstein’s casting couch through John Conyers being the guy every female reporter in DC knew not to get on an elevator with to a remote-control lock on Matt Lauer’s office rape room at NBC. These are the people who lecture me about sexism and racism and global warming and deviant-minority-of-the-week rights and want to confiscate my guns because they propose my morality can’t be trusted? Well, fuck them and the high horse they rode in on.

I have more and more sympathy these days for the Trump voters who said, in effect, “Burn it all down.” Smash the media. Destroy Hollywood. Drain the DC swamp. We’ve all long suspected these institutions are corrupt. What better proof do we need than their systematic enabling of rape monsters?

As a tribune of the people Trump is deeply flawed. Some of his policy ideas are toxic. His personal style is tacky, ugly, and awful. But increasingly I am wondering if any of that matters. Because if he is good for nothing else, he is good for exposing the corruption, incompetence, and fecklessness of the elites – or, rather, in their desperation to take him down before he breaks their rice bowls, they expose themselves.

Yeah. Is there anyone who thinks all these rocks would be turning over if Hillary the serial rape enabler were in the White House? Nope. With her, or any establishment Republican, it’d be cronyism all they way down, because they’d feel a need to keep the corrupt elites on side. Not Trump – his great virtue, perhaps overriding every flaw, is that he doesn’t give a fuck for elite approval.

Maybe Trump’s voters aren’t angry enough yet. It’s not just a large number of women our elites have raped and victimized, it’s our entire country. Our infrastructure is crumbling, our debt is astronomical, our universities increasingly resemble insane asylums, our largest inner cities are free-fire zones terrorized by a permanent criminal underclass. And what’s the elite response? Oh, look, a squirrel – where the squirrel of the week is carbon emissions, or transgender rights, or railing at “white privilege”, or whatever other form of virtue signaling might serve to hide the fact that, oh, look, they put remote-controlled locks on their rape dungeons.

It’s long past time for a cleansing fire.

Nov 28

Proposal – let’s backport Go := to C

The Go language was designed with the intention of replacing C and C++ over much of their ranges. While the large additions to Go – notably automatic memory allocation with garbage collection – attract attention, there is one small addition that does an impressive job of helping code be more concise while not being tied to any of the large ones.

I refer to the := variant of assignment, which doesn’t seem to have a name of its own in the Go documentation but I will pronounce “definement”. It must have an unbound name on its left (receiving) side and an expression on the right (sending) side. The semantics are to declare the name as a new variable with the type of the right-hand expression, then assign it the value.

Here’s the simplest possible example. This

void foo(int i)
{
    int x;
    x = bar(i);
 
    /* More code that operates on i and x */
}

becomes this:

void foo(int i)
{
    x := bar(i)
 
    /* More code that operates on i and x */
}

A way to think about definement is that it generates a variable declaration with an initialization. In modern C these can occur anywhere a conventional assignment can.

Definement is a simple idea, but a remarkably productive one. It declutters code – scalar and struct local-variable declarations just vanish. This has two benefits; (1) it improves readability, and thus maintainability; and (2) it eliminates a class of silly errors due to multiple declarations falling out of sync – for example, when changing the return type of a function (such as bar() in the above example), you no longer gave to go back and tweak the declaration of every variable that receives a result from its callsites.

Definement syntax also has the property that, if we were to implement it in C, it would break cleanly and obviously on any compiler that doesn’t support it. The sequence “:=” is not a legal token in current C. In gcc you get a nice clean error message from trying to compile the definement:

foo.c: In function ‘foo’:
foo.c:3:5: error: expected expression before ‘=’ token
  x := i
     ^

This makes it a low-risk extension to implement – there’s no possibility of
it breaking any existing code.

It is worth noting that this will actually be slightly simpler to implement in C than it is in Go, because there are no untyped constants in C.

I think there’s a relatively easy way to get this into C.

First, write patches to implement it in both gcc and clang. This shouldn’t be difficult, as it can be implemented as a simple parser change and a minor transformation of the type-annotated AST – there are no implications for code generation at all. I’d be surprised if it took a person familiar with those front ends more than three hours to do.

Second, submit those patches simultanously, with the notes attached to each referencing the other one.

Third, wait for minor compilers to catch up. Which they will pretty quickly, judging by the history of other pure-syntax enhancements such as dot syntax for structure initialization.

Fourth, take it to the standards committees.

OK, am I missing anything here? Can any of my readers spot a difficulty I haven’t noticed?

Will anyone who already knows these front ends volunteer to step up and do it? I certainly could, but it would be more efficient for someone who’s already climbed the learning curve on those internals to do so. If it helps, I will cheerfully write tests and documentation.

EDIT: No, we can’t backport C++ “auto” instead – it has a different and obscure meaning in C as a legacy from B (just declares a storage class, doesn’t do type propagation). Mind you I’ve never seen it actually used, but there’s still a nonzero risk of collision with old code.

UPDATE, DECEMBER 2ND: I have been in touch with Ken Thompson. He approves, raising two minor technical caveats about stack growth and name shadowing.

Nov 18

Language engineering for great justice

Whole-systems engineering, when you get good at it, goes beyond being entirely or even mostly about technical optimizations. Every artifact we make is situated in a context of human action that widens out to the economics of its use, the sociology of its users, and the entirety of what Austrian economists call “praxeology”, the science of purposeful human behavior in its widest scope.

This isn’t just abstract theory for me. When I wrote my papers on open-source development, they were exactly praxeology – they weren’t about any specific software technology or objective but about the context of human action within which technology is worked. An increase in praxeological understanding of technology can reframe it, leading to tremendous increases in human productivity and satisfaction, not so much because of changes in our tools but because of changes in the way we grasp them.

In this, the third of my unplanned series of posts about the twilight of C and the huge changes coming as we actually begin to see forward into a new era of systems programming, I’m going to try to cash that general insight out into some more specific and generative ideas about the design of computer languages, why they succeed, and why they fail.

Continue reading

Nov 13

The big break in computer languages

My last post (The long goodbye to C) elicited a comment from a C++ expert I was friends with long ago, recommending C++ as the language to replace C. Which ain’t gonna happen; if that were a viable future, Go and Rust would never have been conceived.

But my readers deserve more than a bald assertion. So here, for the record, is the story of why I don’t touch C++ any more. This is a launch point for a disquisition on the economics of computer-language design, why some truly unfortunate choices got made and baked into our infrastructure, and how we’re probably going to fix them.

Along the way I will draw aside the veil from a rather basic mistake that people trying to see into the future of programming languages (including me) have been making since the 1980s. Only very recently do we have the field evidence to notice where we went wrong.

Continue reading

Nov 07

The long goodbye to C

I was thinking a couple of days ago about the new wave of systems languages now challenging C for its place at the top of the systems-programming heap – Go and Rust, in particular. I reached a startling realization – I have 35 years of experience in C. I write C code pretty much every week, but I can no longer remember when I last started a new project in C!

If this seems completely un-startling to you, you’re not a systems programmer. Yes, I know there are a lot of you out there beavering away at much higher-level languages. But I spend most of my time down in the guts of things like NTPsec and GPSD and giflib. Mastery of C has been one of the defining skills of my specialty for decades. And now, not only do I not use C for new code, I can’t clearly remember when I stopped doing so. And…looking back, I don’t think it was in this century.

That’s a helluva thing to have sneak up on me when “C expert” is one of the things you’d be most likely to hear if you asked me for my five most central software technical skills. It prompts some thought, it does. What future does C have? Could we already be living in a COBOL-like aftermath of C’s greatest days?

Continue reading

Nov 02

Against modesty, and for the Fischer set

Over at Slate Star Codex, I learned that Eliezer Yudkowsky is writing a book on, as Scott puts it, “low-hanging fruit vs. the argument from humility”. He’s examining the question of when we are, or can be, justified in believing we have spotted something important that the experts have missed.

I read Eliezer’s first chapter, and I read two responses to it, and I was gobsmacked. Not so much by Eliezer’s take; I think his microeconomic analysis looks pretty promising, though incomplete. But the first response, by one Thrasymachus, felt to me like dangerous nonsense: “This piece defends a strong form of epistemic modesty: that, in most cases, one should pay scarcely any attention to what you find the most persuasive view on an issue, hewing instead to an idealized consensus of experts.”

Motherfucker. If that’s what we think is right conduct, how in hell are we (in the most general sense, our civilization and species) going to unlearn our most sophisticated and dangerous mistakes, the ones that damage us more by the weight of expert consensus?

Somebody has to be “immodest”, and to believe they’re justified in immodesty. It’s necessary. But Eliezer only provides very weak guidance towards that justification; he says, in effect, that you’d better be modest when there are large rewards for someone else to have spotted the obvious before you. He implies that immodesty might be a better stance when incentives are weak.

I believe I have something more positive to contribute. I’m going to tell some stories about when I have spotted the obvious that the experts have missed. Then I’m going to point out a commonality in these occurrences that suggests an exploitable pattern – in effect, a method for successful immodesty.

Continue reading