The Unix Hater’s Handbook, Reconsidered

A commenter on my post pre-announcing Why C++ Is Not Our Favorite Programming Language asked “esr, from the perspective of a graybeard, which chapters did you consider good and which chapters did you consider bad?”

(Technical note: I do not in fact have a beard, and if I did it would not be gray.)

Good question, and worthy of a blog entry. I was the first technical reviewer for the manuscript of this book back when it was in preparation — IDG published it, but I think it was passed to me through MIT Press. As I noted in the same comment thread, I worked hard at trying to persuade the authors to tone down the spleen level in favor of making a stronger technical case, but didn’t have much success. They wanted to rant, and by Ghod they were gonna rant, and no mere reviewer was gonna stop ‘em.

I’ve thought this was a shame ever since. I am, of course, a long-time Unix fan; I’d hardly have written The Art of Unix Programming otherwise. I thought a book that soberly administered some salutary and well-directed shocks to the Unix community would be a good thing; instead, many of their good points were obscured by surrounding drifts of misdirected snark.

You can browse the Handbook itself here. What follows is my appraisal of how it reads 14 years later, written in real-time as I reread it. After the chapter-by-chapter re-review I’ll sum up and make some general remarks.

Introduction

I have a lot of respect for Don Norman, but he did not write this on one of his better days. The attempts at intentional humor mostly fall flat. And “…now that’s an oxymoron, a graphical user interface for Unix” looks unintentionally humorous in 2008. Otherwise there’s very little content here.

Preface

Similarly unfortunate. Sets the tone for too much of the rest of the book, being mostly hyperbolic snark when it could have been useful criticism. Very dated snark, too, in today’s environment of Linuxes wrapped in rather slick GUIs. The anecdotes about terminal sessions on Sun hardware from 1987 look pretty creaky and crabby today.

The authors write: “It’s tempting to write us off as envious malcontents, romantic keepers of memories of systems put to pasture by the commercial success of Unix, but it would be an error to do so: our judgments are keen, our sense of the possible pure, and our outrage authentic.” I know and rather like some of the authors, so it actually makes me a little sad to report that fourteen years later, writing them off this way is easier than ever.

Anti-Foreword:

Dennis Ritchie’s rejoinder is still funny, and his opening and concluding words are still an accurate judgment on the book as a whole:

I have succumbed to the temptation you offered in your preface: I do write you off as envious malcontents and romantic keepers of memories. The systems you remember so fondly […] are not just out to pasture, they are fertilizing it from below. [… Y]our book is a pudding stuffed with apposite observations, many well-conceived. Like excrement, it contains enough undigested nuggets of nutrition to sustain life for some. But it is not a tasty pie: it reeks too much of contempt and of envy.

Chapter 1 – Unix: The World’s First Computer Virus

About equal parts of history and polemic, history not new, polemic cleverly written but not very interesting once you get through chuckling at the verbal pyrotechnics. The stuff on the standards wars is really dated now.

And no mention of Linux, which had just acquired TCP/IP support the year this was written and already had a thriving community. No one can blame the authors for not foreseeing 2008, but it’s as though they were mentally stuck in the 1980s, oblivious to the reality of 1994.

Chapter 2 – Welcome, New User!

There is some fair criticism in here. Yes, Unix command names are cryptic and this is a UI issue — mitigated a lot in 2008 by the presence of GUIs that no longer suck, but still an issue.

Quality of documentation still ain’t so great, and comprehensible and useful feedback when commands fail is something Linux applications could stand to be a lot better at. But these are problems almost everywhere, not just in Unix-land; it seems a bit unfair to reproach Unix for special sinfulness on their account.

On other points they do less well. “Consistency and predictability in how commands behave and in how they interpret their options and arguments” sounds very nice, but the first part is impossible (different commands have to behave differently because they solve different problems) and the second is more a gripe about shell wildcard expansion than anything else. Sorry, no sale; it’s too useful. If you don’t like the way rm * works, go fire up a GUI file manager.

Then there’s a lot of flamage about Unix developers being supposedly content to write shoddy programs. Neither true nor interesting, just more hyperbolic snark obscuring the points on which they have a point.

There are some good nuggets in this chapter, but on balance digging through the excrement to find them does not seem worth it.

Chapter 3 – Documentation? What Documentation?

Yes, man(1) is still clunky and man pages are still references, not tutorials. This isn’t news in 2008, and wasn’t in 1994, either. The difference is that in 2008 the man pages style they’re excoriating is less of a blocker; we have the Web and search engines now.

Their fling at “the source code is the documentation” has got some unintentional irony since open source happened. Gripes about obsolete shells don’t add much, if anything, to the discussion.

Much that’s true in this chapter, but almost nothing that’s still useful or novel in 2008.

Mail: Don’t Talk to Me, I’m Not a Typewriter

This is mostly a rant against sendmail. Most of the criticism is justified. A lot of Linux distributions default to using Postfix these days, and the percentage is increasing; end of story.

Chapter 5 – Snoozenet: I Post, Therefore I Am

USENET is not exactly dead, but these days it’s mainly a relay channel for p2p media sharing and porn. There’s some stuff in this chapter of interest to historians of hackerdom, but nothing relevant to current conditions.

Chapter 6 – Terminal Insanity:

This chapter has dated really badly. To a good first approximation there simply aren’t any actual VDTs any more; one sees a few on obsolete point-of-sale systems, but that’s about it. It’s all terminal emulators or the OS console driver, they all speak VT100/ANSI, end of story, end of problem.

There’s an attempt at an architectural point buried in the snark. Yes, it would have been really nice if Unix kernels had presented a uniform screen-painting API rather than leaving the job to a userspace library like curses(3). But — and I speak as an expert here, having implemented large parts of ncurses, the open-source emulation of it — moving all that complexity to kernel level would basically have solved nothing. The fundamental problem was that Unix (unlike the earlier systems these guys were romantically pining for) needed to talk to lots of VDTs that didn’t identify themselves to the system (so you couldn’t autoconfigure them) and the different VDT types had complicatedly different command sets. The stuff that curses did had to exist somewhere, and its capability databases; putting it in a service library in userspace at least guaranteed that bugs in this rather tricky code would not crash the system.

But this is yesterday’s issue; the VDT is dead, and the problems they’re griping about dead along with it.

Chapter 7 – The X-Windows Disaster

This chapter begins unfortunately, with complaints about X’s performance and memory usage that seem rather quaint when comparing it to the applications of 14 years later. It continues with a fling at the sparseness of X applications circa 1990 which is unintentionally funny when read from within evince on a Linux desktop also hosting the Emacs instance I’m writing in, a toolbar with about a dozen applets on it, and a Web browser.

I judge that the authors’ rejection of mechanism/policy separation as a guiding principle of X was foundationally mistaken. I argued in The Art of Unix Programming that this principle gives X an ability to adapt to new technologies and new thinking about UI that no competitor has ever matched. I still think that’s true.

But not all the feces flung in this chapter is misdirected; Motif really did suck pretty badly, it’s a good thing it’s dead. ICCCM is about as horrible as the authors describe, but that’s hard to notice these days because modern toolkits and window managers do a pretty good job of hiding the ugliness from applications.

Though it’s not explicitly credited, I’m fairly sure most of this chapter was written by Don Hopkins. Don is a wizard hacker and a good man who got caught on the wrong side of history, investing a lot of effort in Sun’s NeWS just before it got steamrollered by X, and this chapter is best read as the same bitter lament for NeWS I heard from him face to face in the late 1980s.

Don may have been right, architecturally speaking. But X did not win by accident; it clobbered NeWS essentially because it was open source while NeWS was not. In the 20 years after 1987 that meant enough people put in enough work that X got un-broken, notably when Keith Packard came back after 2001 and completely rewrote the rendering core. The nasty resources system is pretty much bypassed by modern toolkits. X-extension hell and the device portability problems the authors were so aggrieved by turned out to be a temporary phenomenon while people were still working on understanding the 2D-graphics problem space.

That having been said, Olin Shivers’s rant about xauth is still pretty funny and I’m glad I haven’t had to use it in years.

Chapter 8 – csh, pipes, and find: Power Tools for Power Fools

Of the “plethora of incompatible shells” they anatomize in the first part of this chapter (including the csh in the chapter title), most are basically dead; the bash shell won. Accordingly, a lot of this chapter is just archaeology, known to old farts like me but about as relevant to present-day Linux or Unix users as Ptolemaic epicycles.

The portability problem in shell programming is almost, though not quite, as historical now. Languages like Perl and Python have replaced the kind of fragile shell scripting the authors fling at — in fact, that’s why they’re called scripting languages. The authors anticipated this development:

At the risk of sounding like a hopeless dream keeper of the intergalactic space, we submit that the correct model is procedure call (either local or remote) in a language that allows first-class structures (which C gained during its adolescence) and functional composition.

I give them credit, they were right about this. It seems curious, though, that they exhibited no knowledge of Perl; it had already been supplying exactly this sort of thing in public view for some years in 1994.

The end-of-chapter rant on find(1) is still funny.

Chapter 9 – Programming: Hold Still, This Won’t Hurt a Bit

It is a shame that the authors are so quick to dismiss the Unix toolkit as a primitive toybox in this chapter, because that jaundiced error gives Unix programmers an excellent excuse to ignore the parts the authors got right. To point out a first and relatively minor example, the use of tabs in make really was a botch that ought to serve as a horripilating example to tool designers.

More generally, many of their points about C and its associated assumptions and toolchains are well taken. Yes, all those fixed-length-buffer assumptions are an indictment of weak tools and bad habits formed by them. Yes, LISP would have been a better alternative. Yes, exception-catching is an important thing to have.

We didn’t get LISP. We got Python, though. I could have cited Perl and Tcl, too, but they aren’t as close to being LISP (see Peter Norvig’s detailed argument that Python is Scheme with funky syntax.) My point here is not to advocate Python, it’s to observe that the Unix community noticed that C was inadequate and addressed the problem. If the statistics on Freshmeat are to be believed, more new projects now get started in scripting languages than in C.

Gradually, in a messy and evolutionary way, the Unix community is teaching itself the lesson that the authors of this chapter wanted to give it. I agree with them that it could have happened faster and should have happened sooner.

I’d say this chapter had dated the least badly of anything in the book, if not for the next one.

Chapter 10 – C++: The COBOL of the 90s

Though out of date in minor respects (C++ got namespaces after 1994, and the authors couldn’t address templates because templates hadn’t been added yet) this chapter remains wickedly on target. The only major error in it is the assumptions that C++ is in the mainline of Unix tradition, was gleefully adopted by Unix programmers en masse, and is therefore an indictment of Unix. Language usage statistics on open-source hosting sites like SourceForge and Freshmeat convincingly demonstrate otherwise.

Chapter 11 – System Administration: Unix’s Hidden Cost

This chapter is a remembrance of things past. When it was written, people still actually used magnetic tape for backups. It is probably the most dated chapter in the book.

In 2008 my septaguinarian mother uses a Linux machine and, after a handful of calls during the getting-used-to-it period, I’ve gotten used to not hearing about it for months at a time. Enough said.

Chapter 12 – Security

Many of the technical criticisms in this chapter remain valid, in the sense that Unix systems still exhibit these behaviors and have these vulnerabilities. But on another level the chapter is suspended in a curious vacuum; the authors could not point to an operating system with a better security model or a better security record. They didn’t even try to write about what they imagined such a system would be like.

The contrast with Chapters 9 and 10 is instructive. Many of the authors come from a tradition of computer languages (LISP, Scheme, and friends) that were in many and significant ways superior to Unix’s native languages as they existed in 1994 (the gap has since closed somewhat). They knew what comparative excellence looked like, and could therefore criticize from a grounding in reality.

There is no corresponding way in which the authors can suggest Unix’s security model and tools could be fundamentally improved. That’s because, despite all its flaws, nobody has ever both found and successfully deployed a better model. Laboratory exercises like capability-based OSes remain tantalizing but not solutions.

The correct rejoinder to this chapter is: “You’re right. Now what?”.

Chapter 13 – The File System

Most of the sniping about the performance and reliability of Unix filesystems that is in this chapter is long obsolete. We’ve learned about hardening and journaling; the day of the nightmare fsck session is gone. The gripes about unreliable and duplicative locking facilities have also passed their sell-by date; the standards committees did some good work in this area.

The authors’ critique of the unadorned bag-of-bytes model is not completely without point, however; as with languages, some of the authors had real-world experience with systems supporting richer semantics and knew what they were talking about.

Some Linux filesystem hackers seem to be groping towards a model in which files are units of transportability that can be streamed, but internally have filesystem-like structure with the ability to embed metadata and multiple data forks. Others have experimented with database views a la BeOS.

There is probably progress to be made here. Alas, it won’t be helped by the authors’ persistent habit of burying an ounce of insight in a pound of flamage.

Chapter 14 – NFS: Nightmare File System

Some of the specific bugs described in this chapter have been fixed, but many of the architectural criticisms of NFS made here remain valid (or at least were still valid the last time I looked closely at NFS). This chapter is still instructive.

Summing Up: What’s Still Valid Here?

The original question was: “which chapters did you consider good and which chapters did you consider bad?”. Let’s categorize them.

The worst chapters in the book, at least in the sense of being the most dated and content-free for a modern reader, are probably 11 (Administration), 5 (Snoozenet), 6 (Terminal Insanity), 4 (Mail), and 1 (Unix: The World’s First Computer Virus) in about that order of worst worst to least worst.

The chapter with the soundest exposition and the most lessons still to teach is certainly 10 (C++), followed closely by 14 (NFS).

A few chapters are mostly flamage or obsolete but have a good lesson or two buried in them. In rough order of descending merit:

  • The authors were right to argue in chapter 8 that classic shell scripting is fragile and rebarbative, and should be replaced with languages supporting data structures and real procedural composition; this has in fact largely come to pass.

  • The knocks on C in Chapter 9 (Programming) were justified.

  • The objections to the pure bag-of-bytes model in Chapter 13 (The File System) should provoke a non-dismissive thought or two.

  • At the bottom of this heap are the few nuggets in Chapter 2 (Welcome, New User!) about spiky command names and proliferated options.

Some chapters tell us things that are true and negative about Unix, but merely rehearse problems that are (a) well known in the Unix community, and (b) haven’t been solved outside it, either. It may have made the authors feel better to vent about them, but their doing so hasn’t contributed to a solution. I’d definitely put 12 (Security) and 3 (Documentation? What Documentation) in that category.

Chapter 7 (The X-Windows Disaster) is the hardest for me to categorize. There’s still ugliness under the covers in some places they mention, but I think they’re mistaken both in asserting that the whole system is functionally horrible and in slamming the architecture and design philosophy of the system.

More than ever I see this book as a missed opportunity. The 14 years since 1994 has been enough time for useful lessons to be absorbed and integrated; if all the chapters had been up to the level of 10 or 14, we might have better Unixes than we do today. Alas that the authors were more interested in parading some inflammatory rhetoric than starting a constructive conversation.

120 comments

  1. “Yes, it would have been really nice if Unix kernels had presented a uniform screen-painting API rather than leaving the job to a userspace library”

    Let’s hope Andy Tanenbaum isn’t listening :-)

  2. No, Python is not Scheme with funky syntax. Python is decrypted Perl.

    That’s probably its biggest strength and a significant weakness as well.

  3. “Yes, it would have been really nice if Unix kernels had presented a uniform screen-painting API rather than leaving the job to a userspace library”

    Let’s hope Andy Tanenbaum isn’t listening :-)

    Microsoft certainly listened. With the release of Windows NT 4.0 they moved the GDI into kernel space, with all the advantages (performance) and disadvantages (bugs in the graphical layer can now crash the system) that implies.

    I think they tried to implement a more sensible separation of concerns in Vista. Which of course leads users to gripe…

  4. Two small quips:

    “When it was written, people still actually used magnetic tape for backups”
    Sorry, what do they use now? Big companies don’t exactly store to DVD.

    I’m also interested in knowing how do you rate MSDN as a documentation source. After all, it (OIMHO) seems the biggest doc source outside of Unix.

  5. >Sorry, what do they use now? Big companies don’t exactly store to DVD.

    I’m aware there are a few really high-end magnetic-tape robots around, but I was referring to old style 9-track tape. That stuff is now so rare that finding a reader for it is not easy.

    > I’m also interested in knowing how do you rate MSDN as a documentation source. After all, it (OIMHO) seems the biggest doc source outside of Unix.

    I’ve never developed under Windows, so I have no idea. And no interest in finding out. Er, if their documentation is at the quality level of their software I don’t think I need to worry about it falsifying my review point.

  6. > I’m also interested in knowing how do you rate MSDN as a documentation source. After all, it (OIMHO) seems the biggest doc source outside of Unix.

    I’d rather read man pages. The same goes for Javadocs, too. All I really need is the function signature with helpful names. If I want a tutorial, there’s google, and I always skip over the MSDN pages these days. They’re too chatty. Maybe if you don’t know what you’re doing they could be helpful, but I think in that case, you should buy a book on what you’re doing instead of reading the tutorial while you’re coding. People who do that scare the hell out of me.

  7. But this is yesterday’s issue; the VDT is dead, and the problems they’re griping about dead along with it.

    If only. These problems will be dead only when I no longer hit backspace or delete or an arrow key on a system I’ve ssh’d into, and see ^something. Yes, it’s not impossible to track down why it’s happening, or impossible to fix, but it still happens.

    It is rather amusing that page 152 contains, in reference to a decision to use termcap in curses, “Starting over, learning from the mistakes of history, would have been the right choice”, while a footnote in reference to AT&T’s decision to start from scratch reads, ” And if that wasn’t bad enough, AT&T developed its own, incompatible terminal capability representation system…”.

    Are you familiar with the more recent trends in UNIX (now Linux) hatedom? There’s Linux Hater’s Blog, which surrounds its criticism with John Solomon-like levels of invective. There’s also elliotth’s blog, which is much less interested in being purely cranky. For example, his post on Rhythmbox links to the relevant Bugzilla entries for the problems he points out.

  8. Introduction:
    This quote still appears to be relevant.

    Literature avers that Unix succeeded because of its technical superiority.
    This is not true. Unix was evolutionarily superior to its competitors, but
    not technically superior. Unix became a commercial success because it
    was a virus. Its sole evolutionary advantage was its small size, simple
    design, and resulting portability. Later it became popular and commercially
    successful because it piggy-backed on three very successful hosts: the
    PDP-11, the VAX, and Sun workstations. (The Sun was in fact designed to
    be a virus vector.)

    The specific details of the standards wars may be dated, but the ‘spirit’ is still with us. LSB is still fairly recent, and it is not always reliable. Also, different distros have different release times, so they include different versions of software with different problems. Standards have improved, but they are still not to the point where most software projects have a Linux binary alongside their Windows and OSX binaries.

    Welcome New User!

    “Consistency and predictability in how commands behave and in how they interpret their options and arguments” sounds very nice, but the first part is impossible (different commands have to behave differently because they solve different problems)

    They may solve different problems, but they often have similarities. Microsoft Word and Mozilla Firefox solve different problems, but they have the same keyboard shortcuts for copying and pasting text (Ctrl-c and Ctrl-v respectively). Unix still suffers from four different ways to list arguments: -r (traditional), r (BSD), –recursive (GNU), and -recursive (X11). Also, they say that it would have been a good idea to provide a standard library to handle regular expressions. This would have helped bring consistent regular expression behavior to C programs and Unix in general.

    and the second is more a gripe about shell wildcard expansion than anything else. Sorry, no sale; it’s too useful.

    One of their may points seems to be that it would be nice if the called program also had a way to see the specific arguments it was called with. This would allow applications like rm to do a sanity check to prevent dangerous operations.

    If you don’t like the way rm *, go fire up a GUI file manager.

    As they say in Chapter 7, Graphical interfaces can only paper over misdesigns and kludges in the
    underlying operating system; they can’t eliminate them.
    ‘rm *’ would not hurt so much if Unix provided a mechanism to retrieve deleted files. Would it have been so hard to provide a special directory where the system would move files after they were ‘deleted’ and where the system would only truly remove them when it needed the memory? Both GNOME and KDE provide a ‘garbage can’, but it only works with programs specifically designed to utilize it.

    System Administration
    I am not sure that it is as out-of-date as you say. The section on configuration files is still spot on. The section on funky device names is still relevant. I still remember a time I was installing Gentoo (yes, I know it is Gentoo, but still) and had to play the which-device-is-my-hard-drive game. It turned out to be /dev/sda, even though I think that is supposed to be a SCSI disc, and my hard-drive used SATA. *nix still does not provide a convenient way (that I know of) to make multiple hard drives look like one disk. The best way seems to be RAID, which is a hardware hack. Also, the requirement for a separate swap partition is still annoying. Both Windows and OSX seem to make due without it; why do people insist on it for Linux?

  9. I have fond memories of this book. I worked at Sun Microsystems at the time, and reading the book made my blood boil. Though I read heavily from the O’Reilly catalog at that time I doubt I could name more than a couple other technical books I consumed. The rest merely become incorporated into the mental fabric; this book refused to be so easily assimilated. In that sense, the authors did a standout job.

    I will be forever grateful to Dennis Ritchie for supplying a quote which I use to this day (heavily paraphrased to conceal my plagiarism): Like excrement, though it may contain nuggets of digestible material it is, on the whole, unsatisfying.

  10. I have a copy of “The Unix Hater’s Handbook” in PDF, and I came to the same conclusions that you did, Eric. The rhetoric was amusing and I enjoyed commiserating with all the people’s tales of lossage. (They really need to do something about that “file deletion is forever” thing.) I remember thinking that most of the problems had long since been solved by journaling filesystems. I’ve never had a Unix crash on me except for the one kernel panic on Mac OS X (which is very nice, by the way :-)).

    Whenever I get a chance, however, I should tell you about the Ubuntu upgrade that went badly. :-)

    The “Handbook” is useful, along with “Life With Unix”, as a time capsule of the state of Unix before the rise of Linux and open source.

  11. Oh, I forgot what I was going to say about documentation. Of course, the man pages are terse, but thankfully most projects have better Web-based documentation. Even better are wikis, which allow the documentation to actually match reality. ;-)

  12. >This quote still appears to be relevant.

    In a sense, yes. The issue is with the implied value judgment that if an operating system is observed to spread because of virality, then it must be inferior on other axes. That isn’t necessarily true; it may be the case that virality is correlated with design traits that make it a better bet in the long term – like, say, open source. Richard Gabriel first confronted this possibility in his 1987 “Worse Is Better” paper; the UHH guys reproduce that text in an appendix but don’t appear to have really grasped the implications. Which is not a knock on them, really; nobody, not even Gabriel himself, fully thought them through for a decade afterwards. Then I did it, and RG and I spent an entertaining 90 minutes the first time we met trying to figure out whether I had unconsciously lifted the central Worse Is Better idea for my “The Cathedral and the Bazaar”. We tentatively concluded that I hadn’t, but we’re still not sure.

    I guess my real point here is that the UHH guys in 1994 had a narrow definition of “technical superiority” that doesn’t take into account the parts of the software ecosystem consisting of human beings operating under economic constraints. Some of them have matured since then. Some haven’t.

    >Unix still suffers from four different ways to list arguments: -r (traditional), r (BSD), –recursive (GNU), and -recursive (X11).

    Yes, that’s a reasonable point, and one of the “nuggets” I had in mind.

    >Graphical interfaces can only paper over misdesigns and kludges in the underlying operating system; they can’t eliminate them

    That’s true, but who says rm is a misdesign? At some level hard deletion has to happen and there has to be a tool to do it; rm has to exist. People who rant about rm being misdesigned really mean, I think, that it is not a tool ordinary users should touch. That’s a defensible position, and it starts a more useful discussion.

    1. > The issue is with the implied value judgment that if an operating system is observed to spread because of virality, then it must be inferior on other axes. That isn’t necessarily true; it may be the case that virality is correlated with design traits that make it a better bet in the long term

      I think your intuition serves you right. There are several papers which question the common myth that the inferior technology often wins just because of timing or market advantages (google “path dependence paper”). Most come to the conclusion that this doesn’t hold true for common examples like VHS or QWERTY. And there is even a pretty recent paper which comes to the conclusion this might not be true in general (unfortunately I can not find it anymore). In any case these are not just opinions but actual scientific research.

      The fact that the UHH even mentions QWERTY as an example shows that their authors in fact entertain the myth of path dependence. One cannot call this nitpicking, since this is the premise the whole book is build on. Take away this premise and there is no book. (Just a somewhat entertaining list of anecdotes.)

  13. At some level hard deletion has to happen and there has to be a tool to do it; rm has to exist.

    Yes, hard deletion has to exist, but it does NOT have to be exposed to the user or the user’s programs. Above, I mentioned a relatively simple way that it could be / have been accomplished.

  14. Security

    the authors could not point to an operating system with a better security model or a better security record.

    Well, they can’t, but I can. The B5000 had automatic bounds checking, sophisticated file locking and automatic logging. VMS is known for being built like a tank. Plan 9 abolishes the superuser and provides a finer grain of access control, etc.

    They didn’t even try to write about what they imagined such a system would be like.

    Well, they do suggest doing away with the superuser, abolishing SUID, and switching to a finer level of access control to allow specific control over certain files (/etc/passwd) and certain actions (such as spawning a shell). They also suggest per-user CPU time quotas and per-user I/O quotas. Also, they suggest a Trusted Path.

  15. I argued in The Art of Unix Programming that this principle gives X an ability to adapt to new technologies and new thinking about UI that no competitor has ever matched.

    Someone on Reddit pointed this out but it’s worth repeating: Multihead Just Worked on the Mac in 1987. Xinerama is still fucking broken in 2008.

    Also, to get any sort of decent 3D performance much of X must be bypassed entirely. (The proprietary NVIDIA driver actually reimplements much of the X server; it remains the only performant 3D stack on Linux.)

    Way to adapt to new technologies.

  16. Hey, if new technologies are dismissed with a blanket “I’ve never developed under Windows, so I have no idea. And no interest in finding out. Er, if their documentation is at the quality level of their software I don’t think I need to worry about it falsifying my review point.”, you start to understand why some people came up with the UHH.

    Because, of course, the topic wasn’t at all related with ways Unix sucked. And of course, someone that has never programmed under a platform can immediately tell if it’s of good quality without eyes rolling.

    Sure, you’ll get thousands of stories about Windows sucking. That’s because of Sturgeon’s Law.

  17. … And of course, because it does suck, and has sucked even more in the past. But the point was that All Software Sucks. It’s, as we say here, “the dead laughing at the hanged man”.

  18. >Hey, if new technologies are dismissed

    It’s true I will instantly dismiss anything from Microsoft, but “new” is not the relevant predicate — “proprietary” is.

    Um, and don’t mistake this for a religious position. It isn’t — it’s my burn scars aching.

  19. “Yes, hard deletion has to exist, but it does NOT have to be exposed to the user or the user’s programs.”

    I think “Your deleted files can be recovered, except sometimes when they can’t” is less pleasant than “your deleted files cannot be recovered, take care”. At least then you know where you stand.

  20. esr, is the software technology for your car’s ECU open source?
    your cell phone?
    your vcr?

    X11 bites. Hard, but as the man said, “Worse is Better”, which is the founding story of linux, and most, if not all of “open source” and “free software”.

    Just why is it that you didn’t address Gabriel’s chapter in UHH, anyway? It has always struck me that “Worse is Better” pre-dated CATB by several years, but you’ve never before now (that I’ve found) given Gabriel attribution for having had largely similar ideas much earlier.

    There is also a view that the OS is the bits that were left out of the programming language, and therefore, the parts implemented by the OS are actually bugs. By that measure, the “operating system” needs to evaporate.

    I’m still hoping for a new “lisp machine” with lisp over a (mostly hidden) linux kernel, ala the architecture of Android, only s/java/lisp/…

  21. You have burn scars over something you haven’t programmed in? That you don’t know?

    I’ve also had contact with Windows and MS software for most of my life, and I’m not typing this from Ubuntu just because. Still, I don’t just dismiss everything before at least having tried it.
    It might not be a religious position, but it sure sounds like it.

  22. >You have burn scars over something you haven’t programmed in? That you don’t know?

    No, I have burn scars from experiences with using and developing proprietary software in general. My animus is not specifically against Microsoft. Because Microsoft software is proprietary, I don’t feel I have to try it to reject having anything to do with it.

    Actually, I already have more to do with it than I want. My wife has a work computer that’s a Windows laptop — I’ve had to troubleshoot things like WiFi authentication on the maggot-ridden pile of stale dog vomit Microsoft calls an OS, more than once.

  23. >This chapter [on NFS] is still instructive.
    And as irrelevant as complaints about Sendmail and incompatible shells. See FUSE (in particular sshfs, which is made of win and goodness).

  24. Usenet is still relevant for comp.* and sci.*

    If you kids don’t use it, that’s your stupid problem. It seems like you like the illusion of searching through /n/ (for a sufficiently big n) fora for your answers. That might look like your doing research/working.

  25. Si Says: I think “Your deleted files can be recovered, except sometimes when they can’t” is less pleasant than “your deleted files cannot be recovered,

    The UHH addresses this concern in Chapter 1. Utilizing the kernel may have its problems, but it is much better than implementing it at a higher level. There seems to be a significant demand for a ‘garbage can’, which is evidenced by the fact that BOTH KDE and GNOME have implemented this functionality in their system, and the UHH provides further evidence with its’ description of all the various kludges users and sysadmins have come up with to ‘safely’ delete a file (‘alias rm rm -i’ & ‘alias rm mv ~/.trash’, etc.). Now, there probably should be a special mechanism to ‘securely’ delete a file (overwrite the blocks several times with random bits).

    take care”.

    One of the major problems is that the *nix environment has made it especially difficult to ‘be careful.’ Historically, *nix programs had “a criminally lax attitude towards error checking.” Wildcard expansion is the major culprit, and since the program had no way to see the options with which it was called, it had no way of doing sanity checks.

    At least then you know where you stand.

    Yeah, you know you are standing on a mass of nylon in a sealed, high-pressure, 100% oxygen environment.

  26. I have been thinking about instances where the kernel would have already reallocated the bits in a deleted file before the user had time to retrieve it, and I am mostly drawing a blank. Even if half of the files on a hard drive was deleted, all the kernel would have to do is point the inodes at the garbage directory and put all the blocks on the free list.

  27. Phil:
    > Now, there probably should be a special mechanism to ’securely’ delete a file (overwrite the blocks several times with random bits).

    There is. It’s called shred. I’m reminded of Neal Stephenson’s remark about learning Unix being an endless series of incidents where you’re on the point of inventing some useful utility and then notice that it’s already there, and has been for decades, and that explains that odd file or command you’ve vaguely wondered about.

    Jeff:
    mrg@delirium:~$ perl -le ‘print 1 + “2”‘
    3
    mrg@delirium:~$ python -c ‘print 1 + “2”‘
    Traceback (most recent call last):
    File “”, line 1, in
    TypeError: unsupported operand type(s) for +: ‘int’ and ‘str’
    mrg@delirium:~$ mzscheme -e ‘(+ 1 “2”)’
    Welcome to MzScheme v372 [3m], Copyright (c) 2004-2007 PLT Scheme Inc.
    +: expects type as 2nd argument, given: “2”; other arguments were: 1
    Python owes a lot to Perl (and Perl owes a lot to Python), but it’s closer to Scheme in some important respects. Of course, it’s not just a skin over Scheme either – think about Python’s broken lambdas (which Perl gets right).

  28. To me, the UHG is a history book, like the Jargon File, the chronicle of a very interesting period. It fascinates me that at that time when I was completely sure that computing equals taking the Commodore out of the cabinet and plugging it into the TV, a lot of other people used multi-user time-sharing systems with e-mail, Usenet discussions etc. When I first saw a PC, I found it really strange that the computer isn’t built into the keyboard, but is a separate box. It looked very unusual to me. I figure the mini-computer Unixers had it the other way around: nice terminal, but where is the computer? :-)

  29. I’ve had to troubleshoot things like WiFi authentication on the maggot-ridden pile of stale dog vomit Microsoft calls an OS, more than once.

    Ever had to troubleshoot WiFi authentication on Linux? None of the GUIs seem to do the right thing, and generally you have to hand-tweak /etc/wpa.supplicant.conf. (Using wpa_supplicant, even when your hotspot is WEP-encrypted or not encrypted at all, is generally a good idea.) Not accessible to end users. And that’s if — if — the kernel supports your WiFi card in the first place. Most WiFi chipsets were completely inaccessible without loading (gee whiz) a Windows driver into the kernel through the ndiswrapper compatibility shim.

    So that steaming pile of dog vomit has a few advantages over Linux when it comes to wifi. At least it did as of a year ago when I last really messed with wifi on both.

    Miles, much like the UHH, “In the beginning…” is now obsolete. Stephenson has switched to Mac OS X. Telling.

  30. >Ever had to troubleshoot WiFi authentication on Linux? None of the GUIs seem to do the right thing, and generally you have to hand-tweak /etc/wpa.supplicant.conf

    Get a better distro. The GUI tools work for me, and I didn’t even know /etc/wpa.supplicant.conf existed until you brought it up.

  31. “Get a better foo. This bar works for me”. That’s exactly the positive attitude needed to solve Linux’s problems.

  32. Not that it matters much, but I misspelled /etc/wpa_supplicant.conf. n.b.: on some distros it’s /etc/wpa_supplicant/wpa_supplicant.conf. Yaaaaay standards!!!

  33. Miles, my point was that ‘securely’ deleting a file was the only case I could come think of that my ‘garbage can’ idea would not work. Of course, one could ‘securely’ delete a file once it is in the ‘garbage can’ like in OSX.

  34. Eric, you may want to give MSDN, Windows, and their developer tools a second, unprejudiced look; they really are better than what Linux has to offer. If you don’t believe me, ask Jonathan Blow, a game developer who tried to port his game to Linux and found out how made of MASSIVE EPIC FAIL it is at even basic stuff.

    I like Linux and these days I develop for embedded Linux for a living. But when it comes to being user- and developer-friendly for ordinary stuff, Mac OS is by far the best, Windows comes in a distant second, and Linux still isn’t even in the race.

  35. Phil: Oh, I see. Just trying to be helpful… FWIW, garbage cans annoy the hell out of me, and I never use them even when they’re available.

    I too have had far more problems using wifi under Linux than under Windows.

  36. The biggest obstacle to widespread Linux acceptance is Linux culture. It seems that a lot of geeks think that Linux should only be accessible to kernel hackers. Too many hackers roll their eyes at Ubuntu and say “there goes the neighborhood.”

  37. Jim says: I’m still hoping for a new “lisp machine” with lisp over a (mostly hidden) linux kernel, ala the architecture of Android, only s/java/lisp/…

    Well, it is funny that you mentioned that, because yesterday I was looking at the OSKit project. For those of you who do not know, OSKit is a framework that implements many of the common functionalities required of all modern x86 operating systems: TCP/IP stack, multiple filesystem support, Linux driver support, POSIX support layer, etc. The cool thing about this project is that all the components were meant to be plugged into any operating system. This would allow OS developers to focus on innovative new features without having to reimplement all the boring stuff. It is a damn shame that the project seems to have been dormant since 2002. Anyway, to the point, while I was searching Google for any signs of life in the project, I came across this email by a grad student that combined OSKit and MzScheme and had a smooth, working lisp machine in 7-8 hours. Now THAT is impressive!! Like I said, it is a damn shame this project died.

  38. >Eric, you may want to give MSDN, Windows, and their developer tools a second, unprejudiced look; they really are better than what Linux has to offer.

    There is no possibility of “better” if the effect is to lock me into a vendor-controlled jail. None. That’s like saying “Here, try this lovely heroin. It feels soooo gooood that you won’t mind that you’ll never be able to stop.” You can list features and capabilities until doomsday and you’ll never get past the word “proprietary”, which stinks to me of lossage and of personal pain too keenly remembered.

    If that’s “prejudiced”, I guess I am. And utterly immovable on this subject.

  39. Eric:

    On a Windows machine, you’d still have a copy of a sestina about quantum physics.

    There isn’t a text editor in Windows that doesn’t autosave files; many will even keep spiral backups (where each autosave is incremented by a date stamp, and after four have been saved, the fifth is deleted to save space). Yes, I know, it’s mollycoddling by a tinker toy OS that doesn’t [include:rant001 through rant999.]

    You may consider the loss of 3 hours of writing to be a trivial price to pay for using Linux. I don’t.

    I’ve worked on Windows, Mac OS X, Mac OS 9 and Ubuntu. And by “Worked”, I mean “Did something that I got paid for.”

    Even adjusting for biases – like the fact that I have more experience with Windows than I do the other four OSs combined – Ubuntu STILL doesn’t match the feature sets I need for most of what I need to do. The command line is cryptic, and requires mental overhead on the part of the user that is incredibly annoying.

    Unix documentation, with some exceptions, is atrocious. You’re either going to become a kernel hacker, or we don’t want you.

    Microsoft’s documentation is, in general, excellent. Microsoft’s developer tools are very good, and there’s a strong market based demand for them – learn to use the tools, code software, get a job. Easier to learn dev tools (through better documentation, and lots of handholding) and putting those dev tools out there for dirt cheap might – just might – explain why Windows has a 90+ percent market share.

    Hell, even Windows’ Office suite includes programming language support with VBA. Is it a great programming language? Categorically not. Is it something that allows someone who isn’t a formally trained programmer or self taught hacker to get useful work done? Yes. Is there a thriving community of people who share VBA apps and source code? Damned straight. Do those people help each other improve applications? Yep.

    Sounds a lot like the Open Source ‘community development’ meme, with easier to use tools to me…and if a quick scan of Monster.com and Jobs.com is any indication, there appear to be about 20x the job prospects for people competent in those tools written by Microsoft.

    As a free market anarchist, even you have to concede that a company that’s kept an 80% or greater market share in the fastest growing computer market segment, in the 30 years since you entered the job market, has to be doing something right beyond glitzy marketing.

  40. >On a Windows machine, you’d still have a copy of a sestina about quantum physics.

    Don’t be silly. Emacs makes backup files too; I’m not sure how it got lost, but failure of my editor wasn’t the problem.

    >Microsoft’s documentation is, in general, excellent.

    That’s “Here, try this heroin” again. It won’t work this time, either. Other people can be junkies if they want; I refuse to go there.

    >As a free market anarchist, even you have to concede that a company that’s kept an 80% or greater market share in the fastest growing computer market segment, in the 30 years since you entered the job market, has to be doing something right beyond glitzy marketing.

    Oh, they’ve done great many things competently. None of those things includes operating systems, however…what they’ve done instead is successfully lowered everyone’s expectations to the point where customers think that shit-smell is attar of roses or something. And those of us with actual standards get looked at like we’re crazy for pointing out the suck.

  41. None of those things includes operating systems, however…

    Nonsense. The NT kernel, designed by legendary Unix-hater Dave Cutler, is a spiritual descendant of VMS and as such boasts a design that is cleaner and more orthogonal than any Unix. (Linux’s ad-hockery looks egregiously bad in comparison.) Admittedly, Windows NT has exhibited problems with stability; many of those were cause by buggy third-party drivers, a default security policy deliberately designed to maintain bug-for-bug compatibility with 16-bit Windows; and buffer overflows in userland software. Many of these problems have been fixed with Windows Vista, and I know Vista users who report high uptime with no crashes. It’s not 1998 anymore. Windows is stable.

    What I think Microsoft does particularly well, however, is to act in a leader/gatekeeper role for the PC platform, negotiating the complex relationships between IHVs, ISVs, and end users. Which means that l33t cutting-edge stuff kind of tends to get left by the wayside if ordinary users don’t care about it. (The Macintosh didn’t have PMT or proper memory management for the first 15 years of its life. Did the Mac user base, known for its technological elitism, care? No.) Any platform needs such a gatekeeper. Open source has coasted along for a while without one, but the results so far have been disastrous. Linux is broken because its community is broken: without a gatekeeper to say “this is what we need to focus on, this is what we will support, this is the API you can rely on”, and so forth, what you have is a horde of anarchic cluster-communities competing with, and often clashing with, each other when their energies can and should be channeled to make significantly more progress. As a recovering fosstard (not at all dissimilar from being a recovering Catholic; I’ve experience in both), I used to say “yeah, let a thousand APIs, window managers, and subsystems bloom!” But then it dawned on me: it’s 2008 and sound still doesn’t fucking work on Linux. App A uses pulseaudio, B uses ALSA, C tries to open ALSA’s backwards-combatable OSS /dev/dsp interface and finds it blocked and so trudges onward with no sound until restarted, and D is trying to connect to a nonexistent esd daemon. FAIL. The only solution that’s been proven to work is for someone to decide on one sound API for Linux. That will never happen, though, because of the monumental cat-herding involved.

    You see proprietary software as a jail. Fine. Being a software engineer, you’re extraordinarily lucky[0] in being able to call that shot. For these people it’s rather akin to a gated community: some freedom is traded for predictability, peace of mind, and gorgeous scenery. And it’s preferable by far to the stone-knives-and-bearskins primeval village which open source represents.

    [0]For large numbers of people, the choice is between using a proprietary program on Windows, or not doing their job at all. At a talk, RMS once told a motion picture editor to get a new job rather than continue using proprietary software (all the editing tools which are any good are proprietary). Nicely illustrative of why “proprietary software = bad” is a completely untenable position.

  42. Oh, they’ve done great many things competently. None of those things includes operating systems, however…what they’ve done instead is successfully lowered everyone’s expectations to the point where customers think that shit-smell is attar of roses or something. And those of us with actual standards get looked at like we’re crazy for pointing out the suck.

    Here are my standards for what makes a “Good OS”.

    Stability:

    Last time I had a my system outright crash (Win NT through XP SP 3) was four years ago. The cause of the crash? Enough dust had accumulated in my CPU fan that the CPU was overheating. Using a blower to shove the dust out solved the stability problem.

    Hardware Compatibility

    I buy the vast majority of hardware out there, plug it in, and it works. I don’t have to sort first by whether or not there’s a driver for the distribution of Windows I use.

    Networking

    The only time I’ve had a problem connecting my geriatric laptop to a network where other people could connect was at your place. Worked fine at all three airports I was at, worked fine at WBC, worked fine in the hotel room at WBC, worked fine in the hotel room at GenCon, worked fine at the Indianapolis convention center.

    Data Integrity and Restore

    My file system backups work just fine. I run test restores every month.

    Software Compatibility

    My applications install without me having to open a text editor, or recompiling a patch into a kernel. I do have to log in as admin when installing them, and log back out after said installation.

    Security:

    I run in user-mode. I only switch to Admin mode when installing software. I live behind a router, run anti-virus sweeps three times per week, and have a firewall. I spend about as much time doing this as most of the Linux people I know spend maintaining their systems for security purposes. My biggest security hole is that I have to switch to Admin mode to install about one Windows Patch update in three.

    Usability:

    Fonts and color spaces display right (Important for me). Apps I am dependent upon, and which I could not personally write alternatives for, and for which, no viable alternative exists in the OpenSource space, run natively rather than through some malebolgian kitbash of emulators and pipes.

    What should I expect from Linux in those categories?

    What categories have I missed that are important where Linux is better?

  43. >What categories have I missed that are important where Linux is better?

    I could list them for pages, but here’s just one that will stand as a good example. If Microsoft were competent at designing operating systems, botnets and the spam problem wouldn’t exist. They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit. If that weren’t the case, plugging the few remaining open relays would make spam traceable to sites that could be shut down.

  44. Fonts and color spaces display right (Important for me).

    Point in Linux’s favor: it has decent screen font antialiasing now. Windows antialiasing looks like ass; Linux is marginally better. As always, the Macintosh is the best by far.

    (See a pattern emerging here? Mac OS X has become the “no compromises” Unix. If you need or are interested in the arcane power of Unix on your desktop, probably the best favor you can do for yourself is to get a Mac.)

    What categories have I missed that are important where Linux is better?

    Here’s one. Linux scales from tiny handheld devices all the way up to supercomputers. These days, Windows and Mac OS X run on cell phones, but only in special vendor-approved builds. However, the open-source nature of Linux makes it far easier to use it as a starting point for your custom embedded or large-scale computing solution.

  45. They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit.

    Again, this is due to a combination of broken userland software (blame the IE and Outlook teams), and an idiotic security policy that was, to some extent, necessary to maintain full compatibility with legacy apps. It really has nothing to do with Windows as an OS (and no, IE isn’t a part of the operating system; I don’t think even Microsoft ever believed that for a second.) Microsoft is quite cognizant of its mistakes in this regard and has been after them like a chicken on a June bug.

    Oh, and the “compatibility with legacy apps” is something Linux fails at too: try running an old binary on a new Linux system and boggle at all the library dependencies you have to resolve.

  46. > Oh, and the “compatibility with legacy apps” is something Linux fails at too: try running an old binary on a new Linux system and boggle at all the library dependencies you have to resolve.

    That’s why GCC was invented. ;-)


  47. I could list them for pages, but here’s just one that will stand as a good example. If Microsoft were competent at designing operating systems, botnets and the spam problem wouldn’t exist. They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit. If that weren’t the case, plugging the few remaining open relays would make spam traceable to sites that could be shut down.

    My understanding of this is that it’s an exploit in Outlook (easily replaced, not part of the OS anymore than Thunderbird is part of Ubuntu), and in IE 5/6. (There are reasons I don’t run IE and Outlook.)

    So, what part of the OS allows that to happen, that you place the blame there.

  48. >So, what part of the OS allows that to happen, that you place the blame there.

    It’s not specific to IE, or Outlook, or any other application. And it’s not any one thing; there are multiple gaping holes in the architecture. I’ll name two: (1) the Windows Update channel is easy to hijack, and (2) for performance reasons, the GUI has run in the same ring as the kernel since NT 4.0, which means the most trivial application buffer overflows can give a cracker the equivalent of root privs if he knows what he’s doing.

    The botnet herds are huge for a reason; the herders can find holes faster than Microsoft can fix them. That’s a sufficient indictment of Microsoft’s technical competence right there.

    If you take a clean install of Windows and put it, un-firewalled, on the net, do you know how long you can expect it not to be pwned? This has been measured by experiment. I believe last time I read about this it was 17 seconds. Down from 43 the previous time the experiment had been run. But the exact figure doesn’t matter; 17 minutes, 17 hours, or 17 days would still be evidence of incompetence.

    And yet people still look at me funny when I complain about Microsoft.

  49. > There is no possibility of “better” if the effect is to lock me into a vendor-controlled jail. None. That’s like saying “Here, try this lovely heroin. It feels soooo gooood that you won’t mind that you’ll never be able to stop.” You can list features and capabilities until doomsday and you’ll never get past the word “proprietary”, which stinks to me of lossage and of personal pain too keenly remembered.

    Eric, that’s just daft. To continue with your analogy, you’re in effect saying that because the drug cartels and the pushers are selling heroin, they can’t possibly have any interesting lessons about drug synthesis or marketing or hypodermic needle design that could benefit legitimate pharmaceutical companies.

  50. OK, just read your next post. To clarify, I’m not suggesting that you use Windows, MSDN, Visual Studio etc, just that it’s worth keeping an eye on them from an appropriate distance in case they have any ideas worth stealing.

  51. (Jeff Read) > Again, this is due to a combination of broken userland software (blame the IE and Outlook teams), and an idiotic security policy that was, to some extent, necessary to maintain full compatibility with legacy apps. It really has nothing to do with Windows as an OS

    Most of the complaints presented here about Linux/Unix are also about the userland. What’s a part of the OS, anyway (…because everyone likes a never-ending debate)? Technically even the Linux audio API jungle problems can labeled as “that’s just some crappy userland tools, we have this new shiny world order (pulseaudio or such) and everyone should just get with the program”.

    I have very little experience with Windows. I recently helped a colleague at work to set up a new Lenovo laptop that came with a factory-installed Vista. The first time the thing was booted, straight out from the cardboard box, it took two hours to come up, including probably a couple of reboots that the system required. I have no idea what it was doing. Possibly some configuration or such that Lenovo put in their factory installation, so I don’t know whether Microsoft had anything to do with it. When it was finally up, the amount of disk space taken was 13 GB. This is plain Vista with nothing installed, with the possible exception of some Lenovo utilities. I had no desire to find out what was going on, so I installed the tools we need at work and handed the machine over as quickly as I could (AFAIK it has been stable since). Is Vista really *that* bloated or are there DVD-quality instruction videos in there or something?

  52. Yes, it is really that bloated. The standard install of Vista (Starter ed. aside) is on the upside of 10 GB.

  53. Jeff Read Says:

    > Point in Linux’s favor: it has decent screen font antialiasing now.
    > Windows antialiasing looks like ass; Linux is marginally better.

    I have a corporate issued T61 with Win-XP, but I frequently run my Ubuntu 8.04 LTS Desktop Edition live CD. I think the basic Gnome interface (as-is) as packaged by Ubuntu is more than adequate, and if I were a small business owner, I would not pay any Microsoft tax for the MS UI. What I do notice is the fonts seem ugly. Not sure you are old enough to remember, but when I first used OS/2 Warp, with Adobe Type Manager, compared to MS-Windows 3.1, the subtle improvement in screen fonts that ATM provided seemed like a big deal to me. Similarly, MS-WinXP seems just a bit crisper in its fonts.

    Do you have a link to a FAQ that might describe why this is, and what I could do about it (I am sure I will need to stop using the Live CD to start)?

  54. I’ve spent quite a lot of time on your trash.You have more tendency to sex than software.Indeed,you are the one who suggests would-be-hackers in “How to become a hacker” article not to waste time on distractions like sex,social approval,etc. In addition to being an attention whore, DON’T BE AN ASSHOLE MR. RAYMOND

  55. A handy guide to translating hissy fits like SomeDude’s:

    “attention whore” = anyone who isn’t painfully introverted.

    “asshole” = anyone who can talk to a woman without scuffing his shoes and keeping his eyes on the floor.

    To be fair, this isn’t just SomeDude. A lot of socially-handicapped hackers tend this way. It’s not really a choice, but something about their neurotransmitter balance, and I try to make allowances. Because in some alternate universe where I don’t have whatever minor genetic quirk gave me an entrepreneur-extrovert type personality rather than the standard geeky quasi-autistic one, I’m probably spitting enviously at an ESR-type. And reading his blog. And wishing, deep down, that I were more like him.

  56. Jeff Read,

    Long time no talk. I too would consider myself a recovering fosstard (perhaps even recovered at this point). Shoot me an email sometime if you’re so inclined. Debating you was one of the more fulfilling wastes of my digital youth that I can recall, and I’d be interested to hear what you took away from your time in the hostel that GNU built.

    Cheers,

    Pete

  57. Marco,

    The “crispness” is due to a combination of factors: use of ClearType (subpixel antialiasing, though Linux does this too), and the fact that Microsoft’s TrueType engine still tends to use hinting, which means it adjusts the font shape so that crucial points fall neatly on pixel boundaries. I don’t know the magic switch to turn it off or even if there is one.

    That hinting makes for sharper lines, but it takes its toll on the glyph shape, making the glyphs look less natural and even. It was necessary in the days of Windows 3.1 and aliased screen fonts, but today is far less necessary and even detrimental as it tends to mess with character spacing as well. The Mac, which employs a subpixel rendering system throughout its Quartz layer, is easily the best at font rendering; all screen fonts on the Mac look page-perfect. The Linux Xft layer comes in second, and Windows third; at least that is my preference for I find the hinted glyph shapes distracting when combined with font antialiasing.

  58. I was a bit of a UNIX fan when I first read that book, but I saw a lot of good points in it.

    The lesson I took away from it was. All general purposes OS’s suck. If you don’t think your OS sucks you just haven’t tried to the the right/wrong things with it yet. Unix, and Windows can be improved, but they will both still suck, because of history, poorly defined goals, and general screw ups in getting all the parts together.

    I have concluded that the correct way, of thinking about OS’s is not as grand building designed and built for a purpose, with majestic skylines, but as city infrastructure, with streets and sewers, that should just work. However people will only notice your system when you overflow into the basement. A infrastructure that is not over taxed is in a city that is not growing, if your OS sucks because it is being taken places that won’t considered when it was first build, well there are worse fates.

  59. Joe,

    No. The Mac and the Amiga did not suck. They were (and the Mac still is) beautiful machines with beautiful OS software designed specifically to make full use of the underlying hardware. Their developers had well-defined goals and succeeded enormously in “getting all the parts together”, but sadly they were marginalized for other reasons (corporate mismanagement, lack of marketing direction, Microsoft aggression, etc.) They had flaws, and they had to be expanded to keep up with ever-changing innovations in hardware and software, but they aren’t the pile of kluges that Unix and DOS/Windows were and are.

    These days, you’d hardly know such things existed, as most desktops are generic x86 shitboxes running Windows or, on occasion, some Unixoid thing. But once upon a time there were companies with strong engineering traditions who were close to their user bases, listened to them, and delivered what they wanted.

  60. (You never actually know if your beard will be gray or not until you grow it out. The apparent color of sink stubble can be deceptive.) The World Wide Web, originally developed on the Unix-based Nextstep, has a lot of Unix-ish ideas. A lot of fundamental stuff seems half-done or inconsistent, but later generations of software are getting more and more of it right. Would a Unix-hater’s idea of a WWW-like system have worked as well?

  61. >They had flaws, and they had to be expanded to keep up with ever-changing innovations in hardware and software, but they aren’t the pile of kluges that Unix and DOS/Windows were and are.

    Romanticizing old also-rans is always a temptation. There were major flaws in both systems; ask someone who was there, sometime, about the horror that was dynamic-memory managagent under early Mac OS versions. Or the truly peculiar kludges necessitated by the fact that the Amiga’s multitasking had no pre-emption.

  62. The thought that Commodore had a ‘strong engineering tradition’ is highly amusing. The Amiga team were good Engineers, but anyone who claimed the Amiga bettered Unix in any substantial way never used it. The lack of memory protection or virtualization handicapped it significantly from the long view, and the CLI and user interface were not all that. I say this as a person who used an Amiga 1000 exclusively for 7 years, and programmed it in C a goodly bit.

    And, Eric, the Amiga did have pre-emptive multitasking. ;-)

  63. >And, Eric, the Amiga did have pre-emptive multitasking. ;-)

    When did that happen? I never programmed in it, but I had a buddy who did…Noah Feldman, does that name ring any bells? This would have been around 1984, I think. He claimed that Amiga programs had to give up control, couldn’t be pre-empted out of their scheduler slots. If that was bad information, please correct me.

    I do know directly about the Mac OS horror; I used early Macs on which it was occasionally necessary to manually assign memory to applications because the OS couldn’t do it. Even at the time I thought that was insane.

  64. The Amiga Exec was always pre-emptive, from day one. It had message ports that could act as synchronization points for communications with the GUI system and kernel, but the system never depended on manual yield points like the Mac did.

    One of the things that Amiga users used to do to mock Windows users was format a floppy drive while manipulating windows, showing the boing ball bouncing in the background screen, etc. Windows didn’t gain the ability to do floppy I/O without blocking other operations until Windows 95.

    See http://en.wikipedia.org/wiki/Exec_(Amiga) for details, and a reference to a 1991 Byte magazine article that discusses the structure of the Amiga Exec kernel a bit.

  65. What is your take on OpenSolaris, as the replacement for GNU/Linux in terms of being the next generation development environment?

  66. >What is your take on OpenSolaris, as the replacement for GNU/Linux in terms of being the next generation development environment?

    Next generation in what sense? Linux has an effective community, a mega-crapload of pretty good software, and a strong brand. It wouldn’t bother me if OpenSolaris displaced Linux, but I see no overwhelming technical advantage there and its community is orders of magnitude smaller. As a fallback for people tied to legacy Solaris software it makes sense, but as a Linux successor? Nah.

  67. Jonathan, in those days Unix was still a minicomputer OS designed to handle multiple users logging in via terminal over serial lines. Amiga was designed to be a desktop OS; as such, it could achieve graphical displays at speeds untouchable by a similarly configured Unix machine (even if you had one for the desktop). A legacy of the days when hardware and software were designed to work together as a cohesive unit, for the benefit of the user. These days, the attitude is that hardware and software are both cheap and fungible. It’s similar to the difference between European engineering (“We build cars”) and American engineering (“We sell cars”). I don’t think even modern Linux has gotten low-latency syscalls to the point where even a stripped-down GUI like xfce feels as snappy and responsive as the Macintosh or Amiga did in their heyday; heavyweight GUIs like KDE and GNOME are not even in the same league.

    And yes, Eric, the Amiga had a fully preemptive multitasking kernel in 1985, at a time when such things were unheard of on any other personal computing platform. It also was the first personal computer to support hardware-accelerated video and multi-channel sound. It could do things computers today can’t, like switch video resolutions in the middle of a vertical scan. It was in many ways a revolutionary machine, You would do well to read up on its capabilities and maybe even play around with the Amiga emulators to get a feel for what it was like.

  68. “Next generation in what sense?”

    Next generaration in the sense that it incorporates all the best from both worlds:

    – System V, a much, much better and more consistent environment than GNU
    – ABI forward compatibility
    – DDI/DDK driver foward AND backward compatibility
    – a GNU alternative for those hopelessly tied to GNU/Linux (in /usr/sfw)
    – MPxIO, 1st class enterprise iSCSI support, enterprise grade clustering (gratis, open source!), grid engines (N1 and N1 geographic), IPMP (all these gratis AND open source!)
    – zones, to a lesser extent LDOMS, and XEN
    – ZFS! What’s not to love about ZFS?
    – DTrace!
    – runs on the same hardware as GNU/Linux
    – enteprise grade compiler suite – gratis!
    – runs the same free/open source software as GNU/Linux!

    One can have his/her cake, and eat it, too: standardization and backward/forward compatibility of a System V UNIX, with boatloads of high quality, enterprise grade, yet open source software! So what’s not to love about OpenSolaris these days?

  69. And, speaking of a smaller community, I believe OpenSolaris would benefit tremendously if YOU joined that community. Yes, you, ESR.

  70. >And, speaking of a smaller community, I believe OpenSolaris would benefit tremendously if YOU joined that community. Yes, you, ESR.

    Probably. However, you averted your gaze from the elephant in the room, which is comparative size of developer bases.

    There was a time when I thought the NetBSD/FreeBSD/OpenBSD crowd had a superior system to Linux. I didn’t switch, because it’s a straight-line prediction of my own theories that (other things being equal) the open-source project with more developers will improve faster than the one with fewer. Therefore, I believed that in time Linux’s metrics on real-world performance, security, and stability would improve to pass those of the BSDs. Eventually, I’d say around 2002, I judged that this had in fact occurred. Had I in fact switched because of perceived technical superiority, I would have spent a great deal of effort backing the wrong horse.

    Having avoided this mistake with respect to BSD, I’m not going to make it with respect to OpenSolaris.

    What’s worthwhile in your projects will be assimilated by the expanding Linux blob. What’s not will die. Sorry, but that’s the way it is. Evolution is not pretty to watch sometimes.

  71. What’s worthwhile in your projects will be assimilated by the expanding Linux blob.

    I get it. So the Linux community are Zerg. Problem is they’ve spent so much time rushing other people’s bases they’ve neglected their own.

    I stripped Linux off of one of my machines the other week and replaced it with NetBSD. It just so happened that this was my primary art machine — a TabletPC — and there was no official support for such machines under NetBSD that I could find. This was to prove a difficult move fraught with hardware-support issues but one I ultimately don’t regret making. Tweaking and compiling the X.org driver from the linuxwacom project got the last piece of the hardware puzzle — support for the machine’s built-in tablet — in place.

    There isn’t the broad base of driver support that there is under Linux, but I’ve found that when NetBSD supports a piece of kit it supports it hardcore. It is able to configure the machine’s sound card for full duplex — something I haven’t been able to observe under Linux (being able only to record XOR playback adds to the sound frustration and general ALSA brokenness). Meaning that I can use this machine as an audio workstation as well. Back in 2001 when I first tried NetBSD, Linux’s USB support was only kinda, sorta there; NetBSD was able to autodetect and immediately start using any USB input device I plugged into it. For a driver to make it into the NetBSD kernel it must not only work but fit smoothly in with NetBSD’s infrastructure — a 30-year history of getting different kinds of hardware across different buses and CPU architectures to work together under a single, unified model. The elegance is astounding. Linux, by contrast, is making improvements in this regard but only really had a unified device model as of the 2.6 kernel series; the hacked-together, git-r-done model of development seems to have exacted a severe toll in terms of code quality and maintainability. It’s kind of sad to see, as I had used Linux for 13 years and become rather fond of it, but it’s approaching a threshold where until it gains the refined culture of quality that the BSDs have cultivated for years, it will collapse under its own ponderous bulk. The same goes for the GNU userland.

  72. I don’t, from experience, subscribe to the notion of “how many masons, that much wall”: programming software is not like building a wall, and more people to work on it does not necessary mean that software will be plentiful, quality, or even correct.

    I’ve seen many a time that a handful of true experts make miracles in the shortest amount of time possible, while “experts” muck around and produce crappy software.

    Quantity is nice, but quality is even better.

    Someone like yourself can and has done miracles. That’s why I firmly believe one ESR can do what would normally take 100 “average” programmers. I’m deeply convinced of it, because I myself experienced it.

    So what if an OS is isn’t mainstream? Why would it have to be? Didn’t you yourself write in “why I hate proprietary software”, that programming is an art, and that one should enjoy it.

    And since the same FOSS software runs on (Open)Solaris just like it runs on GNU/Linux, I don’t really see why community size is a problem.

    Every open source beginning was hard. I don’t see why OpenSolaris would be any different; it has to start somewhere.

  73. Eric S. Raymond, Miracle Worker.

    And you thought the Obama supporters were blinded by hype!

  74. Have you read his book, “The art of UNIX programming”?

    I look up to him, as far as programming tenets go.
    There is no shame in learning from those who are smarter and wiser than yourself.

    I also believe that ESR would be of great use as a guide to all those GNU/Linux people coming to OpenSolaris. As a “Gray beard”, he’d be great mentor to the GNU/Linux newcomers in the OpenSolaris world. We have lots of those.

    After all, who knows UNIX better than a “Gray beard”? No, who *understands* better than a “Gray beard”, what UNIX is all about?

    And ESR, when I read your “why I hate proprietary software”, you touched a spot in me, deep inside. When you described what you went through, I saw myself in that description, word for word.

    Thank you for sharing that essay.

  75. “We have lots of those” == lots of GNU/Linux people coming over to use, play and work with/on OpenSolaris. Just look at opensolaris.org/os/discussions/

  76. >And since the same FOSS software runs on (Open)Solaris just like it runs on GNU/Linux, I don’t really see why community size is a problem.

    Two words: device support.

  77. With all due respect, but that is something that used to be the case, and is definitely no longer so.

    Most contemporary hardware is supported now, and support for even more hardware is in the works.

    I have an intel core quad PC, and everything just works, out of the box: the network, the HD audio, the Nvidia 3D accelerator (with acceleration).

  78. Ultimately, you will do what you want. I just think it would have been great to have you along for the ride.

  79. UX-admin, Solaris seems worth looking into. It’s rather shameful that we must accept the worst possible systems (Linux, frickin’ Windows, x86) just because of mob support.)

  80. @ESR:
    We will watching closely how MUCH of your publications will not look ridiculous 15years later!

  81. Well, I’m not a veteran like you guys, but I thought I’d share my feelings anyway. I’ve been using Vista at work and at home for about a year now. It crashed once. I used my aunt’s macbook for one summer, running OS X. The things must have crashed on me about twice a week. Everytime I ran too many apps (meaning a torrent client, a video player and a web browser, and maybe a few PDFs) the whole thing just froze, had to be rebooted. I also run GNU/Linux at home. The sound problem (mentioned in another response) is just ridiculous. Even printing isn’t easy. I’m not running Gentoo or using a windows-only printer either. I’m running Ubuntu, and my printer is a 15 y.o. HP Laser Jet 4. It can print a test page fine or standard jobs, but try to make it print even pages only and it’s no go, the printing job just disappears. You say I should have to recompile something? If your car broke down every few days and you were told all you had to do was polish the spark plug to keep the thing going, you would also think about finding something better. Isn’t that what’s happening to US car manufacturers right now?

    As for being proprietary, please grow up. Why should I care to be able to modify something if it already works? If it doesn’t work, why would I buy it? Microsoft software is expensive? Let’s say I have to pay 1000$/year on Microsoft licenses. That’s less than 2% of most entry level salaries for IT jobs. If the software increases my efficiency by 10 minutes per day I’m a clear winner. But what could save 10 minutes per day? Let’s see: not having to read through endless poorly structured documentation, or filtering through forums looking for a command line, somewhere inserted between the top-notch *nix intelligentsia comments that tell you what an idiot you are for not knowing the command, instead of giving you the command itself.

    GNU/Linux may have a ton of free software, but most of them do not work properly(meaning without having to change code), have a tendency to crash without warning or error messages, are not user friendly and have poor or no documentation. And I almost forgot, there are a lot of Open Source software available for Windows, and they usually have better GUIs and more functions than the *nix equivalent (ex: 7-zip vs p7zip).

    Don’t get me wrong, I wouldn’t be spending so much personal time using Ubuntu, or trying to use FreeBSD, if I didn’t love the concept of freedom based software, but the *nixes still have a long way to go before the hate become unjustified.

  82. @SGilmour:

    My experience with OS X has been much different. Usually such crashing is caused by hardware problems (typically bad RAM). Software for OS X is very stable (for the most part; there are a few apps that crash).

    Windows is a mess. I work in IT and see problems every day. When I used to use it at home, it’s was horrible.

    Linux is nice, but requires a certain level of maintenance. A LaserJet 4 should work fine (though it may be necessary to modify settings in the CUPS web-admin page).

  83. “There is no possibility of “better” if the effect is to lock me into a vendor-controlled jail.”

    This would make sense, except GCC is many GCCisms and bugs that developers program around, effectively locking their code base to GCC. It took a while for the Linux kernel to compile under other Linux compilers (notably Intel and later Sun, but I dunno if the Sun “port” has been finished). Microsoft has better standards compliance these days than GCC, so you run the risk of vendor lock-in moreso with the Linux compiler than the Microsoft compiler. Most compilers have to have “GCC compatibility” for the code to compile with GCC without [sometimes, major] changes. Microsoft doesn’t have this switch, so I guess that’s why you refuse to use their development tools?

    MSDN Documentation (both online and installed) is vastly superior to anything Linux has to offer. It is not a coincidence that KDevelop mocks the Visual Studio User Experience (failingly, however), as well as trying to mock the MSDN Documentation experience.

    Proprietary is not as bad as you say. Try to keep the discussion from getting too religious. The UHH still have more than enough relavance these days. X still takes more resources to run than Windows XP. I can run XP on a P2 233 MHz computer with 160 MB RAM and a 16 MB Graphics Card, when GNOME/KDE show laggy performance on a Celeron 2.7 GHz with 1GB RAM and a 128 MB Radeon card. I don’t even want to get into Hardware support when it comes to X.

    Linux developers are worrying about too much **** that doesn’t matter for desktop users. Sun is doing the same thing with OpenSolaris. They are trying to do this desktop distro, but it will not install/run on 3 of my 4 computer systems.

    In the words of a UNIX hacker, “If you want an Open Source Windows, then contribute to the ReactOS project and stop crying about proprietary crap.”

  84. Also, yes, I know Linux’s bread and butter is the server, but if they want to displace Microsoft Windows they will have to standardize on one user experience and really optimize X and grow it’s hardware support. OpenSolaris has to do the same thing. I cried when I found out they were doing a KDE port to be included in the base install. I shall not use it.

    Linux is a cluster**** to develop for. That’s why commercial tools like Kylix has failed so collosally on that platform. It actually was a good development tool, but it’s hard to develop commercial native Linux GUI applications because of the GUI/Library clashes and assorted differences among the distros.

    The only way to really be accurate in developing for Linux is to use WineLib, Develop a core applications with much functionality implemented in a proprietary scripting language (like some Editors do, SlickEdit?), use Java, or understand that your application will alienate half of the Linux userbase because some hate KDE and some Hate GNOME – and the applications will not integrate properly if they use a different base…

    People need to wake up and see how horrible Linux is as a both a development platform for end-users, and a desktop/delivery platform. When they get rid of all the fragmentation, things will get better. Linux is not knocking on the door of Windows. Not Ubuntu, not Novell, not any of them. The hate for proprietary software won’t help, either. Since proprietary software has, on average, much better quality and support compared to Open Source Software (this includes “community” support, btw, IME).

  85. “With all due respect, but that is something that used to be the case, and is definitely no longer so.”

    Yes, it is so.

    Maybe it runs on you machine, but I have 4 computers and Solaris won’t even boot the installer on 3 of them because of it’s pretty terrible hardware support.

    I do not buy hardware to run Free OSes, I just renew by RHEL subscription instead.

  86. “Windows is a mess. I work in IT and see problems every day. When I used to use it at home, it’s was horrible.”

    Learn to use Windows, I guess? The same way you learned to use Linux?

    There are so many people who disagree with that statement :) The only people who really say that, are those who don’t use Windows and only want to tarnish it to persuade others to move to Linux. I have had huge arguments with Linux users who have bashed the Windows GUI/Userland/Security and have been proven wrong handily. I do not want to have this discussion again, but if you feel like you must go that route, I have a saved copy of those posts that I can paste to this blog.

    Overstating and exaggerating things is not good. What you say is FUD.

  87. > If Microsoft were competent at designing operating systems, botnets and the spam problem wouldn’t exist. They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit.

    And to be fair, all the efforts (and wasted time of people’s lifes) around these problems have to be added to the TCO (of the respective OS).

    Now it seems some politicians even want to have the military step in for cyberproblem mitigation, so this will become even more expensive. And thus become kind of a subsidy to MS. It’s the same “rake in profit for myself now for substandard quality and leave the problems to others” principle as bailouts for the financial gamblers, dumping waste into the environment for ecology, raping girls leaving their inner scars to others to solve, etc.

  88. > If Microsoft were competent at designing operating systems, botnets and the spam problem wouldn’t exist. They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit.

    Nice argument.Viruses and worms were there before Windows and before MS-DOS.

  89. >>> …rather than leaving the job to a userspace library like curses(3)

    OK, here’s a question… what is that “3” doing after “curses”? Or find(1) and man(1)? Is find(1) different from find(2)? Or is find always (1)? (and why?)

  90. >what is that “3″ doing after “curses”?

    David, it’s a Unix documentation convention. The number in parentheses is the manual section you’d find the feature in. (1) is utility programs, (2) is system calls, (3) is library routines. Higher numbers are more variable, except that (6) is always games if it exists.

  91. Fair comments ,i do think the book was true at the time and i think your comments are fair now.

    I do think OSX , Windows and Unix are 70s Dinosaurs whose design assumptions are no longer relevant and requirements have changed.

  92. > If Microsoft were competent at designing operating systems, botnets and the spam problem wouldn’t exist. They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit.

    HAHA .. UNIX is NOT secure at all , MULTICS ( and VMS , OS390 , OS400) was FAR more secure and there is a nice paper about a Trojan on Multics and the authors answer was these systems can never be secure and should never be shared ( ie on the internet) we knew this in the 70s .

    As long as someone can convince users to run a program ( which is a given) you will always get botnets in systems with ambient authority. Unix ( and Windows) security is fundamentally flawed as it has security at the user level ( which was ok in the 70s but not now) you need ACL security at the process level or better use a Capability security system Like EROS , Coyotos or MOOOS . Note File descriptors in Unix are capabilities unfortunately the access control is flawed.

  93. I am relatively new to Unix, well Ubuntu/Linux, with previous experience in Macs and Windows. I am forced to use it at work as key software is not available in 64 bit windows, or I simply need mainframe power.

    Unix/Linux is an absolute crock in my view. It takes ages to learn and I am very dependent on others. It is not obvious to me how I can do anything beyond work with a simple Cygwin window. Some colleagues who know Unix, aloofly claim its superiority, yet refuse to help me learn it.

    I would dearly like to install and run an FVWM and an X-Windows client myself, especially when the commercial X-Win32 program fails on me, but I haven’t a hope of sorting things out without technical support.

    For all its faults, I can work most MS Windows issues out myself. The book rang absolutely true for me. What a ridiculous and obsolete operating system! Operating systems should not be an impediment to your work, but an enabler. Even the current BASH/Linux/Ubuntu implementation of Unix utterly fails in this regard for a new user.

  94. @Bruce:

    Go get the Ubuntu Live CD — you’ll download that one if you click the big, green, friendly Download button — and give it a try. Just boot in on any reasonably current Windows desktop or laptop; you’ll get to play with it without ever installing it. Then tell me what you think of Unix. Most people in your situation are using engineering workstations thinking that that is the current state of Unix on the desktop. Hardly. My non-technical wife (she’s a psychologist) uses Linux at home on her desktop and on her Dell Vostro laptop with very little help from me.

  95. Typo fix:

    $ rcsdiff ‘index.html?p=538′
    ===================================================================
    RCS file: index.html?p=538,v
    retrieving revision 1.1
    diff -r1.1 index.html?p=538
    145c145
    < Similarly unfortunate. Sets the tone for too much of the rest of the book, being mostly hyperbolic snark when it could have been useful criticism. Very dated snark, too, in today’s environment of Linuxes wrapped in rather slick GUIs. The anecdates about terminal sessions on Sun hardware from 1987 look pretty creaky and crabby today.

    > Similarly unfortunate. Sets the tone for too much of the rest of the book, being mostly hyperbolic snark when it could have been useful criticism. Very dated snark, too, in today’s environment of Linuxes wrapped in rather slick GUIs. The anecdotes about terminal sessions on Sun hardware from 1987 look pretty creaky and crabby today.

  96. I’ve read the UHH, and the fact that many of these problems still exist is the greatest damning evidence that UNIX sucks and will always suck. It’s not 1969 folks!

    The chapter on X server is not only still relevant, it is more relevant than ever! Sure, some of the ICCCM ugliness has been hidden if you only use certain toolkits, but it’s still there and no better standard exists. The mass proliferation of toolkits has only worsened the chaos and defeats the original purpose of the GUI. When using Linux, I stick to the command line except for web browsing. In spite of having it’s own chaos, it’s still less random than what we jokingly refer to as the Linux desktop.

    The only silver lining is that the UNIX toolset and assorted crap is being standardized which is the first step towards sandboxing and moving on to the next big thing. I can’t wait for the day when we can have a system were the left hand knows what the right hand is doing. Hopefully, instead of the everything-is-a-file paradigm, it will have the nothing-is-a-file paradigm.

  97. Hello,

    First of all, just for the record I am a student in India. That basically means that I have access to an average speed of 256 Kbps on the Internet and possess a basic Dell Inspiron laptop. I have requirements of Social Networking applications as well as Programming languages. Thus, I can easily say that I am the ultimate average end-user.

    Linux is beautiful, as a student of Computer Science, I can only view it as a work of art. I am working up my way through languages mentioned by you and will soon enter the world of kernels. But it has focused so hard on open source and security, it has made end-user reliability impossible. mp3 is a proprietary format and ogg, flac is open source and better. But it doesn’t matter because none of my favorite songs are available as them. So if you want people to use these formats you have to build a library of songs as such.

    In the same way, UNIX has already won the OS wars when it comes to large scale implementation in fields of academia, engineering, computing, security, web hosting and even mobiles. But when it comes to home and office users, Windows is much better. Accept it. Countless people have been working day and night to create applications for windows for the normal people who do not have a high security requirement and also have really good support/documentation for them. Yes, Microsoft was/is a complete asshole when it comes to their licensing. But, its sad when accomplished programmers such as you do nothing but bitch about it. Windows has its own base of Open source Community. I say that the Mozilla foundation is perhaps the best of all networks. They have worked hard and today firefox is the best browser availible. They worked with what they had and won all competition.

    Therfore I support Jeff in saying that perhaps all UNIX and Linux users and programmers should unite to think more about end-users rather than their intellectual superiority and Linux might replace Windows or else – Windows wins by default.

  98. @ Vineet : cf Firefox : It’s crap. Absolute stinking garbage. Everything about it sucks. Firefox is the best argument that proprietarians have about low quality foss software. A million eyes, a million hands, a million turds.

    For the record, I feel exactly the same as esr expressed earlier. I despise olosed source software and printers that can’t be repaired. But Firefox should be shot at dawn. It stinks.

  99. Ok, I am a little late to the party, but Eric has cleverly avoided the question of how he will not use a proprietary desktop OS, but has probably used a cellphone running proprietary firmware in the past. See, cellphones have evolved from having a firmware into having a firmware capable of running java, and then into having a full OS (=with native apps). If it wasn’t for Google saving the open source community’s bacon with Android (let’s assume Android is open source, even though you can’t build the OS from the source repositories for your Nexus, because certain drivers and codecs are missing) you would had to forgo owning a cellphone at all (or buy some rare N900 from ebay and watch it die from USB death, as that model does, and then forgo owning a cellphone).

    Soon, TVs and even washing machines, cookers will all become “smart”, with their own full smartphone-like OSes that can download apps or do network updates over the internet. If Google doesn’t make an open source OS for all these or if their efforts bomb (like they did for tablets), you will have to forgo owning these appliancies two, because they would run a proprietary OS. In other words, slowly turn yourself into a troglodyte like Stallman does.

    And I didn’t have to mentione cars and in vehicle infotainment… Any new car has in vehicle infotainment nowadays, soon with apps

    I also didn’t have to ask why you are using the proprietary firmware of your monitor or TV to view this post right now. What the diffetence between firmware and OS? Is a firmware with a Java midlet interpreter turned into an OS? Is WP7 with it’s sandboxed apps a firmware or an OS? Ah… Such a blurry line.

    1. >you would had to forgo owning a cellphone at all

      You’re confusing me with RMS. I’m not a fanatic, so check your fanaticism at the door.

  100. And before ypu rant about closed ecosystems/APIs and locked bootlo aders (for ARM systems) being the deal breaker for Windows, those smart appliancies i mentioned above could also be running OSes withclosed ecosystems and APIs for their apps and locked bootloaders. If conventional appliancies aren’t made anymore, and Google fails to coviemce the manufacturers to make some appliancies running something (semi)open source will you forgo owning such appliancies?

  101. If you don’t care whether your cellphone, washing machine or TV runs a proprietary OS with locked app installation (no sideloading), a proprietary API and locked bootloader, why do you care when Windows does it?

    What exactly is your gripe with Windows? You don’t just accuse Windows of being poor in quality (a dubious claim when it comes Win7 if you ask me, it’s not Windows that needs to have half of it’s graphics stack replaced by Nvidia to remove tearing, it’s desktop Linux with X.org), you also accuse it of being a “drug”, aka something immoral.

    The anti-MS crowd has to make a decision. Either you are going to declare every device that runs an OS with locked app installtion, proprietary APIs and locked bootloaders to be immoral and refuse to use it (even if that means not owning washing machines or TVs, if no non-smart ones are made anymore), essentially turning your self into yet another Stallmanite, or you will have to give Windows a break, and stop calling it a “drug”/immoral.

    Till then, the general attitude of the anti-MS crowd is “an OS has 90+% market share, and that OS isn’t our beloved LInux”, and “someone (Ballmer) is getting rich and famous and that someone isn’t me”. Aka pure fanboysm and jealousy.

    1. >If you don’t care whether your cellphone, washing machine or TV runs a proprietary OS with locked app installation (no sideloading), a proprietary API and locked bootloader, why do you care when Windows does it?

      I think I’ll blog about this. It’s a question that’s come up before and is likely to again.

  102. Months later, but:

    jon banquer wrote: “Firefox is the best argument that proprietarians have about low quality foss software. A million eyes, a million hands, a million turds.”

    You take all of those million turds and put them into a sizeable shipping container, and drop that shipping container into the blue-white supergiant of “I can change browsers if I want to”.

  103. The more I invest and volunteer “information” (of any kind) into a given device, the more I start to care about it’s openness. The two main reasons being privacy and ownership (= no one can lock me in or ransom my data)
    In other words, I could not care less about whether my toaster or washing machine is open or not, because I don’t NEED to trust it or control it or own the “data” I give it.
    OSs and smartphones are examples of the opposite extreme.
    Most technology is on a sliding scale somewhere in between.

    If someone wants to insist 100% on one view or the other, I don’t necessarily think they’re crazy fanatics, but it’s a viewpoint based more on ideology and philosophy than practical concerns.
    The view of kurkosdr, however, is pure silliness. It’s like saying if your favorite food is bratwurst then you are a big fat hypocrite if you ever eat anything else and don’t advocate the destruction of all the world’s non-bratwurst food production.

  104. “One of their may points seems to be that it would be nice if the called program also had a way to see the specific arguments it was called with. This would allow applications like rm to do a sanity check to prevent dangerous operations.”

    I still prefer a dumb tool, which does what it is told to do, to an over-intelligent piece of software which keeps making false assumptions about what I meant it to do.

    “As they say in Chapter 7, Graphical interfaces can only paper over misdesigns and kludges in the
    underlying operating system; they can’t eliminate them. ‘rm *’ would not hurt so much if Unix provided a mechanism to retrieve deleted files. ”

    Wrong!

    “Would it have been so hard to provide a special directory where the system would move files after they were ‘deleted’ and where the system would only truly remove them when it needed the memory? Both GNOME and KDE provide a ‘garbage can’, but it only works with programs specifically designed to utilize it.”

    … and that exactly is the solution. If you are unhappy with the way rm works, define an alias or a shell function or even a script that works the way you want. The opposite way is much more difficult if not impossible. Thus rm is not a bit of a misdesign.

  105. Here’s what I think about Unix and I have the right to have my own opinion.

    It was an operating system written by demonized code writers that Jesus really LOVES!!!

    Too bad no Christian prayed for the writers of Unix out of love. Had at least a single Christian done that Unix would truly have been awesome.

    I’m sorry world, I’m a Christian and I too have not done what I was supposed to at times. Please forgive me.

  106. kurkosdr wrote :- “If you don’t care whether your cellphone [etc] runs a proprietary OS … why do you care when Windows does it?”

    For me, I do care a bit, but there is generally no choice. I do have a choice of avoiding Windows on my PC.

    “What exactly is your gripe with Windows? … you also accuse it of being a “drug”, aka something immoral.”

    Don’t get me started. Not only the technical considerations, but for example the licensing; such as the fact that a legitimate Windows copy I had on a PC stopped working when I replaced a failed motherboard, apparently because MS assume I must be a pirate. Like all my furniture falls apart if I replace the carpet. It is not just the inconvenience and cost, it is the insulting, patronising attitude MS have. Yet UHH and some here complain about the *nix “attitude”!

    The anti-MS crowd has … to give Windows a break, and stop calling it a “drug”/immoral.”

    Windows is inanimate, but Microsoft are indeed immoral, led until recently by two obnoxious men – Gates and Balmer. For example Microsoft’s well-documented underhand ways to maintain their monopoly, and their corruption of the ISO standards process by stuffing national standards committees with their partners (we had not realised this was so easy to do) to steanroll their OOXML document “standard” through, if you can call a 6000 page document that includes requirements like (paraphrasing) “do things the way we do”, a standard at all.

    “Till then, the general attitude of the anti-MS “someone (Ballmer) is getting rich and famous and that someone isn’t me”. Aka pure fanboysm and jealousy.”

    As Google would say, Did you mean Anti-fanboyism? Fanboyism is more a Mac and Windows speciality, with people believing that Jobs was a living god, and that Gates invented the computer and has now become a saint with his charity.

    But I wouldn’t want Balmer’s fame, which is for being a buffoon, nor Gates’, which is of being a bad-tempered jerk who, like Nobel, Carnegie and Rockefeller before him (utter bastards who started to worry how history would remember them) gives his unspendable wealth away to try to compensate for his massive negative karma. I am not very interested in money as it happens; it is not the most important thing in life (as this is esr’s site I don’t mind saying that sex is), although it comes in useful. As it happens, I am a millionaire, not uncommon these days but enough for what I want to do.

  107. I have used both MSDN and man pages. At least for my purposes the latter is much better. The qualities in man pages are:
    1. man pages are more concise
    2. If I want details on an obscure corner case then most man pages will cover it.
    3. The man pages document what is actually implemented

    man pages are reference documentation and not hand holding, so criticizing them for not being hand holding is not valid. MSDN is long on hand holding and often short on the details I actually need. If you need unix hand holding then I believe there are books, websites and companies which offer it.

    That said I have a PhD in theoretical computer science, so I might have more sensitivity to imprecisely split hairs and documentation which fails to cover fine details than most. I also have far too much experience of microsoft software not conforming to the documentation, which I have not found on un*x/linux. If something goes wrong the MS documentation is almost always useless.

    I actually prefer LaTeX, and sometimes lyx, to MS word. This might be due to writing a PhD thesis with a bibliography and equations, which is much easier using LaTeX. I could probably now afford microsoft but I don’t want to downgrade to C compilers which don’t support C99, which everything except MS VC++ supports at least almost completely supports.

    I am not against proprietary software in principle but windows has very little to offer in terms of the things I want to do. I use linux because it a better solution to my problems than anything microsoft or apple can sell me. I don’t actually care about sound.

  108. I just started writing a review of The Unix Hater’s Handbook. Because we needed an almost-25-years-out perspective.

    Although there is a lot I do not agree about ESR about, we’re pretty close on this one.

    I do find it amusing that Unix (including, of course, Linux as effectively Unix) *won* everything but the corporate desktop. Anything in the cloud, anything in a data center? Probably Unix, unless it’s an Exchange or AD server.

    Any smartphone? Almost certainly IOS or Android.

    I’ve been paying special attention to the “but other things are better”. I can now install TOPS-20, ITS, and Multics on emulators (emulators running on Linux, to be clear).

    TOPS-20 seems to have its charm, and I haven’t gotten through a Multics installation yet. But ITS? Sure, I guess I get it was a hacker’s paradise, but comparing DDT to even Version 6 /bin/sh ? Are you high?

    I probably would like a Domain/OS or Symbolics LISP Machine better, but their emulators are still sort of hard to get going, even in 2018.

  109. @Dr Zhivago:

    “a bad-tempered jerk who, like Nobel, Carnegie and Rockefeller before him (utter bastards who started to worry how history would remember them)”

    I know it’s three years later, but I’ve gotta correct the record:

    Nobel is famous for creating civilian explosives safer than the comparably powerful explosives of the period. (As a 19th century mine owner, your main choices were things like black powder, which were safe but weak, or things more like nitroglycerine, which were strong, but risked maiming or killing your poor immigrant workers by going off prematurely… and it’s not as if OSHA existed at the time.)

    He created the peace prize after reading a premature obituary which mistakenly villainized him for inventing military explosives.

    You’re repeating the mistake that prompted the creation of the peace prize.

Leave a comment

Your email address will not be published. Required fields are marked *