Linux-Hater’s Blog, considered

One of the advantages of having helped found the open-source movement that I cherish most is that nobody can criticize me when I criticize it. I’m a gadfly by nature, disgusted by cant even (actually, especially!) when it’s my own insights being reflected back at me as dogma. Anyone who actually does that is likely to flip me into full Discordian rascal-guru mode.

So I was actually pleased to learn of the existence of Linux-Hater’s Blog. I rather looked forward to winnowing through it for nuggets with which I could shock the more fanboyish members of my community by agreeing. Alas: when I finally went there with intent to read, I discovered that the never-actually-identified author of the blog had ended the project. I read the entire archives anway.

A lot of it is just off-target flamage. The very first substantive entry, for example, is a flame about copy-paste behavior that applies to all Unixes running X, not just Linux and not just open-source systems. Linux-Hating Blogger’s bile is further undercut when the discussion of standards he links to includes a reasoned (and, I think. correct) decision to make Linux implementations behave more like the way Macs and Windows do it.

I’m also going to just ignore entries at the lameness level of Linux won’t get you hot chicks (which is to my certain knowledge untrue) and Linux sucks.. for watching Porn. That knocks out, oh, at least 60% of the content. But then there’s this, in Use Linux to lower your customer’s expectations:

You know, cuz it’s totally acceptable to ship a busted battery meter,
or something that you have to type some crazy hexadecimal key in every
time you want to get on the interweb. It ships Linux, so we can
forgive it, right? Fuck no.

Fuck no, indeed. The point of open source is supposed to be better software quality; LHB is quite right that we ought to given vendors who ship slovenly builds a sound kicking.

And then there’s this: You don’t pay me, so I don’t care what you want. LHB is right; a lot of open source is developed by developers for developers and underweights – or completely fails to connect with – the needs of actual users. In fact, the situation is actually worse than LHB describes; his belief that “When you’re small, you’ll do a bunch of stuff to try to get more users.” is, generally, false. Small open-source projects aren’t normally focused on getting more users at all; usually, they happen because some hacker thought a particular program would be fun or useful to write, and whatever number of users show up in his in-box is fine with him.

It’s no bad thing to have LHB remind us that inattention to end-users’ needs is a serious problem; it’s a point I’ve made in public more than once myself. Nor is he wrong to point out that formal project management can’t actually solve this; the developers themselves have to care. I actually like his last line: “And besides, open source projects already have product management. It’s called a bug tracker.” Spoken in jest or snarkiness, perhaps, but they really do function that way.

So, is there a solution? Interestingly, LHB is too smart to actually commit himself to the position that monetary incentives can make developers care; one suspects he’s been a programmer at a closed-source shop, and knows exactly how often the whole self-congratulatory apparat of paid managers and marketing departments produces botches just as awful, if not worse, than development-by-geeks-for-geeks.

In Release cycles are for lusers, LHB actually manages to say something useful and constructive and even admit to good reasons for respecting Mark Shuttleworth. Zounds! You could almost think LHB were a secret Linux fan!

I got remarkably far into the archives before I found something that I disagreed with at a more fundamental level than “Dude, you’re just flaming.” It was here: Good Software isn’t really free. LHB writes:

Projects like the kernel and firefox are exceptions in a sea full of
shitty projects. They are how open source projects should be
run. They’ve figured out how to create value that people will pay
for. They have paid people working on them, producing valuable code,
solving real problems, and are usually shipped in usable, tested ways.

Rarely, and I mean rarely (i.e. hard enough to find that it’s not
worth trying out 3000 different apt-get installs for programs that do
the same thing), you find a project that has a really good developer
writing really good code, but it’s not backed by a sustainable
model.

But LHB is wrong. Bearing in mind Sturgeon’s Law (“90% of everything is crap”) finding projects that produce quality code without what LHB thinks of as a “sustainable model” isn’t actually hard. One effective way to filter out the real crap is to ignore projects that aren’t packaged by a major distro. There are an awful lot of projects good enough for (say) Ubuntu to feel it can package and ship without jeopardizing its reputation; of these, only a vanishingly small percentage have paid developers.

Within that set, there will as usual be a power-law distribution of quality. There are ways to make sure you’re at the good end. One I find pretty reliable is to look at the number of people in the credits or authors file; more is better, and I don’t think I’ve ever encountered a real dog with more than a half-dozen names on it. What you’re doing here is evaluating a proxy for the number of people who found the code sound enough to invest time in improving it.

You can actually apply more filters than this: does it have a website that has recently been updated? Is there documentation that looks useful at first glance? The point is that however many of these you apply, however high you set the quality bar, you’re still not going to get to where any but a tiny fraction of what’s left has what LHB thinks is a “sustainable model”.

And…oops…LHB says: “The vast majority of the rest is crap. Not too unlike commercial softare you say? No shit.” That’s right; he actually knows, when he thinks about it, that “sustainable model” doesn’t really do sweet fuck-all for your error statistics, and the power-law quality distribution applies to all software, on and off Linux and whether it’s open or closed.

Beneath the profanity and the flamage, LHB actually has a clue. When he allows himself one, that is — something the blog’s stated mission often prevents.

Occasionally he’s dead on target. As in The Registry is dead…long live…the registry!? Tell it, brother! This old Unix hand thinks gconf is indeed a botch, a frightening piece of overengineering. Give me a $HOME full of old-school dotfiles any day; they’re far easier to read and modify without fearing that a change to one thing will break everyting.

How to Write a KDE Application is both funny and seems disturbingly true. To be fair, so does How to write a Gnome Application. Well, except for the cloning part; there are plenty of original apps using both toolkits. But otherwise…I laughed. I winced. Then I laughed again.

How to Create a Linux distro is not quite as good, though the last item (“Write tons of documentation on complicated procedures to make things work, instead of making things work.”) has a bit of sting in it. These three satires probably represent the high point of LHB’s oeuvre; any Linux fan who doesn’t wince and take at least one lesson from them definitely needs to get out more.

Catastraphont is kind of interesting. You have to ignore the first paragraph, which was easy for me since LHB’s pages don’t in fact render in Deja Vu Sans on my Linux box. His description of the layers of historical cruft around X fonts is pretty accurate. It’s also shared with almost every modern Unix, including the closed-source ones. (Yes, MacOS X is an exception because Apple obsesses about these things.)

Perhaps this should have been Unix-Hater’s Blog; LHB admits at one point that he’s emulating the style of the Unix-Hater’s Handbook. But then he wouldn’t get to throw around cute epithets like “freetard”. Like the Handbook, too much of LHB reads like bile looking for an excuse.

That affects LHB’s prose style, too. There is a certain entertainment value in phrasings like “more tangled than Paris Hilton’s semen-encrusted hair after her cameo in a Brazilian vomit porn tape”, but if that’s the only note you hit in your writing…you could be more effective. And LHB is in fact much more effective when he forgets to cop his attitude and writes something like this:

Y’all seem to not realize that most people don’t google for answers to computer issues in the first place. To these people, it either works or it doesn’t. If nothing happens when they plug their camera into their computer, they assume their computer just doesn’t work with their camera. Or they call up their lame-ass grandson who installed some weird thing called youbuntube on their computer. They don’t give a flying fuck if some forum user gph0t04ever on gphoto-rulez.org has a 10-step procedure that will make it work.

Besides, to actually use google effectively, you already have to 1) kinda know what you’re talking about, 2) know what keywords to use, and 3) know how to use the results to fix your problem. When’s the last time that someone typed “my screen looks big” into google, and got to your newbie-proof instructions of how to replace the “nv” in your xorg.conf with “nvidia”? Oh, that’s right. Never.

This is a worthwhile reality check. Or, as I sometimes put it, “Documentation is an admission of UI design failure.” For most users, procedures that need to be documented might as well not exist.

But he continues to be really uneven. His Stupidity Formula, for example; even if you buy the notion that stupidity increases with number of developers, agency and communication problems certainly mean it doesn’t decrease reliably with the amount of money thrown at the problem. There needs to be a multiplier proportionate to the square of the funding organization’s size in there.

Then there’s Feel the Source, where LHB, apparently seriously, proposes that upstream Linux projects should ship production binaries. That is, rather than shipping tarballs and letting packagers and distro builders make the binaries.

If I were writing in LHB’s style, I’d be sputtering scornful profanity right about now. Yeah, like every open-source project can have a build farm in its basement, with servers for every possible arcane combination of hardware, distro, and release level. The concept is just nuts. We’ve evolved a three-tier system (upstream projects to packagers to distro repositories) for excellent reasons; it’s the minimal-complexity adaptation to our deployment issues. This is probably the most foolish thing LHB wrote, if we’re leaving out the pure Beavis-and-Butthead flamage.

Sometimes LHB just seems confused. In my experience, when a Linux user or advocate says “Linux gives me choices”, it actually means “My choices aren’t dictated by a single-vendor monopoly or a locked proprietary data format.” But, in The fallacy of choice LHB argues as though Linux advocates actually relish having lots of competing choices for each applications niche as a virtue in itself.

This is an odd position that seems not to actually match observed behavior; we don’t normally see people building competitors to an established program unless there are specific reasons to do it. So, for example, nobody seems to be trying to build direct competitors to the GIMP, but we do have Scribus and Inkscape that work from different imaging models.

It’s too bad LHB goes down this garden path, because he might have had some properly trenchant things to say about (for example) the GNOME/KDE split. That had a reason, but a case could be made that it was a bogus one.

And after that post, LHB gradually runs out of momentum. There’s one last and mildly good rant at Pulse my audio; sound has never been broken for me, but the plethora of Linux sound APIs and servers is undoubtedly a mess for people with more complex requirements. Once again, though, it seems a little off to blame this on “freetard” attitudes; really, it sounds to me like the mess was more due to design problems that were intrinsically difficult to get right without a couple of (software) generations of experience.

This is how it ends:

So in true open source fashion, as the maintainer of this project, I am going to arbitrarily drop off the face off the of this earth for purely selfish reasons, and leave the entire cause in limbo. That is how open source projects truly die. But hey, all the material is out there for y’all to see (it’s “open source” in it’s own way), so maybe someone else will take up the cause. Carry on, lusers!

That kind of embodies all of LHB’s contradictions right there – trashing open source in one breath, expressing a sort of stifled backhand respect for it in the next. As though even he, the Linux hater, can’t stand aside from what Linux has taught him.

140 comments

  1. I think the point LHB tried to make in “Feel the Source” is that it would be better to fix the problems with binary distribution than to continue with the current model. The problem with downstream packaging is that downstream sometimes screws up the code through their ignorance. The Debian OpenSSL debacle is a serious example of this situation. You would not have to maintain a build system; when I used Gentoo, I built everything on my desktop, so I know it can be done on the cheap. As for supporting every esoteric combination of hardware and software, why do you need to do that? esr, you are an old-time Unix user. Is your desktop a Sparcstation? How about an AlphaStation? An SGI Octane? A Vax? Do you Slackware? NetBSD? Minix? Who uses these systems nowadays for anything more than dicking around? Yes, binary distribution would not work well in the embedded systems world, but that sphere is primarily inhabited by paid developers who do not fear compilation. If you could distribute an x86/AMD64 binary that is compatible with Ubuntu, Debian, RedHat & Suse, you could probably satisfy 99% of the Linux Desktop market.

  2. >I think the point LHB tried to make in “Feel the Source” is that it would be better to fix the problems with binary distribution than to continue with the current model

    I still think that’s his most serious error. I actually used to build my own RPMs for distribution; I moved away from that because even within that one package format there’s enough variation in where various system directories are placed to be a problem. Possibly LSB willsolve this some year, but it hasn’t yet.

  3. The lesson of LHB is that it’s easier to complain about Linux and open source than to actually fix it. I agree with the author’s claim that most Linux projects are “by programmers, for programmers.” (Fortunately, Ubuntu seems to be remedying this.)

    As for hardware issues, this is attributable to the wide variety of components interacting with each other in unexpected ways. The Windows Vista fiasco shows that this is not limited to Linux.

    What I would like to see would be an informative, snide, warts-and-all guide to Linux and open source, like an updated version of Life With Unix.

  4. >The lesson of LHB is that it’s easier to complain about Linux and open source than to actually fix it.

    True. I certainly agree with LHB on this much, though: snarling “Fix it yourself!” at people with legitimate complaints is not appropriate. When your users tell you that your usability sucks, you’ll by Ghod fix it if you have any pride in your craft.

  5. If I remember correctly LHB was/is a developer for embedded systems using Linux. So his rants came from direct hands on experience working with some of the details. LHB-Redux is still new and developing his abilities and rants – I’m sure that you didn’t start out with well written, thoughtful and insightful clues, much less relevant clues. I mean, you still despise proprietary software because of horrible experiences many, many years ago – It’s like I had a really bad experience at this greek eatery, so no I will never have anything to do with greek food, no matter the place or amount of time that’s passed. And you continue to trash it based, not off of first-hand experience, but seemingly* primarily off of word of mouth (Strong dislike for Vista because you’re not used to using it and had trouble with your wifes laptop – similar to people having issues with any operating system/software they’re not used to)

    “Yeah, like every open-source project can have a build farm in its basement, with servers for every possible arcane combination of hardware, distro, and release level. ”

    That’s also one of his complaints, that there have to be so many different compilations because of the different distros and all the choice. It can lead to a moving target for developers. Or stuff that breaks because x updated a module, while y still depended on the old version, and x doesn’t resolve dependencies well, if at all. Yes that can happen with any OS/software, but there can be a better coordination between vendors and the base they’re aiming at.

    Open source at times can be too much like herding cats, while proprietary is still herding them – it can be at least in a specific and walled area.

    Not that Open source is horrible, but it has many problems that show up more readily because it is open, and that can be bad PR. And a community that can be difficult to work with at times doesn’t help.

    * I say seemingly based off of the posts that I’ve read, but I admit that I could be way off base due to a lack of more in-depth knowledge of you and your history.

  6. I actually used to build my own RPMs for distribution; I moved away from that because even within that one package format there’s enough variation in where various system directories are placed to be a problem. Possibly LSB willsolve this some year, but it hasn’t yet

    Yes, this is the major problem, and until someone fixes it, Linux application support will suffer. I am uncertain whether it will ever be fixed. The FHS has existed for 15 years. The LSB has existed for 8. This is aeons in the computer industry, but the problems are still there.

    I know binary distribution is possible. I have seen id Software, Skype and others do it. If it requires static binaries, so be it. RAM and Disk Space are cheap nowadays. My grandmother’s cheap Toshiba laptop had 2GB RAM & a 400GB harddrive.

    I know it is harder to update all ones apps when a library has a bug or flaw, but this already the case with the proprietary stuff listed above, and if the apps are being competently maintained, updates should be released shortly.

  7. C’mon, Psychology 101: if he really hated Linux he would have not spent so much time for learning many aspects of it actually pretty well. He is a lover who practices “tough love”, nothing else. I really dislike romantic comedies, the reason you don’t see me saying a word about them is that I haven’t seen one in 10 years at least. So what could I say. If I’d pour insightful and accurate bile onto many rom-coms you should suspect that there is still something that attracts me to them, that I am a lover who feels cheated, not a hater.

  8. Linux Hater’s Redux is Pete Bessman?

    What’s your evidence for this? If true, I must say I’m amused. I wish he were still hanging out here.

  9. The biggest point I took away from LHB was the moving target syndrome.

    Linux fails to be a platform.

    Every distro has different file structure, different config files, different supported libraries, etc – and these differences are there for no good reason (as LHB so emphatically points out). So an ISV, or open source upstream project, can’t easily target Linux – it has to target Red Hat, Fedora, openSUSE, Ubuntu, Debian, Slackware, Mandriva, PCLinuxOS, Gentoo … and on and on and on. Thus, upstream has to pass off tgz source packages, then packagers have to package it, then distros have to put into repos.

    It works okay for regular supported stuff. It’s nice to open Synaptic, and essentially go shopping (for free), and nearly instantly have something installed.

    The problem lies in in a couple of things:

    1. If I want to get the latest version of say, Banshee, or Amarok, or AbiWord, or something else, I have to wait for the distro to package it, and then do a full system upgrade (save for a a sprinkling of compatible backports out there). That’s ludicrous. In Windows, if I want the latest package, I simply download the setup exe or msi, and I’m off and running.

    2. Huge, massive duplication, triplication, quadrication – if that’s a word ;-) of effort. Essentially, we have armies of distro packagers doing the exact same thing, over and over and over. Wouldn’t it be nice to have one big-ass generic repository, for common user/desktop apps, that all distros could pull from, and then each distro simply maintain their own system level packages??? Then all those packagers doing the same thing can point their efforts into improving their actual distros, or various upstream projects. Also, ISVs could deploy to the generic repository.

    That’s the main take away I get from LHB.

    Perhaps LSB will some day come to fruition.

  10. I think Ubuntu is becoming a semi-official standard, which might solve this problem. If I would want to develop a cross-platform FLOSS app I’d just make sure it works on Ubuntu well and leave it to the maintainers of the other distros to make it work on theirs.

  11. Eric:

    I rescind my earlier comment re: Pete Bessman. I mistakenly thought it was him but it turns out he is not LHR, but he did contribute.

  12. > Occasionally he’s dead on target. As in The Registry is dead…long live…the registry!? Tell it, brother! This old Unix hand thinks gconf is indeed a botch, a frightening piece of overengineering. Give me a $HOME full of old-school dotfiles any day; they’re far easier to read and modify without fearing that a change to one thing will break everyting.

    To be fair, GConf has one very distinct advantage over Microsoft’s registry implementation in Windows: the worst possible mutilation that can happen to gconf, still doesn’t require a complete re-install of the operating system, or even the removal of your $HOME directory as a whole. Sure you’d have to delete $HOME/.gconf (or whatever it was called), but at the very least your data is safe… and even some time.

    Now don’t get me wrong, GConf is still a horrible abomination that should have never existed, and I also much prefer a whole bunch of dotfiles and dotdirectories for my applications (I don’t see why that’s a bad thing anyway, most file listers will hide such files by default, so you’d only really see them when you *need* or *want* to see them).

  13. I always hear the argument that Linux has many applications that does the same thing. Just think about this: why do many languages in the world exist? They all do one purpose – a tool to communicate and socialize with people. But then, they have their own strengths and weaknesses. The same applies to Linux, and open-source projects, in general. As they say, the more the merrier.

  14. No, I am not Pete Bessman. I read the platform rant a long time ago and found it appropriate to the blog. When I got into contact with Pete Bessman, I asked him if I could post the rant, and he agreed.

    BTW, I have posted a reply to this topic.

    Eric says: Yes, and I have replied on your blog. Thank you for your civility; I don’t think LHB would have been so constructive.

  15. JeffS,
    1. If I want to get the latest version of say, Banshee, or Amarok, or AbiWord, or something else, I have to wait for the distro to package it, and then do a full system upgrade (save for a a sprinkling of compatible backports out there).

    Not necessarily true. I know that with Fedora you don’t have to do a “full system upgrade” to get a new version of an application. For an Amarok update (for example) do
    yum update amarok
    …if the only thing you want to update is Amarok (and, of course, things on which it depends).

    You say the only way to update an application is to do a full system upgrade… Where and when was that the case?

  16. But, in The fallacy of choice LHB argues as though Linux advocates actually relish having lots of competing choices for each applications niche as a virtue in itself.

    This is an odd position that seems not to actually match observed behavior; we don’t normally see people building competitors to an established program unless there are specific reasons to do it.

    Might that partly be a matter of perspective? Is it possible that LHB was thinking of the perceived habits of Linux users (not programmers)? Typical Windows thinking is to frame everything in terms of the user and/or desktop and it’s possible LHB was thinking along those lines. (Once someone learns that bad habit it’s very hard to break them of it.)

    I think most people will make LHB’s mistake when they look at any group of Linux users. In our local LUG there are many people who favor different distributions and different tools for some of the same purposes. From a distance we look like we’re behaving randomly.

    As individuals we’re all doing the same thing: Whatever I want with whatever tools I prefer.

  17. Fuzzy Conner: Might that partly be a matter of perspective? Is it possible that LHB was thinking of the perceived habits of Linux users (not programmers)? Typical Windows thinking is to frame everything in terms of the user and/or desktop and it’s possible LHB was thinking along those lines. (Once someone learns that bad habit it’s very hard to break them of it.)

    How exactly is that a bad habit? Do you think using the mouse is akin to drug use? People use GUIs because they like them more than command-line interfaces. With GUIs, you have the entire interface laid out for you, and using the mouse is faster than using the keyboard.

  18. Now that more open source projects are being done in interpreted/bytecode languages, would that moot the need for massive build farms? Would a distro that favored components written in scripting languages work better across different configurations?

  19. >Would a distro that favored components written in scripting languages work better across different configurations?

    Maybe. Wouldn’t solve the problem for the huge existing base of apps and libraries in C, though.

  20. >How exactly is that a bad habit?

    I”ll leave you two to have this argument, but: in case it wasn’t clear, LHR, you are welcome here.

  21. There is a line in an old Dirty Harry flick that went something like — “Man has to know his limitations.”

    Same could be said for a lot of users. There are many to whom they should just never ramble off the Windows Reservation. It suits their lack of curiously. I don’t mean that as a derogatory comment, just that that are many people who just want to get on with it and leave the ‘what if’ to what brand of beer they have in the refrigerator.

    Nor do I deride MS Windows itself. Though Vista was pretty lame for a company that knows better than this. Microsoft has quite honestly taken the one size fits all model farther than any company that I am aware of. Its a commendable achievement in its own right; even if it pushes the limit on unsuitability to purpose from time to time.

    The whole Linux/FOSS effort is one huge experiment. Bound to be failures. But it is not that an individual program in a herd survive but that the herd moves forward and foster other programs to exist.

    Finally I would hazard that the Linux based Windows haters have a better grasp of their disdain than the opposite. Goes back to brand of beer paradigm.

  22. “framing everything in terms of the user” is precisely the correct behavior. Sadly, not even Windows does it to the extent that it should. I cite again the example of the Amiga, which serviced interrupts corresponding to user input above all other interrupts. This behavior was unique among operating systems that supported preemptive multitasking. The rationale being that if the user expresses a desire it should be carried out immediately. As a result, a computer based on a single seven-megahertz 68k CPU feels snappier and more responsive than a multi-gigahertz, dual-core machine from today. (The Windows version of the Amiga emulator UAE even sets itself to the highest possible priority level, so that it can provide a lightning-fast response to user input that Amiga users were used to.)

    The fact that Linux doesn’t put the user first, not just on a philosophical or communitarian level but on a down-to-the-bare-metal, engineering level is why it is made of fail.

    N.B.: I have a colleague who worked at Commodore during the late 80s. The Amiga Kool-Aid loses effect very slowly, and is somewhat contagious. There are reasons why an “also-ran” in Eric’s terminology can inspire such passion.

  23. Nor do I deride MS Windows itself. Though Vista was pretty lame for a company that knows better than this. Microsoft has quite honestly taken the one size fits all model farther than any company that I am aware of. Its a commendable achievement in its own right; even if it pushes the limit on unsuitability to purpose from time to time.

    Vista FUD needs to die the death. Vista was arguably Microsoft’s best release, though I posit that the OS expends too many rsources attempting to fix problems that are entirely the creation of Microsoft’s engineering culture. If you:

    * get rid of all the cruft that was installed on your PC by the manufacturer
    * disable unnecessary random bullshit like ReadyBoost and whatever it is that gobbles up 60% of your system RAM caching .EXEs in memory

    you will have a Windows system that is more secure and stable and looks less like ass than Windows XP. There’s a new driver model which breaks existing XP drivers, but there’s one of those for Linux every few months or so.

  24. Please Mr Linux guy, please help the end users. We have tried FOSS from Linux to GIMP and we were burned and are now very disillusioned.

    The UI problem is unbelievably bad. Programmers will openly insult professionals who, after using a dozen other successful UIs in the past, cannot get their heads around the programmer think that is the UI for so many projects. Linux is horrible in many ways, but the king of UI failures has t be The GIMP. Not one professional artist is being consulted on that project. It is trying to reproduce Photoshop. Photoshop is not designed as an art tool at all. it is a photographers digital darkroom and editing platform. Artists use Photoshop by learning workarounds, or because they are just unaware of better software. You cannot make a successful art program if your process is to copy a photographers program. So even if GIMP gets it right, in the end, they still get it wrong. I personally know dozens of professional artists who have tried explaining this to the GIMP programmers, and have all faced the most rude responses one could imagine because the programming team are so convinced they know what is best for artists.

    I have worked on a demonstration UI that I have user tested with art students and art professionals. If you want to help artists, and in turn help The GIMP, please email me and I can send you a finished demonstration (well as far as a non programmer can).

    Linux does have other problems that some of the old hands are concerned about. The Macro Kernal… The OS is impossibly large now and keeps growing larger. Really, I hate all three of the “popular Operating Systems” and I am not alone. The OS will never load from HDD in 10 seconds if the code doubles in size every 18 months. A good OS needs to be small for speed and stability. Now there may be a way to get this if you remove packages from the Kernal and recompile, but who other than a programmer will ever do that? I am an old hand, but I am no programmer… and the freedom programmers claim to get from Linux is only a dream to me. To expect every user to put their lives on hold for years to become programmers is not reasonable, but I expect you have already realised that yourself. I now point to Workbench 3.1 as my ideal OS. Fast, efficient, easy to customise (customising the startup sequence was easy, even for me) using text editor or graphical settings tools, and it was fun to use. It may have lost much of the humour of Workbench 1, but it was a lot of fun. Why can’t Linux be small tight and fun for everyone? Copy Workbench, all the patents are dead, and Apple has been doing it now for a decade. A quick note. Linux has just had a new RAM to HDD bootloader announced. We had “RAM image to HDD” boot loaders on the Amiga in ’94… it works, but only for the seriously geeky. The end user does not like making a new RAM image every time an OS setting is changed, or files are moved in specific areas. So fast load times can only be achieved with a tiny OS, AND/OR the usage of a ROM that contains most of your OS.

    To make Linux small, I would be happy to buy specified parts so the Kernal could drop support for hardware only a few people will have. I have seen too many Linux advocates insisting that a half finished driver for every piece of hardware ever made is the way to go. It is not necessary. Supporting only 2 graphics cards, but supporting them 100% means that most end users will get a better experience. These users buy pre made computers. Why would any “Linux Box” manufacturer ever use a 50% compatible card when a 100% compatible card will sell for a similar price?

    Oh, and in the current tech community, it has been forgotten that the Amiga, Atari ST, Acorn Archimedies and others, did not just have cheap hardware. On all of these platforms there was professional graphics software, and a fair amount of other software, but my experience is with graphics/animation/video, and this software cost hundreds of dollars. Possibly because of price gouging, Apple and Microsoft survived while these other platforms fell. Adobe did the same thing with software. They introduced graphic software that costs many thousands of dollars. I may be babbiling, but I was happy to buy three graphics titles for $1000, and I would again. Linux does not need to be FOSS or die. Linux also does not need to have software companies selling software for $3000. I purchased professional software for as little as $100 (actually Photogenics 2 and Disney Animation Studio) on the Amiga. A war against commercial software by a group of Linux users seems self destructive.

    Maybe I do not know as much as the Linux clique, but to me, these seem like no brainers.

  25. >There’s a new driver model which breaks existing XP drivers, but there’s one of those for Linux every few months or so.

    And it hardly matters, because most drivers are maintained in-tree.

  26. >You cannot make a successful art program if your process is to copy a photographers program.

    I’ve done successful art (including collaborating on the cover of The Art of Unix Programming and various graphics for Battle For Wesnoth) with the GIMP, so It’s successful for me. I can readily believe that the GIMP team is rude to artists, but I think this is partly because if you ask N artists for UI advice you tend to get at least N+1 wildly diverging designs. Have you looked into Inkscape?

    >Now there may be a way to get this if you remove packages from the Kernal and recompile, but who other than a programmer will ever do that?

    You don’t have to. On most distributions the drivers are built as modules that load on demand, so you don’t end up paying for what you don’t use.

    >I have worked on a demonstration UI that I have user tested with art students and art professionals. If you want to help artists, and in turn help The GIMP, please email me and I can send you a finished demonstration (well as far as a non programmer can).

    I’d like to see that.

  27. Phil, yes; it’s defined in the Jargon File as “Amiga Persecution Complex“. There’s also a very strong “myth of return” dearly clung to among the Amigan Diaspora — the belief that one day the Amiga will come again in glory to judge the living and dead platforms. The AROS project has apparently been starved for developers because of this: if the reborn Amiga is strangled in its infancy by the availability of open-source clones running on x86 econoboxes, it would be a great loss.

    And it hardly matters, because most drivers are maintained in-tree.

    Most drivers are maintained in-tree because there isn’t a standard driver API to code against.

  28. > There’s also a very strong “myth of return” dearly clung to among the Amigan Diaspora — the belief that one day the Amiga will come again in glory to judge the living and dead platforms. The AROS project has apparently been starved for developers because of this: if the reborn Amiga is strangled in its infancy by the availability of open-source clones running on x86 econoboxes, it would be a great loss.

    That’s just preposterous. If developers like Amiga and think it’s worth developing AROS, by all means they should. Holding back an open source project in the hopes that Commodore will return is just silly, on many levels.

    > Most drivers are maintained in-tree because there isn’t a standard driver API to code against.

    Most drivers are maintained in-tree because it’s the best place to put them, and it can be mostly guaranteed that the driver won’t break on the next version of the kernel, because the driver will also be updated (and quite possibly, more stable, less buggy, etc).

  29. LHR
    How exactly is that a bad habit? Do you think using the mouse is akin to drug use? People use GUIs because they like them more than command-line interfaces. With GUIs, you have the entire interface laid out for you, and using the mouse is faster than using the keyboard.

    No, I don’t think it’s akin to drug use. Yes, that gave me a good laugh. :) Sure, most people love GUIs. Sure, the mouse is faster.

    My point was that Eric, being a programmer, was talking about programmers, in his words people building competitors to an established program. LHB was talking like a user about things users use. (By the way, I never said that’s the only thing LHB contributed or that every LHB assertion came from that place.)

    To give some context to the next two paragraphs, I should mention that my day job is teaching people and often includes teaching people How Things Work.

    Yes, I consider thinking no deeper than the desktop to be a Bad Habit and most people do it by default. “Bad” because thinking of “everything in terms of the user and/or desktop” overlooks the fact that a GUI is not an operating system. “Habit” because once someone thinks their desktop is their computer it’s almost impossible to get people away from that incomplete idea.

    LHB gave lots of criticism. Open source needs good constructive criticism, that’s one path by which it improves. To clarify and understand good criticism it’s sometimes useful to identify whence it comes.

    And even while I know you hate Linux I say “Thanks for helping.”

  30. That’s just preposterous. If developers like Amiga and think it’s worth developing AROS, by all means they should. Holding back an open source project in the hopes that Commodore will return is just silly, on many levels.

    You’re right; however:

    * AROS has evolved phenomenally well despite, or maybe even because of, its small dev team.

    * The Amiga community, to my knowledge, hasn’t tried to actively stop the development of AROS; rather, most of them just aren’t interested in it, and former commercial vendors of Amiga software are not interested in porting to the new platform.

    * Remember that the Amiga was hardware and software designed to work together as a functional unit. An Amigaish OS running on generic hardware is profoundly less appealing for reasons which should be obvious to those who remember the beauty of the original.

    * Such an attitude is not surprising when you consider the community it came from, in which software developers are craftsmen and you show your appreciation and respect for their craft by paying them handsomely for their work. In the words of the CLImate nag screen “A program worth using is worth buying.” When you consider that many Amiga users were media professionals — artists, film and television producers, and musicians — even an expensive program was relatively low in price if its function increased your day-to-day productivity. Thus open source is viewed as a corrupting influence, reducing the market value of this craftsman’s work down to zero.

  31. I’ve done successful art (including collaborating on the cover of The Art of Unix Programming and various graphics for Battle For Wesnoth) with the GIMP, so It’s successful for me.

    As have I. But that’s mainly an issue of living with and working around GIMP’s limitations, something which is tolerable for a programmer to do; much less so for an artist. The artistic and graphic-design community has standardized on Adobe Creative Suite because Adobe has made it its business to provide software that works for these people, and there is a huge amount of knowledge of issues like workflow, and how artists do their work, within the company accumulated through years of user-feedback cycles. Such knowledge, if it exists in the fosstards’ world, is very dilute; more likely it is actively filtered out by the attitudes of said developers. There are also licensure issues surrounding standard, basic stuff like PANTONE colors and CMYK color separation. Free software can’t even get in the game because BAWWWWW, software patents are evil. As a result, real artists — professionals who do this stuff day-to-day — consider GIMP and related programs to be difficult to use and laughably unsuitable for production work.

  32. The comments on here make it obvious that there will be no listening to anyone fro any corner of the Linux world. no one here is asking for the Amiga to return, only that you examine what made the OS so unique and see if you can copy a long dead OS to improve linux. Since Linux is third currently in a race of three… but losing by a long shot. Usability is one of the biggest problems Linux has, and accusing people of having complexes because they suggest in a far more adult manner than the replies, that a good OS be inspected for ideas.

    Hmm, GIMP used as part of a cover by one user. What did the other people in the collaboration use? Did they need to fix up the colour because you could not supply the art in CMYK, ready to be made into separations? Did they need to fix the lossy damage done to your files? When I first found myself working with offset printers, I found that even 100% JPEG leaves invisible artifacting that appears during the printing process. The GIMP does not just have UI flaws, the tools themselves are sorely lacking. I also have seen The GIMP programmers being VERY polite, as long as they are responding publicly to their sycophant fanmail. It takes skill to be rude to someone who idolises you. This does not mean at all that they are not rude to others. Your N+1 argument is also an assumption. Some of the artists have formed into focus groups in our own time and have agreed on UI. When asking for the UI to be looked at, we get the same rude responses.

    Make all the excuses you want guys. If you keep blaming the messenger, then Linux usage will shrink. There are many who have tried Linux and will never use it again, release it to more people and you will just alienate swathes more computer users. Yes, your numbers go up temporarily, but they crash long term as word of mouth from all the insulted early adopters spreads. Have you ever heard of the business rule “a satisfied customer will tell no one, but a dissatisfied customer will tell tell 20 others”? You have an army of dissatisfied customers already. Do you believe I would say anything good about a self serving arrogant community who create nothing useful to normal people if my grandmother suddenly asked if she should buy a Linux box? Of course I am not. I care too much for her to let her use Linux in it’s current form, and with the current community.

    Oh, the Amiga’s Workbench was exceptional, but if Linux copied from OS2, BeOS or QNX then I would almost be as happy. These all have superior user experiences to Linux, and to a lesser extent Windows and OSX. The whole point is to improve what you have. For the past 12 years I have just seen Linux play the “Me Too” game with Windows. Linux always trying to catch up and using the Windows interface as the blueprint for a less than perfect Linux copy.

    If you have not noticed that many of your colleagues went out of their way to attack me, insult me, instead of debating valid points, then how do you hope to spot the insulting behavior of others “preaching Linux”?

    Oh, just so you all know, I am not impressed with FOSS. I was releasing my own work as Public Domain before Linus, or any of you started your great crusade. You are not morally superior to everyone you meet, so please stop the attitude.

  33. Oh, Jeff, again, the mistakes. How would programmers feel if people assumed all you wanted was DirectX10 developers suite? Aren’t all programmers games programmers? If you were forced to use DirectX10 you could probably make do, but it would not be easy. The tools are not designed for the jobs you need to put them to.

    Photoshop is NOT the industry standard for artists. It is the industry standard for Photographers, composting jobs, and it is the defacto standard for teenage pirates hoping to be professional artists one day.

    Artists use software like Corel Painter, high end artists use tools made by Quantel. Animators are often left out of the market now Deluxe Paint and Personal Paint have died with the Amiga, but you can do half of what you used to be able to do on Toon Boom Studio. There are many other programs that are better suited to artists needs than Photoshop, but I do not need to be here for the next hour listing them.

    It is a complete lack of understanding by FOSS programmers of how digital artists of all kinds work that frustrates us. No one asked us what we needed, or what we actually use. The GIMP just assumes we all use Photoshop, and are willing to put up with a horrible attempt at a copy just to get it for free. Just imagine if all FOSS artists refused to make any more orange Ubuntu wallpapers? We know better. Orange is actually the most disliked colour of the spectrum, especially hated by most women. Should we tell the programmers that they get what we give them because we know better? Think about it for a minute.

  34. > Some of the artists have formed into focus groups in our own time and have agreed on UI. When asking for the UI to be looked at, we get the same rude responses.

    You could always ignore the official project if they are so rude, and fork the software to provide the UI your group likes. Surely you can find somebody that has the programming experience to do it, you can even hire a developer if you need to. There’s actually a fork already called Gimpshop that’s supposed to look and feel like Adobe Photoshop; though I’ll provide the disclaimer here that I’ve never used Gimpshop myself (perfectly fine with standard GIMP thank you) and last used Photoshop at version 5 or something…

    BTW: GIMP supports CMYK and lossless file formats. :-)

    > There are many who have tried Linux and will never use it again, release it to more people and you will just alienate swathes more computer users. Yes, your numbers go up temporarily, but they crash long term as word of mouth from all the insulted early adopters spreads. Have you ever heard of the business rule “a satisfied customer will tell no one, but a dissatisfied customer will tell tell 20 others”? You have an army of dissatisfied customers already.

    Where’s the evidence of this? I see none.

    > Do you believe I would say anything good about a self serving arrogant community who create nothing useful to normal people if my grandmother suddenly asked if she should buy a Linux box? Of course I am not. I care too much for her to let her use Linux in it’s current form, and with the current community.

    I don’t believe you would in particular, no, not with that attitude. However, it seems you’re trying to drag the “grandma can’t use Linux because it’s too hard!” argument into it, which has been demonstrated to be nothing more than FUD on several occasions; if I misinterpreted your intentions, I am sorry in advance.

    > For the past 12 years I have just seen Linux play the “Me Too” game with Windows.

    I’ve actually seen the opposite trend. Windows tries to re-implement what the Unix/Linux crowd has already done (be it closed or open source), and poorly. Or they even do blatantly retarded design decisions just to have the appearance of an ability to compete with Unix (for example, Windows 2000 and up integrate the web server into the kernel in order to make its performance more on par with Unix’s user space web servers; several exploits (see Code Red, etc) have used this to gain privileges higher than that of the Administrator account).

    > Oh, just so you all know, I am not impressed with FOSS. I was releasing my own work as Public Domain before Linus, or any of you started your great crusade. You are not morally superior to everyone you meet, so please stop the attitude.

    I don’t think anyone here has actually claimed moral superiority, maybe you posted on the wrong domain?

  35. > The GIMP just assumes we all use Photoshop, and are willing to put up with a horrible attempt at a copy just to get it for free.

    Newsflash: GIMP never was, and is not, a copy of Photoshop. The original developers hadn’t even seen Photoshop until some months later, and the only idea that they actually copied from it was layers.

  36. Should we tell the programmers that they get what we give them because we know better? Think about it for a minute.

    That has been Apple’s attitude and it has actually worked out rather well. Mainly because heh, they do know better.

  37. Animators are often left out of the market now Deluxe Paint and Personal Paint have died with the Amiga, but you can do half of what you used to be able to do on Toon Boom Studio.

    To say nothing of Flash, which is actually used to produce lots of television animation these days. Of course it looks like ass, but that hardly matters in today’s animation ecosystem.

  38. Newsflash: GIMP never was, and is not, a copy of Photoshop.

    Booooy howdy. Photoshop users are the biggest group of people pissed at GIMP’s broken UI.

  39. Oh I forgot to mention, Amiga software was not cheap because of units shipped. Even DPaint, which was shipped with some computer software bundles did not exceed a million copies. Photoshop sells many millions of copies every new upgrade. If numbers equates to price reduction, Photoshop should be worth $80.

    For “smart people” you lot make a lot of wild assumptions.

  40. Oh no. Teenagers are angry and have time on their hands. Artists do not. This is the first time i have mentioned the problems with the GIMP publicly for a year because I don’t need to waste my life trying to reason with a brick wall. if these teenagers need Photoshop so badly, then maybe GIMP needs to be forked. A serious tool that professionals need, and the Photoshop for pirates who learned what layers were from their online tutorials. I thought that forks were something to be proud of in the Linux community…. So why has there been no programmers willing to create a working fork? Oh there have been attempts, but they have died after a few technical goals had been reached.

    is it because there really are no programmers who have the generosity to create software that they cannot directly use themselves? Are FOSS programmers essentially self serving?

  41. I’ve actually seen the opposite trend. Windows tries to re-implement what the Unix/Linux crowd has already done (be it closed or open source), and poorly. Or they even do blatantly retarded design decisions just to have the appearance of an ability to compete with Unix (for example, Windows 2000 and up integrate the web server into the kernel in order to make its performance more on par with Unix’s user space web servers; several exploits (see Code Red, etc) have used this to gain privileges higher than that of the Administrator account).

    But you’ve just proved Bunny’s point. Whatever the merits of sticking the web server into the kernel, it actually boosted Windows’ web page serving performance well above Unix; to counter this, Linux developers added a web server (well, not really; more like a generalized module that blasts file contents to a socket) to the kernel. Taillights-chasing anyone?

  42. Mike Swanson

    if you really have a point with the developers making their own program, then you imply that they are among the worst programmers in history. Deluxe Paint was working in 12 months to a professional level. Photogenics about 18 months if I recall, Paul Nolan did it all himself, Photoshop 1 was ready in 18 months from what I have read, and only had a few developers, Dogwaffle and the professional version has been developed over many versions by one man in his spare time over the past five years or so. Artrage was created as a spare time project by two programmers.

    If The GIMP has not been at the mercy of copying Photoshop, then why has it taken more than a decade and why is it still such crap? Any half decent programmer with creative freedom could outperform the developers of The GIMP.

    Using your argument, and comparing it to all the other free and commercial programs that have needed limited resources, The GIMP has been a failure, and still FOSS supporters hold it up as a shining example of the quality of FOSS. This actually damages the reputation of all FOSS. If GIMP is your poster child, how bad is all the other FOSS? This also calls into question the skill of all other FOSS programmers.

  43. @Bunny: is it because there really are no programmers who have the generosity to create software that they cannot directly use themselves? Are FOSS programmers essentially self serving?

    Yes, we are essentially self-serving. So is everyone else; “generosity”, while a real emotion, emerges from a desire to maximize various sorts of selfish secondary gain. You’ve asked the wrong question, and it could only have led to answers that misled you further; you should have asked whether there are programmers who harbor a self-interested desire to create software that meets the needs of non-programmers. The answer is yes, as I know because I am occasionally one of them.

    The problem is not really a lack of “generosity”, it’s a combination of (a) poor communication skills, (b) cluelessness about usability and UI design, and (c) no facilities or tradition of doing usability testing. I’ve railed against these myself, and the community is improving. Slowly.

    And you’re too hard on the GIMP; its designers have created a program that meets real needs, they just don’t happen to be your needs. I’m in the process of learning how to draw maps with it now, and am finding it pleasant to use. You may, if you wish, write this off as a programmer reacting to a UI written by programmers for programmers – the fact remains that they must have gotten something right, or I’d be screaming.

    >You are not morally superior to everyone you meet, so please stop the attitude.

    This isn’t an FSF blog; we don’t claim to be morally superior to anyone here, just to do what we enjoy in the hope that it will be useful. Chip off the shoulder, please; it’s impeding your ability to be taken seriously.

    @Mike: BTW: GIMP supports CMYK and lossless file formats. :-)

    Lossless file formats, yes, but the CMYK support is still half-assed. In particular, it can’t do bright blacks. Ken Burnside, who designs for print and occasionally hangs out here, could explain in detail.

  44. And you’re too hard on the GIMP; its designers have created a program that meets real needs, they just don’t happen to be your needs.

    Or the needs of anyone else who work within a film, television, or press workflow.

  45. The only rude, condescending, faux morally superior, comments being made on this thread are coming from Bunny – who, ironically, is pointing those very same accusations at FOSS programmers and users.

    I see it often in forums. Someone gives Linux/GIMP/whatever else a try, and finds it does not initially meet their needs. So rather than politely ask for help (or recommendations for other programs) , they come into forums ranting about what losers everyone who “uses this crap” are, and what crap they think it is, or generally make rude, condescending remarks. Then they bitch that the responses they get “are rude”. It’s funny, really.

    Sorry Bunny, politeness and respect go both ways. If you want polite respectful responses, be polite and respectful yourself.

    And judging from your posts here, I’m guessing that you were far from polite and respectful when trying to make UI recommendations to the GIMP programmers. And, you deservedly got rude responses back.

    Sorry, saying something like “you GIMP programmers are a bunch of dumb-asses who wouldn’t know good UI if you pulled it out of your ass – so listen to my infinite wisdom about good UI design and get a life!” does not work for getting a desirable response.

    Disclaimer – I know probably you didn’t saying anything that extreme, but you probably came off that way.

    All that said, it’s sad because I agree with some of your points. I think the GIMP is a really powerful program that matches most of Photoshop features, but I certainly don’t like it’s UI.

    I’ve been using Linux less and less lately. There are many good things about it, and it’s improved in many areas. But Linux still is rough around the edges (for the mainstream desktop user).

    But being rude and condescending about pointing out those rough edges gets you nowhere, in forums or chat rooms, etc. LHB was one thing – people who read that blog knew they were going to get some serious ranting. But if you go and bitch at GIMP programmers about their UI design, without being constructive, you won’t get anywhere.

  46. Or maybe Bunny tried to make constructive suggestions to the GIMP community and was met with the same sort of dreadful, arrogant, antisocial attitude profiled by Udolpho (warning: if LHB pissed you off this will give you a coronary).

  47. >the same sort of dreadful, arrogant, antisocial attitude profiled by Udolpho (warning: if LHB pissed you off this will give you a coronary).

    Actually, I found Udolpho unintentionally funny and kind of pathetic. He put me in mind of the unflattering portrait of “neurotypicals” on the Institute for the Study of the Neurologically Typical; so much resentment, so little brain. There isn’t even any point in disliking people like him; they’re stuck being what they are and will never realize it’s a prison and they are their own jailers.

  48. Do you think maybe acceptance of GIMP is being held back by the name?

    I mean, GIMP? Really? What’s next; BUTTPLUG?

  49. What’s wrong with the name GIMP? Some sort of slang I’m not aware of? (And if it were intentional, it probably would have been stated long ago, like git was sarcastically named after its creator)

  50. Mike Swanson, the word “gimp” has two meanings in contemporary English: either a type of BDSM submissive role or derogatory slang for a handicapped person. Either way, it’s not appropriate for use as the name of a professional software program.

  51. From the first entry found at the Urban Dictionary:

    http://www.urbandictionary.com/define.php?term=gimp

    1. Gimp

    (1) a derrogatory term for someone that is disabled or has a medicial problem that results in physical impairment.

    (2) An insult implying that someone is incompetent, stupid, etc. Can also be used to imply that the person is uncool or can’t/won’t do what everyone else is doing.

    (3) A sex slave or submissive, usually male, as popularlized by the movie Pulp Fiction.
    Look at that gimp in the wheelchair

    I would imagine that the word probably derives from “limp”, since it usually has an implication of being broken or not quite “right”. Like something that works but only half-way so, or in a ridiculous manner.

  52. Sorry Jeff. Who started the pissing war? Do not play the martyr if your side was the first side to put the boot in… a few times actually. So please, morraly superior may work with your mother making you lunch upstairs, but for anyone with an IQ, you just sound manipulating.

  53. > using the mouse is faster than using the keyboard.

    This is going back to earlier, but the keyboard use referenced in the linked article is use of keyboard shortcuts in the GUI, whereas the keyboard use that you had just previously mentioned was use of a CL. I am the first to admit that GUIs are more user-friendly, but I really do find the CL faster to use day-by-day.

    > Sorry Jeff. Who started the pissing war? Do not play the martyr if your side was the first side to put the boot in… a few times actually. So please, morraly superior may work with your mother making you lunch upstairs, but for anyone with an IQ, you just sound manipulating.

    I didn’t see anyone being really rude on this thread until you started in on your page-long rants. Could you give an example please? The GIMP developers don’t count, for reasons already elucidated.

  54. I don’t mean to imply that the GIMP is a “gimpy” program. I am neither a photoshop nor gimp user frequently enough to have real familiarity with the ins and outs of the UIs… but I can say that they are both challenging from a zero-experience point of view. I suppose both can be learned and both are powerful image manipulation programs. There is something I heard of called GIMPShop once that supposedly is GIMP with a more Phototshop-like interface, but I have not looked into it. I have always felt the name was a little misfortunate, being familiar with the slang connotations of “gimp”. But part of me sort of assumed that it was a bit of sarcastic geek humor, perhaps. One thing I have noticed is that many programmers, especially in the open source and free software worlds, tend to have a sense of humor about themselves and their creations. I find that refreshing mostly.

  55. Something else I would like to chime in on… There is merit in following the status quo in some situations. You don’t re-invent the controls for driving a car even in every new concept car, though an occasional experimental diversion can bring new and better ways of doing things to light. For many programs you see the tape recorder interface replicated… <<, , ||, [], and >> symbols for instance are ubiquitous. If a large part of the user base is familiar with certain ways of doing things, and those ways work fairly well, then it makes sense to replicate that usage model for most people. This is of course the momentum that keeps many people with Windows in the first place… familiarity. Now it is true that there can be better ways, and if so those ways will make sense to pursue. But I would say that if there is not a compelling reason to do something significantly different then there is no good reason to expect users to do it different. It may be just as good but it adds an extra re-learning that may not be necessary. One thing I do like is programs that tend to put the functionality in a back end or underlying library and have a separate UI layer that can be replaced. For instance, I like Deluge as a torrent client, and especially since they moved the main engine to a daemon. Now I can even run the back end on my server, use a web UI, or connect the client UI over the network to the daemon on the server. I like that flexibility. Or MPD… there are many programs that provide a front end to it. And that separation makes it useful in other applications like some more complex home entertainment set-ups or for use with other sorts of programs, such as Ampache. Likewise a good front end like Amarok can use multiple back ends, as Amarok 2.0 can control Ampache, for instance. Now these things are not confined to FOSS vs proprietary software by any means. But a company-backed or other ‘funded’ development usually has ROI in mind and will dictate that the project follow certain conventions to please and attract a user community… especially when user base == customer base == monetary gain, or ROI for the development.

  56. It seems odd to need to point this out in a thread where I would have thought most if not all would be familiar with what GIMP stands for –or would at least have done a simple Google –it’s the first hit:

    GNU Image Manipulation Program -I don’t see much odd about the name. You can find examples of slang or even other languages where words mean something quite different then they may to North Americans –Try the Chevy Nova… No Va means no go in Spanish…

  57. Don Hensley, GIMP was developed by Americans at UC Berkeley — and the G originally stood for General. Despite the fact that the center of gravity for producers of high-quality software probably shifted from America to Europe in the 90s, Americans are still the biggest consumers of software; offending them is not a good way to grow your user base: open source, proprietary, or otherwise. There really is no excuse for the name.

    Open source really needs a fucking marketing department. A real one. It probably also needs executives to set the direction and goals for the developers and a means of rewarding them for doing the utterly nasty work of soliciting user feedback and delivering decent, usable interfaces — but gasp! That would mean running things more like a company! OH NOES!

  58. Isn’t it funny how every discussion over FOSS usability essentially turns into a debate about the merits of GIMP? As if there are no other FOSS graphics programs? As if every FOSS application is developed with the same model and mentality?

    How about contrasting the recent developments of Blender, which was done with feedback from people who were actually using it to make movies? Or projects like firefox and chrome that are redefining the standards for browser interfaces?

    Why does one controversial UI in one app get treated as proof positive that FOSS is doomed and Linux will forever be third class?

  59. >Why does one controversial UI in one app get treated as proof positive that FOSS is doomed and Linux will forever be third class?

    Because that’s the outcome the speaker desires, of course. Evidence is immaterial.

  60. When I first discovered your blog, you hadn’t posted in forever and I thought you’d given up on blogging. Glad to see you back!

  61. “Sorry Jeff. Who started the pissing war? Do not play the martyr if your side was the first side to put the boot in… a few times actually.”

    No pissing war here. Just pointing out your rudeness.

    I’m not annoyed at all. I’ve seen it all many times before. And, as I already pointed out, you made some good points about good UI and usability. But the good points get lost under the din of the “FOSS sucks, and you all suck” mantra that you keep repeating.

    “So please, morraly superior may work with your mother making you lunch upstairs, but for anyone with an IQ, you just sound manipulating.”

    That’s the typical knee-jerk response anti FOSS detractors always use. While the stereotype (nerd living in parents basement) is true in some cases, usually it is not. I’m 43, married, have two kids, two dogs, a house in a very nice area, and a successful career.

    So go ahead and use your negative stereotypes if they suit your needs to bitch about stuff. But know this, using negative stereotypes only makes you look like a jerk and a moron, and cause people to ignore anything you have to say.

    BTW – that negative stereotype is often used with IT people in general, not just people who use/support FOSS. And, amusingly enough, it’s often used for detracting your line of work as well (artists). ;-) Actually, there are other negative stereotypes regarding artists, but we won’t go there.

  62. @Jeff Reed:
    “Or maybe Bunny tried to make constructive suggestions to the GIMP community and was met with the same sort of dreadful, arrogant, antisocial attitude profiled by Udolpho (warning: if LHB pissed you off this will give you a coronary).”

    I read that rant by Udolpho, and loved it. I’ve seen and worked with that type of IT person many times. In fact, I work with two of them right now – arrogant, abrasive, self-loathing, overinflated sense of self worth, bad dressers (wear’s the same shit year after year), and on and on. Also, one of my old college buddies fits that mold to a “T”. Underneath all that roughness lies a good guy (which is why I’m still friends with him). But his exterior fits the stereotype exactly.

    But I’ve also encountered many IT people who were quite pleasant to deal with – good manners, sense of humor, good communication and business skills, good wardrobe ;-), etc etc. But the idiots that fit the stereotype Udolpho rants about really make it tougher for the rest of the “normal” IT crowd.

    As for Bunny – maybe he/she did try to make constsructive comments to the GIMP community, and maye he/she got rude/curt responses. That could be entirely possible because people have been bitching about GIMP’s UI for years, and the GIMP devs still haven’t changed it. But by judging Bunny’s rants here, it’s also entirely possible that he/she did the same rude ranting to the GIMP community.

  63. >But his exterior fits the stereotype exactly

    That’s why I pity Udolpho. He lacks something that’s essential to understand people like your friend; he’s fixated on the monkey rituals of normal socialization and will never understand people who aren’t invested in them. The ironic part is that he thinks he’s the sophisticate – but most of what he thinks is real and important is just noise, the human equivalent of chimpanzee mutual-grooming behavior. It’s not an intrinsically bad thing to be competent at, but when you think it defines the limits of the acceptable world…thanks, I’d rather be the geek I am or even the autist I’m not.

  64. Alan, GIMP is an easy target to start with, not because it’s egregiously bad, but because it’s egregiously good. In the realm of open-source software not developed or backed by a large company, it’s a stellar example: it’s been around for years, popular, and it works really, really well. By the standards of your average OSS app, or even your average Windows app, its UI isn’t even that bad.

    But the UI doesn’t reflect years of user feedback from people whose business it is to produce art and design artifacts with the program. As a result, it is fine for amateurs, but professionals have a hard time taking it seriously. It simply doesn’t integrate well into their workflow, to say nothing of the broken CMYK support and lack of support for industry-standard PANTONE colors, which are absolute deal-killers for a professional application. Workflow is perhaps the #1 UI issue confronting OSS projects today, now that the issues regarding making pretty buttons and a nice desktop environment have largely been solved. If you’re writing imaging or design software and it doesn’t fit within the workflow of a typical studio, I’m sorry, but you’re not even a contender in that space.

    GIMP is illustrative of the fact that a solid program with a decent UI is not enough if you want your software to succeed; you have to make it your business to please and work well with the people whose trade your software caters to, as Adobe has done.

  65. @esr:
    “That’s why I pity Udolpho. He lacks something that’s essential to understand people like your friend; he’s fixated on the monkey rituals of normal socialization and will never understand people who aren’t invested in them. The ironic part is that he thinks he’s the sophisticate – but most of what he thinks is real and important is just noise, the human equivalent of chimpanzee mutual-grooming behavior. It’s not an intrinsically bad thing to be competent at, but when you think it defines the limits of the acceptable world…thanks, I’d rather be the geek I am or even the autist I’m not.”

    I agree. I explored Udolpho’s site for a few minutes, and did not see any identity. It’s easy to blog anonomosly, and be abrasive and inflammatory in the process, and stereotype groups of people like he/she does. In fact, what Udolpho is saying geeks are guilty of, he/she is more guilty, the way he/she blogs. Pot calling kettle black.

    Anyway, us “nerds”, “geeks”, “IT professionals” yadda yadda, would do well to take some of those rantings to heart. It doesn’t hurt to keep one’s outward appearance up, nor does it hurt to have good social skills. ;-)

    But other than that, Udolpho is full of shit.

  66. Social graces are an important skill to cultivate. You are only making life more difficult for yourself if you don’t.

  67. Sorry if this was said before, but in reference to Linux (and other Open Source OS’s out there, Mac OS X excepted) I have this to say:
    If someone asks you for help on an issue and the first words that come out of your mouth for the solution are “Open a command prompt…” the simple answer is, you might want to sit down for this, YOU…ARE…DOING…IT…WRONG. I don’t mean that the solution offered is the wrong one, what I mean is that if you have to tell an end user to drop to the command line FOR ANYTHING the entire model of the OS is wrong. Period. Look at Windows and Mac OS X, despite your personal feelings about the OSes or companies involved, not one time in your entire troubleshooting career as a tech for these will you ever HAVE to drop to command line. Sure, the CLI can make it easier for some techs, but the point is you don’t HAVE to, and if you don’t have to as a tech, certainly “average Joe” user doesn’t. Now look at Linux, e.g. I want to use the NVIDIA driver, download the NVIDIA driver, drop to the command line, run the installer, reboot machine, open command line again and either manually edit xorg.conf or run Xorg –configure and hope this works. Now the same operation on Windows, download NVIDIA driver, double-click to run, reboot, and wow, its ready. Now I know someone is going to yack on about this being the fault of NVIDIA for having a closed source driver. And I have two things to respond, 1) so what, its closed source, its worked in Windows for years that way, its because Linux will change the API on the fly (and other reasons) that this doesn’t work and 2) this is only an obvious example there are many, many more. Ok, sorry for the rant, you may have your blog back

  68. esr, could you delete the last comment with the name Meis but has my email address. I must have accidentally typed ‘Meis” into the Name field.

    Meis, to be honest, some distros like Ubuntu make it a bit easier. According to this page, you just click System -> Administration -> Hardware Drivers and check the box to enable the NVidia driver. Granted, the Ubuntu Community Documentation is an absolute nightmare.

  69. I think Meis’ post is confusing two independent issues: Tech support (“If someone asks you for help”) and the user doing things for himself (“I want to use the NVIDIA driver”). For the first, I vastly prefer the command line because rarely can misunderstandings happen, and it usually gets the job done far quicker. Try walking someone through a GUI sometime over the phone, it’s an absolute nightmare.

    For the later, you really should bring that up with NVIDIA themselves for making it so needlessly complex. Either that or buy video cards with open source drivers.

  70. >That’s why I pity Udolpho. He lacks something that’s essential to understand people like your friend; he’s fixated on the monkey rituals of normal socialization and will never understand people who aren’t invested in them.

    If you’re going to criticize people like him who write off hackers as “freaks and geeks”, I think it’s only fair for us to retire terms such as “lusers” and “room-temperature IQ”. I don’t need to repeat the need for nontechnical people to start using Linux if it’s going to spread. They simply have better things to do than hack the kernel, and we need to accept that.

  71. >If you’re going to criticize people like him who write off hackers as “freaks and geeks”, I think it’s only fair for us to retire terms such as “lusers” and “room-temperature IQ”.

    Huh? How does that follow? I could pity a blind person while still thinking that Braille accessibility is tremendously important. That’s a hypothetical; there aren’t actually enough blind people to make Braille accessibility much more than a feel-good checkbox item in the larger picture. My point is you’re confusing two separate issues here. They’re related only in that you can’t successfully evangelize to an audience you pity or despise, it tends to leak through in your presentation and turn them off.

    Fortunately, for every Udolpho in the world there’s someone like my mother – not a geek, but she married one and mothered a couple and understands that there is good point in tolerating their foibles even though she don’t speak the language. She’ll never hack the kernel, but she’s not so trapped by her prejudices that I pity her as I do Udolpho.

  72. @Phil: I am glad Ubuntu is making it easier, openSuSE does as well, however, I only used the NVIDIA driver as an example as it is an obvious one most people can relate to and have heard of.

    @Mike Swanson: No, I am not confusing two issues, however, you may have been confused by my representation. I was saying that, as a tech, one is not REQUIRED to use the command line in OS X or Windows, though many techs do find it easier. And if a tech doesn’t need to, then obviously an end user doesn’t need to. Whereas, in Linux, even the most basic stuff for end users requires some CLI knowledge and just about any troubleshooting immediately requires some terminal time. Which brings me back to the original point, if the answer to a support question for your OS requires an END USER to use the command line, your OS model is wrong. Because of this, Linux is still by developers for developers and nowhere near ready to be used as a mainstream desktop. It lacks the stability, reliability, uniformity, integration and other features to make it acceptable for that.

  73. >if the answer to a support question for your OS requires an END USER to use the command line, your OS model is wrong

    Your assumption is out of date. Modern distros don’t require end users to ever touch CLI. I know this because I’m the go-to guy for my wife, my mother, and one of my sisters.

  74. @esr: While I respect your knowledge of Linux and OSS in general, in this I must disagree with you. I have run every major modern Linux distro (Ubuntu, openSuSE, Fedora, RHEL, Gentoo-command line obvious on that one i know-, Mandriva, Debian, and more that I can’t think of right now) and every single one has its own glitches, all of which have required command line intervention, often enough because the GUI was crashed (please don’t get me gong on why X is just trash), but still because the OS’s are a CLI based model. I freely admit, Ubuntu and openSuSE have come very far in moving past this, but they have not eliminated the need for even an end user to drop down to CLI to fix things when they do go wrong, or even use CLI for normal use (apt-get, anyone? Come one, even YaST by default is a GUI system, even its CLI version is an ncurses based GUI). I’m not saying that makes them bad OS’s (well, maybe Ubuntu, :) sorry, but I seem to be one of the few that detests Ubuntu), but for something that is for “average Joe” or “Grandma” or whatever term is bandied for common end user, this model is wrong. And yes, the hardware tested was reasonably modern a P4 HT with DDR2 667 memory, not bleeding edge, but solid and well tested (I used this hardware to specifically eliminate arguments based on it being bleeding edge/untested hardware instabilities/incompatibilities).

  75. >all of which have required command line intervention, often enough because the GUI was crashed

    But my mother simply is not having this experience. Nor is my sister, nor my wife. I know this, because I’m the person they call when they have problems.

    Sometimes *I* use CLI to fix things, but that’s different than them being required to do it.

  76. I just got to watch a friend of mine spend two days tracking down what Firefox 3 broke on an automatic update on her Ubuntu install. It grabbed a dependent library which broke other things she needed; she updated the other thing, and the library it grabbed nuked Firefox. She eventually backed everything up and went to a full reinstall, grabbed Firefox 2.0.9 and is back up and running. Whenever she forgets to tell apps to not auto-update, she goes through this piece of detective work.

    This, in Windows, is called DLL hell, and it’s one of the things that the Windows Registry was created to fix. The Windows Registry does, in fact, solve this problem.

    Why, with the army of coders Linux has, and those working on Ubuntu, does she have to fear updating her applications?

    This hasn’t happened in Windows since 1995.

    It hasn’t happened in MacOS since 2001.

    It never happened in BeOS.

    How much is two days of her productivity worth? Well, from the income she missed from not being able to submit work to her client’s web site, she could’ve *bought* Windows.

  77. >I just got to watch a friend of mine spend two days tracking down what Firefox 3 broke on an automatic update on her Ubuntu install. It grabbed a dependent library which broke other things she needed; she updated the other thing, and the library it grabbed nuked Firefox.

    That’s truly weird. In fact, it’s so weird that I think something must actually have corrupted her package database. I used to trip over this sort of thing occasionally under Red Hat, but Ubuntu is supposed to track dependencies like this and I’ve never seen it fail to do so. I maintain five Ubuntu systems (my desktop, my laptop, my wife’s desktop, and the servers downstairs), and I’m the guy on the phone for three more, so I do a lot of updates; my sample isn’t small.

  78. >But my mother simply is not having this experience. Nor is my sister, nor my wife. I know this, because I’m the person they call when they have problems.

    I, too, am the one who fixes the computers for my family, so I know where you’re coming from. In fact, that is specifically why NONE of my family members runs Linux. The few times I have made a set and forget Linux system (like you are referring to with your family) it took a week and a half of work to get it that only to have it break or wig out the first time the kernel (or some other system componant) updated. I don’t mind working around the bugs in Linux, its a great thing for me to tinker with and play with, but production desktop or relatives desktop? Or worse yet, let them be a member of the wheel group? Not a chance. I don’t have that kind of time, even if I could manage to maintain their systems without CLI, it just isn’t worth it, not when Windows comes (excluding, sometimes, hardware drivers) out-of-the-box in set it and forget it. Time is money, and my time is worth a healthy chunk of it.

    @Ken Burnside: Yes, dependancy hell is another large issue with Linux, especially as each distro tries to “solve” it their own way and ends up using various kludges which inevitably break, as in your case and the one I just listed. Thats one of the reasons it took so long to set up those set it and forget it Linux boxes, once they were installed I had to take time and configure and prep them, then update them and repair whatever configs and packages got completely botched in the update process. You are quite right in that this is an inexcusable state of affairs.

  79. ESR: interesting (though not altogether surprising) to see you consider accessibility secondary and basically a PR issue. As a blind computer user (and no, this is not identity politics, I don’t define myself that way but it is obviously part of who I am) it’s also interesting to note that the best I can expect from you is pity. Fortunately the accessibility situation in gnu/linux is driven by people who think it more than a feel-good checkbox, thus orca, emacspeak, BRLTTY, and some other very capable accessibility solutions. I’d also point out that accessibility, as a rule, runs together with good design, since an application that can be interrogated about state by an accessibility aid is an application that can be interrogated about state for any other purpose (testing, use as the back-end of a bigger process, etc).

    Obviously noone is obliged to care about accessibility, nor to have any particular attidue with respect to blind people (or anyone else). I’d just say though, that pronouncements like that make it rather difficult for me to advocate for free software inside the blind community. Oh well.

  80. About CLIs:

    A few years ago I picked up a Mac from a friend. I hadn’t really used a Mac before, but I heard that Mac OS X presented a good interface on top of UNIX. Not to mention that knowing OS X might look good on my resume.

    It took me a day to get used to OS X. Configuring the big stuff was easy, but I started to notice that options, normally available, were no where to be found. After reading some threads in a forum at macosxhints I found out how to accomplish these things using the CLI or a text editor.

    According to pundits and posters the moment a CLI is used then the OS isn’t user-friendly. Thus, using their logic, Mac OS X was not user friendly. But isn’t that supposed to be the Mac’s forte?

    After some thought, I reasoned that the Mac had two types of users, ones that read the manuals, and ones that won’t. To reduce clutter, the features that most people wanted were available in the GUI, and the ones that technical users wanted required them to learn how to use the command line. The only people left out were the people that wanted to be technical without the desire to read a manual (MSCEs).

    My point is that we can’t dismiss an OS just because it has an anti-clutter feature, such as the CLI.

  81. Meis is correct: a desktop OS needs to be structured from the ground up around the GUI. Everything that can be configured, set, or fixed in the OS must be reachable graphically. Those wars were fought in the 80s; the point-and-click GUI won. Mac OS X (which is more of a Unix than Linux; it is certified against Unix ’03 whereas Linux is not) passes this test (you can get a bare CLI but only an emergency root prompt and only by holding down a magic keychord at system boot). Linux still clings too fondly to its old character-terminal interface roots. It is not an acceptable modern desktop OS.

  82. >As a blind computer user (and no, this is not identity politics, I don’t define myself that way but it is obviously part of who I am) it’s also interesting to note that the best I can expect from you is pity.

    You need to work on your reading comprehension. I spoke of ‘pity’ only in a hypothetical.

    It is true that I put accessibility for blind and handicapped people relatively low on my priority list; that’s because there are much larger populations, with more influence on demand patterns, that we haven’t got the hang of writing for yet. When planning for victory, you have to plan for victory, not just for small gains that will make you feel good.

  83. > Everything that can be configured, set, or fixed in the OS must be reachable graphically.

    I agree, but that can be a pain for advanced users.

    > Mac OS X (which is more of a Unix than Linux; it is certified against Unix ‘03 whereas Linux is not) passes this test (you can get a bare CLI but only an emergency root prompt and only by holding down a magic keychord at system boot).

    How does the ability to get a full-multi-user bare CLI invalidate the ability to access all configs via GUI? I don’t see how the lack of that (occasionally very useful) feature is a good thing. It’s one of my main gripes with OS X.

    > Linux still clings too fondly to its old character-terminal interface roots. It is not an acceptable modern desktop OS.

    Uh…no. Not true. I would submit that a CLI is the most efficient way to interact with a computer, for someone practiced in it. I would also submit that the ability to use a CLI does not invalidate whatever GUI is being used. The argument that Linux is somehow being held back by its ability to use a CLI is a wrong one; the problem, which is the lack of GUI ability to access features available at CLI, is not caused by the existence of a CLI. It’s caused by developers writing for developers. Taking away the CLI will not fix that in any ways.

  84. >Uh…no. Not true. I would submit that a CLI is the most efficient way to interact with a computer, for someone practiced in it. I would also submit that the ability to use a CLI does not invalidate whatever GUI is being used. The argument that Linux is somehow being held back by its ability to use a CLI is a wrong one; the problem, which is the lack of GUI ability to access features available at CLI, is not caused by the existence of a CLI. It’s caused by developers writing for developers. Taking away the CLI will not fix that in any ways.

    Correct. I analyzed these issues in some detail in The Art of Unix Programming.

  85. Ah, we’ve left off pummeling the GIMP and moved on to that other old chestnut, the GUI vs. the CLI. How many little brown’n’orange GUIs does Ubuntu need to provide before we can lay this to rest.

    Now here’s a problem for an open source OS. Unlike the closed OS, there are no arbitrary limits to what can be done. So in the interest of fairness, if you are going to go about accusing Linux of lacking GUIs, please do this first:
    – Come up with a concrete example of a task that requires the CLI
    – Determine that no major desktop distro released in the last 2 years lacks a GUI for it
    – Make sure that the same task can be done AT ALL in the proprietary OS you are comparing to
    – If you meet this, gripe away. Then post it at Ubuntu brainstorm so the world can benefit from your research.

    I use Ubuntu, but sadly for the folks who code GUI tools I mostly use the command line to fix stuff. Because it’s fast and I have better things to do than wait for a window to draw. And no, I’m not an old-skool Unix hacker either; I never touched a unix-like OS until 2003.

  86. The argument that Linux is somehow being held back by its ability to use a CLI is a wrong one; the problem, which is the lack of GUI ability to access features available at CLI, is not caused by the existence of a CLI. It’s caused by developers writing for developers.

    Specifically I was not addressing the fact that Linux had a CLI, but rather that the GUI was still, in many ways, a second class citizen under Linux. Config files scattered to the four winds, applications installed in /usr/bin with their associated resources in /usr/share — oh, and the DE has to be made separately aware of the application for it to appear on the menu — all of these things speak of an OS designed to be textually interacted with, with a GUI retrofitted on top of it rather than one designed from the ground up around the graphical interface and spatial metaphor. So, yes — most things have graphical knobs these days, but the infrastructure required to support said graphical knobs is immense, complex, and consequently more brittle.

    What you say about developers writing for developers rings true (wurld dominashun — ur doin it wrong); the end user must be placed first and foremost in the fundamental design of the OS. This is not really done in the GNU/Linux world and it may require some complete rethinking even of how we write a kernel. (Again, the Amiga devs got that one right from day one.)

    – Come up with a concrete example of a task that requires the CLI
    – Determine that no major desktop distro released in the last 2 years lacks a GUI for it

    Wireless. Last time I used Ubuntu (around a year ago) the GUI for wireless didn’t work 100%; I had to drop to a shell to get everything working. All the knobs were there, but I couldn’t connect to my home WPA network.

    I use Ubuntu, but sadly for the folks who code GUI tools I mostly use the command line to fix stuff. Because it’s fast and I have better things to do than wait for a window to draw.

    Me too, but I have lately begun to wonder whether using a command line to fix stuff is because under Linux using the GUI tools is simply too cumbersome. Windows and Mac don’t have this problem. Anything that can be configured can be easily set with the GUI. (In the case of Windows, “easily” is rather relative…)

  87. ESR: OK, I understand your remark to be hypothetical. For whatever reason I understood the hypothetical to refer to the other clause (as in, caring about accessibility is hypothetical but pitying blind people is actual). My reading comprehension is greatly aided by less ambiguity.

    As to the tactical issue, it can be argued both ways. I think demand patterns of public entities are going to be somewhat affected by the accessibility situation (not speaking of blind people here specifically). In addition, life expectancy keeps going up, so we can expect more disabled people (maybe medicine will fix this in the future, but maybe not). To close, accessibility, like security, tends to be problematic when bolted on. It’s the kind of thing that is usually cheaper to design in advance. The good thing about Unix philosophy is that it tend to be conducive to such designs, given the tendency to separate processing from presentation, and the desire to make things automatable. So far I’m pretty sure gnu/linux is the only OS that can be installed from scratch by a blind person, for instance, without help. Not sure about macs.

  88. >To close, accessibility, like security, tends to be problematic when bolted on.

    That’s an interesting point. I don’t think your argument about increased life expectancy is sound – blindness and motor impairment may happen as complications of age-related diseases, but they’re not common enough to make the impaired population much larger relative to the general population of oldsters. You’re still competing for UI design time against a much larger group of people who have good visual acuity and fine motor control; their desires, e.g. for better-tuned visual interfaces, have to come first if our goal is to win market share.

    On the other hand, you’re clearly right that accessibility can’t be painted on, and that making GUIs accessible to the blind is especially problematic (I have given related issues some thought in connection with my unfinished The Art of Unix Usability). You may just have given me some ammunition against fools who think CLI should be abolished in favor of an all-GUI world. My reason for thinking them fools is that CLI can be more concise and better suited to power users; what you’ve implicitly pointed out is that it is more easily rendered to and from speech, and that matters.

  89. “But my mother simply is not having this experience. Nor is my sister, nor my wife. I know this, because I’m the person they call when they have problems.

    Sometimes *I* use CLI to fix things, but that’s different than them being required to do it.”

    But esr, the problem is that the average joe six-pack/hockey mom probably doesn’t have an expert like you around to help them out, or to set it up so that there isn’t as much of a need to use the CLI. What happens when someone hears about linux decides to install it and suddenly their wifi doesn’t work. Not even with the driver they downloaded through a lan connection. Then they find something that in a forumhelping them by saying sudo blacklist blah blah blah, and then modprobe, and then follow this easy to read step of configuring your wireless connection.

    True through your experience using the CLI isn’t a major issue, but that’s your experience, and you’ve been doing this a little while so you’re a little more savvy to how to go about these things. So your claim of it’s different from them having to do it is negated by the fact that you are who you are. If they didn’t have you around there’s no telling whether they’d be able to work through. I’m not saying they aren’t intelligent and unable to figure it out, but since you’re there you can’t see the difficulties. By adding an expert to the equation you’ve rendered your example as anecdotal rather than empirical.

  90. Have you ever troubleshooted a wireless connection on Windows? Sometimes it’s pretty much impossible to get it working there.

    The argument that Linux is hard to use compared to the predominant system is pretty ridiculous when you consider how hard Windows itself is to use. Plus, it’s just my observation that many families have at least one “tech guy”, else they can find a shop to take their machines to (unfortunately ones that help you with Linux are not too common, yet…).

  91. Plus, it’s just my observation that many families have at least one “tech guy”, else they can find a shop to take their machines to (unfortunately ones that help you with Linux are not too common, yet…).

    I’m inclined to agree that such a shop is uncommon, and I wonder which is more common: A shop like you indicate, or a Linux User Group…? Personally I’d think a LUG would be more common but that’s because I look for LUGs and I haven’t really tried to find a shop.

    I’m curious, and I’ll look around locally to see how many shops would work with Linux.

  92. The thing is that I can do most if not all that I need to in the Windows wifi GUI, CLI is not generally necessary – possible but not necessary. With the Linux flavors that I’ve tried it typically requires mostly CLI time to try and get it working. With certain chipsets it’s still hit or miss, depending on the flavor of Linux and what needs to be resolved.

    And LUG’s are fine, but a person can still need a certain technical knowledge level to be able to understand what’s being said as well as implement it.

    Anyway, the comment was about the use of CLI vs. GUI and whether a resident “expert” (ok in the case of esr remove the quotes” makes it user friendly or not.

    As far as many families having a “tech person”, I’m not sure how much weight anecdotal evidence carries. I can say that it’s my observation that most don’t have a “techie” or trust shops to work on their stuff (I don’t want them getting my sercret to the ultimate brownies. The secret is in the…” But that’s just anecdotal evidence, there’s nothing other than my word that this is the case. Just like saying that Linux/Windows/OS X is great/sucks based off of anecdotal evidence is meaningless. It’s just personal opinions with nothing to back them up. That isn’t to say that your point isn’t true, but it’s true only for the subset of the population that you’ve observed, under the circumstances that you’ve observed them in (typically uncontrolled.)

  93. Have you ever considered the logistics of enabling the Geek Squad to service Linux PC’s? Despite their cool-sounding names (Geek Squad, Genius Bar at Apple stores, etc.) they are droids. Reinstalling Windows constitutes the standard fix probably for many if not most of the problems they see.

    Now imagine someone brings their Fedora box in and gets an Ubuntu box back. EPIC FAIL.

  94. >But esr, the problem is that the average joe six-pack/hockey mom probably doesn’t have an expert like you around to help them out, or to set it up so that there isn’t as much of a need to use the CLI.

    Please, I’m not so dense that I don’t understand this argument. The thing is, the average “joe six-pack/hockey mom” doesn’t have an expert around to fix Windows when it breaks, either. You’re falling into the trap of arguing as though Windows is mythically hassle-free when the right questions to ask are about the relative pain level.

    Two years ago I would not have dared to be the informal Linux support guy for three relatives. Today it’s easy. As it happens I did my first real service call since installation a few days ago — had to fsck a disk after my mom hard-crashed her system with the power button (modern Unixes are hardened against this but as with Windows it can still happen if the dice come up triple sixes). Sounds terrible until you consider all the Windows problems she could have been having since I switched her over just about a year ago.

  95. No one here is saying Windows is hassle-free. It’s just that the utility-to-hassle ratio for Windows is still much higher than for Linux.

  96. it’s true only for the subset of the population that you’ve observed, under the circumstances that you’ve observed them in (typically uncontrolled.)

    As all such things are, and with that disclaimer I’ll share below one small experiment I’ve done.

    No one here is saying Windows is hassle-free. It’s just that the utility-to-hassle ratio for Windows is still much higher than for Linux.

    That’s not been my experience. The experiment I referred to above went something like this: My cousin is Joe Six Pack, a typical user who doesn’t know how any computer works and doesn’t care to. For a time he lived at my house. I assembled some spare parts for him and installed Fedora. I showed him menu items for Firefox and Gaim and told him to call me if he had any problems. (I did not tell him root’s password.)

    He used that computer for a year and never had to ask me for anything.

    Of course the disclaimer applies, this casual study involved only one person. Still, a year is a long time.

  97. >Now imagine someone brings their Fedora box in and gets an Ubuntu box back. EPIC FAIL.

    Given what I’ve heard about Fedora, I think that might be an epic win, actually.

  98. “sounds terrible until you consider all the Windows problems she could have been having since I switched her over just about a year ago.”

    And yet she also could have been having plenty of Linux problems. You did a good install, knew what your were doing and set it up right.

    I’ve seen Windows boxes run just as long without “all the Windows problems” that you assumed. I’ve seen linux boxes give their equivalent of a BSOD often enough to deserve a clean install. You can’t lay a blanket statement like the one I quoted and expect it to actually mean anything. You do know what assuming does right? Well you just assumed that there would be “”all the Windows problems.” Yes, I know that you said could, but the context says would.

  99. >You did a good install, knew what your were doing and set it up right.

    My point is that’s no longer difficult. I installed a completely stock Ubuntu in the completely standard way; the only sticky spot was that it wouldn’t talk to her Winmodem. so I bought her a cheap wifi card (which worked perfectly first time) and she connects through a neighbor’s WAP. In the 11 months since then, until I had to fsck that disk this weekend, no Linux-related hassles at all. Zero. None.

  100. I’ve been following your discussion for a while and I am surprised that nobody brought into discussion K3B or Amarok with all their plug-ins and looks .

  101. Hi there I like your post “Linux-Hater’s Blog, considered” so well that I like to ask you whether I should translate into German and linking back. Answer welcome. Greetings Schlauchboot

  102. Interesting article (and discussion).

    A few thoughts – the sorts of things that either work, or don’t work day-to-day, or require troubleshooting in Linux are usually a result of the way all these packages are integrated together by the distributor. It seems those that know the Linux landscape _can_ pick a path using a polished distribution that “just works”, and get results at least up to the standard of Windows ABC on the same machine. However there is such a diversity in linux distros (and hardware peripherals) that there are innumerable ways to get it wrong, and for every Windows user that has a punt and gets it wrong, its “Linux” that didn’t work, not XYZ distro.

    In a similar line, a user might have a problem with a piece of hardware. It may be that there is a perfectly reasonable GUI that enables the problem to be fixed, but because there is *always* a CLI equivalent to solve the problem, the view is propogated that “Linux” is difficult because you need to use the CLI. I know from a personal perspective, when trying to describe a process to someone, it is far simpler to *describe* the process via a series of CLI statements than trying to describe all the menu navigation, button labels, mouse clicks to solve a task. I may use the GUI myself because in most cases its just click-click-click, but I understand the processes underneath so when I have to explain it to someone else I’ll often prefer to give them three commands to paste into a bassh session, than write a three page story with pictures to show them how to do it via the GUI.

    In short – for all the different combinations of KDE/GNOME/XFCE/Fluxbox/Ubuntu/Kubuntu/Mepis/Gentoo/Slax/Debian/whatever, no matter how well packaged a distro gets (even if they are able to exceed the levels of integration and useability offered by Mac/Win), every time someone somewhere experiences a problem with some other obscure combo of distro and hardware and seeks help on a forum, whether or not there is a simple GUI solution to their problem, there will always be many more solutions offered that contain command line entries – and hence the perpetuate the view that “this Linux thing is poorly integrated, and when it breaks you need to use a whole bunch of arcane commands to fix it”.

    The solution? dunno. As a distro builder with a stand-apart offering you’d want to have a strong branding distinct from the Linux name though. Sound liek a certain recently popular distro that’s copped some flak for exactly that?

  103. A few thoughts – the sorts of things that either work, or don’t work day-to-day, or require troubleshooting in Linux are usually a result of the way all these packages are integrated together by the distributor. It seems those that know the Linux landscape _can_ pick a path using a polished distribution that “just works”, and get results at least up to the standard of Windows ABC on the same machine.

    I’m one of those people, and my quotidian use of Linux and BSD is only made possible because of that fact. In the words of Matt Foley, I’m going to give you a little scenario about my life so you know where I’m coming from.

    In the early 1980s, the only two companies making Unix workstations for the desktop were Sun Microsystems and, oddly enough, Tandy. My father brought home one of these Tandy beasts and occasionally, I got to play with it (provided I didn’t break it of course). He held onto it for years past its expected service lifetime, and I learned eventually to program C on it. When I came to Linux in 1995 I had already an internal model of how Unix systems worked that I had carried with me, literally, since I was 11 years old.

    Where I’m going with this is that we who carry internal computer system models in our head are freaks compared to the general population, who can’t even decipher what “PC LOAD LETTER” means. Linux (and Unix in general) is designed for us, but a modern desktop OS needs to be designed for them. The machine and its operating environment should be designed to be operated by an ordinary human carrying a system model of whatever his business happens to be, not the computer itself. That’s why our financial system runs on Excel spreadsheets (possibly a factor in their failure?): for the people whose business is business, Excel is an ideal platform to do your thinking on. The machine should also support recovery from failure modes without a jarring shift in operational models. Integration between hardware, OS, and applications should be seamless.

    The only major desktop computer platform I see nowadays doing all this is the Mac. Windows is kinda sorta halfway there; the scar tissue the PC user base grew in dealing with Windows’s shortcomings in this regard is overwhelmingly evident. Such a user base cannot deal with Linux — not even something slick and pretty like Ubuntu — so long as Linux retains this thousand-and-one different ways to assemble things but not-quite-integrate them approach.

    The solution? dunno. As a distro builder with a stand-apart offering you’d want to have a strong branding distinct from the Linux name though. Sound liek a certain recently popular distro that’s copped some flak for exactly that?

    There’s a free-Unix “distro” that achieves all this. It’s called Mac OS X. The fact that it has wooed many Linux users into switching speaks towards its effectiveness at obsoleting desktop Linux.

  104. The Unix Haters Handbook will be a classic, required reading, 100 years from now.

    How can I say that. A long story…

    When RMS was finally washing his hands of the MIT CADR software, to never touch it again, I actually tried to convince him that he should continue programming in Lisp rather than adopting the C programming language for his (not yet named GNU) venture. But he said that he understood the limitations, but thought that Unix-like systems would become very popular one day, and it was important to build around something that was generally understood and popular.

    Note that I had previously convinced Lisp Machine Inc to fund RMS in the development of a Macro package on top of TeX that would make it easy to print the Bolio format MIT CADR documentation (aka BoTex aka Texinfo). So I had some credibility with him, and he also knew that I understood the limitations of lisp on hardware at the time from my work on VAX-NIL and PDP-10 Maclisp.

    Face it. Unix set back the expectations of what a computer should do by about 10 years.
    When that was finally begining to be digested along came the web, which set back expectations by about 20 years.
    (A block mode IBM 3270 terminals with graphical icons enhancement, thats all).

    Now here we are 25 years later, with hardware that is about 1000 times faster, with about 1000 times more storage, and the biggest leap forward that I can think of is TRUETYPE. But wouldn’t Metafont work almost as well?

    Meanwhile, the CADR is open source free software with various virtual machine implementations, Multics is also
    open source and free, but no VM available yet. And we have a new Free Open/Source Hardware movement going.

  105. >The Unix Haters Handbook will be a classic, required reading, 100 years from now.

    I was the first-line technical reviewer for UHH, and I don’t think so. Read my retrospective to see why. I mourn the missed opportunity.

  106. Meanwhile, the CADR is open source free software with various virtual machine implementations, Multics is also
    open source and free, but no VM available yet. And we have a new Free Open/Source Hardware movement going.

    “Various virtual machine implementations”? Sweet! I only know of the one — usim — and while running a CADR is made of pure awesome, there’s a busy-wait or something in there which causes my laptop’s fan to spin up to disturbingly high RPM. Is there a CADR VM that is more well-behaved with CPU time?

  107. >I’ve seen Windows boxes run just as long without “all the Windows problems” that you assumed. I’ve seen linux boxes give their equivalent of a BSOD often enough to deserve a clean install.

    I’ve never seen either of these animals. With respect to Windows boxes, I’ve never seen any longevity without major nuisances and heavy technical maintenance in any vanilla installation of any version of the system. Since joining my fiance in New Zealand last May, her Windows box has been reinstalled three times. The current iteration is not looking very healthy now, either. Yet, an installation of GNU+Linux that I had put on it when I was out last year works exactly as well as it did when first installed on the machine. Barring hardware failure, which recently forced me to reinstall Archlinux on my primary Desktop, no software fault has ever required me to perform a “clean install” of GNU+Linux. I used to get some soft lock-ups on the Desktop a few months ago, which vanished immediately after installing the next kernel version. Nothing even close to a “clean install” was required. I have never seen patches or updates coming from Microsoft that completely corrected and restored a system from a problem with that severity 1. after a fault initially presents, and 2. without reinstalling the entire thing again, anyway.

    Having said that, my experiences prove closer to ESR’s trepidations then your comments to the contrary.

  108. ESR, I am really shocked that you take the site on face value. I feel a little silly pointing this out, but the LHB author *wants* to be a huge Linux fan. He loves the open-source ideal. He sees the potential of Linux, uses it himself, and would like to recommend it to others. But there are too many flaws to conscionably recommend it to an average user. He knows that if the Linux community had an answer to the flaws he points out, Linux really would be something to brag about! What’s worse, the community at large isn’t actively addressing these things. (Or something along those lines– hopefully you get the idea.)

    You seemed to almost see this in a couple of your comments, but it seems like it never caught hold. Re-read his articles in this light, and tell me you still think he’s some Windows advocate or something.

    You may disagree with his methods, but there’s logic there. Say you’re a Linux fan that is honest with himself, and so naturally sees an ocean of flaws in front of him. You also have a way with words, and decide to start a website to talk about this. You decide to write satirical, inflammatory articles. Some people will get the joke. Some people won’t, and will get angry that someone on the internet dared criticize Linux. Still others may or may not get it, but will be motivated to go out and fix some of these things. In the end, would you say that the LHB author has had a positive or negative net effect on the Linux/OSS community?

    I have a lot of respect for you, but I am sorry to say that I have to give you a wholehearted YHBT.

  109. QUOTE: “With respect to Windows boxes, I’ve never seen any longevity without major nuisances and heavy technical maintenance in any vanilla installation of any version of the system.”

    Linux users are supposedly knowledgeable therefore working around Windows issues should be trivial (and they are). Clean up and maintenance can be automated, services and applications can be controlled. Really trivial things. Hell you can just import some registry keys then reboot to save you a bunch of time. There are tons of tools for these kinds of things. This shouldn’t really be an issue for somebody who actually knows what they’re doing. I don’t get this double standard people have with Windows. If I complain about the default behavior of something in Linux people will tell me to change the setting yet these same people will take shots at Windows because of its default settings. Either they’re unfamiliar with Windows or they’re so biased they see nothing wrong with their inconsistencies. Reminds me of a guy who hated IE because it asked him if he wanted it to be the default browser and therefore Firefox was superior. Never mind that Firefox has the same exact behavior and the methods to turn the behavior off are in the same three places in both programs.

    Then of course my experience with Linux hasn’t been much different with Windows regarding defaults. I always have to do something with either OS. Set something up, turn something off, remove something, et cetera. There’s always something that annoys me that requires a work-a-round. I just don’t see the big difference. Specific problems may be different but all-in-all I’m not having more or less problems on either side. Anybody who really knows what they’re doing can set up a stable system regardless of it being Windows or Linux.

  110. >I feel a little silly pointing this out, but the LHB author *wants* to be a huge Linux fan. […] I have a lot of respect for you, but I am sorry to say that I have to give you a wholehearted YHBT.

    You know, when I write things like “Zounds! You could almost think LHB were a secret Linux fan!”, I expect people to read that as “He’s not fooling me any better than he’s fooling himself.” I respectfully suggest that the reading-comprehension failure here is yours, not mine.

  111. I’m sure you’ll all be delighted to know that the LHB has resumed service… handbags at the ready, girls!

  112. I would like somebody on this board to explain something to me.

    I have used various distros and different versions of Linux as well. I have compiled a few tarballs, used some packaging systems (both CLI and GUI — I remember Fedora 3’s system was particularly horrid). My basic understanding is that it’s up to the distro maintainer (vendor?) to supply all the software to the end user, whereas with Windows I can download freeware from the author’s website, buy it in a shop or get shareware from various file depositories etc.

    When I asked someone why a particular package worked in one version of Linux but not another, I learned that every 6 months, a new version comes out, and everything changes — the GUI, the GCC versions etc. so everthing is recompiled. In the case of Mandrake for example, that meant that v.8.1 was, on average, just as buggy as 8.0 — many bugs were presumably fixed in the 6 months after 8.0 was first released (otherwise what were those updates doing — surely not merely wasting bandwidth?). But 8.1 had just as many bugs, just different ones. (because of all the changes they made to everything.)

    Why go up a version so often? Wouldn’t there be a better chance of getting a more robust, mature OS that works if they had just kept working on 8.0 for, say, 3 years? (I’m just using Mandrake as an example, they all seem to have 6 month-release cycles)

    I think distro-independent binaries would be better, with a single click installer. One for AMD64, the other for intelx64. Someone once said to me that was impossible because Linux advances so quickly, and Ubuntu x is so much different from Ubuntu y.

    Really? How come often the same binary works in Windows 95, 98, 2000, XP? I played Bad Mojo the other day, it was written for 3.1 AND it still works perfectly in XP SP3. I don’t know about you, but I think the ability to run old software can often be very useful.

    Could Linux just slow down and take things easy? It seems like it’s in such a hurry. And why does everybody call Microsoft a monopoly? There’s still Apple, and from what i have seen of it, OSX is the best desktop UNIX-like to date.

  113. I rejoice at LHB’s return. Gotta love him, even if I love linux more :P

    @Marco Nadal
    “And why does everybody call Microsoft a monopoly?”

    Well for a frighteningly large percentage of computer-users, computing = Windows. Give me better example of a monopoly?

    G.

  114. >I respectfully suggest that the reading-comprehension failure here is yours, not mine.

    Sorry, not buying it. Why write a 1500-word essay, if you really saw the meaning of his site? YHBT.

  115. >Sorry, not buying it. Why write a 1500-word essay, if you really saw the meaning of his site?

    Because he made substantive points that I think Linux developers should listen to. I wanted to give LHB a bit more exposure while suggesting to him that less flamage would be more effective.

  116. Eric,
    “So, for example, nobody seems to be trying to build direct competitors to the GIMP, but we do have Scribus and Inkscape that work from different imaging models.”

    KDE’s Krita saves almost the same problems from a different technical point of view. They will say that Krita is more oriented towards freehand and towards using tablets. Both problems could be solved by improving Gimp, for a fraction of the development effort of Krita.
    Gimp, however, is ugly, convoluted and built around Gtk, which doesn’t have the best reputation around.

    A “business user” point of view would never understand why Krita exists. Your own argument proves LHB’s point.

  117. Tuomov the developer of the ION window manager has some interesting Linux criticism, too.

    Tuomov’s main point is, I think, simplicity.
    I like simplicity too, but I am not a computer expert (just a musician), so I would like to read your comments about…

  118. A few comments: that LHB does not have the guts to allow comments on his posts …

    Audio does absolutely suck in Linux!!! When I have Lirefox running playing a youtube video, if someone calls my SkypeIn phone number, Skype cannot make any sound, etc … It may be that Fedora did a bad job, but it definitely sucks!

    How can someone dedicate so much time at analysing Linux and criticizing it is beyond me! Is he being payed by Microsoft or Apple?
    I mean someone with a normal brain should either like Linux or does not spend so much time on it

  119. I remember a time when Bill G. behaves like our world CTO and CIO in one person without being appointed officially. What a media event when Bill retreated into the wild to his cabin to envision new IT technologies and methods. Everybody waited eagerly for the messiah to come back to announce the future of our digital live. Afterwards you paid a lot of money for only one special MS service which was to get patronized. In this situation your obligation as customer was not only to keep your OS in good mood but the hotline guy of MS as well.
    With Linux/GNU/Distribution breathing down on their neck they put the customer back in focus. This is good for business customers having a healthy power of demand again.
    But even in the circumstances of a tamed IT company, for private matters there is no way around open and publicly owned software. Nowadays your whole life is digitalized on your PC. It must not be exposed to commercial interests by any means. In the end money making comes first and than the customer and than prosperity for the rest of humankind. This includes Apple and Google as well.

    The open source community is by far not perfect because of normal group dynamics. But they don’t have to hide this either. I favor every controversial debate in the community over some streamlined one-way information from the marketing department of a big company. But on the other hand the debate should be reasonable. Therefore I am happy about how your article handles the LHB blog on a constructive way.

    Regarding GIMP: http://gui.gimp.org/index.php/GIMP_UI_Redesign.

  120. @esr: I’m kinda assuming you are the blogger here. Hmm – Lazarus Long as a quote? Why not credit the actual author? Smacks of fantasy, dude. So, now I have left a little attack, and perhaps have taken you off balance, we will proceed. The author you quote famously touted engineering and logic. And, you have some very well reasoned and rational thoughts in this post. There are two, however that I quite disagree with your thinking, and have to throw my support more to LHB. Those are your comments on “Feel the Source”, and “The Fallacy of Choice”.

    A little intro – I’ve been the office/family geek for about a decade and a half. Almost all of that has been DOS/Windows, with a little Novell thrown in, and multiple attempts to integrate Linux into the mix. At one point, I had a healthy income due to my ability to help people with their computer issues in a Windows world. The Linux attempts were, for the most part, “Fail”, until this year. It took a significant lurch, and a significant learning effort, but I am now somewhat Linux literate.

    My observations have been that installing the software functionality that one is accustomed to in the Windows world, is a major PIA in the Linux world. Synaptic be damned. Let’s say I’ve gotten used to a neat little feature like having my desktop background automatically change, using pics of my choice. Is this easy for me in the Linux world? No – there are choices, but they are difficult to locate, difficult to install, and they may not even work – but I have no way of knowing. Somebody else will say how easy it is to google these things, and then install them – but I am using the example of a real user, and not a dummy, who was NOT able to find this functionality that way. That user isn’t alone – he is just the tip of the iceberg. Like LHB said, most user don’t really want all that choice. Especially when most of those choices are substandard compared to what Our User is used to. And, the three tier distribution model that you mention? It sucks – because EVERY distro has to do it. If it isn’t in the distro repositories, Our End-User is stuck. The software distribution model used by Linux is a major weakness. It needs to be addressed, it needs to be addressed cross-distro, and it needs to be addressed by something better than make/make-install/configure. For each distro to go completely their own way weakens the OS as a whole. For an end-user to have to configure software is just plain a no go. I’d guess 85% of end-users have to click and go, or they won’t go at all. Maybe 10% on top of that will give a reasonable attempt to make things work with a few clicks and commands – but not as much effort as what would be required if the software you wanted was not in your repository. Add to this, that the FOSS model doesn’t work (or, at least, not as well as . . .but read on). Oh, it is indeed valid for a small segment of software implementation (1%??). But we have real-world confirmation that the proprietary model does work, and that freeware also works. And, that the something-in-the-middle, shareware, also works, but just not real well.

    Regards; Hiero2

  121. Ah! I forgot one other point – LHB uses Linux at work, by his/her own statements in the LHB. On this basis, those who believe that LHB secretly admires Linux could be completely wrong. LHB is no different from a disgruntled cubicle Windows user – I believe LHB HAS to use Linux, and therefore has NO allegiance or other emotional tie. Although, there also appears references to an attempt by LHB to improve the lot of Linux by getting the Linux powers-that-be to listen, and to change. So, I conclude that LHB has to use Linux, but wishes that it would do better than it does for philosophical as well as practical reasons.

  122. I just glanced thru the lhb , I wont go into the too techincal parts ,which I dont know much , I keep zipped,
    BUT my biggest question is Why these folks are taking every tiny parts/projects/softwares like Amarok or linuxfolks saying linux is virus free , bugs them soooo much ,
    DOES NOT EVERY FRIGIN PRODUCT MANUFACTURING COMPANY DO THAT ? Say and praise their own products even if they are bad or good , I never heard any one saying how annoying some win or mac os’s parts are (except for IE).
    Its like since they are getting it for free , they can shit on it as much as they want , it’s like a psychological effect that when they buy some crap and have to pay hard earned cash that thing INSTANTLY become like second child , and since something of the almost same caliber being produced by a group of folks who are not being douchbags like some others its suddenly not worth it ~
    (the really annoying thing when someone uses pirated windows and says I lub windoze and I hate linux coz I never used it and never will, really how do you deal with these people ?)

Leave a comment

Your email address will not be published. Required fields are marked *