Warmed-over Vista

OK, it’s now definite. Windows 7 is gonna suck, and suck hard.

The plausible suspicion all along has been that it’s the Vista codebase with a superficial paint job. Now it appears there are holes in the paint; the public Windows 7 beta describes itself as Vista.

A Slashdot article parses the hints that there will be no beta 2, nor a release candidate 2 – they’re going to go straight to a full production release within a few months. There is no possible way for Microsoft to address architecture-level problems, fix the driver model, or claw back their lost application compatibility in that time. In effect, Windows 7 will be a Vista service pack with a fancy new brand on it.

Microsoft is behaving as though it believes that Vista’s problems were nothing but PR, and that by rebranding and spinning up the hype engines they can overcome those. The results when this strategy collides with reality should be … entertaining, to say the least.

150 comments

  1. To be (semi-) fair, the Windows 7 beta is much less of a resource hog than Vista was, so they’ve likely addressed at least that aspect of the concerns with Vista. Still, it’s pretty clear from playing around with it that it’s a Vista makeover — there are some minor changes in the UI, but on the whole it has the same experience.

  2. Meanwhile, have you noticed the explosion of Linux and Linux-based operating systems going on in the smartphone and netbook markets right now? Intel itself just announced the alpha of a Linux distro for netbooks. The empire is falling more loudly of late, it seems.

  3. The first link just shows the technical writers forgot to update a file. It doesn’t mean the code hasn’t been redone at all. It just means they’re at least re-using one help file and didn’t update. They should certainly be spanked for that — the tools have been in place for a very long time to manage your product name in help files on a global basis.

    (I’m a tech writer and tools engineer for publication systems.)

  4. It is so obvious Microsoft is still in pain from Intel’s “right hand turn”. They expected 4Ghz processors to smooth out Vista and beyond, instead they have dual core 2Ghz and the new product growth market is single core, power optimized, in order execution 1.6Ghz.

    Microsoft has directly communicated that Windows 7 is entirely based on Vista, with some (probably minor) changes, some details: http://channel9.msdn.com/shows/Going+Deep/Mark-Russinovich-Inside-Windows-7/

    For another anecdotal Linux experience, Christmas 2007 my nearly 20 year old brother in law showed up with a Gateway laptop running Ubuntu. He had figured out how nuke and pave Windows, install whatever software needed , play his MP3s and use all of his web based applications (of course).

    So I ask him, “You know what is cool about Linux right? You can do everything from the command line.”

    He replies… “What’s the command line?”

    I love this story. I probably tell it too much.

  5. Super! Vista works great for me. I was afraid they would bow to nay-sayers and fuck it up. Evolutionary changes all the way.

    I am tired of all morons that never actually used Vista complaining about it. I use it every day, at home and at work, on updated hardware of course and it rocks. Polished too, unlike the horrible Linux fonts that make the ‘Net look like ass.

  6. Eric,

    What do you mean fix the driver model? You mean make it broken again from a security standpoint, as it was before Vista?

  7. Of COURSE Win7 is Vista underneath. What, you thought they re-invented the whole thing from scratch?

    I have vista ultimate at home and work, and have had 0 problems since turning off the user account control stuff in hour #1.

  8. >You mean make it broken again from a security standpoint, as it was before Vista?

    Not so much that as the API changes. I’d read somewhere that Vista has an entire new graphics/multimedia framework and that patchy support of the older APIs was tanking a lot of apps.

  9. Not so much that as the API changes. I’d read somewhere that Vista has an entire new graphics/multimedia framework and that patchy support of the older APIs was tanking a lot of apps.

    Thanks for clarifying. You’re right; there’s no fucking excuse. Free software guys have gotten surprisingly far in replicating the old APIs in a completely foreign environment (Wine); Microsoft has the resources and incentive to provide compatibility shims that work.

  10. Thanks for clarifying. You’re right; there’s no fucking excuse. Free software guys have gotten surprisingly far in replicating the old APIs in a completely foreign environment (Wine); Microsoft has the resources and incentive to provide compatibility shims that work.

    Sometimes to get a better design, you have to break backwards compatibility. Many key bits of Microsoft software have huge gobs of code dedicated to making small quirks from previous versions work. I agree though, that a compatibility layer separate to the OS is appropriate.

  11. Now it appears there are holes in the paint; the public Windows 7 beta describes itself as Vista.

    That screenshot is hardly definitive; it’s been fairly well-known for a long time that Microsoft has no concept of setting an environment variable or anything that can change the product name with a snap, it’s hard-coded into various places. Likewise, they don’t use any version control at all for Windows!

    At any rate, you have to be a fool to think that Microsoft has both the inclination and manpower to fix fundamental problems in Windows. They have not yet done so since the very first versions (both the DOS-based ones and the NTs), and Windows 7 is really just some minor changes to Vista; it makes the first version ever to be faster and lighter than the previous one, but it’s not due to any real fixes to the design.

  12. …they don’t use any version control at all for Windows!

    I’m told that Microsoft actually wrote its own version-control system they use on their internal projects, called “Slime” (sort of an acronym for “Source Library Manager,” or perhaps the actual acronym was “SLM” and employees just called it “Slime”). They tried to turn that into an actual product to complement their other development tools, calling it “Microsoft Delta,” but it was poorly received and Microsoft wound up buying OneTree and turning their SourceSafe product into Visual SourceSafe.

    Whether Slime is better than “no version control at all,” I haven’t heard, but my guess would be “yes, but only just.”

    (Disclaimer: Around the time MS brought out Delta, I was working for another company that also made version control and configuration management software for Windows. It’s now very hard to find any information about Delta on the Internet.)

  13. I think the big problem with Microsoft is that they design for “average” people. And average people tend to be nowhere near as smart of software engineers. So Microsoft developers are designing for people they might consider to be beneath them. As Paul Graham points out, this tends to have less than optimal results.

    Of course, Linux can occasionally be rough going, but one can tell in the design that developers treat users as equals by refusing to condescend to them. Mac OS X assumes that most of its users are smart, but just happen to have better things to do than mess around configuring their computer. Of course, there’s a terminal for the geeks.

    In both cases, the developers seem to be designing stuff that they’d actually use. That’s what makes the difference between Windows and Unix.

  14. Mac OS X assumes that most of its users are smart, but just happen to have better things to do than mess around configuring their computer. Of course, there’s a terminal for the geeks.

    I think it’s more accurate to say that OS X assumes that its users will think it is well designed because it’s based on Unix. This is very much not the case.

  15. >I think it’s more accurate to say that OS X assumes that its users will think it is well designed because it’s based on Unix.

    Hah. Most Mac OS X users have no frickin’ idea what Unix is, and wouldn’t give a fart if you tried to explain. This I know from experience.

  16. Sometimes to get a better design, you have to break backwards compatibility. Many key bits of Microsoft software have huge gobs of code dedicated to making small quirks from previous versions work. I agree though, that a compatibility layer separate to the OS is appropriate.

    Marshal, you are absolutely right about breaking backwards compatibility. The thing is Microsoft could have done it right, but their history is littered with ur-doing-it-wrong moments that leave their user base saddled with yesteryear’s tech when it’s well past its expiration date.

    Apple has, to my knowledge, never gotten it wrong.

  17. >Hah. Most Mac OS X users have no frickin’ idea what Unix is, and wouldn’t give a fart if you tried to explain. This I know from experience.

    I don’t think most Windows users have any idea what DOS is, either.

  18. @Jeff: backward compatibility is the most important thing for an OS, much more important that any revolutionary design decisions. You (the ISV) can just ship a binary and be sure it can be run by an average user – now and six months later, when OS gets updated. Apple paid less attention to backward compatibility and has 7% of users wile Microsoft has 90%.

    (Linux world doesn’t pay any attention to backward compatibility that’s why there’s no market for commercial software for it)

  19. Hah. Most Mac OS X users have no frickin’ idea what Unix is, and wouldn’t give a fart if you tried to explain. This I know from experience.

    You are right. However, there is a sizeable contingent that is pro-OS X solely because it is based on Unix. My primary laptop is a MacBook, and I have happily run OS X for years. I have also happily run Windows, Linux and Free BSD continuously over that period quite happily on various hardware. Frankly, it becomes tiresome listening to people who believe one OS is inherently superior to another. The idea that OS X is well designed (untrue) because it is based on Unix is the most frequent diatribe I have to listen to, hence my exaggeration.

  20. I think the big problem with Microsoft is that they design for “average” people.

    That’s not actually a problem now. It is plain to see that MS aims at a different target. Windows’ UI and user-friendliness provide the opportunity for a semi-literate man or an 80-year-old fart (a fossil, better to say) to get some stuff done. Working with a Unix-based OS for a non-geek and actually doing serious work is not practically impossible, but a real pain in the ass. But my point is: Given the significant superiority of FOSS, we can hopefully predict that as the quality of general education concerning computer software gets better and the average man’s competence improves, general tendency towards Unix increases by a significant degree.

    The problem is the gap between savvy programmers and ordinary people. Bridging this gap results in dramatic reduction of MS’s share of users.

  21. > Hah. Most Mac OS X users have no frickin’ idea what Unix is, and wouldn’t give a fart if you tried to explain.

    Which is where linux distros have, to this point, failed to “catch the ring”.

    Of course, when you have Jordan Hubbard maintaining the “unix” layer, (as Apple does) you have someone in your pocket that no current linux distro can match.

  22. @steven Ray: Why do you think that “ordinary people” will ever be closer to savvy programmers? Let’s analyze how it was with the cars: during the first decades when “automobiles” were new, every driver was also a car mechanic. He had to be, because there were no service centers and such.

    Now, every grandmother is able to drive and they sell cars with hoods sealed. Of course, there was an exception of our “beloved” Soviet cars which required advanced technical knowledge from their owners, because of extremally poor workmanship and (practical) lack of service centers in Soviet Union.

    I’m a programmer myself and I’d like people to be more tech savvy. That way Adobe would be more eager to support my OS of choice (FreeBSD) with its products. But the reality is that average computer user is going to be less and less tech savvy, the same way as car drivers. There were the times, when having a(n access to) computer equalled to “being a programmer”. Those times are gone for good :(

    It’s much more likely that computers of future will be like PS3 – yes, you are able to install Linux on it (because Sony recognizes and supports your legitimate wish to learn the hardware etc), but Linux is run from inside the hypervisor OS which ensures that you don’t get “too much” of access to the hardware (e.g. you can’t get to RSX (video) chip). If you want to develop for it seriously, go buy Sony SDK license.

    And millions of people will “vote” for such future by simply buying those “computers”.

  23. Mike Swanson > Likewise, they don’t use any version control at all for Windows!

    From a comment to the Vista shutdown menu story at http://moishelettvin.blogspot.com/2006/11/windows-shutdown-crapfest.html

    “It’s their setup that’s all wrong, not the cost of hardware or software. The current source code management setup in Windows was introduced by Mark Lucovsky (him who Ballmer threw a chair at when he announced he was joining the big G). The VCS is called SourceDepot and it’s a modified version of Perforce.”

    This is from 2006 and describes the system that was used for Vista.

    Erbo > Microsoft wound up buying OneTree and turning their SourceSafe product into Visual SourceSafe.

    VSS apparently has been very, very prone to having its repositories corrupted. I don’t know if the current versions are better. This was written in 2006: http://www.codinghorror.com/blog/archives/000660.html

  24. ESR in “predicting that Windows is going to suck” shocker :-)

    As a point of anecdotal evidence, I’ve been using Vista at work for a couple of months now. It’s… well, there aren’t any showstoppers, and I have yet to see a BSOD, but there are lots of minor UI irritations. You can get work done on it, provided you install Cygwin, but overall I’d much rather be using Ubuntu. A colleague’s been running the Windows 7 beta for a while now, and finds it much preferable to Vista.

    On second thoughts, there’s one huge annoyance – the lack of a package manager. I spent literally days setting up my new machine with all the software I needed when it arrived; on any modern Linux distro, that would have been one yum or apt-get invocation, plus a trip out to get some lunch while the details were handled for me.

    > Microsoft is behaving as though it believes that Vista’s problems were nothing but PR, and that by rebranding and spinning up the hype engines they can overcome those. The results when this strategy collides with reality should be … entertaining, to say the least.

    I’d love to agree with you, but let’s not forget that they’ve had a lot of success with this strategy in the past.

  25. Interesting, I had heard a few months ago by the word of a supposed Microsoft employee that they didn’t use version control at all; apparently the man was lying (I talked to him on some IRC channel, quite likely the identity was illicit, although there was little reason to believe anyone would want to pretend to work at Microsoft). On the other hand, the truth seems to be not much better than the lack of anything, heh (assuming that this one’s not lying, but I have no current reason to believe so).

  26. I’ve been using Vista for over a year. I wish every day I could roll this machine back to XP. I wish I could roll it back to Win2K. I’ve never had a machine with more horsepower than this one, both in processing power and amounts of RAM, but Vista is balkier, slower, and less stable. I went for months without having to re-boot XP; I frequently have to re-boot Vista. Just for one example. Most of the frustrations, I just put up with, like the waiting for simple things to happen. For instance, why should Solitaire take more than a couple of seconds to load? Yet it frequently takes up to 10 seconds. Most of that is waiting for something to begin to happen.

    Then there’s the money I have had to spend to buy new copies of programs that worked fine on XP and which I must have for business. I understand why MS can’t support all legacy software forever, but none of these programs were DOS-based or anything like that.

    Here’s an example: You take something with an established as wide as Palm PDAs, and the Palm Desktop 5.x which has worked fine, and now it won’t on Vista. Palm cobbled together 6.x, but it does not work as well — it is monochrome, and you lose some information than you have by using color on the PDA (categories and such.) I guess that is partly Palm’s problem. My question is, why did they have to fix something that wasn’t broke? (And no, that one I didn’t have to buy.)

    While I don’t take the time to stay as educated about these issues as I once did, I am still in the top 20 percent of everyday users (not a computer systems professional of any sort.) Maybe the top 10. I say this because I realize I may have said something which would be obviously wrong, to a professional. My point is, if it isn’t working for a guy like me, it isn’t working for a lot of people.

    Hmm. Maybe my problem is that I know enough to be frustrated with it. Maybe all the grannies and grampas out there who just want to use Quicken and send photos of the kids through email don’t know how wrong it is. “Pretty new interface! Why all the complaints?”

  27. @RCL:

    Why do you think that “ordinary people” will ever be closer to savvy programmers?

    Because as the civilization prospers, computer-related jobs/disciplines grow or at least the role of computer [and in this case software] tends to become an indispensable part of almost any profession. Keeping that in mind, a user [not necessarily a programmer] will probably feel the necessity to learn more about software in order to choose the one(s) that fully fit his needs and as a result, make better products. When his knowledge of software increases, he can have a better judgment about quality and efficiency. Dismal standards such as merely looking at the UI and dumb-ass-friendliness will lose their priority. O.K, I have my own doubts and maybe that is an oversimplification. Maybe Unix will never find it’s way to my grandpa’s house unless developers pay more attention to the aforementioned criteria. But the issue is not my grandpa; he is the problem. Future generations with high quality education IMHO, will at least urge MS-type companies to remarkably level up their products or take their balls and go home.

    But the reality is that average computer user is going to be less and less tech savvy, the same way as car drivers.

    Look, I don’t say your premise is completely wrong but I guess you’re going too far with that analogy. People do care about technical aspects of their car before choosing one. It means that the car factories are not just concentrating on the exterior design or quality seats. In software [unfortunately usually] better quality stuff is not so easy to use, but that is not the case with the cars. Here’s a question: Suppose the company X produces a kind of cheap car which has significantly better features than the ordinary ones [e.g. it can fly over other cars in heavy traffic]. The problem is that it lacks a sexy design and is a relatively hard type to master. What will happen in your opinion?

    here were the times, when having a(n access to) computer equalled to “being a programmer”.

    Really? When? AFAIK, a programmer was someone who had read Peter Norton’s book on Assembly. I mean it could have been worse. As ESR says there were hackers who didn’t have any computer.

  28. @steven Ray

    But the issue is not my grandpa; he is the problem. Future generations with high quality education IMHO, will at least urge MS-type companies to remarkably level up their products or take their balls and go home.

    Well…
    1) I don’t think that there’s something wrong with current generation and that the next one will be cardinally different.
    2) I think that your presumption that MS success was possible only because of “dumb users” is incorrect.
    3) I also don’t share your (implied) opinion that people, once becoming educated, will switch from MS to Unix. I know quite a few highly knowledgeable people (employed mostly in game development business) who prefer MS products because of their technical qualities (e.g. there’s no such thing like DirectX/Direct3D anywhere else).

    Here’s a question: Suppose the company X produces a kind of cheap car which has significantly better features than the ordinary ones [e.g. it can fly over other cars in heavy traffic]. The problem is that it lacks a sexy design and is a relatively hard type to master. What will happen in your opinion?

    My answer: major car industry players will quickly catch up and add flying abilities with “sexy design” and simplified steering to their own models. If they don’t, it means that those new features do not solve any real problems and company X’s car is going to be a niche product like Segway.

    The same applies for software. Usability is also a sign of quality… It’s easier to understand if your native language is not English, because you’re then more often exposed to localization problems, which ruin your user experience. Unfortunately, FOSS world is not that keen on fixing them, and even now it’s not that convenient to work with Cyrillic-named files in Unix (e.g. lack of uniformity between X and console).

  29. Mike Swanson: that person lied to you. In 1997 I worked for a guy who had interned at Microsoft in the previous summers and he mentioned Slime pretty much like Erbo did, didn’t have anything bad to say about it and also said they had one guy who worked full time maintaining it.

    I suspect it wasn’t too bad, it was certainly developed at a time when Microsoft was doing better stuff, especially in development tools. (Which doesn’t explain why the flaky “trashes repositories” Source Safe of the ’90s is still trashing them a decade later, but then again that’s a bit of dog food they apparently don’t themselves eat).

    – Harold

  30. This all seems oddly familiar…anybody remember Windows ME, the successor to Windows 98 SE? In that case, WME was a minor, rushed upgrade of Windows 98 SE (which was generally considered to be a successful version of Windows)…of course it was a complete disaster because it contained major stability regressions and many of the new features it did provide were badly broken. It’s pretty clear that it was an ill-considered and costly stop-gap to buy time until Windows XP came along. MS cannot afford another one of those.

  31. Hah. Most Mac OS X users have no frickin’ idea what Unix is, and wouldn’t give a fart if you tried to explain.

    To Mac’s target audience, “Unix” is a buzzword thrown about in movies like Jurassic Park to make computer shit sound cool. Knowing that their Mac has Unix underneath just makes them feel even more studly about owning a Mac.

    That’s about the only function Unix ever need serve to the vast majority. They want a computer that works, lets them get shit done, and maybe looks really cool. The Mac succeeds on these points. A Linux-based desktop does not.

    Also, I would be inclined to know how many Linux users actually realize that Linux is not a Unix, whereas Mac OS X definitely is.

  32. @RCL:
    I don’t think that there’s something wrong with current generation and that the next one will be cardinally different.

    Nothing is wrong with the current generation. Please don’t put words into my mouth. Next generations are ahead in time. They’re “living in future” and naturally enjoying the benefits, and not inherently superior.

    I think that your presumption that MS success was possible only because of “dumb users” is incorrect.

    Did I say that? I said “Bridging this gap results in dramatic reduction of MS’s share of users.” I never said MS will lose all of it’s market share. Yes, there are still some areas in which open source sucks and there’s no better alternative than using proprietary products.

    I also don’t share your (implied) opinion that people, once becoming educated, will switch from MS to Unix. I know quite a few highly knowledgeable people (employed mostly in game development business) who prefer MS products because of their technical qualities (e.g. there’s no such thing like DirectX/Direct3D anywhere else).

    A good example for what I meant about “some areas” above. thanks.

    My answer: major car industry players will quickly catch up and add flying abilities with “sexy design” and simplified steering to their own models. If they don’t, it means that those new features do not solve any real problems and company X’s car is going to be a niche product like Segway.

    A real intelligent answer, exactly what I expected. Don’t you think that the same has happened [to a considerably less degree] in software industry? This is what them kids at Apple were up to. Unix-based and sexy OS. You are right. I’ve heard that next Ubuntu releases will be [graphically] shaped like Mac OS X. Unfortunately it has been a slow process and one of the FOSS weaknesses. Even the next generations are not going to completely neglect user-friendliness and GUI.

    The same applies for software. Usability is also a sign of quality… It’s easier to understand if your native language is not English, because you’re then more often exposed to localization problems, which ruin your user experience.

    You are right. Indeed, I am not a native English speaker and I know what you mean. Here, Software engineers are trying for years to develop a completely local version of Linux. They haven’t finished yet :)

  33. >Also, I would be inclined to know how many Linux users actually realize that Linux is not a Unix, whereas Mac OS X definitely is.

    “Linux is not a Unix” is disputable. I discussed this question in detail here; in the terms I set up, Linux is not a “genetic Unix” but is a Unix in the sense that it’s an intentional implementation of the Unix interface described, e.g., in POSIX and the Single Unix Standard. I think either usage of “Unix”, broad or narrow, is defensible, but prefer the broad usage myself.

  34. >Also, I would be inclined to know how many Linux users actually realize that Linux is not a Unix, whereas Mac OS X definitely is.

    The Open Group actually sued Apple a few years ago over using the name ‘Unix’ in advertising without a license. OS X has since become a certified Unix in the legal sense.

  35. I think either usage of “Unix”, broad or narrow, is defensible, but prefer the broad usage myself.

    Have you talked this over with your wife the lawyer? Ever since Bayer lost the rights to their brand name for acetylsalicylic acid, companies have become more aggressive in discouraging trademark dilution. That’s why, for example, Xerox encourages you to speak of photocopying documents rather than xeroxing them; and Lego Group encourages you to speak of “LEGO bricks” and not “legos”.

    These companies can do little about individual cases of trademark dilution, outside of PR. But for a prominent figure such as yourself to openly encourage such dilution of the Unix trademark may warrant greater attention from TOG.

  36. I’d read somewhere that Vista has an entire new graphics/multimedia framework and that patchy support of the older APIs was tanking a lot of apps.

    I don’t see how you can have any valid opinions about Vista if this represents the depth of your knowledge with regard to it’s underpinnings. Vista does have an entirely new graphics framework. It doesn’t tank hardly any apps.

    Vista’s problems ARE 90% PR problems.

  37. “Linux world doesn’t pay any attention to backward compatibility that’s why there’s no market for commercial software for it”

    This is mostly incorrect.

    Linux has excellent backwards compatibility on a parallel with the best that Windows has to offer. The only effect on commercial software is that sometimes(rarely) that compatibility comes at the cost of a re-compile. At worst it becomes a usability issue (how do you tell your package manage to keep version 2.1 of library x as well as 2.0 ) and this is improving all the time.

    The reason for the “no market for commercial software” is more a profit/loss situation. Why build a windows and a linux version when you’ll get the lions share of your users from a windows version?

  38. Linux has excellent backwards compatibility on a parallel with the best that Windows has to offer. The only effect on commercial software is that sometimes(rarely) that compatibility comes at the cost of a re-compile. At worst it becomes a usability issue (how do you tell your package manage to keep version 2.1 of library x as well as 2.0 ) and this is improving all the time.

    Usability issues make or break end-user OS’s.

    And commercial vendors rarely ship source. Therefore, binary backwards compatibility is far more important than source backwards compatibility.

  39. >Sometimes to get a better design, you have to break backwards compatibility.

    true. but more often (overwhelmingly often), it’s down to cost considerations (eg macos vs carbon) or virtuousness considerations (eg python 3, which appears at a glance to have thrown away key aspects of its novice-user friendliness, eg loose print now tight and print() ).

    >>>>I think it’s more accurate to say that OS X assumes that its users will think it is well designed because it’s based on Unix.
    >>>Hah. Most Mac OS X users have no frickin’ idea what Unix is, and wouldn’t give a fart if you tried to explain. This I know from experience.
    >>You are right. However, there is a sizeable contingent that is pro-OS X solely because it is based on Unix. … The idea that OS X is well designed (untrue) because it is based on Unix is the most frequent diatribe I have to listen to, hence my exaggeration.
    >To Mac’s target audience, “Unix” is a buzzword thrown about in movies like Jurassic Park to make computer shit sound cool. Knowing that their Mac has Unix underneath just makes them feel even more studly about owning a Mac.
    >That’s about the only function Unix ever need serve to the vast majority.

    precisely. “unix” is a Badge of Virtue to most of the current mac evangelists (different from mac users, which has been eric’s recent experience). 99% of them don’t use any of the underlying capabilities (the 1% that does is typically genuinely competent, though). that’s not important to them. what’s IMPORTANT is the Virtue point of its magic-badge: “unix” underpinnings.

    cargo cult status.

    these people will typically go all sorts of funny colours if you point out that windows xp is identically unix. and explode if you point out it’s closer to mac HIG than macosx.

    RCL:
    >backward compatibility is the most important thing for an OS, much more important that any revolutionary design decisions. You (the ISV) can just ship a binary and be sure it can be run by an average user – now and six months later, when OS gets updated. Apple paid less attention to backward compatibility and has 7% of users wile Microsoft has 90%.

    quite the opposite, in fact.
    a KEY difference historically b/w apple and ms has been apple’s focus on backward compatibility. that’s the main reason they were so sluggish in introducing step-shift hardware upgrades, as the software needed serious effort to support both seamlessly. never heard of a “fat binary”? interestingly, this turned into forwards compatibility. macs remained viable wa-aaaayyyyy past their nominal hardware expiry dates, as they could still run modern fat binaries.
    as a result, on this machine 3-4 hardware re-architecturings later, i can still run apps on macosx which were written in ’88. several of the ones i run daily are 6 & 8 compatible.

    key exception: post-tiger. they seem to have recently decided that stuffing older-mac users is A Good Thing.

  40. > in the terms I set up, Linux is not a “genetic Unix” but is a Unix in the sense that it’s an intentional implementation
    > of the Unix interface described,

    Sure, but you decided to define the terms, and your conclusion here is no different than “Eunice is a Unix, too.”

  41. eg python 3, which appears at a glance to have thrown away key aspects of its novice-user friendliness, eg loose print now tight and print() )

    [looks]

    Euwwww…

    [sticks with 2.5]

  42. >But for a prominent figure such as yourself to openly encourage such dilution of the Unix trademark may warrant greater attention from TOG.

    The TOG can go piss up a rope. It’s not my job to defend their trademark – and, in fact, when the AT&T-vs.-BSD lawsuit went down in the early ’90s, long before TOG, I offered in my capacity as Jargon File editor to testify that ‘Unix’ had already turned into a generic technical description of a family of similarly-designed operating systems.

  43. @steven Ray:

    Glad we agree on the most of key points :)

    @Jon:

    Try running some Linux demos of late 1990-s and/or early 2000 from http://www.pouet.net. Demoscene is probably the only non-commercial developer community that (as a rule) does not ship sources (it’s the essence of demoscene competition – to be able to create something the others cannot), so everything you get will be binaries. Here you go: http://www.pouet.net/prodlist.php?platform%5B%5D=Linux&order=release&page=16

    Or try running Kylix (an abandoned commercial Linux project). Or ancient ported Linux games.

    I don’t have to argue – the binaries will tell you themselves everything about Linux backward compatibility.

    @Saltation:

    Ok, I’m probably incorrect with Apple paying not enough attention about backward compatibility. But they certainly paid *less* attention than Microsoft, which even kept the older bugs in newer releases. What’s interesting, it’s not that bad move as it may seems: keeping old bugs means retaining more users, more users mean more developers (simple free market rule), more developers mean more bright minds (statistics), more bright minds mean less bugs to keep in the next platform iteration… You see, evolutionary way to fight bugs is more stable than revolutionary.

  44. I need to clarify why “keeping old bugs means retaining more users” (most of you know that already, but I want to be understood by everyone, so forgive me repeating the truisms): a lot of programs depend on some particular system behavior, be it a documented specification or just an implementation side-effect. If you introduce revolutionary changes (e.g. just fix old bugs or redesign major API), you’ll incite a “civil war” on your platform, which will likely result in market fragmentation.

  45. Try running some Linux demos of late 1990-s and/or early 2000 from http://www.pouet.net. Demoscene is probably the only non-commercial developer community that (as a rule) does not ship sources (it’s the essence of demoscene competition – to be able to create something the others cannot), so everything you get will be binaries. Here you go: http://www.pouet.net/prodlist.php?platform%5B%5D=Linux&order=release&page=16

    Ah yes, the demoscene. Mysterious binaries (typically) without source code nor proper names being used. You’d have to be an idiot to run those. Out of curiosity, however, I unpacked the very first tarball in your link and noticed a file called “demo.c” immediately. What is this? Why it’s source code of course; if you were to put enough effort into it, you could probably port it to SDL 1.2 rather than SDL 1.1 which is used in the binary; there’s bound to be several other issues too thanks to sloppy coding.

    Plus, even if I were dumb enough to run it, I couldn’t, wrong CPU architecture! Time eventually makes all binaries obsolete. Well, I suppose I could download Red Hat 9 or something and run it in a virtual machine, but that’s too much effort.
    $ file demo
    demo: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.0.0, not stripped
    $ uname -m
    x86_64

  46. Ah yes, the demoscene. Mysterious binaries (typically) without source code nor proper names being used. You’d have to be an idiot to run those.

    If you subscribe to this attitude, then one hopes you also review the source for all the open source software you use. If you are capable of carrying out such a review competently, you probably don’t need the source to figure out how safe a small packed binary is. In any case, why not run it in one of your fantastic Linux sandbox accounts? Or do you not trust your open source software to be correct enough?

  47. @Mike Swanson:

    Ah yes, the demoscene. Mysterious binaries (typically) without source code nor proper names being used. You’d have to be an idiot to run those.

    While I understand such an attitude to demoscene, I don’t share it – I’m deeply rooted in demoscene and think it created an innovative and competetive culture, much more refreshing than mostly uninnovative, redoing-the-proprietary-stuff FOSS.

    Anyway, problems running Linux demos are not that demoscene-specific. Every binary distribution is going to suffer from the same problems.

    Including sources in Linux demos is (practically) a necessity. That’s why most of Linux (and crossplatform) demos are so poor – too few sceners are going to “donate” their work to public just because there’s no other way to run their demo.

    And that by coincidence is also the reason why they don’t make (with few exceptions) commercial software for FOSS OSes.

  48. And, in breaking news, it turns out Windows 7 UAC is both annoying and useless! Yes, folks, it’s possible for crackers to disable UAC without the user ever being notified!.

    Lovely. All the vaunted advantages of the Vista security rethink just went up in smoke, leaving users stuck with all the confusing behavior and compatibility breakage. for no benefit whatsoever!

    Even by Microsoft’s standards, this is pretty rich. And they’re refusing to fix it — they say it’s working as designed.

  49. > The TOG can go piss up a rope. It’s not my job to defend their trademark

    Can you say, “Contributory Infringement”?

    Are you *sure* your wife is a lawyer? Because if she is, she either doesn’t have clue one about IP matters, or you’re just not listening to her.

  50. If you subscribe to this attitude, then one hopes you also review the source for all the open source software you use. If you are capable of carrying out such a review competently, you probably don’t need the source to figure out how safe a small packed binary is. In any case, why not run it in one of your fantastic Linux sandbox accounts? Or do you not trust your open source software to be correct enough?

    There is some flaw in your argument; not trusting some random binary from an untrusted source is quite different from assuming that all binaries are inherently evil. I trust the Debian team well enough to be sure that my operating system itself is not contaminated with untrustworthy binaries; APT uses GPG to verify packages as it is installing, and it will warn you when a package cannot be verified (and prompt you for action of whether to continue or cancel the installation). I use only one non-Debian repository (the VirtualBox non-free repository, of which Sun Microsystems also provides a GPG public key for), and the few software which I install manually either has my hand already well-versed in it or is small enough to review before running.

    You also assume that a malicious program has to be large, I find that pretty ridiculous myself, it doesn’t take much effort to write the equivalent of `rm -fr /` into it. Yes, I could run the binary in its own user account, but the effort would be fruitless, as I demonstrated, the binary isn’t even for my CPU architecture in the first place. If you still think I’m being overtly paranoid, I probably won’t convince you of believing in safe computing habits, I just have to wish you luck and hope that you may never get harmed because you downloaded some binary from the Ubuntu Forums and ran it as root!

    And, in breaking news, it turns out Windows 7 UAC is both annoying and useless! Yes, folks, it’s possible for crackers to disable UAC without the user ever being notified!

    This is also possible on Windows Vista itself — I do not know if the method is identical, but I know that there’s at least a similar method out in the wild. UAC just annoys your usuage of (usually) legitimate program and doesn’t do shit to stop Vista-aware viruses. :-)

  51. @Mike Swanson:

    By the way, why do you keep saying that it’s impossible to run 32 bit i386 binaries on x86-64 system? With 32bit userland installed, you don’t need a VM for that.

  52. “I offered in my capacity as Jargon File editor to testify that ‘Unix’ had already turned into a generic technical description of a family of similarly-designed operating systems.”

    Really, becuase you’re about the only person I’ve ever heard that’s claimed this. I can’t think of a person I know who doesn’t say that it’s a Unix style, which is far from claiming that it’s Unix. And if it’s in your little jargon file, it’s wrong. Just because you say it means this, or you’re familiar with a handful of “experts” that agree doesn’t mean the community as a whole agrees. And that’s kinda’ the definition of definition – the larger community has to agree on what something means. Your jargon file has showed a few times that you may maintain the etymology of a word, but you don’t keep the meaning of the word.

    Linux took a good idea and stole it. Why, probably because they couldn’t come up with a good idea themselves. Why go through all the work of developing something really good when you can just steal it right? And then lead people to believe that it is the original. I’m not saying it sucks, but face it, they didn’t want to play by the rules somebody else set and they ripped off Unix. There is no other way to look at it. They didn’t imitate, they went out to specifically implement the entire feature set, usability, look and feel of Unix. And let’s not forget that Linux is purpotedly supposed to mean Linux is not Unix, the whole recursive definition thing.

  53. You also assume that a malicious program has to be large, I find that pretty ridiculous myself, it doesn’t take much effort to write the equivalent of `rm -fr /` into it.

    No, I do not. Demos are typically not that big, especially older ones. Ergo, a good programmer can unpack it and read all of the disassembly. Of course, open source is predicated on the notion that one requires source code to modify and inspect programs, but this is not even remotely true.

  54. >And let’s not forget that Linux is purpotedly supposed to mean Linux is not Unix, the whole recursive definition thing.

    Heh. No. “Linux” was coined from the name “Linus”.

    >they didn’t want to play by the rules somebody else set and they ripped off Unix.

    Er…no. There was no “rule” against emulating POSIX and related standards. In fact, that’s exactly what those standards were intended and designed to allow.

    Given the ignorance you are displaying on these topics, your claim that everyone you know uses “Unix style” and I am therefore wrong wrong wrong should, I think, best be viewed as an unintentional joke.

  55. >Linux took a good idea and stole it.

    like BSD did. *shakes righteous fist*

    >Er…no. There was no “rule” against emulating POSIX and related standards. In fact, that’s exactly what those standards were intended and designed to allow.

    exactly.

  56. >And, in breaking news, it turns out Windows 7 UAC is both annoying and useless! Yes, folks, it’s possible for crackers to disable UAC without the user ever being notified!.

    that [hole + article] is hilarious

  57. keeping old bugs means retaining more users, more users mean more developers (simple free market rule), more developers mean more bright minds (statistics), more bright minds mean less bugs to keep in the next platform iteration… You see, evolutionary way to fight bugs is more stable than revolutionary.

    This reminds me of the tired and debunked argument that breaking a window stimulates the economy by creating work for glassmakers. Of course we’re still talking about broken Windows here, so…

    Your thesis is trivially debunked by the progression Windows 95 -> Windows 98 -> Windows Me. A major architectural change needed to happen in order for Windows to advance into the 32-bit preemptive-multitasking world. It did not happen in consumer versions of Windows. The result was a growing shambles of messy code with sploits coming out the wazoo. Apple has pulled off three such architectural changes: two CPU-architecture migrations and a complete OS architectural rewrite — while building in compatibility with the old stuff through shims up to and including CPU emulation. Such an effort would be much less difficult for Microsoft, who has stayed with the x86 arch as its bread and butter since time immemorial, yet it did not happen. A decent DOS emulation layer would have made Windows NT a viable commercial operating system for the home and office with a modern kernel in 1993. Their DOS implementation was half-assed. So instead they gave us warmed-over DOS with essentially “C:\WINDOWS\win” in autoexec.bat, and that itself was a horrid kitbash of Win32 functionality onto the Win16 layer. Meanwhile, at about that same time, the Linux guys had gotten DOSEMU working well enough to play Doom with full framerates and sound, and even boot Windows 3.x in some cases.

  58. Oh, and I should add that in 99% of the cases the Apple compatibility layers worked seamlessly with the new CPU/OS, out of the box. If Apple has dropped the ball on backwards compatibility, it’s mainly through e.g., dropping Classic deliberately, but long after such an approach had proved viable for running old Mac apps.

  59. “Heh. No. “Linux” was coined from the name “Linus”.

    >they didn’t want to play by the rules somebody else set and they ripped off Unix.

    Er…no. There was no “rule” against emulating POSIX and related standards. In fact, that’s exactly what those standards were intended and designed to allow.

    Given the ignorance you are displaying on these topics, your claim that everyone you know uses “Unix style” and I am therefore wrong wrong wrong should, I think, best be viewed as an unintentional joke.”

    You are correct on the Linux part, it’s actually GNU that is Gnu is not Linux – I got my recursion wrong. However, Linux is still not Unix, it’s a clone.
    http://www.kernel.org:
    Linux is a clone of the operating system Unix, written from scratch by Linus Torvalds with assistance from a loosely-knit team of hackers across the Net. It aims towards POSIX and Single UNIX Specification compliance.

    It has all the features you would expect in a modern fully-fledged Unix, including true multitasking, virtual memory, shared libraries, demand loading, shared copy-on-write executables, proper memory management, and multistack networking including IPv4 and IPv6.

    http://www.linux.org
    Linux is a free Unix-type operating system originally created by Linus Torvalds with the assistance of developers around the world. Developed under the GNU General Public License , the source code for Linux is freely available to everyone. Click on the link below to find out more about the operating system that is causing a revolution in the world of computers

    Now, since both of these organizations list Linux as either a clone or Unix-type, I think that implies that your statement was “wrong, wrong, wrong.” Unless both of these organizations are wrong, or I’m misunderstanding what they’re saying. If that is the case, could you please explain how I’m misunderstanding how they’re Unix.

    Now, I didn’t say that they didn’t play by the rules of the standards, I implied, and gather that I wasn’t clear enough, that they didn’t want to play by the rules of their bosses/contracts and/or the owners of the software rules – i.e. we don’t want you to change this, or alter our code, or infringe on our proprietary rights. I believe that this is in part what started the drive for a more robust FOSS movement, as well as the development of GNU. Now, since the two sites listed above tout themselves as clones, this would imply that they liked what was in Unix, but didn’t like the restrictions placed upon them and so stole the ideas and implemented them as they saw fit.

    No there’s no rule saying they couldn’t implement/emulate the standards, but they tried to make it as similar to Unix as they could. Maybe I’m wrong, and I’ll freely admit it if you would care to enlighten me.

    *Note – S.T.E.A.L. – Stratically Transfer Equipment to an Alternative Location. I’m not saying that they necessarily did anything illegal, but they “stole” the ideas they liked and designed the system how they wanted it.

  60. >Now, since both of these organizations list Linux as either a clone or Unix-type, I think that implies that your statement was “wrong, wrong, wrong.”

    They’re making a propitiatory gesture towards the lawyers. But when programmers are talking about (for example) how to port code, “Unix” is a category that includes Linux. Nobody would bother to say, for example, “Linux and Unix signals are different from NT semaphores and events”; rather, they’d just say “Unix signals” and leave it understood that Linux signals behave in the same way unless otherwise explicitly specified.

  61. “They’re making a propitiatory gesture towards the lawyers.”

    Why are they doing this then? If it’s gone into public domain as a uniform way of discussing Unix-style OSes, why should they need to do that?

    And since, in the case of your example, signals are based off of the POSIX standards would it be more accurate to say that POSIX signals are different than NT semaphores? Yes Unix and Linux both implement a standard, they most likely do it differently (since I’ve not looked into Unixes implementation I’m assuming here,) but they both do it adhering to POSIX standards.

    From my experience Linux has been described as a Unix-like system. And what I took issue with is that your claim that Unix is now a universal term for implementing these standards, while it is still being acknowledged as a separate entity by the groups responsible for it. In fact for porting code wouldn’t the category would be POSIX with at least Unix, BSD and Linux falling into this category. And isn’t this distinction inherent in the desire to have set standards within the community? Why call for standards if they’re not referred to? Why refer to them by a system that implements them?

    This brings the problem down from a overview to a finer grain. If BSD implements the same POSIX standards, then why should there be a need to target BSD specifically if it’s implementing the standards as well. Or is this a general problem with the community, it wants standards, but doesn’t implement them consistently?

  62. >Why are they doing this then? If it’s gone into public domain as a uniform way of discussing Unix-style OSes, why should they need to do that?

    You packed two different claims into that sentence, one of which I’ve never made. “Public domain” is a term of art in copyright law, not trademark law, and you shouldn’t throw it around if you don’t know what it means (and, clearly, you are even more misled on this score than Jim Thompson is about what is and is not “contributory infringement”.)

    But the answer to your other question is that the lawyers have not caught up with reality and, indeed, have a vested interest in denying it as vehemently as possible. So you get ritual gestures on both sides — the kernel developers publicly pretend they’re not implementing Unix, and the lawyers publicly pretend to believe them.

    >Or is this a general problem with the community, it wants standards, but doesn’t implement them consistently?

    This is a general problem, mainly because standards certification involves a damned expensive test process and isn’t designed to cope with an OS that changes as fast as Linux does.

  63. You packed two different claims into that sentence, one of which I’ve never made. “Public domain” is a term of art in copyright law, not trademark law, and you shouldn’t throw it around if you don’t know what it means (and, clearly, you are even more misled on this score than Jim Thompson is about what is and is not “contributory infringement”.) ”

    http://fairuse.stanford.edu/Copyright_and_Fair_Use_Overview/chapter8/8-a.html
    Public Domain applies not just to copyright but trademark and patents as well. Before you start accusing people of not knowing what terms mean look into it yourself. Public domain refers to something that the general public has a right to use.

    Ok, so let’s call it Trademark infringement since that’s what it is, since Unix is a term that refers to a specific item and not necessarily a generic term like aspirin, band-aid, coke, or zipper.

    Since there is a history of Unix emulators trying to make the trademark of Unix into a generic term, and subsequently being sued, it seems unlikely that it is either proper or legitimate to state that Linux and Unix are the same.

    And this brings us back to the other question of if it’s trademark is invalid and in the public domain then why should they have to worry about the lawyers?

    ““Linux is not a Unix” is disputable. I discussed this question in detail here; in the terms I set up, Linux is not a “genetic Unix” but is a Unix in the sense that it’s an intentional implementation of the Unix interface described, e.g., in POSIX and the Single Unix Standard. I think either usage of “Unix”, broad or narrow, is defensible, but prefer the broad usage myself.”

    I may be wrong, but when you said you prefer the broad term I think it could easily be read as you believe that Linux = Unix. Just because Linux doesn’t want to or isn’t capable of maintaining certification to be able to call itself Unix doesn’t mean that it’s Unix. Isn’t that the point of having standards?

    How does not being able to maintain certification = right to call it Unix? Is it not able to build and maintain the roots that need to be there for certification and then build from there? Or should standards change at the whim of Linux?

  64. >Public Domain applies not just to copyright but trademark and patents as well. Before you start accusing people of not knowing what terms mean look into it yourself. Public domain refers to something that the general public has a right to use.

    You’re still incorrect. Note that the page you cited was headed “Copyright and Fair Use”. Public domain has a specific meaning as a term of art meaning “not covered by copyright”; neither lawyers nor the statute nor case law apply it to things that were or might have been trademarks. After reading that page, I can see how you might have been misled into believing otherwise, their wording is a bit loose.

    How do I know this? I’ve been involved in two attempts to establish a trademark and one attempt to dispute a trademark claim, and I listened very carefully to all three sets of lawyers. IP law is a complex, finicky area and the lawyers on your own side will kick you, hard, if you get the terminology wrong. Especially if you get it wrong somewhere the other side might be able to spin as a commitment to a particular legal posture — and I’ve been an expert on one case where the penalty for screwing up that way could have run to billions of dollars.

    “Trademark infringement” is what TOG would allege if a person who stood to gain from it asserted that Linux is a Unix. I can do that until continental drift pastes Pangaea back together without significant legal exposure because I’m not a principal, employee, or otherwise agent of any firm who stands to benefit from TOG’s trademark going pop like a balloon.

    I’m in an even stronger position because I’m a publicly recognized expert on what Unix is – I wrote The Art Of Unix Programming, you might recall, and there was no question in the mind of any reviewer or reader that the lessons of that book were intended to apply to Linux. If TOG were fool enough to sue me, all I’d have to do would be enter that book as exhibit A and it would not only nail the case shut but expose their lawyers to a substantial risk of bench sanctions for frivolous and malicious abuse of process. Judges really, really don’t like lawsuits that are attempts to gag a scholar or journalist.

    >Since there is a history of Unix emulators trying to make the trademark of Unix into a generic term

    There is no such history. From Eunice and Coherent onward, the authors of all the emulators I know about have been extremely careful to make ritual gestures of appeasement at the current holder of the Unix trademark rather than challengiing it.

    >And this brings us back to the other question of if it’s trademark is invalid and in the public domain then why should they have to worry about the lawyers?

    I have at no point asserted that the the term “Unix” is in the public domain. It is a registered U.S. trademark.

    What I have alleged is that in normal casual usage among programmers, “Unix” refers not only to OSes which are legally Unix but to others (such as Linux) that are behaviorally Unix as well. The way the title of my last book is interpreted by buyers and readers is a case in point. The language is out of sync with the law.

    If given the opportunity to testify, I would state that as a topic expert I believe “Unix” had already become genericized in this way in 1993 when the AT&T-vs.-BSDI lawsuit was going down; in fact I offered to testify to this point on BSDI’s behalf directly to Rob Kolstad, way back when. Had I done so, it is fairly likely there would have been no “Unix trademark” for TOG to inherit.

    >I may be wrong, but when you said you prefer the broad term I think it could easily be read as you believe that Linux = Unix. Just because Linux doesn’t want to or isn’t capable of maintaining certification to be able to call itself Unix doesn’t mean that it’s Unix. Isn’t that the point of having standards?

    Let me be very clear about this. At one time, before Linux’s standards conformance was as good as it is now, I would have said there was a good case for not describing Linux as “a Unix”, because it didn’t behave enough like a Unix. At present there are only two possible reasons to assert that Linux is not a Unix:

    (1) You care what TOG’s lawyers think. I don’t.

    (2) Linux, while almost 100% conformant with the TOG standards, is not certified for that conformance.

    To the second objection, the correct answer is “In 2009, there’s no reason anyone other than a TOG employee should give a flying fuck.”. The goal of those standards has been achieved — Linux and the open-source BSDs are the existence proofs. The certification process is an almost meaningless formality unless it happens to pay your salary.

  65. Thank you. Now I understand your position better. And I see where I made my mistakes in the matter. I’ve learned a few things and appreciate it.

  66. > (and, clearly, you are even more misled on this score than Jim Thompson is about what is and is not “contributory
    > infringement”.)

    Nice one, but even if this is a true statement, you haven’t countered my argument, because this reduces to “Jim Thompson knows more about subject A than you do about subject B”.

    Try again, esr-hole.

  67. > How do I know this? I’ve been involved in two attempts to establish a trademark and one attempt to dispute a
    > trademark claim, and I listened very carefully to all three sets of lawyers

    And were you successful? I didn’t think so.

    All the other back-pedaling just proves my prior point.

  68. (2) Linux, while almost 100% conformant with the TOG standards, is not certified for that conformance.

    Certification is everything.

    Linux lost out to Windows in many security-critical applications in the early 2000s. Why? Because Windows was certified to EAL4 in the Common Criteria; Linux was not certified at all. The certification may be bogus, but as far as Uncle Sam was concerned Windows was the more secure operating system.

  69. >Try again, esr-hole.

    That’s it. You were warned what the consequences would be if you descended to petty insults again. You are now banned, and your comments will be treated as spam as soon as I get my filters educated.

  70. >Certification is everything.

    No. Been there, seen this. At most, it generally provides a convenient excuse for something the manager wanting to do the acquisition wanted to do anyway.

  71. I still have a few questions.

    “There is no such history. From Eunice and Coherent onward, the authors of all the emulators I know about have been extremely careful to make ritual gestures of appeasement at the current holder of the Unix trademark rather than challengiing it.”

    Is BSD considered a Unix emulator? And if so weren’t they sued for infringement by AT&T in 1992? Or was it strictly copyright? I’ve found information for both arguments.

    If, since a trademark isn’t public domain – as you’ve stated, where does it go when it becomes generic term like aspirin, cellophane, zipper, band-aid, coke? Yes it becomes a generic term, but does it fall into Public Domain, or into a different category?

    Again, I’ve found conflicting information about this.

    http://www.nolo.com/definition.cfm/term/4BEE68F3-F3CD-4A64-A0FF54C5F23E28E8
    has it defined as:
    A creative work, invention or logo that is available for use without permission from its owner. This typically occurs after patent, trademark or copyright protection has expired.

    And yet there’s also arguments that since the term public domain has been loosley defined throughout history its a tough call.

    http://books.google.com/books?id=wHJBemWuPT4C&pg=PA164&lpg=PA164&dq=Legal+definition+of+Public+Domain&source=web&ots=1eYXOzAe-y&sig=X5hEMG0icjSh_uee0J_A5rHqK1g&hl=en&ei=8quISaKALZKWsQOZ_aCXBg&sa=X&oi=book_result&resnum=9&ct=result

    While the above link is specifically about copyright, it has some good discussion about a legal definition of public domain that may be relevant.

    However, my limited experience with this subject doesn’t lend any more information. (A couple of classes Business Law, which included copyright, patent and trademarks)

  72. If, since a trademark isn’t public domain – as you’ve stated, where does it go when it becomes generic term like aspirin, cellophane, zipper, band-aid, coke? Yes it becomes a generic term, but does it fall into Public Domain, or into a different category?

    Band-Aid® and Coke® are not generic terms. Listen carefully; the advertisements all say “Band-Aid brand” and if you order a Coke at a restaurant that has a distribution deal with Pepsi, they will say they serve Pepsi.

    To my knowledge a genericized trademark becomes effectively public domain, such that even a competitor to the mark’s former owner can use the name referring to its own products: for example, rival drug companies to Bayer can sell “aspirin”, and 3M could sell cellophane tape in the US (but not in the UK). That is the case for US law anyway; Wikipedia indicates other jurisdictions (such as Germany) may be different. Disclaimer: Neither I nor Wikipedia are to be considered authoritative for IP law issues.

  73. >I still have a few questions.

    Good, you’re asking the right person. I’m a topic expert in this area (e.g. the intersection of Unix history and IP law.)

    >Is BSD considered a Unix emulator? And if so weren’t they sued for infringement by AT&T in 1992? Or was it strictly copyright? I’ve found information for both arguments.

    Depends on what you mean by “emulator”. But there are very few people who would have called it one; that term is normally reserved for emulation environments hosted inside operating systems with non-Unix primitives, and BSD was never like that.

    BSDI was sued in both trademark and copyright grounds. IIRC, the trademark charges quickly became a non-issue as BSDI agreed to change its advertising slightly to make use of “Unix” only as a modifying adjective rather than a noun. I have a much clearer memory of the the copyright-violation charges because they were more substantial. They blew up in AT&T’s face when it became apparent that AT&T had stripped some Berkeley copyrights off code it incorporated in System V. The judge frowned menacingly at AT&T and told both parties to get the hell out of his courtroom and settle.

    They did. The resulting settlement freed BSD from AT&T IP claims. BSDI, allowed the AT&T lawyers to save face by agreeing to remove and rewrite a tiny amount of code (3 files out of 1,300).

    (I was in communication with BSDI while all this was going on. The non-profit ISP co-I founded in 1993 was a BSDI site then, and I had a small reputation as a Unix expert and public advocate, so I had a conversation or two with Rob Kolstad.)

    >If, since a trademark isn’t public domain – as you’ve stated, where does it go when it becomes generic term like aspirin, cellophane, zipper, band-aid, coke?

    It would be called “unprotectable” or “ineligible” if anyone brought it to the USPTO again. I don’t know of a term of art more specific to trademarks than that.

  74. “Band-Aid® and Coke® are not generic terms. Listen carefully; the advertisements all say “Band-Aid brand” and if you order a Coke at a restaurant that has a distribution deal with Pepsi, they will say they serve Pepsi.”

    http://www.uspto.gov/main/glossary/index.html#g

    Definition of generic by the US Patent and Trade states :terms that the relevant purchasing public understands primarily as the common or class name for the goods or services.

    And yet when most people I know cut themselves they ask for a band-aid, not a Band-aid brand adhesive bandage, or 3M brand adhesive bandage, just “Can you get me a band-aid.” So the goods in this case have become a generic term.

    And I know that in the south(being in the military I met many people from the south that always referred to a coke as just a soda,) and most likely other locations, the term coke is often a generic term for a soda, brand not-withstanding.

    And Jeff, we then get into the debate of what is public domain, which is the discussion that esr and I were having. Not to mention my avoidance of using wikipedia as a reference, just because it’s not a fully trusted source, so I hold neither myself, you, or it as an authoritative source.

  75. “(being in the military I met many people from the south that always referred to a coke as just a soda,) ”

    correction: referred to a coke in reference to a soda.

  76. “Apple has pulled off three such architectural changes: two CPU-architecture migrations and a complete OS architectural rewrite — while building in compatibility with the old stuff through shims up to and including CPU emulation.”

    Interesting. Are you sure? The best argument of why Windows became the de facto standard desktop was that Apple broke backwards compatibility four times, and the third-party vendors and the users just didn’t like their investments go up in a smoke. MS was religious about backwards compatibility, there are all these little funny stories in Raymond Chen’s blog that Sim City had some idiotic kind of memory management that they could get away with under DOS, but not under a multi-tasking OS so they put in a special feature to Win95 to detect if Sim City is running and handle it. Quite amazing, actually. Another fun story from RC is that some third-party software extract the Win95/98/2K/XP animgif file copy dialog from the resource file and use it, and they don’t even copy it and bundle it with the application, they just do it runtime. So in Vista they wanted to use something else than the animgif and then realized that while implementing that something else, they have to leave the animgif there, because one of these is a Game of The Year and another is a popular anti-virus, cannot afford to break them.

    RC says their number one problem is the idiotic and frequent reliance of third-party vendors on undocumented features that nobody ever promised they will be in the next version, but they still have to support it in the next version, because they just cannot afford to break popular apps.

    AFAIK Apple was never so religious about backwards compatibility.

    If the number one rule of an OS upgrade – or buying a new model of Apple – is that the one absolutely unacceptable thing under all and every circumstances EVAR is to have LESS than I had before it, especially losing the ability to run ANY software I ran before, and I think it is, even if the vendor is a dumbass and relies on undocumented OS features, and if Apple was just a little bit worse in it than MS, keeping backwards compatibility for the documented OS features but not for the undocumented ones, that’s a good reason why they lost.

  77. k wood,

    As I mentioned, PR is about the only effective means a company has against casual trademark dilution by individuals. Even Coke does it — remember its slogans like “Can’t beat the real thing”? I suppose the ultimate litmus test is whether it will stand up in court: If Pepsi introduces “Pepsi brand coke” and Coke sues, the outcome of the case will determine whether the trademark is sufficiently generic.

    Shenpen,

    If the number one rule of an OS upgrade – or buying a new model of Apple – is that the one absolutely unacceptable thing under all and every circumstances EVAR is to have LESS than I had before it, especially losing the ability to run ANY software I ran before, and I think it is, even if the vendor is a dumbass and relies on undocumented OS features, and if Apple was just a little bit worse in it than MS, keeping backwards compatibility for the documented OS features but not for the undocumented ones, that’s a good reason why they lost.

    No. Your theory (“Microsoft is religious about backwards compatibility”) is false on its face, and the reasons given are completely unjustifiable according to any sane design principles. Your theory is false because of all the products Microsoft deliberately broke compatibility with because they came from competitors in markets Microsoft considered critical — products Raymond Chen, a Microsoft employee, deliberately left out of his yarns. And bug-for-bug compatibility is never a sound design principle; not even Microsoft really believes that anymore as they broke a lot of stuff with XP and a lot more stuff with Vista. Thankfully, DOSBox exists to take up some of that slack if you are nostalgic for SimCity.

    As for Apple breaking backward compatibility — yes they did it. Mac pointers used to be 24 bits, the remaining 8 reserved for tags; when the extension to full 32-bit pointers took place a number of apps, including Excel, broke. People complained, but Apple merely shrugged their shoulders and said “Those app developers should have used the documented APIs, they were warned.”

    According to Shenpen’s design ethos, the advantages to be gained from having full access to 32-bit address space would have to be forgone, because existing buggy behavior must be preserved at all costs. If you can’t see the problem with this, God help you.

    Apple lost because they are a hardware company. Microsoft whored out its OS to just about anybody. It was a Stallmanesque display of openness for its day.

  78. > The best argument of why Windows became the de facto standard desktop was that Apple broke backwards compatibility four times

    They also blundered on a bunch other things after Jobs was ousted, including at least: selling overpriced and underpowered hardware for a number of years, failing at the office market when it grew rapidly (maybe they didn’t even try very hard), waiting for Pink/Taligent/whatsitsname to materialize and being forced to stick to an outdated OS in the meantime.

    I often have the feeling that you don’t really have backwards compatibility in a piece of software unless you have the source. On free systems you can pretty much always compile old stuff and get it to work, even if it means doing various contortions to get e.g. badly written 15-year-old Fortran code that used to run on a number of obscure systems to work (been there, unfortunately). Granted, the result may do strange things in the new environment, but I still feel more in control than I would if I had to depend on a 16-bit Windows binary running on XP (not sure if that is technically possible, but you get the point).

  79. correction: referred to a coke in reference to a soda.

    This made me smile, as I have always heard “coke” used in reference to any soda.
    “Can you grab me a coke?”
    “Sure, what kind?”
    “Sprite.”

    I first saw some GNU software on a SunOS machine probably (guessing here, my memory is fuzzy) about a decade after Richard Stallman’s initial announcement. It was all completely new to me, and I didn’t understand the courtroom battles or Richard Stallman’s campaign for social change–I just knew that it “just worked.”

    Thanks for all the history and dialogues like this one, it’s good reading and interesting to learn.

  80. “Microsoft deliberately broke compatibility with because they came from competitors in markets Microsoft considered critical — products Raymond Chen, a Microsoft employee, deliberately left out of his yarns.”

    It’s news to me, but not a surprise. I’m not very surprised about MS breaking compatibility for the sake of market dominance, it roughly fits to their corporate ethics, or the lack of it. Still, the point is that these were exceptions from a general rule, rather than ignoring the general rule (Apple).

    “Microsoft really believes that anymore as they broke a lot of stuff with XP and a lot more stuff with Vista”

    Not with XP. Really old accounting apps run on it, really old games too, the trick is usually just removing DirectX and installing an older version, and/or setting up a desktop shortcut in compatibility mode. With Vista, yes – I think Joel Spolsky blogged about it that the backward-compatibility-fan “Raymond Chen camp” finally lost against the neophile MSDN camp. With Vista. Not before.

    (Interestingly, the backward-compatibility of Office is horrible. A PowerPoint 4 document today is as good as lost.)

    “Mac pointers used to be 24 bits, the remaining 8 reserved for tags; when the extension to full 32-bit pointers took place a number of apps, including Excel, broke. People complained, but Apple merely shrugged their shoulders and said “Those app developers should have used the documented APIs, they were warned.””

    And the (somewhat early-adopter) end-users, who only recognized that they paid good money for an upgrade and from their point of view they were cheated, because they suffered losses because of upgrade, became enraged and recommended PC/Windows to their friends. The rest is history. This is the usual case of technological excellence vs. what users need.

    “According to Shenpen’s design ethos, the advantages to be gained from having full access to 32-bit address space would have to be forgone, because existing buggy behavior must be preserved at all costs. If you can’t see the problem with this, God help you.”

    Surely there must be a way to achieve a good compromise. Run those old apps in some sort of a sandbox or whatever. I can’t really tell now what the best compromise it, merely that one can be found with enough effort. Not an elegant one, but a working one.

    “Apple lost because they are a hardware company. Microsoft whored out its OS to just about anybody. It was a Stallmanesque display of openness for its day.”

    Surely that was an important part of the picture, I agree. Abstracting the hardware away in WinAPI, rescuing users from hardware lock-in – while getting them into software lock-in, of course.

    I keenly remember the change in computing vocabulary: first IBM PC, then “PC-compatibles” and finally “PCs” which meant “anything Windows will run on”. This is no small achievement, given the sorry state of hardware-dependent “microcomputer” operating systems of that age. In fact I think besides the many reasons to despise MS this is one thing we should give them some credit for. I think the spectacular development and cheapening of hardware is largely due to this.

  81. Run those old apps in some sort of a sandbox or whatever.

    That’s what I was advocating.

    if(app==Excel) {
    useOldPointerBehavior();
    }

    somewhere in your kernel code is not a solution. A broken sandbox compatibility layer for old apps that doesn’t affect your kernel or base API development and screams hey guys, update your shit because things work different and better now is a much more engineeringly sound way to handle the problem. Apple has provided that. Microsoft has kept old bugs, or an entire old crufty architecture (DOS+Win16+as much Win32 as we can put on this house of cards) around in their core code for (some) old apps, and introduced new bugs to deliberately break others.

    Really old accounting apps run on it, really old games too, the trick is usually just removing DirectX and installing an older version, and/or setting up a desktop shortcut in compatibility mode.

    DOS compat went out the window with XP. Actually with all versions of Windows NT, but the justifying reason for Windows 9x was so you could keep running your old DOS stuff. Old games in particular were no longer usable, but a lot of nifty old apps too — things like ScreamTracker — died with the changeover from DOS to NT kernel.

    Building a home OS on NT technology was imho A Good Thing, but it should have happened eight years before it did, with an extremely robust DOS compatibility layer — with 16-bit instruction emulation if necessary. The Linux crowd has had DOSEMU for a long, long time and it’s superb — good enough to run games with full sound and graphics.

    In fact I think besides the many reasons to despise MS this is one thing we should give them some credit for. I think the spectacular development and cheapening of hardware is largely due to this.

    IBM deserves more credit for this, as well as the blame for the predatory business practices Microsoft uses. (Bill learns from the best (worst).) The only unassailable piece of IP in the whole PC architecture was the BIOS. IBM didn’t think enough of its desktop market to patent the various components and interfaces they put into it. Once IBM’s proprietary BIOS was reverse-engineered the doors were wide open.

    Also if you really want to see a company religious about backward compatibility, look no further than Big Blue. We owe the very concepts of CPU emulation and machine virtualization to work they did in the 60s. The things that can be done with IBM mainframes these days (keeping a crufty old app alive in one VM segment, while running something newer like Linux that provides a modern interface to that app in another) is astounding.

  82. >Interesting. Are you sure?

    100%

    >The best argument of why Windows became the de facto standard desktop was that Apple broke backwards compatibility four times, and the third-party vendors and the users just didn’t like their investments go up in a smoke.

    what a peculiar Justifiction (© saltation 2006 (2005?)). up until Apple killed 100% backwards compatibility with Intel macs, you could run every mac app on every mac ever built (exceptions (very minor, and nearly all are games): those apps which had overridden standards and written down to the bare metal on particular hardware). example: as i type this i’m currently running (as my standard email app) a program written in 1988. on a mac built 13 years later on a radically different unix-based OS built 18 years later. no tweakings or special knowledge required. examples: every macos4 app i have (still kept on this drive) still runs fine on macosx on a machine nearly a million times faster (seriously) than the best available at the time.

    hell, shufflepuck cafe still runs fine.

    whoever told you that needs a strong dose of Facts (available on prescription only. side effects include Reality and Rational Argumentation).

    the real reason windows became the defacto standard is they (originally) targetted the social processes underlying the overwhelming bulk of computer purchase (and hence: the mindshare leaders, influencing even personal purchases). specifically: key nexus = corporate IT depts.

  83. >Apple lost because they are a hardware company. Microsoft whored out its OS to just about anybody. It was a Stallmanesque display of openness for its day.

    Glassée’s legacy had hideous consequenes. he drove around for a while with a car license[sic] plate “Open Mac”. because he allowed the SE to have an expansion port (SCSI).
    seriously.

  84. >Surely there must be a way to achieve a good compromise. Run those old apps in some sort of a sandbox or whatever.

    that’s exactly what apple did. 2 flavours with macosx. “Classic” — running on a VM; “Carbon” — running on a slightly cut-down Framework.

  85. heh. quote on slashdot currently:
    “”[0]David Gerard writes “Wine (the Windows not-an-emulator for Unix) runs
    Windows applications more often than not. (Certainly more often than
    Vista does.)””

  86. >Battle for Wesnoth made the GearCrave Top Ten Open Source Games list.

    wot? no Dwarf Fortress?

    Dwarf Fortress isn’t open source.

  87. >Battle for Wesnoth made the GearCrave Top Ten Open Source Games list.
    >
    >Congratulations to you and the team.

    You know what’s really entertaining? I’ve been a developer on no fewer than three of those top ten. I didn’t contribute much to FreeCiv — didn’t get along with the project lead — but I was deeply involved with nethack at one time. In fact, GearCrave’s screen shot shows off two features I wrote personally — the use of IBM form graphics for room walls and the support for coloring artifacts and monsters (yes, the colors convey information and aren’t just random). This was in the late ’80s before bit-mapped graphic displays were common.

    I wouldn’t go so far as to aver that having ESR on your game project is a ticket to bright lights and world renown, but judging by GearCrave’s list it clearly does not hurt…

  88. There is a question I have wondered for some time, about IBM and Microsoft.

    Why in the world did IBM allow Microsoft to own and control DOS?

    They let MS walk away with the most valuable piece of the whole thing back in 1981. IBM succeeded in making their PC the dominant standard but they themselves derived precious little benefit from it. As everyone is no doubt aware, IBM sold their PC division to the Chinese company Lenovo. IBM is not even in the PC business anymore.

    The only explanation I have ever heard is that IBM got sued a lot, so they wanted someone else to do the OS. Certainly IBM must have had a lot of smart guys who could have come up with an operating system. Why didn’t they? Would this just pure and simple one of the most bone-headed moves in the history of American business?

    Imagine if IBM had controlled the OS. There might not be any Windows; hell maybe we would all be using OS/2 Warp. Remember that IBM did eventually try come up with their own OS but it was too late.

    Can you guys shed a little light on this?

  89. >The best argument of why Windows became the de facto standard desktop was that Apple broke backwards compatibility four times, and the third-party vendors and the users just didn’t like their investments go up in a smoke.

    what a peculiar Justifiction (© saltation 2006 (2005?)). up until Apple killed 100% backwards compatibility with Intel macs, you could run every mac app on every mac ever built

    I’m not sure about Mac-to-Mac backward compatibility, but the Mac to Apple ][ lack of compatibility was a killer IIRC. I knew people who were still complaining about the lack of VisiCalc as late as 1997.

  90. I wouldn’t go so far as to aver that having ESR on your game project is a ticket to bright lights and world renown, but judging by GearCrave’s list it clearly does not hurt…

    I dare say the causative relationship is the other way round.

  91. Ubuntu Wipes Windows 7 In Benchmarkshttp://tech.slashdot.org/article.pl?sid=09/02/05/1919259&from=rss

  92. The only explanation I have ever heard is that IBM got sued a lot, so they wanted someone else to do the OS. Certainly IBM must have had a lot of smart guys who could have come up with an operating system. Why didn’t they? Would this just pure and simple one of the most bone-headed moves in the history of American business?

    IBM was in the business of selling mainframes. They got caught with ther pants down by the emergence of inexpensive personal computers; they had no idea how to cut a deal favorable to them in this foreign market. Nor did they care, since they did not forsee both the ascendance of their microcomputer architecture as dominant (they were an also-ran at the time) nor the dwindling of the mainframe in relevance.

  93. >I’m not sure about Mac-to-Mac backward compatibility, but the Mac to Apple ][ lack of compatibility was a killer IIRC. I knew people who were still complaining about the lack of VisiCalc as late as 1997.

    yeah, visicalc was a PROFOUND innovation, mostly for the financial markets. there were traders who would literally run around with their apple IIs under their arm.

    once excel came out, though, its profound dominance of visicalc meant they all switched essentially immediately.

    still the best spreadsheet on earth by a LONG way (wingz failed by not dominating excel in any user-key area)

    snobs shoud consider this description: an indefinitely flexible declarative coding + database environment, with full procedural capability/override, cutting-edge mathematical/ops-research plugins, and built-in WYSIWYG display/printing capabilities.

  94. >> I wouldn’t go so far as to aver that having ESR on your game project is a ticket to bright lights and world renown, but judging by GearCrave’s list it clearly does not hurt…

    heh

    >I dare say the causative relationship is the other way round.

    gah. you’re too good for us, Marshal. sure enough, as soon as any game got popular, Eric fired up his opensource tardis, went back in time, and got involved in the expost good stuff once he knew it’d be worthwhile. well spotted.

  95. >Why in the world did IBM allow Microsoft to own and control DOS?

    micro-cultural reasons. PCs were a joke, a gimmick. some weird super-nerdy-looking loser with a lot of money (not many people know that the REASON gates dropped out of harvard and started up MS is that he inherited $10m. back then, that was SERIOUS money) pops up and offers to take some irritating trivial but expensive hassle off our hands, for something we’re only really doing because other people are making noises about it and we’ve barged in to show OUR DOMINANCE? hell, yeah!

  96. gah. you’re too good for us, Marshal. sure enough, as soon as any game got popular, Eric fired up his opensource tardis, went back in time, and got involved in the expost good stuff once he knew it’d be worthwhile. well spotted.

    Wesnoth was out for some time before Eric made his first commit, and he didn’t have much to do with FreeCiv. Clearly, he chose to work on Wesnoth because it was promising, rather than it becoming a promising game because he chose to work on it. I don’t know when nethack became popular, so maybe he was a driving force in its popularity. Good on him. However, the extrapolation of Eric being a causator in game popularity based on 2 data points on a top list is pure silliness. From that data alone we can’t conclude even conclude that he is unlikely to hurt a game’s popularity.

  97. *tchoh*
    and here’s me thinking eric’s flippancy meant he was SERIOUS about his participation directly creating a game’s popularity.
    fool, me.
    you’ve opened my eyes, Marshal.

  98. I have to say, it took some talent to miss the joke in ESR’s statement there. Good on you, Marshal.

  99. On the Windows 7 UAC:

    Actual comments from people who A) use it, and B) maintain/fix the code for it:

    http://blogs.msdn.com/e7/archive/2009/02/05/uac-feedback-and-follow-up.aspx

    Isn’t it interesting how the people without an axe to grind can say “We screwed up, here’s what we’re fixing.”

    Oh, and FireFox 3.06 broke my friend’s Ubuntu install by overwriting key library files with newer versions as part of the install process. We learned our lesson and logged what we had to change last time this happened. As near as we can tell, it didn’t even need the newer library files for FireFox….it was just trying to be helpful.

    This is now the third time since mid-November that we’ve had to go rooting into the innards of her Ubuntu install to correct for “DLL-hell”.

    We’re getting close to 40 hours logged on fixing this every time one of her software packages updates. Eric, have you spent 40 hours on a Windows machine this millennium?

  100. Ken, that’s really weird. Are you updating from the unstable branch or something? Horror stories like this may be typical of Gentoo, but they should be very rare anywhere else.

  101. >We’re getting close to 40 hours logged on fixing this every time one of her software packages updates. Eric, have you spent 40 hours on a Windows machine this millennium?

    Nope. I’ve also never, ever had the kind of problem you’re describing – and I run a pretty stock Ubuntu. I join Daniel Franke in scratching my head and saying “What the fuck?”

  102. >Isn’t it interesting how the people without an axe to grind can say “We screwed up, here’s what we’re fixing.”

    It remains to be seen whether they’re actually competent to fix it. Previous experience is, shall we say, not encouraging on this score.

  103. >Clearly, he chose to work on Wesnoth because it was promising,

    Oh, Marshal. You’re such a joy to have around. So uninententionally amusing. Yeah, sure that’s how I operate — I run around sniffing at the arseholes of projects, going “Ooooh! Oooh! Will this someday make the top-ten list in a magazine I never heard of before and don’t care about, or some other equally meaningless popularity contest?”

    You’ve found me out. Tell all your friends, do.

  104. FWIW, I’ve had this sort of problem exactly once since ditching Gentoo; it was while updating a pre-etch Debian snapshot. Something that I upgraded triggered a regeneration of my initrd, and I found out when I rebooted weeks later that it had hosed it. Though, I was running LUKS on top of LVM on top of software RAID, and LUKS was at the time highly experimental in Debian, so it wasn’t that surprising that it was too much for mkinitrd’s poor brain to handle. If anything, I blame the kernel for that whole volume management stack being a godawful mess. BSD does it a lot better.

  105. Frankly,I don’t know how this descended to the level of petty hostility. The fact that Eric appears on the list is more likely a result of his seeking out a worthwhile project than of his contributions, since they seem pretty destined for fame regardless. The fact that Eric chooses to take this as an insult boggles the mind. I’m sure he’s done great work on Wesnoth and nethack. I guess he’s still butt-hurt about the fact that I made him look stupid re: ptrace.

  106. >I guess he’s still butt-hurt about the fact that I made him look stupid re: ptrace.

    Nice universe you live in. Isn’t this one.

  107. > Nice universe you live in. Isn’t this one.

    Eric, can’t you just admit that occasionally you make a mistake?

  108. Eric, can’t you just admit that occasionally you make a mistake?

    Eric is first to come blustering in with derision and condescension, and last to admit that he was wrong.

  109. I just reread the ptrace discussion, and I didn’t see anything that I could construe to be Marshal making ESR ‘look stupid’. Possible to tell us how, exactly, you did, Marshal?

  110. >I just reread the ptrace discussion, and I didn’t see anything that I could construe to be Marshal making ESR ‘look stupid’. Possible to tell us how, exactly, you did, Marshal?

    You shouldn’t have done that, Tom. You’ve just invited childe Marshal to drag the entire thread into the bizarro alternate universe that he dwells in – you know, the one where I also have a time machine that I use to go back and join game projects like nethack twenty years ago because they’re popular today. (Of course, in this universe, I borrow my buddy Guido van Rossum’s time machine to do it.)

    Besides, if there’s any more merit in that discussion, it belongs attached to its parent blog entry, not this one.

  111. Probably true. I was not trying to restart that discussion, but quite possibly Marshal will take it that way. Apologies.

  112. >What he says about Microsoft goes double and triple for Linux.

    Reading the article, I note he complains about the fact that the developers control the feature list at Microsoft, and argues that it should be controlled by usability researchers and designers. Isn’t this what ESR was largely complaining about in his ‘Why I Hate Proprietary Software’ post? More generally, how do you get around the problems that ESR cited there? If they make the company make good software at the expense of coder-friendliness, that’s no answer.

  113. Control by usability researchers and designers would at least be an improvement over control by clueless PHBs and marketroids.

  114. Heh, that reminds me of how the Xbox 360 has a very high failure rate because the marketing team demanded an unrealistic form factor (long story short: the GPU is placed directly underneath the DVD drive which gives it insufficient cooling)

  115. Wait…this is because they had requirements as to the SHAPE of the thing? *sigh…*

  116. You shouldn’t have done that, Tom. You’ve just invited childe Marshal to drag the entire thread into the bizarro alternate universe that he dwells in

    Indeed. I will reply in that thread, if Tom cares to look.

  117. MS and usability:

    Microsoft bought the Navision ERP software shortly after I started to work with it, in 2002. Now,, Tog’s famous usability metrics about the mouse betting better than the keyboard are only true when the user uses a diverse set of functionality. Not when the user is involved in large-scale data entry.

    Navision A/S as an independent company managed that very rare kind of achievement that despite being a very modern-looking Windows MFC application, it was at least as keyboard driven as any old MS-DOS or Unix app: you could very literally unplug the mouse and navigate to an invoice form, fill it and post and print it using nothing but arrow keys for navigating menus and fields, F3 for new record, F6 for selecting drop-downs, the numeric keypad for putting in amounts or quantities, F8 for copying the value of the cell above in a grid, and F11 for posting. I hope I don’t have to explain how insanely cool that is when your job is to make 100 invoices a day.

    If you are optimizing for creative user work, optimize for intuitiveness i.e. mouse and visuals. If you are optimizing for living, breathing data entry devices i.e. clerks, optimize for bandwith i.e. keyboard. Navision A/S understood that. Microsoft?…

    Microsoft’s first “improvement” was to create an Outlook-like menu structure, where you can navigate within a pane still with the arrow keys, but you can choose another navigation pane only with the mouse. That was bearable, but predicted a direction.

    Now we have something called “Role-Tailored Client”, where the only keyboarding you can do is to TAB between form fields. The old client is still there but is slowly being phased out.

    But, but, the Role-Tailored client looks extremely cool on a projector when you do a sales demo. You see unposted orders and invoices represented by an icon of a stack of papers and the stack size correlates to the number of them. You see a graph representing some bullshit metric right below them, your favourite customers right beside them, it looks like a portal, It looks extremely cool. It makes you drool, it makes you want to buy it on spot and brag about it to to your friends how advanced software you use.

    It’s extremely good for everything, except for just one little thing: working… sigh…

  118. But Shenpen! Isn’t the promise of Apple’s iWorld that we can all become turtleneck-wearing latte-drinking art fops?

    Sigh… there goes that dream…

  119. That’s “art fags“, Jeff. Or did you censor yourself a bit, there?

    Bit of self-censorship, yes, but since the “fop” was a stock character very much akin to our modern metrosexual, I thought it more fitting given the image Apple tries to project for its products and their users.

  120. Yeah, W7 is Vista SP3. However, I’m tired of the yearly wipe and reinstall of XP. I was going to get Vista, but now I’m using the W7 Beta.

    It’s ok.

    I’ve tried a couple times going to Linux, and it just never sticks. I bought a Mac Mini, and I just don’t like the OSX GUI. So that leaves me with Windows.

  121. I don’t think OS’es are exclusive. OK for me on my home computer they are, as it’s small and old, because I left the powerful one half a continent away, but in general, if you have like 60GB of HDD, you don’t have to *choose* between Windows and Linux. You can use both.

    I know for many people it’s an identity question, but the whole point is that it shouldn’t really be. Your tool should never define you. IMHO.

  122. Unrelated, but the term “art fag” triggered a funny story in my memory. Shortly after I arrived to the UK in 2006, I was walking on the street smoking a cigarette when some guy approached me and mumbled something about “a fag” in that Black Country dialect I could not quite decipher at that point. (Sounds like a curious crossover between German and English, “train station” = “troyn stoyshun”, probably a dialect that goes right back to the Saxon roots of English.) I was a bit like WTF, does he think I’m gay or what? Turned out, he was just asking for a cigarette, for “fag” means cigarette in the Black Country dialect (and also in Scouse).

    I have always been interested in etymology, because even if the Shapir-Whorf isn’t literally true (many linguists challenge it), still I think language teaches us a _lot_ about a culture and way of thinking. (It’s like in programming, show me your data structures and I’ll have a good guess about what your program does.) So if anyone could shed some light on how “fag” acquired two totally different meanings in those two languages that are not as far away from each other as they were 50 years ago because today’s Brits listen to at least 5-10 hours of American every week on TV, I’ d be grateful. Perhaps… is it because cigarettes have “butts” ?

  123. >I don’t think OS’es are exclusive.
    nope. in the sense that i agree with you. in fact, old MacOSers tend to be the most prolific, since they have the mindset that a new OS is just a dragNdrop away. very common for old macos powerusers to be running 3 or more OSs nowadays. i’m running 4 on this box, my dad’s running 6+ on his wider hardware. and we’re both data/semantic nuts, so in both cases all are using the same user files.

  124. >“fag” means cigarette in the Black Country dialect (and also in Scouse).

    :)
    actually, it means “cigarette” in every non-american english-speaking country. secondarily, the latter american meaning flavours its use. but only secondarily.

    “fancy a fag?”
    english: “yeah, i’m gasping”
    american: “I’M NOT A FAG!!! … not that there’s anything WRONG with that, of course”
    australian: “yeah, i’m gasping. … not that there’s anything WRONG with that, of course”

    not sure how it acquired meaning of “homosexual” in america. but you may be amused to know that strictly it means a fire-ready piece of wood, ready for immediate lighting. and also strictly, it means a kind of pork dumpling-thing. always raises a wry smile walking around town here, seeing footpath signs outside cafes and such: “faggots and mash”. then in the weekend markets: “faggots” next to stacked bundles of coppiced wood.

  125. In Lord of the Rings, in the ascent of Carhadras, it says something about everyone carrying ‘faggots’ of firewood. The meaning I understood was a bundle of wood, not a single piece.

  126. I’ve been using Vista for over a year now. Works almost flawlessly. Can’t say the same about any desktop distro I’ve ever tried. They have waaaay more problems and small issues all around, and not one single gain in return.

    Windows 7? The same as Vista, with somewhat better perceived performance, some new useful features and a refined, better thought-out UI. They got a winner there.

    But hey, keep the delusion up. After all, every year is the year of the Linux desktop ;)

  127. >After all, every year is the year of the Linux desktop ;)

    For those with Linux desktops, yes.

  128. While Eric’s own writing on why Windows 7 will suck hard, I don’t think it completely cuts the cake.

    Rather, the fact that Microsoft’s solution to the complaint of Windows Vista containing too much DRM… is even more DRM in Windows 7.

  129. Tom Dickson-Hunt: >In Lord of the Rings, in the ascent of Carhadras, it says something about everyone carrying ‘faggots’ of firewood. The meaning I understood was a bundle of wood, not a single piece.

    no, it’s definitely the single piece. blurriness of english: not just the group but each man would have been carrying “faggots”.
    you can buy a bundle of faggots in the local market here, for example.

  130. >fag…homosexual

    it occurred to me later: “fag” WAS used historically in Britain (19C up until perhaps the 60s/70s) to refer to junior students in private schools required to act as personal servants/dogsbodies/near-slaves for senior students. the tone of “subservience” is closest in American english to that implied by “punk” (which interestingly in England has quite the opposite tone: that of bolshie aggressive resistance to influence). “punk” in modern-day American mostly takes flavours from its use in the prison system, for a guy Used for homosexual sex. i wonder if that’s the connection?

    note also that “punk” in traditional english is any soft highly flammable substance used for fire-lighting (of faggots, for example), such as wood shavings or wood pulp

  131. Probably good news for FLOSSLibertarians: mises.org going open source. It doesn’t mean they are starting to use FLOSS software, did that long ago, I guess. It means they are releasing all of their content over SVN so that anyone can set up mises2.org if he wants to. Wow! A brave move, I think.

    Volunteers needed here: http://blog.mises.org/archives/009475.asp

  132. >For those with Linux desktops, yes.

    Of course, the three of them. There is a news article right now on the Ars front page about the necessity of computer labs in universities, and some stats are worth thinking about. Let me quote:

    >> According to the school’s Information Technology & Communication department, 3,117 freshmen enrolled in 2007, and 3,113 of them owned their own computer. Nearly all of the machines were laptops, with 72 percent running Windows and 26 percent running Mac OS X (six hardy souls ran Linux).

    http://arstechnica.com/tech-policy/news/2009/03/whats-the-point-of-running.ars

    But yeah, I’m sure they’re all wrong, and stupid. Talk about delusion.

  133. It seems as if Win7 will be getting an XP compatibility mode, built on top of Microsoft's acquired VM technology.

    This is, precisely, the Right Thing. If Microsoft can pull off the implementation in a decent fashion, then Windows 7 will become a formidably popular desktop OS.

  134. oo. shades of yellow box. i mean, “Classic”.
    which apple, in its wisdom, has elected to eliminate.

    alas, poor yoricmac. i knew him well, horatio.

  135. 8 hours of Amazon sales of Windows 7 are outstripping Vista’s entire preorder period.

    Seven looks to be a very, very lucky number for our friends at Microsoft. Not so much for those who had their hopes pinned on 2010 being the year of the Linux desktop…

Leave a comment

Your email address will not be published. Required fields are marked *