From “200 Laptops Break a Business Model” in the pages of the New York Times:
So who’s up, who’s down and who’s out this time around? Microsoft’s valuable Windows franchise appears vulnerable after two decades of dominance. Revenue for the company’s Windows operating system fell for the first time in history in the last quarter of 2008. The popularity of Linux, a free operating system installed on many netbooks instead of Windows, forced Microsoft to lower the prices on its operating system to compete.
Mene, mene, tekel, upharsin!
Clayton Christensen, author of The Innovator’s Dilemma concluded six years ago that Linux and open source seemed to be executing a classic low-end disruption on Windows and closed-source technology. But it hardly took the man who formulated the concept of “disruptive technology” to notice this; it seemed obvious to me as well on reading his book, and not in the least controversial to the many business audiences I later exposed to the idea.
What Christensen could have added, but didn’t, is that the crisis moment when an incumbent’s market collapses is more likely to occur during an economic downturn when all potential buyers are feeling increasing pressure to cut costs. The NYT article marshals the evidence that, for Microsoft (and possibly for other tech companies like Intel dependent on high-margin flagship products with low-margin competitors), that moment may be arriving now.
In normal or boom conditions, the Vista debacle might have mattered much less to Microsoft. In the days when the company was sitting on a $60 billion dollar cash reserve Bill Gates used to boast that he could run the company for five years with zero revenue. That would have been enough coding time to write a new OS core from scratch, if need be. But that hoard is now mostly gone, spent on stock buybacks and acquisitions that have proved uniformly unsuccessful and a net drag on the company’s core Windows and Office business. Thus, MSFT is running out of room to maneuver.
If today were still normal or boom conditions, a collapse in Microsoft’s market share might benefit Apple as much or more than the open-source community. But downturns hurt high-cost, high-margin products more than commodity equivalents. Apple is paying the price for its luxury-good positioning now as it reports that revenue from its desktop line fell 31 percent this last year, and its laptop share is being hurt by cheap netbooks.
The one benefit of recessions is that they squeeze inefficiencies out of the market. As cost pressures mount, paying the license fees to shoehorn Vista or XP onto netbooks is going to look less attractive to the vendors and Linux will probably regain much of the the 100% share of that segment it had when this class of machines was first introduced. Microsoft will only be able to forestall this the way it pried loose the netbook market a year ago, by subsidizing the netbook vendors to the point where the net cost of an XP or Vista license is effectively zero. But that tactic can only be sustained for as long as Microsoft can afford to make no actual profit on the only market segment that is actually showing growth.
Extrapolating from Gates’s implication of about a $12B-per-year burn rate, MSFT’s cash reserve gives it about 20 months of burn time at this point. Adding what it can make on per-seat corporate licenses and the Vista-boggle during that time obviously isn’t coming up with a figure that makes the company’s financial mandarins happy, or MSFT wouldn’t have announced its first round of layoffs ever – and, of course, it missed their earnings-per-share target for the first time, too. I know how these people are trained to think and what kind of discount on future outcomes they apply, having been a director of a publicly-held company myself, so I’m pretty certain from their recent behavior that Microsoft’s own planners aren’t giving the company more than 30 months to live without one of (a) a major product success, (b) a return to market conditions that make customers much less price sensitive, or (c) a harsh restructuring of MSFT to cut its costs and run rate.
Right now, there is absolutely nobody who thinks the IT or consumer market is going to be back to anything like boom conditions in 30 months – and even if it did, price ceilings would have been reset by the expectation consumers are re-forming around $200 netbooks. There’s simply not enough room in a $200 bill of materials to sustain Microsoft’s business model. as I’ve been pointing out for nearly a decade and the NYT has just noticed.
Therefore, the scenario MSFT has to pin their hopes on combines major success for Windows 7 with a drastic downsizing of Microsoft so it can run on the seriously compressed revenue stream that is all netbook-land will afford. Viewed from this angle, Microsoft’s recent behavior makes complete sense. But there is no certainty that this strategy will land MSFT with a sustainable business model within its burnout time. And, even if we assume that it can, there is zero margin for error on the way there. Even one serious shock, such as a Vista-like failure of Windows 7 to gain traction, will sink them.
It’s also worth bearing in mind that if my model is wrong it’s likely to be because MSFT’s cash reserve and sustaining revenue streams both continued to erode after the most recent date for which I have solid information (November 2008). Under plausible but less optimistic assumptions (e.g. assuming their cash hoard has continued to decline since November at the same rate it did over the last two years, they keep missing earnings targets, and cutting operating costs proves difficult) they may not have 30 months to recover but as little as 18.
This is how empires fall. Until the last minute it is difficult to see what’s coming, because they tend to hollow out from within long before the damage becomes obvious from outside. Microsoft lost its battle of the Teutoburger Wald when it failed to prevent Linux from going mainstream in enterprise computing around 2003. Now, with the layoffs and the first-time fall in Windows revenues, we’re seeing the retreat from the Antonine Wall to Hadrian’s.
As the final collapse nears, each successive retrenchment will come faster. If history repeats itself exactly, the splitting of the eagle into a Western (Windows) and an Eastern (Office) empire may be the next major step, with the Western Empire collapsing shortly afterwards and periodic attempts by Easterners to recapture the OS market coming to nothing. But here I veer off into fancy…
So, are you putting your money where your mouth is and shorting MSFT? What’s your (financial) position?
It seems to me that Microsoft’s Office business is in more imminent peril than its Windows business. For now and into the near-to-mid future, there’s still a boatload of software that will only run on Windows, and WINE isn’t polished enough to be a practical alternative for non-geeks. On the other hand, OpenOffice, Thunderbird, and Google Apps are all well-beyond ready for prime time, and people can switch over to them on a whim at any moment they choose. In fact, it almost looks like Microsoft has already capitulated: Windows 7 no longer ships with a MUA, and ODF is now suppported in Wordpad.
@
The popularity of Linux, a free operating system installed on many netbooks instead of Windows, forced Microsoft to lower the prices on its operating system to compete.
@
Yes , that’s it . And wait for more, Bill ….
I, for one, welcome our new netbook overlords. I visited a Best Buy with my father who’s considering purchasing a netbook, the few netbooks on display seemed to be Windows XP machines. The one EEE PC’s wi-fi didn’t seem to be working. (Way to support the Linux community, Best Buy!)
It’s amazing how even serious geeks have been blindsided by netbooks. They’re another example of Christiansen’s “disruptive innovations.” They’re not really suitable for “serious” work, especially if you have sausage fingers like I do. But they’re great if all you do is surf the Web and do e-mail. I expect my next computer will be a netbook, or that flash memory prices will drop so that all laptops use it as storage.
> or that flash memory prices will drop so that all laptops use it as storage.
That’s not a terribly bold prediction. The only question there is whether it’ll take six months or two years.
Insofar as OpenOffice and MS Office…
For anything on the higher end of their functionality, OpenOffice doesn’t come close.
I speak of this from first hand knowledge with both Writer (sending properly formatted footnotes…) and Calc (where my production work sheets won’t even open, and that’s without using any VBA.)
Just curious: has anyone tried developing on one of these things? I wonder if the small screen and keyboard are impediments to programming. But I suppose you could always use external keyboards, mice, and monitors.
Daniel Franke > It seems to me that Microsoft’s Office business is in more imminent peril than its Windows business. For now and into the near-to-mid future, there’s still a boatload of software that will only run on Windows, and WINE isn’t polished enough to be a practical alternative for non-geeks.
Not sure about that. There are probably as many of apps and workflows built on MS Office that cannot be ported to Open Office or converted to e.g. web apps without a complete rewrite. Not to say that rewriting the VB and the Excel spreadsheets involved would not be a huge improvement in a lot of cases.
What kind of revenue numbers are ESR’s ‘time left’ scenarios based on? At first glance it seems like he is extrapolating from Bill G’s bragging about how long MS can go on with no revenue. I’m pretty sure that the revenue is not zero now and will not be zero over the next two years. Windows and Office on the desktop will still rake in billions for some time to come from businesses and governments, and that should be enough to sustain a smaller Microsoft.
If Mini-Microsoft’s blog is anything to go by, the company could probably cut it’s staff headcount in half and not lose anything worth mentioning. The problem is that the people who will be leaving are not necessarily the people whose leaving is a good thing for the company.
That said, I think Paul Graham had a point when he wrote almost two years ago that MS is dead. He meant that it was no longer a consideration as a competitor for new startups. Two years later, I can’t think of anything exciting coming out of MS. IE7 and IE8 were supposed to stop Firefox. The growth rate of Firefox’s installed base has not slowed at all. I don’t know anyone who uses the Live stuff, but I see people using Google’s apps all the time. I know a couple of people at my place of work who have dual-boot laptops, and they are using Linux as an ‘instant-on’ system, even though the Linux installation is a regular hard drive based one and not one of these new motherboard/flash thingies. Booting Vista is so slow that they just don’t bother (the employer-mandated antivirus crap may have something to do with it).
Apple’s experience does not, actually, support your thesis. Desktops are down, but the laptop sales are up dramatically. What’s happening there is a shift from desktop comptuers to laptops for the mainstream….. not an indication that the “expensive desktop” is going away. Apple’s laptop sales are not under pressure from netbooks at all— this is a commonly heard claim from the “analysts” who still think Apple is on the ropes like it was in the 1990s.
On the other hand, Apple is competing in this segment, with a premium product that is outselling the netbooks: The iPhone and iPod touch.
The fundamental thing that you, and other Linux advocates, do not get, is that when it comes to the mainstream, usability matters. Until linux comes out with a good UI (and not a warmed over clone of windows which was a half backed clone of the Mac) it will never see mainstream adoption.
History bears this out. Where has linux seen success? In the server marketplace.
In netbooks, linux is a failure– resulting in netbooks having the highest return rate of any class of notebooks. People get them home, discover that linux may look like windows but isn’t, and even the ones who are willing to learn it, invest a few hours and get frustrated, and back the machine goes.
Netbooks as a market are a failure. They have unit volume because they are a fad, but at the end of the day, people want a computer that can do more than browse the net, and they are increasingly turning to laptops for that.
If they just want to browse the net and run a few apps, they can already do that on their iphone or ipod touch.
The really compelling data we have from last quarter is the huge jump in ipod touches that were sold.
THAT’s the computing platform of the next 15 years. And thats where people will turn when they want a computer that is cheap.
Jay > People get them home, discover that linux may look like windows but isn’t, and even the ones who are willing to learn it, invest a few hours and get frustrated, and back the machine goes.
As far as I can tell, this is based on one news story about one product where the vendor had done a sloppy job of Linux installation. An Asus spokesperson went on record soon afterwards saying that the return rates for their Linux and Win XP netbooks are similar.
While I agree with the facts presented by Mr. Raymond, I would like to point out that, in my opinion, Linux will have to fight for its share of the netbook market around the middle of the year.
As a Linux user, I have to admit that after testing Windows 7 in my Aspire One I was very, very surprised. The speed and stability matches my customized Debian -testing install.
A glitzy marketing campaign and raving user reviews would at least put a dent in the Linux netbook market share.
And Vista isn’t performing any better elsewhere in the world, in Hungary, it’s below 10% market share, which the tech mags are openly calling an epic failure.
I wonder how it goes in India and China, my guess is: not well. (Especially that India has an IBM/Oracle/Java-oriented IT culture.) So, yes, I think there is some sort of writing on the wall.
Main reason is probably that in the recent 5-6 years we got used to the idea of not upgrading our hardware much, unless we are heavily into gaming. Hell, even in gaming, we are just upgrading the 3D card – the dual-core 3Ghz box that was built for me in early 2004 and was not considered particularly expensive even then, is not considered too obsolete at all now at early 2009. So having to upgrade the system just for a new OS that does not seem to provide any real benefits is just something that people apparently don’t want to do.
However, other business streams are going better, especially Microsoft Business Solutions: MS CRM has killed much of the CRM market (nobody even remember f.e. Pivotal anymore), Navision/Axapta/Great Plains killed much of the ERP market: iScala and Baan and Exact are almost history, Oracle Financials is struggling, only SAP is giving a good fight but only in the high-end, the “real” SAP, as SAP BusinessOne is an epic failure.
So I think what will happen is that Microsoft will have to reinvent itself just like IBM reinvented itself as a consulting company, Microsoft will have to reinvent itself as a business solutions provider. That will require a MASSIVE reduction in size and operating costs. (Which isn’t too good as it means thousands and thousands of unemployed programmers will flood the market, pushing down salaries.)
Still this is the only thing they can do, as business solutions is the only market where FLOSS is still not competing much and I’m afraid won’t in the next few years either: http://www.openerp.com/ and others are great, but PHB’s just won’t trust them without a big brand name. (If IBM had any brains they would just buy OpenERP and stick a big friggin’ IBM logo on it without touching anything else, as the IBM brand name still has a lot of appeal for older PHB’s.)
Microsoft is apparently on a state of rapid decline. Their monopoly is sliding. Sure, but MSFT is one hell of a scary beast. They have incredibly exceptional large bulks of money in store for the rainy day. Indeed, when a giant-like company like this is deteriorating, no one can accurately predict it’s forthcoming struggles for survival. It’s quite like a lion with a bruise. One has exactly no idea what’s next in the Microsoft-show. They’re becoming probably more dangerous, meaning they are in their extreme limits and may act as savagely and immorally as possible [more than ever before] just to keep it going.
The war with the dragon is a dangerous one for the open source warriors as well. As Nietzsche says:
He who fights with monsters should be careful lest he thereby becomes a monster.
Just joking :)
Everything I’m hearing suggests that Windows 7 is going to be awesome and will address many of Vista’s shortcomings. Imminent Death of Microsoft Predicted? Fat chance.
Odds are, Eric, you’re running a box with one of the absolute worst possible CPU architectures. But x86 still won. Why? Because it’s compatible with what’s out there.
Windows is also the most compatible thing with what’s out there, both in terms of hardware and software. It’s not going away.
The real scary bit, as alluded to by other commentators, isn’t that Linux will take over the PC OS market, or even cut margins there to unacceptable levels; it’s that that space shows signs of vanishing entirely. It may well turn out that outside the server market, the long-term technology will be appliances, not general-purpose PCs. Webtops, gaming consoles, set-top boxes…
Apple seems well-positioned for this, and Linux will be a common tech-layer for everybody else addressing this market. But I suspect in twenty years the “home PC” will seem about as anachronistic as the central home electric motor predicted by futurists of the late 19th century.
Competition is a good thing. It’d be nice if Linux and Apple really posed that formidable a threat, but at the end of the day, Microsoft’s biggest enemy is itself, and it’s recovered from self-inflicted wounds in the past (ME, anybody?). Windows 7 is Microsoft’s response to Vista.
Alas, for the time being, “Putting Linux on [netbooks] is not going to help sales [says Stephen Baker, NPD Group’s vice president of industry analysis]. He estimates that 90 percent of netbooks sales are devices that have Windows XP on them.” (www.msnbc.msn.com/id/28627170/)
“It may well turn out that outside the server market, the long-term technology will be appliances, not general-purpose PCs. ”
Maybe. Though AFAIK it would be a bit of a stupid direction to take. I think the smart choice in this regard is put all your eggs into one basket and make that basket really cool. One huge screen – preferably a projector, or lacking that something like a plasm TV – with really good speakers and crisp clear image, some very comfortable big keyboard, a good trackball, and you can do all you want on it from one general purpose PC: movies, gaming, web, e-mail, working, reading ebooks, listening to music, videophoning, whatever you ever wanted to do with any electronic device, all in the highest quality.
That would be the smart choice: the most bang for the least buck.
I see no point in the current explosion of consoles, smartphones, netbooks and all those gadgets – it’s wasting money via duplication and a compromise for lower quality…
If “on the go” is such a high priority – for me not but I guess for some it is – then just add one general-purpose pocket device to it, in size something between a smartphone and a netbook, that uses some kind of remote-desktop technology like VNC, RDP or Citrix to log into that big box at home via GPRS or wifi and then send the mails from there, run the browser from there, etc. etc. why build it all into the device? Especially why into several devices?
I think Windows 7 is pretty much doomed to suck.
The reason I say this is because Vista sucked, and they haven’t had time to fix it. Not just the surface suckiness in the UI, but deeper stuff like resource-hogging and the loss of compatibility with old apps and drivers. Fixing that would require major rewrites reaching far down into the core architecture, something there simply haven’t been enough development cycles to arrange since it became apparent that Vista was a bomb.
There’s evidence for this proposition in the Windows 7 public beta. All the reviews I’ve looked at suggest it’s Vista with minor UI tweaks. If that’s the best Microsoft can do – and I really don’t see any realistic prospect of do better in 30 months or less – then it’s hasta la vista, baby.
esr, what do you think is going to happen with VMWare? Do you see it getting slaughtered by Xen or some other open source alternative or do you think the demand for a support phone number and a company to sue will justify their business model?
Not to challenge your view of Microsoft (I don’t understand that issue, and I don’t own its stock, so okay), I do question your metaphor of the Fall of the Roman Empire, an event that did NOT occur in the fifth century. Having discarded their unprofitable western provinces, the empire went on for another thousand years, preserving much of classical culture in the Graecophone east. Its people called themselves Roman and retained many features of the Roman governance. Nothing fell. It endured. (Even as theoretically mightier states like Persia and the Arab caliphate and the Mongol empire collapsed and vanished). Is that what will happen to Microsoft? Are the rest barbarians? Rome did not fall till 1453, and by then Italy was civilized enough to inherit and appreciate its culture.
“I think Windows 7 is pretty much doomed to suck.
The reason I say this is because Vista sucked, and they haven’t had time to fix it. Not just the surface suckiness in the UI, but deeper stuff like resource-hogging and the loss of compatibility with old apps and drivers. Fixing that would require major rewrites reaching far down into the core architecture, something there simply haven’t been enough development cycles to arrange since it became apparent that Vista was a bomb.
There’s evidence for this proposition in the Windows 7 public beta. All the reviews I’ve looked at suggest it’s Vista with minor UI tweaks. If that’s the best Microsoft can do – and I really don’t see any realistic prospect of do better in 30 months or less – then it’s hasta la vista, baby.”
So, you’re basing your opinion off of other peoples opinions, but no actual first hand knowledge… glad you’re giving us the results of your rigorous testing, people might begin to doubt your expertise otherwise.
I’ve actually dared install Windows 7, so I believe I have a little authority in speaking about it.
To me, it’s a mixed bag, but it felt pretty much like they were putting temporary fixes/patches on top of Windows Vista to make it feel “better”. Half of the included Windows Vista programs were outright removed in order to reduce the disk space required (“only” 6GB now!), they are now downloadable as the “Windows Live Essentials” package (I doubt people will really download this, nobody even used those programs to begin with). They also apparently messed around with the bootup and shutdown sequence to make them faster than Vista, though I can’t help but to think that they hadn’t really fixed anything rather than just working around some flaws…
The primary look of the UI (the MS Windows equivalent of a window manager, there’s no better term since you can’t actually replace it) is mostly the same as Windows Vista, but the “big thing” they’re is the new taskbar which is a minor at best change to the whole feel of things. By default, it’s just the program icons representing the buttons (you can get the same effect in Xfce by application names, probably true in other DEs/panels as well), it actually saves quite a bit of space for applications on the taskbar, although it’s not an original idea so they don’t really deserve all that much credit for it. Various control panels (particularly ones related to customization) have been changed yet again, but this time their layout actually makes sense, unlike the Windows Vista ones. For example, you now have just one window for changing the desktop wallpaper/screensaver/colors and all that stuff; Windows Vista hacked up the XP Display Properties and barfed that dialog everywhere, it was quite ridiculous.
As far as compatibility issues go, I ran into quite a few things that weren’t compatible with Windows 7 even though they ran on Vista; I’m not understanding where this “perfect compatibility” claim is coming from. Granted I ran mostly minor programs few people run, but the most major that didn’t run was Google Chrome (just displayed “Aw Snap!” whenever it started up), how can all the Microsofties miss that? I suppose they’re still using MSIE, but as for me, I headed off to mozilla.com, and Firefox ran fine, so at least I didn’t pull my hair out while browsing the web. I was fairly disappointed too, I was looking forward towards my first chance to run Google’s web browser, but oh well. At least I felt happy to flood the built-in bug reporting feature (why hasn’t this appeared in the final versions of Windows? :-) with all the incompatibility reports.
All in all, their goal seemed to be “better than Vista”, which is not a hard goal to reach, and in that sense, they succeeded. It is still Windows with all the classic flaws, I’m afraid they will never fix; it’s just trying to impress people who think UIs and startup times are the only things that matter. The real test will be whether people will still hold onto Windows XP or not over Windows 7. If Windows has any future at all, it lies in ReactOS, where nobody owns it and people can actually fix some fundamental issues (IMO, ReactOS needs a Wine-like wrapper around legacy programs to pretend they have full admin rights when they don’t, but that’s not a topic for this blog); at the very least, ReactOS can serve as a maintained NT-like operating system following the inevitable demise of Microsoft.
The UI tweaks are more than minor; they’ve actually fixed the damn thing. In other words, they’ve finished ripping off NeXTStep and given up trying to hide it :-). The semantics of the Windows 7 taskbar are identical to WindowMaker. Of course, I still think they got the UI right with Windows 1.0 and have yet to match it. It was a tiling window manager, just like xmonad which I use today.
Also, I’ve heard they’ve done some pretty substantial refactoring of the kernel. It seems to show: Windows Vista runs like a dog inside KVM (Linux’s fork of qemu to take advantage of Intel’s and AMD’s hardware virtualization instructions), while Windows 7 is much less painful.
All that said, though, if I used XP as my primary OS, I wouldn’t pay more than $30 to upgrade to Windows 7.
> Just curious: has anyone tried developing on one of these things? I wonder if the small screen and keyboard are impediments to programming. But I suppose you could always use external keyboards, mice, and monitors.
An eee 901 is my only machine, so what development I do is done on there. External keyboard, mouse and monitor is the way I go most of the time. I’ve only done a few lines at a time of actual programming without them, but my bottleneck is thought rather than typing speed. (One annoying aspect: the keyboard, at least in the UK, has no dedicated backslash/bar key. Fn-z gives \, shift-fn z gives |. Bizarrely, Fn-shift-z gives ^\, which is what I use as a screen escape. Sooner or later I’ll set a different mapping up for when I don’t have an external keyboard plugged in.) I imagine referencing APIs would be a pain on 1024×600.
Main problem at first was that the default distro sucks. I’m running gentoo off an sd card for now; I also hear that eee Ubuntu is pretty good. (This is probably why linux netbooks haven’t taken off as well as they might: windows ones work pretty much like windows always does, but linux ones are crippled compared to anything else that runs linux.)
The hardware hasn’t generally been a problem for me, but it’s about four times as powerful as what I was using until a few months ago. I’m not really a power user, so take all of this with a grain of salt.
10 years ago, several of my german clients unilaterally switched their entire setup to linux, for long-term reasons of cost, safety, controllability. whole german banks and central banks (germany is a federation like america, and has a similar central bankS setup), all running wholly on linux. quite proud of them and their sheer guts, walking around and seeing an entire trading floor all linux, linux desktops flickering in every single upstairs office and room. even excel running apparently-native for each (STILL no remotely comparable spreadsheet for nontrivial work).
was back in one of the central banks for a nostalgic 5week contract last year (completed in 5 days — corporates’ estimators aren’t used to hackers. :) spent the rest of the time uprating their installation). superpleased to log into a linux desktop to do it. still rock-steady, still doing the business for them.
Faré said, “So, are you putting your money where your mouth is and shorting MSFT? What’s your (financial) position?”
This would first require either that ESR was willing to own shares of MSFT. Assuming that he was unwilling to actually own shares in MSFT, he would then need to be willing to sell naked puts (making him a naked short in MSFT).
Taking a naked short position would require that ESR first opened a margin account with his broker.
Under Regulation T, the Federal Reserve Board requires all short sale accounts to have 150% of the value of the short sale at the time the sale is initiated. The 150% consists of the full value of the short sale proceeds (100%), plus an additional margin requirement of 50% of the value of the short sale. For example, if an investor initiates a short sale for 1,000 shares at $10, the value of the short sale is $10,000. The initial margin requirement is the proceeds $10,000 (100%), along with an additional $5,000 (50%), for a total of $15,000.
Personally, I doubt he has the money. The book market isn’t that lurcrative, and its getting worse all the time.
Even if he did, I’m not sure that shorting a stock that closed up, but still below $18 today, especially when there are nearly 9 billion shares of it in the world.
A whole generation has been raised that computer means “microsoft”. Free Software (and open source) have a long way to go to reverse that. Yes, people are beginning to notice Free Software solutions, and there are wins here and there, but the world is a very, very large place.
In addition, MSFT has paid quarterly dividents of approx $0.10/share since mid-2004. Your broker gets those.
> Mene, mene, tekel, upharsin!
You’re such an an unmensch! :-)
BTW its “Mene, Mene, Tekel u-Pharsin”, and normally one doesn’t include the exclaimation mark.
> This would first require either that ESR was willing to own shares of MSFT. Assuming that he was unwilling to actually own shares in MSFT, he would then need to be willing to sell naked puts (making him a naked short in MSFT).
You’re confusing naked short selling with uncovered calls and naked puts. Shorting has nothing to do with options. Normal short selling means borrowing shares and then selling them. Naked short selling means simply selling stock that you don’t own without first making arrangements to borrow it. Naked short selling has been illegal since last September, and before then, the distinction was transparent to retail investors, as their broker would be expected to deal with the borrowing. It has only ever concerned people who conduct transactions directly from the trading floor.
Uncovered calls and naked puts are when you sell an option without first buying or shorting (respectively) the corresponding stock, which, as with any form of short selling, makes your potential losses unlimited.
Huh? Why the hyphen? There’s no hyphen in the original. ×žÖ°× Öµ× ×žÖ°× Öµ× ×ªÖ¼Ö°×§Öµ×œ וּפַרְסִין “Counted, counted, weighed, and Persians.”
I think because פַרְסִי is a proper noun, so when you add the prefix וּ meaning Ö¼”and”, and then transliterate, you want to preserve the capitalization. Then to prevent it from looking weird, you add a hyphen, kind of like the apostrophe when in French you prepend d’ to a proper noun. But this is just guess coming out of my linguistics dabbling and my laughably weak Hebrew School education.
(Aside: editing a GtkTextArea with mixed text direction is confusing as hell)
Err, dropped the final nun. פַרְסִין == “Pharseeyn”
The reason I say this is because Vista sucked, and they haven’t had time to fix it. Not just the surface suckiness in the UI, but deeper stuff like resource-hogging and the loss of compatibility with old apps and drivers.
I thought a lot of the resource-hogging was supposed to be DRM-oriented, watching you every clock cycle to see if you were playing a single frame of unlicensed high-res media and attempting to degrade it on the fly. That could presumably be turned off without *too* much effort.
Another straw in the wind about the doom of Microsoft….
There was a time when buttons with slogans that bashed Windows and Microsoft were a small but noticeable part of my income. Then XP came out, and people didn’t seem to be as angry. When Vista arrived and everyone hated it, no one asked for any buttons on the subject. Microsoft was at the point where they couldn’t even disappoint people.
>You’re confusing naked short selling with uncovered calls and naked puts.
And it’s because I don’t understand distinctions like this that I haven’t short-sold Microsoft. I understand economics but my grasp on financial mechanics is weak.
I remember a time when IBM was the big evil company that people wanted to see destroyed. I remember a time when the software companies that gave up on copy protection found that they suddenly dominated the market. I remember a time when AOL was the coolest hippest thing ever – before people realized why walled gardens suck and open protocols rock – just the way Facebook is now that people are forgetting again. All these things are cyclical. The same types of battle (and sometimes even the exact same battles) get fought over and over again, and the same corporate logos often appear again on the opposite side of the fight.
The name “Microsoft” is not evil – even if it is an icon that currently represents some evil ideas. When these ideas start to weaken the corporation, it is actually better if it does not die but learns the correct lessons and becomes a champion for the right side. It has happened before. It will happen again.
So don’t think, “How can I destroy Microsoft.” Think, “How can I accelerate Microsoft’s learning curve.”
[BTW – Eric, thanks for giving Jen and I the con-party tour this weekend. We had a really good time, and it was great to see both you and Cathy again. Ping me at the email address I submitted – I am working out the details of a new idea and your advice/input would be welcome.]
I just wrote a bunch of articles on the basics of short selling a month ago.
Most of the Vista slowdown comes from something that Eric would approve of, in theory.
Vista finally moved most of the UI stuff from Ring0/Ring1 to Ring 3 as part of its security re-design. This was also a necessity for re-skinning most of the Windows UI for the Aero interface.
Most of the backwards compatibility issue with Vista stems from this, and a much more sensible approach to program privileges. Vista no longer lets programs escalate to Admin level privileges without prompting the user for an Admin level password. A lot of software for XP and earlier expects to silently run in Admin mode. (It also meant that most XP drivers weren’t going to work in Vista, which caused massive hardware compatibility issues).
Where the Vista UI either did something right or pissed off the userbase (or both) is that there is no option to always allow a program to auto-escalate without intervention.
What most people see as “Massive DRM” is Vista asking for authentication every single time a program tries to escalate its rights. This was ESPECIALLY problematic when XP was upgraded to Vista.
So, increase security, change a lot of the underlying architecture…and, gee, everyone notices. Hell, look at the griping that came out with KDE 4.0 coming out.
Of course, since none of the Linuxistas can be bothered to actually use Vista enough to become conversant in it, and since it doesn’t do things the way they expect it to (and is therefore a Crime Against Nature and All Things Good), all of the diatribes are going to be repeats of any bad coverage of Vista…most of which are written by people who never wanted to upgrade from XP, and who learned the Windows UI back in the ’90s. (Hint: If it comes from Infoworld, Infoworld considers the Windows 9x/2000 UI to be the height of UI design, and resents changing anything. In large part because they resent having to teach their users, yet again, how to use a UI.)
Not that Vista is without flaws. Vista also grew from “bloated pig” to “arthritic water buffalo, gravid with triplets”, and the two tier approach on what was and was not adequate hardware to run it (and the Aero interface) was a horrible strategic blunder. (Short form: “Vista Certified” means you can run it with Aero turned on. “Vista Ready” means you can run it with Aero turned off and a few other features turned off. Nobody wanted to admit that the second should be “Vista Crippled.”)
In regards to Vista’s DRM tracking slowing down performance, let me re-use one of Eric’s arguments.
The proof that DRM doesn’t work is that within weeks, and months at the outside, any DRM scheme will be cracked, and those exploits will be circulated. The time frame goes down rapidly the more widely exposed the DRM schema is.
Let us presume that this is true. I have no evidence saying it isn’t, and it makes sense.
Vista purportedly has built in DRM. Vista is roughly half of the total PC desktops out there.
By now, someone should’ve broken this DRM, and everyone in the Linux world would be using it as Argument Number 1 on why DRM is Futile. Oddly enough, this is not the case.
Either there IS no DRM tracking built into Vista (beyond the install code DRM which is trivial to get by), or Microsoft has made a DRM system that’s so l33t that nobody’s cracked it. In two years.
According to Malware and Windows security tracking sites there are two interesting trends.
1) Vista desktops have outnumbered XP desktops since February of 2008. Most recent figures say it’s a 55/45 split. This trend tracks pretty closely with the decline in spam email volume; every time the raw number of XP users goes up, Spam email volume goes up. Every time the volume of Vista users goes up, it remains unchanged – as Vista (and its successors) take hold, the volume drops as a whole.
2) The total number of Windows intrusions/exploits out in the wild has trended flat-to-down since Vista came out; the number of PC users has gone up, the number of Vista installs has gone up. When sorted by installation type and taken as percentages of the userbase…
A) XP has about 60% of its intrusions due to flaws in the OS, 20% due to flaws in apps, and 20% due to social engineering. While XP is about half of the installed Windows base (about 48%) and about 45% of their total tracking database, it represents over 80% of the Windows intrusion sites out there. In particular, its ability to get infected while doing an install is a big source of problems.
B) Vista has 80% of its intrusions due to social engineering, with about 15% due to bad apps, and 5% hitting holes in the OS. It represents about 15% of the intrusions as a whole, and about 20% of the Windows total.
C) Of the remaining 5% of intrusions out there; the breakdown is about 70% social engineering, 20% application flaws, and 10% rights escalations. Keeping in mind that malware targeting will oversample any version of Windows because of the target size. Depending on how you read the numbers, the fact that 5% of the exploits are OSX can be read as “Vista has hit parity with OSX for security” or “Vista is better than OSX for security”.
Could someone explain what Mene, mene, tekel, upharsin! means?
From the other posts its Hebrew and means “Counted, counted, weighs and Persians.”
What does that have to do with this topic?
And MSFT seems like a pretty risky short. You could be right, but just be way too early.
There are a lot of doggier stocks out there that you could short.
Darrencardinal, see http://en.wikipedia.org/wiki/Writing_on_the_wall
>> This would first require either that ESR was willing to own shares of MSFT. Assuming that he was unwilling to actually own shares in MSFT, he would then need to be willing to sell naked puts (making him a naked short in MSFT).
>You’re confusing naked short selling with uncovered calls and naked puts. Shorting has nothing to do with options.
actually, i’m afraid you’ve both got yourselves confused.
‘short’ and ‘long’ are first and foremost verbs, essentially meaning to make a transaction which will give you a negative or a positive exposure to the instrument transacted, respectively. you can also use them to describe the direction/sign of resulting (net) exposures. eg, if i long 100 shares, then short 20, i’m left long 80. short 80 to close out the strategy.
so in this example, eric could short the stock, or if the margin reqts (aka “deposit”) were too onerous, he could LONG a put option (not short it) and ride it out till it expired (or short a call, but then his upside is limited to the cost of the option and his downside is unlimited). the “naked” aka “uncovered” aka “normal” is just jargon implying you’re not bundling the trade with an existing position or accompanying trade in the underlying stock. generally, unless you’re a pension fund/asset manager sitting on an existing and compulsory portfolio of stock, or you only want a BIT of a movement and know what it means to buy/sell at different strikes, you should only consider the “naked” option. think of it like modularising messy code — keep each bet separate unless there’s a VERY good reason not to. my email app does not have an embedded calculator without a VERY good reason. my downside view on MSFT does not have an embedded long position without a VERY good reason.
summary:
to gain unit exposure to a fall in MSFT’s stock price, the primary/cheapest options are:
• short the stock
• long puts on the stock (remembering to scale the purchase UP by the fraction 1/delta and DOWN by the contract size; google “equivalent asset postion” for explanation)
>>You’re confusing naked short selling with uncovered calls and naked puts.
>And it’s because I don’t understand distinctions like this that I haven’t short-sold Microsoft. I understand economics but my grasp on financial mechanics is weak.
i can do you a jackrabbit primer by email if that’s useful, eric. once you remove the jargon clutter, it’s actually pretty simple.
regardless, for people simply trying to take long/short positions, i UNRESERVEDLY RECOMMEND the new breed of spread betting shops, where you are essentially connected to a sales-trading desk (full phone access, not just net) and are given a free day-trader tool of surprising capability. http://igindex.co.uk/ is essentially top-flight professional-level market access at walmart prices. in this case, eric, you could swerve the put/call/collar/spread/blahblah decision and simply say “short MSFT, kthxbye”
>google “equivalent asset postion†for explanation)
^postion^position
>What most people see as “Massive DRM†is Vista asking for authentication every single time a program tries to escalate its rights.
That’s not all of it by any means. Vista spends a lot of cycles and code complexity trying to verify that the DRM isn’t being end-run by hardware shims or software emulations, in an attempt to unsure that no undegraded video and audio can get to where it can be re-recorded. This causes significant performance degradation on operations as simple as file copies. Also, it means your Vista installation may refuse to work if you mod your hardware in any way, like upgrading a video card or hard drive.
>By now, someone should’ve broken this DRM
Your assumptions are naive. What the Vista core has embedded in it isn’t DRM in the normal sense, which is essentially a combination of an encryption protocol with some sort of privilege manager. These systems are easy to crack because privilege managers tend to leak ostensibly secret keys during normal operation (see Kerckhoffs’s Law for discussion of related issues) and can be spoofed relatively easily if the user has control of the hardware and software environment. What you misunderstand as “DRM” in Vista is mostly checks intended to ensure that application-level DRM can’t be spoofed or evaded.
> you can also use them to describe the direction/sign of resulting (net) exposures. eg, if i long 100 shares, then short 20, i’m left long 80. short 80 to close out the strategy.
I’m aware of this usage but it strikes me as sloppy. It’s much clearer to say “buy” and “sell” to describe a particular transaction, and “short” or “long” to describe next exposure. And okay, I guess I should have said “shorting equity has nothing to do with options”, as it’s legitimate to say that selling uncovered calls puts one in a short position.
> Also, it means your Vista installation may refuse to work if you mod your hardware in any way, like upgrading a video card or hard drive.
This is no longer true, and it relates a different kind of DRM than what we’re talking about. Windows requires itself to be “activated”, i.e., you need to send your CD key to a Microsoft server and receive back confirmation that you’re using a legitimate copy. This is to protect Windows itself from piracy, not to protect digital media. If you reinstall Windows or make significant changes to your hardware (IIRC, modify three or more out of eight checked components), you need to reactivate it. Last I heard, Windows will let you activate the same key up to 10 times without hassle, but Microsoft constantly messes with this number; it’s previously been as low as 2.
It used to be that if you don’t activate Windows within 30 days, it’ll only boot in safe mode, but as of Vista SP1, it’ll merely nag you with popups and periodically reset your desktop background to black.
I only meant to italicize the word ‘equity’ two posts up. I either typoed the closing tag or WordPress swallowed it.
>i can do you a jackrabbit primer by email if that’s useful, eric. once you remove the jargon clutter, it’s actually pretty simple.
That would be interesting.
>I’m aware of this usage but it strikes me as sloppy.
quite the opposite. it is unambiguous. which is why it’s the industry standard. you are absolutely right that for a lot of instruments (by volume) “buy/sell” is equivalent. but as soon as you get away from the plainest of plain vanilla, buy/sell actively becomes a distraction. take a simple inverse floater — long rates (the numeraire) is a sell, same as a bond (“stock”), yet buying a floater leaves you flat. in credit derivatives, both pooled and vanilla, “investors” (the standard term) are short not long. and with say a triple currency reverse dual currency rollercoaster swap with a step-down step-up floor, well, you do want to be bloody careful.
we use long/short not to obfuscate but to clarify. only when safely tucked into one single market or trade does a desk start casually saying “buy” or “sell”. prop.traders and hedgefunders work through a strategy’s longs and shorts and only at execution time re-affirm buy or sell in each instrument. equity fund traders going long yen sell japanese importers. even plain vanilla debt traders, govie AND credit, first state their hedges at each ladder rung as long/short before selling/buying (respectively).
but yeah, for constrained single-market or single-trade discussions you can use buy/sell. the custom reuters keyboards for the directly connected exchange traders have the 2 execute buttons labelled Buy and Sell, for example. this is sensible: each trader is confined to only a single trade and they frequently talk to the public — it’s valuable to reduce any constant cognitive dissonance. unbelievably, though, the buttons are RIGHT NEXT TO EACH OTHER. as in: flush. seriously! every single fx trader on earth (fx is the fastest market by far) has at some stage in his career hit one in haste, looked aghast, and immediately hit the other twice.
>>i can do you a jackrabbit primer by email if that’s useful, eric. once you remove the jargon clutter, it’s actually pretty simple.
>That would be interesting.
it’s on me list!
fuck it, it’s a hell of a lot less irritating than everything else on the list — i’ll start tonight
Cc me while you’re at it (email address is on my website).
What you misunderstand as “DRM†in Vista is mostly checks intended to ensure that application-level DRM can’t be spoofed or evaded.
Interestingly enough, file copies on Vista SP1 and later are faster than they are in XP. Knowledge base article on slow file copies in Vista showed that there was a glitch in their journaling system for NTFS 5; it was doing the journal validation in a very processor intensive way. I believe one of the Linux file systems experienced a similar glitch.
And my core argument still stands:
I’d think that with a target as juicy as WIndows Vista, with two years, and umpty-million copies out there, that someone would have cracked this, and the crack would be used as an example of why Windows/DRM sucks.
This has not (to the best of my knowledge) occurred.
Which do you think is likelier?
1) Microsoft made something that couldn’t be broken
2) It was touted as a feature, but doesn’t actually exist,
Most of the performance testing I’ve read for the beta build of Windows 7 indicates that it’s faster than XP in nearly every benchmark yet. Vista SP1 got rid of most of the disk I/O issues compared to XP (that journaling glitch, mentioned above).
>I’d think that with a target as juicy as WIndows Vista, with two years, and umpty-million copies out there, that someone would have cracked this, and the crack would be used as an example of why Windows/DRM sucks.
What, you mean the anti-spoof checking? It’s been cracked, all right. Repeatedly. The problem is that, as with mods to defeat the activation check, the binary patches are finicky to apply and extremely prone to break things if you have a different build of the OS than the one the patcher was expecting. So they tend to be published on warez sites but never deployed much in the real world.
But Microsoft CAN’T squeeze cost out of the Netbook market, even dropping the price of XP to zero. All Windows boxes need antivirus (or they quickly need reimaging). The 3-year cost of antivirus subscriptions is as much as a $200 Netbook!
This is a “Microsoft Tax” from the lousy security architecture that is simply crushing in this market segment, and Microsoft doesn’t even see a penny of it.
Quite frankly, my wife hates Vista because it’s slow, bloated, yadda yadda. What’s weird is that it’s Apple that’s holding her from switching to Ubuntu – iTunes is important to her, and Banshee and Amarok aren’t quite there yet. When that happens, we’ll be Microsoft free, and I don’t think we’ll be the only ones. I can’t help but think that Ununtu on a Netbook would be stable and easy to use for many more years than any Windows computer would be.
Interesting, and it contrasts a bit with http://esr.ibiblio.org/?p=734 .
Assume that any given software is made 1000 different builds, which may be simply inserting a fairly large number of random NOP’s in the machine code or something more complicated. When you buy a CD or download it you get a random build.
Assume that users aren’t told the build number and even comparing the file length is futile as care was taken that all are the same length.
Even if crackers take the pains to crack all of them, for the users it’s just too inconvenient to try to figure out which one you have.
Presto! DRM that’s not unbreakable but still almost always un-broken in practice.
To Ken Burns:
Do you have pointers to your data on malware breakdowns by OS version? I’m in the security industry, and while Vista is somewhat better than XP, I’m skeptical of a lot of your comment. Links would be very helpful.
I’d point out two things, one theoretical and one practical.
Theoretical: There’s a big, big difference between best practice and typical practice on the desktop. I know folks who can lock NT down basically as tight as Unix; you don’t see anyone running this way in the real world. As long as you let users install new software, the malware will get installed.
“It looks like you’re installing software. Allow or deny?”
“Allow, and please SHUT UP!”
[malware gets installed]
Practical: There’s no way that Vista will run on the Netbook hardware without making everyone really unhappy. Therefore, they will ship with either XP or Linux. If it’s XP, your numbers on Vista vs. XP will change rapidly in XP’s favor.
Not trying to be argumentative, but I really don’t see that Vista is making much difference in the anti malware battle.
Sorry, that’s “To Ken Burnside”.
These are all very interesting points, but has anyone considered that there might be some people who simply don’t care what operating system there running?
to be fair, with the death of MacOS, and the individually voluntary but uniform drive to UI&architecture homogenisation since pre95, we are all, UI-wise, running the same OS.
i’ve been a mac-man for human stuff and a unix-man for tech stuff since 1988. but mac died with macosx. macos was a good cut at mac but macosx is nothing like mac. macosx10.1-5s are essentially identical and a wild leap backward from macos7.6. hell, from 6. christ, in most UI respects, from 4. god, i’ve just depressed myself.
macosx 2009 remains ur-identical to sunos 1990/irix 1988. xp is a BETTER macosx than macosx, and it’s closer to mac (HIG) than macosx. macosx is just an app server and xp is a better app server AND more user-friendly. by a long way. and it kills me to say it. but as a long-time mac zealot (vs apple zealot), if xp (just as posix as linux or macosx) switched to using mac mouse drivers, it would dominate macosx in EVERY ui aspect left to macosx. right now i’m continuing with mac due to financial inertia. my next machine wil be either a macos-capable pismo or a “whatever” XP box. the bloodpressureexploding stupidities of macosx vs xp have finally exhausted the reserves of my preference for clean mouse use.
Saltation:
I’m a fan of Mac OS X, despite its faults. (I think it should take a cue from X and be network-transparent). Can you give some examples of why you think the interface is so bad?
Okay, but if anybody is wanting to short Microsoft stock, remember that the stock price might not unilaterallly go down, and so you will need to have a supply of cash sitting on hand to back up your short position. Otherwise you could get a margin call. As you come close to losing all of your money, your stock broker will automatically sell your stock so that you don’t end up in the position of having a brokerage account with a negative value.
I owned SCO stock because I was a Linux kernel developer, and a portion of Caldera’s stock offering went to every kernel developer. I knew that SCO was going down when they started suing people. I sold my stock for $2/share because I thought that everyone would realize that, and dump their shares of SCO. They didn’t. It eventually went up to $20. Had I immediately turned around and shorted it, I would have had to pump TEN TIMES as much money into it in order to not get a margin call. All for an approximately $2/share gain. Had I sold short at $20, I would have gotten a $20/share gain, or approximately 100% gain.
The key to making a LOT of money on short selling is to double your holdings when the stock falls in half, and double again, and again and again and again. On a stock like SCO, you could have gotten 7 doublings, or turned a $10,000 investment into $1.2M (presuming that that many shares were available to short).
>All Windows boxes need antivirus (or they quickly need reimaging).
Utter falsehood. I’ve had my Windows XP box on the same install for five years now, running most of that time without antivirus software. I install some every year or so to check things out, and then delete it because it’s a system hog(I haven’t upgraded my CPU or RAM since 2002, so it’s a bit on the slow side even without antivirus).
In that time, I’ve gotten nothing in the way of viruses. The worst any of the AV programs has found(and I’ve tried different ones and updated them before use) is the odd tracking cookie. Windows Update and a brain is enough to keep XP safe. 98 was a whole different kettle of fish – I had two major outbreaks in eight months on my dorm LAN, despite doing all the security stuff and having AV installed – but those days are gone for people who know what they’re doing.
> Also, it means your Vista installation may refuse to work if you mod your hardware in any way, like upgrading a
> video card or hard drive.
to be fair, your Linux installation may refuse to work if you upgrade a video card or hard drive, albeit likely for different reasons.
>has anyone considered that there might be some people who simply don’t care what operating system there running?
Spot on, watching porn and checking mail is two of the greatest achievements of the day for an ordinary fat guy down the street. So why should he give a damn?!
“98 was a whole different kettle of fish – I had two major outbreaks in eight months on my dorm LAN”
Yes. My first flirt with Linux, about ten years ago, with SuSE 6.1 was when a virus destroyed all my pictures and all my Word documents. Then I practiced dual-booting for a long time but when I left my powerful PC half a continent away and had only a laptop with limited HDD space on me, and thus had to choose, and chose XP for the games. I had some bad feelings about it, missing the command line and all, but I just didn’t want to live without Steel Panthers Main Battle Tank and stuff like that. And Skype Video – the only thing that makes living a life half a continent away from loved ones bearable. And I had zero virus issues with it.
I think the major reason is not really that XP is better, but it’s Firefox. Internet Explorer is a malware magnet, leave that out of the picture, use a firewall, don’t install stupid shit, and Windows becomes relatively malware-free.
I think Win2K and XP were relatively OK, many technical reasons to hate MSFT disappeared at around that time.
Vista changed the picture back – when my only choice will be Vista I’ll either pick up my old, still powerful box or buy a new one but definitely back to dual-booting again. Perhaps by then someone ports SPMBT. (Videocalling is already OK on Linux.)
30 years later historians will be puzzled how and why MSFT’s approach to OS design progressed like this: clueless, clueless, clueless, clueless, WOW! clueful!, clueful again!, clueless, clueless.
I think more technical reasons to hate MSFT disappeared with the Vista release. Microsoft finally gets serious about security and even Eric complains that it “sucks” because it’s not backwards compatible.
Sometimes schisms with the old have to be made. We were saddled with Win9x crappiness for years because Microsoft wouldn’t let go of native DOS compatibility and couldn’t find a good way to shim it into place. (Linux had DOSEMU. DOSEMU was awesome.)
The problem with shorting is that it’s not enough to be realize a company is doomed – you have to correctly predict when everybody else will realize the company is doomed. There’s an old adage that’s well worth remembering – “The market can remain irrational longer than you can remain solvent.”
I think OpenOffice(.org, ugh) and the like solve the wrong problem. People don’t want a free Office clone, because it will never be as compatible as they’d like, and it will never be any better than office. I recently finally booted Microsoft Office in favour of Scrivener. Now, Scrivener won’t meet everyone’s needs, but that’s how I see Word being defeated – by applications that don’t adopt all its numerous design flaws, and which target each individual niche. I think it’s madness to think that everybody needs to have the same monolithic word processing software, and the only reason that it’s the status quo is file compatibility.
For many people, such as bloggers and grandmothers, Google Apps is sufficient to replace Microsoft Office. For novelists and screenwriters, something like Scrivener is perfect. For academics in the humanities, there still isn’t a perfect replacement. When I’m writing an academic paper, I need something with good reference manager support, or a good built-in reference manager.
Ken Burnside: do you happen to have some links of the figures you posted on Vista vulnerabilities?
Bennett > People don’t want a free Office clone, because it will never be as compatible as they’d like, and it will never be any better than office.
I don’ t see why that should be true. I’ve been an administrator for Linux desktops and I’ve heard quite a few users say that they like this or that feature of OOo better than the MS Office equivalent (I rarely use either myself). E.g. OOo Writer apparently handles long documents better than Word, which is a significant advantage for our users. Now that OOo extensions are catching on, I can easily see a situation similar to Firefox vs. IE emerging. Granted, MS has huge resources behind office, OOo is built on a less-than-ideal codebase and Sun may be bureaucratic in running the project, but I don’t think there is any inherent reason why OOo could not be developed further than MS Office. That may turn out to be the case, but it does not automatically follow from the facts that MS is a big company and Office formats are a de facto standard.
I do think you have a point about office suites in general. I would phrase the problem in terms of data formats: what would a nice, clean, powerful and free document format look like, if you designed it from scratch in 2009? Something minimal that suits web-based apps well and around which lots of tools could grow. Hopefully not XML. Obviously .doc is very bad indeed and ODF is not exactly small or nice either.
Hopefully something text-based.
>What, you mean the anti-spoof checking? It’s been cracked, all right. Repeatedly. The problem is that, as with mods to defeat the activation check, the binary patches are finicky to apply and extremely prone to break things if you have a different build of the OS than the one the patcher was expecting. So they tend to be published on warez sites but never deployed much in the real world.
Isn’t that the best Microsoft can hope for — that non-technical types won’t be able to defeat DRM?
>Isn’t that the best Microsoft can hope for — that non-technical types won’t be able to defeat DRM?
It’s the best either Microsoft or its big-media partners can hope for. It’s also self-destructive in the extreme, because what it teaches consumers to do is pirate unlocked versions of the music and video they want rather than put up with not being able to device-shift the content.
Props to my friend Randall Munroe.
“props”? “**props**?” “props” as in “proper respect?”
No no, you have to give *credit*. Here is the license for XKCD.
Specficially,
I suggest a link, Eric. And soon. Its really odd that someone who founded the OSI would ignore a license.
ESR says: Now that I’ve read the license, it appears to me I was in conformance even before I added the link.
I don’t endorse piracy myself, there’s plenty of DRM-free outlets (even iTunes allows DRM-free for a higher price…) that you will be able to keep forever. Oh, there’s also the venerable DRM-free and high-quality Compact Disc.
>I don’t endorse piracy myself
Nor do I. I was pointing out that piracy is what the media companies are training people to do with DRM, but that’s a value-free statement of consequences rather than encouragement.
Based on my experience with the Eee 1000, Microsoft has nothing to fear.
The system is damn near unusable out of the box. Installing Eeebuntu only gave me a different set of headaches. I still can’t convert ebooks on my Eee and have to sneakernet them with a usb drive from my desktop because the Linux networking doesn’t work. I’ve wasted about 15 hours unproductively tweaking this thing to try and make it work right.
I used to run Slackware as my only desktop OS back in 1997. I remember hand editing xconfig files, I remember fvwm. I no longer have the spare time nor the enjoyment I used to get from playing with Linux, I want to get things done.
> For academics in the humanities, there still isn’t a perfect replacement. When I’m writing an academic paper, I need something with good reference manager support, or a good built-in reference manager.
Have you tried LyX or something equally TeXtastic?
Not always. Steam has been nothing short of an unmitigated success, and people would rather buy authentic DRM’d games from it than attempt to pirate them and risk viruses and malware.
Smells like more of the same ol’ neverending Wishful Thinking.
> Steam has been nothing short of an unmitigated success, and people would rather buy authentic DRM’d games from it than attempt to pirate them and risk viruses and malware.
Also, the process of pirating games is quite simply beyond many computer users. Just look at all the things you have to do to pirate a game:
1. use Bittorrent to download rar files
2. unrar the files using a program you probably don’t already have.
3. find the installer and launch it
4. find a keygen and use it.
5. patch the game with the company’s patches
6. find a no-disc crack and use it.
(with the additional proviso that if you try to apply any patch to a no-disc-cracked game, go to step 1 because your game is now unusable)
I wonder if music companies could make it equally hard to pirate music.
Oh yeah, and I forgot installing a disc emulator and figuring out how to prevent Safedisc from detecting it.
And I’m not by any means advocating DRM/all the rest, I’m just saying that it has the potential to be more effective than some estimate it to be.
>Just look at all the things you have to do to pirate a game:
You can’t really count step 3 as piracy hassle, as that would be necessary with a purchased copy too.
As an experiment, I once “pirated” a game of which I have a legal copy (two, actually, one for Windows and one for Mac System 9) — Civilization II. It required only steps 1-3, as the copy I found on Pirate Bay already had all relevant patches and cracks installed. This random sample suggests that game crackers are already doing a pretty good job of hassle minimization.
The goal was to see if I could get it to work under WINE. It doesn’t.
Here’s the starting point – most of the links going from this article dig deeper into the statistics.
http://blogs.zdnet.com/Bott/?p=505
And – I’ve been running the same XP install since May 2004. I also have pretty safe computing habits; I never run in an Admin level account, live behind a router, and do periodic scans with an Antivirus and malware sweeper about once a month.
Aside from one instance of the Slammer Worm (which got installed when I was surfing on an Admin level account while installing a patch…), I’ve never found malware on this system. Been going on 4 years without an incident.
>I’ve never found malware on this system. Been going on 4 years without an incident.
I’ve been running Linux for 15 years and never had an incident. The difference is, to achieve your record you have to run behind a router and do periodic malware scans, and be in the top 1% of Windows users ranked by technical savvy, I don’t; I could run a stock Linux on a bare cable without having an expected-time-to-zombification of less than 17 minutes.
Why is there furry art on the LaTeX project site?
AROS was bad enough, but those Amiga guys were always a little weird and Eric Schwartz, for better or worse, is a vocal member of their community.
But to run Linux well you have to be in the top 0.1% of computer users ranked by technical savvy.
Jeff, I wouldn’t say that.
Jeff, as you probably noticed, my last post was about Steam not computer proficiency.
What furry art are you talking about?
>But to run Linux well you have to be in the top 0.1% of computer users ranked by technical savvy.
As I have pointed out before, my septaguinarian mother refutes this theory.
even where it doesn’t affect me constantly, the design attitude intra-macosx gives me the irrits.
examples:
• leopard locking the user out of their machine for the duration of downloading system updates. not a major drama for me on 20Mb/s. sorry, everyone else: we don’t want you — stick to *snigger* tiger. or panther *giggle*.
[off-topic, but time machine offends me viscerally for the contrast b/w the effort and the marketing. i built the identical architecure (core) in ’96 to get around a client’s refusal to allow installing a VCS. it took less than an hour in shell/cron between idea and “production” — no change needed ever since.]
• auto-defrag (ie, re-save) of all files over 20mb on OPEN. open!! not save!! even if you discount that this reduces macosx’s viability as a high-frequency file/web server, consider the inanity of the stance of preferring a FS to be pure even if received in poor condition, compared to checking on save. think of the relative real-world probability of these causes of fragmentation (~0% vs ~100%). yep, we’d rather irritate every user every week with unexplained sluggishness than offend ourSELVES with a potentially impure file after disk-swap.
• implementing a buggered VM pager as the new default post-jaguar. ok if you have many spare gigabytes of disk (which most people DO, nowadays), but catastrophic if you’re running close the wire (eg, old system or nearly full disk). i explain the problem and my fix for it (trivially simple but you have to hack /etc/rc) here, if you’re interested. (written for nontechies; if you recoil at the mention of textedit, go vi)
>this is the main reason why some machines which could quite happily run Jaguar came to a juddering halt with later versions. the disk jammed.
why?
it’s not “flexible” at all — what it is is “exponential”. instead of grabbing another Xmb of disk each time to cope with new memory demands, it grabs twice the previous grab. so: 20mb, then 40, then 80, then 160. all well and good, perhaps, if the releasing of diskspace was clever. it’s not. try running up a couple of gb of vm files by cranking up lots of heavy apps, then quitting all apps. wait as long as you like, those vm files will stay there.
hi eric
a/ ta for killing the syntax-buggered comment
b/ but you also killed the fixed comment
c/ unfortunately, we crossed in the net: the above 2nd-one now makes no sense sans-context
perhaps you could delete these two too, and i’ll post both back again later, cleanly? (tried to email you this, but my ISP is blocking it)
…assuming of course it’s not too off-topic for you (it’s just response to a direct question). in which case you should probably just kill the 2nd one and let me know by email so i know not to re-post.
(tho you might be interested to check out the link re fixing macosx’s VM before you do)
That’s because she has you to keep her box running. Have you learned nothing from all the ranty comments of the form “X doesn’t work” that your friends, smart people at that, keep posting whenever you crow about how Linux is ready for prime time?
Windows works well enough for most people to run on their own, and it can be kept working well enough with a few simple steps: use a firewall, check for viruses, etc. (Gee, this is starting to sound like health class…)
You can’t get Linux working at all with an arbitrary hardware configuration; the odds are still good that you will need to do some command-line tweaking to get everything working, particularly in the domain of wireless networking and graphics.
Should read “The average user can’t get Linux working at all…”
>That’s because she has you to keep her box running.
Sorry, this ain’t it. A bit over a year now, and exactly *one* OS-related tech-support call. Most of what she’s needed help with would be the same how-do-I-do-this-with-a-browser issues that would come up on Windows.
The one OS-related call happened after she hit the power-off button at the wrong time and hit the tiny window where even a journalling filesystem needs to be fscked.
Jeff, the lion mascot goes all the way back to the TeXBook. AFAIK, Don Knuth is not a furry. If I’m mistaken, then I’d like to get off this planet as soon as possible.
Not to mention the buggy upgrades. I recently upgraded from Ubuntu 8.04 to 8.10 and when I rebooted, X didn’t recognize the keyboard and mouse. I had to use ctrl-alt-F(1..7) to get to a shell prompt and mess around with the boot order in /etc/init.d/. It turned out that X was being loaded before the hardware abstraction layer. This is an epic fail, folks, and stuff like this happens all the time (another issue was that it would always hang on reboots). I have to admit that while troubleshooting, I was sorely tempted to go back to a system where Most Things Just Work.
I dual boot XP and Ubuntu, and I’ve also noticed that Firefox hangs a lot more than IE7, which, when it does fail, at least has the good sense to crash rather than leave you wondering.
Saltation, modern OS innovations like virtual memory and hardware abstraction have spelt the death knell for snappy, responsive GUIs.
Multimedia software developers have apparently been complaining about the intolerable jitter that non-realtime preemptive kernels like those in Windows and Mac OS X introduce. In the old days, everything was hand-optimized to respond to external input events, and you could directly access the interrupt hardware. This is no longer the case, and our computers have felt more sluggish ever since as a result.
Those two domains are really the last troublesome ones, and both of them are at the fault of the wireless and graphics vendors, not of Linux. In particular, the companies that produce many of them refuse to have their “trade secrets” stolen by competitors and lock everything down with a tight grip; no documentation for you, and if they even bother with a Linux driver, it’s closed source and most often extremely buggy and crash-prone. Both of the arenas are changing for the better, however, the last three companies to really be putting up difficulties are Broadcom, Atheros, and NVIDIA.
Thomas: Unfortunately I have to place the blame with your problem on incompetent Ubuntu maintainers (unless there’s any indication that it’s another problem, but it doesn’t look like it); this is one of the many reasons I switched from Ubuntu to Debian.
“As I have pointed out before, my septaguinarian mother refutes this theory.”
You know what’s strange? The widespread assumption that Windows is easy for non-techies. No, it’s not, that’s the Mac. BLOODY big difference. Pretty much everybody who manages to achieve an acceptable level of comfort using Windows at home does with the help of a geek.
I can’t even count the number of times I visited someone, booted their computer to show him/her some useful website, cringed at the look of IE 6.0 with 12 toolbars and 143 pop-up windows right at starting the application, which took about 3 minutes because of all these, installed Firefox, installed Avira AntiVir, installed a registry cleaner, installed this and that… then they asked how to overcome DVD regional restrictions, I asked it what the hell it is as it was never a problem for me, sighed, installed VLC videoplayer etc. etc.
Basically what happens with most people that someone who knows what he is doing spends half a day making their Windows usable. After THAT it’s fairly OK and user-friendly, of course. But not before.
> The last three companies to really be putting up difficulties are Broadcom, Atheros, and NVIDIA.
Broadcom has already fallen; Atheros‘s days are short, and there’s a light at the end of the tunnel for NVidia. The situation with hardware RAID is currently a lot worse than either wifi or graphics.
> ESR says: Now that I’ve read the license,
Kinda says it all, no?
> it appears to me I was in conformance even before I added the link.
How so? Where was the attribution? Where was the linkage?
Good post, and love your blog. Liked that one on the Arthurian stuff, as did the kids.
From the standpoint of the consumer who can’t afford to hire tech help with our computers, but a family of geeks who are using them almost all day, I think Windows is for the birds. We are all fairly techie (about 16 computers in the house in varying stages of youth or decrepitude). We mostly hate WIndows after years of security and crash hassles (particularly on the Dells and especially with Norton’s). We have always fixed our computers ourselves, with a lot of dreadful help from India. When insuperable, got another computer. Vista was the last straw.
The gamers in the family are stuck with Windows machines because of their inventory of games and the new games and second hand ones being cheaper on Windows than Macs. Countless problems with lost data, fried hard drives, spyware, viruses, mostly from kids lowering firewalls to load music, films, games. The only time we really needed Windows was when the kids were editing their school magazine on Publisher and were working in both places.
I love Macs mainly for ease, great built in software, beautiful graphics and accurate color for photo editing. Not a one of us has ever had any trouble doing work for school or the office on a Mac. We install Office for Mac on one machine tho I don’t bother on mine. I mostly use an Imac for my photography and the ease of Iphoto, seamless etc. Have never had a single problem with any of my Macs. The most beautiful displays.
I was thrilled when the first LInux netbooks came out and got an Asus EEE, but have given up on it as we could never configure ours to connect reliably and consistently to our passworded WPA network, and would not interface with any peripherals at all. Not any MP3 players, no cameras, no printers, nothing. DItto with the OLPC laptop I got kid when it first came out. Fantastic machine for a kid but couldn’t connect to a network or interface with printers. I spent hours on the Asus and OLPC user sites trying to figure out the gobbledygook but couldn’t do it.
This is where Microsoft should shudder: Despite being a geek and the office fixit of all things mechanical there, I won’t bother with a home machine I have to reprogram. And nothing with Windows is ever easy. It’s like loving someone with incurable issues. You bond to anything you work hard on, but you may not have the energy or inclination to bother. I don’t. I work fulltime, have a husband several kids and pets I would rather embrace, and just don’t need another hobby. A computer is a tool, not a love object for me (although my feelings for my old macbook Pro led my kids to call me Gollum).
Although I can use the Asus LInux at the open network at the gas station and the library, in our neck of the woods there aren’t free open networks generally, so it is now a brick. I got the kid a $250 one on sale for Xmas that has Windows XP on it so can connect to our network, and a solid 12 gb drive so his dropping it on the bus, etc. doesn’t cause a problem. He uses it as an ebook, photoviewer, homework log, writes stories, email, net surfing, etc. With 2.5 hours a day on the schoolbus, the thing is a godsend for work and entertainment.
It’s funny that nobody ever talks about a major pair of factors in the popularity of netbooks: the worsening reading eyesight of all us babyboomers (it kills me reading my Blackberry for very long, but I can’t afford an Iphone and don’t want ATT’s rotten service that is nonexistent where we visit), and then the fact that now all our work computers are monitored, we need something we can do personal business on during a lunch hour without trying to decipher a 2.5 inch screen. I remember having to contact estate lawyers and draft legal materials on said tiny screen, and it was an ordeal. If you have an important email to draft or document to peruse, the netbook is easier to do it on.
I got a slightly better one myself (160gb Acer) altho the Windows drives me buggy. I wanted the 160 to keep a fair library of photos, some ebooks, some music and audible books, as an all purpose gadget for travelling, taking to work. I installed Safari, Firefox, etc. on it as soon as possible, but it is not the greatest for photography. I use it to preview pictures when out taking pictures, but the interface to email more than a few pictures is incredibly clunky and limiting. It is more than adequate for blogging away from home, tho, and I won’t care if it gets stolen or broken as I would about a Mac Laptop. These machines need to have a DVD drive, tho, to install stuff or to play a game. And a video card.
People’s increasing use of cloud technologies (blogs, Facebook, picture sharing, Google, etc.) make it possible for us to use these relatively low spec netbooks when out and about which most of us are most of the time. Also, once you’ve had a high tech gadget stolen (as recently happened to me with a Blackberry with a year’s worth of emails, passwords, personal data, etc. foolishly saved on it) you come to prefer something that is bare bones, with less use to store private documents and more just for on the fly access or entertainment. The next thief will not have so much fun reading my letters from Aunt Lavinia….
And Apple. Well, I am still sulking since the creeps at the Apple Store claimed my three times repaired problems on my MacBook Pro were the result of accidental damage (they weren’t) and refused to fix it under warranty. I decided after that to never again get anything Apple. WEll maybe only get Apple desktops. My current Imac is just fine for now…Just trying to avert my gaze from the new 17 inch Macbook Pro with the 8 hour battery and the new better photo software…..
Of course, if I get a pink slip next week I shall feel very foolish having this mobile tech as I settle in to the rest of my life growing spuds and turnips, warmed by the dog, and cranking a hand generator to stay connected to the internet….having sold the children into indentured servitude, whilst better half breaks rocks in the fields…
jeff: nah. they are conflating bloat with features. beos had all the preemption etc of current OSs and ran so snappily you could have several videos running while waving around yet another window with yet another video running, all at full clip without a stutter. on 68030 chips. besides, if it was that big a deal, they could insert a per-PC RTSP engine in with a wider jitter parameter than normal.
for some reason, eric’s not deleted my out-of-synch 2nd-comment, above. so i’ll just re-post the (originally) preceding comment, even though it will now bizarre-up the sequence.
Shenpen:
>and thus had to choose, and chose XP for the games. I had some bad feelings about it, missing the command line and all
install CygWin — gives you a surprisingy complete unix-alike cmdline env
David Delony:
>I’m a fan of Mac OS X, despite its faults. (I think it should take a cue from X and be network-transparent). Can you give some examples of why you think the interface is so bad?
please be very clear i am talking about USER interface, in terms of interaction with the computer, not (just) the graphics/skin.
so, examples.
just quickly off the top of my head:
clickthrough focus-setting. compulsorily interleaved windows (used to be optional per-click, that approach chosen as overwhelmingly best after user-testing). braindead redesign of menus so that all the most-valuable parts of the screen real-estate are now the most useless & rarely used — bye-bye mile-high menubar, it was fun/fast while it lasted. transparency. loss (through deprecation) of type/creator metadata. unixlike FS means file-recovery rates on dud disks and deleted files is 50% if you’re LUCKY (more typically 10%; used to be ~100%) and even then typically loses dir hierarchy info. inability to backup 100% (!!). hardwiring of aliases into inodes, meaning an interdisk copy or even a restored backup loses any links. loss of user-defined file hierarchy (move things round to match your preferred current mindmap: things stop working — in many cases (granted this IS becoming much less frequent) apps won’t even work if they’re not in /Applications). failure of apps to respect aliases of support files/folders if you restructure things for better ease of use (eg, backup). loss of ability to casually AND EASILY REVERSIBLY hack the kernel to add/modify preferred system-wide behaviour. the explosion of subsidiary critical files thru the FS per app, making maintenance/repair/tweaking a major exercise (requiring deep and detailed per-app knowledge) instead of trivial. loss of effortless multi-OS systems. loss of drag&drop system installation/repair. hardwiring of OS into boot partition and FS structure. OS “cruft” (old MacOSXs clog and eventually fail (i lost a Tiger install this weekend, eg) just as we used to laugh about Windows systems doing). loss of the general effortlessness and not-dependent-on-deep-knowledge-ness of customisation or troubleshooting. the appalling and constantly frustrating windows-like weak feedback re current operation delay. collecting destructive and nondestructive controls together in the window titlebar, requiring precise mousing to ensure nonloss for something trivial. similarly braindead replacement of the app menu/launcher menus with the dock (“what apps are running?” “umm… give me a minute”). the fixes for common problems (i help on a few MUGs) nowadays are most commonly the cmdline/special-computer-knowledge efforts we used to lampoon windows for. etc etc etc.
basically, macosx users must comply with the computer’s approach (technical architecture), whereas with mac it was the other way round. and macosx’s architecture was and is designed with a view to insulated coders impressing their peers by adopting high-virtue stances (eg screen real-estate) or technologies (eg transparency), rather than thinking about effect on users. it is 100% inverted from the original ethos of mac.
it’s notable that the general friction is so high now that instead of every mac being a cheerful reflection of the owner, nowadays it’s a novelty even to see someone pasting in custom icons.
and to re-iterate, they REALLY did buggerup the default VM pager from MacOSX 10.3 onwards (but easily fixed for tech-heads).
Retriever, you have said that you cannot stand Windows, find the Linux laptops to be useless, and will not purchase another Apple computer (except maybe a desktop). Just where does that leave you?
Saltation,
You’re confusing the BeBox with the Amiga. BeOS never ran on the ‘030. In the early nineties the Amiga could run hundreds of processes, each crunching audio data in real time, on a 20MHz 68k. To this day it remains unsurpassed in terms of PMT task switching speed.
And the future of end-user software will be closed source on closed platform. The original Mac was semi-closed; that served as a quality filter to weed out people who weren’t committed to releasing software that would sell with the Mac audience. Now that the doors are wide open with free Xcode and online API docs… well, it’s almost like freshmeat out there. The iPhone appears to be doing much, much better.
Well, Phil, I took out my old college Smith Corona from the attic with its horrible associations of typing and retyping thousands of pages, and thought “my computers don’t look so bad!”. Obviously, one settles. These days I just use whatever’s cheapest and whine a lot! What I meant was, at this time, the Linux products brought out SO FAR for the general public are crippled by not being able to reliably interface with ordinary consumer peripherals, and to connect with “secure” (I know they’re not really) networks, so for practical purposes they are currently useless to me. My husband intones like a Greek chorus “Evil Microsoft prevented it,or they would have been worse competition.”
I will snap up a Linux netbook in future that will connect out of the box. I still use WIndows machines because I have to in order to use certain peripherals, and some programs (at work for example). My main beef with Apple now is price for a laptop than can handle graphics or games or edit photos, and the ever worsening build quality (which is true of all other laptops, so probably not fair to complain). I love using the Mac products, but dislike the snarky attitude towards customers. If one is over 30 in an Apple store, one is spoken to as if simple minded by the young salespeople there “The hard drive is in here” (as if I were 80 and soft in the head). I do love the way one can work with photos, email, websites so much more easily on a Mac. Maybe we consumers are all just spoiled now?
ALthough not an expert in the technical stuff, as most here are, we consumers know pretty much what we want in a computer, and even tho we may have past passionate preferences for one platform or another,we have no personal stake or brand loyalty any manufacturer can count on in future. Not for hardware or software or operating systems. It isn’t like my camera equipment where I am so locked into Nikon that that’s it, I can’t let myself drool over a Canon birding lens. I think what I am taking so long to say is that what should make Microsoft and Apple shudder is that so many consumers,like me, are so fed up with so many aspects of all the brands of computers that they end up becoming primarily price driven.
Consumers and their computers are like worshippers and denominations these days. Nobody can count on their fanboys or fangirls sticking around forever. The denominations and the Computer and Operating System Deities keep producing their product, assuming they can count on the faithful continuing to stand in line for them. But the people pick and choose, and are quite capable of defecting en masse within a year to a cheaper, simpler, cooller product if something better comes along.
Excuse the length of this (have just spent several hours debugging kid’s fracking Windows computer which the little gaming monster just about choked. It becomes a competition between more Windows-savvy spouse and I (fewer preconceptions, because more ignorant, so often spot the problem faster), who can kill the gremlins. Small triumph getting it going again with much trial and error. I dare say a techie would have fixed in ten minutes what I took all afternoon to do.
It’s worth noting that at my local Micro Center, several models of netbook are available. All of the netbooks they offer run Windows XP. They do not sell Linux-based netbooks. I asked.
Once again, the market, of which Eric claims to be a staunch defender, is choosing Windows.
Jeff Read:
>You’re confusing the BeBox with the Amiga. BeOS never ran on the ‘030.
bit of a sideissue, but no: i know both, and loved amigaos, but if you mention “amiga” in most public contexts, people autodismiss anything you say thereafter as “games machine”. so i tend not to bring it up.
you’re slightly right re 68030 — i meant 68040. my fault: i’m overly a fan of the SE/30, and 15+yrs later it can colour my language when typing fast.
i have used BeOS running on 68040s at macworld. that’s where i saw (and then did myself) the demo with waving round a video over background videos. the machine reported them as 68040s, the demonstrator claimed them to be 68040s, media articles at the same time reported them as 68040s, i have no reason to believe otherwise.
>To this day it remains unsurpassed in terms of PMT task switching speed.
i’m not surprised. it kicked arse.
And the future of end-user software will be closed source on closed platform. The original Mac was semi-closed; that served as a quality filter to weed out people who weren’t committed to releasing software that would sell with the Mac audience. Now that the doors are wide open with free Xcode and online API docs… well, it’s almost like freshmeat out there. The iPhone appears to be doing much, much better.
hmm. the only difference i see from “the original Mac” is that instead of paying $40/yr for the full tools (essentially unixlike, incidentally) + doco, they’re now bundled/online. you can write pure posix now if you like, but almost nobody does: they’re still using a proprietary framework.
iphone is off on another planet of restriction: single release channel, enforced technically and legally.
>Once again, the market, of which Eric claims to be a staunch defender, is choosing Windows.
perhaps, rather, the Gatekeepers (corporate buyers) are choosing Windows.
this critical segment of decision-makers, btw, is the one deliberately targetted by Microsoft since Day One.
Everything I read says that BeOS was released commercially on the PowerPC and x86, with an early prototype running on AT&T hardware. Since becoming aware of BeOS in the late nineties my conception was always that the BeBox was a PowerPC platform.
And I know everybody dismisses the Amiga as a games machine, just as they dismissed the Mac as a toy early on. Everybody that is except for the people who were doing serious hardcore professional video and audio work on the Amiga at a time when no other platform could reasonably compete at that price point.
One of the things that you learn hanging around Amiga people is that it takes much, much less of a kernel than you might think to have a serious, professional desktop. Compared to AmigaOS’s tiny elegant design, Windows and Linux seem like pure bloat and excrescence (though I think the NT system has a lean and mean kernel struggling to get out from under all the Win32 cruft).
>One of the things that you learn hanging around Amiga people is that it takes much, much less of a kernel than you might think to have a serious, professional desktop.
you betcha
“And I know everybody dismisses the Amiga as a games machine, just as they dismissed the Mac as a toy early on. Everybody that is except for the people who were doing serious hardcore professional video and audio work on the Amiga at a time when no other platform could reasonably compete at that price point.”
Depends on which Amiga. Amiga 1000/2000/3000/4000 were quite popular in professional work in Germany at that age, one could spot them at many offices, despite the horrible price. 500/500+/600 were mostly for gaming.
Forget Linux. Phantom will rule all (and mercifully I’m not talking about that game console made of pure fail).
Money quote:
For what it could do the Amiga x000 was dirt cheap. SGI was threatened with being completely undermined by the Amiga, the next generation of which would match an SGI workstation in graphics capability at a much lower price point. But that was before Commodore got run into the ground…
Phantom seems to be nothing more than Squeak with an autosave feature. This is impressive, how? It seems like Zavalishin has a case of PMS.
I do think “the IT or consumer market is going to be back to anything like boom conditions in 30 months,” so there is somebody. Basically tech has gone through an ice age since the dot.com bust, I expect it to take off again any day now. Netbooks are a nice toy, hardly a real threat to Microsoft. Microsoft’s wounds are self-inflicted, as you state, but they’re not fatal. It will take someone strong or marauding hordes to deliver the death blow. I believe it will be the latter and their sword will be my hybrid license, not open source, which has decidedly failed at this task.
>>jackrabbit primer
>That would be interesting.
in yer inbox. please be as harsh as you like — i can’t improve it if i don’t know how it failed you.
“Forget Linux. Phantom will rule all”
Neat. Now it’s just that little question of reproducing the whole ecosystem of Windows, or even the smaller but still huge ecosystem of Linux. We’ll see how it goes in, well, about 2020.
Actually… this got me thinking. WINE is going a bit slow because the WinAPI often behaves differently from the documentation etc… but the Linux kernel is self-documenting. So if you have a better idea for an OS, can you “just” emulate the Linux kernel and then everything from bash to OpenOffice to Firefox would run on it, making your OS competitive, a viable choice? Of course you can. But how hard it is to emulate the Linux kernel? Given that most stuff are modules, if you just emulate the core and plug the modules in it… can’t be that hard, can it? But if it’s not that hard, why Hurd is a failure? In theory if the core kernel would be easily replaceable, they could plug in Hurd and easily distribute a Ubuntu Hurd distro, without the users having any problems, couldn’t they?
worked for apple
welcome to posix
;D
>In theory if the core kernel would be easily replaceable, they could plug in Hurd and easily distribute a Ubuntu Hurd distro, without the users having any problems, couldn’t they?
Assuming Hurd worked, it should be possible. There would probably be some issues, but ISTR seeing some Linux distro that actually ran Hurd.
I was being facetious about Phantom, though I did really think it neat that the author saw Linux as a weaker alternative to Windows and thus unable to compete.
This is already happening. Most major BSDs have a Linux emulation layer. You need a Linux userland though. And if your OS implements some radical new idea not implementable in POSIX (or Linux) syscall semantics, native applications are a necessity.
I think you mean Debian GNU/Hurd?
That’s it, thanks.
I think AROS is shaping up to be a much better replacement for Windows on ultra-mobile platforms than Linux. Unlike Linux its kernel is tiny, blazingly fast and easily comprehensible, and it would make a more attractive end-user experience. Filling in the holes that still prevent it from being a desktop for the casual user should be easy for a netbook manufacturer to fund. Furthermore, by deploying AROS, manufacturers can offer the same user experience with a considerably less powerful CPU (say a 0.4-GHz Geode instead of a 1.8-GHz Atom), saving money and power and making for a much more mobile device.
The sound of empire staggering back to its feet, getting ready to beat the everloving shit out of the mofo who knocked it down
nyer
Microsoft sales drop! ESR has it right, again.
http://money.cnn.com/2009/04/23/technology/microsoft_earnings/index.htm?postversion=2009042316
Companies certainly need an alternative to Microsoft products. MS has milked the market enough, and with SaaS options, Google, etc on the horizon, some of its monopoly is certainly going to be broken.
Talking of the empire falling, has anyone tried Sharepoint and exchange alternative HyperOffice
Phil,
Most of the drop is due to two factors:
The economy is failing, and PC sales in general are dropping
Netbooks move Windows XP units, not more profitable Vista units
Linux doesn’t even factor in. In a year the netbook space was completely pwned by Microsoft.