The news comes to us today that in 1Q 2010 Android phones outsold Apple’s iPhone by a significant 7%. As it said on the gunslinger’s gravestone, “I was expecting this, but not so soon.”
Business Week and the Wall Street Journal are on the story, but the most interesting version is from the story they’re apparently deriving from at All Things Digital, because it includes a graph showing recent market share trends that conveys a lot more information than the present-time numbers.
I’ve written before that I think Google has been running a long game aimed against the telecomms carriers’ preferred strategy of customer lock-in, and executing on that game very well. Against the iPhone, its strategy has been a classic example of what the economist Clayton Christensen called “disruption from below” in his classic The Innovator’s Dilemma. With the G-1, Google initially competed on price, winning customers who didn’t want to pay Apple/AT&T’s premium and were willing to trade away Apple’s perceived superiority in “user experience” for a better price. Just as importantly, Android offered a near-irresistible deal to the carriers: months, even years slashed off time-to-market for a state-of-the-art cellphone; a huge advantage in licensing costs; and the illusion (now disintegrating) that said carriers would be able to retain enough control of Android-powered devices to practice their habitual screw-the-customer tactics.
In Christensen’s model, a market being disrupted from below features two products, sustaining and disrupter, both improving over time but with the disruptor at a lower price point and lesser capabilities. Typically, the sustaining company will be focused on control of its customers and business partners to extract maximum margins; on the other hand, the disruptor will be playing a ubiquity game, sacrificing margin to gain share. The sustaining company will gold-plate its product in order to chase high-end price-insensitive customers; the disruptor will seek out price-sensitive low-end customers.
The trap for the sustaining company/product is that as it chases high-end customers, it will ship marques that are more and more gold-plated – overdesigned, overengineered, and overpriced relative to what most customers actually want. The break in the market occurs when the disruptor reaches the level of capability that customers actually want to pay for, at which point they switch over en masse and the bottom falls out of the sustainers’ market share. If it’s very lucky, it holds onto an eroding fortress at the high end of the market for a while. A short while. Lots of historical examples of this pattern can be found at the Wikipedia article.
In the smartphone market I have been expecting a disruptive break that would body-slam Apple’s market share, but I expected it to be several quarters in the future and with a really fast drop-off when it happened. Instead, it looks like Apple took a bruising in 4Q 2009 and has failed to regain share in 1Q 2010 while Android sales continued to rocket. Android hammered market-leader Blackberry just as badly, a fact which has gotten far less play than it probably should because the trade-press loves the drama of the Apple-vs.-Google catfight so much.
What actually seems to be going on here is that Android is successfully disrupting both Apple and Blackberry from below; together they’ve lost about 25% of market share, not enough to put Android on top but close enough that another quarter like the last will certainly do that. What does this tell us?
First, that Google’s slow-ball launch of Android and its attempt to advance on a broad front by co-opting a lot of vendors has worked spectacularly well, recent fallings-out with Sprint and Verizon over the Nexus One notwithstanding. The Motorola Droid and lower-end Android phones have to be selling like crazy for NPD’s numbers to look like they do, and I believe that because I’ve been tripping over new Droid users a lot lately. I predicted that the ubiquity game would beat the control game, so I get to do a bit of a fist-pump about this.
Second…several articles of conventional wisdom need to be re-examined. One is the attractive power of the app store. Apple’s supposed advantage here isn’t doing them any obvious good at all. How can we tell? Like this: Apple and Blackberry have experienced very similar drops in unit share, about 7% each in the last two quarters. One has an app store, the other doesn’t. [I was wrong about this; Blackberry has one, but Apple’s is universally conceded to be far better.] If that’s made any, difference, it’s not one that’s showing in the numbers. The least hypothesis is that the app store doesn’t matter.
My best guess is that mass-market smartphone users are already overserved by even Android’s app store, let alone Apple’s allegedly richer one. Who could build any kind of mental model of what’s there without it being a full-time job? Hard-core geek that I am, I’d be one of those most rationally expected to dive in and turn my phone into an app-enabled superwidget…but the truth is, I’ve only ever grabbed a dozen or so, and most of those were games, and the only two I use regularly are games I could do without. I might as well be on a Blackberry.
Onboard apps just don’t look like they’re that important. We developers and maybe-someday developers need to face the disturbing possibility that, basically, users don’t care about them. That they’re buying (um) phones, not the Superwidget of the Future, and all the hype about apps was mostly just sustainer gold-plating.
Third: it doesn’t look offhand like anything about the iPhone is saving it from bleeding unit share right in parallel with Blackberry. This includes Apple’s vaunted superiority at UI and physical product design. All the arguments about how far Android is or is not behind the iPhone on this level begin to look to me like irrelevant, elaborate ways of missing the point – because nobody thinks Blackberry competes well in UI polish with Apple, and that lack doesn’t seem to make a damn bit of difference to the velocity at which they’re both losing share to Android.
Time out for a moment while I laugh and point at the Apple fanboys. In your fantasies, mass-market smartphone customers were just pullulating with eagerness to plight their troth to Apple’s unsurpassed UI like so many architecture students swooning over Barcelona chairs. You overlooked the fact that the same advantages never pulled Apple’s computer market share over 10% and they’ve been hard-pressed there as well; in January they dropped to fifth place in the U.S. Steve Jobs’s pitch to everybody’s inner art fag is beginning to look a little threadbare.
I suppose about the only good news for anti-Androiders is that WinMobile, the only contender many of them love to hate more, got the living snot beaten out of it sooner than the iPhone and Blackberry did. It’s dropped about 10% unit share since 1Q 2009, only 3% or so of which was in the last two quarters. In fact, as I look at that graph, Android is the only smartphone contender to post a net rise in unit share since mid-2009!
Apple, Blackberry, WinMobile, Palm…that graph says Android is eating everybody’s lunch. And not slowly, either, but even more rapidly than I predicted it would. This is going to have knock-on effects. I will leave as an example exercise for the reader the question of what it does to the future of the iPad.
As an Android Fanboi, I like this too much.
I do have pause for the next windows mobile tho. They’re pushing it *hard*, it’s got a very well known name behind it and opportunities for integration with *everyone’s* desktop that other vendors can only dream of.
Of course, one of the things I like most about android is that it doesn’t _need_ desktop integration, but I’m not confident that it’s such a strong marketing message.
Hey Eric, a few mistakes regarding Blackerry here.
First, it does have an app store for quite a while now (it’s called app world).
Second, I think you’re overlooking the fact that these numbers exclude enterprise phones, the area where BB really shines, and where Android is going to struggle a bit more. As such, the iPhone *has* been hit harder than the blackberry.
That being said, the rest of your argument still holds.
A reasonable analysis, though I think you should be wary of assuming that Blackberry market share drop and iPhone market share drop are attributable to the same forces. Blackberry has always been about mobile email, the UI has been hideous and for most users I know that have had them foisted on them practically impossible to use. The ease with which mobile email can now be combined with good GUI and user experiences is arguably what is doing Blackberry in. In some ways this may make the analysis much worse for Apple. Because if people/businesses are leaving Blackberry to iPhone (which given exchange server compatibility, strong internal demand and operator sales force focus is a reasonable assumption) then they must be bleeding a lot of individual user sales to be dropping at the same rate.
I also don’t know that Apps are as small a factor as you suggest. In the UK at least there is a lot of activity with apps and those that I do use a lot, train timetables, tub maps, weather etc, I use A LOT, the issue isn’t so much whether you can cognitively map the app store, but whether you believe it to contain an app for anything, in which case a quick search finds what you need. I think apps are important and confidence in both the apps and the environment that they come from more so. People are getting used to having computing power in their pocket, they are getting used to being able to use it on demand wherever they are, and they are getting used to a model that has a discrete app to focus enquiries, so that they don’t have to cludge about in a browser on a device whose form factor doesn’t support search and information acquisition in the same way as a desktop. However, you may be right on pre-purchase decisions, as it’s a new model that you have to get used to, rather than something that leaps out as a reason to buy.
I find it amazing people are making such a big deal about it. I think people need a reminder that this ordeal is basically comparing an OS to one phone. There are multiple Android phones across multiple carriers. Whereas the iPhone OS is only available on what? The iPhone, which is on how many carriers? One…. and for the most part, not a very good one. Also, let’s wait until the iPhone HD and iPhone OS 4 come out. I’m sure these numbers will be a bit more even till.
I’m a Maemo fanboi….hey!…over here!….don’t ignore me!
The value of the app store, to the typical end user, is not the actual size of the pool of apps, much-hyped at 10**N for meaningfully large values of N, but the exponentially increasing probability that the 5-10 apps you actually care about enough to make phone-purchase decisions behind them, plus the 15-20 that might be fun toys to play with now and then, will be found somewhere among that 10**N, and likely in multiple forms from multiple programmers, virtually ensuring that at least one of each will manage to not suck nearly as hard as the insanely crippled apps they’ve become accustomed to from cell phones in the past.
Not to mention that a phone-centered app store beats the ever-loving crap out of the crippled carrier-centered app stores we’re accustomed to. Blackberry owned the high end for so long because theirs were the only smart phones for which non-carrier-supplied apps were effectively available at all.
The iPhone’s most significant limitation, in this area, isn’t Apple but AT&T. Most users don’t care about Apple’s stranglehold as long as sufficient diversity exists within the wall. But I doubt there’s a cell phone user anywhere in America who hasn’t experienced the pain of carrier lock-in and resented it. Android kills carrier lock-in, which is why Android wins.
Now that Apple’s kicked the market open for PADDs, I think Android will step in pretty soon in a big way. There are already 7″ Android devices on eBay going for about $120, and while they don’t appear to be quite as fast or slick as the iPad, they look like they will cover similar uses for about a third of the price, and unlike the iPad they all have at least an SD card reader (some have USB ports, too) so it will be easier to transfer files to and from the device. There are multitouch versions starting to pop up in the Chinese electronics markets for about $200-$300. Some of the major computer makers are getting into the game too, so more iPad competitors will be coming in the next year or so. I expect they’ll aim for the same areas of use and capabilities as the iPad, with USB ports and without the closed Apple ecosystem, for about the same price (essentially, do more for the same money).
I figure Apple will make a few new fans with the iPad, just like they did with the iPod and iPhone, but beyond that and the initial surge of early adopters, not much will change. It’s just too expensive for what it does.
There’s an easy way to measure whether carrier lock-in is a big factor. Compare European sales to the US. iPhone isn’t locked-in here in Europe (or to nothing like the same degree – I can get iPhones on three out of the five UK carriers, for example), so if the sales are holding-up against Android here, then it’s AT&T that’s dragging iPhone down.
I’m sure that Apple’s determination to only have one iPhone is why there isn’t a CDMA iPhone, which means the only major carrier they could add is T-Mobile, who have next-to-no 3G coverage. I think Apple have their fingers crossed they can survive the next couple of years with AT&T until LTE happens, and then launch an LTE iPhone that will run on all four major carriers. Whether they can survive is another question altogether; Android might well eat their lunch with Sprint and Verizon’s help in the meantime.
Point of Information: My wife and I are long-time AT&T subscribers (though admittedly, we started out as Cingular customers). More importantly, it’s been quite a while since we upgraded our phones, and our ‘2-year commitments’ were over a long time ago. This morning I checked the AT&T Wireless site, and AT&T would love to upgrade our phones to iPhones — for $18. Probably a 3G (not a 3GS), but it indicates a certain level of desperation…
Unfortunately, the only Android phone that AT&T offers is the Motorola Backflip, about which the less said, the better.
It’s an interesting note, but any “great” platform is going to have explosive growth… to a point where it reaches saturation. Until we see where that point is, predictions of global conquest are premature (or, as the traders say, “the Trend Is Your Friend ‘Till the Bend At the End”). Basically, Android is the first smartphone platform to challenge Apple that has the unique combination of not sucking ass and having serious backing behind it. It will be interesting to see what HP does with Palm – WebOS is pretty nifty, but Palm was far too weak to push it on its own. HP might be able to push it out of its dying niche. They have the muscle, but I’m not sure if they have that sort of innovative drive anymore. They’ve been pretty hidebound lately.
In the long run, I’d expect both Android and iPhones to maintain significant market shares unless one just really screws things up badly. Blackberry will slowly bleed away (thankfully), as will the older “dumb phones.”
@Craig Trader: There is a Google Nexus One with a radio designed for AT&T. Un-subsidized, direct from Google.
Great! Eric, you’ve just convinced me to buy a(n) (overclockable) Nokia N900.
I’ve said it before elsewhere, but I’ll say it again: the cellphone market is and always was a very fickle place. Today’s hottest phones are tomorrow’s obsolete pieces of crap and I mean that quite literally. I said long ago when Apple first released the iPhone that Apple didn’t know the market they were competing in and even if they succeed in redefining the market now, someone else was going to come in and eat their lunch. And, at the time, I even thought it might be Google, who had yet to announce Android.
Google’s open source stack was made available to all comers who wished to toss their hat into Android’s ring and I knew that was the model needed to make a cellphone platform work. Non-geek cellphone buyers view their phone as more of a fashion accessory — and they did that before Apple started catering to them. They want something that looks cool, sleek and new. They don’t care about apps so much. In fact, I’ll say the only thing non-geeks really care about from a functionality standpoint is texting, cameras, instant-messaging. e-mail and Web browsing. Oh yeah, and they want a phone that works well as a phone. If it doesn’t sound good on a phone call, it’s a paperweight. And in a fickle market like the cellphone business, being the only supplier of phones for your platform just ain’t going to work in the long run.
>I do have pause for the next windows mobile tho. They’re pushing it *hard*, it’s got a very well known name behind it and opportunities for integration with *everyone’s* desktop that other vendors can only dream of.
Er, so if these advantages are so powerful, why didn’t they keep Microsoft out of the basement last time around?
Never mind. I know the answer. The carriers’ strategists decided years ago that hell would freeze over before they’d let Microsoft do to phones what it did to the PC – commoditize the cofactors and cream off most of the profits. It’s not going to work this time, either.
>It will be interesting to see what HP does with Palm – WebOS is pretty nifty, but Palm was far too weak to push it on its own. HP might be able to push it out of its dying niche. They have the muscle, but I’m not sure if they have that sort of innovative drive anymore. They’ve been pretty hidebound lately.
Had a long conversation last weekend with a friend who’s a big fan of WebOS. His unhappy conclusion: it’s superior (Linux inside, with a Palm emulation layer on top), but third to market dooms it. I don’t know WebOS, but his analysis of the market dynamics seems sound. The only way HP could have a shot is if they emulated Android’s multiple-carrier/multiple-platform strategy, and there’s nothing in HP’s history that suggests they know how to think that way.
Is it not possible that the decline in iPhone sales is due to a rise in iPad sales? I know a lot of people that bought an iPhone with no intention of using it as a phone.
> Onboard apps just don’t look like they’re that important.
Strange. I think there must be more going on than meets the eye with the app store side of the equation.
Handheld games, for example. Maybe people don’t buy their phones for the games, yet, but they will. It’s too nice to have the phone double as an entertainment device. This may simply be a market education problem. There hasn’t been a real “killer app” for a smartphone yet, unless you count the apps that the manufacturer bundles with the phone, such as Google Maps, which aren’t tied to the app stores.
Anecdotally, my two-year-old can barely point-and-click a mouse. She can do it, but its a slow and painful process for every click. But she can play games on an iPhone like a pro and never stops begging everyone to borrow their touchscreen phones. There is an emerging market of iPhone games aimed at toddlers. When parents figure this out it will be as popular as the DVD TV screens in the back of minivans. Being able to entertain the kid for ten minutes when waiting in line somewhere is a killer app.
> Is it not possible that the decline in iPhone sales is due to a rise in iPad sales?
Another possibility is that all the likely iPhone buyers are waiting for the 4th-gen iPhone expected this summer. Although I wouldn’t expect the mass market to be that savvy.
>Another possibility is that all the likely iPhone buyers are waiting for the 4th-gen iPhone expected this summer.
Might be. But what could the 4G pull that would be a game-changer? Can’t be full multitasking or voice-to-text everywhere; Android already has those. Your theory implicitly concedes something very interesting: technologically, Apple is now chasing Android’s taillights rather than the other way around.
If I were Apple, my preferred move for the 4G would be to go aggressively multivendor; they’ve got to do something to avoid being swamped from all directions by a flood of cheap droids. Maybe if they shipped an unlocked quad-band device? But their U.S. exclusive with AT&T has yet to run out. I’m not really seeing any way out of that corner.
I love my Droid almost as much as I love my son, but also remember that the 3GS hardware is getting pretty old at this point. I would expect Apple to get a pretty serious bump when the release the next iPhone, especially if it really does have a better screen than anything else currently does. I wouldn’t be surprised if they overtake Android again. But if they don’t manage anything revolutionary with iPhone OS 4.0 (and it looks like all its changes are to bring it to par with competitors or just iterative improvements), Android phones will soon catch up hardware wise and pretty soon we’ll be back where we are now, and then the gap will only be getting wider!
>especially if [the 4G] really does have a better screen than anything else currently does.
You know, I think you’ve identified the most plausible rabbit for Jobs to pull out of his hat. It would be completely consistent with Apple’s version of “ooh-look-at-the-shiny!” to do that.
Whether it will work is another question. As you say, it would leave the 4GS exposed to a counter by the Nexus Two. But there’s a broader problem; it would be what I’ve called “sustainer gold-plating”. For things that are phones (rather than tablets or netbooks), I suspect we’ve already reached the point at which increases in display size and pixel density are going to yield diminishing value per increment of unit cost. When all is said and done the 4G is going to have to fit in one hand, and I think the high-end smartphones are already nudging that hard limit.
(Anecdotal evidence: my wife could have my old G-1 for free but doesn’t want it. One reason is that she prefers flip-phones to stick phones, but the other is that she has smaller hands and smaller pockets than I do and would find the G-1 a pain to carry.)
> Is it not possible that the decline in iPhone sales is due to a rise in iPad sales? I know a lot of people that bought an iPhone with no intention of using it as a phone.
Why would anyone do that rather than just buy an iPod Touch? I can’t imagine this being a significant factor in the market.
> at All Things Digital, because it includes a graph
Hmmmm. Looking at how that green Android line spikes up about 6 months ago, I had a flash. Wasn’t that around the time that all the cool Motorola Droid commercials first started running…?
We may simply be seeing the results of a good ad campaign timed to coincide with the natural tapering off of iPhone’s now-several-years-old wave of hype, that everyone has now become jaded to.
Email and Web were the “killer app” for the smartphone. That’s why the market jumped so quick from iPhone to Android. There’s no functionality difference between them in the mass market. The cellphone market is fickle. I know, I keep saying that…
“Hard-core geek that I am, I’d be one of those most rationally expected to dive in and turn my phone into an app-enabled superwidget…but the truth is, I’ve only ever grabbed a dozen or so, and most of those were games, and the only two I use regularly are games I could do without. I might as well be on a Blackberry.”
You seem to be suffering from iPad syndrome. You bought a fancy phone you didn’t need, and now you’re not sure what to do with it. It may not be too late to return it and get a Motorola RAZR instead, or perhaps a Microsoft Kin?
However I and others I know have dozens of apps in regular use on our i* devices. I agree with you that the number of apps in the appstore is not important, but the quality is. Last night, while sitting on the couch streaming netflix (to my blu-ray player, not an iPad) I used my iPhone to:
* Look up some actors on the TV show I was streaming
* Add a few expenses into an expense tracker
* Categorize some bills I’d recently scanned in to Evernote
* Figure out what one of the songs was from the show I was watching
* Plan a few things to do today in my ToDo manager
* Play a couple moves in Scrabble
* Call my mom on Skype (no cellphone reception of any kind in my apartment)
* Control my Sonos Zoneplayer
* Read some Bloomberg News
These all used apps from the appstore. Now there’s no reason that the Android appstore couldn’t have the same apps, and hopefully at some point in the near future it will. But I’ll challenge your assertion that the appstore isn’t important.
>You bought a fancy phone you didn’t need, and now you’re not sure what to do with it.
You are absolutely wrong. I knew exactly what I wanted from the G-1 and the Nexus, and I got it: a capable web browser that I am never without. I lusted for the Nexus upgrade for the extremely specific reason that the improved display made the browser look better. The apps don’t add a lot of value on top of that, at least not for me.
I think Morgan Greywolf is right. The killer apps were email and web. Apple’s going to be hard pressed to find another.
> But what could the 4G pull that would be a game-changer?
Like you I only see incremental iPhone improvements coming like a much better display, faster CPU, better camera, etc.
I’ve been expecting Apple to surprise the world sometime with a video phone, but I don’t really expect that to happen with the 4th-gen iPhone. I did hope it might happen with the iPad, and I still expect Apple to someday upgrade the iPad into a vidphone first, then later the iPhone would follow as the tech matured.
I agree Apple will eventually be forced to go multi-vendor and then to open up their platform just to compete with Android. I thought Apple would have a longer window on top than they now appear to have. Unless they really wow the world with something amazing soon, Google wins. Go go open source!
Morgan Greywolf Says:
> The cellphone market is fickle.
Good point. Apple’s App Store broke the lock the cell phone vendors had over third-party smartphone apps. It now looks like Android will sell enough units to make ports of popular iPhone apps commercially viable. I feared Apple’s App Store would prevent Android adoption, but I think Google is dodging that bullet, apparently due as ESR said to the good job Google has done selling Android to cell phone makers as a cheap alternative.
End result: people will abandon iPhone like crazy unless Apple drops their prices.
>End result: people will abandon iPhone like crazy unless Apple drops their prices.
What is this “will” abandon, kemo sabe? NPD’s numbers say it’s already happening. But let’s look at Apple’s option there. Suppose they do cut prices?
They take a double hit if they do that. One is to margins, but the other possibly more important effect is on their brand positioning. Apple gets an enormous amount of cachet from being perceived as a cool, elite choice. The last thing they can afford to be in is a grubby price war.
If this is true, one wonders if Google is barking up the wrong tree with Chrome OS/Chromium OS, rather than focusing on adapting Android to the tablet/iPad-alike form factor, leveraging existing Android apps the way Apple leveraged existing iPhone apps.
ESR says: Interesting point. Looking at the volume numbers, I have to say that ChromeOS’s future looks pretty bleak. If I were Google’s strategists, I’d give it the ax now before it costs them any more NRE.
@sam: “Last night, while sitting on the couch streaming netflix (to my blu-ray player, not an iPad) I used my iPhone to:”
Seems to me that if you managed to do all that, you weren’t actually paying much attention to the movie? You might be wasting money on netflix. I also like to expand upon a movie I just saw. That’s why I use wikipedia, tvtropes and others _after_ the movie.
And yes, this is in jest, please don’t take it seriously.
> Looking at the volume numbers, I have to say that ChromeOS’s future looks pretty bleak.
Chrome OS is Google’s way of hedging their bets.
From Business Week:
Google had to suspect that smartphones and tablets were likely to eat the PC market from the bottom, but they didn’t want to bet the company on it, and with their cash on hand why should they? From Google’s point of view, every Google OS installed in any form factor is a sale denied to Microsoft and/or Apple.
The Chrome OS is intended to create a truly portable app ecosystem based on open web standards, where “all apps are web apps”. That’s more than Android gives us. At the same time, Chrome OS kills the virus threat dead using the sandboxing technology already tested inside the Chrome web browser. If they kill Chrome in name the important technology will still live on in Android, or vis a versa.
>Chrome OS is Google’s way of hedging their bets.
That looks like good analysis to me. ChromeOS is like Android in that there’s Linux at the bottom, so there may be less to the difference than meets the eye. And your point about technology salvaged from scrapping one likely winding up in the other is worthy of note.
> I’ve been expecting Apple to surprise the world sometime with a video phone
The upcoming (this summer, I believe) EVO, a 3G/4G network Android phone from Sprint has cameras on both sides. The one facing you is supposedly for video phone and the one facing away is for a camera. Wimax, the Sprint 4G network, is wicked fast, so it will support video. I don’t know if there is any software on it for video phone though. Sprint is rolling out Wimax in my area this summer, too. I think I know what my next phone will be. (Current is an HTC Touch running Windows Mobile.)
Yours,
Tom
>The upcoming (this summer, I believe) EVO, a 3G/4G network Android phone from Sprint has cameras on both sides.
Wow. If EVO Launches before Apple can get a 4GS with video out the door, I’m thinking Apple may be truly fucked. Beaten both on features and price.
On the other hand…there have been several past attempts to launch videophones that failed. There’s an outside chance consumers simply don’t want this capability.
@adriano:
Valid point :) Really the TV was on mostly to keep me company.
Regardless of the iPhone vs. Android debate, I do find it absolutely amazing that it is even possible to do this. A few years ago I spent a small fortune on the Nokia 6682 which despite being one of the higher-end ‘smartphones’ of the day was an absolutely horrible experience (and a stupid purchase on my part).
I often feel the need to defend Apple, not because its current actions are particularly defensible (many of them aren’t), but because they should should rightly get most of the credit for ushering in the era of small devices that do not completely suck.
Being at the mercy of Apple has been a tremendously better experience than being at the mercy of Nokia/Verizon. My next phone will almost certainly be an Android (HTC Evo 4G, I suspect) and I am looking forward to having a comparable experience, but without being at the mercy of someone (well, except for my cellphone contract).
The Sprint EVO is really Sprint’s major push into the Android world. I expect to see a marketing campaign (not quite as big, but big) similar to the Verizon Droid campaign.
A landline videophone’s primary utility is to present you with the dilemma of either:
1. Transmitting live video of yourself in a state of either partial or total nudity entirely inappropriate to the relationship you have with your co-conversant, or
2. Always wearing business dress at home.
It is little surprise to me, anyway, that they never caught on.
A cellular videophone will be mostly used away from home, and thus in environments where one is already presumably dressed appropriately for interacting with the world face-to-face. It won’t face that problem. Whether it will still be unpopular for OTHER reasons is a question on which I’m disinclined to speculate.
@techtech: “The Chrome OS is intended to create a truly portable app ecosystem based on open web standards, where ‘all apps are web apps’.”
Where have I heard that before? Oh, right, Apple tried telling everyone that for the iPhone OS, and Palm tried telling everyone that for webOS. Both were eventually forced to relent and issue native-code SDKs.
I suspect that eventually Google will fold Chrome OS and Android together, creating some mixture of both that combines Chrome OS’s excellent Web app support with leveraging the existing Android app base. The resulting “Chromedroid” might be an ideal OS for phones, tablets, and netbooks all at once, and would have wider potential than iPhone OS.
> A cellular videophone will be mostly used away from home
Not really. Our cells are for us to talk to family, friends and co-workers. Our landline is for modems and telemarketers. Lots of people have ditched their landlines.
Yours,
Tom
Reported specs on the HTC EVO here: http://www.engadget.com/2010/03/23/htc-evo-4g-is-sprints-android-powered-knight-in-superphone-armo/
And, wow – although looking closely, it’s not far ahead of the Nexus One (clearly, I have not been keeping up with the cellphone wars!). 4.3″ 480×800 screen, HDMI out, 1 Ghz CPU with 512MB RAM, 1 GB flash (integral) + 8 GB SD card, 8 MP rear camera and 1.3 MP front camera, 4G + wifi.
Never mind the iPhone; I’m starting to think this platform will disrupt the PC from below.
“Our landline is for modems and telemarketers. Lots of people have ditched their landlines.”
The killer app for a videophone is “honey, I found the cutest shirt for sale — do you like it?”
Right now, I have a colleague of mine that’s pretty big into social network (mainly to keep in touch with kids and grandkids), and I just recommend him to try out the HTC Incredible, which does what he needs, and is on his network (Verizon).
There are still people holding on to the iPhone on Verizon just to not get entangled with AT&T network. But for me, why wait? There are better alternative in things like Nexus One, HTC Incredible or Motorola Droid.
There have been phones with 2 cameras already. Hell, the Nokia N70 is ancient for a mobile phone and it has two cameras, one for videocalls and one for taking pictures. The videocall functionality works trivially, just like an audiocall, so long as you have the 3g bandwidth required. Still people make next to no use of it, I don’t know if it’s due to the pricing of the data or some other factor.
I’ve been sticking with my trusty Nokia 6310i for seven years, simply because there was nothing out there I was sufficiently convinced of technically, and whose manufacturer had not recently roused my bile for one reason or another (yes I am that kind of customer – I will deny business for political reasons).
Two years ago I thought I might go for the Neo Freerunner, but in the end I declined. While surely a great toy, I was not convinced about the phone’s utility in daily life.
Just two weeks ago I made up my mind to buy an Android phone (the HTC Desire), and apart from the shortened battery life I can’t say I have any regrets! Getting used to the interface was a breeze, Internet works great, I will never be lost again thanks to GPS and Google Maps, the OS is Linux based – and nobody tries charge me a yearly fee for the SDK, as Apple does.
Obvious win is obvious.
I’ve examined this out by using the phone (version 1.6) in the comparable spot, the identical sequence of apps, for the same period of time (about 6 hours), with and without task killer and with and without that battery saver app that shuts down the radio and enables it only sporadically. To my surprise, neither bettered my battery lifetime at all, and the battery saver app actually worsened it slightly.
The emerging pro-iPhone spin on these numbers in the press is that 3G/3GS sales are artificially depressed because iPhone customers are waiting on the summer release of the 4G.
This, of course, fails to explain why the unit shares of every other Android competitor are diving…
Why do you feel that there is a single reason explaining decline for each of Android competitors, and rise of Android?
>Why do you feel that there is a single reason explaining decline for each of Android competitors, and rise of Android?
Because we’re looking at unit share declines for Apple and Blackberry in the same period that are identical to statistical noise level, and Android is the only competitor gaining share.
Yes, one could construct a complicated explanation for these facts. But the data sure look as though since mid-2009 Android has been hammering everyone else pretty impartially. This suggest that the driver for the change is something endogenous that Android is doing, rather than a bunch of unrelated exogenous drags on its competitors.
Shaving with Occam’s razor, essentially.
It’s interesting that iPhone and Blackberry are being hammered at about the same rate.
I would have expected that Blackberry would be in more trouble than iPhone, based on how much suckier Blackberry is (at least in my experience).
(Though perhaps my experience isn’t reflective of a fair fight, because my Blackberry is employer-provided and locked up tighter than the proverbial – with restrictive security policies, which means no extra apps).
>I would have expected that Blackberry would be in more trouble than iPhone, based on how much suckier Blackberry is (at least in my experience)
That was my expectation too. The fact that it isn’t breaking that way is what’s making me question some of the conventional wisdom about the importance of app stores and UI.
There is an alternative possibility. It’s possible that Blackberry would be bleeding share a lot faster if not for the inertia of corporate accounts. My wife has one issued by her law firm and raised this possibility. It would be an unlikely coincidence if the resulting stickiness canceled the market impact of the iPhone’s feature superiority to within statistical noise, but it’s possible.
Erbo Says:
For what it’s worth, Google seems to have succeeded in creating a viable web application engine in their Chrome browser. The thing is supposed to be amazing at running Javascript programs. Roughly as stable and as fast as native apps. It’s no wonder they thought “hey let’s just replace the whole OS with the browser” after seeing how well Chrome works.
> and Android is the only competitor gaining share
Could this be a temporary result of a bunch of panicked cellular carriers (everyone but AT&T) running scared from Apple and as a result all desperately pushing Android phones at the same time? There has to be a large fraction of the public that, confused by cell phone technology in general, simply buys whichever shiny thing in their price range that is shoved in their face after they walk into a cell phone store.
>Could this be a temporary result of a bunch of panicked cellular carriers (everyone but AT&T) running scared from Apple and as a result all desperately pushing Android phones at the same time?
I’m sure “running scared from Apple” is a big part of what is going on. What I don’t buy is that this is temporary. Remember the context: the carriers are desperately afraid that somebody else in the communications value chain is going to seize control of their customers and commoditize them into low-margin bit-haulers. They ran scared from Microsoft when they saw it as the plausible threat, now they’re in the same mode with Apple. (Except for AT&T of course, which has done a deal with the devil.)
Their fear is completely rational; in fact, I think they can delay this fate but not avoid it, and I’m guessing the smarter carrier execs know that too – they’re just hoping they can delay the inevitable until it’s someone else’s problem. Android, because it’s open source, doesn’t obviously trap them at the wrong end of the power relationship with an OS vendor, so they’re running to it as the least bad option. The stupid execs think they can use it to protect themselves from commoditization; the smart ones see it as engineering the longest possible delay in that process.
This isn’t a transient set of pressures they’re feeling. It’s not going to change this quarter or the next. Google is exploiting that ruthlessly and well.
> I don’t know WebOS, but his analysis of the market dynamics seems sound.
I’ve been running WebOS for awhile now. The UI is the best on the market and its programming capabilities ( just look at webos-internals.org ) are beyond Android and the iPhone currently. Its two problems are the complete lack of any effective marketing and the Pre should have been physically designed better. And having you say “I don’t know WebOS” is an obvious example of a total marketing failure on Palm’s part.
If HP wants to sell more devices they need to create a couple of non-phone Palm devices to compete against the DSi, iTouch, and iPad markets. The UI, 3D game capabilities, smooth merging of social networks, and web access of WebOS put it ahead of those systems. And in the information/game pad market people don’t wait for a “2 year upgrade”. They just spend money for entertainment purposes.
ESR says: Your assessment of WebOS strengths and weaknesses matches my previous informant’s exactly.
Agreed 100%. I love my Pre, but I do wish it had a more solid feel to it. The first run Pre’s had several QC issues, including the screen cracking for no apparent reason. I also wish Palm hadn’t gotten clever with the camera and the “simulated depth-of-field” thing. Because of that, it doesn’t do macro shots well which makes bar-code recognition very limited – I can get QR deCODEr to recognize and read a QR Code about 60% of the time, and regular bar codes unreadable. This shuts out several popular apps that most droid phones do quite well. The internals (processor, etc.), on the other hand, seem to be pretty good quality and well put-together.
The WebOS SDK was available even before the Pre was released (and the “Homebrew” community was active almost before release day, too). They started with a limited availability, and phased it into open release over the next few months. I believe this was the plan from the start, as it matches the way they phased in the complete feature set over the first few months. Remember, WebOS hasn’t even been out for a full year, yet. Palm spent the first few months making sure what they had out was stable, adding new features and making sure those were stable and didn’t break anything else, then doing it again with something else.
Of course, Palm has encouraged the Homebrewers rather than trying to shut them down like Apple does. They’ve even created an official unofficial app catalog and a way for developers to set up their own app stores on their websites. I just don’t see Apple ever doing something like that.
I use Chrome as my default browser. It’s a very nice browser.
I also use Excel and Google Docs Spreadsheet.
There is no way I would use Google Docs for the things I use Excel for.
Call me a curmudgeon, but I LIKE having local access to my files, and I LIKE having local backups of my files.
In watching long term trends in CPU design, I also suspect that we’re about to watch a sea change transition. CPU manufacturers are at the point where throwing more horsepower into a CPU isn’t affecting the user experience that much. I’m a fairly high end user – page layout, vector drawing tools, large bitmap editing.
Anything made since about 2006 or 2007 would cover my needs nicely. Honestly, the Intel Atom comes close to meeting my needs.
When talking about server rooms, power consumption is king. I’m wondering how soon it will be before Intel autocannibalizes its Xeon server line for blades running quad-core Atom CPUs that draw 1/10th the power.
> When talking about server rooms, power consumption is king. I’m wondering how soon it will be before Intel autocannibalizes its Xeon server line for blades running quad-core Atom CPUs that draw 1/10th the power.
Someone is making an artificial brain using a large number of cell phone CPUs running a neural net.
This trend is why I started learning Erlang. I’m attracted to the idea of easy multi-processing. Not sure whether Erlang provides that in practice for a wide variety of practical programs, although it should do fine at neural nets. There are a *lot* of coders I respect who comment here and I’d be interested in their thoughts.
I just flashed on SETI@Cell Phone or even SETI@Neural Net@Cell Phone. Wonder how many people would want to run some app like that on their cell phone – especially while it’s on the charger overnight.
Yours,
Tom
Yours,
Tom
> What is this “will†abandon, kemo sabe? NPD’s numbers say it’s already happening.
Wait! Are there any numbers that show iPhone sales have declined? At all? I mean in terms of number of units sold rather than “market share”. I don’t see how people are jumping from “Apple’s share (in units) of US sales declined” to “Apple is in trouble.” I’d kind of like to see some indication that Apple’s unit sales or profitability have been impacted in any way at all – much less negatively – before we start dancing on that grave.
That chart showing market-share trends would be entirely compatible with, for instance, the theory that all those Droid ads *grew the smartphone market*. Whose nutty theory says Apple has to lose for Android to win?
>Wait! Are there any numbers that show iPhone sales have declined?
Apple says it had record sales for the quarter.
>Whose nutty theory says Apple has to lose for Android to win?
New-sales share predicts future market share. This is hardly nutty.
Yeah what Glen Raphael said.
In the most recent quarter IPhone sales were up 100% over the previous quarter.
Slice and dice the market share numbers all you like. The plain fact is that Iphones are selling like hot cakes. Still.
Ken Burnside says:
>>> There is no way I would use Google Docs for the things I use Excel for.
Call me a curmudgeon, but I LIKE having local access to my files, and I LIKE having local backups of my files.
Well Ken if you use Google Docs you are no doubt aware that you can download your doc to your local machine. That way, you have a copy in the cloud, one on your hard drive, and you could put a copy on a thumb drive as well.
Isn’t that the best of all possible worlds?
> New-sales share predicts future market share.
I’m not sure that’s true when you’re talking about new sales right after a new product introduction. But in any case the right response to that is probably “So what?”
Apple’s goal isn’t to ship more phones than everybody else, it is to make money selling really nice phones. The iPhone’s initial announced goal when it was introduced was to win a 1% share of the smartphone market, wasn’t it? And then there’s this:
http://www.financialexpress.com/news/55-mn-smartphones-sold-in-q1/614490/
Quote: “The volume of smartphone sales continued their climb in Q1, 2010, registering a growth of 67 per cent, a study by Canalys said. […] Apple […] has made share gains over the past year, climbing from 11 per cent a year ago to 16 per cent in Q1, 2010.”
If the US market grew by anything like 67%, Apple’s smaller “share” there likely still represents an increase in sales volume.
In short: the latest results are good news for Android fans but aren’t bad news for Apple fans. Apple’s still doing just fine. This town is still plenty big enough for the both of them. :-)
But doesn’t Android contribute to the death of Linux?
http://jakob.engbloms.se/archives/1131?owa_from=feed&owa_sid=
To port an application written in C/C++ such as Firefox to the Android platform, the app has to be modified to work as a backend to the Java interface. ArsTechnica has a write-up on how Firefox was brought to Android through just such a modification. Note that this does mean that ports are not as straight-forward as they would be to other platforms with a directly accesible C API. Interestingly, the Android approach essentially inverts the traditional relationship between C and other languages, where it was common to have a C adapter layer around other languages (like Java) in order to access the platform.
>But doesn’t Android contribute to the death of Linux?
It’s Linux inside, so that seems unlikely.
It’s a very thin Linux, Eric.
Note that there is no gpsd. Look at the gyrations needed to put Firfox on Android.
If the kernel survives, as the google fork, was that god enough?
>If the kernel survives, as the google fork, was that god enough?
Don’t get the vapors. No matter how weird it looks out of the box (and I’m not only talking Android), if it’s all open source, somebody will backport a C compiler to it. And then it’s off to the races; Unix userland will follow, even if it wasn’t there out of the box.
Kinda strange that everyone is pisssed at Apple for insisting that apps are written in C, C++, or Objective C, yet nobody is pissed that you can’t write apps in C or C++ on Android.
I just came across this post http://jakob.engbloms.se/archives/1131?owa_from=feed&owa_sid= which claims that C as a fundamental programming language is being deprecated on newer machines, esp smart phones, in favor of other languages. In particular it claims that
Also from that article, since you like python:
Hell, they’re not even going to teach C after next year over here in Blighty.
Incidentally, Eric, are you aware of Meego, the Intel/Nokia Linux stack for smartphones and similar?
Unlike Android, which AFAIKT is a whole new Java userspace on a linux kernel, Meego is basiacally a new desktop environment on an otherwise ordinary Debian-based linux distribution. The Nokia N900, in particular, is less a Linux-based-cellphone than a Linux workstation that happens to have a cellphone form-factor.
I’m curious whether you think this can get any traction – I’m inclined to think Android has stolen the oxygen, but don’t really know.
ESR says: I’m also inclined to think Android has stolen its oxygen. But it’s early days yet.
Um, that’s not a new argument. In fact, it’s one esr himself has made. I think he made more or less the same argument elsewhere more recently, but I can’t seem to find it at the moment.
@esr: Hey, btw, what ever happened to C++ Considered Harmful, anyway? I like the concept a lot and I thought about writing a similar argument 3-4 years ago, but never got around to it. C++ was supposed to save coding time and reduce lines of code, but if you’ve ever looked at both the GTK and Qt libraries, Qt is much, much bigger due, in large part, to being written in C++, while GTK is written in C.
ESR says: What happened is my collaborator lives 1500 miles from here. It’s unfinished.
> ESR says: What happened is my collaborator lives 1500 miles from here. It’s unfinished.
Good God, man. Rob only lives in Austin. Its a mere single timezone away.
Whats more likely is that you and Rob are so consistently wrong when you write together (64-bit article, anyone?) that you’ve abandoned the C++ article, too.
Take this, for example:
http://landley.net/next/28-08-2007.html which is titled “If Apple can’t scale production impossibly fast, the race for the new 64-bit platform standard looks like it’s down to Vista-64 and Linux.”
which gets so many things almost impossibly wrong that at first I thought it had to be a joke.
>Whats more likely is that you and Rob are so consistently wrong when you write together (64-bit article, anyone?) that you’ve abandoned the C++ article, too.
You are beginning to troll so consistently that I’m finding you a waste of my time. And you’re not even very good at trolling, because given Apple’s market share then and now the quote is in fact correct. MAC OS X isn’t setting any standards in anything else than how to be a gold-plated niche product.
I’m tired of your nonsense. If you don’t either up the level of your critical game or learn how to keep a civil tongue in your head I’ll ban you.
Things seem to be going swimmingly for Apple. Note that this is from Rob’s site:
http://landley.net/next/uni.jpg
Context: http://landley.net/next/24-09-2007.html
Also note that AAPL closed at 257.68 today, a full $100 higher than the pointed to article’s quote of $157.41 from Aug 24, 2007 (21 months).
> I’m tired of your nonsense. If you don’t either up the level of your critical game or learn how to keep a civil tongue in your head I’ll ban you.
Go ahead, I’ve been banned before. Ask Jay.
Snarky! Burn ’em, esr.
Otherwise, Fortune magazine named Apple the most admired company in the United States in 2008, and in the world in 2008, 2009, and 2010.
So someone likes them. It’s not like Apple buys a lot of ads in Fortune.
In reference to where I said, “>If the kernel survives, as the google fork, was that god enough?”
and you responded with “Don’t get the vapors” (I didn’t fart), note that I’m not the only one who thinks there is a problem (and apparently you’re on the other side of that concern.)
To be possibly more clear, I think there is a lot of shakedown to come, along these colliding vectors here: http://lwn.net/Articles/372419/
Greg K-H has the last word currently, but it’s months old now. http://www.kroah.com/log/linux/android-kernel-problems.html?seemore=y
> given Apple’s market share then and now the quote is in fact correct.
If you see market share as success, sure. (Damnit, where is Russ Nelson when you need him?)
But Sun showed us all what happens when you sacrifice profits to gain market share.
Here is Larry Ellison, on Sun’s previous management
Pull quote:
or should we look at the stock performance of the two companies that ‘top’ Apple in market share (in the PC space)?
http://www.google.com/finance?q=DELL+HP+AAPL
or perhaps compare the market cap of these three:
HP: $4B
DELL: $30B
Apple: $234B
I’m seeing order of magnitude differences as we move from #1 (HP) to #2 (Dell) to #4 (Apple).
*This* is one example of “things almost impossibly wrong”.
>If you see market share as success, sure.
This is you changing the subject, troll. Landley and I never disputed that Apple is profitable and has huge market cap. The issue we were addressing is who gets to control the next generation of personal computing. For that, market share trends are in fact what matter. Apple is not going to win that fight. Apple is not even trying to win that fight. They’ve taken “Computer” out of their corporate name and are now focused on higher-margin cellphones and personal electronics, just as we predicted.
You are making it clearer and clearer that you will say anything, no matter how shifty or irrelevant, to throw feces at me. I will no longer tolerate this, because you have also demonstrated almost conclusively that you are not bright enough to be interesting. You may be used to being the smartest guy in the room in whatever little pond you hopped out of, but by my standards, your grasp of facts and logic is indifferent and your critical-thinking skills are poor. Most damningly, you have not on any occasion ever displayed an insight that was novel to me. Your license to be rude to me is therefore revoked; the next time you troll me I will ban you.
You may regain the privilege of being rude to me on my blog by teaching me something, or exhibiting some thought that is actually novel or interesting. Surprise me; demonstrate that you are not a waste of my time. I would actually be pleased if you managed this. If you can’t hack that, mind your fucking manners or I will boot you out.
Actually, that image is an excellent reason to bet against Apple.
Is there anything more trendy, status-conscious and fickle than the 21 and under demographic? I suppose one could have predicted that acid-washed jeans would become a staple of our society way back when as well.
So in 5 years they’ll be going, why am I overpaying for this crap? Whom am I trying to impress?
How is that image a reason to bet against Apple exactly?
Yes that under 21 demo is very fickle, and Apple is one of the few that consistently can appeal to them. They sure as hell aren’t getting a Dell dude.
Focusing overmuch on market share is a mistaken approach to evaluating Apple’s success. Profits matter too, people. You can easily go broke trying to buy market share. Apple is shrewd enough to refuse to play this game. They insist on getting good margins on their stuff, and usually get it.
And a word in defense of Mac OS X: I submit they are doing more than anyone to get Linux on the desktop. Everyone knows it contains FreeBSD, and more people are using it than all the Linux distros combined. In a way Apple has seen the value of open source and put it to work.
from the article:
>>Apple has 6.4% market share in the US PC market. That’s not enough.
Really? Ok. That was in 2007. Apple share is now about 7-8%. How much is enough?
>> This means that for Apple to achieve 50% market share by the end of 2008, it has to double its production, then double it again, then double it again. In the next year. And that’s just in the US: worldwide they need yet another doubling in that time period. The Macintosh’s US market share is less than a quarter of the size of the whitebox market, and 1/5 the size of Dell.
Who said Apple was ever going to achieve that much market? What kind of nut would think that was a realistic goal? And financially Apple is blowing Dell out of the water. Margins matter.
>> Worse, computers cost money to produce before you can sell them, and Apple hasn’t got the money to buy enough parts to make all those computers. To purchase the necessary inventory, Apple would have to borrow tens of billions of dollars and gamble it could sell them before the loans came due. Another way to state Moore’s Law is that computer hardware depreciates 50% every 18 months, and the secret of Dell’s success has been rapid inventory turnover. The last thing Apple wants to do is own lots of computer parts for a long time.
Apple now has about $40 billion in cash and securities, and no long term debt. Apple does not need to borrow a dime. And I believe Apple does not even make the machines, but has another company make them under license. That takes care of the inventory problem.
Honestly that article was pretty bone-headed. Apple is doing just fine. And if you what to know who controls “the next generation of personal computing,” it looks like Microsoft on desktops. Smartphones is a war, I’m not sure anyone can be said to control it.
>Really? Ok. That was in 2007. Apple share is now about 7-8%. How much is enough?
Rob Landley thinks the positive network effects start to be a threat to unseat the market leader at about 30%, but that number is derived from observing a couple of historical overturns and we both know it’s pretty fuzzy. It’s certainly well north of 8%, though.
“Yes that under 21 demo is very fickle, and Apple is one of the few that consistently can appeal to them. They sure as hell aren’t getting a Dell dude. ”
Yes, but what happens when the higher education bubble inevitably bursts?
>Yes, but what happens when the higher education bubble inevitably bursts?
/me blinks
That’s…actually an interesting question. Wow. I think Apple would survive OK if the bubble burst tomorrow; there’s a lot of money in iPods, and that market has wider spread than college students and art-house trendoids. But you’re right, their laptop sales would take a dive.
Here’s what I see happening.
Apple is building gold plated computers; the value proposition on Mac hardware regularly churns through a three to four year cycle. Right now, we’re right about year two on the cycle.
Basically, Apple comes out with Something Cool for the Mac hardware market about every 3-4 years. They charge a premium for it, and make VERY good margins. They do hardware refreshes on under 10 models of computers every year, and are looking for the next Something Cool.
When they HAVE the ‘Something Cool’, most of the press in the industry alternates between gushing over the Something Cool to pointing out that you’re being forced to pay for features you’re not going to be using.
As the cycle progresses, the ‘Macintosh Tax’ compared to COMPARABLE hardware and features tends to decline; it never quite reaches parity. Apple is staying out of the commodity hardware market as much as they can.
Most of the profits from the Macintosh division remain within the Macintosh devision doing software updates and paying engineers to think like marketers, and to be able to leap on the next Something New.
Note that this is a model for making a consistently profitable computer division. It is not a model to break Microsoft’s 90%+ market share, and I honestly don’t think Apple wants to even try.
Why should they? Microsoft is the best marketing department Apple never had to pay for. So long as A) Microsoft is growing the total number of computers sold in this space and B) Apple has an 8-10% share of this growing market due to people ‘trading up’, the Macintosh division remains profitable while still appealing to its core demographic.
They have to do the occasional anti-Microsoft jab to keep their partisans happy, and they have to keep mixing the Kool Aid…but Apple would MUCH rather be the Mercedes of computers than the General Motors trying to sell everything from industrial farm machinery to the Geo Metro.
Even more to the point, Macintosh sales, while up in units sold, market share and revenue (and even in profit margin if some of my sources are correct), are a declining portion of the company’s overall revenue model.
I could EASILY see the Macintosh unit being spun off after Jobs departs the company (either voluntarily or in a pine box). Hell, if Linux is sufficiently disruptive from underneath in this space (an argument I find less and less likely to come about, because I think the ‘front’ has shifted), I could see it happening under Jobs’ watch. While Jobs has a lot of personal fondness for the Macintosh…the iPhone, iPad and iTunes media sales are going to dwarf the ‘division that’s kept around because Steve is sentimental.’ once it’s no longer sustainable to sell gold plated computers.
Also, Rob Landley’s reformulation of Moore’s law – “Computer hardware depreciates at 50% per year” shows that he doesn’t understand how depreciation is calculated, and he really doesn’t understand commodity hardware economics.
It’s more like a depreciation of 200% in the first year (the hardware is now worth 33% of what you paid for it) and about 900% in the second year (the hardware is now worth about 3.5% of what you paid for it.)
It is amazingly difficult to convince small businesses that they are spending more on electricity for a computer for a year than that computer’s value in depreciated dollars. In part this is because of a tendency in small businesses to think of that computer as ‘paid for’ and no longer a cost item.
One of the sea changes in hardware right now is that the most expensive part of a data center in prioritization is this:
1) People to run it. And this is changing rapidly, Most data centers are now running more computers with fewer people – even Windows data centers aren’t immune to this. Expect improved automation and smart virtualization to slash data center manpower requirements by 5% per year for the next five years.
2) Electricity to power it and keep it cool. This is on the verge of changing as ARM and Intel fight it out for the data center. (This, by the way, is my prediction for what will kill the Windows data center. I don’t think Intel can get the Atom’s ‘data center successor’ value proposition competitive with the ARM CPU, and without Intel, the Windows data center rapidly turns into virtual Windows licenses in Hypervisor or VMWare).
The electrical costs for a data center are likely to surpass the manpower costs by the end of 2011 for most tier 1 through tier 3 data centers.
3) Hardware. Even two years ago, this was jockeying with manpower for top of the list. We’re already seeing commodity hardware competing on “I draw less juice and can do everything you reasonably need” in the laptop/netbook and desktop computer space. It’s coming in the data center, and it’s going to be a HUGE change there.
I don’t know that I’d call jake particularly rude, perhaps combative but that’s about it and not unusual for the discourse in this blog. I’m not sure precisely what nerve he’s touched with esr, perhaps he could expound. jake, thanks for that Ellison link, I always enjoy Larry (and used to enjoy Scott a lot too :) ), who’s a smart guy most of the time but dead wrong about the consolidation he sees sweeping software. Also, you got the HP stock ticker wrong, which is why you got that ridiculous $4 billion market valuation: it’s HPQ and their valuation is $110 billion, almost half of Apple’s ridiculous overvaluation.
Darren, good point about BSD code getting much farther on the desktop through Apple’s hybrid model than linux ever has. No doubt profits are important but I think the Apple-backers are losing the trees for the forest with those financial numbers. If you look at the game Apple is playing, aiming for the luxury niche with a very closed platform, that strategy has always lost to more open strategies that try to get more of the software market, at least in terms of ubiquity. Perhaps Apple could someday go all in with its huge cashpile and embrace the more open, mass-market route, but Microsoft has plenty of cash to fight that war. It certainly won’t happen while Jobs is at the helm though.
Go ahead, I’ve been banned before. Ask Jay.
Did Jake just insinuate that he’s the same person as Marshal?
Google announced today that it’s ending direct-to-consumer sales of Android phones. Do you think this affects your thesis at all?
>Google announced today that it’s ending direct-to-consumer sales of Android phones. Do you think this affects your thesis at all?
Probably not. As long as unlocked Android phones are available to consumers through any channel they have the same disruptive effects on the market; whether it’s a Google web store or J. Random Retailer doesn’t seem to me to matter a lot. Actually, if this move leads to competition that drives down the price for the higher-end units it may accelerate the market evolution I’m expecting.
To depreciate by 200%, something would have to be worth less than zero;
If the hardware is worth 33% of what you paid for it, that’s 67% depreciation.
In the second year, dropping to 3.5% is a loss of 29.5% of the original value or 89% of its value at the end of the first year. Neither of those numbers comes close to 900%.
Monster – that’s not how I was taught to calculate standard depreciation, nor is it how anyone I’ve ever worked with calculates standard depreciation.
Mind you, it’s the way someone with a general mathematics background, rather than business finance or accounting background, would express depreciation.
However, in non-jargon terms:
Your typical business computer is worth roughly 33% of its purchase price after a year.
It’s worth roughly 3.5-4% of its purchase price two years after you buy it.
It’s worth about 0.05% of its purchase price after three years.
In many small businesses, the electricity used to power a two year old computer costs them more than the actual accounting value of the machine.
esr says:
>>> Rob Landley thinks the positive network effects start to be a threat to unseat the market leader at about 30%, but that number is derived from observing a couple of historical overturns and we both know it’s pretty fuzzy. It’s certainly well north of 8%, though.
Fair enough. But Apple is not trying to unseat the market leader, they are trying to occupy a nice, profitable little niche for themselves. It is a pretty good business, and if they could get market share up to 10% it would be that much better. But there is 0% chance they are going to unseat Windows. They know this, and are not trying to achieve such a quixotic goal. Microsoft does not have to lose for Apple to win.
Also about that higher education bubble: that is an interesting question. There have been a lot of articles over at National Review and other places about how much higher education has been oversold. When do people think this education bubble might burst, and how it might happen? I think there is something to this; when a bunch of young people and their parents figure out they might be better of going and learning plumbing or electrical work rather than getting a useless degree, watch out.
And I don’t think Jake is being a troll. He is being combative, sure, but he is making some valid points and backing them up with links. He is not merely hurling around content free abuse.
>Fair enough. But Apple is not trying to unseat the market leader, they are trying to occupy a nice, profitable little niche for themselves.
You. too, are changing the subject, but at least you’re not doing it with obvious feces-flinging intent. So, once again: the topic is not whether Apple is going to remain a viable company or make lots of money. The issue is whether it will get to set desktop standards that everyone else has willy-nilly to follow. And the answer is still “no”, not until its market share at least triples.
>The fundamental issue here is that you misunderstand Apple’s goal and thus are talking about something irrelevant (Apple setting desktop/phone standards that others have to follow) because Apple is fundamentally uninterested in doing so.
Damn, you people are thick today. it was Rob Landley and I who were pointing out that Apple isn’t interested.
Ajay says:
>>No doubt profits are important but I think the Apple-backers are losing the trees for the forest with those financial numbers. If you look at the game Apple is playing, aiming for the luxury niche with a very closed platform, that strategy has always lost to more open strategies that try to get more of the software market, at least in terms of ubiquity. Perhaps Apple could someday go all in with its huge cashpile and embrace the more open, mass-market route, but Microsoft has plenty of cash to fight that war. It certainly won’t happen while Jobs is at the helm though.
So far Apple seems like they can co-exist just fine in their niche alongside the dominant standard. If that strategy has “always lost”, then why is Apple still around? I mean they have been selling Macs since 1984. Shouldn’t they have bit the dust by now?
As an Apple investor I can tell you: if anything, Apple is now undervalued. The most recent quarter they reported earnings that were close to 90% greater than the same quarter a year ago. Now a fair price for a growth stock, as measured by the PE ratio, would be at least half of that growth. So that would be about 40-45x earnings. And I just checked and it is selling for 22x earnings.
I have been investing in stocks for 20 years, and I am telling everyone on this thread: Apple at this price is cheap. This is a rare chance to make a lot of money. Mark my words folks. I am betting this next quarter will be a blowout. It will be the first quarter that will include IPad sales.
Android has problems, though. First, it’s written in Java, not Objective-C. Interpreted, not run. Second, because it’s multitasking, background apps get to run even though you’re trying to interact with the device, so the interaction is jittery. You can see this best when trying to scroll. Third, because there are multiple ways to do things, it’s hard to habituate. Fourth, the UI is missing affordances which lead you to do the wrong thing. For example, the music player will happily let you click on an artist and display the song list below the bottom of the screen. Or if they have a lot of songs, it will take up the entire screen, giving you the idea that in order to restore the previous screen, you have to hit the Back button. Unfortunately, that exits the music player. The Phone application encourages pocket calls because when you hang up, it throws you into the call log. At least on a Moto Droid, simply holding the phone by its edges is sufficient to press a “Redial” button. The back button use is inconsistent. Sometimes it means “return to the previous application”, but sometimes it means “return to the previous web page.
Then there’s another annoyances … like always pressing the volume control buttons when picking it up, or the My Tracks application randomly stops GPS tracking, or you can’t edit searches in the Market if you’ve made a typo, or the email has no “oh fuck” facility; if you delete something, you have to pull it out of the Trash.
Oh, and every change in a bit requires battery power, so the fewer of those, the better. Smart phones will bring back efficient coding, because you can’t have inefficient code and long battery lifetime.
The fundamental issue here is that you misunderstand Apple’s goal and thus are talking about something irrelevant (Apple setting desktop/phone standards that others have to follow) because Apple is fundamentally uninterested in doing so.
Apple doesn’t actually want to own the market. They don’t want to be the dominant player. Not in phones, not in computers. Apple’s functional dominance of portable media players is an aberration due to having the combination of the best media player app of the major vendors and the best player UI of the major vendors. But that market is going away anyways.
They want to the dominant prestige player. And they’ve achieved that in both the phone and computer markets. Android overall isn’t their competition (although the high-end Android models are to some extent), in fact Apple needs Android to be successful in order for Apple to achieve their goal of dominating the prestige niche which is where the margins are nice and fat. without Android (or some other competitor) to fill the rest of the market the carriers would be pushing Apple hard to introduce lower-end, lower-margin products.
Apple spent a long time trying to compete directly in the overall market during the 1990’s. It was an outright disaster for the company. When Jobs came back he changed the entire business model to one of niche dominance to ensure fat and safe margins. It worked spectacularly.
Note this is a mistake that many people make, especially those in the Windows and Open Source worlds where market share and dominance is seen as a major, if not overriding, goal.
>Damn, you people are thick. it was Rob Landley and I who were pointing out that Apple isn’t interested.
No, you and Rob asserted that Apple couldn’t possibly be a factor in the 64-bit transition becUse they couldn’t grow fast enough. From what I can tell, neither of you ever asserted that Apple lacked desire.
The problem, from where I stand, is that you are overly-willing to dismiss Apple’s market cap (only Microsoft and Exxon-Mobil are larger). Apple’s mkt cap is clearly an indication that “the market” sees AAPL as a growth stock. That it’s current record earnings will continue (if not accelerate).
Apple’s P/E is what, 25 or 26? That’s another HUGE indicator that the market sees Apple as continuing to grow.
How this affects Apple’s presence in the 64-bit transition is more complicated:
1) Many of us think 64-bit is far less of an issue than the transition to multicore.
2) Apple is pushing the clang compiler project very hard. It’s already showing up gcc, and is knocking heads with the best compiler tech from Intel and Microsoft. Unless Linux turns away from gcc soon (and by the looms of it, freebsd has turned away from gcc, and openbsd has intentions. OpenBSD, NetBSD and … Linux are, however, hampered by (wait for it), the larger number of architectures they support. Clang isn’t going to run on PPC or MIPS without significant effort.
While clang is an open source project, I don’t have to explain to you how much influence Apple will have on it’s direction as the project’s main leader & sponsor.
3) Russ Nelson makes some great points. Apple’s phone runs ‘C’. Android runs Java.
Watts / MIPS is huge when you’re running on batteries. Android has to run more code to accomplish the same thing. Watts/bit is also critical, but really more of a radio thing, and here iphone and android are equally-matched.
4) Apple is currently attracting thousands of programmers to Objective-C and Cocoa. The success of these on iPhone OS will play back ink the Mac space.
Android and Linux seem to be targeted at running Java.
Microsoft seems to be targeting CLR.
Apple is targeting C.
Which of these is the most natural for a change to 64-bit?
How about multicore?
>No, you and Rob asserted that Apple couldn’t possibly be a factor in the 64-bit transition becUse they couldn’t grow fast enough. From what I can tell, neither of you ever asserted that Apple lacked desire.
That’s been my contention ever since Apple dropped “computer” out of their name in, what, late 2007? It’s in one of the tweo papers Rob and I did.
I tend to agree that Clang and LLVM in general are going to be a very big deal.
> Apple’s P/E is what, 25 or 26?
Apple’s P/E is only 21.54, fanboi. 25 or 26 would be indicative of a bubble.
And P/E is commonly interpreted as, “number of years of earnings to pay back purchase price”. That is, Apple investors are paying $21.54 for each $1 of earnings.
Microsoft’s P/E is 14.95.
RIM’s is 15.33.
Oracle’s is 21.25.
Google’s is 23.00.
Nokia’s is 26.61.
Motorola is 63.80.
RedHat’s is 67.43.
esr> “The news comes to us today that in 1Q 2010 Android phones outsold Apple’s iPhone by a significant 7%.”
Suggestions to the contrary, Android sales are still well below the iPhone.
According to IDC, Apple took 16.1% share of smartphones in the quarter, while HTC and Motorola (the only Android makers, who also sell other non-Android smartphones) amassed a combined global share of 9%. Android’s total share is less than half that of second place RIM’s BlackBerry sales and less than a quarter of the smartphones sold by Nokia, but Android is getting a lot of press to suggest that it is taking over the market, at least in the US.
Android is doing well in the US for the same reason RIM’s BlackBerry sales have kept pace ahead of the iPhone, despite failing to best or even match Apple’s platform in terms of technology: most of those phones are being given away for free. That’s an easy way to claim market share, but not really a way to actually create sustainable growth. And if you look at RIM’s global sales in the last quarter, they’re only up 45% year over year compared to Apple’s 131% growth, despite all of RIM’s promotional free giveaways contrasted with Apple’s actual sales to customers.
The NPD’s numbers Eric quoted aren’t so much a story of Android competing against iPhone as much as Verizon trying to keep up with AT&T’s iPhone trajectory by dumping a ton of free smartphones (2 for 1 giveaways anyone?) into the market, many of which happen to use some version of the Android OS. Even so, Verizon is still behind AT&T in terms of smartphone sales.
It’s also notable that only a third of the installed base of Android phones are running Android 2.x. The majority are still running last year’s versions, meaning that Android as a platform is fractionalized to the point where increased sales (and free giveaways) are NOT creating a viable market for on-phone apps.
Apple’s iPod touch is also driving game development to the point where Apple’s mobile platform is encroaching upon dedicated game systems. Apple now has 20% of that market, nearly double the share taken by the Sony PSP and nearly three times what the iPhone OS claimed just last year. Android isn’t even represented on the chart, because there is no viable gaming market on Google’s platform. If Android were really outselling the iPhone OS in some meaningful way, that shouldn’t be the case.
AT THE END OF THE DAY, Both Google and Apple are accomplishing exactly what they’re intending to do.
The purpose of Android is to broaden the base of mobiles that are tied to Google’s adware-based services, rather than Microsoft (Windows Mobile) or perhaps Symbian, which could potentially ally with Microsoft given the deal between Nokia and Microsoft to bring mobile versions of Office to Symbian at some point.
The purpose of the iPhone is to sell new hardware at a profit and dramatically expand the market for Cocoa-based software development. Google doesn’t really care about the hardware margins of its partners, and can’t seem to sell its own Nexus One branded phone. Conclusion: Google doesn’t care about hardware.
It also isn’t that excited about creating a mobile software platform. It appears to hope that the open source community will accomplish most of that work for it. Google long term plan isn’t to do anything with Android’s Java-like VM or even the new C-based native platform; Google sees the future of mobile apps developed in the same code as it sees desktop apps: the web. The company is likely betting everything on HTML5. Talk about Android’s current VM, NDK, and Google’s support for Flash are all just efforts at covering the bases until mobile HTML5 apps can become a reality.
Apple sees HTML5 as a useful tool for developing web apps, but unlike Google, the company isn’t sold on the idea that the desktop and native mobile platforms are going to vanish anytime soon. Apple is putting significant efforts into developing Cocoa and Cocoa Touch as viable platforms well into the future, serving needs that web apps can’t serve now and many never really excel at.
That’s why Google’s apps are nearly all web based (Maps, Docs, Gmail, etc) and Apple’s are all native (iWorks, iLife, iPhone apps, Pro Apps).
It should come as no surprise that Google’s support for cheap, low quality phones that can be given away for free is an extension of that adware/web-based strategy, while Apple’s premium iPhone market that exacts the world’s highest ASPs while still biting off a third place share of the global market is exactly what Apple is trying to do.
To make my point clear, you stated:
I suppose the whole thing hangs on “have to”, really. Nobody *has* to follow the Win32 API, but it does appear that the office file formats are a standard, as both Apple and Open Office go out of their way to both inhale and produce .doc, .xls and .ppt files.
Android certainly isn’t establishing “desktop standards that everyone has willy-nilly to follow.” I don’t think anyone is arguing that Android belongs on the desktop. Since you made the original blog post, (about NPD’s report that an on-line survey of US customers self-reporting that they would rather buy an Android-powered phone, and who knows if what they heard was ‘Verizon’ rather than ‘Android’?), I don’t see how the subject is about the desktop, but as you’ve oft pointed out, its your blog, and you get to decide if the subject changes mid-thread.
Windows is a desktop standard.
Apple is working very hard to create more programmers who could write desktop apps for MacOS X.
Android is not a potential desktop standard.
Linux has no real standing on the desktop.
If the question is, as you suggest, “will apple succeed in dominating the desktop?” My answer is, “not likely.” They’re not focused on having that happen.
As pointed out above, the 64-bit transition may not even be much of a bump, never mind a teutonic shift.
So, Eric, while I doubt you’ll ever admit to me teaching you anything, is there anything here “that is actually novel or interesting”?
>So, Eric, while I doubt you’ll ever admit to me teaching you anything, is there anything here “that is actually novel or interesting�
Some assertions which I don’t see factual grounds for yet; you should cite your sources batter, and also be careful about distinctions between (for example) U.S. and worldwide sales. If you supply a full factual case, that will be interesting.
And you’re also confusing two different topics, because Rob and I weren’t addressing smartphones when we predicted (correctly) that Apple would not be a factor in the scramble for control of the 64-bit desktop. Neither Apple’s iPhone performance nor Android is relevant at all, there.
Finally…if you don’t think I’d ever admit that an obnoxious troll had taught me something useful, then you completely fail to comprehend the standard of rationalty to which I hold myself. Since I don’t require your approval on any level whatsoever, that failure is strictly your problem.
And now for something completely different.
Question for the audience: Does the average consumer know that a Droid will run all the cool apps they just saw on their friend’s Nexus One?
> And you’re also confusing two different topics
No, I’m not confused that you weren’t addressing smartphones when you attempted to write the pre-history of the transition to the 64-bit desktop. I do note that you’re at least 1.5 years late for the transition to ‘happen’, and that now you’re renaming what was “the 64-bit desktop” to “the next generation of personal computing”.
Guess what, I think you’re seeing very early days of “the next generation of personal computing”. Its Android, and iPads and iPhones, and goodness knows what else 5 years from now.
Its probably not 64-bit CPUs desktops. Or rather, it is, but its 8-12 cores of 64-bit CPUs in desktops, and these are used largely by creators (programmers, editors of movies, 3-D CAD creators (see the whole ‘maker’ movement), musicians, authors, etc. These will run whatever is needed to run the tools required to create.
For iPhone OS devices, it will be a Mac. For Android devices, it will almost certainly be a linux desktop of some description. And yes, there will still be Windows.
But in the near future (certainly before you can draw social security, though, given your employment history, I doubt you’re counting on much from that) many many people who are current desktop users will simply … stop. Their next computer (because all they really need are simple ‘Office’ apps, email and a web browser) will be something very simple, very sexy, and very, very inexpensive. LIke an iPad, only… better.
Face it, the “Desktop PC” industry is going to pop like a giant, over-ripe zit, spreading sebum-soaked debt all over the mirror of our industry.
As for you -n- Rob writing together, I’m only aware of:
http://catb.org/~esr/writings/world-domination/world-domination-201.html
which doesn’t mention the Apple Computer -> Apple, Inc naming transition (the revision history only shows 2006 dates), and
the as-yet-unpublished C++ paper.
Sources:
http://www.zdnet.com/blog/btl/apple-iphone-smartphone-market-share-surges-rim-slips/34181
http://www.roughlydrafted.com/2009/11/10/inside-googles-android-and-apples-iphone-os-as-business-models/
> Finally…if you don’t think I’d ever admit that an obnoxious troll had taught me something useful, then you completely fail to comprehend the standard of rationalty to which I hold myself
Given that you used, “troll”, “not bright enough to be interesting”, and insinuated that I’m amphibian, I’m not seeing a very high standard of self-measure. Or rather, I think I *do* understand your standard for self-evaluation. Rather well, in-fact.
I’m of the firm position that the 64-bit desktop landscape (and which OS platforms dominate it) looks largely like the 32-bit desktop did, subject to already existing trendlines. 64-bit didn’t matter, at least, not on the desktop it didn’t.
I also think I’ve pointed out at least 3 different reasons why linux on the desktop is in deep mud:
Its going to be held back by gcc.
Google have forked the linux kernel.
Google and others are attempting to kill ‘C’.
>Google and others are attempting to kill ‘C’.
You’ve just forfeited any claim to seriousness.
>Or rather, I think I *do* understand your standard for self-evaluation. Rather well, in-fact.
Good, then you’ll be pleased that I’m about to confirm your model. You’re banned as of now. Go nourish your hater fantasies somewhere else; I’m not interested.
Anyone predicting the demise of ‘C’ had really better qualify the prediction with regards to application program. Because for any kind of low-level programming, until the current microprocessor programming paradigm changes, C is the only game in town. Hell, many processors are developed with C friendliness in mind, ARM is one such architecture. Further, even with regards to application programming, it’s unlikely to go away entirely. I’d say the most likely outcome is that as computing power increases, more application will be developed in high-level languages- Python, Lua, Java, etc. But, anywhere speed is the critical design component, C can’t be beat.
Now it’s possible that some enterprising IC design outfit will decide to take a look at Lua bytecode and take a crack at creating an architecture that runs Lua bytecode directly, for instance. But then they’ll have to figure out how to get all those programmers out there to switch over to Lua- not an easy task. Perhaps the next big thing for IC’s is a hardware based garbage collector…
Another possibility is that programmable logic becomes the new programming paradigm, possibly VHDL or Verilog. Personally, I think this holds more possibilities.
But as I said, these are bottom-up, paradigm shifts in how computing is done. Until then, C will remain.
>But as I said, these are bottom-up, paradigm shifts in how computing is done. Until then, C will remain.
There’s a tier of software for which I don’t think C can ever be displaced, really, because said software needs a language in which keeping use of resources minimal and tightly controllable is more important than anything else except portability across different hardware architectures. Until that job goes away, C won’t ether.
Use of resources is not the only thing you might want to tightly control. Here a couple other cases where I’d never consider using anything higher-level than C:
1. Cryptographic code that’s exposed to attacker-supplied input. If you’re not writing this in C, you’re probably creating exploitable timing side-channels.
2. Certain lock-free concurrency algorithms that are very delicate with respect to the order of memory accesses.
In both these cases, even C is arguably a bit too abstract, and you’re well-advised to read the compiler’s outputted assembly to check it for sanity. A “portable assembly language” like LLVM-IR or qhasm is arguably a better choice, but at least C still leaves you with a prayer of getting it correct.
>There’s a tier of software for which I don’t think C can ever be displaced, really, because said software needs a language in which keeping use of resources minimal and tightly controllable is more important than anything else except portability across different hardware architectures. Until that job goes away, C won’t ether.
Agreed- and why I qualified my comment as a paradigm shift. Even in terms of programmable logic, C would likely have a seat at the table.
Actually, the more I think about it, I can only conclude my imagination is too limited to figure out what might replace C as a low-level standard (I’m sure it will be done, I just haven’t a clue as to what). It’s hard to get much more fundamental than if-then’s, loops, functions and easy memory-access. I’ve always thought that C was a great example of Einsten’s “Everything should be as simple as possible, but no simpler.” You’d think that, if it’s possible to have OS-on-a-chip, it would have been done.
Perhaps some sort of “chemical-computing” that mimics our own brain’s abilities, but with more control. But then, would something like that need programming? or allow it? (cue sinister music…)
>>The fundamental issue here is that you misunderstand Apple’s goal and thus are talking about something irrelevant (Apple setting desktop/phone standards >>that others have to follow) because Apple is fundamentally uninterested in doing so.
>Damn, you people are thick today. it was Rob Landley and I who were pointing out that Apple isn’t interested.
You did point that out to some extent for computers(although not in this thread, which is what I was addressing, unlike Mr. Fischer). _BUT_ the exact same thing applies to the phone market too, which pretty much invalidates your assertion in the post itself that Android outselling Apple means anything. Android isn’t disrupting Apple, it’s helping Apple by disrupting the rest of the cellphone and smartphone market. What Apple wants is to own the most profitable part of the consumer market and have a relatively predictable single platform owning the rest of the consumer market. Android gives them exactly that if it becomes the dominant overall player, exactly as Windows does in the computer world. RIM is likely destined to be the Solaris of the cell market, with significant corporate presence (and/or domination) and negligible consumer presence.
Note also I’m not actually agreeing with Jake Fischer in any way aside from what Apple’s business model is. I’ve read your 64 bit article and it tracks reasonably well overall.
> A “portable assembly language†like LLVM-IR or qhasm is arguably a better choice
Coming from an embedded background, I’d always thought of C as exactly that. But your examples are clearly different. They’re almost like an assembly macro language. My first thought would be to hand code the critical areas in assembly and code non-critical stuff with C. But these two certainly provide what appears to be a nice alternative. Still, in particular with qhasm, it looks like knowledge of the underlying architecture (registers and the like) is necessary.
“When do people think this education bubble might burst, and how it might happen? I think there is something to this; when a bunch of young people and their parents figure out they might be better of going and learning plumbing or electrical work rather than getting a useless degree, watch out. ”
I’ve been giving it some thought. There appear to be several factors at play — a coming shortage of (as you say) skilled tradesmen in a potentially immigration-unfriendly political climate, overpriced higher ed, lingering high unemployment among newly college graduated and probably some others I’ve overlooked (possibly coming ubiquity of online ed e.g.)
I’m guessing several classes of highly unemployed and pissed off millenials could do the trick. 2012 or 2013 at the latest?
I don’t believe we’ll be seeing the end of C anytime soon, for the aforementioned reasons. I do however, see it’s use in decline, maybe faster if ‘Go’ takes off (http://golang.org).
Google is pushing it as a systems language to take over from C++ from what I gather. It has Ken Thompson and Rob Pike as principal designers – two systems guys of Legend.
I’ve messed around with it some. I wouldn’t call myself a systems programmer, but it is compiled, gives you safe pointers, and lets you bit bang. It seems to me to be the biggest competition C had in a long time – perhaps ever.
– Norm
>It seems to me to be the biggest competition C had in a long time – perhaps ever.
I agree. I’m keeping an eye on it for that reason. The GC is an issue, though; there’s just no way to do that doesn’t imply serious episodic spikes in either latency or RAM footprint.
> I’m guessing several classes of highly unemployed and pissed off millenials could do the trick. 2012 or 2013 at the latest?
http://radar.oreilly.com/2010/05/disintermediation-risks-trends.html
“When do people think this education bubble might burst, and how it might happen? I think there is something to this; when a bunch of young people and their parents figure out they might be better of going and learning plumbing or electrical work rather than getting a useless degree, watch out. â€
There’s an article about this very subject in last Sunday’s New York Times Review of the Week section. I remain concerned, though, that those who opt for vocational training risk being automated or ‘obsoleted’ out of a job just when they hit mid-life. Fat chance sending *their* kids to college then.
Totally off topic, but… there’s an interesting article about Alan Turing’s teddy bear on Reddit this morning: http://blog.jgc.org/2010/05/talking-to-porgy.html
JF: yes, I suspect something like this is coming (who hasn’t noticed the early stages of it with MIT OCW, e.g.)
And what do I see today on the web? An article from AP:
http://tinyurl.com/24mgmwe
If the MFM is starting to push this, then it looks like my prediction may not be substantially wrong.
“…I agree. I’m keeping an eye on it for that reason. The GC is an issue, though…”
By their own admission they have a fairly pedestrian GC implementation right now, with plans to overhaul it in the pipeline. I expect they’ll at least end up with something as worthwhile as Java/ObjectiveC.
My biggest concern is over their approach to types/interfaces. There is no specific declaration of “implements X” anywhere, just an understanding that if a type has all the methods of interface X, then it ‘implements’ interface X. It doesn’t take much imagination to see how that could fall flat on its face.
I appreciate their candid approach to pragmatic language design though….I’m watching with interest.
“I remain concerned, though, that those who opt for vocational training risk being automated or ‘obsoleted’ out of a job just when they hit mid-life. Fat chance sending *their* kids to college then.”
Yes, but here’s the key factor: most of those who go to college have no business being there.
From the linked AP article:
“But federal statistics show that just 36 percent of full-time students starting college in 2001 earned a four-year degree within that allotted time. Even with an extra two years to finish, that group’s graduation rate increased only to 57 percent.”
We’ve lowered the IQ requirements to enter college way too much. A much needed correction is coming. We DO NOT need anyone below 108 (and maybe as high as 111) IQ even thinking about 4-year college.
Regarding market share, I recall reading recently that according to NPD, Apple holds 91% of the market for PCs over $1,000. Frankly, as an AAPL shareholder, I’m fine with the Dells and the Acers of the world having vastly more unit share than Apple. I’m expecting a lot of those $400-600 PC to be replaced with iPads in the next year or two.
“But federal statistics show that just 36 percent of full-time students starting college in 2001 earned a four-year degree within that allotted time. Even with an extra two years to finish, that group’s graduation rate increased only to 57 percent.â€
This is nothing new. I was in the audience (1963) when the dean gave the old “Look at the person to the left of you; now look at the person seated to your right. Two of you won’t be here at graduation…”
That NYT article I cited earlier points out that people who have had some college, but didn’t graduate, still make more money than those with no college at all. (They didn’t subtract the cost of student loans, though.)
>The GC is an issue, though; there’s just no way to do that doesn’t imply serious episodic spikes in either latency or RAM footprint.
Those unfamiliar with garbage collection may wish to read this article. ftp://ftp.cs.utexas.edu/pub/garbage/gcsurvey.ps It is a very good survey on uniprocessor garbage collection techniques from 1992. It will give you the basic understanding and terminology on GC.
Then, follow up with the Jones and Lins book “Garbage Collection: Algorithms for Automatic Dynamic Memory Management”.
See also: http://www.cs.umass.edu/~emery/pubs/04-15.pdf which documents a memory-constrained GC that runs with soft real-time performance.
This paper: http://www.cs.umass.edu/~emery/pubs/gcvsmalloc.pdf documents the time-space tradeoffs between GC and explicit memory management. It measures only stop-the-world, non-incremental, non-concurrent garbage collectors, and thus is somewhat worst-case.
Its conclusion:
Since memory is cheap in most environments…
>Since memory is cheap in most environments…
You’re still getting the episodic spikes in RAM footprint, they’re just disguised by the fact that you’ve specified your environment so total RAM is larger than the high-water mark and you don’t care when it’s underutilized. Which may be OK; there are a fair number of embedded scenarios that fit this bill. It gets more problematic when you have more than one process using GC; if the RAM budget is less than the sum of high-water marks you can get failure, or VM stalls introducing nasty latency spikes.
The “memory-constrained GC that runs with soft real-time performance” is, unsurprisingly, a copying collector. It’s well known that these have relatively low and predictable overhead relative to (say) mark and sweep followed by defrag. But they have terrible memory locality. Copying GC puts a soft bound on your maximum RAM and latency at the cost of triggering lots of cache misses and thus increasing average latency. This is one reason they can only claim soft realtime.
I’m a LISP guy since the 1970s. Every few years somebody (like these guys) claims to have finally tamed the GC problem, and it always turns out that they’ve pessimized some aspect of performance that they’re not measuring well. We’ve learned to be skeptical.
That said, you are broadly correct in that the cheapest thing to do is usually to throw RAM at the problem.
Just run the GC on a different core.
Question for you Eric. You might have heard that Google is shutting down its web store for N1. What do you think that portends for the future of Android?
# JF Sebastian Says:
> Just run the GC on a different core.
Could you propose an algorithm for this that would deal with the obvious problems?
# esr Says:
> That said, you are broadly correct in that the cheapest
> thing to do is usually to throw RAM at the problem.
Isn’t that another way of saying that the solution it efficient automatic garbage collection is to not garbage collect?
Perhaps we all need to follow your lead and just use static buffers only :-)
>Isn’t that another way of saying that the solution it efficient automatic garbage collection is to not garbage collect?
Not quite. I’m saying that of all the constraints you can relax to make the GC problem easier, adding more RAM and using something like a generational or copying GC that is space-intensive but has low time complexity is usually going to be the cheapest and most effective; the paper Lapin cited is right about that, as far as it goes. That is, if you take a really comprehensive view of the costs and include, e.g. lifecycle maintainence on the code required to implement trickier approaches.
Sorry if this sounds a bit snotty, but you really have to have marinated in a dozen LISP implementations before you’ll fully grok the tradeoffs. What I’m telling you has been folk knowledge in LISP-land since the early 1980s; the rest of computer science has since gone through a couple of cycles of forgetting and rediscovering it.
>Perhaps we all need to follow your lead and just use static buffers only :-)
Works well if you can get away with it. But GPSD’s data flow makes it a really odd special case; most programs can’t get away with this.
I don’t think GC can be “tamed” without having more knowledge about how long an object is expected to continue to exist. With it, the collection algorithm can take life expectancy into account, and try to group things together that are expected to die at about the same time. That would reduce the tendency to re-fragment immediately. But you wouldn’t want to make that the primary criterion, since you’d lose locality of reference. You’d probably have to go with a model that tries to keep “related” objects together, while sorting within those groups by life expectancy.
>I don’t think GC can be “tamed†without having more knowledge about how long an object is expected to continue to exist.
See, that’s just another direction from which you can try compressing the constant volume of suck associated with this problem. And where it’s gonna bulge out the other side is that if your actual object lifetimes don’t match the assumptions you tuned for so carefully, GC performance will go all to hell.
> You might have heard that Google is shutting down its web store for N1
Yes. In this thread even.
> What do you think that portends for the future of Android?
Android? Don’t know, except the carriers have flexed their muscle and killed the Nexus One. (And just after esr said, “telecomms companies, beware!“)
It does mean the N1 is a dead end phone.
>It does mean the N1 is a dead end phone.
Huh? No. It’s just going to move to different retail channels. HTC spent a lot of NRE on that hardware, do you think they’re going to just let it sink because the Google brand isn’t on the cover?
Android is doomed to become the platform of choice for people with iPhone envy. It will dominate the lower and middle tiers of the smartphone market, with the iPhone occupying the top tier.
Dell is doomed to become the platform of choice for people with Sun envy. It will dominate the lower and middle tiers of the server market, with the UltaSPARC occupying the top tier.
ESR says: Heh. I was going to write an earnest technical explanation of why Jeff was wrong, but dfranke wins.
9mm is doomed to become the platform of choice for people with .45 ACP envy. It will dominate the lower and middle tiers of the handgun market, with the Colt .45 occupying the top tier.
# esr Says:
> Works well if you can get away with it. But GPSD’s data
> flow makes it a really odd special case; most programs
> can’t get away with this.
My comment about “the best GC is no GC” was partly serious. No doubt you are right that static buffer programming is much harder, but let me point out that there were one heck of a lot of programs written in BASIC before BASIC had dynamic memory allocation.
Also, it is worth pointing out that LOTS of programs can do perfectly well without GC because they don’t run long enough to care. For example, in Linux most of the small command line programs like cp, mv, rm, and so forth could probably do perfectly OK without dynamic memory. There is also a special category of program that needs dynamic memory at the program set up phase only, and doesn’t need to dispose of it. (Perhaps these programs would be a better example of this, since they need to parse the command line.)
So depending on your definition of “most” I think most programs can run perfectly well without GC, and it is certainly possible to write many programs without any dynamic memory at all (though it is undoubtedly more work to do so.)
@Jeff Read
It’s called “low-end disruption”. Daniel and Ken save things that were once reasonable, once Android gets the lower and the middle tier it will be good enough for the bottom of the top (and so on).
@esr
Garbage collection is necessary in some places. Some data structures need garbage collection, some don’t. Could greater control of the garbage collector help? A ‘nogc’ primitive perhaps. Therefore you could use Go as a “C with garbage collection”, on an opt-out basis (C++ kinda does the reverse).
Google is also pushing Android into a rather different space where Apple has tried to establish a beachhead.
Then you don’t have that knowledge.
I also wonder how much memory page size affects GC performance. If you have an object that takes up eight full pages of memory, and a partial page on either side of it, there’s no need to move those eight middle pages around, but there is something to be gained by filling in the unused space at the beginning of the first page and the end of the last page. As page sizes go up, there’s less opportunity to have those full pages in an object.
Heh. That’s right. You kids and your newfangled “garbage collection.” Why, back in my day, coding on Turbo Pascal, you had to fit your code and data in one 64K segment! Dynamic memory allocation was a luxury we just couldn’t afford! Garbage collection, indeed….
*ducking*
@The Monster
Those huge objects are the sort of thing where my hypothetical ‘nogc’ primitive may be useful. It likely somewhat static, especially in size. Any process recreating it out of place is likly to be complex, so throwing in manual memory management wouldn’t matter. A generational gc would be helpful here, but would cause RAM spikes.
It has occured to me that a ‘nogc’ primitive has some semantic problems. What about a ‘nogc’ structure that references managed memory?
Hmmm… Go enforces the 1TBS. All you Microsoft-brain damaged, opening brace-on-a-new-line types can commence whining now, but clearly you lost. :)
Seriously, Go is cool, but I think the GC problem will likely keep it off of small embedded platforms. Which is okay for Go because it isn’t likely that Go’s designers were targetting small embedded platforms. Clearly Go was designed for today’s multicore servers and workstations, since its primary emphasis is on multiprocessing.
Peter, D already does that, with a garbage collector used by default that you can optionally turn off where necessary, one of the reasons why D is the most likely successor to C and C++, not Go. The main reason is that Walter Bright, the guy behind D, is a very smart guy who has been writing C and C++ compilers for decades.
Ha ha. Except we aren’t talking about Sun — we’re talking about Apple, which as Some Guy points out, actually does dominate the top tier of the desktop market.
Uh huh. Silly fanboy. That’s because Apple is about the only contender in the over-$1,000 home/consumer PC category. Apple’s closest competitors in that market don’t sell very many machines, if any at all in that category for over $1,000, and I think Apple doesn’t sell any for under $1,000. These marketing surveys don’t consider, for example, high-end $1,000+ corporate PCs and workstations, a category dominated by Dell.
>That’s because Apple is about the only contender in the over-$1,000 home/consumer PC category.
That’s right. Those prices are a premium it charges for OS X – the only home PCs in that price range are tarted-up “gamer” systems sold to idiots who don’t understand network latency. The notion that Macs have better “build quality” in anything but the exterior case design is a crock.
More than gold-plated; with certain exceptions (the 2006 bad run of Intel MacBooks comes to mind) it’s pretty much impossible to buy a big-name PC with the same build quality as a Mac, at least not at a price that doesn’t make the Mac the better value proposition.
Apple cares about quality.
Looking at the D source examples, it would appear that declarations of victory for 1TBS are greatly exaggerated…
> Apple cares about quality.
I have an inexpensive Compaq that’s been running fine for six years. How much better can Apple’s build quality be?
Yours,
Tom
I wouldn’t be surprised if there are 10- or 15-year-old Macs running in many a music studio. Usually because it’s the only thing that will drive some obscure but irreplaceable piece of audio processing equipment.
Yeah. Jeff Read is just being downright silly again. Since they started selling Intel machines, Apple sells nothing more than commodity machines with a little extra pretty/shiny thrown in. The motherboard is nothing more than Intel’s reference board for the given CPU and chipset. That means they are no different than anything sold by Dell, Lenovo, HP, etc., except for the OS.
Samsung has very good build quality.
In terms of desktop PCs, provided you burn in the hardware, build quality more or less solves itself.
Actually, aside from Android / iPhone strategies, I think Apple’s biggest mistake was going with only one network provider in the U.S. This forced all the other network providers on a wild search for the iPhone killer, subsidizing Apple’s competitors. At the same time, the popularity of the iPhone on the AT&T network led to real capacity problems, not to mention that sometimes other carriers have faster data or better data coverage. If Apple had spread the iPhone around the other networks would gladly have spent less subsidizing Apple’s competitors and the network load would also have been spread.
OTOH, you may notice that no one has tapped me for any management positions….
Yours,
Tom
I am not sure how much more quality Apple’s hardware has. It does use the Intel hardware now, just like everyone else. It may look a little prettier but that does not mean better. You could argue the OS is higher quality but that is a different thing.
I use a Toshiba myself and it seems like very high quality stuff. Four years old and still goin’ strong.
I would avoid Dell though. Every last person I have talked to that has used one says they would not get another.
Darren Cardinal:
When it comes to commodity PC hardware vendors, the following brands are ones I’ll recommend:
Lenovo
Samsung
ASUS
Apple
Toshiba
The following, I tell people to avoid:
Acer/Gateway/eMachines
Packard Bell
Dell
Sony (not because of hardware, but because the best thing to do with a Sony is to remove the hard drive without ever booting it up, put in a known blank hard drive, and install Ubuntu. Then burn the original factory hard drive under the light of the full moon, and spread the ashes around a crossroads, while muttering whatever Latin proscriptions you know.)
I don’t have an opinion either way about:
Hewlett-Packard
Compaq (used to be really down on them)
Mmmm…I wouldn’t put Dell in the same category as Acer/Gateway/eMachines/Packard Bell. Their consumer stuff is crap, I agree 1000% on that. But their corporate and SOHO lines aren’t half-bad. Especially their laptops. I’ve got a couple of Dell notebooks that I’m very happy with. Plus, they sometimes have good deals on stuff they’re blowing out. And their tech support and customer service for home users sucks. (I know whose got the contract for that support: a bottom feeder in Tampa, FL called “Stream.” ) But their help desk and customer service for corporate customers is very, very good. Even for smaller accounts.
ASUS — well, I used to build my white box machines exclusively using their motherboards, but lately I’ve noticed that they’ve started to go downhill a bit. I prefer Gigabit these days.
Everything else I agree with. Lenovo makes great notebooks too, even if they’re a bit pricey.
Have you found a solution to the Sony problem that doesn’t involve inhaling the smoke from burning the hard drive? :)
About ASUS- yeah when I built my machine I used an ASUS mobo- it was good stuff, and their tech support was helpful when I had a question. I understand they make laptops now, and I have heard they are pretty good.
Are Packard Bell and emachines still around? I thought they folded, or got bought by another company or something.
eMachines is part of the Acer/Gateway group.
Packard Bell still exists. Packard Bell also has the distinction of some of the most brilliantly executed bad ideas ever.
There was a computer I repaired that had a card that combined the following features:
Modem
Ethernet Port
Game Controller
Sound Card
Serial Port
All on one ISA slot. The connections were set up in such a way that if you tried using more than 2 of those functions at once, the cables would get in each other’s way. If you managed to jimmy the cables in, the current draw would fry the card.
Which is what brought this technological wonder to my workbench to fix.
Someone REALLY put some engineering thought into getting all those components on one card – and then someone else tried to make the card survivable.
It was a bad idea in general – I have no idea how much they saved by trying to put that on one card, but it couldn’t’ve been much. It was a brilliantly executed bad idea, and I doff my cap to the poor engineers in Taiwan who took an impossible mandate and put it into hardware. And made it sort of work.
“That NYT article I cited earlier points out that people who have had some college, but didn’t graduate, still make more money than those with no college at all. (They didn’t subtract the cost of student loans, though.)”
I’m not sure that’s the right comparison, because the pool of “no college at all” includes a ton of unskilled labor (and just plain unemployable) types. I wonder how the “some college” group compares to skilled labor/technical school graduates (which is really the issue here.)
In any case, if you’re going to have persistently high unemployment for the next 5 or 6 years, the outlook on 4 year higher ed will certainly be altered.
@Ken Burnside:
Niiiiiiiice. :-) LOL. Though, these days it wouldn’t be unusual (common, even) to see all of those on one chip. Well, maybe except the modem. This had to be a Packard Bell…. (And Packard Bell is also part of Acer; they acquired it in 2008.)
@Darrencardinal: As Ken Burnside said, yes. But Packard Bell machines are, I think, no longer sold in the U.S., only in Europe and Asia.
Many Toshiba laptops from that era had… heat dissipation issues. Nothing system-killing unless you work in the Sahara, but enough to add a lot of dead weight in heatsink, and make the fan(s) roar on a hot day…
They also tended to have matte-finish cases with gray paint that wore off ridonkulously easy of you rested your wrists on the lower part of the case, leaving ugly black stains.
But they tend to work for years, I’ll give ’em that…
“There’s an outside chance consumers simply don’t want this capability.”
Videophoning was one of the “the future will so awesome if we will have this” staples of the sci-fi novel of my childhood and thus when I first saw Skype Video I totally it will completely revolutionize the world. Well what actually happened is that it certainly wasn’t impactless, a lot of people, especially students, appreciated the opportunity to keep in close contact with their love ones and for me it was the major enabler of being able to move out from my hometown without feeling hopelessly alone anywhere else, but no, it did not revolutionize the world. Somehow it made less of an impact than the sci-fi novels expected it to. Videophoning is something to use to talk to mum or gf/bf once a week – not something to use for most or even a few of your calls.
Why? I think it has a laughably simple reason. People don’t want to put on nice clothes, do their hair, make-up for the ladies etc. for a simple phone call. Nor do they want to sit in one place without moving around. Nor do they want to give out more info than is necessary – they want to be able to frown and give a finger without the other party knowing they did. So at the end of the day for the typical purposes of day-to-day phoning video is not only useless, but actually a hindrance, a misfeature. So in hindsight it seems the old sf writers were somewhat naive in human psychology.
BTW I think the othe sf staple the flying cars didn’t take off for the similar psychological reasons. No traffic jams is nice but once you drive something that flies in the air instead of rolling on the ground the risk of crash, injury, death has increased exponentially and that is something that simply doesn’t seem to worth it for most folks for most purposes.
Shenpen, enough with the facile psychological “explanations.” Video conferencing is going to be huge, what’s held it back so far are mere technical issues like deploying proper codecs/chips and provisioning bandwidth adequately, nothing else. Once everyone has video cameras on their smartphones/tablets, video calls will take a big chunk out of voice calls, just as text messages are doing today. Each type of messaging has its own use but video calls will be huge. As for flying cars, there are much more stringent technical limitations there, having to do with safety, ease of use, and price. A Cessna 400 costs around half a million dollars, not many people are going to afford that nor the training flying one requires. Not sure how you chalk up safety as purely a “psychological” issue. I think the safety and ease of use issues can be solved by having automated systems that do everything for you, sort of like auto-pilot on a plane except better. Perhaps flying cars will have to fly at extremely low altitudes, say 20-500 ft, for optimal safety, but I’m not sure how the physics of such low-altitude flight works out, ie if such low altitudes are really feasible.
> I think the safety and ease of use issues can be solved by having automated systems that do everything for you, sort of like auto-pilot on a plane except better.
Yes. Auto-pilots make pilots safer.
> Perhaps flying cars will have to fly at extremely low altitudes, say 20-500 ft, for optimal safety, but I’m not sure how the physics of such low-altitude flight works out, ie if such low altitudes are really feasible.
Higher is probably safer – more time to correct problems before you hit dirt.
Yours,
Tom
Ajay Says:
> Shenpen, enough with the facile psychological “explanations.â€
Curious reaction. I think Shenpen makes a pretty good point. I also think it is ironic that you cite texting, since this is simply the same spectrum in the other direction. Why do people text rather than talk on the phone? One major reason is that a text takes a lot less commitment than a phone call, just as a video phone call takes a lot more commitment than an audio only call.
That isn’t to say that the iPhone with video conferencing isn’t a pretty major thing; it probably will be. I think the meaning of such a thing is hard to imagine right now, I think something paradigm shifting, like twitter, might come out of it. However, I don’t know, and if I did, I wouldn’t tell you, I’d be making it in the basement, so that I could be rich, Rich, RICH!!!!
But Shenpen makes a pretty worthwhile observation IMHO.
Another problem with Android: hand somebody your phone, and watch them attempt to make a call. Bonus points if you weren’t on the home page.
Investor Dave McClure at I/O: ‘Open Is for Losers’ ★
Anthony Ha, writing for VentureBeat:
Paul Graham of incubator Y Combinator took up the question again when asked about the viability of building a website that works on multiple phones through their mobile browsers versus native applications that are built for specific platforms like the iPhones. Graham said he hopes that mobile websites can win.
“I’m very afraid of a world in which we are all Steve Jobs’ slaves,†Graham said. “If anything can save us, it might be Chrome.†When Costolo asked whether he would invest in a company building for the iPhone versus Google’s Android platform, Graham answered, “Of course iPhone. I’m talking about what I hope will set us free, not what will generate opportunities.â€
Flying cars won’t be feasible till there’s a near-absolute failsafe mechanism — perhaps a trigger that deploys whole-vehicle airbags simultaneous with Halon-flooding the engine compartment for a nice, safe, Mars-lander-style touchdown?
Paul Graham always was an Apple fanboy at heart. As for Chrome on iPhone, Graham is confused. I doubt that Apple will allow a full, unfettered version of Chrome on the iPhone. They forced Opera Mobile off the App Store, but then (just a few days ago) allowed Opera to post Opera Mini. Thing is, there’s a world of difference between Mini and Mobile — Mobile is a full-featured mobile web browser and Mini isn’t. Same thing will happen for Chrome: Google won’t be allowed to have a fully-featured Chrome on iPhone.
Flying cars are going nowhere because of the collisions that will take place between them and the people using their jetpacks. Low altitudes will be particularly hazardous to both due to the dense networks of monorail lines running everywhere.
I used to always get a good laugh from the Popular Mechanics-type magazines’ predictions for the future. I long ago gave up waiting for those flat screen TVs that they claimed would be thin enough to hang on the wall like a picture….
Flying cars won’t be viable for a long, long time. Flying motorcycles OTOH, might show up reasonably soon: A light flying vehicle limited to good weather, requiring somewhat more skill than car-driving to pilot, with much reduced safety compared to cars, and generally having the same sorts of disadvantages relative to cars that ordinary motorcycles do.
“Why do people text rather than talk on the phone? One major reason is that a text takes a lot less commitment than a phone call, just as a video phone call takes a lot more commitment than an audio only call.”
One third of the population are introverts. Texting is the ultimate introvert means of communication. The extroverts happily blab blab blab on their phones. Certain modes of communication favor certain temperaments. This isn’t going to change anytime soon (if ever), so must be kept in mind when forecasting the future of comm. tech.
RE: “Flying cars”
I actually see more possibility for something closer to Luke Skywalker’s land speeder, with enough vertical range to give a liberating amount of Z axis – certainly enough to eventually rid us of these ugly things called ‘roads’ ;)
The problem with hovercraft or other ground-effect vehicles is that they don’t have brakes. They’ll need fly-by-wire control systems that can interpret “hitting the brakes” to provide the correct deceleration.
That’s because there are far worse slavemasters to have than Steve Jobs. At least under Steve your shackles will be comfortable, look smashing with your outfit, and be attached to a robotic load-lifter frame that provides mechanical assistance to your back-breaking labor.
:)
Surely, as the is US only sales, this points more to the following factors:
1) Lack of CDMA iPhone
2) Giant amount of hype surrounding the iPhone next generation device (due to be announced in June)
3) Android phones are actually available now.
This guy is my personal hero:
http://linuxoniphone.blogspot.com/