How smartphones will disrupt PCs

I never bought the hype that laptops were going to obsolesce the conventional desktop PC, nor do I buy today’s version of the hype about netbooks. The reason I didn’t is simple: display and keyboard ergonomics. I use and like a Lenovo X61 Thinkpad happily when traveling, but for steady day-to-day work nothing beats having a big ol’ keyboard and a display with lots of pixels. I have a Samsung 1100DF, 2048×1536, and it may be a huge end-of-lifed boat anchor but I won’t give it up for a flatscreen with lower resolution and less screen real estate.

But now I’m going to reverse myself and predict that smartphones — not today’s smartphones, but their descendants three to five years out, will displace the PC. Here’s what I think my computing experience is going to look like, oh, about 2014:

All my software development projects and personal papers live on the same device I make my phone calls from. It looks a lot like the G1 now sitting on the desk inches from my left hand; a handful of buttons, a small flatscreen, and a cable/charger port. My desk has three other things on it: a keyboard about the size of the one I have now, a display larger than the one I have now, and an optical drive. Wires from all three run to a small cradle base in which my phone sits; this also doubles as a USB hub, and has an Ethernet cable running to my house network. And that’s my computer.

(In a slight variation, the screen and keyboard devices don’t have wires to the phone; instead, they talk to it via wireless son-of-Bluetooth. But wires have a significant advantage, as we’ll see below.)

When I leave the house, I pull the phone from its cradle and put it in my pocket. At that point, the onboard screen becomes its display. I’m limited to low resolution and a soft keyboard through the phone’s touchscreen…until I get to my local internet cafe, which is full of display-keyboard combinations much like the one I have at home, awaiting my use. If for some reason I need an optical drive, I borrow one and plug it into the device hub that’s servicing my phone.

And that hub is definitely wired to its devices, if for no other reason that this avoids unnecessary wireless collisions over which cradle owns which devices. It also make my private traffic more difficult to snoop casually.

The key to this scenario is a combination of the convenience of a very small, portable computing device with the ergonomics of a desktop system. Actually, because I’m a hacker, I probably own two or three of these: the “phone” is whichever one has my sim card in it, leaving the others available for experimental OS installs and trial upgrades. Whenever my devices are connected to the house net, they sync file state with each other, so there’s always a recent backup handy if I lose the one from my pocket.

I am never without my phone. I am never without my computer. They’re the same device. I remember having a “desktop”, but it’s just as firmly in the past as my long-ago days using refrigerator-sized minicomputers; I’d no more go back than I would to a VAX-11/780. The distinctions we used to make between phones, computers, music/video players, personal GPSes, and PDAs already seem quaint; my “phone” is all of those.

This is why smartphones are important. They’re pretty disruptive already; you only have to look at the havoc they’re wreaking on the market for standalone GPSes to see that. What most people haven’t figured out yet is that what they’re doing to GPSes today they’re bound to do to every other sort of personal electronic device tomorrow, including personal computers. Once you’re carrying a networkable Turing-complete device on you, the economic/ergonomic case for having it do all those GPS/media-player/PC things is unanswerable. Who wants the hassle of multiple devices when they can just have one? It’s all computing, anyway.

The only step towards that it we haven’t taken yet is dissolving the marriage of our conventional screens and keyboards with those bulky tower cases beside our desks and teaching phones how to use them. Otherwise most people could meet their computing needs with their smartphones (and an outboard drive for their music/video/movie collections) today.

In a future post, I’ll explain why the same economic forces driving the convergence pretty much guarantee that the software on them will be open source.

201 comments

  1. It sounds like all you’re doing is embracing the flawed logic of the laptop/netbook boosters and applying it to smartphones now. The fundamental question is whether any mobile device can do all the things that a powerful desktop/server can do and the answer will always be “no.” Today, you wouldn’t want to run photoshop on your iPad, tomorrow you won’t want to run those cool new 3D videos on your Nexus Three. I agree with your point about mobile convergence but you go a bridge too far when you posit that most will similarly ditch their desktop or home server too. As for your internet cafe scenario, you give no rationale for why someone would want to use those cafes much when they already have their tablet for quick stuff and can always go home or to the office desktop for keyboard and cpu-intensive apps. There will definitely be a minority of users who ditch their desktops for their tablets, just as I ditched my landline for my cell almost a decade ago, but the mobile cpus will always be underpowered for most people for a full home experience.

    1. >As for your internet cafe scenario, you give no rationale for why someone would want to use those cafes

      Many people — especially under-30s, but including me — actually enjoy working “third place” environments where you can be near other people but are not automatically expected to be in a social mode. The draw at these places used to be cheap Internet access, but now it’s mostly psychological.

  2. Apple’s iPad is a completely different model of usage. Its expressly >>>not<<< a Alto, which is what you seem to want.

    Computers will be for creators.
    Devices like phones and tables will be for consumers of the goods created by those who use computers.

    Why would you need an optical drive when you can have 802.11n wireless backed by gigabit fiber?

  3. > And that hub is definitely wired to its devices, if for no other reason that this avoids unnecessary wireless
    > collisions over which cradle owns which devices.

    What you don’t know about wireless would fill entire shelves.

    > It also make my private traffic more difficult to snoop casually.

    One word, baby. TEMPEST

  4. Yeah, I can see the attraction of computing in a third place, I suppose I’m just skeptical that your tablets will suffice in those coffee shops or juice bars. I think it more likely that their current rows of desktops will be replaced by a small server rack in the back that will augment your “third place” computing experience, just as you’ll always need a home server to do the same.

  5. David D. Friedman once suggested a similar workflow. I’ve been reading the Autodesk file where John Walker says one of Autodesk’s biggest successes was taking advantage of the fact that “the introduction of new technologies can cause discrete jumps in the economic fundamentals of a business, an industry, or an entire economy” ( http://www.fourmilab.ch/autofile/www/subsectionstar2_73_2_5.html , but that page won’t make much sense outside of the rest of the chapter http://www.fourmilab.ch/autofile/www/chapter2_73.html ).

    Before AutoCAD, companies wouldn’t dream of running CAD on anything smaller than a minicomputer — generally with special hardware added to boot! AutoCAD worked because PCs of the 1980s had enough horsepower to run CAD. When they crossed that threshold, they caused a discrete jump in the economic fundamentals of the CAD industry, and Autodesk took advantage of that. Today’s smartphones have much more horsepower than PCs of the early ’80s. The main thing they lack is a decent interface for working with serious applications.

  6. Unrelated, but I saw this on my most recent visit to Philadelphia to visit friends, and knowing you’re from the area I thought I’d share. It’s right off South St on I think 4th or 5th.

  7. P.S., after having actually read the post. The Nexus One is a huge leap even from the G1 in precisely the direction you’re predicting. It is markedly faster at everything (with its 1 GHz processor), and the screen is actually usable to browse the internet (esp with WiFi on). Once the software/hardware issues of docking to the external devices via Android/Chrome, it’s already usable for a non-power user as their PC.

  8. >Why would you need an optical drive when you can have 802.11n wireless backed by gigabit fiber?

    Linux distro installs are my main use.

  9. >David D. Friedman once suggested a similar workflow.

    I’m not a bit surprised to learn this. We’re friends, and it fits his thinking pattern as I understand it pretty well.

  10. I’ve been using my 4 GB USB flash stick for installs or trying out “live DVDs” since last year, works fantastic, optical drives are so 90s. ;) Flash sticks should work on any reasonably recent BIOS that supports booting from USB storage. I was surprised that you mentioned optical drives in your post, gotta agree with Jake on that one, optical drives are out. Where I disagree with Jake is on the artificial distinction between creators and consumers, as we are seeing a boom in the number of creators that is only going to accelerate. Just as a multitude of bloggers and podcasters are currently replacing old print media and radio, TV and a bunch of other fields are about to be crowdsourced. Open source can be seen as the first iteration of that trend for software itself.

    1. >optical drives are so 90s

      OK, for “optical drive” substitute “nonvolatile physical medium du jour”; it makes no essential difference to the logic of the argument.

  11. The idea that cafe keyboard-display-hubs will be any more secure is naive. I would never trust such a keyboard; it’s far too easy to put a keylogger in place, and BAM! all your credentials are compromised. They even make hardware keyloggers that look exactly like a normal keyboard.

    1. >I would never trust such a keyboard; it’s far too easy to put a keylogger in place, and BAM! all your credentials are compromised.

      I deliberately ignored that issue, because bringing your own keyboard wouldn’t actually solve the problem either. If the cafe management is corrupt and determined, you’re gonna get snooped by TEMPEST-like methods. What wires do is just close off the easiest snoop path.

  12. but the mobile cpus will always be underpowered for most people for a full home experience.

    You need to define what you mean by “full home experience”.

    I’d argue that for most people a “full home experience” (wrt computing) involves checking email, checking the web, storing their photos and videos and maybe using the various bits and pieces of office (word, excel etc). The reason that netbooks have been successful is that from a power perspective they are perfectly adequate for the majority of people at an unbeatable price point. (Note that ESR isn’t arguing counter to this. His specific problem with netbooks isn’t related to computing power but to interface)

    If IBM wanted to jump back into this market it’s more than possible that Cell-like architecture(like the PS3) could help bridge whatever gap exists between low power and high power computing. Your cradle isn’t just a power and data connection, it adds SPEs to your device thus improving its power (for certain appliations at any rate). This would of course require new software to take advantage of which is usually the death knell of any idea.

    1. >(Note that ESR isn’t arguing counter to this. His specific problem with netbooks isn’t related to computing power but to interface)

      JonB has it exactly right. Some of the rest of you should have been paying closer attention.

  13. >Just as a multitude of bloggers and podcasters are currently replacing old print media and radio, TV and a bunch of other fields are about to be crowdsourced.

    “Crowdsourcing” is not going to replace traditional TV series’ any time soon. What’s more likely is that traditional TV will simply be distributed via the Internet. This is just more of the ludicrous anti-expertise nonsense that’s floating around, where the uninformed feel entitled to spread their ignorant opinions.

  14. There’s a good reason why Laptops have USB ports and video out. As a practical matter, they are replacing desktops for most people. While real keyboards and big monitors are needed features, laptops support them just fine, and your big drive is likely a USB drive.

    Talk to anybody under 25 and they almost assuredly own a laptop or netbook, but the only way they’d have a desktop is if they’re a gamer. Desktops are becoming specialist devices for those who need more CPU/GPU power than is economic in a laptop. See any college/university for examples of this. The kicker is that while laptop proponents have been talking about this for decades, it’s only become practical in the last few years as laptops hit a price/performance ratio not terribly different from mid-range PC’s.

    Personally, my Laptop has a 21″ CRT and a real keyboard attached at home (along with an extra 2TB of storage, a couple scanners, a printer, etc…) WOrks just fine and handily replaced my previous desktop.

    The smartphone on the other hand is a peripheral. A useful one, but not practical replacement for general-purpose computing. What it is doing is killing off most of the special-purpose devices that preceded it. The PDA is dead, hand-held GPS’s are dying unless they’re application-specific (bike/boat/plane), the low end of the digital P&S market is evaporating and the media player market is also slowly getting eaten.

    My expectation for 2014 is I’ll have 2 primary computing devices. My laptop and my smartphone. The smartphone will be the always-on-me device, the laptop will be used for serious work. But then again, this is the model I mostly use now (except I have a laptop, a mini-tablet[netbook with tablet functions] and a smartphone, the tablet’s for web access/media playback where the smartphone just doesn’t do a good job yet).

    1. >While real keyboards and big monitors are needed features, laptops support them just fine,

      No they don’t. I have yet to encounter a laptop with a VGA out that can drive a monitor with higher resolution than the laptop’s flatscreen. My X61 (1024×758) cannot drive my desktop monitor to capacity.

  15. JonB, I tried to give some examples of the full home experience above, eg Photoshop or 3D video. I think your settop box in your living room is likely to become your home server, both serving out HD video to your big screen DLP TV and powering your cpu-intensive apps, which explains why Microsoft has emphasized owning that new beachhead also. However, it is besides the point exactly what the new cpu-heavy apps are, what matters is that there will always be such new ones that will bog down your underpowered mobile CPU. This is why new versions of Office are no faster nor do they take up a significantly lower percentage of your desktop resources than older ones did, the software always grows to use the new computational resources. I do agree that on the lower end there will be people who go full mobile without any home desktops/servers, I just think that group is a minority. If you posit SPEs in the cradle, all you’re really saying is the home desktop will be shrunk down to something approaching a cradle, at which point there is no disagreement if we’re both acknowledging that the mobile device will need such computing help in the home.

  16. Grumble…

    I’ve been saying *exactly* what you said above for seven years now. (Right down to the timing, only seven years ago it was ‘ten to twelve years from now’.) I’ve said it in geek meetups, at unconferences, at SF conventions, and any other place I could get an audience who cared even the slightest.

    Now you said it. When it comes true, no one will remember me…

    /Grumble

    Note to those who jump to conclusions: I am *not* claiming ESR stole the idea from me. It is such a natural one that I expect there is someone else out there who can honestly claim they have been saying it for ten years.

    1. >It is such a natural one that I expect there is someone else out there who can honestly claim they have been saying it for ten years.

      I agree. This does not constitute a bold prediction; the engineering required to get to it isn’t hard and the economics line up nicely. I would be astonished if I had been the first person to get there and find your claim that you did seven years ago completely credible.

  17. I’d like to see a movie projector that would fit on the side of a phone. Given the heat it would give off, you’d want it separable. Something that could project to a piece of typing paper, a white coffeeshop wall, the back of a car seat etc.

    The desktop projectors look bigger than they need to be.

    1. >I’d like to see a movie projector that would fit on the side of a phone. Given the heat it would give off, you’d want it separable.

      Yeah, with current tech that really does look like an exception, I admit. Small high-power LEDs might change that in the future, though.

  18. Don’t forget you won’t have to be plugging any of your smaller devices in any more. All will be charging via wireless. WiTricity hopefully will be well developed enough.

    1. >All will be charging via wireless.

      I would have included this in the scenario, but my impression is that systems like that are mainly laboratory toys hampered by low transfer efficiency and that the limits are fundamental…you can use them for smart cards but they fail at the power density smartphones would require.

      Can you cite any figures for transfer efficiency and how it varies with power density?

  19. Roger, the reason that much of “traditional” TV has not hit the internet already is because the existing TV stations and cable companies, ie existing distribution, get mad when the networks undercut them with internet distribution. This is one of the main reasons why the networks have not done and won’t do jack while crowd-sourced internet video takes off and swamps them. It is particularly silly of you to say this argument is anti-expertise, what great expertise do you imagine TV producers have? ;) Just as there’s nothing special about writing an operating system that can’t be done as well by a diverse group of programmers connected only by the internet, the same can be said of newsgathering, podcasts, or video shows. Perhaps you should try to inform yourself about these issues, rather than appearing uninformed and ignorant yourself by accusing everyone else of that.

  20. Speaking of people moving to mobile devices as computers, I’ve seen some reports recently about a significant and growing number of people getting iPhones or iPod touches and using these as their only computer.

    Noted, these aren’t power users. They’re doing email, the web, facebook, taking and posting photos, etc. but they have everything that they need in one easy to use and understand package.

  21. >No they don’t. I have yet to encounter a laptop with a VGA out that can drive a monitor with higher resolution than the laptop’s flatscreen. My X61 (1024×758) cannot drive my desktop monitor to capacity.

    Then your laptop encounters are with some rather limited hardware.

    My 2+ years old Thinkpad T60 can drive external displays up to 2048×1536 (I’ve tested it up to 1920×1080 on the VGA port), while the internal display is only 1280×800. The newer W series can easily do that.

    1. >My 2+ years old Thinkpad T60 can drive external displays up to 2048×1536 (I’ve tested it up to 1920×1080 on the VGA port), while the internal display is only 1280×800. The newer W series can easily do that.

      Interesting. Either I botched my testing or the X (ultra-light) series I’ve been using for a decade threw away this capability in order to lower power dissipation. Either is possible.

  22. > This is just more of the ludicrous anti-expertise nonsense that’s floating around, where the uninformed feel entitled to spread their ignorant opinions.

    This is [quite obviously] a very dangerous line of thought; if it prevailed, none of us would be conversing on blogs or even independent newspapers. Hell, we might not even be trusted enough by the powers that be to use personal computers. I know it is tempting to revert to this kind of thinking when confronted with widespread ignorance, but every time it has been tried, it has been for the worse for both the society and its individuals. You’d think we’d have gotten past this way back towards the freaking Protestant Reformation, but sadly, old memes die hard…

    Anyways, how long till we ditch the notion of an unattached device and have it plug into us?

    1. > This is just more of the ludicrous anti-expertise nonsense that’s floating around, where the uninformed feel entitled to spread their ignorant opinions.

      I’m an expert, and I will take the side of the “anti-experts” against snotty elitism like this. Every. Single. Time.

  23. >Anyways, how long till we ditch the notion of an unattached device and have it plug into us?

    Not until the things are reliably secure.

    Given the state of computing insecurity today, I really don’t want something plugged into my brain that’s as flawed as today’s software is.

  24. >Interesting. Either I botched my testing or the X (ultra-light) series I’ve been using for a decade threw away this capability in order to lower power dissipation. Either is possible.

    Well, given the major changes in the hardware over the last 10 years, I can’t think it’s been a universal decision to do that.

    I’ve found the larger problem is x.org and the drivers for same. I’ve wasted countless hours making x.org do what I want it to.

    Take a look at http://intellinuxgraphics.org/ It might shed some light on the issue, in their dual heading section.

  25. I like your vision, and expect to see it someday…but in a few years? No. Simple problem – juice. I just don’t see solutions to the power consumption hurdle arriving that fast.

    However, a compromise might be to leverage technologies like the son-of-Balsamiq….mobile interface, but with a crunching backend hosted on a meaty server….

    1. >I just don’t see solutions to the power consumption hurdle arriving that fast.

      That’s a respectable sort of objection, but what specific power hurdle are thinking of? The Nexus One already cranks 1GHz out of an ARM driven by a smartphone battery. Your outboard display is going to be separately powered from wall current, so that’s not a problem.

  26. >Roger, the reason that much of “traditional” TV has not hit the internet already is because the existing TV stations and cable companies, ie existing distribution, get mad when the networks undercut them with internet distribution.

    This doesn’t appear to be in conflict with anything I’ve said, so I’ll just leave it.

    >This is one of the main reasons why the networks have not done and won’t do jack while crowd-sourced internet video takes off and swamps them.

    What crowd-sourced Internet video? youtube is used primarily for watching content produced through the traditional paradigm and clips of video games (which are not everyone’s cup of tea). There is an objective lack of quality here (insofar as anything cultural can be objective).

    >It is particularly silly of you to say this argument is anti-expertise, what great expertise do you imagine TV producers have? ;) Just as there’s nothing special about writing an operating system that can’t be done as well by a diverse group of programmers connected only by the internet, the same can be said of newsgathering, podcasts, or video shows.

    How is a diverse group of actors going to put together a show with a handful characters in it? How are they going to appear simultaneously on the one set? I think small companies may supplant the major players by distributing over the Internet; this is not crowd-sourcing.

    >Perhaps you should try to inform yourself about these issues, rather than appearing uninformed and ignorant yourself by accusing everyone else of that.

    The last sentence of my post was poorly written/conceived, and for that you have my apologies. In any case, where are your credentials to be “debunking” the TV industry? You haven’t even thought out the basic logistical problems, such as the requirement for people to be physically present on the same set. This is an anti-expertise attitude – you clearly are not an expert on TV production, but you believe that because you are able to post on the Internet that frees you to spew non-orthodox positions that cannot be defended.

  27. You don’t mention the cloud at all. An alternative possibility is that there will be a cheap cpu driving that big screen and another one in the cyber-cafe but your data and any serious processing lives on a server somewhere. To an end-user, it looks similar except there’s no hassle of plugging things in and no risk of leaving your data in a taxi. Do you not see this as a major way of doing things?

    1. >Do you not see [the cloud] as a major way of doing things?

      I like having physical control of the device my data lives on. Also, I dislike being quite as dependent on the uptime of other peoples’ servers as “the cloud” implies.

      I think my twitches are common enough to make the phone-centric scenario win over the cloud-centric one.

  28. The X61 by all information I can find supports external displays at high resolution just fine, the issue is properly configuring your displays in X (which handles this a lot less elegantly than Windows does). I don’t see an easy way to get switching between an external display when docked and the internal display only. Dual-head should work fairly well though.

    It’s been a long time since I’ve seen a laptop which was restricted from high-resolution VGA/DVI/HDMI output separate from the internal display (Apple persisted in this for quite a while, but there was an easy hack to enable it), even my old Thinkpad G40 supported this with its ancient Intel GPU. X has long been a stumbling block for this configuration though, which is why my laptop runs Windows and has a VMWare install for when I need Linux.

  29. Hmm. I’m not sure I like the idea, if only because I like being able to open up my computer and physically hack it. Also, I think desktop PCs will survive, if only as a specialty item, because pretty much by definition you can fit more computing power into a tower than into a smartphone; there will always be new applications that will take all the cycles you give them (much better video games, absurdly hi-definition video playing, the old standbys like brute-forcing keys, etc.).

  30. Roger, the reason why TV stations constraining networks from putting their stuff online is in conflict with your claim that they will someday simply slap their shows online is because the networks will always be stuck between their old distribution channels and the new online channel, until internet shows swamp them and it’s too late, not sure why you can’t understand that. As for the quality of online content, they’re still figuring out their revenue model so understandably it’s not that good yet. You provide no argument for why they will not figure out a revenue model nor for why online quality will not be better. In fact, I currently subscribe to 4 podcasts, despite always having hated radio as lowest-common denominator piffle. This is because online distribution allows for entirely new niches and much higher quality levels, unreachable by broadcast.

    Yes, you will need actors in a TV show to be on a set, but not so for the writers, editors, and the rest of the post-production team. Further, all that stuff won’t be done by a set company or group but will be culled from the best that anybody in the audience can offer. Every script will be largely written by a collection of random people on the internet, rather than having a set writer or group of writers. Also, the actors won’t have to be in a particular city like LA, they can be anywhere as all they need is a good HD camera, which are pretty affordable nowadays. As for my credentials, I have none. However, simply by being an interested and curious observer, I know far more about the vagaries of content creation, particularly at an economic and technical level, than most people you could name. On the other hand, you simply assume that your frivolous objections have not been considered or represent anything worthwhile. As for expertise, it depends what you mean by that. I have no formal education or training in the subject, but I have read and listened to a fair amount of stuff on the topic and cogitated on it all, as an interested amateur. I think you’ll find that us interested amateurs will often put self-proclaimed “experts” to shame with our knowledge and analysis on most topics. I will agree with you that there is a lot of ignorant analysis on the internet, but you don’t make much of a case for your ability to tell the difference between such garbage and well-reasoned analysis when you lump my considered opinion in with them. ;) It would be nice if you could actually raise a worthwhile argument against my positions rather than flatly asserting repeatedly that they cannot be defended and that I don’t qualify as an “expert.”

  31. That’s a respectable sort of objection, but what specific power hurdle are thinking of? The Nexus One already cranks 1GHz out of an ARM driven by a smartphone battery

    Not all the time, it doesn’t. My N900 has an ARM chip capable of 800+ MHz, yet is incorporated into a SoC that runs it at 600 MHz….and can throttle it down to 200 MHz as required (driven by the Maemo OS, of course).

    The point being that these mobile devices have power sources currently in the 1500 MAh range, and you need to be crafty with how you conserve that juice. Try sucking down a bunch of youtube content and see how long your Nexus lasts ;) About as long as my N900 I’ll bet….5 hours? Yet, when I’m hacking on my server via SSH I can tweak away while on the road pretty much all day.

    Your outboard display is going to be separately powered from wall current, so that’s not a problem.

    Quite so….in fact, the whole power issue is moot if we consider just hopping from location to location plugging our mobile devices into some form of cradle. Yet to me, this kinda kills the whole vision of ‘mobility’….I want to be able to be truly mobile *while* I am being productive. I understand this is a subjective question of ‘expectations’…..

    My vision? The onboard display is nothing more than a direct device control. All ‘real’ interaction will be via a holographic virtual display via son-of-bluetooth….some eyewear that provides a hi-rez HUD. I’m wearing shades, have a mobile device in my pocket, maybe a touch control built into my gloves……..I’m on a Gibson fantasy overload, baby! :)

    My time horizon for the above….20 years.

  32. >>>> I like having physical control of the device my data lives on. Also, I dislike being quite as dependent on the uptime of other peoples’ servers as “the cloud” implies.

    I think my twitches are common enough to make the phone-centric scenario win over the cloud-centric one.

    I don’t know, I think when people actually start using things like Google Docs, and see how easy and convenient it really is, they will flock to it. It loads quicker and is perhaps easier to use than Office.

    And I think the privacy concerns are overblown and a little paranoid.

  33. Dan, I don’t think your power argument holds because esr is not talking about mobile usage but whether the mobile can replace the desktops currently used at home, the office, and coffee shop. As he says, power is not a concern in those locations as you’ll just plug your mobile device in to the display hub there, that can just as easily supply power also.

    Darren, uptime and privacy issues are another reason why the home desktop or server will always be the central datastore, with the cloud merely providing ancillary features like redundancy and backup.

    1. >Dan, I don’t think your power argument holds because esr is not talking about mobile usage but whether the mobile can replace the desktops currently used at home, the office, and coffee shop. As he says, power is not a concern in those locations as you’ll just plug your mobile device in to the display hub there, that can just as easily supply power also.

      Yes, that’s the model I’m thinking about. I expect that use in the general-puropose-computer mode would be relatively uncommon when not attached to a peripheral-service node, simply because of the poor ergonomic implications of being limited to the soft keyboard and onboard display.

      And if you want the capability of a netbook, nothing stops you from carrying your own screen and keyboard. There’s probably be a market for things that look physically like small laptops bet are designed to have your phone plug into them.

  34. Agreed, Ajax.

    But isn’t it better to have your important documents on your hard drive and in the cloud?

    I bet the hard drive is far more likely to fail you than Google.

    Having redundant backups is nice.

  35. Ajay & ESR – I understand your point of view. I even agree that this may well be the practical immediate future of mobile computing.

    However, I view the future of mobile computing differently : You don’t just physically anchor yourself on a point-to-point basis – this is the compromise we live with *now*. Truly mobile computing needs to enable people to be productive anywhere they choose…..and the processing/power requirements of this future are a formidable challenge.

    Plus….nothing’s gonna happen until the porn industry figures out a way to make money on it ;)

  36. I have yet to encounter a laptop with a VGA out that can drive a monitor with higher resolution than the laptop’s flatscreen.

    All Macbook models can go at least to the limit of single-channel DVI. That said, laptops are just too clunky compared to smartphones.

    I expect my next setup to be a nice, fast desktop, plus an iPad–all this for less than I paid for my current 15″ laptop four years ago. We’ll see once the device actually comes out, though.

  37. >Having redundant backups is nice.

    If the costs of hard drives fall with those of other things, then it should be easy enough for people who are paranoid enough to want it to set up their own small RAIDs (or just single disks, if the space requirements are not great) for backups; there is already software available to do this (Apple’s Time Machine, no doubt among others) with single disks. I, myself, would only trust a cloud provider with my backups if I encrypted them on my computer before sending–and even then, I’d prefer using my own disks (I know that I won’t go out of business on little notice, for example).

  38. Nah…..the iPad is a septic tampon

    After the phenomenal success of the iPhone/iPod Touch, I’m astonished by the rather pathetic effort expended on the iPad. It literally is an oversized iPhone. How dull! I don’t question the awesome ergonomics of its heritage….but seriously….is the iPad the best that Jonathan Ive could conjur? Is he finally all out of ideas?

  39. One would assume for this sort of thing to happen (and it will. Not sure if you’re right about the timeframe, but time will tell) and be reliable enough for those of us who make a living on the damn machines is some sort of cloud storage. I wouldn’t trust my data to be primarily housed on something I could drop/flush/step on/lose/etc.

    In the shorter timeframe, I see this sort of thing working with a docking station that provides additional CPU/GPU power or something. I really don’t think they’ll be able to cram enough power into mobile devices in the 5 year timeframe to displace modern desktop hardware. I seriously doubt that a cell phone in a half decade will have power to compete with a 3ghz current Intel or AMD Quad-core + 8GB of RAM + ATI 5800-series GPU. Some laptop manufacturers are building laptop docking stations with PCI Express X16 slots, so you can add your own GPU, dock your laptop, and have it switch from crappy onboard graphics to something that can actually drive 1-6 decent sized displays with good 2D/3D accelleration. Even if you’re not a gamer, many cheaper onboard mobile graphics chipsets (I’m looking at you, Intel!) can’t drive anything over 1440×900. My AMD quad-core box drives 2×24″ monitors @ 1920×1200 — I’d have a difficult time stepping down to something less. The newer ATI Eyefinity cards are tempting — the ability to drive up to 6 monitors with one card is sexy.

  40. …I dislike being quite as dependent on the uptime of other peoples’ servers as “the cloud” implies.

    I think my twitches are common enough to make the phone-centric scenario win over the cloud-centric one.

    Regarding those two preferences, soon you can have the cloud with telecom levels of uptime/reliability. A “smart phone” is only one piece of a network. Here’s the rest:
    http://en.wikipedia.org/wiki/3GPP_Long_Term_Evolution

  41. JonB, I tried to give some examples of the full home experience above, eg Photoshop or 3D video.

    Apologies for that. I obviously missed some things in your “wall of text” style of posting. My bad.

    In a way what I was trying to get to with your definition of a “full” experience is are you pitching at the average use case or are you assuming that people are only going to use it if it can do everything that a artistic, gamer. tech nerd will want it to do? If you’re going for the average use case then i’d argue that a netbook can do everything that the average user would want. So it needs to be as powerful as a netbook but without the hardrive or cd (or keyboard or built in 12″ monitor).

  42. I managed to drive a 1920×1200 monitor from the VGA port of my four-year-old Compaq laptop. It got kind of noisy because at resolutions (and pixel clock frequencies) that great the analog signal degrades significantly. These days I run dual screen, 1280×1024 and 1280×800 on that same laptop and it’s great.

    Multiple monitors are pretty much a must-have for developers these days.

  43. JonB, I think the fundamental distinction is whether the mobile device will ever be able to do all the things that a particular person might want from a desktop/server, and if so, whether consolidating on a tablet or two will make sense for that “average” person. I think that tradeoff will make sense for a minority, but not for the majority of people who are always going to have some use, whether it’s editing and encoding their home movies from their 1080p camcorder or using some snazzy new graphics-intensive app, for desktop computing that their mobile phones will be underpowered for, particularly since mobile is more expensive. I just upgraded my roommate’s PC with a new Asus motherboard, Pentium dual-core E6300, and 1 GB of PC2-6400 RAM for $108 from Fry’s (keeping the old case, psu, and hard disk), let me know when you can get anywhere close to that kind of computing value in a mobile device.

    Techies like us might want to consolidate, but I think most people are fine with having a desktop/server for a richer experience at home and a tablet on the go. I don’t even need to know precisely what they’ll need that computational power for, it suffices to point out that software has always found a way to use up that power and will simply find new ways to do so. I think you’re broadly right that many common tasks work fine on a netbook, but there’s always a handful of tasks in that long tail of computationally-intensive apps that the average person wants. I should note that I’m not necessarily arguing for a desktop PC, but that some sort of home server will always be around, essentially making the tablets thin clients while in the house. :)

  44. >Roger, the reason why TV stations constraining networks from putting their stuff online is in conflict with your claim that they will someday simply slap their shows online is because the networks will always be stuck between their old distribution channels and the new online channel, until internet shows swamp them and it’s too late, not sure why you can’t understand that

    Let’s not move the goalposts; we were talking specifically about crowd-sourcing (I can quote your original post if you’d like). There is just no evidence that a quality TV show is going to be crowd-sourced in the near future. I never claimed independent Internet shows would not work, only that they would not be ‘crowd-sourced’.

  45. Ultimately, I wonder how this differs from docking station scenarios you can imagine with laptops or netbooks for that matter.

    The main reason for docking stations being fairly niche will remain. While your smartphone in 2014 will probably blow my current desktop to smithereens in computing power terms, so will my 2014 desktop blow that smartphone to smithereens too. So the desktop will remain interesting, although, granted, it may be less omnipresent than today because casual users will have enough oomph in their smartphone.

    In other words, I think you have a huge chance of seeing your prediction realised in the sense that the use-case will come into existence, but I doubt it will completely displace desktops. There will be more options open, which is a good thing ™.

  46. I don’t see the appeal of tablets as a form factor. It’s not a question of technology — there’s just a huge ergonomic dead zone between fits-in-your-pocket and has-a-real-keyboard.

    1. >I don’t see the appeal of tablets as a form factor. It’s not a question of technology — there’s just a huge ergonomic dead zone between fits-in-your-pocket and has-a-real-keyboard.

      Agreed. I don’t get the fascination either.

  47. You may not like trusting “the cloud”, but would instead prefer the ability of your portable device to access storage on your home network. In this scenario, your phone would rsync to that device, so that if when your phone is lost/stolen/broken, you have everything on the home network. Of course, that’s no protection against a fire, tornado, hurricane, earthquake, burglary taking out both your phone and the mirror at the same time. So you better have some encrypted storage out there somewhere to back up your back up.

    As to the question of keyloggers installed in the foreign keyboards… It is simple to use the phone itself to input all passwords, and never trust those external devices. Since trusting external keyboards at home might lead to you forgetting when away, it might be a good idea to be in the habit of doing this even at home.

  48. @esr: Hey, Eric, haven’t you heard of QWXGA flat panels? Both Dell and Samsung offer 2048×1152 displays, which gives you most of the screen resolution of that boat anchor you’re using now. You definitely need to make sure you’re running Emacs 23, though, because you’ll need that Xft support. ;) I doubt you’ll miss those extra 384 pixels, especially considering how much better LCDs render fonts when using sub-pixel anti-aliasing.

    And you’re definitely not the first person to predict this scenario. All the PC rags predicted along the lines of your scenario about 10 years ago, and they thought it would be happening by now. The main things holding this scenario back are no longer sufficient portable processing power or good, reliable wireless availability like they were 10 years ago, but are mostly usability factors along with the continued attempts by the wireless companies (and by Google and Apple), who haven’t given up trying to assert control over the mobile phone.

    Unless and until mobile phones are completely open and interchangeable among carriers, mobile phone dominance will not proceed.

    1. >I doubt you’ll miss those extra 384 pixels, especially considering how much better LCDs render fonts when using sub-pixel anti-aliasing.

      Grrr. Fewer vertical pixels is exactly what I won’t accept…it means fewer lines in my Emacs buffer, which is Not To Be Borne.

      >Unless and until mobile phones are completely open and interchangeable among carriers, mobile phone dominance will not proceed.

      Which is why the Nexus One is a really critical development here.

  49. @esr: The Dell flat panels (I have one) can be turned 90 degrees while in Emacs and if things are setup right, your desktop will rotate 90 degrees right with it! That means more lines while in Emacs… :)

    The Nexus One is the phone Linus just got, right? If I’m not mistaken, that’s a GSM phone and therefore won’t work on CDMA carriers like Verizon or Sprint. AT&T and T-Mobile are the only GSM carriers I can get in my area, and their coverage sucks. That’s one of the biggest reasons I like to stick with CDMA and EVDO.

  50. “I don’t see the appeal of tablets as a form factor. It’s not a question of technology — there’s just a huge ergonomic dead zone between fits-in-your-pocket and has-a-real-keyboard.”

    IMHO tablets serve a very important function – reading comfortably. Whenever you try to browse the web, or an e-book sitting on a sofa or lying in bed on a normal laptop, it’s just awkward, the keyboard always gets in the way, you cannot hold it the same way you would hold a book. Actually a full tablet with touchscreen and all is not necessarily necessary, what is needed for optimal reading experience is actually much cheaper and simpler 1) be able to fold the keyboard back 2) have some sort of a pageup-pagedown key behind screen. Like the OLPC: http://blog.syracuse.com/voices/2008/12/one-laptop-per-child-hoboken.jpg – why don’t the other manufacturers get a clue?

  51. Until we see battery technology that allows us to use a faction of a modern’s desktop’s power in a small form factor, all the smartphones will be is glorified diskettes that allow us to do very basic things that don’t require a large screen. And, as desktops move more towards multi-core, the way we’ll program for desktops and for mobile devices will diverge even more.

    Not to mention that we have to break the current pricing models, which IMO are pretty broken.

  52. The OLPC makes a great ebook reader.

    The Nook makes a better one. As will, I suspect, the iPad.

    Tablets are probably also ideally suited to the only application Shenpen thinks is real: data entry, particularly in environments where mobility is required such as hospitals and delivery trucks. They are more convenient for casual browsing and similar use than a netbook, which requires one to sit down and poke at a keyboard or mess with a touchpad. Apple has been bragging about completely obsolescing the netbook with the iPad. They’re not terribly far off.

    I agree with Eric’s thesis: most people’s basic computing needs, such as web, email, word processing and spreadsheets, could be handled with a smartphone and if there were a way to plug it into a constellation of peripherals that would mimic the functionality of a “real” computer, desktops and laptops as we know them will be significantly threatened.

    However, the stack will not be entirely open source. At a minimum, the baseband firmware must remain proprietary, for these reasons (there may be others): the carriers will not accept random devices on their networks, wish to lock phones to operate exclusively with their networks, and wish to re-sell you functionality that already exists on your phone with gratuitous surcharges for things like tethering; the FCC will certainly have something to say about radio transmitters able to do arbitrary things on the airwaves; and finally, homeland security regulations require the federal government to have a backdoor into your phone. Don’t kid yourselves, folks; you are being surveilled. Maybe in Europe, where privacy laws are stricter and mobile carriers less encouraged to nickel-and-dime their customers to death, fully-open-source phones will gain traction. But not here.

  53. Unless and until mobile phones are completely open and interchangeable among carriers, mobile phone dominance will not proceed.

    They are — in Europe. More affordable pay-as-you-go plans without onerous contracts, too.

    Seriously, the United States hasn’t got its shit in one sock when it comes to infrastructure like telecommunications or mass transit.

  54. @Jeff Read: That’s because in Europe the telecommunications infrastructure is basically government-owned. The U.S. saw the light of doing away with that concept quite a while ago.

  55. Minor quibble addrssed already: the storage device will be defined by an interface (probably USB 4?) – the specific storage mechanism is irrelevant, probably flash, maybe a larger disk for video, or some new breakthrough.

    More significant; I think your machine(s) may actually be VMs (KVM?) capable of being instanciated on any number of different devices including your phone, possibly a cloud, maybe a desktop for heavy-duty crunching. Hackers will have lots of VMs lying around, but only the low-level hardware guys will still work with operating systems on bare metal.

  56. Jeff,

    “Tablets are probably also ideally suited to the only application Shenpen thinks is real: data entry,”

    WOW. I actually feel touched that you happen to remember our discussion at least 3-4 years ago – seems like an eternity to me – about this subject. Thank you.

    But actually, I think, no. As for the mobile aspect of digitizing data wherever it first appears, I can buy wireless mobile terminals much cheaper than a iPad which have not only a touchscreen but also barcode reader gun plus a good enough little keyboard also with function keys and all, plus are rugged and waterproof and whatever. I can imagine mostly non-industrial uses for the iPad, i.e. private uses plus perhaps business uses for he “suit” types of folks where style matters a lot, but real industrial uses really not.

    (BTW my old argument, if you remember, was about not quite that, but about that modern software is user-friendly and all but for all the cases where the shiny interfaces and service-oriented architectures don’t happen to cover or are just plainly expensive the only fall-back point is one desktop user beating in 20 pages of printed records into a computer at a machine gun speed and that requires a different kind of usability – usability without the mouse and without Ctrl-Alt-Meta-K type shortcuts, but by simply arrow keys, function keys and enter and that’s what I find sadly missing in modern times.)

  57. >>It is such a natural one that I expect there is someone else out there who can honestly claim they have been saying it for ten years.

    > I agree. This does not constitute a bold prediction; the engineering required to get to it isn’t hard and the economics line up nicely. I would be astonished if I had > been the first person to get there and find your claim that you did seven years ago completely credible.

    I think the winner is Arthur C. Clarke, with the min-sec’s (mini secretary) in the novel Imperial Earth:
    http://en.wikipedia.org/wiki/Imperial_Earth

    Only he used IR link to the keyboard and screen.

  58. 2014’s too soon. That’s not enough time for A: The cutting-edge first generation product to come out, which as far as I know remains only in our heads (i.e., it’s not even something we’ll see in six months) B: The mass-market multiple followons that normalize the idea C: The standardization process of the interconnect so that you might actually be able to do that (probably something based on USB3 but I’m not sure that’s quite off-the-shelf yet) D: Significant enough penetration that your local cafe would be willing to invest in the hardware.

    Double extra bonus time if between step B and C someone tries to lock us in the trunk with a proprietary interface, requiring another two or three years to establish that, no, really, USB3(/USB4) is the way to go.

    Otherwise no particular objection, but there’s no way we can get all that done in four years. I’d have to guess 2014 is about the middle of when we might see these devices begin.

    1. >Otherwise no particular objection, but there’s no way we can get all that done in four years. I’d have to guess 2014 is about the middle of when we might see these devices begin.

      Your objections are on point. Much depends on how aggressive Google is, I think. OTOH, I’m likely to be an early adopter

  59. Someone mentioned TEMPEST.

    I heard that with new flat screen monitors it doesn’t work as well as it did with the old tube monitors, because the flat screens don’t give off as much energy.

    Is this true? Anyone know about this?

  60. A: The cutting-edge first generation product to come out, which as far as I know remains only in our heads (i.e., it’s not even something we’ll see in six months)

    Just wait until 2012’s WWDC.

    Is there anyone who doesn’t expect Apple to come up with this first, and within the next couple of years? I mean an iPhone is a Mac, with radio hardware attached and running a shinied-up version of At Ease. All they need to do is bring out a flat-panel display shell with USB ports for keyboard and mouse, and you have your iPhone computer.

    I seem to recall an Apple patent a few months back for just this sort of thing…

  61. I think tablets are the killer form factor for mobile devices, because they allow you to use the browser and other apps much easier and much better video resolutions. They are admittedly harder to carry around than a smartphone but I think function will win over form in that regard; people will find a way to carry them, perhaps fanny packs make a comeback? ;)

    Roger, You are the one moving the goalposts by claiming that TV shows can simply be slapped online and then distancing yourself from that claim now. As for crowd-sourcing (not sure why you offer to quote me when I have not retreated from that position at all, in fact, I gave more examples which you proceed to ignore as usual), all the argument entails is that the kind of widely collaborative process that we currently see with open source or blog threads will spread to other creative endeavours, once the software to enable it for the less technically savvy is made widely available. In fact, crowdsourcing already happens widely- I’ve read that many of the stories from The Office in the US come from friends of the writers and their experiences- it’s just not made explicit as there isn’t a way to recognize and pay that crowd, the internet simply makes that possible.

    Not sure what evidence of quality you need when I already pointed out that the 4 podcasts I listen to have much better quality content than the crap on radio, because they can focus on a particular niche and kill it, rather than having to serve a lowest common denominator mass audience like current broadcast media. It’s getting funny how you continue to simply make flat assertions with no reasoning or evidence behind them, ignoring all the reasoning and evidence I’ve given. :) What precisely is your problem with our crowdsourced future? Perhaps that would be a more interesting and relevant topic because you clearly don’t have anything to contribute to a discussion of whether or not it will be crowdsourced.

  62. @Mike Earl (Feb 26 @ 1:45 PM):

    I was also going to mention LG Expo projector phone (the one advertised during the US Super Bowl with the trailer from the movie Avatar).

    @esr:

    Haven’t you discussed this topic before (at least indirectly)? I seem to remember a thread where you talked about your micro-sized portable computer *wirelessly* attaching to a nearby keyboard/mouse and display (using UWB techniques for the video as I recall). I tried to search the blog, but couldn’t find anything relevant.

    ESR says: Sorry, wasn’t me. I haven’t gotten to play with UWB hands-on yet.

  63. For me, the appeal of the iPad (yeah yeah, worst possible name) is this:

    My academic work involves books, journals, and paper legal pads. The iPad sounds ideal for going to a café and reading with. In contrast, my laptop takes up too much space in my bag, is too heavy, takes up too much desk space, and is just clunky. Paging through a book with a trackpad is clunky. The iPad is tiny, great for reading, and, compared to a good laptop, cheap. Compared to a Kindle DX (only slightly cheaper), it’s also more useful for recreation, and the platform is more likely to be around a few years hence.

  64. OK…jokes about the name aside ;)

    I do like the idea of the iPad – although it’s far from an original concept, but rather the latest contender for the holy grail of tablet computing. Kindle is nice, but limited, and greyscale….but it does have eInk, which is cool (it sips the juice). I don’t think the iPad is “it”, but it’s certainly a bold attempt. Maybe the momentum and hype surrounding Apple will shift a bunch of units….but I don’t expect to see a triumph.

  65. Big thing about the iPad: (tampon jokes aside)

    It is the only computer designed to be held and used in one hand. This is a big deal.

    Perfect for doctors or anyone who’s moving around and needs to enter data.

  66. I expect the iPad to slowly evolve, become more capable, and replace Macintoshes. The simplification embodied in the iPhone OS are akin to the CLI to GUI transition. In the future, programmers will still deal with processes and hierarchical filesystems, but not most people.

    The idea that the iPad is “only” a giant iPhone misses the point so completely that I just don’t know what to say.

  67. @Darrencardinal: actually the iPad is just about useless as a computer without the attached keyboard since it has NO tablet functionality (the touch-sensitive display is a glorified touchpad, not an actual tablet, no stylus support). If you’re looking for a one-handed computers, look at the Fujitsu U-series minitablets. They’re similar weight, smaller in size and are convertible tablets. They actually work very nicely if you only need netbook-level power with a more flexible form factor than a netbook. Only downside is they cost about twice as much as an iPad.

  68. @Adam: expect that iPad functionality in the next or second-to-next revs. Like the iPhone. Apple’s slogan could be ‘screwing with first adopters since…’

    This is only half tongue-in-cheek.

  69. Apple’s slogan could be ’screwing with first adopters since…’

    This is only half tongue-in-cheek.

    It’s funny coz it’s true :)

    Hey, without early adopters, you all don’t ever get any subsequent iterations….

  70. If capability based security/secure cooperation paradigm ever becomes widely deployed, the small portable device with an independent user interface device you’re carrying could easily be the initial starting point of all your authority. It has its own input system to authenticate you to itself, and it handles all secure access control interactions with devices elsewhere, eliminate keyloggers as a primary security threat to access control (though they remain a surveillance threat to data you interact with). It won’t have to be powerful enough to do all your computations, nor have enough data storage capability to hold all your data, due to secure cooperation. It may as well be your phone too.

    This differs from esr’s vision in that the phone doesn’t have to grow powerful enough to be your computing environment, just be powerful enough to manage secure interaction with a computing and storage cloud, though surely local device capability will continue to grow.

  71. About laptop video out: as far as I can recall, every laptop I’ve used that had a VGA output, going back to the late 90s, supported higher resolution than internal display, though Windows didn’t handle that well until about Windows XP. My laptop’s docking station is plugged into a 20″ 1600×1200 screen. Don’t forget X, Dangerous Virus!, dissatisfaction guaranteed.

    About laptops being too large and heavy compared to an iPad: That’s mostly due to your choice of a laptop. I “lug” around a 2.5 lb convertible laptop/tablet that is smaller than an iPad (though it is quite a bit thicker) and more powerful than a netbook. The keyboard is adequate though quite a bit poorer than a Thinkpad keyboard.

    About iPad: keep in mind that Apple acquired Fingerworks, who had a multitouch capacitive surface used as a combination keyboard, trackpad, and gesture recognition device back in 2005, that fit into the space of an PowerBook keyboard. The onscreen keyboard may be more usable than your first impression.

  72. @Adriano: I doubt it. We’re still waiting for keyboard functionality in the iPhone … *taps fingers in succession* Basically it boils down to Steve Jobs being full of himself. In his not-so-humble-opinion, buttons and keys are just too complicated for most people.

  73. @morgan: I wouldn’t really know, but I was thinking more of a stylus. Is Steve against those, too? Although, obviously, ‘sullying’ his Delicious Product with a hole for a stylus would be stretching it.

  74. You folks all probably have me outgunned on the organic computing front, but I think that gives me a useful perspective. I have painfully learned since I left my small pond that there is *always* somebody smarter than I am. That said…

    One *big* caveat to esr’s scenario: security. You’re paranoid, but *are you paranoid enough?*

    Sturgeon’s law applies to software, even open source, even security.

    Feminine hygene jokes aside, the iPad’s biggest limitation is you can’t make voice calls from it. If it did, you could pair it with a headset and have 1/2 of esr’s scenario *now*. If it supported external storage, (and arguably, iTunes for the desktop could be that) it could be turned into 3/4. I’ve worked most of my career with less cpu, memory, and storage than an iPad has *today*.

    There are stylii for the iPhone. They’ll work with the iPad.

    http://www.amazon.com/Ten-Design-Stylus-iPhone-Black/dp/B00174N3OI/ref=sr_1_2?ie=UTF8&s=electronics&qid=1267291912&sr=8-2

    Oqo tried to build esr’s scenerio. the price performance ration was way too high. Their product cycle was too slow, so they couldn’t fix it.

    1. >Oqo tried to build esr’s scenerio. the price performance ration was way too high. Their product cycle was too slow, so they couldn’t fix it.

      I think the Nexus One may actually make the nut. Add a Redfly variant that supported it and you’d really have something.

      And I just learned that Redfly has a beta Android driver they’ve been touting at trade shows.

  75. I am rarely as disappointed as I was with OQO. I was all geared-up ready to plonk down some serious cash for their 2nd gen device, and poof! they’re out of business.

  76. Redfly….hmmm….at first I thought “WTF?” – why not just pack a netbook FFS? – but then the possibilities started sinking in.

    Nice idea. Its battery life is a bit sucky, considering it’s a ‘dumb terminal’ of sorts. I’ll keep an eye on it for Maemo support.

  77. @esr: I don’t think the Nexus One can make the nut

    Three failures of the Nexus One: 1) No accelerated 3D, 2) price/performance ratio is very poor compared to llaptops, 3) no EVDO/CDMA. The OQO might have done it if they weren’t tied into the Microsoft tax — take the OQO, throw Android or Ubuntu Netbook Remix on an OQO — then you’ve got something.

  78. esr: How might this product interact with or alter your projection?

    [In case my html fails that’s http://www.vuzix.com]

    Clicking on the Consumer Video Eyewear link shows three variations on the basic concept of video monitor goggles/glasses (“under three ounces” may or not meet your definition of comfortable eye wear) with included audio delivery and a fourth (presumably improved) model coming later this year. My impression is that much of the objection I’ve read here (as that relates to monitor capability via your projected/predicted computer development) is obviated by this product, but I hesitate to baldly assert so.

    I must say the price point is attractive enough compared to HD-capable TV’s that I’m planning on buying one (probably the 310 which Amazon lists at $179+ today) next weekend.

  79. ESR, on vertical monitor space: When I visited the Google office in Kirkland, everybody had three 24″ displays set in portrait mode. If you don’t care about accurate color, you could have 3240×1920 resolution for under $400 (plus the cost of a new video card, if needed.) One downside is the lack of subpixel font rendering. Seemed like a great setup, though.

  80. @ David McCabe

    If you were responding to me, these are the specifications the manufacturer lists for the Wrap 310 model I mentioned above:

    Equivalent to a 55-inch screen viewed from ten feet
    Twin high-resolution 428 x 240 LCD widescreen displays
    24-bit true color (16 million colors)
    60Hz progressive scan update rate
    Ultra-low video distortion
    26 degree diagonal field of view
    2-3/8″ intraocular distance (IOD)
    Independent +2 to -5 diopter focus adjustment
    Weighs less than three ounces

    Stipulated that the manufacturer may be lying or only have a questionable quality acceptance standard, but, given the above parameters, what is your definition of “resolution is extremely low”?

  81. I suspect that the desktop computer I’m saving up to buy will be the last one I buy – given that I tend to run them for 5+ years at a time.

    My needs for a desktop machine are a bit different from Eric’s but are in roughly the same hardware spec What I do to make CPUs churn is vector drawing programs and statistics packages.

    As to the kind of computer Eric is describing – combine a tablet form factor with the ‘netbook cradle’ and you’re 95% of the way there, and those are on the market now…

  82. For me, the appeal of the iPad (yeah yeah, worst possible name)

    Actually there is a kind of precedent for it. See http://www.ubiq.com/weiser/testbeddevices.htm .

    The original idea was that in time we would consider computers in much the same way we consider pads of paper(hence the name). Rather than being personalised, they’d be used and shared as needed. The personalisation would come via a “tab” which would contain enough of your identity to allow for the retrieval of settings and storing of files.

    Out of interest, note the PARC prototype tab device ( http://www.ubiq.com/parctab/ ). Of course at that stage it wasn’t a phone and it had little computing power but the ideas were around.

    Stipulated that the manufacturer may be lying or only have a questionable quality acceptance standard, but, given the above parameters, what is your definition of “resolution is extremely low”?

    856 x 240 or 428 x 480 (2 times 428×240) are both pretty low resolutions. I’d consider 1024 x 768 a low resolution these days.

  83. The original idea was that in time we would consider computers in much the same way we consider pads of paper

    Just to add, in this vision one size computer didn’t fit all but the distinctions were drawn around interactions rather than power. There might be a big TV style computer to do your video editing on (most of their research was in the office space so it got used for video conferencing).

    All were potentially “shared” spaces that used the tab to identify who was using it.

  84. Equivalent to a 55-inch screen viewed from ten feet
    Twin high-resolution 428 x 240 LCD widescreen displays

    That is less than standard-definition TV. Remember how blurry an old 55″ TV was? This would be worse.

  85. @ JonB and David McCabe:

    856 x 240 or 428 x 480 (2 times 428×240) are both pretty low resolutions. I’d consider 1024 x 768 a low resolution these days.

    If I’m reading the specs right that’s one 428×240 “LCD widescreen display” for each eye. Mounted less than 2″ from the surface of the eye itself. Really fella’s, at that viewing range how much resolution is required to amply appreciate Salma Hayak’s* decolletage anyway? :)

    Not being either an early (or even second wave, really) adopter of technology or a fervent videophile I’m currently subsisting on a non-HD capable 29″ set that provides acceptable image resolution from the same general range (~10′). To upgrade to an HD capable set of near-equivilant size (32″) would cost me anywhere from ~$290 to upwards of $400 judging by Amazon’s first page result for “HDTV” – and none of them offer 3D viewing capability either. The Wrap 920 (the more capable version of the glasses system) offers twin 640×480 resolution and I can get it w/ the VGA controller from Amazon for ~$320 – and it will function off my cable feed (or HD broadcast antenna if I want to buy one), my desktop, laptop (Ok, netbook w/ external DVD-RW drive) or mp3 player. For the additional potential accessability, I’m willing to accept a modest limitation in screen resolution. I’m just watching movies guys, or reading my email or the blogs. Power user (or even particularly competent, come to that) I am not. Eric’s proposed cell-phone-as-portable-computer (however original or not) seemed to lack even this degree of equally portable viewing option. Given that the basic idea he proposed is still at least a few years off, how do any of you see the potential for this type of tech for that application in the equivilent time frame?

    Besides, when it comes to screen resolution, there’s always the possibility Salma’s shirt may come adrift; some fantasies actually benefit from a certain measure of vagueness, I think. On video, anyway.

    * Feel free to substitute your preferred provider of pre-frontal breastworks display as stikes your fancy of the moment.

  86. Will Brown: I have the MyVu model with 320×240 resolution – it’s mildly craptastic. Also, unless you’re doing some interesting mind-hacking, of bloody course the resolutions don’t sum. If it broadcasts the same image to both displays, the perceived result is a XxY image. If it broadcasts a slightly offset image to both displays, your mind does the usual anaglyph stuff and turns it into an XxY 3D image. Further, there’s a LARGE nosepiece getting between your left eye and the image displayed to the right eye.

    I say all this as someone who is a big fan of mobile computing, and my analysis? Save your pennies for later technologies. Picoprojectors are going to be head-mountable in 18-24 months, and should easily outpace LCD-based solutions in both price and resolution, even accounting for the additional development time that the LCDs have. End advice? Do your research and don’t just jump for whatever’s around now, unless you actually do need a solution NOW.

  87. > I’d consider 1024 x 768 a low resolution these days.

    Us not-so-extremely rich people have to deal with netbooks that only go up to 1024 x 600. Websites haven’t been that great at pastproofing. :(

  88. I’ve come to realize that smaller is better. Really, a whole lot better.

    I have a Toshiba Satellite laptop, and a few months back I built myself a new desktop system.

    And I think I may have made a mistake.

    That ATX style case seems like a battleship sitting on my computer table, compared to the sleek profile of my laptop. And it doesn’t really do anything the laptop doesn’t do except have a nicer bigger monitor.

    Next I might have to try one of those neat little netbooks.

    The Toshiba machine is a delight btw, I would recommend it to anybody.

  89. ESR,

    I would go the middle ground with your ‘setup’. The netbook will be the all purpose device as the keyboard is still big enough to be useful and the display though small would still be functional enough to support a 3-4 multiheaded teleconference. But still small enough that carrying it around is not a discomfort.

    The smartphone on the other hand will be more powerful, but lighter still. It seems that the rage in cells is ‘thin is in’. Its cultural and I doubt one can overcome it at this point. So unless you can pack all your functionality into something 5mm thick I don’t see it.

  90. Computers will be for creators.

    You know this desire to divide everyone up into pre-destined roles is borderline sociopathic.

  91. Ok, lasat time on this topic, I promise.

    If my previous offering lacks image quality horsepower, how about this instead?

    Admittedly, it gives up something on coolness/looks points, but the specs are:

    Panoramic field of view (83° to 123°, depending on model) provides situational awareness, active peripheral vision and enhanced realism
    High resolution: accepts video inputs up to 1920×1200 pixels per eye
    A lightweight (350g), low-profile design that is quick and easy to wear, adjust and take off.
    Several models to choose from, offering the perfect fit for many performance and budget requirements

    Close enough to primetime yet?

  92. The computer industry is shaped by two types of products.

    Games and porn. Nobody wants to play World of Warcraft on a handheld and porn looks a hell of a lot better on a 23″ screen.

    The PC will survive the latest wave of, “The PC will be killed by “.

  93. The Nexus One is the phone Linus just got, right? If I’m not mistaken, that’s a GSM phone and therefore won’t work on CDMA carriers like Verizon or Sprint. AT&T and T-Mobile are the only GSM carriers I can get in my area, and their coverage sucks. That’s one of the biggest reasons I like to stick with CDMA and EVDO.

    That changes with LTE, which is essentially the next generation GSM data standard. Sprint will be the only carrier that doesn’t support it. The first Verizon LTE cells go online a few months from now.

  94. You wouldn’t go back to a VAX? But the instruction set on a VAX – it was marvelous! The only machine with a better ISA was a PDP-10! Don’t tell me you wouldn’t go back to a PDP-10 in a heartbeat?!?!?

    — David

    1. >You wouldn’t go back to a VAX? But the instruction set on a VAX – it was marvelous! The only machine with a better ISA was a PDP-10! Don’t tell me you wouldn’t go back to a PDP-10 in a heartbeat?!?!?

      Back in the day I cared about pretty instruction sets, because back in the day I had to write assembler. Now, I don’t care. Actually, nowadays I prefer instruction sets to be optimized for compiler code generation, not for hand-hacking. Those who profess nostalgia for the machines of olden times have forgotten how underpowered they were.

  95. I was just looking through a leaked copy of the System 7 (Mac OS circa 1990) source code, and noticed that it was smaller than my current for-pay project, MediaWiki. This confirmed my belief that MediaWiki is not only a big ball of mud, but hideous oversized.

    System 7: 906 assembly files, 188 C files, 177 Pascal files. Mostly impenetrable. Most modules include some hand-hacked assembly, known as ‘glue’, to load them in correctly.

  96. @Rich, you’d be surprised, then, at the sales volumes for the Nintendo DS relative to any other console, then. Wikipedia has it at 125.13 million consoles sold. In comparison, sales reported (wikipedia, too) for the xbox 360 are in the order of 39 million consoles.

  97. I remember the periodic arguments between the ‘all-in-one’ folks and the Geek Utility Belt folks.. I’m thinking the AIO folks can declare victory soon?

    (I also recall a Bruce Sterling short story (or perhaps he edited it in an anthology?) where some biker gets pulled over by nasty cops, a widget of his is confiscated and destroyed, but it turned out to be a BBS server and its influential membership proceed to ruin the cops’ lives.. One could rather easily run a popular WordPress blog on a Snapdragon smartphone and 3G/4G networking..

    (and how long until we get Rudy Rucker’s uvvy?)

    BTW, don’t forget, your smartphone in 5 years will come standard with a video projector and probably a laser projected keyboard..

    1. >I remember the periodic arguments between the ‘all-in-one’ folks and the Geek Utility Belt folks.. I’m thinking the AIO folks can declare victory soon?

      Yeah, I expect so. I had this conversation with Don Norman (the Design of Everyday Things guy) about ten years ago. I think the Geek Utility Belt theorists undestimated the design impact of really powerful lightweight programmable devices.

  98. @Morgan: it wasn’t like Xerox was doing anything with them… You snooze, you lose baby!!

    ps: IIRC didn’t Apple somehow actually pay Xerox for the privilege of visiting PARC and questioning their people?

  99. Speaking about the VAX system:

    In 1984 we had one in high school, that I played around with. It was fun. It was underpowered, but it had a certain undeniable old school charm and mysterious allure. You could play those old games like Space War.

    When everyone got on it it got slow as molasses though. It had three of those old style teletype printers.

    The next year they got rid of it and replaced it with a bunch of Apple IICs.

  100. I can see this happening in large population areas. Not in rural areas in the flyover States.

    I’m only 30 miles from Kansas City, yet a SmartPhone cannot work here because we can’t even get a regular cell phone to work here. We depend upon very expensive Satellite Internet service that is way slower than broadband.

    The cost for providers to add service to rural areas is so high that I foresee a huge division in the future between the broadband haves and the broadband have-nots. I run my small business on the Internet. As it is now, I run up against my allotted bandwidth simply by uploading some photos to artfire.com and to You Tube. And we pay about $80 a month for our very limited access.

    So, I wish all of you in the large population areas of the country well, but believe me, your forecast will by no means be universal in the USA, because this is a very large country (which is why many of you think we are behind Europe in cell phone usage) with relatively few people per square mile in the center of the country.

    Those of us in rural areas will struggle to keep up with technology advances simply because networking is so much different out here.

  101. I’m only 30 miles from Kansas City, yet a SmartPhone cannot work here because we can’t even get a regular cell phone to work here. We depend upon very expensive Satellite Internet service that is way slower than broadband.

    One thing you have to be aware of is how uniquely American your situation is. The U.S. telecommunications infrastructure is a laughingstock, easily among the worst in the industrialized world. And before anyone trots out the “at least ours isn’t run by the gubmint” argument consider Finland, who keeps par with the rest of Western Europe in terms of broadband deployment and availability and has never had a government-run or -sanctioned telco monopoly.

  102. One thing you have to be aware of is how uniquely American your situation is. The U.S. telecommunications infrastructure is a laughingstock, easily among the worst in the industrialized world.

    Do you have something to back that up?
    I’m not calling you a liar but I’m always interested in someone else’s “the grass is greener somewhere on the other side of a pond”.

    There are many people in Australia who firmly hold the belief that Australian telecoms is woefully backward. There are some real reasons for that but there’s a lot of myth there as well.

  103. You wouldn’t go back to a VAX? But the instruction set on a VAX – it was marvelous! The only machine with a better ISA was a PDP-10! Don’t tell me you wouldn’t go back to a PDP-10 in a heartbeat?!?!?

    68k ASM 4 LIFE!

  104. > Back in the day I cared about pretty instruction sets, because back in the day I had to write assembler.

    spoken like someone who hasn’t written much assembler, actually.

    > Now, I don’t care. Actually, nowadays I prefer instruction sets to be optimized
    > for compiler code generation, not for hand-hacking.

    VLIW for Eric!

    (dude, you must luuuv Intel’s Itanium)

    > Those who profess nostalgia for the machines of olden times have forgotten how underpowered they were.

    for their time? No Eric, no they weren’t.

  105. Now, I don’t care. Actually, nowadays I prefer instruction sets to be optimized for compiler code generation, not for hand-hacking.

    You sure about that, Eric? The x86 instruction set is neither particularly compiler friendly nor particularly user friendly.

    1. >You sure about that, Eric? The x86 instruction set is neither particularly compiler friendly nor particularly user friendly.

      And as such, sucks pretty hard in both directions. But this kind of problem is why we have compiler jocks.

  106. I think you’re confusing predictions with utopian dreaming. Sure, it would be nice if public access stations were wired-only because it would be better for users’ security, but who the hell thinks that’s going to be a major motivator? Didn’t we all just have a depressing moment of realization with the release of the iPad, that general-purpose computing has, in the general case, been an abject failure when distributed to people?

    You’re not the standard case. Anything you predict for the future state of technology and its interfaces should take that into account.

    I do wonder, though, when the ubiquity of these things will finally lead to interesting signage.

  107. And as such, sucks pretty hard in both directions. But this kind of problem is why we have compiler jocks.

    Imagine an alternate universe in which the 68k — which is exactly the opposite: compiler friendly and user friendly — had won. Then the man-hours and brainpower that have been soaked up fighting the bletcherous hardware could have been put to work on much more interesting problems. That kind of “yeah, the hardware sucks but we’ll make up for it with software” approach gets you Windows. Designing the software to support world-class hardware gets you an Amiga.

    Mind you, I’m not completely despairing. Thanks to smartphones and the very trends you’re describing, the ARM CPU (which, mirabile dictu, has 68k-style base+multiplier*offset addressing modes) looks like it’s going to be the long term winner.

    1. >Imagine an alternate universe in which the 68k — which is exactly the opposite: compiler friendly and user friendly — had won.

      I didn’t just imagine this. I was one of the people pulling for it to happen, pre-PC. Sometimes you lose and just have to cope.

      Fortunately the lossage turned out to be recoverable after the 386 came out.

  108. Jeff Read Says:
    > Imagine an alternate universe in which the 68k — which is exactly
    > the opposite: compiler friendly and user friendly — had won. Then the
    > man-hours and brainpower that have been soaked up fighting the
    > bletcherous hardware could have been put to work on much more
    > interesting problems.

    It is always nice to imagine a world in which legacy support, and backward compatibility are not important. Fact is that huge amounts of effort are made in that regard in every area of software, not only processor support. In your scenario, instead of man hours being soaked up dealing with the difficulties of x86, the man hours would instead have been soaked up dealing with the backward compatibilities problems of the x86.

    What Apple did going from PPC to x86 was nothing sort of a miracle in my opinion, and could only have happened in the hyper controlled and managed environment of the Apple ecosystem, and could only have been sustained by a set of geeky acolytes with the ability to bury every argument they had ever made as a result of the reality distortion field. Bill Gates could never have pulled this off in 1995.

    Microsoft never had that level of control at the operating system level, though at the application level they did as is evidenced by the transition from Office 2003 to 2007 in which they completely changed the user interface, throwing away billions of man hours of training and experience in their users. People seemed to lap it up. Personally, I hate it and refuse to use Office 2007, but it serves to illustrate the level of control MS has over the office product line.

    Evolution is a fact of life. Our eye is suboptimal, and, I suppose, an intelligent designer would have done a much better job. However, since there was no convenient intelligent designer, layer built upon layer revision upon revision, is the reason you can read this today. Evolution works in practice. (Does that mean the nature invented agile programming?)

  109. I think the Geek Utility Belt theorists undestimated the design impact of really powerful lightweight programmable devices.

    I think you underestimate the design impact of really powerful programmable devices where nonprogrammers are first-class citizens. To the nonprogrammer a computer is an appliance. It’s a thing they buy that lets them do X, Y, and Z. They are not creative enough to imagine exotic uses for it, so accordingly a locked-down, vendor-controlled appliance that does just those things (with potential future upgrades) is fine with them. It may not scale. It doesn’t have to: look at the enormous success game consoles have had since their manufacturers employed onerous licensure fees and approval processes for anyone wishing to develop for their platforms. They certainly haven’t wanted for third-party publishers or titles developed for them.

    Grendelkhan is right: the iPad changes everything we think we know about computing devices. A general-purpose data appliance with vendor controls on which software gets approved for the platform will be just the ticket for someone who does email, browsing, Facebook, reading, and video and who is concerned about malware.

    1. >accordingly a locked-down, vendor-controlled appliance that does just those things (with potential future upgrades) is fine with them.

      The death of the PDA refutes you. And it’s too early to declare the iPad a success, let alone a game-changing one. Game consoles aren’t on point, as they aren’t designed to be carry-me-everywhere devices.

  110. What Apple did going from PPC to x86 was nothing sort of a miracle in my opinion, and could only have happened in the hyper controlled and managed environment of the Apple ecosystem, and could only have been sustained by a set of geeky acolytes with the ability to bury every argument they had ever made as a result of the reality distortion field.

    And could only have happened at the cost of some PPC apps and drivers breaking, as well as all of Classic. But, as you say, the RDF goes a long way towards ameliorating that.

    Bill Gates could never have pulled this off in 1995.

    He could have done it. But the will just wasn’t there. In 1995 DOSEMU for Linux ran Doom at full speed, with sound. The Windows NT compatibility layer? Er, no.

    These days Microsoft owns both VirtualPC (virtualization/emulation technology) and .NET (abstraction layer for the machine’s instruction set). With calculated moves they could easily and seamlessly pull off a transition from x86 to some other CPU architecture. I used to think that the PC to smartphone transition would give them the impetus to e.g., migrate Windows 7 to ARM, but I think they’d simply scrap it and start over from scratch given their corporate culture.

  111. The death of the PDA came about because by the time PDAs got powerful enough to be taken seriously, people were attaching phones to them. The iPad targets the netbook space, not the PDA/smartphone space.

    1. >The death of the PDA came about because by the time PDAs got powerful enough to be taken seriously, people were attaching phones to them. The iPad targets the netbook space, not the PDA/smartphone space.

      Way to miss the point, there! Now think more carefully about the phrase “networkable Turing-complete device”.

  112. “The idea that the iPad is “only” a giant iPhone misses the point so completely that I just don’t know what to say.”

    Yeah, it’s like finding the cat sleeping in your bed or finding a panther sleeping in your bed differs mostly just in the size of the cat, but the… er… experience can still be remarkably different :-)

  113. “The death of the PDA refutes you.”

    IMHO there were primarily two problems with the PDA:

    1) It was presented as much as a data entry device as a reading device. Calendar, spreadsheets and so on – and after playing around a while with it people generally concluded no friggin’ way I’m going to enter any kind of data fumbling with a stylus. And let’s not even talk about handwriting recognition – since when have early-adopter IT-savvy people a handwriting recognizable even by people, let alone by a software?! The first result of us learning to type fast and accurately and using it for pretty much every task that requires writing is that our handwriting resembles that of a particularly retarded 10 year old kid. It was hopeless and a stupid, unrealistic idea.

    (BTW frankly, there seem to be certain completely ungrounded myths in the industry about “business users”. A “business user” is a mythical creature like a unicorn who eats spreadsheets for breakfast and feels nervous if he cannot update his Outlook calendar from the bathroom. I’ve never met one, despite that my job is actually about business apps. This sort of thing is hugely overrated.)

    2) It was generally too small even for reading.

    OTOH the problem with the Kindle and Nook and such is that they are specifically targeted for just e-books. A handheld device of a comfortable size to read _anything_ – websites, PDF, maybe MS Word and OpenOffice docs, has I think a bright market because our reading to writing ratio tends to 50:1. Reading a 100 pages long document on a comfy device on the sofa and writing a 2 page long review about it on a desktop PC the next day sounds about right and doable.

    The only thing left is really just the “Slashdot question”.

  114. Jessica,

    “Personally, I hate it and refuse to use Office 2007, but it serves to illustrate the level of control MS has over the office product line.”

    Just a side info you might be interested to learn.

    Some insider info a mid-level MS executive told me in 2004. Dunno what’s the situation is now, but back then they were facing two problems:

    1) 50% of the total sales revenue of MS comes from Office: yes, much more than from the OS or from any of the many other products.

    2) with Office 2000 people were kinda happy, felt the product is *finished*, working as supposed, and felt no need to upgrade it to 2003 or XP, therefore they based their whole corporate strategy around somehow getting Office users to upgrade. The whole corporate strategy means that even the new versions of other products were designed with this in mind, like requiring Office 2003+ if you want to use the “export records into Excel” functionality etc. etc.

  115. It is funny how Jeff’s trying to make a case for appliance computing based on a product that hasn’t even been released yet. :) Look at the smartphone share numbers I linked to earlier, Jeff, only on twitter hype are the appliances winning. General-purpose computing wins because there is always a long tail of apps that the closed platform overlords cannot possibly anticipate. People are already complaining about no Flash on the iPad- the one decision I agree with cuz of how shitty Flash is but that’s besides the point- and there’s nothing singular about Apple embracing a new form factor that many other companies have already been producing. Rather, Steve Jobs is always doomed to seed new markets with his closed model, because a closed model provides more fit and finish that is important in the early stages when the model for the new tech is not really figured out, only to lose to those who build a more open platform. He lost the PC that way, he will lose iTunes/content and mobile devices that way. I’m not sure he particularly cares, as long as he can overcharge the Apple fanboys with huge profits, but it’s funny when anybody mistakes his silly model for anything worthwhile.

  116. @JessicaBoxer

    What Apple did going from PPC to x86 was nothing sort of a miracle in my opinion, and could only have happened in the hyper controlled and managed environment of the Apple ecosystem, and could only have been sustained by a set of geeky acolytes with the ability to bury every argument they had ever made as a result of the reality distortion field. Bill Gates could never have pulled this off in 1995.

    Maybe I’m misunderstanding what you’re saying here, but it was certainly no technological feat. Mac OS X is basically a port of major portions of NextStep/OpenStep and a revamped Macintosh GUI to Darwin/FreeBSD. All the code was written in a cross-platform manner from day 1. The very first versions of Rhapsody were written to run on both PPC and x86. There were even rumors that they might bring out a version of Rhapsody that ran on top of Windows. Apple’s decision to move to Intel chips was more about supply problems from IBM (who was more busy turning out the Power5 chips for their servers than they were about meeting Apple’s supply demands) than it was about any amazing technological feat. Macs were already using cheap commodity hardware everywhere else in the system. All the PPC-based Macs were already using the PCI bus, USB had already supplanted ADB, etc.

    Anyone shocked about Apple’s move the x86 was simply not paying attention.

  117. “Apple’s decision to move to Intel chips was more about supply problems from IBM (who was more busy turning out the Power5 chips for their servers than they were about meeting Apple’s supply demands) than it was about any amazing technological feat.”

    I had the fortune to work at Freescale here in Austin for about two months, after they fully split from Motorola and went private. From what I heard from insiders, a big part of the ‘supply problems’ were Apple’s fault. They regularly would order, say, a million units of chip X and cancel at the last minute, at a loss to Motorola/Freescale. It was part of some nasty stupid contract some sales guy @ Mot worked up (probably something in the AIM agreements). When the contract finally expired, Freescale dropped Apple super fast. IBM alone could not keep up demand by themselves anymore, so…

    At least, this is how I was told. I have no proof outside of Freescale hearsay.

  118. Anyone shocked about Apple’s move the x86 was simply not paying attention.

    Apple also reputedly cross-compiled OS X internally on both PPC and x86. Methinks they got burned — seriously, like far more than their public face would show — by the 68k to PPC transition and were determined not to let that happen again.

  119. My bet is that it’d be pretty cheap to take the cradle, with its physical interfaces to monitor, keyboard, mouse, and outboard drives (including your phone data backup), and wire in a processor and RAM so you could use all those together without your phone. You know, so they aren’t all useless in case your pocket was picked in the street, and you haven’t gotten your replacement phone yet.

    At which point it’d be a computer. On your desktop. I wonder what they’d call it.

  120. >I think my twitches are common enough to make the phone-centric scenario win over the cloud-centric one.

    I find the software + service model that the most preferable. Examples: Evernote and Dropbox. I can view and have files on my device and they sync to other devices but when I use any web terminal they are also there via a web interface. This leads to me having multiple workstations I use throughout the office and home and where I never manually move any files or ever worry about versioning.

  121. “No they don’t. I have yet to encounter a laptop with a VGA out that can drive a monitor with higher resolution than the laptop’s flatscreen. My X61 (1024×758) cannot drive my desktop monitor to capacity.”

    Try: http://intellinuxgraphics.org/dualhead.html

    2.2. statically setup in xorg.conf
    RandR1.2 configuration in xorg.conf is based on per monitor. So you need write a ‘Monitor’ section for each output and specify these monitors in ‘Device’ section.
    Below is a example snippet in xorg.conf.

    No room here for the rest, but it’s full clone setup with differing resolutions. I haven’t done it since Fedora 8 or so, but it worked then on a T40. Presently using an X61 tablet, and no reason not to think that it wouldn’t work on this computer.

  122. Eric seems to agree with Paul Graham:
    http://www.google.com/search?q=site%3Aesr.ibiblio.org+%22paul+graham

    Yet hither comes yon prediction from pg himself:
    http://ycombinator.com/rfs6.html

    Most people think the important thing about the iPad is its form factor: that it’s fundamentally a tablet computer. We think Apple has bigger ambitions. We think the iPad is meant to be a Windows killer. Or more precisely, a Windows transcender. We think Apple foresees a future in which the iPad is the default way people do what they now do with computers (and some other new things).

    Programmers may never want a computer they don’t control, but ordinary people just want something cheap that works. And that’s how the iPad will seem to them. Many will never make a conscious decision to switch. They’ll get an iPad as well, then find they use their Windows machine less and less. When it dies they won’t replace it.

    Will this future happen? It could. And if it does it will bring big changes. There will need to be iPad alternatives for all the things people now do on PCs. That could mean more than just replacing all the desktop software, because there may be things PC users now do with web apps that might be better done with native iPad apps.

  123. Microsoft never had that level of control at the operating system level, though at the application level they did as is evidenced by the transition from Office 2003 to 2007 in which they completely changed the user interface, throwing away billions of man hours of training and experience in their users. People seemed to lap it up. Personally, I hate it and refuse to use Office 2007, but it serves to illustrate the level of control MS has over the office product line.

    My understanding is the exact opposite.

    2007 is the first office product to get any form of resistance at all. It’s 3 years on and there are still businesses out there who refuse to upgrade.

    Office 2003 (and earlier) is in one of those unfortunate spaces where it’s become so entrenched that despite the fact that its a retarded product which breaks almost every usability principle in the world, trying to improve it is an exercise in futility because the answer to “what is usable?” in that space has become “whatever office does”. Unfortunately for 2007, “whatever office does” isn’t how it works.

    Personally I like 2007. It’s the first office that grouped functions in a way that made any form of sense to me. At the same time my usability training cringed at the way MS expected everyone to just march on board with the new interface. Ultimately it’s a good excuse to steer people towards OOO.

  124. The computer of the future will be the metaphorical equivalent of a safety pencil and circle of paper. Turing-completeness will be irrelevant to the difference in power from the user’s perspective; it’ll just be a black-box gadget that does things that, previously, one needed several black-box gadgets for. Extensibility and universality have been the source of the troubles that plague modern desktop computing. It’s plenty clear already that as long as people can use Facebook from it, the usefulness of its Turing-complete nature is about as relevant as the flavor of assembly language it runs.

  125. JonB Says:
    > Office 2003 … breaks almost every usability principle in the world

    Curious comment. I’m no “usability professional”, however it seems to me that the most basic usability principle is this one: “the software does what the user expects, the controls are where the user expects them to be.” Regardless of any theoretical principles, that is the basic practical principle of usability. Maintaining backward compatibility for a very widely used product is at the very heart of implementing such a usability feature.

  126. “It’s plenty clear already that as long as people can use Facebook from it, the usefulness of its Turing-complete nature is about as relevant as the flavor of assembly language it runs.”

    For some reason, this brings to mind the ‘people ready to do violence on your behalf while you sleep’ concept. I imagine all those engineers and admins out there, heroically slugging away at designing systems and developing code, all so the sleepy Users can access their social networking of the day on their fancy iWhatsits without having to Think about it. I’m not sure the comparison is far from reality, actually…

  127. Way to miss the point, there! Now think more carefully about the phrase “networkable Turing-complete device”.

    Why should I? The target audience who are purchasing these hypothetical devices sure won’t. The iPad fills a niche which neither desktop (or laptop!) PCs nor current smartphones do: the curl up on a couch and check email, read a book, or watch a lolcat video niche. Netbooks attempted to tackle this niche, but fell way, way short of the mark (yes, I have tried this with a netbook!).

    It could well be that the iPad of the future will be a large display that communicates via Bluetooth with the iPhone in your pocket. But the dynamic will not change. Apple “gets it” in a way that virtually no one else in the industry does. They don’t think, “hey neat! A networkable Turing-complete device that goes everywhere you do. I wonder if I can get Python to run on it?” They think in terms with how it will be used by ordinary people going about their lives and how best to make such use easy and convenient. This necessarily involves restricting what programmers can do with the device. This is the central concept in modern computing, and it’s why open source doesn’t have a chance in this world.

    Game consoles aren’t on point, as they aren’t designed to be carry-me-everywhere devices.

    O RLY?

  128. I use and like a Lenovo X61 Thinkpad happily when traveling, but for steady day-to-day work nothing beats having a big ol’ keyboard and a display with lots of pixels.

    Y’know, I’m beginning to feel like an old coot: the kids of today will be perfectly at ease with touchscreen keyboards whereas I cannot tolerate even the full-size keyboard that comes standard-issue with a new PC. I have to have one that goes ka-chunk, ka-chunk, ka-chunk — only then can my fingers fly and I can achieve coding zen.

    My successors, who will program by dragging bits and bobs from a toolbar on their iWhatever display and connect them with lines of light drawn by a sweep of the finger, would think me rather quaint and “retro” — steampunk, even, in my input-device tastes.

  129. # Jeff Read Says:
    > My successors, who will program by dragging bits and bobs
    > from a toolbar on their iWhatever display and connect them
    > with lines of light drawn by a sweep of the finger, would think
    > me rather quaint and “retro”

    “Keyboard? How quaint.” — Scotty, Star Trek IV.

  130. On a vaguely related note, Apple have done the right thing for themselves by not announcing international iPad prices yet, for the pound is tumbling against the dollar in recent weeks (not the Apples UK prices are set on exchange rate alone).

  131. The only thing Apple “gets” is how to continuously shoot themselves in the foot with blatantly idiotic moves. The reason general-purpose computing always wins is because extensibility always trumps the dimwits who want a closed computing cage in which they can be safe and unchallenged. It would be nice if those claiming otherwise would look at the market share numbers that have empirically demonstrated this fact for decades.

  132. > I think you underestimate the design impact of really powerful programmable devices where nonprogrammers are first-class citizens. To the nonprogrammer a computer is an appliance. It’s a thing they buy that lets them do X, Y, and Z.

    I’m with you so far…

    >They are not creative enough to imagine exotic uses for it, so accordingly a locked-down, vendor-controlled appliance that does just those things (with potential future upgrades) is fine with them.

    And here you lose me. As a nonprogrammer, I’m plenty creative enough to imagine exotic uses for my computers. I just don’t have the mad programmer skillz needed to implement those uses. So locked down vendor controls are a Bad Thing from my POV – they cut down the amount of stuff available from those who do have the programming ability. Now it defeats the purpose to adopt a unixoid “users are losers” attitude in the course of making the thing programmer-friendly. But vendors should avoid making things gratuitously harder for independent programmers by such things as licensing hassles, sdks that cost an arm & a leg, etc.

    1. >And here you lose me. As a nonprogrammer, I’m plenty creative enough to imagine exotic uses for my computers.

      Give it up, Deep Lurker. All you’re going to get back from Jeff Read, or any other Apple-worshipper, is something I’ve just dubbed the “No True User” fallacy, after the No True Scotsman fallacy. The fact that you’re creative enough to imagine exotic uses excludes you from the True User category, so your desires need not be taken into account when projecting the glorious victory of the iPhone, or the iPad, or whatever other lock-in device Steve Jobs is peddling this week.

  133. The only thing Apple “gets” is how to continuously shoot themselves in the foot with blatantly idiotic moves.

    I guess that’s why their financial statements look so bleak.

    I have to have one that goes ka-chunk, ka-chunk, ka-chunk

    These days, I prefer good laptop keyboards (pre-chiclet macbook, thinkpad). Less key travel means less effort. In contrast, people who learned on typewriters use a comical amount of force; how does it not make your fingers sore?

    Fortunately, there are laptop-style keyboards available as desktop peripherals.

    For me, the biggest open question about the iPad is the utility of the onscreen keyboard.

    I imagine all those engineers and admins out there, heroically slugging away at designing systems and developing code, all so the sleepy Users can access their social networking of the day on their fancy iWhatsits without having to Think about it.

    Sure, just like the bright people at car companies pour out their labor and ingenuity so that you can drive a car without knowing anything about the mechanics.

  134. The fact that you’re creative enough to imagine exotic uses excludes you from the True User category, so your desires need not be taken into account when projecting the glorious victory of the iPhone, or the iPad, or whatever other lock-in device Steve Jobs is peddling this week.

    You’re right. It’s all bullshit. The “typical user” of these devices is a fiction, an abstraction, a complete straw man. But by pretending the typical user is uncreative, we not only become capable of engaging with the genuinely uncreative types, but also smart people who don’t want the cognitive load of imagining what they want their device to do and devising a way to get from point A to point B. Even smart, brilliant, talented people sometimes want an appliance instead of a kitbag of tools that requires ingenuity in order to get something that does what they want.

    I realized long ago that the only way to get an OS like Linux taken seriously on the desktop was to make geekiness the default. Not everybody has to be Linus, but there has to be a cultural ambient acceptance and encouragement akin to the fondness postwar American youth had for their automobiles. It’s every nerd’s dream that the guy or girl on the street should at least appreciate what that nerd does and why it’s important. What I didn’t realize until recently is that this effort is pert-near futile. The social obstacles are just too well-entrenched to overcome. A joke about Microsoft and lightbulbs comes to mind…

  135. Sure, just like the bright people at car companies pour out their labor and ingenuity so that you can drive a car without knowing anything about the mechanics.

    That used to be the case, as in Henry Ford’s vision of a car for the common folk, but now American automakers slap a new layer of marketing onto the same old engine and body-frame platform and still expect record profits. When the profits don’t come they file for bankruptcy and expect the government to run them and pull their asses out of the fire.

  136. esr: The fact that you’re creative enough to imagine exotic uses excludes you from the True User category, so your desires need not be taken into account when projecting the glorious victory of the iPhone, or the iPad, or whatever other lock-in device Steve Jobs is peddling this week.

    As much as you’d like to think that crippled devices are unpopular, or that people won’t stand for it, the facts say quite the opposite. As Mark Pilgrim put it, outside of Planet Debian and your own personal echo chamber, nobody gives a shit about Freedom 0.

    You can have desires. You clearly do have desires. But when determining the direction in which general-purpose computing devices will evolve over the next few years, your desires don’t really count for much, because they make up a tiny, tiny slice of the pie.

    1. >As much as you’d like to think that crippled devices are unpopular, or that people won’t stand for it, the facts say quite the opposite.

      Lock-in is a recipe for short-term success but longer-term failure. It’s not that “people won’t stand for it”, it’s that eventually consumers outgrow the limitations of crippled or single-function devices on their own. The history of consumer electronics is littered with examples; one very contemporary one I’ve already cited is handheld GPSes.

  137. ““No they don’t. I have yet to encounter a laptop with a VGA out that can drive a monitor with higher resolution than the laptop’s flatscreen. My X61 (1024×758) cannot drive my desktop monitor to capacity.”
    Finally found a vga cable. Plugged it into the VGA socket on my X61 tablet and into the desktop Dell 1920×1200 LCD. Booted Fedora 12 (KDE). Both monitors showed the usual bootup sequence. Both came up in 1024×758 mode. In ‘Fedora->SystemSettings->Settings->Display, both screens were shown as connected, both at the LVDS default of 1024×758. Changed the VGA resolution to 1920×1200 and hit ‘Apply’. The desktop LCD changed to 1920×1200, with the tablet staying at 1024×758. The two screens had their upper left corners in the same position. The result was a virtual screen the size of the larger defined screen, with the smaller within that. Not quite a clone in the sense that I thought it was going to be, since you could have stuff on the larger not visible on the smaller screen (And offhand I don’t remember how to move the smaller around, relative to the larger).
    You can also do the usual twinhead ‘Right of’ or ‘Above’ from the Display settings.

  138. it’s that eventually consumers outgrow the limitations of crippled or single-function devices on their own. The history of consumer electronics is littered with examples; one very contemporary one I’ve already cited is handheld GPSes.

    I can’t think of any example of a platform that is programmable, and has a huge variety of software available, yet where software must be approved by the platform owner. This is a different case from a single-purpose device like a GPS, or few-purpose device like those old, non-programmable PDAs.

  139. 2014 migth be a little bit too soon.
    Not for technology as 4GHz processing units will be current (if you follow Moore’s law)
    The main problem is that people don’t adopt that fast.

    The idea of my mom replacing her desktop pc by a smartphone… lol

  140. Let just say for the record, for those of you who think my intense opposition to the patent system is nuts, that Apple’s latest patent suit against HTC (which Ajay referenced above) is the very essence of what I am talking about. It is rent-seeking in the most literal sense of the word. Were it to be successful it would have a profoundly damaging effect on the cell phone market. Apparently, they have a patent on turning off the screen when you put it to your ear — that must have taken years of R&D to come up with.

    Thankfully, I suspect most of their patents will fail in court based on the In Re Bilski precedent (though that is still not 100% settled since it is going to end up at the USSC.) However, it will cost HTC tens of millions of dollars to defend, even in face of Bilski, and you and I will be the ones to pay for those lawyers.

    The world would be a better, more innovative place without the patent system.

  141. # Jeff Read Says:
    >This is what everybody, including Google, thought Android would look like

    Unfair Jeff, this is a picture of a proof of concept device. It is hardly fair to compare it to a finished commercial product, What did the iPhone look like two years before release?

    > Apple did the hard work of innovating, and HTC chased their taillights.

    The putative moral basis of patents is whether they encourage invention and innovation. Are you claiming that sans patents Apple would not have innovated in the way that they did? Patents offer inventors (or more likely inventors employers) the right to seek rent from people who play in the same space. Clearly it offers some encouragement, but it also obviously offers many deterrents to innovation. There are pros and cons from a utilitarian argument, but most people (including Madison apparently) only look at the pros. Fact is that any time the theory is actually put to the test (that is that the pros outweigh the cons) it does not demonstrate it at all, in fact, it often demonstrates the opposite. Two obvious examples being the huge explosion in innovation that took place only after Watt’s patents on the steam engine expired, and the recent GAO study that concluded that drug patents retarded innovation in that industry. Ironically, both industries are considered excellent examples of the benefits of patents when they show the exact opposite.

    That is not even to mention the outrage that someone somehow can seek rent on the efforts of my own hands when they may very well have had nothing to do with it at all. The fact is that everything was invented at least once by somebody. Many commercially significant ideas have been invented recently. If all these were licensed, and the patent system fully implemented, the economy would grind to a halt, even if only in the paperwork that would be generated. It only does not do so because our courts and government agencies are so breathtakingly inefficient, and getting patents is expensive and time consuming, and prosecuting them even more so. That is why, BTW, that the public’s love of patents is largely based on a lie. They have a idea of the lone inventor inventing something in his garage and becoming a zillionaire. Ask any patent lawyer — that almost never happens, in fact many patent lawyers won’t even accept these guys as clients. Patents are creatures of big, huge businesses, not the little guy.

    BTW, if you introduce a little free market capitalism into the patent system, to make it run more efficiency we get, what we like to call a “patent troll.” The truth is that the whole system is one big lumbering patent troll.

  142. Jeff Read Says:
    > Jessica, I hate software patents as much as the next guy

    I forgot to ask: why are the arguments against software patents not equally as valid against patents on door locks, engine pistons or drug formulations?

  143. Without disputing your general point about patents:

    This is what everybody, including Google, thought Android would look like.

    Unfair Jeff, this is a picture of a proof of concept device. It is hardly fair to compare it to a finished commercial product, What did the iPhone look like two years before release?

    *All* smartphones pre-iPhone looked like this. Small screen, keyboard, lots of fiddly butons, indirect manipulation, aimed at suits. That was the paradigm. Apple spent years developing a completely new UI modality. Then the rest of the industry immediately copied them. (Including cargo-cult copies like the LG/Verizon dumbphones, as well as worthy competitors like Android and WebOS.)

    Whether or not software patents improve innovation, it is a *very* safe prediction that, if not for the iPhone, the smartphones on the market today would look pretty much like they did in 2007, and would be used by largely the same people (because they’d be more fiddly, and because of the carriers’ price stranglehold).

  144. Jeff,

    > Ol’ Steve never took too kindly to tailgaters.

    Ol’ Steve is still tailgating on HP.

    Jessica,

    What about copyright and trademark law? I think content really needs copyright protection (but only for 14 years). I am not down with the idea that the only content which can be monetized is a live performance, and I don’t see how people can sell books, for example, without copyright. At the same time I think people have the right to produce and sell derivitive works, like fan fic and even translations. For the author to be able to completely block other works is a bad idea. One modification I’ve heard of is to require people who own copyrights and patents to license their work to anyone for a percentage governed by law.

    Yes, yes, freedom. Yes, yes, open source – but nobody buys hardware, support or training for a book or a magazine article – so many of the successful business models for FLOSS don’t work for comsumable content.

    It’s a problem.

    Yours,
    Tom

  145. Curious comment. I’m no “usability professional”, however it seems to me that the most basic usability principle is this one: “the software does what the user expects, the controls are where the user expects them to be.” Regardless of any theoretical principles, that is the basic practical principle of usability. Maintaining backward compatibility for a very widely used product is at the very heart of implementing such a usability feature.

    I agree with your principle but the problem is more complex than that.

    My basic thesis is that office is not “naturally usable” but instead people are forced to use it so frequently that their natural use patterns conform to what office does. This is of course not a problem for people who have already done this but what about the new starters? I can’t specifically think of any situation where “because we don’t do it like that now” is a good argument against progress.

    In truth I think the situation is even more complex than that. I don’t think one size fits all usability is ever right (in that One True Way meaning of right). But from a “pedagogical principle” perspective 2007 is a stronger product. I don’t believe that any particular GUI is more usable for all people under all circumstances. And that was the part of me that cringed. If they had’ve offered both “classic mode” and “2007 mode” options the product would have been stronger for a negligible overhead.

    why are the arguments against software patents not equally as valid against patents on door locks, engine pistons or drug formulations?

    Actually i was thinking about this recently because for those things it doesn’t offend me. After giving it some thought I came up with a couple of possible rationalisations.

    1) (specifically for drug formulations) government mandated monopolies make a crazy form of sense when you have significant government mandated costs of development (e.g. clinical trials). I can accept the argument that it’s an expensive process to go through clinical trials and that a 20 year monopoly is the brass ring at the end seems somewhat fair.

    2) Most patents i’ve read outside of software have some specific features, they have very concrete bounds and they have a concrete description of an implementation. Software patents by and large don’t have either of these features. they tend to be as broad as possible and talk in handwavy terms that don’t give concrete implementations.

    3) The lifecycle is massively different. I bet the basics of pistons and door locks are much the same 20 years from now and the only reason why we’ll go to a different drug formulation is because a better one comes out. The average lifecycle on software is what? 5 years? in some cases 2 or 3 years? 20 years is way too long for software.

  146. esr: Lock-in is a recipe for short-term success but longer-term failure. It’s not that “people won’t stand for it”, it’s that eventually consumers outgrow the limitations of crippled or single-function devices on their own. The history of consumer electronics is littered with examples; one very contemporary one I’ve already cited is handheld GPSes.

    I don’t think that example illustrates what you think it does. Users traded in one appliance (a handheld GPS) for another appliance (the iPhone) which condenses several of these appliancey things in one small, cutesy package. Yes, the users outgrew a crippled, single-function device, and moved to a crippled, multi-function device. The openness of the platform here seems to be a red herring.

  147. # David McCabe Says:
    > *All* smartphones pre-iPhone looked like this.

    Sorry, but you are simply factually wrong on this David. I have a TMobile Wing which came out years before the iPhone. It runs Windows 6, and in form factor is closer to the Droid than the Treo type phones. There were also many phones of that era that did not include a physical keyboard, and had a similar sized screen.

    Not by any means saying that the Wing is as good as the iPhone or the N1, but it is plain wrong to say that all phones looked like the Android proof of concept.

    You massively over estimate the degree of innovation in the iPhone. Gestures have been around for decades, though never fully implemented in commercial products (and Apple certainly did a good design and implementation job,) and most of the rest is just attractive, good quality industrial design. That is not nothing, but it is not Einstein either.

    > if not for the iPhone, the smartphones on
    > the market today would look pretty much like
    > they did in 2007

    Right, because Apple are the only innovative people in the world. Sheesh!

  148. Jeff,
    > What about copyright and trademark law?

    Since you ask… the things categorized under the general and deceptive heading of “intellectual property” are all completely different in nature.

    Patents: I have already raved about, they are horrendous.

    Copyright: there is more justification for protection of copyrights than patents. The fact is if you produce a song or book identical to mine then you almost certainly copied it. However, frankly, I think there world would be a much better place without copyrights.

    The argument that copyrights ensure a supply of new art is entirely bogus, the putative justification for copyright, is on its face ridiculous. There was plenty of art generated before copyright laws, and there are many, many people who produce art for the love of it. Perhaps art would become greatly less commercialized because there is less money to be made in it, but who amongst us would regret this development.

    There are many ways to monetize art without the restrictive practices of copyright. In a sense, the very essence of art is to share, so copyright is anti-art in a sense.

    If there were no copyright laws would there be no art? Of course not.

    However, as I say, I have a little more sympathy for copyrights than patents.

    Trademarks: These are so different, I really don’t understand how they fall under ths same category. Trademark law is essentially about fraud. If I make a fizzy beverage called “Coca Cola” the claim is that I am defrauding my customers by making them think it is the drink made in Atlanta. This seems a perfectly reasonable thing to me, and so I support it. However, I think that generally the trademark laws tend to be over reaching. I certainly should be able to make a beverage called “Coke sucks, drink me instead”, or put up a site called “WalmartExploitsWorkers.com”. Both are probably legal, but pretty close to the edge, and shouldn’t be.

    Trade Secrets: Are the final of the four. And are obviously a matter of employment and other types of contract. I don’t understand why they are regulated under intellectual property law.

    > I am not down with the idea that
    > the only content which can be monetized
    > is a live performance,

    If people aren’t willing to pay for something, that is their business. Why does art have to be monetized at all? Motzart did some great work without copyright, what is wrong with that model? I recognize that Britney Spears might have to downgrade to First Class from her G5, but is that really the end of society? I would contend that copyright skews the value system so that art is much more expensive than it should be.

    And I might add that hyper commercialization has hardly contributed to a great improvement in the quality of art. In a sense, you might say that the quality of art is almost inversely proportional to its degree of commercialization.

    > Yes, yes, freedom. Yes, yes, open source
    > – but nobody buys hardware, support or
    > training for a book or a magazine article

    Here is the problem though: government has created one business model for art that greatly overvalues it, and so, consequently it is the only one artists use. Wouldn’t it be great if there were some innovation in the area of business models for art? Perhaps you can’t think of a way to do it, but if your life and livelihood depended on it, perhaps you might find a way.

  149. *All* smartphones pre-iPhone looked like this. Small screen, keyboard, lots of fiddly butons, indirect manipulation, aimed at suits.

    Not true.

    For example: http://forum.xda-developers.com/

    This page contains (to my knowledge) every interesting phone that HTC developed. Look at the photos per group. The photo is representative of the phone of that kind(case mods for different carriers excepted). Anything higher up the page than (and including) the HTC Athena was pre-iPhone (in fact the athena was launched only 20 days the iphone was announced).

    While there are blackberry like devices and a few devices with keyboards of one form or another the vast majority of their phones were d-pad with a couple of extra buttons. The overwhelming majority of them were touchscreen.

  150. Oh and note that there ARE some phones missing from this page. They’re the ones without touchscreen and from what I remember they all had a regular phone’s interface. Imagine windows mobile running on a generic nokia and you’ve pretty much got them.

    (those phones were very uninteresting since they were generally pretty crippled in both hardware and software)

  151. Wow, still people commenting here.

    I stand corrected per the above point. But, while it is not quite true that,

    Right, because Apple are the only innovative people in the world. Sheesh!

    Apple is one of the only companies in the world with any taste. (Palm also has great taste, but no longer has what it takes to foster a new platform, apparently.) RIM, to their credit, created a great product for suits, but were completely failing at making a mainstream product. Most other smartphones were Windows Mobile trash.

  152. Jumping to the bottom to answer a couple people’s comments about video and television.

    If the Internet tries for realz to take over from broadcast television, it will melt down.

    *Unless* carriers and “broadcasters” finally give in and set up a reasonable multicast backbone, and use that. Problem is that’s hard to monetize.

    The Internet’s problem — in general — is that, given who it was designed by, it has a built-in tendency to make things easier to deliver for users and harder to monetize for providers.

  153. Baylink, that is complete and utter nonsense. Nobody with a brain is going to try and recreate broadcast television on the internet, it’s going to be something much better than that. Talking about how streaming and multicast are hard to do, ie recreating the old-fashioned TV streaming model on the internet, is as dumb as talking about how hard it is to stick an internal combustion engine on a bicycle and make it safe. Nobody with a Tivo watches streamed video, they record the shows and watch it at their convenience. Internet video will automatically download to your home overnight and then you’ll have hundreds of hours queued up to watch any time you want. Imposing a 24-hour stream on internet video is just dumb and not what most users want anyway. As for monetization, that’s as easy as pie: it’s trivial to implement subscription or micropayment systems for internet content. The only reason it isn’t done is because the techies who could do so are economically illiterate and the content people are technically illiterate.

  154. Ajay: Internet video will automatically download to your home overnight and then you’ll have hundreds of hours queued up to watch any time you want.

    I strongly disagree with this bit. Something which I didn’t foresee, but looks obvious in hindsight, is that once bandwidth grew enough to allow real-time video streaming, nobody wanted to actually download their videos any more. Consider the people who have their favorite videos on Youtube–even though they’ll curse when the internet is slow, they’ll still watch it in their browser rather than downloading it. Maybe it has more to do with the browser-is-the-OS idea people seem to get than I’m crediting it for, but in any case, what you describe seems to be the opposite of the current trends away from downloadable media and toward streaming media, especially since streaming designs give the media providers far, far more control over the end users. I imagine internet TV will look pretty much like Hulu.

  155. I strongly agree that smartphones will disrupt PCs, in future everybody must using smart phone than personal computer. todays world every body are looking for easy to use product especially in the world of business. I think that people in future must go for smartphones.

  156. yes. i too agree that the smartphones are the future gadget which people will use instead of a pc.it is easily portable than the conventional pcs

Leave a Reply to DM Cancel reply

Your email address will not be published. Required fields are marked *