The midrange computer dies

About five years ago I reacted to a lot of hype about the impending death of the personal computer with an observation and a prediction. The observation was that some components of a computer have to be the size they are because they’re scaled to human dimensions – notably screens, keyboards, and pointing devices. Wander outside certain size extrema and you get things like smartphone keyboards that are only good for limited use.

However, what we normally think of as the heart of a computer – the processing and storage – isn’t like this. It can get arbitrarily small without impacting usability at all. Consequently, I predicted a future in which people would carry around powerful computing nodes descended from smartphones and walk them to docking stations bundling a screen, a pointing device, and a real keyboard when they need to get real work done.

We’ve now reached an interesting midway point on that road. The (stationary) computers I use are in the process of bifurcating into two classes: one quite large, one very small. I qualify that with “stationary” because laptops are an exception for reasons which, if not yet obvious, will be in a few paragraphs.

The “large” class is exemplified in my life by the Great Beast of Malvern: my “desktop” system, except that it’s really more like a baby supercomputer optimized for fast memory access to extremely large data sets (as in, surgery on large version-control repositories). This is more power than a typical desktop user would know what to do with, by a pretty large margin -absurd overkill for just running an office suite or video editing or gaming or whatever.

My other two stationary production machines are, as of yesterday, a fanless mini-ITX box about the size of a paperback book and a credit-card-sized Raspberry Pi 3. They arrived on my doorstep around the same time. The mini-ITX box was a planned replacement for the conventional tower PC I had been using as a mailserver/DNS/bastion host, because I hate moving parts and want to cut my power bills. The Pi was serendipitous, a surprise gift from Dave Taht who’s trying to nudge me into improving my hardware hacking.

(And so I shall; tomorrow I expect to solder a header onto an Adafruit GPS hat, plug it into the Pi, and turn the combination into a tiny Stratum 1 NTP test machine.)

And now I have three conventional tower PCs in my living room (an old mailserver and two old development workstations) that I’m trying to get rid of – free to good home, you must come to Malvern to get them. Because they just don’t make sense as service machines any more. Fanless small-form-factor systems are now good enough to replace almost any computer with functional requirements less than those of a Great-Beast-class monster.

My wife still has a tower PC, but maybe not for long. Hers could easily be replaced by something like an Intel NUC – Intel’s sexy flagship small-form-factor fanless system, now cheap enough on eBay to be price-competitive with a new tower PC. And no moving parts, and no noise, and less power draw.

I have one tower PC left – the recently decomissioned mailserver. But the only reason I’m keeping it is as a courtesy for basement guests – it’ll be powered down when we don’t have one. But I am seriously thinking of replacing it with another Raspberry Pi set up as a web kiosk.

I still have a Thinkpad for travel. When you have to carry your peripherals with you, it’s a compromise that makes sense. (Dunno what I’m going to do when it dies, either – the quality and design of recent Thinkpads has gone utterly to shit. The new keyboards are particularly atrocious.)

There’s a confluence of factors at work here. Probably the single most important is cheap solid-state drives. Without SSDs, small-form-factor systems were mostly cute technology demonstrations – it didn’t do a lot of practical good for the rest of the computing/storage core to be a tiny SBC when it had to drag around a big, noisy hunk of spinning rust. With SSDs everything, including power draw and noise and heat dissipation, scales down in better harmony.

What it adds up to for me is that midrange PCs are dead. For most uses, SFF (small-form-factor) hardware has reached a crossover point – their price per unit of computing is now better.

Next, these SFF systems get smaller and cooler and merge with smartphone technology. That’ll take another few years.

144 comments

  1. If it sits in my room somewhere, I’m not as concerned with the form-factor as I am with being able to turn of “telemetry”, and not having to pay a subscription fee for basic OS services :-P.

    Maybe something like this will happen. If so, then I’ll be somewhat dissapointed, because I’m on your supercomputer power-user end of things. I write engineering simulations, designed to crunch million double-precision variable lists. I bought my 8 core AMD desktop so I could start up 8 processor hogging threads at once. If I could afford one of those 32-core server motherboards without lighting my grad-student bank account on fire (burnt offerings of $$$ to the technology gods!), I’d be playing with one of those.

    Everyone moving to some small efficient form-factor (because you don’t need anything more to play games, (well, not games that anyone can write without a Hollywood studio)) means less economic pressure to ramp up power – the sort of power needed to solve “real problems”.

    (Also why I start twitching internally whenever someone tells me that “Information appliances are the future! You don’t need that much computer!”)

    Not that I object to the raspberry pi and mini-pc being available: I have two of those, and one of them comes in very handy to bit-bang electronics and play around (seriously, at this point it’s almost a logic analyzer/DAQ – a networked one too! :D

  2. I’d be entirely sold on the concept if there were more SFFs with options for ECC, but as far as I know that limits you to Xeon D solutions, although ARM doesn’t as far as I know have the market segmentation that Intel imposes here, it’s just that ARM based systems are almost always extremely cost sensitive.

    Now, I suppose I’m weird in that through 2013 I built Xeon ECC systems for my parents, but then again those are rock solid like the ones I build for myself.

  3. PS – wrt telemetry, yes, the obvious answer is: Use Linux, and sandbox Windows/other for things that need Windows/other.
    PPS – not that I disagree with your analysis wrt: small form-factor PCs. But it does make it less likely that I’ll be able to drop $1000 in the future and get a million-core processor the size of a brick with a terabyte of ram needed to do, oh, say full molecular-dynamic simulations of a ribosome or thousand electron quantum chemistry, or hundred-million-node CFD in the comfort of my office.

  4. Am I right that this squares with the trend of portable computers such as smartphones becoming bigger because as they get used more, consumers increasingly want larger, more usable interfaces?

    Do you see a lower limit on the size of SFF hardware due to connectivity needs? My smartphone has an audio jack and a small USB/power jack, which is nice for a phone. My computer needs power input, dedicated video, dedicated network, dedicated audio and multiple multi-purpose data or data/power connections (USB, currently), all of which make it now seem like the hardware for the computer to cable interface should take up more space than the computer itself. Having experimented, I’m wary of relying on non-wireless connections due to security and bandwidth issues, and I don’t see that going away.

    1. >Do you see a lower limit on the size of SFF hardware due to connectivity needs?

      Maybe, but if so it’s “smaller than a Raspberry Pi” for most purposes.

  5. What about virtualization?

    If you have multiple server PCs, you can replace them by multiple SFF PCs, or you can replace them with a single server PC running them all as VMs. Use Chromebooks or tablets for terminals.

    And games are still the real drivers for desktops. You need a certain amount of interior space to fit an nVidia 970FX or whatever the latest high end card is, and you need space for cooling for it and your i7 quad-core.

  6. Maybe something like this will happen. If so, then I’ll be somewhat dissapointed, because I’m on your supercomputer power-user end of things. I write engineering simulations, designed to crunch million double-precision variable lists. I bought my 8 core AMD desktop so I could start up 8 processor hogging threads at once. If I could afford one of those 32-core server motherboards without lighting my grad-student bank account on fire (burnt offerings of $$$ to the technology gods!), I’d be playing with one of those.

    Not that I necessarily think this is a completely good thing or a perfect solution, but the trend seems to be that if you need the massive processing power, you purchase time from dedicated processors in a cloud-hosted rack somewhere else, while your local machine just needs to handle talking back and forth with the rack systems.

    Due to the response time issue, real-time interactive displays such as video games seem to be the one time you still need to have the processing power as close to you as possible. If you’re just doing a lot of math, whether as engineering modeling or 3d animation for later display, I’d think you just need to keep an eye on the process and get back the final results.

  7. On the subject of Thinkpad replacements you may want to look into a Clevo. I’ve never used one, but they have a reputation for robustness and upgrade friendliness: you won’t void the warranty by opening the case. Unfortunately most of their stuff is gaming oriented (but tastefully, no glowing flame decals here!). Fortunately they also have workstation oriented models. Unfortunately I’ve heard some bad things about their keyboards, but most of the complaints are from a few years ago and seem to be straightened out.

    There is a complication in getting one however; Clevo is an ODM that sells laptops to a bunch of different companies. Their U.S. distributor is called Sager, who is not known for having the best customer service ever, so the usual solution is to buy from a place like xoticpc / lpc-digital / puget systems[1] / System76[2] / etc. which specializes in the customer service.

    This is more power than a typical desktop user would know what to do with, by a pretty large margin -absurd overkill for just running an office suite or video editing or gaming or whatever.

    /pedantic-mode 1

    Actually a mid-range “gaming pc” probably has far more raw power than the beast. It is just optimized for the nearly perfectly Amdahlian problem of video rendering. Same for a pc optimized for video editing.

    [1] Puget systems only has one very high end laptop model anymore, but they specialize in having perfect customer service if the stories are to be believed
    [2]: System76 sells their computers with linux pre-installed. They even write and maintain open source drivers for some hardware. The catch is that they are far more expensive than a place like xoticpc.

  8. My reaction to these changes has been along similar lines.

    I have a Nexus 5 phone. There’s a Nexus 7(2013) tablet in my purse, along with a folding bluetooth keyboard if I anticipate the need to do any heavy lifting. When I go to the office (downtown Philly on the Penn/Drexel campus; parking’s bad and security is worse) I take a $150 Chromebook (light, cheap, and contains no consequential persistent state).

    Any of those devices can Chrome Remote Desktop to an elderly Linux tower wired to our big-screen TV (that listens to a Chomecast most of the time but can be switched to Linux, Google TV or a bluray player) or a weapons-grade gaming laptop that is my main development environment and gaming machine. The laptop stays more-or-less permanently docked to a USB hub, cat 5 ethernet, audio mixing board and HDMI switch. At home I access that with a wireless keyboard and trackball from a La-z-boy recliner, and the HDMI feeds a monitor on a swing arm over the chair.

  9. > laptops are an exception ft reasons which, if not yet obvious, will be in a few paragraphs.

    Wasn’t there a plan for an Ubuntu phone that was going to work by essentially the same principles you outlined – a smartphone that docks to a keyboard/monitor setup?

    > Do you see a lower limit on the size of SFF hardware due to connectivity needs? My smartphone has an audio jack and a small USB/power jack, which is nice for a phone. My computer needs power input, dedicated video, dedicated network, dedicated audio and multiple multi-purpose data or data/power connections (USB, currently),

    If the battery is good enough you may not need to use the power connection at the same time as you are using other peripherals. The most recent Macbook uses this model, having only one USB (3.0 type C) port and one audio port, just like your phone.

    USB certainly can carry network, video, audio, and power. Even the audio jack on phones (and the macbook) is a concession to the fact that most people want to be able to use cheap headphones. There’s no fundamental reason that your device itself needs more than one. It could use a hub, or a “do-it-all” device (what used to be called a docking station) that breaks out conventional ports rather than more USB ports.

    And, of course, most people will use wireless. People who insist on wires for everything for security or bandwidth reasons are likely to be a minority, one that can be catered to with a single USB port that the majority will only ever use for flash drives and power.

  10. I’m a mostly laptop guy at the moment. And as I’m a web developer in a modern agency, that translates to a Mac (I need to run Photoshop AND apache.) I still have my hands elbow-deep int he command line of Linux servers that I deploy stuff to, but I sadly don’t have a Linux desktop any more.

    Stuff like this makes me excited, however:

    http://arstechnica.com/gaming/2016/03/amd-wants-to-standardise-the-external-gpu/

    I spend about 60% of my computing time with my laptop at my desk hooked up to a 27″ external monitor and a nice keyboard and mouse, but I really need to be able to grab the laptop and go to a meeting and resume my work uninterrupted.

    I basically classify everything, computing wise, into thermal budgets. My phone actually has quite a bit of CPU and GPU power, rivaling that of my personal laptop. What it’s lacking is RAM (a paltry 2GB) and real thermal budget — it throttles the CPU based on heat. My laptop performs more consistently, with it’s 4-year-old Core i7. But it just can’t perform as well as a dedicated desktop PC, which has, effectively, unlimited thermal and power budget. A pluggable augmentation, like the external GPU I linked to, would be fantastic for when I dock my laptop to my desk. Or might allow me to replace my laptop completely with a phablet some day.

  11. I still have a Thinkpad for travel. When you have to carry your peripherals with you, it’s a compromise that makes sense. (Dunno what I’m going to do when it dies, either – the quality and design of recent Thinkpads has gone utterly to shit. The new keyboards are particularly atrocious)

    If you want a quality laptop, your only choice is a MacBook Air or Pro. I recommend NOT installing Linux on it; keep Mac OS X on there. It’s Unix enough, it comes out of the box will full support for all the hardware (because Apple), and the excellent power management gives you 10+ hours of battery life. Linux power management is shit on all platforms, plus the wonky way it handles the trackpad will probably drive you nuts. Lenovo and other manufacturers have proven untrustworthy on the spyware front, and if you buy from Lenovo again you may end up with creepy unwanted things in your BIOS. This probably won’t happen with Apple kit.

    If you have to pave over the laptop’s installed OS, a MacBook still makes for a better Linux machine than most laptops.

    Clevos suck. Avoid. They routinely ship with light-bleeding screens and other problems.

  12. Oh yeah — you can run Word, Excel, and Photoshop on the Mac too while still retaining your Unix development environment.

  13. And Virtualbox and VMWare Fusion work well for running *nix (I spend about a third of my time in Mint full screen) and Homebrew is a reasonably compleat package maanger..

    Dual-booting into windows (*shudder*) for running the handful of games I occasionally load up is straightforward.

    That said, I can also understand why someone coming at it from the linux side may decide to nuke and pave.

  14. > If you want a quality laptop, your only choice is a MacBook Air or Pro. I recommend NOT installing Linux on it; keep Mac OS X on there.

    While I agree with the quality of the Mac hardware, you pay for that hardware and the privilege of owning MacOS.

    Given Apple’s control freak tyranny over the OS and what you are allowed to do with it – replacing it with Linux is the >first< thing ESR would do. In my mind, this makes the laptop over-priced as MacOS is the point of having an Apple machine.

  15. FWIW for a number of years my home server wasn’t fanless, but a toasterbox computer (XPS I believe) running first Fedora, then Suse.

    It was a great toy to play with, and with dropping SSD prices, I can foresee going back to a standalone linux-based household “server” in a fanless, NUC-style, or similar case.

  16. Do you see a lower limit on the size of SFF hardware due to connectivity needs?
    (Let’s see if I guessed the proper formatting for quoting…) See the recent trend of having USB C replacing every plug, and you get things like the most recent Macbook with everything running over a single USB C port. With external GPUs, I expect most systems to have a couple ports so that the CPU-GPU connection can get as much bandwidth as possible, but everything else will be perfectly happy to share. Also, see gaming keyboards acting as hubs, containing a couple of USB ports and audio IO that they pass through. Right now, USB C requires a dedicated hub for splitting to all the legacy wires, but once the big keyboards get up to date there? I’m expecting two or three USB C/thunderbolt ports to be the sweet spot manufacturers will eventually converge to.

  17. In my own experience, OSX is just similar enough to a Linux to lull you into a false sense of familiarity, and then annoy the hell out of you when it doesn’t do what you want. Also, apt-get is a hell of a drug.

  18. @dtsund – It’s not supposed to be “Linux”, it’s based on FreeBSD. I’m pretty sure ESR is familiar enough with a variety of UNIX platforms, he’s been in this game since before there was a Linux.

  19. Regarding Foo’s recommendation, I’ve been running Arch on a Clevo W740SU that I got “barebones” from AVA Direct (low prices, irritating service), then upgraded with aftermarket RAM and SSD, for a year and a half. Cost $1200 then for a Haswell i7-4760HQ (quad-core, Crystal Well graphics) with 16GB RAM 80GB+240GB SSDs. System76 offers Ubuntu installed OOTB, but their prices are significantly higher, and I didn’t want Ubuntu anyway.

    I find it usable for loads including multiple browsers+Eclipse+multithreaded integration suites, and the integrated graphics is good enough to drive a 2560×1600 over DisplayPort at high quality.

    The keyboard is mediocre, but I rarely use it; I get gorilla back quickly when typing on a laptop keyboard, and so I use Real Keyboards. It’s quite lightweight, especially compared to my previous “mobile workstations”.

    I would recommend the update to the W740SU (it’s a Skylake, so beware of no VGA) as a practical portable as long as you don’t have to spend hours at a time on the keyboard (but for me, the last applies to any keyboard attached to the screen I’m staring at).

  20. It’s tempting to get a SFF system. Right now, I have the RPi3, but it’s still not quite speedy enough to replace even a low end web browsing desktop like my old, now defunct AMD E350 box. It certainly is fun to play with though. I actually have two systems that would be considered (midrange) gaming PCs and I use the older of the two as my main web browsing desktop for the moment. An old PhenomIIx6. Maybe when AMD Zen is released and someone makes an SFF box out of a zen APU, I’ll pick one up and I can start leaving all of my gaming PCs turned off except when gaming.

  21. For my next machine, I’m going
    modular. It’s going to be one of these – http://nexdock.com/ – in conjunction with a Pi 3. $160 for a full Linux based machine! Web browsing, Streaming, Development. Maybe not video editing, but who cares.

    The great thing is that I will be able to upgrade to a new Pi 4 or 5 in a couple of years for $30-$40, without having to replace the laptop form factor.

  22. Consequently, I predicted a future in which people would carry around powerful computing nodes descended from smartphones and walk them to docking stations bundling a screen, a pointing device, and a real keyboard when they need to get real work done.

    Much like my Surface 4 Pro, recently acquired.

    Larger than a smartphone, but only because it’s also a laptop; the computing guts would fit even smaller form factors.

    Point being the keyboard is optional (integrated into a cover case), and with a simple dock it can drive two real monitors and have an arbitrary array of “real computer” stuff attached to it, and it’s damn near as powerful a CPU as my desktops are.

  23. > If I could afford one of those 32-core server motherboards without lighting my grad-student bank account on fire

    Since your code will already utilize multiple CPUs it might not take much to spread it across a Beowulf cluster.

  24. Smartphone-sized computing systems (I’m putting RaspPi in this category) are good for many uses but not truly universal. Power and cooling still matter for some uses (the RaspPi 3 itself has cooling issues, albeit it’s so tiny that just keeping it submersed in mineral oil is actually a very reasonable solution), so there will be some demand for desktop-sized computers, some of which will be larger than current SFF/’nettops’. And yes, you could move most compute-intensive tasks to the cloud, but cloud computing is still comparatively expensive. especially if you need GPU compute, lots of storage and the like.

  25. I have a Retina MBP for web development and graphics work. Pretty nice hardware with a mediocre OS and an absolutely godawful desktop environment. If you want to replace the Thinkpad, a Dell with linux pre-installed would probably be a better bet.

  26. spread it across a Beowulf cluster.

    I have no idea how long it’s been since I last heard “image a Beowulf cluster…”

  27. It is interesting to ponder a cluster of Raspberry Pis. Some quick
    googling puts the pi 3 at about 180 MFlops, and the baseline of a
    “desktop supercomputer” at about a TFLOP. Rounding the pi up to 200
    MFLOPS, we get about 5000 pis. With $40 per pi and another $10 for
    flash, and we are at $25k before purchasing a nontrivial amount of
    networking equipment, power cables, and shelving/housing. And fans,
    very important at this scale. Even with volume discounts, it is hard
    to see this being competitive with a high end tower
    system. Disappointing, or perhaps relieving, since I don’t need
    another project…

    If ESR is correct about the SFF eating away at the larger desktops,
    this may become an economical approach in 5 to 10 years though.

    What is the status of cluster software these days? Is it something
    easily spun up on a few systems as an experiment? A small pi cluster
    may not be useful, but would be fairly inexpensive and might be a fun
    exercise.

  28. Oh, speaking of 32 core systems not setting bank accounts on fire, I recently patched a system together from primarily ebayed parts. A 24-core MagnyCours Opteron (6174) box with 32 gigs of ram. $320 for everything but the hard drives. I tossed in a 1TB drive I had laying around and it works great. The case is only half-there and lacks the hard drive bays, and the power supply is meant for a 1u and it’s in a 2u case, and the hard drive backplane is just hanging out loose on top of the case, but it works.

    I recently ordered another 32gigs for $60 and a couple of 4TB hard drives for $290, so it’s gotten quite a bit more expensive, but if you already have drives lying around and no need for using all 8 memory channels at once it’s moderately reasonable for a 6 year old server. Number crunching may not need a redundant 4TB ZFS pool for storage either.

    It may not perform up to the standards of modern 32 core Xeons, but the MagnyCours was the last opteron where every core had its own FPU, so it should be a little better than a 32 core Bulldozer derived opteron on number crunching apps.

  29. Being a Scala/Java developer, I refuse to touch any Apple hardware…they’re too hostile to the tech I work in every day.

  30. @ams Yeah, it’s not the easiest thing ever though. Server boards can be tricky to figure out, some of them need specialized cases to operate, or specialized power supplies. I was lucky that Supermicro’s website contains a list of parts for every computer they sell so I could get the important missing bits piecemeal. The motherboard and half-case were the most expensive part, $120. Ram was $70, Heatsink/Fan for $60 (for 2) from Newegg, $30 for 2 6174 processors, PSU was $24 on Ebay, $10 for the SATA/SAS backplane. I would have bought all 64 gigs of ram all at once, but I wanted to make sure it would work before I spent $140 on ram. (Figured that even if it wouldn’t work with only 4 channels on 2 sockets, I could run the 4 channels with a single socket. Turns out it works fine with only 4 sticks of ram across 2 sockets.)

  31. I need a fairly powerful workstation to handle things like PCB autorouting and FPGA compilation, and of course lots of RAM because I usually leave dozens of apps (and a VM or two) scattered across my 16 virtual desktops.
    Next “desktop” machine will have More Power! And probably an SSD for the root device.
    In the laptop department, I can get away with rather less, as I don’t usually need to do a lot of FPGA compilations on it (more than a couple of builds at a client site, and it’s time to head for home and use a real computer). Does need virtualization support, so it can run the same VMs as the workstation. Current laptop is something that was on sale for under $400 a few years ago, plus extra RAM and a bigger HDD.
    Current home server is the previous workstation, repurposed. It’s getting long in the tooth. Its successor ought to be a no-moving-parts SFF type, except that I’ll have to come up with someplace to stick a bunch of spinning rust for the bulk file storage, unless the price of terabyte SSDs comes down a bunch. That’s something to ponder a few months out.
    Lab machines are cheap smallish things, one running Linux and the other Windows (because some hardware I need to use is only supported under Windows, and also I sometimes need to test things for Windows-only clients).
    In the tiny-amazing-things department? Back in February, I got a $100 eval module for one of TI’s processors. Runs Linux, has two gigabit Ethernet ports, and its Whetstone number is freaking awesome compared to the supercomputers of yore.

    So, yeah: for most computer uses, cheap little things will do the trick just fine, as long as there’s some way of connecting a decent keyboard and display. For some engineering uses, unbounded computing power will always be called for (though making it smaller is a necessary part of making it faster, so the big box ends up mostly being a way to manage waste heat and a place to put the video card, or perhaps a home to an array of tiny fast computing units).

    What I’d Like To See Now… is a way to remote my phone’s display to a fixed display on the dashboard of my car, mainly for displaying the map app while driving. The same technology ought to work for a word-processing app and an office display, right?

  32. Others have mentioned the USB type C connector, but the change in working habits that this will enable cannot be stressed enough. This new connector will turn most new monitors into universal docking stations that work with any new laptop or smartphone. Most of the added hardware (essentially a USB hub) already existed in many monitors a decade ago. New power management hardware is needed, but not that expensive.

    The type C connector is symmetrical; no fumbling about. The cable (once most of the legacy devices die) is also symmetrical (can be swapped end-to-end). The power is negotiable as to both the amount (up to 100 watts) and the direction. The bandwidth is copious and some of it can be negotiated to use for things such as video.

    So you’ll be able to plug your laptop into your monitor/integrated hub with a single cable. From there it fans out to the keyboard and mouse and possibly a wired ethernet interface. The monitor informs the laptop that it is powered from the wall; the laptop then informs the monitor that it wants some of that power, and can charge its battery while connected, because the direction of power flow is no longer tied to the direction of data/control flow.

    The monitor also informs the laptop that some of the signalling lanes can be used for video, and the laptop can then output video through the monitor.

  33. I used to be a major hardware nerd back in highschool(late 90s), but these days I find that it’s feeling a lot less relevant. There’s not much I can’t do with my 2009-vintage machine, except that it’s a bit RAM-light. All I want out of my system now is a place on my desk to put a third monitor. If you’d told 1999!Alsadius that I’d ever be using a 7 year old machine voluntarily, I’d have thought you were mad.

  34. If we are looking at something around the speed / capability of a kiosk / secretary level machine then maybe we could look back to the ancient form factor of the Crummydore 64. Keyboard with ports and a smallish external PSU. When i look at some of the old PC and Apple desktops that had less memory and HDD space than even a Pi and we designed some rather nice disco light controllers with those limits ( protel i think )

  35. @Alsadius I completely know how you feel. I was on the upgrade treadmill until about 6 years ago when I bought a ridiculously overpowered computer. It ended up disappointing me. Now I just scrounge up all kinds of cheap parts. My main gaming and desktop computer is 6 years old. My secondary system is 2 years old, but it’s slower than the 6 year old one for many tasks. The only way I could be happier with it is if I put a USB 3.x card in it. I spend far more now on desks, storage, monitors, and input devices than I do on motherboards and processors. My desk now has 4 monitors in a T shape. (I tried having them in a square, but it’s annoying not having at least one monitor centered.)

    If I went back and told myself in the 90’s that I’m happy with a 6 year old computer, old me’d wonder what kind of drugs I was on.

    (I do use a newer 8-core AMD system at work because of the compile workloads though.)

  36. ams: The things you mentioned — CFD, molecular dynamics and quantum chemistry simulation, etc. — all seem amenable to running on GPU, and there’s plenty of demand these days for GPUs with as much brute force as possible. You can thank the deep learning craze for that: it turns out that when lots of rich companies suddenly burn with a desire to do ridiculous amounts of dense matrix arithmetic, hardware makers are happy to oblige them.

  37. If you want to replace the Thinkpad, a Dell with linux pre-installed would probably be a better bet.

    Dell’s Ubuntu machines are B+ tier, maybe A- tier. The hardware is great and it’s good that they are shipping Ubuntu, but there is always some driver issue that won’t be resolved for some time after the machine comes out, meaning you gotta buy last year’s machine to take full advantage. OEMs all along the supply chain just don’t want to support Linux.

    The MacBook is God-tier — as close to flawless as a personal mobile workstation gets. Apple has the clout to twist arms all along the supply chain to make it happen.

  38. @Peter Scott:

    I was thinking that. I do have some ridiculous graphics card (not used or even really designed with graphics in mind) : A friend of mine sold it to me after bitcoin mining got too unprofitable for him.

    I’ll have to look up the APIs (CUDA, etc) to write things for them. One thing that makes me hesitate though is that SIMD might not be very applicable to nonlinear problems. Quantum chemistry, probably, because that can be turned into linear algebra. Molecular dynamics, CFD, and the rest? Maybe. I’ll have to look into it.

  39. GPUs are already giving big speedups for molecular dynamics and CFD, so it’s definitely possible. And the programming model is actually a fair bit nicer than SIMD — it’s Single Instruction, Multiple Thread, which is similar but easier to work with. The idea is that you have a huge number of threads natively supported by the hardware, each written serially, and a single instruction will run across several threads at the same time, with different operands. It’s a surprisingly less-unpleasant way to write data parallel code!

  40. Or you could run the most widely deployed unix out there, which I’m surprised nobody else has mentioned, Android. :)

    Since my ultrabook died late last year, I’ve been using my 8.4″ Android tablet as my “desktop,” just propped up on the table in front of me and paired with a bluetooth keyboard. No monitor necessary, as its OLED screen has 4 million pixels, more than all but 4K monitors, and I don’t mind the smaller size most of the time. All command-line utilities are available from the excellent Termux app, “apt install” away to your heart’s content:

    https://play.google.com/store/apps/details?id=com.termux&hl=en

    I currently have “make -j3” building a mid-sized project in Termux, leaving one core for writing in this browser tab. It’s a Samsung, so I can jump into split-screen mode with two apps when wanted; the upcoming Android N will come with full multi-window on some devices. Of course, there are issues with using an Android tablet as a desktop- bluetooth keyboard drops keystrokes after a long enough pause, Chrome pages have to constantly be zoomed in on such a high-res display- though Word, Excel, and a bunch of games are also available (none of which I use).

    But with no other “desktop” can I just pick up the two-thirds of a pound tablet screen in one hand and go lie in bed when I feel like reading something.

    1. >But with no other “desktop” can I just pick up the two-thirds of a pound tablet screen in one hand and go lie in bed when I feel like reading something.

      I’ve said before that I think this is where it’s all going. But the technology isn’t quite fully baked yet.

  41. @ams:
    > I’ll have to look up the APIs (CUDA, etc) to write things for them. One thing that makes me hesitate though is that SIMD might not be very applicable to nonlinear problems. Quantum chemistry, probably, because that can be turned into linear algebra. Molecular dynamics, CFD, and the rest? Maybe. I’ll have to look into it.

    There are toolkits and libraries for doing CFD on GPUs, both for CUDA (NVIDIA cards) and OpenCL (theoretically universal, NVIDIA + AMD).

    See e.g. RapidCFS, utilizing https://sim-flow.com/rapid-cfd-gpu/

    Nb. if you are familiar with OpenMP, there is its GPU equivalent, OpenACC… though the standard is open, most compilers are proprietary and “we will quote you the price” expensive…

    @Peter Scott:
    > GPUs are already giving big speedups for molecular dynamics and CFD, so it’s definitely possible. And the programming model is actually a fair bit nicer than SIMD — it’s Single Instruction, Multiple Thread, which is similar but easier to work with. The idea is that you have a huge number of threads natively supported by the hardware, each written serially, and a single instruction will run across several threads at the same time, with different operands. It’s a surprisingly less-unpleasant way to write data parallel code!

    What matters for the speedup using GPUs is not the _linear_ part, but the _massively parallel_ (fine-grained parallel) part, and you get better speedups if you have high computational density (high ration of operations to [global] memory access). With SIMT (Single Instruction Multiple Threads) you can run different operations on different threads, but if the branching happens inside warp / wavefront, you will suffer performance penalty (this is similar to hyperthreading, but with 32 threads not 2).

  42. I’ve already gone to tablets for my Windows needs (which is down to about one program now – VMware virtualization client). I got a Winbook TW700 from Microcenter for under $100, and a wired-ethernet+USB hub, and it’s button 3 on my KVM (alongside Mac mini and my Linux game machine).

  43. What does everybody here do for Network Storage? I have a decently big collection of DVD’s and BluRay discs that I have ripped for use in home media centers around the house, all of which stream the data from an old linux server that is spinning rust. The disk in that thing is old and I fear it going bad at any moment, which would cause some pain if I had to rip all the content again. I have looked into OpenNAS but wonder if there aren’t some cheap cloud solutions that would work for 1080p streaming.

  44. I’ve used / owned / loved Thinkpads since they first came out. I’m presently running a T61 with 8GB of memory and an SSD drive. It’s fast, it’s quiet and I love the keyboard.

    Eric you may want to look for a reconditioned version of your current Thinkpad, buy one and have it as a spare. I’ve got a spare T61 ($140) in a box downstairs ready to receive the active SSD and memory. Minimal downtime and nothing to get used to.

  45. Apple’s control freak tyranny over the OS and what you are allowed to do with it

    Poppycock. You can do anything on Mac OS X that you can do on Linux, including shitcanning the security by re-enabling root, loading unsigned, untrusted code, or any other way you want to fuck it up. The kernel is open-sourced, and the developer tools are free.

  46. What Apple WON’T let you do, is distribute code through THEIR store that doesn’t pass their acceptance criteria, which is exactly what they should do. Their security record is pretty damned good, whilst Google Play is about as safe as barebacking your way through Tijuana.

  47. > The kernel is open-sourced, and the developer tools are free.

    Both developer tools and UI / desktop environment are closed-souce, though.

  48. One thing that Tablets and phones are limited by right now is their cooling efficiency. The manufacturers keep shoving faster processors in, but they don’t have enough surface area to dissipate all of the heat they generate. My previous phone could run at full speed for 5-6 seconds, then it started throttling down due to temperature. It was water-proof, so the trick I used to keep it going fast was submerging the back half in water to cool it down. Perhaps one way to get desktop levels of performance out of a tablet or phone would be a docking station that tightly coupled to something on the phone that would let it suck heat out of the unit?

  49. Both developer tools and UI / desktop environment are closed-souce, though.

    The compilers are open-source, and you don’t have to use the IDE if you’re a EMACS geek.

  50. Not sure if you are thinking small enough yet.

    http://www.amazon.com/Intel-Compute-STCK1A8LFC-Z3735F-Ubuntu/dp/B00W7KAABK/ref=sr_1_10?s=pc&ie=UTF8&qid=1460875138&sr=1-10&keywords=intel+compute+stick

    That said, an arm cpu in this form factor can be pretty good too. The next generation odroid c2 is pretty good, except that the video drivers suck.

    My own issue is that I prefer wires to wireless. Ethernet over hdmi? or USB? has to start happening at some point… or 10GigE get cheap enough to use…

    1. >Not sure if you are thinking small enough yet.

      I applied a criterion outside the scope of your argument: “The SFF that my troubleshooter buddy Phil is already using and knows how to troubleshoot.” Not a consideration lightly to be dismissed in early stages of using such systems, I’m sure you will agree.

      You could only counterargue effectively by flying yo’ bad self out here with a Compute Stick, really. Or, I’m going to be at Penguicon the 29th of April to 2 May. Come join us!

  51. I have a compute stick as well as numerous Pis. I made the compute stick my Windows 10 test device*. It works pretty well but is a bit slow if you try to do too many things at once (this isn’t a win 10 thing it was also slow running Lubuntu and more than one app). It’s pretty good as a browser that plugs into the TV and thus would work fairly well as a guest room OS. I paired it with a powered USB2 hub and usb2 ethernet instead of wifi but the wifi also works fine. The USB2 hum solved the lack of ports for things like keyboards and also allowed me to plug in a couple of weirder USB things that only have windows drivers that I want to use.

    * don’t laugh too hard, win 10 is actually not bad as a GUI and it runs my prime dev environment (node.js + atom ) just fine. Need to try the new bash in the dev preview edition

  52. BTW replacement for Thinkpad. Consider the MSI gamer laptops, my company has standardized on them, you get 8GB or 16GB of memory and various amounts of SSD storage (I’ve got 1.5TB). For reasons due to corporate standards I run win 10 on it as the host with VMware to run various guest OSes. Works very nicely and supports 2 external monitors in addition to the laptop screen. At one point I also had a USB3 docking station on it and in that environemnt I had 3 external monitors running including a 4k+ one.

  53. >I’ve said before that I think this is where it’s all going.

    I read your post at that time, others have been predicting similar hand-held “desktops” long before you too. What’s interesting is that it’s now almost here.

    >But the technology isn’t quite fully baked yet.

    Sure, I pointed out some software flaws I ran into. As a long-time Microsoft-watcher, what do you make of their mobile capitulation? On the one hand, they came out with Continuum first, so they seem to be keeping their hand in this market, just in case. On the other hand, with almost no mobile market share, it almost certainly won’t be MS capitalizing on such ultra-mobile “desktops,” which will further eat into Windows market share.

    Are the MS borg on their way out?

    I’m also skeptical of their big cloud push, I suspect decentralized protocols will come back and turn this cloud boom into the second coming of the fiber glut.

    1. >As a long-time Microsoft-watcher, what do you make of their mobile capitulation?

      I don’t watch Microsoft closely enough to know what you’re referring to.

  54. Win Mobile share is low and falling, meanwhile, their Windows chief says mobile is “not a focus for this year.” Surely you heard of it, was big news from BUILD a couple weeks ago? Or are they off your radar these days? ;)

  55. > Perhaps one way to get desktop levels of performance out of a tablet or phone would be a docking station that tightly coupled to something on the phone that would let it suck heat out of the unit?

    That “something is called a heatsink. But the problem is, if you have a docking station that’s big enough to shed heat from your phone, that means it’s also big enough to do its own compute and use the phone as dumb data storage (and that’s a lot more efficient in practice).

    The RasPi3 and other SBC’s are the only real exception to this rule, and that’s just because they’re small and self-contained enough so you can just keep them in mineral oil without that making things overly gross. (For the record, I assume that when “external GPU’s for compute” become a thing we’re going to keep them submersed in mineral oil too. It’s the no-brainer, well-understood solution for all sorts of heavy-duty cooling applications.)

  56. > I applied a criterion outside the scope of your argument

    For your choice of your new email server, sure. That has little to do with this blog post, whose subject is generalized prognostication about The Future Of Computing.

  57. Like you, my personal laptop used to be a ThinkPad. Good keyboard with a TrackPoint (eraser-head pointer) that allowed me to touch-type and use VIM key sequences without leaving the home keys for a touchpad or mouse. And touchpad drivers often suck so badly that the cursor goes flying and you end up destroying a document. When my ThinkPad died, I ordered the latest upgrade to the W series. What a disaster. Horrible keyboard, and some atrocious floating track-pad mouse-button thing. Utterly unusable. Returned. I now use a Dell Lattitude for the keyboard and pointing-stick.

    I can’t speak to compatibility with your favorite *nix distribution as I run Ubuntu on my server, but still have Winders on my laptop.

  58. One warning about OSX: It ships with HFS+ set to fold case, which really fscks up the git CLI (git mv Foo foo tries to move Foo to foo/Foo == Foo/Foo). Smart thing to do is to copy the file system, set the copied version to case sensitive, and then boot off that and wipe out the old partition (you can’t do it on the FS that you’re running on). Then set Finder to case-insensitive searches. Why they don’t have case-insensitivity pushed off to Finder by default is beyond me. Also hilarious: Try ‘>foo:bar’ in ~. Then open ~ in Finder; it’ll show as ‘foo/bar’. Legacy path separator!

  59. > Others have mentioned the USB type C connector, but the change in working habits that this will enable cannot be stressed enough.

    The line between “connections between computers and peripherals” and “network connection between computers” blurs significantly, if it doesn’t disappear outright, with Thunderbolt’s ability to drive two 4K video screens @60 Hz and still have ~8 Gb/s left over for things like Ethernet over USB. If your “dock” includes some native computing power of its own, then when you plug your smart phone into it, they can form a little cluster, and divide up the work between them in creative ways to optimize for whatever resource is in shortest supply (which probably won’t be the USB bandwidth).

    The sort of integration we’re just starting to see with car entertainment/nav systems and smart phones will go to 11 when “docking” your phone into a charging cradle means enough bandwidth to play two different movies on the screens embedded in the backs of the front-seat headrests, leaving one less thing for Johnny and Susie to fight over on that trip to Wally World.

    Now that the Thunderbird optical cables are out, max. range is 60m. Instead of wiring our homes with the latest Ethernet offering, we might just be running Thunderbird instead, and keeping some dongles around to connect to legacy tech.

  60. > Wait, how does OSX even Unix with a case-folding root FS?

    Fortunately, the files never have the same name apart from case. And please note that the case is stored, just ignored in directory searches and so on. Otherwise, there’d be real problems (at least with development between different OSs). It’s a stupid hack for old OS compatibility (which they threw out the window when they moved from PowerPC to Intel, so I can’t see why they’re keeping it.) Also, case folding seems to avoid Turkish (nat fet A? ken bleim fem for it, konsid?rin i/? and ?/I.)

    Proof that this happens:
    Type eMaCs into the command prompt of a Mac. It’ll work. (If you don’t like Emacs, try vIm.)

    1. >It’s a stupid hack for old OS compatibility (which they threw out the window when they moved from PowerPC to Intel, so I can’t see why they’re keeping it.)

      See, this would be a crash landing for me even if not for the whole proprietary-code thing. If OS X can’t even get that right, I’d be a fool to trust it in larger things.

  61. Oh foo. My English using Turkish spelling seems to have gotten corrupted. It should have been:

    nat fet Aİ ken bleim fem for it, konsidırin i/İ and ı/I

  62. > If OS X can’t even get that right, I’d be a fool to trust it in larger things.

    It can, it just doesn’t by default.
    The hardest thing I’ve noticed is dealing with the lack of standard time libraries (if you want precision better than 1Hz). I had to hack together a bit of preprocessor to allow a common source-code interface for subsecond timing.

  63. “Consequently, I predicted a future in which people would carry around powerful computing nodes descended from smartphones and walk them to docking stations bundling a screen, a pointing device, and a real keyboard when they need to get real work done.”

    It seems the road towards that goal has been lost. The “convergence” seems to flounder on the chicken-and-egg problem of connections to screens and keyboards that are not available because there are no smartphones that can use them.

    Any new thoughts about how this could be brought about?

    1. >Any new thoughts about how this could be brought about?

      I think the market will solve this one easily. Both the smartphone vendors and the peripherals vendors have a financial incentive to make it happen.

      It’s quite possible USB C might be the critical enabling technology.

  64. > It seems the road towards that goal has been lost. The “convergence” seems to flounder on the chicken-and-egg problem of connections to screens and keyboards that are not available because there are no smartphones that can use them.

    Have you been paying attention to the discussion here about USB Type C (and the Thunderbolt 40 Gb/s flavor thereof)? Did you notice that major smartphone manufacturers are moving from Micro USB to this new connector? Even if Apple stays with Lightning, there will certainly be third-party adapters to handle them (like the Mophie battery/case for my iPhone has Lightning on the inside, but μUSB on the outside, allowing me to use any of the dozen or so such cables I have in various places).

  65. > Their security record is pretty damned good, whilst Google Play is about as safe as
    > barebacking your way through Tijuana.

    BlueOx, is that you?

    I gotta remember that line.

    My current “wandering around laptop” (named Erdos) is a Lenovo X140e with a quad core AMD A4-5000 CPU, 16g of RAM and a SSD hard drive running Winders 10.

    Linux is…sub-optimal on it (stupid driver issue), and with Windows10 + Cygwin it’s good enough for most of what I need a traveling laptop/second computer to do. Also whatever Chromium and Facist Fox used for to execute the stuff for the Zimbra and Gmail web clients ran *horribly* on Linux, but acceptably under Windows, and IDGAF about the base OS.

    > Wait, how does OSX even Unix with a case-folding root FS?

    Quite well. Stuff done at the CLI is Unix, stuff done through the GUI is Mac. I’ve been using MacOS X for a longish time and never run into a problem. Then again, it’s my home machine so it’s where I go to get away from problems, and I get *pissed* when I find them there.

  66. Like others here, I do not own a real laptop anymore, even though I’ve owned several over the years.

    I’m now in the tablet only camp with an external keyboard and mouse for when I know I need to do real work on the road.

    The reason? So my tablet actually gets used while at home, while my laptop collects dust.

    That is, once I got a decent desktop, my laptop just stopped getting used, until one day I realized that I couldn’t really bring it with me on the road because it didn’t have the right versions of software or data that I needed on it for useful work, because I had let it sit for too long, and didn’t have the time to update it all.

    But my tablet? My tablet gets used (and thus updated) all the time, to either catchup on news whilst eating breakfast, or as an e-book reader, or to look up info whilst watching TV during commercial breaks.

    So I eventually replaced my original iPad with a tablet running the same OS as my desktop, so I could use it for the dual purpose of a consumption device at home and for real work on the road with a keyboard/mouse.

    The tablet doesn’t have to be beefy enough to do any heavy lifting, as I could remote into my desktop should that be required. It just had to be beefy enough to handle office software when offline, which is a fairly low bar these days.

  67. > > My current “wandering around laptop” (named Erdos)
    > I see what you did there.

    I’d be sad if you didn’t.

    As to the subject line, I thought the “Midrange” had died years ago.

    I took a job in 2011 where we had a Compaq ES 40 or 45 in the racks. It wasn’t doing anything (powered off, unplugged) but it was still there.

    Everything that class of machine did was replaced by “generic” intel powered machines.

    So the midrange has been dead for over a decade.

    1. >Everything that class of machine did ws replaced by “generic” intel powered machines.

      Oh, I had a different and later meaning of “midrange” in mind – I meant generic Intel boxes with ATX motherboards in tower or mini-tower cases. As opposed to SFF systems like the Jetway or Pi, or custom-built hotrods like the Great Beast.

  68. > IDGAF about the base OS.

    RMS would probably agree with you, arguing that it’s all mere microcode for his portable LISP machine (a.k.a. EMACS). And besides, the computing industry became dead to him when Symbolics folded.

  69. “What it adds up to for me is that midrange PCs are dead. For most uses, SFF (small-form-factor) hardware has reached a crossover point – their price per unit of computing is now better.”

    This is something that I completely disagree with. For the simple reason that it makes too much sense. You know, that sort of sense that crashes consumer economies. When I was a kid, it was a question/prediction of mine that most personal vehicles on the roads in North America and Europe didn’t make a whole lot of sense. I figured most should be either this:

    https://en.wikipedia.org/wiki/Auto_rickshaw

    Or this:

    https://en.wikipedia.org/wiki/Semi-trailer_truck

    Without much to fill the gulf between. Your prediction is very similar, and will be (and has been so far) prevented by software expanding to fill the hardware in a typical mid-range desktop. One of the most subtle examples of this software expanding is something called Unity, a game engine that’s fairly easy to develop on, but in some cases is an incredible waste (something called AdVenture Capitalist) if you want to kill an hour of standby time checking it out for a few seconds. It obviously isn’t mobile-compatible Unity games that are going to drive PCs, but bigger ones like The Long Dark and Kerbal Space Program. That’s a relatively small market segment compared to triple-A franchises like Battlefield, Halo, and Call of Duty. There will be developers who will make sure there are PC versions that will take advantage of the available hardware and make sure that current-gen consoles go obsolete sooner rather than later. If not developers, then modders! If not software then, people’s innate fascination with bling, the same thing that’s kept sedans and SUVs from disappearing from our motorways. Then, like department stores no closer than an hour’s walk, the software will fit the hardware.

  70. I followed the link to the Intel NUC, mainly being interested in how it performs in 3D gaming graphics, largely because back when I was an avid gamer this was the primary reason I had to upgrade my hardware pretty often. Apparently, very well: http://www.intel.com/content/www/us/en/nuc/nuc-core-i7-gaming-usage-guide.html

    I remember when 3D / gaming used to be a high-end, high-range usage. What happened in the last 10 years I wasn’t paying attention? I get it, operating systems also got 3D (although my first thing to do is to turn that off after installation, my Windows 7 looks like Windows 2000 and my Linux runs LXDE, Lubuntu), and this enabled even laptops with integrated graphics cards to run older 3D games on lower settings which was previously impossible, but now they are talking about HD graphics? It is weird to see this TV term, HD, applied to computers, basically it means 1920×1080, but 3D graphics is a bit more complicated than that – I remember when I was a gamer I had to play with settings like bilinerar vs. trilinear vs anisotropic filtering in order to get a good FPS.

    Is gaming really becoming a low-end usage now and 3D something like with consoles, a “just works” thing, without having to pay much attention to getting the right card and the right settings?

    OTOH I think gaming is moving on the VR now and thus for most purposes a truly low-end laptop will be enough.

    I also suspect 3D will be calculated on a server and gamers will use a thin client. Something like RDP / VNC.

  71. @esr:
    >Both the smartphone vendors and the peripherals vendors have a financial incentive to make it happen.

    Microsoft tried and got it backwards (and couldn’t get any market share), it’s in Apple’s interests to sell each of their customers a Mac *and* an iPhone, and while individual Android vendors may have some incentive to make it happen, it is in Google’s best interests to prevent it from ever happening, as anything you can do on your own hardware is one more Web service they can’t sell you, and they can leverage the Android vendors against making it happen by threatening to withhold licensing for the Play ecosystem. The future of computing in the West, and especially the US, looks very dystopian to me.

    On a total tangent:
    A line of shimmering blue light appears.
    You have 12 charges remaining.
    The midrange computer flees in terror!
    The midrange computer dies.
    The SFF hits you.
    You hit the SFF.
    You have slain the SFF.
    The Beast of Malvern breathes fire. -more-
    You die. Dump the screen? (y/n)

    1. >Apple’s interests to sell each of their customers a Mac *and* an iPhone,

      It sure is. So what? Given interconnects like USB C, Apple can’t stop peripherals vendors from creating symbiotic hardware that will replace the peripherals on a Mac around a smartphone or tablet core – not without crippling their tablets and phones so obviously that there’d be blowback even from hard-core Apple cultists.

      Or, to put it a different way, if Apple wanted to defend that hill, the thing to do would have been to proliferate a bunch of Apple proprietary device interconnects so they could control the peripherals market for their devices. They didn’t.

      >Google’s best interests to prevent it from ever happening, as anything you can do on your own hardware is one more Web service they can’t sell you

      That’s even more mistaken. Google’s web services are correctly designed to have high penetration anywhere there’s a browser, whether that’s a tower PC or a bare smartphone or a smartphone with a high-quality monitor and keyboard plugged in. To a first approximation, Google doesn’t care what your eyeballs are in front of. To a second approximation Google wants to obsolesce PCs with phones, because they get just a a little more control over the channel that way.

      >You die. Dump the screen? (y/n)

      LOL.

  72. I will hazard a guess as to why MS gave up on a mobile OS. They don’t need one. I am currently banging refresh on browser tab pointed at the tracking data for a shipment that includes a Surface Pro 4 (not a Surface Book). This is a tablet form factor device that runs straight-up Windows and therefore all the things I use Windows for (gaming and one spreadsheet with a complicated Solver setup that won’t work in Open Office or Google’s spreadsheet without more time spent than I care to spend). Moore’s Law and improved battery tech rescued them from needing to have a cut-down OS. They have Office available for both Android and iOs now, and there are Android emulators for windows already. And, allegedly, the “next” xBox will be “windows compatible.”

    Prediction: nVidia and ATI/AMD will shortly come out with reference designs that will have 2x USB Type-C (so you can input on one and power on the other) and Nx HDMI ports and/or Nx USB Type-C for output, a “desktop-grade” GPU and active cooling in a package no larger than a pair of cigarette boxes. (This is a sucker bet; under my desk I have a Lenovo that crapped out a week after the warranty expired; in its optical drive bay is a module that contains an nVidia GPU and active cooling that enabled the laptop to run SLI).

  73. > It sure is. So what? Given interconnects like USB C, Apple can’t stop peripherals vendors from creating symbiotic hardware that will replace the peripherals on a Mac around a smartphone or tablet core – not without crippling their tablets and phones so obviously that there’d be blowback even from hard-core Apple cultists.

    Features they could leave out support for without being “so obviously that there’d be blowback even from hard-core Apple cultists”:

    – Mouse support
    – Use of an external video device as the primary screen
    – Keyboard support for navigation [rather than exclusively text input]
    – Multi-window multitasking.

    More likely, though, they’ll just adapt. You don’t think they can sell a slick laptop-shaped case that an iPhone slots into for a thousand bucks? Their keyboard is $99, their external trackpad is $129, and their 27″ monitor is $999. Knock off a few hundred for the smaller monitor, but add in the value of the huge external battery that’s going to be taking up the majority of the space under the keyboard.

    1. >More likely, though, they’ll just adapt. You don’t think they can sell a slick laptop-shaped case that an iPhone slots into for a thousand bucks?

      I think that’s exactly where they’re headed. Pushing USB C makes no sense otherwise.

      One reason is that as computers ephemeralize and get smaller, the big cost driver (and the margin potential) isn’t in the computing core any more. That’s certainly true if you look at the BOM for a smartphone; over 50% of it is the display.

      Now ask this question: if you’re Apple, would you rather sell slick industrial design that’s visible, or try to justify your high platform cost on features users cannot see and interact with? Yes, I do think Apple would rather sell uber-cool docking stations for mobile devices (and more mobile devices) than more computers.

      Thanks for prodding me to work this out. Yesterday I didn’t understand USB C as a strategic move. Now I do.

  74. I doubt they’ll be the market leader, but the idea’s already out there (it was the premise of the failed Ubuntu Edge), and they’ll be scrambling to catch up once Samsung releases one.

  75. Anyway, Apple knows that they’ve got millions of iPhone customers they can’t sell a mac to. And making the iPhone capable of acting as a full-sized computer, even with cheap non-apple peripherals, is going to mean those people don’t buy a computer and thus have more money to buy an iPhone at a higher price point.

  76. > – Mouse support
    > – Keyboard support for navigation [rather than exclusively text input]
    > – Multi-window multitasking.

    My uncle uses all of these heavily and owns nothing but Macs (except for a few tablets, a PS2, a Wii he doesn’t use, and a few Android phones). He’d get really pissed if these were taken away (he is a programmer).

  77. @EMF – the point is they can take them away (or not implement them, since they’re not supported now) from tablets and iPhones, so that people who want them have to buy a Mac.

  78. All I was saying, basically, was that having USB type C doesn’t automatically entail support for these things, and phones lacking them would not cause people to consider them “crippling their tablets and phones so obviously that there’d be blowback even from hard-core Apple cultists.”

  79. iOs doesn’t support mice right now; no OTG and no support for BT mouse. This annoys me mildly already. Whereas Android does both; and while USB OTG is basically a nerd toy right now, it is out there.

  80. @esr:
    That’s even more mistaken. Google’s web services are correctly designed to have high penetration anywhere there’s a browser, whether that’s a tower PC or a bare smartphone or a smartphone with a high-quality monitor and keyboard plugged in.

    But on any platform they control, they want to cripple its usefulness for standalone computing as much as possible without losing market share in order to force people to the browser and to apps with backends on the Google cloud. A few years ago, before they were working on Ubuntu Phone, Canonical had a project called Ubuntu for Android, which would have had the Ubuntu and Android userlands running simultaneously on the same kernel. If it were really in Google’s interests to turn smartphones into general computing platforms, they would have partnered with Canonical on that project and pushed it *hard*, as they would have been first to market with a phone running an existing desktop ecosystem with tons of existing applications.

    1. >But on any platform they control, they want to cripple its usefulness for standalone computing as much as possible without losing market share in order to force people to the browser and to apps with backends on the Google cloud

      Yes. Now explain to me why this doesn’t give them an incentive to obsolesce tower PCs and even SFFs in favor of docked Google phones.

      Remember, the question is whether computing cores are going to shift from being mostly things recognizably descended from tower PCs with dedicated large peripherals (this including current SFF systems like my Jetway or the NUC) to smartphone-like things occasionally docked to large peripherals. It seems to me that you are now arguing my case for me.

  81. Yes. Now explain to me why this doesn’t give them an incentive to obsolesce tower PCs and even SFFs in favor of docked Google phones.

    Because the things they need to do to obsolesce PCs would result in people doing more local computing and less computing in the Google cloud with their phones.

    1. >Because the things they need to do to obsolesce PCs would result in people doing more local computing and less computing in the Google cloud with their phones.

      I don’t see how that follows at all. By hypothesis those things are now being done on PCs anyway. The PC-to-phone move is one from a platform Google has less control of to one it has more control of, Tt can hardly hurt.

  82. I don’t see how that follows at all. By hypothesis those things are now being done on PCs anyway. The PC-to-phone move is one from a platform Google has less control of to one it has more control of, Tt can hardly hurt.

    Well, part of it, I think, is that we realize that those things are necessary if phones are to obsolesce PCs, whereas I think Google may be overconfident of the degree to which their existing model obsolesces the desktop.

    Maybe I’ve misjudged their motivation, maybe not. The fact remains that, whatever their reasoning for thinking otherwise, if they really thought it was in their interests to obsolesce the PC, you would *already* be able to walk down to your local cell phone shop and buy a dockable phone running Ubuntu for Android.

  83. Ian Argent,
    >I will hazard a guess as to why MS gave up on a mobile OS. They don’t need one. I am currently banging refresh on browser tab pointed at the tracking data for a shipment that includes a Surface Pro 4 (not a Surface Book). This is a tablet form factor device that runs straight-up Windows and therefore all the things I use Windows for (gaming and one spreadsheet with a complicated Solver setup that won’t work in Open Office or Google’s spreadsheet without more time spent than I care to spend). Moore’s Law and improved battery tech rescued them from needing to have a cut-down OS.

    What a joke, 12.3″ tablets are good, ie mobile, enough? I do think tablets will do better than they are now, but 11″ and up is too big for a tablet. I predicted a year before the original iPad that the magic middle for tablets would be 7″, and the great sales of the Nexus 7 and other mid-size tablets a couple years ago seemed to prove me right… till the market shifted to even smaller 5.5″ phablets.

    Those 4-7″ devices are driving sales and that smaller mobile hardware is powerful enough to do everything you mention, with only the software lagging behind. That means the software _will_ move to mobile, only a matter of time (as seen with the Termux app I linked above, which only launched last summer), and the absence of Wintel on mobile dooms them.

    >They have Office available for both Android and iOs now, and there are Android emulators for windows already. And, allegedly, the “next” xBox will be “windows compatible.”

    MS can stick around for a bit with that Office mobile escape hatch, as can Intel with their strength in servers, but both won’t last. An Android emulator is useless without apps, which means sideloading pirated apps as google isn’t going to provide the Play Store. As for the Xbox, yes, that’s the solution, another failed platform that’s being gobbled up by mobile convergence.

  84. There’s no reason the surface pro 4 has to be any bigger than it is for its screen size; and looking at the products line, it looks like they have a 7″ Surface Pro running Win10 “desktop.” I was looking for something I could read 8×10 gaming PDFs on, a task which is moderately painful on a 7″ tablet. So, they have Windows on mobile already; at least down to the 7″ tablet size.

    As for the emulator; Bluestacks runs the Google Play Store and (while I haven’t tried it yet) probably runs the Amazon one. No sideloading necessary.

  85. Just checked, Bluestacks will cheerfully run the Amazon Underground app and install apps from there as well. I’m not thrilled by their monetization scheme.

    And, for that matter, I thought I saw that the current Android dev kit had a fully functional emulator included now; intended for troubleshooting, but still. It’s in Google’s best interests to ensure that there is a consumer-usable Android emulator with full functionality available for Windows, for reasons that ESR has already pointed out. MS would be against it, but they can’t really do anything about it for fear of blowback.

  86. Ian, there’s no 7″ Surface Pro, maybe you’re thinking of another product, and I believe they force Win 10 Mobile on screens below 8″, ie no full desktop, only Continuum. As for Bluestacks, maybe my info is out of date. Googling around, I see that people were sideloading the Play Store up till a year ago, but there are now screenshots of an official install, so maybe google _did_ allow it at some point. Not that that really helps MS much, as having to run apps from another mobile platform on your Win 10 mobile or desktop is hardly a ringing endorsement of Windows!

    Forget emulators, which are a sideshow as most users are not going to set one up, what’s really interesting are the rumors that MS canceled their Android bridge, Astoria, _because it worked too well!_ Apparently MS management didn’t like that devs could just run their Android apps unmodified on the mooted Android subsystem for Windows, unlike the iOS bridge that requires some modification, so they canceled the Android bridge and shipped some of it as the recent “bash/apt for Windows.”

    Anyway, 4-7″ devices are where it’s at and Microsoft and Intel, the desktop PC winners, have no presence there. That doesn’t bode well, as the recent Intel layoffs show. Others in this thread have said Apple will defend its Mac lineup, but I don’t see that when Tim Cook says he switched out his Mac for an iPad Pro. Apple appears to be all for iOS cannibalizing the Mac, and google seems headed in that direction too, with hopes that Android N’s full multi-window implementation, that was recently discovered, will help eat further into PC sales. Both are headed after the PC market, and with their much greater mobile sales, have the clout to take Wintel out.

  87. Is there a technical reason MS can’t slide the guts of a Surface Pro 3 or 4 under a 7″ screen? Or even a Surface (non-pro)? As near as I can tell from a quick google and having fondled a couple in an MS store for > 15 minutes, they all run “desktop” Win 10. (I suppose it depends on what the difference between Win 10 and Win 10 Pro is).

    Anyway, what they’re doing now isn’t the point of my posts – the point was that MS no longer has to have a bifurcated OS product; their “desktop” grade OS runs acceptably well on mobilizable hardware. That opens up their options a lot. Whether they’ll take advantage of that or not, well, I’m just some guy on the internet.

    Given that I just this hour installed Bluestacks, and then installed Google Play Movies so I could have my movies available offline (since Big G won’t allow you to do anything but stream to Windows); I think I’m well aware of the availability of the Play store on Bluestacks. It’s also a lot more consumer-grade than it was even a year ago (the last time I poked at it).

    Every thing I’ve read, and from talking to actual users of the devices, says the MS phones are slick pieces of kit; but critically crippled by the lack of software available to them, and a lack of hardware variety. MS appears to be ceding the handheld (<5" screen) market, and does not currently have a <= 7" device (smallest is a 10") – but they opened up Office to both iOs and Android. (Side note – spreadsheets are damn near impossible to actually use on a <7" screen, and I'm not all that happy with the experience on a 7" screen,either)

  88. Wait, how does OSX even Unix with a case-folding root FS?

    The same way Cygwin or mingw do it — preserve but ignore case. Works in 99% of cases; the remainder (e.g., distinct files called ‘makefile’ and ‘Makefile’ in one directory) are pathological.

    In fact the remarkable ease with which case-insensitive filesystems work, even given tools which are built to assume case-sensitivity, shows what a bad decision case-sensitivity really is. But then, Unix is a clownshow of bad decisions — no standardized, well-supported asynchronous I/O primitives, no standard semantics for how and when writes to files are committed to disk, a CLI from the seventh circle of hell — and case-sensitive motherfucking filenames. Kudos to Apple for bringing a bit of sanity (even if it folds case wrong for some international character sets).

    1. >Works in 99% of cases; the remainder (…) are pathological.

      Oh ho ho ho ho. If there were a gallery of “famous last words just before you discover a shortcut was incredibly wrong“, these would be prominent in it.

  89. @Jeff Reads: lack of round-trip byte-by-byte preserving of filenames is serious HFS+/MacOS issue – that is what you get with case-folding (even case-ignoring preserving case).

  90. > lack of round-trip byte-by-byte preserving of filenames is serious HFS+/MacOS issue – that is what you get with case-folding (even case-ignoring preserving case).

    Eh? Biggest annoyance I’ve seen is .DS_Store files showing up everywhere, not to mention resource forks. No actual changing of filenames (the : turning into / is just an issue in GUI stuff, the filename is stored with a colon.)

  91. I just ran into the case-insensitivity issue recently – I was trying to mirror my Linux home directory to my mac, including a python source tree, which contains a directory called Python and a built executable called python.

    It’d be nice to have an easier way to turn it off than remaking the filesystem.

  92. @EMF: it is not a matter of user interaction, but of programming interface: what you get from readdir is not what you saved file as (NFD vs NFC form for Unicode / UTF-8).

  93. > In fact the remarkable ease with which case-insensitive filesystems work, even given tools which are built to assume case-sensitivity, shows what a bad decision case-sensitivity really is.

    Making a filesystem case-insensitive adds complexity. Every attempt to open a file must first determine whether a file with the same name, but potentially different case, already exists. So either you have to store the name twice (once preserving case and once canonicalized to UPPER, lower, or perhaps Proper case) in the directory itself, or canonicalize the names when reading the directory into memory, in order to be able to have something consistent to search. If you confine yourself to ASCII, you can use some bit-masking tricks to make things simpler, but once you go past that, good luck.

    All of this overhead every time you need to open a file. No thanks. I’d rather explicitly specify case insensitivity when I want that “feature”, not be stuck with it at all times.

  94. > It’d be nice to have an easier way to turn it off than remaking the filesystem.
    Fortunately, it’s really only flipping a few bits in the superblock.

  95. @EMF: it is not a matter of user interaction, but of programming interface: what you get from readdir is not what you saved file as (NFD vs NFC form for Unicode / UTF-8).

    The NFD/NFC thing doesn’t have anything to do with case folding, really – you can have either without the other. And anyway, I thought people who needed to smuggle arbitrary (well, almost arbitrary) byte sequences that have no textual meaning in filenames were a myth – what’s your actual use case?

  96. Everything is in the cloud–despite better judgement about PII, business critical data and PCI data–and there’s no reason to carry around anything other than a suitable terminal to access it. Our entire development environment is virtualized between vmware and aws to give development to deployment in a hybrid cloud from soup to nuts. No need to have anything other than a nice macbook that ages well and has a crisp display to access the RDP sessions. In fact, the only thing I would splurge on is a NVIDIA based version of the bitcoin miner I built but never used out of Radion cards to play with deep learning. Nothing like being able to kick off a build in style from a phablet.

  97. Ian,
    >Is there a technical reason MS can’t slide the guts of a Surface Pro 3 or 4 under a 7? screen? Or even a Surface (non-pro)? As near as I can tell from a quick google and having fondled a couple in an MS store for > 15 minutes, they all run “desktop” Win 10. (I suppose it depends on what the difference between Win 10 and Win 10 Pro is).

    Heat? There must be a reason almost no &lt7″ or smaller device has a core m, let alone an i3, all opting for Atom instead. All the devices you mention run Win 10 desktop because their screens are not less than 8″.

    >Anyway, what they’re doing now isn’t the point of my posts – the point was that MS no longer has to have a bifurcated OS product; their “desktop” grade OS runs acceptably well on mobilizable hardware. That opens up their options a lot. Whether they’ll take advantage of that or not, well, I’m just some guy on the internet.

    Apparently it doesn’t, either because Intel doesn’t cut it for mobile or MS doesn’t make available the full desktop for ARM, only Continuum.

    >Every thing I’ve read, and from talking to actual users of the devices, says the MS phones are slick pieces of kit; but critically crippled by the lack of software available to them, and a lack of hardware variety. MS appears to be ceding the handheld (&lt5″ screen) market, and does not currently have a &lt= 7″ device (smallest is a 10″)

    Then they’re making the same mistake as Intel, ignoring that that mobile market will come for them next. Unless they’re ready to throw in the towel on Windows client revenue, which may be the case, considering they’re upgrading consumers for free.

    >but they opened up Office to both iOs and Android. (Side note – spreadsheets are damn near impossible to actually use on a &lt7″ screen, and I’m not all that happy with the experience on a 7″ screen,either)

    Yep, never tried it but I can’t imagine using Office on such a small screen, other than for some one-off viewing or editing. Rather, you’ll set your smartphone down on a dock connected to a large monitor and use a desktop UI instead, just as some people currently dock their laptop at a larger monitor at their work desk.

  98. I have a Dell Venue 8″ tablet based on an Atom part that runs Windows 8 desktop. Presumably it can be upgraded to run Windows 10 desktop.

  99. I believe it can as it barely clears the 8″ threshold I keep referring to. Anything smaller than that, ie the vast majority of devices sold, and you’re forced to Win 10 Mobile, no full desktop.

  100. Don’t confuse interface with OS. Forcing the interface to the “mobile” version fit smaller screens is a usability decision.

    It might be a heat issue that’s preventing smaller devices with the Surface Pro internals. I didn’t realize it until I started using full time, but the Pro 4 has fan cooling. OTOH, my android gets almost painfully hot if I run it full bore for very long (gaming, mainly) or use the fast charger.

  101. I’ve used Windows 8.1 in Desktop mode on a 7″ tablet, both bare and with an external mouse and keyboard. It’s just fine with HID and at least usable with the touchscreen. Disabling it in Windows 10 is a jerk move. You’re sure there’s not a registry tweak to override Tablet mode?

  102. >Don’t confuse interface with OS.

    Tell that to Microsoft, they’re the ones who seem to be confused. :)

    >Forcing the interface to the “mobile” version fit smaller screens is a usability decision.

    Except that the “mobile” interface is now available on all of Windows 10, including on desktop PCs. The difference is that they’re not making available the full desktop for docked mode in Win 10 Mobile, only Continuum.

    >It might be a heat issue that’s preventing smaller devices with the Surface Pro internals. I didn’t realize it until I started using full time, but the Pro 4 has fan cooling. OTOH, my android gets almost painfully hot if I run it full bore for very long (gaming, mainly) or use the fast charger.

    Also power, as even Atom 2-in-1s will suck between 9-15 W, which is death for the smaller batteries in 4-7″ devices.

    >I’ve used Windows 8.1 in Desktop mode on a 7? tablet, both bare and with an external mouse and keyboard. It’s just fine with HID and at least usable with the touchscreen. Disabling it in Windows 10 is a jerk move. You’re sure there’s not a registry tweak to override Tablet mode?

    I think such small Win 8 devices were in limbo, as they didn’t say whether or which version of Win 10 they’d be upgraded to. I don’t know if they’ve since said so, but if it’s Win 10 Mobile, the full desktop would be disabled. I don’t know about registry tweaks, but the ability to run desktop apps was found buried deep even inside Windows RT so maybe it still is, but most users are not going to dig that deep.

  103. The Atom 2-in-1 link got swallowed somehow, here it is again. Look at the table under “Power Consumption” and compare the Lenovo with an Atom chip to the much more efficient ARM chips, which also do much better on the computational benchmarks above.

  104. I think that’s exactly where they’re headed. Pushing USB C makes no sense otherwise.

    Thinking docking stations when it comes to USB-C is a bit short-sighted. What they’re really after is a single “no-legacy” connector per device. And by “no legacy” I mean no 3.5-mm headphone jack.

    The reason? DRM.

    The record companies — and the Princes of the world — would just love it if audio DRM made a roaring comeback. The main purpose of getting rid of it in the first place was to bootstrap the market for digital music. Now that digital music is commonplace, and primarily accessed via hand-portable devices with rights management fused into the OS and hardware, DRM for songs becomes much more feasible without alienating the consumer base. And the next Prince that comes along would much rather lock their output in a vault never to be released (as Prince himself has done with an unknown quantity of his work) than release it into an environment where stealing is easy and commonplace.

    So a device vendor that can offer rights-management guarantees including an effectively closed analog loophole will have a competitive advantage over one that can’t. Dumbphones are dead; the device manufacturers live or die by the content that’s available on their platform. No DRM will mean that artists and publishers will withhold their content.

  105. Er, closing the analog loophole is literally physically impossible. The best you can do is force a quality loss (the proverbial camera-pointed-at-screen) or embed an invisible/inaudible signal that will cause recording devices which recognize that signal to refuse to record (e.g. macrovision).

    All you’ve got to do with your hypothetical DRM-compliant USB headphones is gut them and use the wires going to the speaker driver as an analog output. And even if it can detect that (not technically impossible, it is an electrical change), they can’t detect being duct taped to a microphone.

  106. Er, closing the analog loophole is literally physically impossible… All you’ve got to do with your hypothetical DRM-compliant USB headphones is gut them and use the wires going to the speaker driver as an analog output. And even if it can detect that (not technically impossible, it is an electrical change), they can’t detect being duct taped to a microphone.

    Another story about the late Prince, this time from Kevin Smith:

    And she’s like “Kevin, let me explain something to you about Prince. I’ve been working with Prince for many years now, and I can’t go in there and tell him that you can’t shoot this documentary.”

    And I’m like, “Why?”

    And she’s like “Because Prince… doesn’t comprehend things the way you and I do.”

    And I was like, “What do you mean?”

    And she was just like “Well, Prince has been living in Princeworld for quite some time now.” She’s like “So, Prince will come to us periodically and say things like ‘It’s 3 in the morning in Minnesota. I really need a camel. Go get it.’ And then we try to explain to Prince, like, ‘Prince, it’s 3 o’clock in the morning in Minnesota, it’s January, and you want a camel. That is not physically or psychologically possible.’ And Prince says, ‘Why?'”

    And I’m like “What is he, being an asshole?”

    And she’s like “No, he’s not malicious when he does it. He just doesn’t understand why he can’t get exactly what he wants. He doesn’t understand why someone can’t process a simple request like a camel at 3 in the morning in Minnesota.”

    The content industry doesn’t comprehend things the way you and I do. And if you’re dependent on them to move devices, as Apple and the Android manufacturers certainly are, you don’t tell them it’s impossible to build unbreakable DRM with a closed analog hole without making a best effort because you are then at risk of losing a substantial amount of revenue.

    And that’s the other thing about DRM: the goal of DRM, perfect protection, is impossible; yet best effort DRM gets us pretty far in terms of eliminating piracy because it artificially elevates the cost of piracy above the cost of getting it legitimately. Technically savvy pirates are going to cut out the speakers on their USB-C headphones, record songs straight off the analog line, and upload them to darknet servers. But those people are vanishingly small in number. Most listeners wouldn’t know what to do to get the analog signal off their headphones.

    So DRM is actually doing a pretty damn good job at its stated goal. Perfect closure of the analog loophole is impossible, but closing it enough to make piracy extremely difficult will be reflected in the revenues of digital music sales and will make record execs happy regardless.

    Point being, it doesn’t matter if you think DRM is right or wrong. It doesn’t matter if DRM can never achieve the goals desired by its proponents. It will be a fact of life, like death or taxes.

  107. See, the number of people who can figure out how to turn the signal from their 3.5mm jack into a digital file in the first place is small enough that I’m not at all convinced there’s not a very large overlap with people who could, if not figure out for themselves how to hack it, at least follow a youtube tutorial (which of course won’t openly be about enabling piracy, it will be for how to turn a pair of cheap DRM-compliant-USB headphones into an “adapter” to be able to continue using your pre-existing analog headphones) on how to do it.

    Or on how to build a soundproof box you can put your phone and a microphone in and play your songs over the speakers. No-one, not even in the content industry, is going to believe that people can’t do that. And with the loophole so obviously still open, antagonizing people by not allowing them to connect to pre-existing analog audio output devices is not worth it, and they’ll focus their efforts on catching people who distribute pirated files.

  108. Re: “And if you’re dependent on them to move devices, as Apple and the Android manufacturers certainly are, you don’t tell them it’s impossible to build unbreakable DRM with a closed analog hole without making a best effort because you are then at risk of losing a substantial amount of revenue.”

    Let me quote Steve Jobs: “When we first went to talk to these record companies – about eighteen months ago – we said, “None of this technology that you’re talking about’s gonna work. We have Ph.D.s here who know the stuff cold, and we don’t believe it’s possible to protect digital content.””

  109. Jeff, your analysis is hilariously bad. All it takes is for one person to break the DRM and upload the pirated audio or video and it’s everywhere. “Best effort DRM” is just as useless as any other DRM, which is why Apple abandoned it for their music years ago.

    The music business has largely moved to a model where they give the mp3s away for free, so you go watch their high-priced concerts, a razor/blade business model. Prince was particularly averse to this because he used to make a lot of money from record sales, but most of the high-paid musicians have made the transition.

    I don’t know where you get these conspiracy theories that they’re still bent on DRM, Apple’s rumored move to ditch the analog headphone jack in the next iPhone? That’s because of the industry race to make the smarphone as thin as possible, even ditching battery size and life to get there, not any analog hate or DRM love.

  110. There will have to be commonly available USB-C to mini converters for at least the next 10 years, to support the ability to listen in cars via aux input.

    And for that matter, Bluetooth still exists and has commonly available analog out.

  111. > even ditching battery size and life to get there, not any analog hate or DRM love.

    Which is stupid. I’d rather have a good battery in a cm-thick phone than a crap battery in a 5-mm phone. Also it’s nice to be able to charge and listen to music through different ports. Which means a USB-C M/USB-A M/3.5mm F cord; ye gods.

  112. > Which is stupid. I’d rather have a good battery in a cm-thick phone than a crap battery in a 5-mm phone.

    Well, that’s what external batteries are for. Maybe there will be a “case” that consists of an external battery plus a USB hub plus non-USB breakouts.

  113. I’d rather have a good battery in a cm-thick phone than a crap battery in a 5-mm phone

    You are not in the center of the bell curve for the market, apparently. (Neither am I – I prize runtime over almost every other feature). OTOH, external batteries are cheap and commonly available for folks who are out of range of a charge port, and it’s not hard to drop a charger every place you are these days.

    If they can sort out some of the physics challenges in contactless charging (or if they separate charging from data carrying and can use magnetically-attached charge cables like my Pebble that’s rated against 5 atm), then it becomes much easier to seal the entire case and call it water resistant to 2-3 atm. Use BT and WiFi for data (including audio out) and there you go,

  114. How well does BT deal with congestion? If there’s a bus full of people all listening to stuff on their bluetooth headsets is it still going to work fine?

  115. How well does BT deal with congestion? If there’s a bus full of people all listening to stuff on their bluetooth headsets is it still going to work fine?

    I work in an environment where there’s a LOT of BT headsets, though they’re not all “in use” at any one time

    A bit of random clicking on Google links suggests that BT/BT interference is not all that big a concern; the channels are thin, the allocated bands wide, the devices are low power, and there is effective collision avoidance. It looks like, at least in theory, 802.11 can crowd out BT (as can anything else in the 2.4 ghz band)

  116. >Well, that’s what external batteries are for. Maybe there will be a “case” that consists of an external battery plus a USB hub plus non-USB breakouts.

    I use a Mophie case/battery for my iPhone to effectively double my battery capacity. It conveniently has the Lightning connection inside to the iPhone, but on the outside it has a μUSB for power, which allows me to use the ubiquitous charging cables I have in various places. Should Apple do away with the headphone jack, I am confident that Mophie will stick one on their cases to provide the same kind of compatibility with legacy audio as with legacy power. They might even stick both μUSB and Type C on the outside of the same case until the latter becomes sufficiently established.

  117. Golf,

    More like Intel’s push to get all handset manufacturers off 3.5mm and onto USB-C. Intel is jockeying to sell USB chipsets; if they can go to BMI, ASCAP, Sony, Comcast, Warner, etc. and say “we’ve plugged the analog hole” they can create a whole new market to tap as handset manufacturers are forced to upgrade to make their platforms compliant with what Big Music demands.

    It doesn’t matter that the analog hole cannot actually be plugged. If Intel marketing can convince Big Music that Intel (or Intel-licensed) kit offers better media protection, Big Music can demand Intel (licensed) kit of handset manufacturers if they want to get music on their mobile platforms. It’s like “security theater”. And the musicians themselves will side wuth Big Music on the issue. Prince’s POV was congruent with literally every small time musician I’ve met: as much of the shaft as musicians have gotten from the record companies, they’ve gotten fucked harder and deeper by the easy availability of downloadable and streaming music.

    Besides which, any web site which shows a person how to tap the analog outs on their USB headphones in order to record music is automatically trafficking in circumvention devices — a federal felony under the DMCA.

  118. @Jeff

    Apple and the Android manufacturers certainly are, you don’t tell them it’s impossible to build unbreakable DRM with a closed analog hole without making a best effort because you are then at risk of losing a substantial amount of revenue.

    Guess I shouldn’t believe my lying eyes and memory about the DRM free iTunes Music Store.

    Prince’s POV was congruent with literally every small time musician I’ve met

    I don’t know who the hell you’ve been talking to, but this matches nothing I’ve seen. Believe it or not there are musicians who grew up within the last 50 years, and understand on some level that the Big Record era is done. Even if they didn’t, the very nature of IP guarantees that they will lose, and I’m trying to be generous in considering IP a real thing.

    Besides which, any web site which shows a person how to tap the analog outs on their USB headphones in order to record music is automatically trafficking in circumvention devices — a federal felony under the DMCA.

    You are right, that has worked so well for the last 17 years. It is utterly impossible to find any illegal material.

    Maybe, just maybe, if you didn’t start with (Axiom Prime = Biggest Guy On The Block Always Wins All The Time) you would occasionally not make such a fool of yourself.

    As usual Jeff, you are full of shit.

  119. Jeff, you keep going on about the analog hole, but my point is that DRM itself is easily broken, forget the analog hole. Go to any torrent site and you will see hundreds of videos taken from iTunes, which Apple still DRMs. It must be easily cracked, as all the original digital copies are available there, no analog hole necessary.

    As for the notion that musicians are getting “fucked harder and deeper,” that is one viewpoint, an extremely entitled one. The twentieth century and its tech just happened to be an ideal environment for popular musicians, where broadcast technology like records or CDs was perfectly geared for content that is recorded once but so popular that you can sell millions of the same content, ie a hit record. Note that that was not a great environ for a “small time musician,” as all the local live bands got replaced by jukeboxes and stereo systems cranking out the hits.

    Well, that broadcast tech has been replaced by the internet, which favors services or niche and custom content. Media tech has merely changed, so that music is not the favored content anymore. Musicians have responded by turning music into a service, ie give the mp3s away as marketing in return for high-priced tickets at a live concert, and to a lesser extent paid streaming music. Niche may work somewhat, as there are a lot more music styles now, but I don’t see how music can go custom.

    So you have this situation where music had some perfect traits to move vinyl records or plastic CDs, but not to make money online. That’s life, that’s how tech changes. If some musicians’ plan is to head back into the past, it’s not going to work.

  120. I’m just a user, not a programmer but I recognised the hardware evolution described in the post and surprisingly discovered I’ve been going down the same road in the Mac world. I’m Mac because I like the photo software. When my first Mac – a mid 2011 mini fried its Thunderbolt/display port I realized having a single Mac was more precarious than I thought with a major lecture nearing completion on the machine. I got it going by using the HDMI port and removing the raw HDD from the dead Thunderbolt case and plugging it into a SATA caddy. I quickly realised I didn’t want to buy another specced up Mini where I had to pay Apple prices for an SSD, and there is no way I would buy an iMac where the best drive available is a Fusion and you need windshield suction cups to remove the screen to get at it. Both of those alternatives I now see are overpriced midsize computers. One solution was a well specced MacBook Pro. But there was another alternative – build a beast. Essentially the top specced Haswell build on TonyMac86 with the unlocked K i7, a Samsung TB SSD, 32 gigs of RAM and as much graphics card as I need all in a nice big, easy to work on, cube case. Just built a matching dolly for it so I can roll it outside when it’s time to blow it out with the air compressor. El Cap thinks its a Mac Pro and for all intents it is and far, far better value than the trash can pro. Since building that I bought the new iPad Pro 9.7 inch happy to pay the premium for the Pencil interface on a really fast, advanced tablet with all sorts of goodness to come on the iOS app store. Then the solution to my portable needs became clear. With a Beast at home then I can buy the lightest notebook for travel and the clear choice is the MacBook because it runs my desktop photo software well enough to prove that I am getting good shots when travelling. Again, I happily paid the premium Apple price because the MacBook in probably the lightest, highest quality notebook you can buy. It is a cutting edge piece of equipment unlike either the Mini or iMac which are crippled overpriced obsolescent mid sized computers. I still need Windows to run two legacy scanners which cost the better part of $2000 between them but have no Win 10 drivers. I bought one of those little blue $199 HP Minis and have picked up a 60 GIG M2 SSD from China for $30 on which I will put XP and mount it behind a small monitor to run the scanners. Again no need for mid sized machine taking up space and eating electricity. So yeah, I’ve found my own way down the same path as discussed here.

  121. I do not believe that Midaged Computers are dying. still, now, Macbook is one of the leading Laptop variety and nothing stands in its way. Macbook is doing a great business everywhere and people still want to own it.

  122. I don’t know where you get these conspiracy theories that they’re still bent on DRM, Apple’s rumored move to ditch the analog headphone jack in the next iPhone? That’s because of the industry race to make the smartphone as thin as possible, even ditching battery size and life to get there, not any analog hate or DRM love.

Leave a Reply to Jeff Read Cancel reply

Your email address will not be published. Required fields are marked *