World Without Web

Technological change has a tendency to look inevitable in retrospect – “It steam-engines when it’s steam-engine time.” Likely this is true in many cases, but I often think we underestimate the alarming degree of contingency lurking behind ‘inevitable’ developments. To illustrate this point, I’m going to sketch an all-too-plausible alternate history in which the World Wide Web never happened.

The divergence point for this history is in 1983-1984, when the leadership of DARPA lied through its teeth to Congress about who was being allowed access to the Internet. The pretense being maintained was that only direct affiliates of government-sponsored, authorized research programs were being allowed on. DARPA knew very well this wasn’t true; they had a broader vision of where the Internet might lead, one that wouldn’t be realized for another ten years. They viewed the yeasty, chaotic culture of casual use by all manner of ‘randoms’ (unauthorized people including, at the time, me) as what would later be called a technology incubator – a vast Petri dish from which amazing serendipities would spring in due time.

This optimistic view was entirely correct. One such serendipity was the invention of the World Wide Web; another, though the causal connections take a bit more work to trace, was the emergence of open-source software as a conscious movement. But what if DARPA had been caught in that lie, funding for its network research scaled back, and a serious effort made to kick randoms off the early net?

It seems all too likely that internetworking research would have stalled out or reverted to the status of an academic toy and laboratory demonstration. There was increasing demand for wide-area digital communications at the time, but it was being mostly met by pre-Internet timesharing services like CompuServe, AOL, and Genie. Those are barely remembered now because the Web steamrollered them flat in the late 1990s – but the Web depended on the TCP/IP stack and internetworking. Without internetworking, no Web, and without that…

Welcome to a world of walled gardens. Your digital universe is a collection of competing fiefdoms run by CompuServe, AOL, Genie, and later entrants that came into the fray as demand rose, many of them run by big media companies. Each network has its own protocols, its own addressing conventions, and its own rigidly proprietary access software. You get the services they choose to offer and that’s it – there’s no end-to-end, no access to the bitstream.

You can only do the equivalent of email and instant-messaging with people on the same provider you are using. Inter-provider gateways are buggy and often nonexistent – some providers think they add attractiveness to potential customers, others think they can shoulder smaller networks aside by making them relatively inaccessible. You see a lot of read-only gateways that allow you to pull messages and content from other providers but not export your own, and there are frequent interdictions of these by targeted providers who view these one-ways as leeching. People who use the nets heavily need to have half a dozen different accounts, sets of credentials, and email-address equivalents.

Any equivalent of user-controlled websites barely exists; they’re an expensive premium service not available on all networks, and subject to “acceptable use policies” that pretty much exclude any content the provider doesn’t like. And, again, they’re only viewable by others using the same access software from the same provider. There is no hyperlinking across providers. And certainly no search engines!

There may not even be hyperlinking within most of the walled gardens, because the whole model of a universal flat document space indexed by URIs never developed. A few scattered groups of visionaries like Ted Nelson and the Xanadu Project have the idea, but nobody else understands what they’re driving at.

Blogs? Forget about it. Again, something like a public-diary or mini-magazine publishing format may be available from some providers, but…no hyperlinks. And there are certainly no third-party blog engines like WordPress or Moveable Type. Audiences are badly fragmented by the walls between providers, and providers exert heavy control over content; if you post something “offensive” on your magazine, your provider will protect its corporate reputation by shutting you down.

Gradually, over time, the smaller providers are merging or being squeezed out of the market. While this cuts down on the number of accounts serious net users need to have, it also means the content-controlling power of the big-provider oligopoly is becoming more difficult to evade. And the kinds of services available, far from broadening over time, are actually narrowing. A nostalgia for the less fettered early days is already developing, but it’s helpless – the big providers say the niche services are unprofitable resource hogs because not enough people want them enough to pay the add-on fees required to sustain them, and who can argue?

Even as late as 2011, if you suggested music or movie streaming you’d be dismissed as a loon; the bandwidth isn’t there, because the Internet boom and the big fiber build-out never happened. Networking gear is several generations less advanced, and evolving much more slowly because its market is orders of magnitude smaller. Only the Federal government and handful of Fortune 50 corporations have fiber/coax backbones, and none of those can talk to each other. Ordinary joes have to deal with X.25 over copper and even worse. Even acoustic-coupler modems, a half-forgotten joke in our timeline, are still live technology in this one.

Smartphones? Google? Pandora Radio? File-sharing? Craigslist? Facebook? Dream on. It’s not just that the technological infrastructure can’t support these things, the conceptual infrastructure is absent. Well, we might have something vaguely like smartphones, but they’d be hardware instantiations of some single provider’s access software. Sealed boxes, no tethering or hotspotting. For that matter I’m far from sure there’d even be anything like WiFi yet in alternate 2011.

There’s a recognizable version of the hacker culture, but the population explosion of the 1990s never happened; it’s basically frozen in amber at about the stage when we were exchanging tiny source archives via USENET postings. There’s no Linux because there’s no net! Without cheap communications, the social engine needed to support large-scale open-source developments never spins up, and the open-source software catalog amounts to little more than a small range of toy programs. Spared the competitive pressure, proprietary operating systems and applications suck even worse.

The news isn’t all bad. There are still jobs for travel agents, and this future doesn’t have a spam problem; that may be the one single advantage of the provider oligopoly’s grip on online content. But compared to the Internet we have, the overall picture is pretty damn bleak.

If you think on-line advertising is obnoxious today, imagine what it would be like if the provider’s access software could shove whatever it chose at you at any time – no alternate browsers, no popup blockers, no escape. Hackers in this alternate history spend a lot of their time trying to write “universal” (cross-provider) clients with user-controlled filtering, but the providers view this as a threat to their business models and conduct an arms race, changing access protocols for greater ‘security’ as often as they can get their ordinary users to download client updates. Using a reverse-engineered client is a violation of your terms of service and can get your account canceled.

The few people trying to build more open public networks are widely dismissed as scruffy anarchists intent on creating havens for hate groups and child porn. But they’re doomed, anyway; the economic and technological base on which to erect their dreams simply doesn’t exist.

It could have been like this. The better outcome we got was not inevitable. Maybe, now, you’ll appreciate it a little more than you have.

148 comments

  1. Maybe it’s your initial mention of steam engines, but this alternate universe gives me a strong feeling of steampunk-ness. Oh how far have we progressed in the last 25 years that I’d compare no-IP to no-electricity. Almost every day something reminds me of how I’m glad to have witnessed that. 8-)

    One gripe: the ending of the first paragraph should say “internet” instead of “world wide web”. The title, though, is way cool!

  2. It seems all too likely that internetworking research would have stalled out or reverted to the status of an academic toy and laboratory demonstration. There was increasing demand for wide-area digital communications at the time, but it was being mostly met by pre-Internet timesharing services like CompuServe, AOL, and Genie. Those are barely remembered now because the Web steamrollered them flat in the late 1990s – but the Web depended on the TCP/IP stack and internetworking. Without internetworking, no Web, and without that…

    Minor point: There was no America Online until 1989, and the services that became AOL — Quantum Link, AppleLink and PC Link, didn’t begin operations until 1985.

    Now for my major point: your scenario largely ignores the BBS scene, including FIdoNet, which by the 1990s had more than 20,000 nodes worldwide and — more importantly — continues operation today, mostly using Internet connections rather than dialup connections. There is no doubt in my mind that FidoNet would have had continued growth in the absence of the World Wide Web. And don’t forget UUCP, either — during that period, it was the only way I got Usenet and email access.

    (Note: I was a FidoNet SysOp in the late 80s/early 90s time period.)

    1. >Now for my major point: your scenario largely ignores the BBS scene

      Yes, fairly deliberately. BBSes and FidoNet didn’t have one of the most important properties of internetworking – a unified flat address space. No equivalent of URLs, and certainly no possibility of hyperlinks.

  3. > Technological change has a tendency to look inevitable in retrospect – “It steam-engines when it’s steam-engine time.” Likely this is true in many cases, but I often think we underestimate the alarming degree of contingency lurking behind ‘inevitable’ developments. To illustrate this point, I’m going to sketch an all-too-plausible alternate history in which the World Wide Web never happened.

    I kind of disagree. The long history of multiple discoveries (https://secure.wikimedia.org/wikipedia/en/wiki/Multiple_discovery; see the even more dramatic citations in ch7 of Kevin Kelly’s _What Technology Wants_) suggests we have underestimated the inevitability of technology & science. (There are enough *6*-person multiple discoveries that Dean Simonton could verify that the Poisson distribution of the 1-3 person multiples held out to n=6.)

    The constant fall of walled gardens which is a theme of Internet history also seems to suggest an inevitability. If there were no trend there, we would expect a lot of lucky holdout walled gardens on the right hand of the bell curve.

  4. I second Morgan’s comment. You are really citing a lack of links leading to no Linux? Srsly? There was shareware well before the Internet. Freeware too. And collaborative efforts did not begin with Linux, either. Hello, Unix! Open Source would have happened even without the Net. And there were already impulses in play that would bring down the cost of long-distance communications. Even packed-switched networks were looking at connecting together all those BBSes. If anything, we might have wound up with a more *efficient* “internet” because people wouldn’t dare bloat their systems as they do their pages today. An overarching structure would have developed out of necessity.

    1. >You are really citing a lack of links leading to no Linux? Srsly? There was shareware well before the Internet. Freeware too.

      I know. I was part of that culture. But without cheap communications, it didn’t scale.

    1. >Sounds like you like government after all.

      Remember, I attribute the takeoff of the Internet to DARPA deliberately lying to its political masters. This happened precisely because DARPA’s leadership knew it was impossible within the incentives operating on governments for them to tolerate the ‘randoms’ or the results DARPA foresaw.

  5. You can only do the equivalent of email and instant-messaging with people on the same provider you are using.

    Mmmmm. In this world, the BBS culture never gets steamrollered by the Internet. The same demand that got the various commercial network providers building mail gateways IRL gets them setting up FidoNet access. Metcalfe’s law undercuts anyone who tries to keep FidoNet off their service; businesses take their ability to communicate with suppliers and customers seriously.

    1. >The same demand that got the various commercial network providers building mail gateways IRL gets them setting up FidoNet access.

      Er, no. There was enough time for this to have happened in our history in the decade after 1984, if the incentives worked as you claim. It didn’t, because they didn’t.

  6. Compuserve, AOL and Genie are basically glorified BBS’s in this scenario. It’s not clear whether they would want to interact with FidoNet/UUCP, but the network effect may make it worthwhile. Most likely, out-bound gateway access would be sold as a “premium service” and common standards would develop over time.

    We would then end up with something resembling our Internet, sans network neutrality and with telcos maintaining AOL/Facebook-like “walled gardens”. The open network would not be as popular as the telcos’ big gardens, but it would find a lot of use among students, academics and the like, allowing the FLOSS community to develop in some form.

    One big problem with this scenario is that BBS’s were dependent on the telephone network, so bandwidth was at a premium and international comms were wholly unfeasible; there was no independent data pipe or backbone. But perhaps some independent WANs could arise as infrastructure costs fall.

  7. I have to read the rest, but DARPA’s contribution was TCP/IP (Padplipsky, Elements of networking style, the). ISO was playing with a 7-layer cake, of which X.25 did the first four not really well. The ISO 7-layer would be like having C++ before C.

    Unix elegance is a mindeset, and TCP/IP follows that model. There was windows, MFC, and now Silverlight (though I’m shocked that Win8 might use HTML5/javascript as a platform)

    As to the services – you forget the phones in the 1990s and before. You could call anyone else in the world with a phone (but it might cost and require lots of numbers). Have you forgotten UUCP based mail and newsgroups that PREDATED TCP/IP?

    Speaking of X.25, I had a service called PC Pursuit that I could dial in locally and dialout in any of 25+ major metro areas to BBSes or whatever was on the far end of a modem. Shortly after I was on Michigan’s MERIT network that supported Multicast video. Can you name ANYTHING that uses Multicast now? (224.0.0.1 and beyond)? That would help streaming.

    Uniformity is not a good thing – Microsoft has been arguing just use Windows and Office because that is “universal”.

    I need to reread the article slowly, but these are my first impressions

  8. I don’t buy this scenario as a long-run equilibrium; the demand for going across providers would just be too high. It’s like saying that each phone company would prohibit you from talking to customers of other phone companies.

    1. >I don’t buy this scenario as a long-run equilibrium; the demand for going across providers would just be too high.

      That incentive might get you mail and instant-messaging gateways, eventually. It probably wouldn’t get you hyperlinks.

  9. >I don’t buy this scenario as a long-run equilibrium; the demand for going across providers would just be too high. It’s like saying that each phone company would prohibit you from talking to customers of other phone companies

    You had long-run equilibirum scenarios in the history of humanity before : slavery was one of them, since Ancient History…

  10. The real value of this is to look around and see how and where the real world resembles this, so that we can see how and where we stand to improve further.

    I think I’ll take another look at Croquet…

  11. You had long-run equilibirum scenarios in the history of humanity before : slavery was one of them

    Huh?

  12. > I know. I was part of that culture. But without cheap communications, it didn’t scale.

    It’s really a question of what the available bandwidth is. The question isn’t does Moore’s Law slow down, but by how much. Software development would be more in silos and that would translate into slower hardware gains. BBS’s would be able to run versions of IRC for chat communication, but distributing large source trees is the hard part.

    Do cable modems exist? (straight to a walled garden, but is their bandwidth speed available?) It seems to me it’s a question of when the ambient tech gets to videochat as that would allow the small hacker culture to distribute files faster by encoding them as a video stream/clip. (I don’t expect VOIP to really happen, and that’s not significantly better than a telephone modem)

    I’d say the revolution takes off eventually as the usable bandwidth among hackers increases, it’s just a question of when.

    Then again, I’m assuming relatively stable specs for movie encodings, those might be custom too (although to talk to TVs there’d likely be some sort of DVD standard that would work to flatten the online video codec space as video cameras would likely output that)

    Another change is that distributed version control likely would have been developed/popularized earlier due to communication lags in a more direct peer to peer distribution chain.

    1. >Another change is that distributed version control likely would have been developed/popularized earlier due to communication lags in a more direct peer to peer distribution chain.

      That is an excellent insight. Earlier version control depended on real-time access to shared file repositories, something that would be difficult to buy in this history.

  13. A free labor market is so much more efficient than slavery that “of course” it can’t be sustained for long periods of time. Except, it was.

    (I hope you didn’t turn your brain off and go into political flamewar mode at the word “slavery”, like I almost did.)

  14. Pingback: Brainbiter
  15. The scenario you portray is actually manifest in the EDI Communications market served by Value Added Networks, the providers that preceded the Internet by decades, yet today use the VPNs as circuits, but all off each other via “interconnect” agreements. Now, let me say that 90% of the VAN industry had been collegially and cooperatively practicing “non-settlement” message exchange agreements, growing the market, and adding value – unit one watershed event occurred – Private equity started the roll up of key VANs, When this occurred, the largest VAN interconnections became arbitrage levers at the behest of the dark masters of sovereign equity of questionable provenance. With the gateway border guns aimed at the independent innovative EDI networks, the very promising providers of API’s, enhanced services, and all kinds of high tech goodness, the market started to tilt out of balance. The largest VANs, each a formidable competitor, grant interconnects to one and other, while the very largest holds one selected innovator out of the pool – (they charge Loren Data ‘out market rates’ while granting arch rivals free peer messaging exchange). Crazy. Foolish, and immoral / bad for the market.

    Recall back to the AT&T pre-divestiture era, after the Kingsbury Commitment, DOJ wanted them busted up, but AT&T started really doing the backdoor lobbying thing, and ads were running all over the country with the message” “Trust the (Bell) System”, yes, there was a contingent, mostly AT&T allies, that really, really, thought that a ‘beneficial monopoly’ was the way to go; one phone company to rule them all.

    So when you say,:”look what might have happened if….” Well, it is happening in the EDI comms racket today. Why the FCC is completely clueless over these important issues is beyond me, they seem negligent in letting these crucial supply chain data flows become subject to the exploitation of a PE funded, toxic monopoly. Meanwhile, the suppliers and medium business users of EDI systems suffer with a lack of innovations, essentially being stuck with FTO uploads and downloads for the last 20 years. The one or two innovators without a hand to help, battle on in Federal Court against the evil empire and their hired stable of vampire top shelf antitrust clones. Wish us luck.

  16. @alan wilensky:

    Why the FCC is completely clueless over these important issues is beyond me, they seem negligent in letting these crucial supply chain data flows become subject to the exploitation of a PE funded, toxic monopoly.

    WTF does the FCC have to do with EDI? As far as I’m concerned, all the FCC should do is to make sure that the lowest level network is dumber than a rock and lets anybody connect to anybody else, and here you are saying it should crawl up the value chain and sort out wars between historical intermediaries and would-be intermediaries, when all that crap should go the way of the do-do in lieu of IETF RFCs.

  17. @esr:

    The same demand that got the various commercial network providers building mail gateways IRL gets them setting up FidoNet access.

    Er, no. There was enough time for this to have happened in our history in the decade after 1984, if the incentives worked as you claim. It didn’t, because they didn’t.

    No, there wasn’t! There was no FidoNet before 1984. Tom Jennings didn’t invent FidoNet until 1984.

    1. >No, there wasn’t! There was no FidoNet before 1984.

      So? I’m saying that we had about ten years – 1984 to 1994 – to see if FidoNet was able to persuade the timesharing services to play nice, adopt common messaging standards, and set up FidoNet access. What actually happened is that the timesharing services contemptuously ignored FidoNet. So the hope that FidoNet might somehow have bridged between the providers is just wishful thinking; if it didn’t happen in the first ten years, it wasn’t ever going to as the providers got older and less flexible.

  18. Hyperlinks depend on the existance of an Internetwork. I think there was enough benefit to be had from one, that it would have been cobbled together from the BBSes, FidoNet, and Usenet. Remember, Usent moved to NNTP because the Internet existed; not because it failed on its own.

    I think that SOME sort of internetwork was inevitable. We got one that works pretty well. We could have gotten one which works poorly. But I think we would have gotten one sooner or later.

  19. I agree that Linux and its subsequent Open Source wouldn’t have taken off because Linux owes a lot to the Internet which exponentially spread the OS. On the other hand, we’d still have its granddaddy—GNU—and maybe even its own Hurd kernel completed :P BTW, Stallman’s info pages had hyperlinks before the existence of the WWW.

    1. >BTW, Stallman’s info pages had hyperlinks before the existence of the WWW.

      That’s nothing special. So did the Jargon File, way back in the early 1990s :-) Took Tim Berners-Lee to make the concept sing, though.

  20. I’d say the chances of this timeline happening were about 100%. Why? Because it happened. (I’m a big fan of the clockwork Universe.)

  21. Fascinating thought. I’ve been on the Net since 1985, though not with continuous access, and it’s been very interesting watching it evolve.

    So your alternate universe looks something like this:

    — GNU: this alternate world would put a premium on cathedral-style software development, at the expense of the bazaar. However, I’d expect GNU tools to still exist. They would be developed by a very small group at slow rates, and distributed over the BBS/FIDO/UUCP links.

    — BSD: what would be the state of this, particularly PC versions like 386BSD? I don’t know enough of the history. Would hackers be running versions of BSD, or would we see a mishmash of PC Unixes like Minix etc.?

    — Hyperlinks: Berners-Lee would still have developed ENQUIRE in 1980, since it pre-dates your branch-point date. Hypercard would still have been developed by Atkinson in 1987, since it didn’t depend on the Net. We’d have the concept of a hyper-connected database, but the fact “Berners-Lee and Cailliau pitched their ideas to the European Conference on Hypertext Technology in September 1990, but found no vendors who could appreciate their vision of marrying hypertext with the Internet,” does not bode well for its future development.

    — Usenet would be in full flower, though it would have taken longer to grow. Most people would be accessing it through UUCP or other batch protocol, and limited bandwidth would keep it text-only. The alt.binaries groups would not exist or would have very limited distribution.

    — The tech boom of the 1990’s and resulting crash in 2000-2001 never would have happened. This would surely have some consequences.

    — Would the Arab spring have occurred? It would be much easier for repressive governments to control digital communications in the alternate world.

    1. >– GNU: this alternate world would put a premium on cathedral-style software development, at the expense of the bazaar. However, I’d expect GNU tools to still exist. They would be developed by a very small group at slow rates, and distributed over the BBS/FIDO/UUCP links.

      Agreed. I almost added a note about this to my scenario. The bad news is that, deprived of the Internet as a medium, the GNU project remains a tiny effort with effectively zero impact. That is, until the timesharing providers stumble their way to something resembling a primitive Internet, if they do.

      >– BSD: what would be the state of this, particularly PC versions like 386BSD?

      Pretty throughly fucked. The spread of 4.2BSD had a lot to do with its early TCP/IP implementation, which made BSD Vaxen excellent network machines for their time. Deprived of that pull, it’s probably just another dead-end academic OS. Very likely the 386bsd derivative never happens at all.

      >– Hyperlinks: Berners-Lee would still have developed ENQUIRE in 1980, [but]…

      Yes. No common TCP/IP address space means no URLs means no hyperlinks across systems. The Web never happens,

      >– Usenet would be in full flower, though it would have taken longer to grow. Most people would be accessing it through UUCP or other batch protocol, and limited bandwidth would keep it text-only. The alt.binaries groups would not exist or would have very limited distribution.

      Agreed.

      >– The tech boom of the 1990?s and resulting crash in 2000-2001 never would have happened. This would surely have some consequences.

      Agreed. One consequence I’ve already alluded to is that the great fiber-backhaul buildout of 1997-2001 never happens.

      >– Would the Arab spring have occurred? It would be much easier for repressive governments to control digital communications in the alternate world.

      Oh, absolutely. This is one reason that I don’t think we ever get to an Internet from this scenario. The provider oligopoly is politically convenient for governments; it increases their ability to control information flow because it defines a small number of network chokepoints (the provider servers) on which they can exert pressure. An alliance between governments and the provider oligopoly will naturally form to suppress free networks.

      So why didn’t this happen in our timeline? We got lucky – the early Internet cadre, with DARPA backing, succeeded in getting a rudimentary net deployed that baked a high resistance to attempts to screw with it into its architecture at every level. The key phrase here is “with DARPA backing”; this gave it a lot of political cover that attempts to deploy free networks later on in the alternate timeline would not have.

  22. @Max E.: “I’d say the chances of this timeline happening were about 100%. Why? Because it happened. (I’m a big fan of the clockwork Universe.)”

    If you really believe that, you need to read more about quantum mechanics, nonlinear dynamics, and chaos theory.

  23. @Morgan Greywolf: “Minor point: There was no America Online until 1989, and the services that became AOL — Quantum Link, AppleLink and PC Link, didn’t begin operations until 1985.”

    That’s a detail, though — Micronet (later renamed CompuServe) had an information network available via dial-up by 1978, and TheSource was founded in 1979. So no change in the mid-80’s would prevent these from being developed and marketed.

    If Wikipedia is right, though, even the development of Micronet could easily have been cut off by its corporate masters, who wanted to focus on selling time-sharing to business, not information to consumers:

    “The consumer information service had been developed almost clandestinely, in 1978, and marketed as MicroNET through Radio Shack. Many within the company did not favor the project; it was called ‘schlock time-sharing’ by the commercial time-sharing sales force. It was allowed to exist initially because consumers used the computers during evening hours, when the CompuServe computers were otherwise idle.”

    Wikipedia: CompuServe

  24. One more thought before I shut down: if you look back at the science fiction writers and other visionaries who wrote about the coming “global village” in the 1960’s and later, I can’t recall a single one who predicted the modern Internet anarchy. Typically, their visions looked more like a super-Compserve, a single system that was either government owned or run by a corporation like Ma Bell. The modern information-sharing universe no more looks like that than modern spaceflight looks like the Von Braun vision of huge wheel-shaped space stations.

  25. @Cathy
    > If you really believe that, you need to read more about quantum mechanics, nonlinear dynamics, and chaos theory.
    Oh yes, I’m quite familiar with that. I just think it *looks* random because we haven’t found the pattern yet. I think someday we’ll get past quantum mechanics and start predicting things again. I freely admit I have no rational basis for believing that– if you like, you may call it my sole religious belief.

  26. I read an essay some years ago claiming that the BBS systems were evolving toward internetworking. Unfortunately I can’t now find it.

    I don’t think the relative lack of progress in that direction proves very much, since certainly by ’87, and probably earlier, though that is before my time, the people who would have been pushing it on were already using TCP/IP, mostly in universities. If that wasn’t there, they would have been building something.

    See for instance this history of the networking between universities in the UK. They connected to ARPANET as soon as it became reasonable to do so, but the process was clearly happening anyway. I can’t believe that in the absence of ARPANET that JANET and the BBSs wouldn’t have become intercommunicating, or that something similar would have gone on in the US, presumably faster. The fact that it didn’t happen is simply a result of the fact that ARPANET and TCP/IP were already in use by the people and institutions that would otherwise have been doing the work.

  27. Fascinating article.

    >The real value of this is to look around and see how and where the real world resembles this, so that we can see how and where we stand to improve further.

    Yes, one wonders in which realms we are presently living in an “alternate history” analogous to the one depicted in this article, as opposed to where we might now be had government/oligarchies not gotten in the way. The health care/medical technology realm perhaps?

  28. @TOMc

    >me: You had long-run equilibirum scenarios in the history of humanity before : slavery was one of them
    > you:Huh?

    It was ironic. I was trying to tell you that a state of catastrophy can be quite permanent in humanity.

  29. Bret wrote “Sounds like you like government after all.”

    Actually ESR seems to be treating the historical government franchise monopoly policy on telecom as an immovable object, as you seem to be as well, and that’s reasonable enough given what I know of the politics of the time. But if the noble born forbid anyone but the noble born from defending the realm, and lo the noble born end up defending the realm, it’s not reasonable to treat that as strong evidence that without the noble born there’d be no defence of the realm.

    If the government had forbidden competition t the USPS so broadly and effectively that ordinary UPS and FedEx service remained illegal, and the Pentagon had evolved its own system of package delivery subcontractors, and then Pentagon-associated legal entities started using those subcontractors through increasingly routine legal fictions until there was enough of a lobbying counterweight against the USPS to formalize and systematize the legal fictions as truly routine written law, you and I and ESR might agree that the Pentagon had been key to the evolution of modern package delivery, but you’d see the story as more of a reason to “like government” and the consequences of its economic intervention than I (and probably ESR) would.

  30. I lived thru the period you’re talking about. Your memory isn’t quite correct. The proprietary services, compuserve, genie and the source were steamrolled, not by the web, but by AOL. AOL steamrolled them because it provided more web like services. Even still, those “walled gardens” were busy breaking down their walls. In fact, compuserve still exists today as a set of online forums that people use. They were interoperably exchanging email, and they were starting to share content, and they were starting to share forum posts, etc.

    Further, like most advocates of the state who say “if there was no government who would build roads?” you make the error of assuming that without centralized control (and the internet is centralized control of the US government, notice the illegal confiscation of domain names that’s going on) that private entities wouldn’t produce a more free, more liberal and more tolerant and more cost effective alternative.

    Much of the current internet has benefited from private enterprise driving down costs and increasing connectivity (after all, the government wanted the internet for itself)… yet you assume that without some magical initiation of force to unify people under a single protocol, that these same market forces wouldn’t still exist.

    Somehow, your desire to advocate for freedom has been twisted into advocating for tyranny, but I suspect there”s no way I can get you to see it.

    I think maybe a refresher reading of Atlas Shrugged, or some L. Neil Smight, or Mises or Rothbard would help.

    1. >Further, like most advocates of the state who say “if there was no government who would build roads?”

      If you think this what-if qualifies me as an “advocate of the state”, you’re out of your fucking mind.

      I don’t know what evolutionary path would have gotten a free market in telecomms to an Internet. I do know that we didn’t get to run that experiment, because there was a government-created telecomms monopoly firmly in place. Even after the AT&T divestiture our infrastructure retained the stamp of that monopoly in ways both large and small. We are just lucky that DARPA subverted the normal centralizing tendency of government at a crucial time.

  31. I read an essay some years ago claiming that the BBS systems were evolving toward internetworking.

    They did, more or less. I — and others, too — wrote software in that general direction. By the time it was approaching usefulness, however, the Internet had already steamrollered the BBS scene.

  32. I wonder how networks would look like *outside* United States, if there were no TCP/IP and not Internet (Europe, Asia)…

  33. @Abolitionist

    Somehow, your desire to advocate for freedom has been twisted into advocating for tyranny, but I suspect there”s no way I can get you to see it.

    I think he’s more praising the fortune of long-term-minded academics who got extremely lucky and managed to push a clever system through a known government-private enterprise telecommunications hegemony.

    Private enterprise is no good if private enterprise is in cahoots with the government. And at the telecom level, it’s pointless to argue that it isn’t.

    @esr

    Technological change has a tendency to look inevitable in retrospect – “It steam-engines when it’s steam-engine time.”

    This vaguely reminds me of something like a technological anthropic principle; that the best thing that happened was somehow uniquely fated to happen, and all good technologies likewise.

  34. (Addressing “steam-engine time”):
    “This vaguely reminds me of something like a technological anthropic principle; that the best thing that happened was somehow uniquely fated to happen, and all good technologies likewise.”

    H’mmm. But all bad things are the sole result of…?

    This has, of course, huge grayness issues. Name a “bad” technology; name a “good” one. Atomic bombs were an excellent technology if you were a soldier listed to go invade Japan in 1945. The interconnected world is a wonderful thing…unless you’re dodging a tech-savvy, ill-intentioned ex.

    One of the arguments against ‘net-like developments via FidoNet in esr’s alternate (and in favor of huge corp-run walled gardens) is…connecting. It was a big ol’ BBS that had more than two dial-up lines; but Compu$erve? Why, it was vast! That’s a choke point. Small BBSs snuck by paying residential telephones rates; being charged as businesses would’ve killed ’em. Conversely, even a “small” ISP (back when there were any) was pretty big: they were selling a product people were more willing to pay for, the whole world on the other end of the line.

  35. Writing off Fidonet because it hadn’t achieved interconnect agreements in its first decade is an error, Fidonet didn’t really start picking up steam until the early 90’s, it was just approaching the mass it would have needed to get those interconnect agreements when consumer Internet showed up and steamrollered it.

    I don’t disagree with ESR’s basic premise, but I strongly suspect that there would have been quite a backnet from Fidonet and that it would have gotten at least moderate interconnect agreements from the walled gardens. One thing to remember is that many of the early small ISP’s evolved directly out of larger local BBS’s. If you had a large BBS, becoming a small ISP in the mid 90’s often meant getting an ISDN line and another 10 phone lines. There’s no reason to suspect that in the absence of the Internet we wouldn’t have seen an increased push to make fidonet interconnectivity more usable (and it was already pretty usable by the mid-90’s, the Demo scene was regularly distributing large files across Europe & North America via Fidonet).

  36. > …without cheap communications, it didn’t scale.

    “Never underestimate the bandwidth of a station wagon full of tapes”. Communication would have been cheap.

  37. @Adam Maas and @Robert Anderson:

    “Never underestimate the bandwidth of a station wagon full of tapes”. Communication would have been cheap.

    the Demo scene was regularly distributing large files across Europe & North America via Fidonet.

    Correct. Many BBS sysops had large WORM volumes online filled with software and other files and even games like the demo version of Wolfenstein 3D were being shipped worldwide via FidoNet’s file echoes. In fact, my first Linux distro — Slackware — was downloaded as a series of floppy disk images that came from one of the file echoes (to be fair the originals most likely came from http://ftp.cdrom.com)

    We even had e-mail and Usenet via UUCP gateways and some sysops that connected to the Internet even had IRC and MUDs.

    I agree that it was just approaching the criticial mass needed to get the interconnect agreements.

  38. >> …without cheap communications, it didn’t scale.
    >
    > “Never underestimate the bandwidth of a station wagon full of tapes”. Communication would have been cheap.

    Bandwidth (throughput) is not everything. Latency is important too…

  39. @Jakub Narebski:

    > Bandwidth (throughput) is not everything. Latency is important too…

    True, but to some extent they are fungible. Forward error correction comes into play. Whole message get sent instead of packets. As others have pointed out, dialup exchange was getting much more common, and there was lots of caching going on (the equivalent of today’s CDNs).

  40. Bandwidth (throughput) is not everything. Latency is important too

    That’s why I recommend outfitting the station wagon with supercharged 454 and a radar detector. :-P

  41. > That’s why I recommend outfitting the station wagon with supercharged 454 and a radar detector. :-P

    Even with the radar detector, you could be increasing maximum latency while decreasing the average latency. Especially if you get caught multiple times and wind up in the slammer, rather than just getting a ticket.

  42. @esr Er, no. There was enough time for this to have happened in our history in the decade after 1984, if the incentives worked as you claim. It didn’t, because they didn’t.

    Actually this is just plain wrong. It took a decade since 1984 for network providers to see that internet access is desirable by customers (Windows got TCP/IP in Windows 95, for example – after decade of talks by Bill Gates about networks of the future). This is WITH government-backing. Fidonet never had a government backing and started at about the same time. It’s only natural it took longer. I say “took” not “could have taken” because I’m not talking about speculations like the opening article does – I’m talking about hard facts.

    @esr I do know that we didn’t get to run that experiment, because there was a government-created telecomms monopoly firmly in place.

    Actually no. Different countries had different problems. In some of them first ISPs actually offered access to Fidonet resources too (mostly ones where internet was not backed by government and so Fidonet development was faster in comparison to Internet). I’ve worked in school back then and in the middle of 1990s we had access to the Fidonet echomail conferences but not to usenet (we had access to both internet-email via UUCP and to Fidonet mail, but usenet was deemed too expensive by our administration). There were limitations (some conferences were restricted to “real” nodes), but in general it was good resource back then. Only when internet access prices dropped far enough people switched to internet resources and few years later ISPs stopped offering access to Fidonet.

    It’s easy to imagine alternate reality without internet. In this reality large companies will have no need to offer Fidonet access, but for small FSP (Fidonet service providers) situation will be different.

    The end result will be the same but it’ll require longer incubation time. Probably much longer and the technologies will be subtly different (because in Fidonet latency was measured in days different completely set of technologies was developed initially).

    P.S. About the venerable hyperlinks: Fido actually had them close to the end. People naturally referenced other messages in echomail conferences and there were developed some ad-hoc solutions to “ask around” on nodes for some particular document. Not long after people got access to the web and the need to further develop the technology evaporated, but yes, there were “kind-of-hyperlinks” in later versions of GoldED. Of course they were actually developed later then internet got them, but in alternate reailty they will have no competition.

  43. The scenario presented here seems to mistake the Web for the Internet.

    Before the Web was invented, Usenet (and to a lesser extent, WAIS and Gopher) was a thriving protocol, with relatively large numbers of people participating daily. Usenet was my first introduction to the Internet, and for years, it was to me what Facebook is to many people now.

    There were MUDs where we spent many late hours engaged with other Internet users around the world, and yes, our email systems were on the Internet.

    Thousands of independent ISPs were thriving before the Web came along, most having gotten their start as dial-up BBSes. Each provided access to a wealth of Internet services.

    For that matter, hyperlinking was already commonplace in Gopher, and fairly mature in Hypercard.

    CD-ROMs were introducing rich multimedia experiences before the Web came along, and it doesn’t take much of an imagination to move from CD-ROMs and Gopher to something very similar to the Web we know today.

    It’s fun to imagine alternate dystopian histories, but the one described here is far-fetched, to say the very least.

  44. there were “kind-of-hyperlinks” in later versions of GoldED.

    Cool! Too bad I never got to see those in action, I had abandonded FidoNet for the Internet by the mid-90s. GoldED rocked. I was glad to see that it later got open sourced.

  45. >Now for my major point: your scenario largely ignores the BBS scene

    Yes, all those people with DOS machines included a lot of hackers writing software to make up for all the scene’s deficiencies. Interlinking took time; machines forwarded messages at night to each other.

    One consequence I can think of is that the FreeDOS kernel project would have been completed and ruled the world instead of Linux.

  46. Interesting scenario, Eric. You focus mainly on the technological deficiencies of your alternate history. But the political situation would be even worse.

    People don’t realize how close Bill Clinton came to breaking the back of the Second Amendment movement in America. Mostly, articles about the 2A issue tend to imply that the 1994 elections took the wind out of gun control’s sails. I call this the “GOP-centric view.”

    The problem with the GOP-centric history is that it’s bullshit. The voters elected a GOP Congress in 1994 in a landslide, almost entirely due NOT to the gimmicky “Contract With America,” but to a backlash against the “assault weapons” ban (really a partial ban on manufacture of semiautomatic rifles, intended to ultimately be gradually amended into a ban on all semiautomatics). Unfortunately, the Dole/Gingrich Congress immediately betrayed the voters who had elected it, not only refusing to repeal the AWB, but actually making NEW gun control legislation, specifically the vile and blatantly unconstitutional Lautenberg Amendment.

    In fact, Clinton kept pushing for gun control, and after the Columbine massacre by a gun control supporter and his friend, it looked as if he’d get it. He got a number of big-city mayors to sue all the major gun manufacturers, in order to force them to agree to settlements which would have forced them AND EVERY BUSINESS THAT DID BUSINESS WITH THEM to accept draconian gun control legislation, including de facto registration. Smith and Wesson buckled under and agreed to the settlement.

    What happened next would never have happened without the World Wide Web. A massive boycott, started not by the NRA but by individual gun rights supporters on the Internet discussion groups (blogs didn’t really exist yet), was initiated by gun owners against S&W, in order to give a message to the other manufacturers. S&W ended up going broke and being sold to new owners for pennies on the dollar. None of the other manufacturers accepted the settlement.

    This Web-centric victory, as opposed to the GOP-centric myth, was what broke the back of gun control. Whereas just months before, Republicans like George W. Bush had been making gun control a major part of their platforms, now not only Republicans but even Democrats were getting as far as they could from the issue. All the politicians conveniently stopped mentioning their own embarrassing positions they’d taken and started mentioning pretty much anything that would change the subject. Gun owners, able to connect online, had proved themselves the strong horse.

    If not for the Web connections, this would have been unlikely. Today gun owners would probably be a shrinking minority and confiscation would be on the table. In all likelihood, gun ownership would probably be illegal in twenty or thirty states, and the federal laws would make it de facto illegal everywhere. With the RKBA eliminated, the crime wave that crested in the early 1990s would have come back in full force, and would be the basis for all kinds of other restrictions on freedom. Criminals would control most of the area of large cities. Yes, I know there are far too many restrictions on freedom even as is, but all you have to do is look at the UK to see just how much worse it could be.

  47. > The divergence point for this history is in 1983-1984, when the leadership of DARPA lied through its teeth to Congress about who was being allowed access to the Internet. The pretense being maintained was that only direct affiliates of government-sponsored, authorized research programs were being allowed on.

    The Thinking Man requests a citation for this.

    The Thinking Man knows that CSNET was funded in 1980 for the 3-year period 1981-1984.

    The Thinking Man also knows that NSFNET was funded for the period 1985-1995. NSFNET was a series of national backbones, (connecting supercomputing centers) at the following rates: 56kbps, T1 (1.5 Mbps) and T3 (45Mbps). It was these backbones which gave rise to what we now think of as the ‘commercial’ Internet.

    The Thinking Man further knows that the NSF appropriations act authorized NSF to “foster and support the development and use of computer and other scientific and engineering methods and technologies, primarily for research and education in the sciences and engineering.” This, in-turn, gave rise to the infamous AUP.

    The Thinking Man wonders if this is the history you (mis-)quote.

    UUNET dates to 1987 as a dialup provider.

    The World was the first commercial dialup ISP to offer Internet access, in 1989.

    CERFNet and NEARNet already existed in their NSFNet regional forms by around 1989.

    The Thinking Man knows that UUNET’s first commercial TCP/IP service over leased lines was running in November 1988. AUP (Appropriate Use Policy) compliant traffic was exchanged with NSFNET. Non-AUP compliant traffic was not. (Much of Europe’s email came over this link for years.)

    The Thinking Man knows that UUNET and PSINet both started actively selling TCP/IP in January 1990.

    The Thinking Man knows that NSFNet modified its Acceptable Use Policy in 1991 to “permit commercial TCP/IP services to interconnect with NSFNet”.

    As The Thinking Man remembers it, NSF cared if the entity abided by the AUP, not the legal status of the corporate entity or its service provider.

    The Thinking Man knows for a fact that UUNET was exchanging packets between NSFNET and AUP compliant research groups in 1988 (with the explicit approval of Steve Wolff. His guideline was “is it to support research and scholarly pursuits?” but he personally approved each organization.)

    The Thinking Man knows that things really ramped up in 1989 and UUNET was frequently asking NSF to clarify what was and wasn’t appropriate use. (A key question in 1989 was if UUNET could gateway email between CompuServe and the NSFNET.)

    The Thinking Man believes that in early 1990 NSF stopped explicitly approving each US network, but continued closely monitoring international connections (I don’t want to put words into Steve Wolff’s mouth) Export control issues were still a concern.

    In March 1991, NSFNET began permitting “Eastern Bloc” countries (Soviet Union, Hungary and Czechoslovakia as they were know at the time) to connect to the NSFNET

    It is The Thinking Man’s memory that NSF insisted on technology transfer long before then, because Congress insisted on it. It wasn’t an accident that CERFNet, NEARNet, and other commercial entitites forked off from
    NSFNet regionals. That was always one of the likely outcomes because of technology transfer.

    “The networks of Stages 2 and 3 will be implemented and operated so that they can become commercialized; industry will then be able to supplant the government in supplying these network services.” —
    Federal Research Internet Coordinating Committee, Program Plan for the National Research and Education Network, May 23, 1989, pp. 4-5.

    And it would appear the feds knew some of the likely effects of what they were doing:

    “The NREN should be the prototype of a new national information
    infrastructure which could be available to every home, office and
    factory. Wherever information is used, from manufacturing to high-
    definition home video entertainment, and most particularly in
    education, the country will benefit from deployment of this
    technology…. The corresponding ease of inter-computer
    communication will then provide the benefits associated with the NREN
    to the entire nation, improving the productivity of all information-
    handling activities. To achieve this end, the deployment of the
    Stage 3 NREN will include a specific, structured process resulting in
    transition of the network from a government operation a commercial
    service.” — Office of Science and Technology Policy, The Federal
    High Performance Computing Program, September 8, 1989, pp. 32, 35.

    This was the formalization of technology transfer resulting in commercialization and privatization.

    1. >The Thinking Man requests a citation for this.

      Right, like anyone’s going to admit in print even thirty years after the fact that they lied to Congress. No. You have to have heard the right war stories from the right people, and have done a certain amount of reading between the lines as well. Nothing I can cite.

  48. > Nothing I can cite.

    The Thinking Man therefore views this as hearsay, and redirects to the more comprehensive history shown above.

  49. The Russell Nelson cannot take the word of someone who speaks in the third person seriously. But I will point to the existance of the SF mailing list as evidence that, just as Eric says, the DARPA folks were lying about off-program uset of the Arpanet.

    1. >But I will point to the existance of the SF mailing list as evidence that, just as Eric says, the DARPA folks were lying about off-program uset of the Arpanet.

      Heh. I’d bet that half the DARPA program monitors were on that list. Here, on information and belief, is how I think their reasoning went:

      “This list is fun. It’s also an experiment in a novel mode of communication (most historians, including me, think it was the very first Internet mailing list). As such, it’s legitimately part of our research into the consequences of cheap, robust wide-area networking. And so is letting on all those college kids who don’t technically qualify under the program charter. But if we try to explain that at the Congressional hearings, we’re sure as shit going to get Proxmired.”

      It was a reasonable fear. William Proxmire trashed a lot of immensely valuable research in the course of his political grandstanding. Once or twice he even apologized afterwards, but the damage had been done. I’ve had it pretty strongly hinted to me that certain of those responsible believed lying to Congress was less risky than telling the truth where Proxmire could get wind of it.

      I suspect the Internet program managers were neither the first nor the last to make that choice. And be justified in making it.

  50. esr,

    Very interesting.

    I believe BBSs had a flat address space, at least in North America. We call them phone numbers. I can see a DNS system which resolves names into phone numbers. And if you don’t require a flat address space, you could cover the globe.

    Is that a useless pedantic point, or a useful interesting point?

    Yours,
    Tom

    1. >Is that a useless pedantic point, or a useful interesting point?

      Sorry, it remains pedantic until there are URI equivalents. What I mean by a “unified flat address space” is that anything on the Internet that is an exportable unit of data (which used to mean a file, but can now mean a web page or even a WordPress blog comment) has a unique name which you can feed to a browser and use to fetch it. This is an important primitive concept because without it you don’t have real hyperlinks.

      In the case of BBSes, phone numbers wouldn’t have been enough. You’d have needed (a) a standard way of notating a combination of a phone number and a file path, and (b) fetchers embedded in BBSes that knew how do dial up other BBSes and present them with a path using an agreed-upon protocol, and (c) service capability in the target BBSes to pass the file back.

      The harder you look at the technical problems, the worse it gets. What do you do if the phone number is long distance, but you might have a route through an X.25 BBS aggregator like tz’s PC Pursuit? What do you do about fetching from single-line BBSes, time out or just eat latency until the phone on the other end hangs up?

      I understand the impulse to romanticize the old-time BBS culture and FidoNet and things like that. I was there, too; it was the morning of the digital world and we were all kids with the shiniest toy in history. But the truth is, those systems sucked. It wasn’t anyone’s fault that they sucked. it was an inevitable result of slow expensive networks, underpowered machines, and a huge conceptual gap where hyperlinks needed to be.

  51. Can we say that Internet and the WWW are the results of finding just the right kind of balance between two polar opposites, control and chaos – because it had to be _standardized_ in a way that specifically allows and encourages “chaos”?

    I know a company which develops satellite and other kinds of TV receivers mostly for Europe. Every major TV broadcaster in every country has their own, often undocumented standards for encoding information, there are absolutely no international, open standards – none that any big broadcaster would follow – and worst thing is that they simply change their “standards” without notification and then some features in the devices just stop working, even though they have signed a contract that they should not do so, but try arguing with a dino like BBC, good luck to that. From the viewpoint of TV, the open standards of the Internet / WWW look like a programmer’s heaven. I wonder what kind of technological, organizational or political choices or circumstances prevented TV becoming something similar to the Internet.

    Another aspect is, and I don’t know whether it is intentional or not, electroning engineers tend to write shitty code so either they shold not write code at all or this should be walled off from application software by sane interfaces like gpsd. Dealing with eletronic engineer – code is a major PITA and often results in walled gardens and restricted competition: it is such a big investement to deal with all the quirks of device X, Y and Z that only a few vendors try it. Example: credit card terminals, bar code printers, and suchlike. Now, the Internet / WWW does a very good job of not having application programmers deal with electronic engineer – code and TV broadcasting does a very bad job of it, and I wonder if it was a conscious choice to wall EE’s off?

    A fourth aspect is open, interchangeable hardware. I think Open Source couldn’t have really got out of the bottle if not for the rise of IBM PC “compatibles”, in the kind of world where you cannot plug a screen X and keyboard Y into a computer made by Z. Yeah, in an Apple kind of world… I think such a hardware had a lot to do with the rise of open source and the Internet / http://WWW... whom should we credit with it? You will not like it, but at least part of the answer is Microsoft…

  52. “This is an important primitive concept because without it you don’t have real hyperlinks.”

    I think this idea can be tracked back to the 18th century concept of the Encyclopédie – that every kind of useful human knowledge should be found with a certain hierarchial taxonomy, which, if you collapse one path of the tree into one line, is basically a URI. Reinvented in USENET etc.

  53. >But if we try to explain that at the Congressional hearings, we’re sure as shit going to get Proxmired.

    You just reminded me of Larry Niven’s short story “The Return of William Proxmire”. You can find it in the Heinlein tribute collection Requiem.

  54. The Thinking Man believes you are mistaken. http://www.filegate.net/zone1/index.html

    The Morgan Greywolf reminds The Thinking Man that the esr did say “unified, flat address space. As The Morgan Greywolf remembers it, his FidoNet node address was something like 1:120/284.0 — Zone 1, Net 120, Node 284, the .0 indicating that The Morgan Greywolf’s BBS was a full node and not a point node.

    (Speaking in the third person is kind of annoying.)

  55. just a note, but I always thought that links were first implemented on gopher (which itself depends on the TCP/IP stack, of course), is there any technical reason why Bereners-Lee is always credited with inventing hyperlinks?

    In my experience, gopher was my first non-email exposure to the internet, and it trained me to think of the net as just a list of links. I had no problem moving on to the web later on – so much so that at the beginning it seemed to me a slower version of gopher!

  56. Speaking in the zeroth person, $skolem pondering just who were the people on the DARPA team at that time. The leadership ought to be a knowable thing, but just what were its rank and file development team like? How did they manage to get on a team with this much influence? (Probably because this tech wasn’t seen as that big a deal at the time.)

    Are there more such teams lurking out there, waiting to do Good at the government’s expense, under cover of their own perceived insignificance?

  57. Benz invented the automobile, but Ford produced automobiles with interchangeable parts on an assembly line.

    TBL is the Henry Ford of hyperlinks. It probably helped that he was building upon a foundation of the “interchangeable parts” of DNS/TCP/IP to take care of the high-level naming, and he was smart enough to not try to reinvent that wheel.

    1. >It probably helped that he was building upon a foundation of the “interchangeable parts” of DNS/TCP/IP to take care of the high-level naming, and he was smart enough to not try to reinvent that wheel.

      Mark Miller, one of the architects of Project Xanadu and a very old friend of mine, has convinced me that Tim Berners-Lee’s most important invention was the 404. Previous projects had tied themselves in overcomplicated knots trying to ensure that retrievals always succeeded; TBL’s insight was that this attempt was doomed in a real world with unreliable systems and variable latency, and that by recognizing that reality he could enormously simplify server implementation.

      Mark tried to recruit me for the Xanadu project in 1980 when it was still located in Swarthmore, and I came very near joining. That’s another interesting alternate history, because Xanadu came heartbreakingly near successfully deploying the Web fifteen years ahead of TBL. I was a very fledgling programmer then, but in view of how widely deployed some of my later projects became it’s not crazy to think I might have made a critical difference. I still wonder about that sometimes. Roads not taken…

  58. I’m somewhat confused, perhaps because my experience in this area isn’t what it should be to comment. But didn’t AOL/Compuserve/Genie and the like also use TCP/IP to run their networks? Was there some predecessor architecture they were using that I’m not aware of?

  59. Right, like anyone’s going to admit in print even thirty years after the fact that they lied to Congress.

    Fortunately the Congressional Record is available; what you could do is look at the testimony. Unfortunately it’s not on the Web yet. I am not curious enough about this to buy the relevant records. I do find it’s good practice to be politely skeptical about unsupported historical claims that happen to align themselves with the claimer’s philosophical or political ends. “Can’t prove it, but conveniently, this bears out my beliefs about how the world works!”

    In any case, by 1988, the CSTB’s Toward a National Research Network was explicitly discussing “the ease with which the NRN may be interconnected with other, commercial network services.” That report was submitted to Congress, where it influenced the High-Performance Computing Act of 1991.

    I can’t say whether or not DARPA lied to Congress in 1983, but the government was on board within half a decade. Sometimes (IANA, many other examples) too much on board. Heh, there’s an interesting alternate history: what happens if alternative DNS roots go mainstream in 1996-97?

    1. >I do find it’s good practice to be politely skeptical about unsupported historical claims that happen to align themselves with the claimer’s philosophical or political ends. “Can’t prove it, but conveniently, this bears out my beliefs about how the world works!”

      Sorry, I’m missing a connection here. Maybe you think I get some ideological advantage from the story that DARPA lied to Congress, but you’ll have to explain it to me because I’m not seeing it.

  60. > In the case of BBSes, phone numbers wouldn’t have been enough. You’d have needed (a) a standard way of
    > notating a combination of a phone number and a file path, and (b) fetchers embedded in BBSes that knew how
    > do dial up other BBSes and present them with a path using an agreed-upon protocol, and (c) service capability
    > in the target BBSes to pass the file back.

    I can somehow see, in an alternate, BBS-led world, Plan 9 gaining much, much more traction than it did. Mostly because its architecture design was built to be flexible in exactly this way.

  61. I have to wonder what conditions existed that gave the leadership at DARPA some confidence that they might have a decent chance of sneaking by the Congressional panel. Politicians are not known for technical awareness, or completely lack imagination, such as the infamous Proxmire.

  62. Hey, someone’s been posting as Eric. “I attribute the takeoff of the Internet to DARPA deliberately lying to its political masters. This happened precisely because DARPA’s leadership knew it was impossible within the incentives operating on governments for them to tolerate the ‘randoms’ or the results DARPA foresaw.” And “We are just lucky that DARPA subverted the normal centralizing tendency of government at a crucial time.” Whoever the imposter is, they’re pretty clearly using this story as evidence that their political beliefs are correct.

    1. I still don’t get it. Mind you, since I believe the story that DARPA lied to Congress I wouldn’t object to using it as evidence for my politics, but I don’t see how I can do that. Fire burns, water flows downhill, Proxmire’s Golden Fleece awards created a set of pressures on research program managers and they responded; in what universe does this turn into an advocacy point for me?

  63. @Shenpen: “A fourth aspect is open, interchangeable hardware. I think Open Source couldn’t have really got out of the bottle if not for the rise of IBM PC “compatibles”, in the kind of world where you cannot plug a screen X and keyboard Y into a computer made by Z. Yeah, in an Apple kind of world… I think such a hardware had a lot to do with the rise of open source and the Internet / http://WWW… whom should we credit with it? You will not like it, but at least part of the answer is Microsoft…”

    Nonsense. Open hardware helped, but Microsoft should not be given credit for that. If Microsoft had not managed to quickly lay hands on Seattle Computer Products QDOS, or if DR had gotten contract for IBM’s PC operating system, Microsoft would be largely out of the OS picture. Nevertheless, the history would have looked much the same, minus an irritating monopolist who did everything it could at every juncture to stamp out proprietary competitors.

    Credit for this open hardware should go to IBM, not Microsoft. Not that IBM did it on purpose; it happened because IBM decided that speed-to-market and minimum investment were key goals, and maintaining proprietary control the resulting architecture and software was not. If IBM had taken those little microcomputers more seriously, its corporate culture would never have permitted this. It was only because they wanted a piece of the new microcomputer business but (erroneously) didn’t take it seriously as a threat to their existing business model that they chose the path they did.

    If there’s anything that Microsoft did to accelerate Open Source, it’s clearly in another area: by making it impossible for any individual business to compete with Microsoft in any way, it created a scenario where only a collaborative Open Source model–one that produced code that could not be killed by killing an organization, and where development man-hours could be scaled beyond anything that even Microsoft could afford–could compete with the MS monopoly.

    Without the blockage of the propritary path, Linux would have been developed but would likely not have seen the big push from major players like IBM that helped it over the acceptance hump. It would be a popular (with hackers, and perhaps academics) but very niche system, not something that Corporate America used to run Oracle.

  64. Well then, why didn’t IBM’s corporate management take the microcomputer seriously? (Assuming that is in fact what happened.)

    What was intrinsic to IBM’s mgmt. that led them to believe this market was worth dangling its feet, but not a full plunge? They aren’t dumb; they’re just driven by different factors.

    In fact, why does IBM continue to fund such “unserious” efforts as AlphaWorks?

    1. >Well then, why didn’t IBM’s corporate management take the microcomputer seriously? (Assuming that is in fact what happened.)

      That’s what happened, all right. At the time of the IBM PC release I was working for a slimeball who happened to have excellent contacts inside IBM. One consequence was that I got my hands on one of the first three IBM PCs to reach the Northeast. Another is that we knew a lot more about what the Boca Raton group was thinking than almost anyone outside IBM did for years after that – more, in fact, than most people inside IBM did at the time.

      So. As to why IBM’s corporate management didn’t take the microcomputer seriously…duuude. These were mainframe guys. Computers meant raised floors and floor-to-ceiling cabinets and cables a foot thick. Microcomputers were hobby gadgets for scruffy hippies. The handful of IBMers who thought differently and snuck through a project authorization ran a really stealthy operation a thousand miles from the centers of power in Armonk.

      Cathy’s account is correct. “Speed-to-market and minimum investment were key goals”. Part of the reason for this was IBM internal politics; they needed to get a product out the door and established before more powerful factions within the company noticed that they were about to bust the hell out of IBM’s price-performance curve. They were particularly worried about a now-forgotten product called the System/23; they knew that once the executives behind that got wind of the PC project there’d be political hell to pay.

      There was some vague sense in IBM that they needed to get on top of this micro thing on the off chance it became important. But you have to realize that even just buying parts from external suppliers was pretty heretical in IBM at that time. Necessary, though, to get the PC to market before the mainframe guys mobilized to squash it. Which is how we ended up stuck with the x86 – the 68000 was under consideration but Motorola couldn’t deliver the glue chips they needed to put around it.

      Cathy is also correct that the ubiquity of Microsoft was a consequence of the PC’s open hardware design rather than a cause. That angle has been covered pretty well in the public historical literature so I won’t rehash it here.

  65. “Credit for this open hardware should go to IBM, not Microsoft.”

    It may be fair to credit both: IBM for making a clonable design and Microsoft for freely selling something standard to run on it. Neal Stephenson, in In The Beginning Was The Command Line, credits Microsoft with Linux, for example: “In trying to understand the Linux phenomenon, then, we have to look not to a single innovator but to a sort of bizarre Trinity: Linus Torvalds, Richard Stallman, and Bill Gates. Take away any of these three and Linux would not exist.”

    I am reminded of a post by a friend who works in the financial sector, who notes that Microsoft giving everyone one operating system (DOS or the various versions of Windows) was a vast improvement over the previous state of affairs: “I’ve worked with Windows all my working life and, despite what you may hear, it has been a blessing to us all: without it we would still be running Wang word processors on Wang hardware that saved documents in a Wang file format that can only be read by other Wang applications and printed on Wang Printers. Or HP, Or IBM, or Toshiba: whatever. It took a Big Bad Corporation to build a big enough operating system that everyone uses it, and every other software vendor works with it rather than against it, each other, and the user population.” Of course, as he goes on to note: “But Vista and Microsoft’s implementation of DRM is a clear indication that they have failed to balance the constant commercial compromise of profit, competition, quality, and customer service.”

  66. > and Microsoft for freely selling something standard to run on it.

    I don’t buy it. At the beginning there was also CP/M-86 and the UCSD p-System. The only innovation that allows Microsoft to rewrite history and make those disappear was its licensing scheme that made it problematic for shops to sell CPM and the p-System.

  67. Oh, they certainly leveraged the hell out of it, by means foul and perhaps on rare occasions fair; this is well-documented. But computers are sold to run applications, and the applications were written to DOS.

  68. > But computers are sold to run applications, and the applications were written to DOS.

    At the beginning, there were lots more apps for either CP/M-86 or the pSystem than for DOS.

  69. The quote concerned the conditions for Linux, which would be as of 1991. Or are you talking about the 1984 turning point in ESR’s alternate history?

  70. Why are you this far into the discussion and nobody has yet attempted any comparison or mentioned the proprietary consoles networks (Xbox live, playstation home, Steam)? They are precisely modern, semi-successful efforts at making walled gardens. They aren’t multi-purpose, certainly.

  71. @Adriano: “Why are you this far into the discussion and nobody has yet attempted any comparison or mentioned the proprietary consoles networks (Xbox live, playstation home, Steam)? They are precisely modern, semi-successful efforts at making walled gardens. They aren’t multi-purpose, certainly.”

    Because the stakes are much lower in that space.

    1. >Because the stakes are much lower in that space.

      That’s right. There’s no threat that those consoles are going to up-gun into a dominant computing and communications platform. They’re designed to expressly avoid all the complexity costs that move would incur.

  72. @David Gerard:

    The quote concerned the conditions for Linux, which would be as of 1991. Or are you talking about the 1984 turning point in ESR’s alternate history?

    I think the network effects would have worked the same without MS in the picture. There would probably be one or maybe two dominant OS vendors, and room for a free alternative.

  73. The only innovation that allows Microsoft to rewrite history and make those disappear was its licensing scheme

    You can also credit them for developing the home computer market, and introducing the concept of using a computer for entertainment.

    No, they did not invent the home computer, and yes, hackers and hobbyists were already making recreational use of computers long before MS existed.

    But Microsoft really developed a market there for its Windows operating system, took the whole home computing concept a couple of levels higher. They turned office machines into entertainement centers : watch movies, listen to music, chat with your friends, write mail, find fun (ior interesting) stuff on the web.
    And they succeeded..
    That’s a vision, and ability to implement.

    Their “computer in every home” most likely also helped creating the critical mass for the 1995-2000 internet boom, which laid the groundwork for where we are today : everyone’s on-line, and now they also want to be on-line wherever they go. Hence smartphones, And Google. And Android.

  74. @Patrick Maupin: “The only innovation that allows Microsoft to rewrite history and make those disappear was its licensing scheme”

    @kn: “You can also credit them for developing the home computer market, and introducing the concept of using a computer for entertainment.”

    I think that move was already well underway in the Apple II era, and had Apple been the dominant player in personal computing 1981 – 2001 rather than Microsoft, the end result would not have been that different.

    Don’t confuse the fact that Microsoft accomplish certain things with a belief that they actually added value; significant parts of the computer revolution really were inevitable “it’s time to railroad” developments, and a home computer market offering entertainment was one of them. I don’t see that development as historically contingent at all.

    1. >I don’t see that development as historically contingent at all.

      I almost completely agree, but there is one contingency in there that Patrick may have omitted. The fact that IBM published ISA interface specs and invited third parties to ship extension boards was crucial. The PS/2, which was IBM’s attempt to reassert proprietary control, was a sales disaster – but even if it hadn’t been, the high tariff for third parties would have ruined it as a platform for the experiments with media support that eventually turned it into a home entertainment center.

  75. @esr: “At the time of the IBM PC release I was working for a slimeball who happened to have excellent contacts inside IBM. One consequence was that I got my hands on one of the first three IBM PCs to reach the Northeast.”

    What was your reaction to it at the time? Did you see that IBM coming to the party with a relatively open architecture would revolutionize the personal computer market, or were you caught up in its very real technical limitations?

    1. >What was your reaction to [the IBM PC] at the time? Did you see that IBM coming to the party with a relatively open architecture would revolutionize the personal computer market, or were you caught up in its very real technical limitations?

      Er. Both, actually.

      It was clear to me even before the IBM PC shipped that it was going to be a game-changer; IBM’s then-dominant position in computing guaranteed that. When it did ship, I saw the implications of the open expansion bus and the BIOS listing immediately. On the other hand, compared to the PDP minicomuters I’d been used to working with, the PC was an incredibly weak machine with primitive software tools. I had spent the last couple of years working mostly in LISP; dropping back to the level of ROM BASIC and assembler was awful.

      But I knew the descendants of the first PC wouldn’t suck. And when the 386 was announced in 1986 I understood what it meant – and actually predicted in print, in an anthology called Tricks of the Unix Masters, that the future of serious computing was Unix on 386s and their descendants. Seems obvious now, but ’86 was about the high-water mark of the Sun-class workstation and there wouldn’t be a working 386 Unix port for another three years, so it was a pretty bold thing to say then.

  76. @esr
    I still don’t get it. … in what universe does this turn into an advocacy point for me?

    maybe something along the lines of

    – esr’s politics/ideology : “governments suck. central management sucks. all government initiative is inefficient and can lead to no good”
    – esr’s opnion on internet : “internet is good”

    => the internet was, originally, a governement initiative, so we have a contradiction here.
    Solution : Well, I happen to know that the internet the way governement planned it was going to suck, but luckily a couple of engineers fixed that (and lied to the government about it to save their asses)

    It also works with “Unix philosophy”:
    unix philosophy : provide only mechanism, not policy.
    Congress is all about policy
    The leadership of DARPA was more about providing mechanism (and open standards, and clean interfaces, etc …. ). They managed to do things their way (and had to lie to congress about it in the process).
    History tells us DARPA / Unix philosophy / …. was right, because it led to the internet we have today, where al sorts of interesting and exciting things can happen,; not some abomination such as the walled gardens, a world without web, …

  77. Mark tried to recruit me for the Xanadu project in 1980 when it was still located in Swarthmore, and I came very near joining. That’s another interesting alternate history, because Xanadu came heartbreakingly near successfully deploying the Web fifteen years ahead of TBL.

    I remember reading an article about Xanadu and thinking it was an absolutely brilliant idea. As I recall, it went beyond linking, and incorporated revision control that explicitly stored the parts of a document that remained the same between versions in separate chunks from the changed chunks, and presented each version as a list of those chunks. I had a flashback to this when I saw how Google Wave was supposed to work.

    Xanadu had some VERY cool ideas, which were frankly before their time.

  78. Esr,

    An interesting post, to be sure. But lots of people can (and do) come up with alternate histories in which the world is worse off. Any ideas spring to mind in which the Internet would be more advanced or better than it is today? Or are we living in a pretty-close-to-optimum world at the moment?

  79. > A certain rotten fruit company invented/discovered this market by producing a computer that didn’t need to be assembled, came in plastic box, and hooked to an ordinary television.

    Ignoring the plastic box, the Apple-II wasn’t the first – the sol 10/20 was a year earlier (and should have been known to Jobs/Wozniak as it came out of Berkeley/Home Brew Computer Club). I remember that seeing something before the Sol, but can’t recall enough details to find it via google.

    Yes, the Sol was available “assemble yourself”, but like many computers of that time, was also available pre-assembled.

  80. @esr:

    I almost completely agree, but there is one contingency in there that Patrick may have omitted. The fact that IBM published ISA interface specs and invited third parties to ship extension boards was crucial.

    And they also published the full BIOS source code in a blue binder. But this was required to compete in the market at the time. Apple 2 computers came with a monitor listing and a schematic…

  81. @Cathy – “significant parts of the computer revolution really were inevitable “it’s time to railroad” developments, and a home computer market offering entertainment was one of them.”

    @Morgan Greywolf – “Microsoft, OTOH, got into it quite by accident.”

    Hm, maybe.

    otoh, there’s all these stories about Microsoft underhanded business tactics to get where the are today … (or where the where in the 90s when Windows 1995 put everybody’s Aunt Tilly in front of a computer and on the internet) … sounds like they’ve had to work, and work hard, “by means foul and perhaps on rare occasions fair”, to accomplish that. Doesn’t read like ‘”by accident” or “happened to be in the right place at the right time but it could have been anyone” to me.

  82. @esr: “I almost completely agree, but there is one contingency in there that Patrick may have omitted. The fact that IBM published ISA interface specs and invited third parties to ship extension boards was crucial.”

    I completely agree with this. And their failure to do this with the PS/2 helped seal its fate, though it might have failed anyway given that the original bus design had already reached critical mass and built up its own ecosystem.

    @Patrick Maupin: “And they also published the full BIOS source code in a blue binder. But this was required to compete in the market at the time.”

    Wasn’t this done in part to ensure that the BIOS was covered by IBM’s copyright, making it harder to duplicate? It required thorough clean-room techniques to legally duplicate the functionality without duplicating any of the code. Copyright law as applied to software has evolved since then.

  83. I’m saying that we had about ten years – 1984 to 1994 – to see if FidoNet was able to persuade the timesharing services to play nice, adopt common messaging standards, and set up FidoNet access. What actually happened is that the timesharing services contemptuously ignored FidoNet. So the hope that FidoNet might somehow have bridged between the providers is just wishful thinking; if it didn’t happen in the first ten years, it wasn’t ever going to as the providers got older and less flexible.

    Nine years. GEnie, Prodigy, and AOL all hooked up their internal email systems to Internet gateways in 1993; there would have been no point in adding FidoNet access after that point, since you now had an inter-garden email standard up and running. (CompuServe and MCI Mail did their gateways in 1989). As this was well in advance of any of these services allowing anything like real Internet access, it seems to me to indicate a response to pressure for inter-communicable email, rather than inter-communicable email being a result of Internet access.

    Now, maybe the services wouldn’t have hooked up to FidoNet as the substrate for inter-communicable email in the absence of the Internet. But as far as “ten years”, in 1992 SMTP had had ten years, and GEnie/AOL/Prodigy hadn’t hooked up to them. FidoNet would have been at least a plausible candidate had the more-attractive Internet not been there to do the job in 1993.

  84. @Cathy:

    Wasn’t this done in part to ensure that the BIOS was covered by IBM’s copyright, making it harder to duplicate?

    Not sure, but it seems unlikely. Certainly, Apple v. Franklin hadn’t been decided at the point the IBM PC was released, but the Computer Software Copyright Act of 1980 tidied up copyright for software a lot. You could get several different operating systems and several different applications. None of them came with source code and everybody knew you were supposed to buy them.

    You can deposit partial source code with the copyright office in order to protect trade secrets:

    http://williampatry.blogspot.com/2007/05/copyright-office-deposit-and-trade.html

    If courts would allow wholesale copying (as the district court first did in Apple v. Franklin) then it would be really hard to distinguish ROM from software available on disk or cassette. That ruling wasn’t going to stand. If courts don’t allow wholesale copying, and require the requisite original authorship, then publishing source like IBM did certainly made Phoenix Technologies’ job a heck of a lot easier.

  85. otoh, there’s all these stories about Microsoft underhanded business tactics to get where the are today … (or where the where in the 90s when Windows 1995 put everybody’s Aunt Tilly in front of a computer and on the internet) … sounds like they’ve had to work, and work hard, “by means foul and perhaps on rare occasions fair”, to accomplish that. Doesn’t read like ‘”by accident” or “happened to be in the right place at the right time but it could have been anyone” to me.

    I suggest reading Stephen Manes’ and and Paul Andrews’ excellent work, Gates: how Microsoft’s mogul reinvented an industry and made himself the richest man in America. Microsoft never intended to sell operating systems. Their main business from the very beginnings of the company revolved around software development tools. In fact, when IBM’s Jack Sams asked Bill Gates about an operating system, Gates sent him to see the late Gary Kildall of Digital Research fame. Gates only reluctantly sold IBM an operating system because he was worried about losing their programming languages business.

    Bill Gates was never a visionary. He was always opportunist and only later did he become an excellent strategist. His replacement, Steve Ballmer, is duller than an old used-up crayon..

  86. I haven’t read all of these comments yet, but I do want to make one observation before I go to sleep (which I badly need to do; baaad hacker)… about “a station wagon full of magtape”.

    It happened.

    Henry Spencer, in the early days, imported Usenet to U Toronto on 9-track tape; he ended up with an *entire storage room full of it*.

    I think when he donated it to Google as part of the initial archive, it filled 1 CD-ROM? Maybe 2? *Very* interesting story; it’s around somewhere.

    See also the 16mm to 3/4″ Umatic transition in TV in the early 70s and what that did to the historical record… New technology is not *always* a win…

  87. “I completely agree with this. And their failure to do this with the PS/2 helped seal its fate, though it might have failed anyway given that the original bus design had already reached critical mass and built up its own ecosystem.”

    Looking back, I have a very eerie feeling that it was less about IBM “making a mistake” with designing an open PC system. It was more that the first credible open system would take over the market. If IBM had build a closed system, there would have been many other contenders that could have replaced it.

    More like Android vs Nokia, Blackberry, Apple, and the rest. Android won because that is what everybody can build and sell. The developers will come when the producers and customers are there.

  88. A lot of those walled gardens were built on top of X.25. If TCP/IP didn’t take off, would X.25 remain as static as it did or would evolutionary pressure force it to grow and streamline itself into something resembling the Internet? A good part of TCP/IP’s rapid evolution surely was due to its rather open standards process, while X.25 was nearly frozen due to its ITU standard background.

    Besides X.25 and TCP/IP, were there any other significant vendor-neutral network protocol suites in the late 70s-early 80s? SNA/VNET/BITNET were pretty tied to IBM; DECnet was DEC; PUP/XNS were hugely influential on TCP/IP, but was it important in its own right outside of Xerox?

  89. > BBSes and FidoNet didn’t have one of the most important properties of internetworking –
    > a unified flat address space. No equivalent of URLs, and certainly no possibility of hyperlinks.

    Usenet does have a flat address space: Every usenet message has a probabilistically unique identifier, and every usenet group has a globally unique human readable name.

  90. This all seems oddly reminiscent of the versions of the Internet in S. M. Stirling’s The Stone Dogs, where both the Draka and the Alliance have developed computers under the impetus of hyper security consciousness.

    1. >This all seems oddly reminiscent of the versions of the Internet in S. M. Stirling’s The Stone Dogs, where both the Draka and the Alliance have developed computers under the impetus of hyper security consciousness.

      Well, I did think of that when I was working out the consequences, but not as a direct model – only because there haven’t been too many attempts to re-imagine computers on alternate timelines and most of the attempts other than Stirling’s have been outright silly (e.g. vast difference engines that would melt down from internal friction heat if you actually tried to run all those gears).

  91. “I remember that seeing something before the Sol, but can’t recall enough details to find it via google.”

    That might have been the Ohio Scientific Instruments machine. It had a metal case, integrated keyboard and ran a 6502 micro on its own proprietory bus. They were a niche that died when the PC came on the scene.

  92. I can think of at least 3 non DARPA protocols that would have allowed internetworking in one form or another. X.25 which someone mentioned above, IBM’s SNA and Novell’s IPX. I don’t recall enough details of X.25 to state that it was a good or bad protocol but both SNA and IPX would in fact have been better than IPv4 in terms of address saturation issues and both were designed like IP to run on a variety of lower level transports. If we used IPX we wouldn’t be worrying about the need to migrate to IPv6 because IPX used a 32 bit network ID field. IPX in the early/mid 1990s was able to do OSPF/PNNI style route summarization, areas etc. and I recall going to a brainshare where Novell and AT&T? talked abotu creating POPs for people to share email etc. between IPX networks.

    Would the web have existed? I don’t know but I think so. The hyperlink idea had already showed up in Lotus Notes and that Apple product whose name escapes me. HTML was taken from IBM’s SGML which already existed.

    As for DNS, in the earlyh/mid 1980s IBM had naming conventions that could have been used, I’m sure it would not have been perfect because ISTR the names had an 8char max and while my memory is somewhat hazy I seem to recall at least a 3 level hierarchy. ISTR Xerox PARC came up with another one. In the mid 1990s the UK JANET network (UK university links) which I think was IP based but I’m not sure had a kind of reverse dns naming system (so your email was name@uk.ac.university.dept)

    And don’t be anglocentric and forget France’s Minitel system

    Would the Internet be the same? no. But I think it would be not totally dissimilar and would have at least most of the features of the current net.

    1. >I can think of at least 3 non DARPA protocols that would have allowed internetworking in one form or another. X.25 which someone mentioned above, IBM’s SNA and Novell’s IPX.

      *boggle*

      Back in the 1980s I worked on SNA emulation software for Unix. SNA would have been a horrible basis for an internetwork. It was designed for passing screenfuls of data from CICS forms – it couldn’t actually do character streams at all.

      IPX was proprietary, which disqualified it. Nobody could put together a large enough partner consortium to dominate the WAN market around a technology with licensing fees; this is one of the major reasons TCP/IP eventually won.

      X.25 might have been at least possible.

  93. >most of the attempts other than Stirling’s have been outright silly (e.g. vast difference engines that would melt down from internal friction heat if you actually tried to run all those gears).

    I just reread Stirling’s The Peshawar Lancers not long ago. He had “vast difference engines”, but I also remember a mention of running coolant through hollow shafts and other parts.

  94. Minitel was just another walled garden; a government-run version of AOL. Worse, their enormous up-front investment in hardware and lack of competition meant it was quickly out of date when the Internet came along and it took a long while to be updated.

  95. > Back in the 1980s I worked on SNA emulation software for Unix.

    Heh. back in the 1980s I worked on SNA emulation software for proprietary boxes (protocol converters) that allowed attachment of ASCII terminals and printers. Actually, you could use SNA for other things than screenfuls of CICS forms, such as printers. But I think you’re right that it would have needed a bit of work to make internetworking work on top of SNA. For error-checked single links SDLC was great, especially compared to its bisync predecessor…

  96. > But I will point to the existance of the SF mailing list as evidence that, just as Eric says, the DARPA folks were lying about off-program uset of the Arpanet.

    Russell, babe, listen-up, (You too, Eric.)

    Robin Williams said that if you remember the 60’s, you weren’t there.

    In a similar manner, if you refer to ‘SF-lovers’ as “the SF list”, you weren’t really on the ARPAnet.

    Nobody at DARPA lied.

    Things were semi well-policed. There was a famous incident where Pournelle got himself kicked off his guest account at MIT. http://www.stormtiger.org/bob/humor/pournell/story.html

    Mail Exchanger (MX) records developed by Craig Partridge in 1986, specifically to allow non-IP network hosts to have domain addresses.

    The concept and plan for a national US research and education network was proposed by Gordon Bell et al in a Nov 1987 report to the Office of Science and Technology. This plan was written in response to a congressional request by then Senator Al Gore. It would take four years until the establishment of this network by Congress

    As far as as hypertext goes, the first VAX I worked with (1981; they’d already been around for over 3 years) used hypertext for it’s help pages (of course that was on VMS); pre-www and pre-Internet. DEC systems were commonly networked and clustered back then (OK, clustering wasn’t until 1984 – still before the Internet) so you had networked computers using GUI (text gui on an ascii terminal) interfaces with hypertext in 1984. TBL had nothing to do with it.

  97. “A lot of those walled gardens were built on top of X.25. If TCP/IP didn’t take off, would X.25 remain as static as it did or would evolutionary pressure force it to grow and streamline itself into something resembling the Internet?”

    Well, we had a ham radio packet network spanning a significant chunk of the country in the early 1990’s, and it was built on X.25. It was very slow, running over VHF radio links at very slow speeds (I think it was nominally 1200 baud, but because of slow retries and poor RF connections, the effective throughput was much lower).

    It worked, and let you have live conversations across hundreds of miles, and it also supported email. It was so much slower than the Internet over hard line connections that it quickly vanished once the Internet became ubiquitous in the mid-to-late 90’s, then was rebuilt on a smaller scale after 9/11 to support emergency communications.

    It was definitely not as capable as a full TCP/IP implementation, but it was a good first-generation system. Transfer the same protocols to hard lines with real bandwidth, add some additional protocols, and you have a worthwhile network.

    However, I see no reason to think this alt-network would evolve hyperlinks. So much of the posts in this thread boil down to, “oh, the Net would have happened anyway, because we could have done thus and so and gotten decent connections over UUCP/FIDO/X.25/whatever.” And yes, that gets you a useful network, but it does *not* necessarily lead to the World Wide Web or a reasonable proxy thereof.

    The thread is “World Without Web”, not “World Without Internet.” The difference matters.

  98. @not(Andy Rubin): “Things were semi well-policed. There was a famous incident where Pournelle got himself kicked off his guest account at MIT. http://www.stormtiger.org/bob/humor/pournell/story.html.”

    Perversely, the URL linked to above is actually support for esr’s thesis. In it, an ARPAnet administrator writes the following line to Pournelle:

    The more attention you (and other people) draw to non-blow-em-up use of the arpanet the more likely some Proxmire type is to start inquiring into its operations.

    Which is pretty much exactly the point made in the essay at the top of this thread…

  99. esr Says:
    > If you think this what-if qualifies me as an “advocate of the state”, you’re out of your fucking mind.

    We are all statists, but some of us know it, and all marxists, but some of us know it. Statist ideas are so pervasive, that one unthinkingly accumulates lots of them, even with conscious efforts to scrutinize where one’s ideas come from, and what is the evidence and basis for one’s ideas. When unplugging a sewer, one gets covered in sewage.

  100. @Baylink: See also the 16mm to 3/4? Umatic transition in TV in the early 70s and what that did to the historical record… New technology is not *always* a win…

    I can speak to that, and also U-matic to (pro)Beta, and Beta to servers; I work in the biz. There’s a mezzanine in a (huge) garage that hits 120F and up on hot summer days half-full of 16mm film, mostly in cans; I wouldn’t try to run it through a projector live: might get one good pass and the color is going to be….weak.

    The tape formats live in a non-climate-controlled warehouse. You get one pass and you’ve got to clean the heads. Second pass? Ugh. Lotta dropouts.

    It gets archived-as-data if it gets re-used. The tapes are (somewhat) indexed. Otherwise, it’s crumbling faster than pulps and newsprint. You can’t have interns archive it, because it’s too fragile and so are the machines to play it back; you can’t have Highly Paid Professionals archive it, ‘cos they’re Highly Paid. And so it sits until needed, when there’s a mad scramble in Geekville to turn sow’s ears back into silk — or gold.

    Servers? So far, almost good, though storage limits mean what gets saved is only the cream; gone are the days of shelving three hours of raw video, of which 20 seconds were used in the outgoing stream. And we’ve actually got an archiving system now that it can be automated.

  101. Good Lord Man!

    The story you paint makes “The Exorcist” look like a kids happy bed time story. How close it came to not happening, I’ll never want to know.

    Those were heady days

  102. I think something like “the web” would have happened any way. Not clear if it would have supported transclusion and more thoughtful bi-directional links.

    As for Linux, the unavailability of the Kernel just means we’d be using a different, BSD-licensed kernel. So we’d all be using a BSD Unix.

    The impetus for Linux was the fact that Minix source code was copyrighted by Prentice-Hall.

  103. An interesting scenario.

    I’m not sure it would actually have turned out that way – so many variables – and possibly we would have even gotten something better (though doubtful). Still, a very interesting thought experiment which reminds us that the frameworks we build the future on, matter.

    It reminds me of the observation that when it comes to national development, institutions matter. The natural experiment on the Korean peninsula demonstrates this, especially when one remembers that during the Japanese occupation it was the north that was advanced and industrialized, and the south that was backwards and agricultural.

    It occurs to me that the contemporary real-space equivalent to the internet may be the Seasteading movement.

  104. I consider it fact that capitalism & democracy are anathema to each other. Given that our civilians are bombed with more propaganda, called advertising & lifehacking, than Adolf Hitler’s murderspree had need of.

    Second: No insult meant, but you were simply sloppy on mentioning that the ‘hype’ is no phenomena, but self-interest of an industrial elite. Alternates, and competitors, are smear-jobbed, or outright criminally suppressed to milk the money.

    Walkie-Talkie COULD be an example that something alike smartphones would have been invented anyway. Maybe less tweaked to the Microsoft dominion.

    I don’t know. What I know is the number of happy & healthy people I have seen in the 46 years I am on planet Earth: It is not exactly encouraging.

Leave a comment

Your email address will not be published. Required fields are marked *