Things Every Hacker Once Knew: 1.8

Heritage games. The legacy of all-uppercase terminals. Where README came from. What “core” is. The ARPANET. Monitoring your computer with a radio. And more…

Things Every Hacker Once Knew

The response to this document has been nothing short of astonishing. More than half of my non-spam mail over the last three weeks has been people writing to suggest additions and corrections or just to thank me. The count of respondents must be over a hundred by now.

A reminder: This document is not intended as a mere ramble down memory lane – it’s much more focused than that. New content has to pass three filters: (1) was common knowledge at the time, (2) has since been forgotten or seems very near forgotten, and (3) might be useful knowledge to younger hackers working on open-source platforms – or is, at the very least, entertaining.

Some matters of potential interest:

1. The largest category of suggested additions that has failed to pass my filters is facts of the form “specified set of obscure ASCII control characters was or is used in specified obscure point-of-sale or financial-transaction protocol”.

2. The most popular single suggested edition that I’ve rejected is that DEL is 7 1-bits because of the paper-tape rubout character. Sorry, guys, I had to draw the chronological/generational line somewhere; just forward of paper tape and punched cards is it.

3. I myself became a direct observer of all this in 1976, about a year after VDTs matured into their final form and at about the earliest point that they were replacing printing terminals in Ivy League computer labs (the rest of the universities lagged this by a few years).

4. I’ve had two mildly protesting emails out of a hundred or so from people who identify as hackers but came up through mainframe-land and never dealt with the whole scene around serial terminals, micros, BBSes, and so forth. Had to tell them that I can only write what I know – and that the ex post facto justification for caring about what I happen to know is that it morphed into today’s open-source culture.

And a final reminder: If gratitude were cash this document would make me a rich man. Since it isn’t, and I have no funding, please express your thanks more tangibly by joining my Patreon feed.

Published
Categorized as General

77 comments

  1. You forgot to link the document, this time.

    Sidenote: I wonder why some articles have “read more” pagination, and some are whole on front page…

    1. >You forgot to link the document, this time.

      Fixed.

      >Sidenote: I wonder why some articles have “read more” pagination, and some are whole on front page…

      That’s a choice I get to make. There’s a special tag I can insert that, if present, tells WordPress to paginate at that point. To illustrate, I’ve added one to the OP.

  2. > Sorry, guys, I had to draw the chronological/generational line somewhere; just forward of paper tape and punched cards is it.

    I agree with the mob on this one. ASCII is organized into neat groups, with NUL and the rest of the non-printing/control characters first, followed by space, 94 printable characters, and the non-printing/control character DEL. That DEL sits on an island as the only control character at the opposite end of the character set makes no sense unless you know the history of punched tape, which required that NUL=”no punches” and DEL=”all punches”.

    The deliberate alternating between punches and NULs, to leave room to DEL out an error and insert corrections over the NULs, was considered a Best Practice in many shops, as the amount of tape “wasted” this way allowed multiple rounds of error correction to be done on the same tape before it had to be abandoned. (For this to work, NUL arguably wasn’t even a character back then, at least when interpreted in a punched-tape context. It was the absence of any character at all, just as it was in Baudot. Giving NUL the special string-termination property couldn’t make sense until after it had fully matured into an actual character.)

    This is significant enough to be worth mentioning, not only to explain this particular design decision, but to illustrate how compatibility with now-obsolete technology informs so much of what we do, which seems to me to be the whole point of the article. It doesn’t need to be a great deal of detail, but a simple mention in passing that “DEL was chosen to be ‘all punches’ and could therefore literally delete any existing character on tape” is entirely justified, based on how this affected the design of ASCII, and therefore Unicode.

    Also, it’s worth noting that punched-paper tape was still used by the NSA for key distribution as recently as last year, so it’s not even accurate to say it’s obsolete tech. For that specific purpose, they’re just now finally implementing a replacement.

    1. >Also, it’s worth noting that punched-paper tape was still used by the NSA for key distribution as recently as last year,

      Oh dear Goddess. Well, I guess that blows it through the relevance filter.

  3. I have to agree with @The Monster — keep at least a little reference to punch tape. It’s relevant to me because I had to use punched tape in the Air Force from 1984-1986, well past the point where I had my own Amiga.

  4. > which required that NUL=”no punches” and DEL=”all punches”.

    This also explains why the ASR-33 used even parity.

  5. As someone who came of age in the BBS era, I agree with keeping the back story of DEL–no, punch cards per se weren’t part of my experience, but it’s one of those puzzle pieces that helps even modern technology (UTF-8) make sense, just like even though I can’t recall seeing a physical serial connection that wasn’t 8N1, knowing about parity bits would explain what that magic incantation is for (I remember 7E1 on a modem at least once, though).

  6. I work for a man who programmed COBOL on Burrough’s mainframes. My mom was a programmer on IBM mainframes. They have strong memories of punch cards.

    Just as the width of a modern car comes from the standard width of an ox wagon in the Roman Empire, I hope you’ll change your mind about mainframe stuff. Since reading Alan Kay and his reference to the B5500 a few weeks ago, I’ve been astounded by how much stuff the mainframes had back in the 1950’s. And how much of Unix, Mini and Micro computers was (and still is) just playing catch-up. The Mini came to the Micro, but the Mainframe is now coming to the Micro too, I expect to see more and more mainframe stuff entering hacker culture in the next few years. The mainframe generation isn’t completely dead, but they won’t be with us long. And I think they still have valuable things to contribute.

  7. I freely grant there was a significant generational change between the old PalmOS, Blackberry, Symbian, Windows CE, and other oddball PDA/phone hybrids that you could find 1996-2006, and what a “smartphone” came to mean after the original iPhone release.

    But If you’re going to date “smartphones” to 2007 in the timeline, you probably need to use a qualifying adjective like “true” or some sort of description, since you’re using the term at variance to how it’s used at, for example, the Wikipedia entry for them.

    1. >since you’re using the term at variance to how it’s used at, for example, the Wikipedia entry for them.

      I just looked, and that’s not what I think I see.

  8. One thing I, as a younger hackeroid, would like to see, either as part of this or as a separate article is “things you remember, but were too young to understand” for people who grew up in the 90s, or thereabouts. This would probably be mostly micro (and in particular Windows/DOS) focused, but would probably include a few mini/mainframe things.

    For example:

    1)I remember our library’s card catalogue system used greenscreen terminals as the user facing bit, but I don’t know what the hardware/software in the back room was (or even what model the terminals actually were). What solutions were most common for this sort of thing in the 90s? Would this most likely have been VT-100s hooked up to a VAX running VMS or Unix in the building? Would it have been 3270s dialed in to a mainframe at the HQ for the county library system?

    2)Our family didn’t have a network connection us kids were allowed to use until around 2000, and I’m fairly certain from that point we were using straight IP-over-dialup. I know that services like AOL in the earlier 90s provided Internet access, but get the impression (from my reading in more recent times) that a large part of what they provided was access to non-IP services running on their own servers directly, but I have the feeling that a lot of kids my age who used these services in the earlier 90s may not have really been able to differentiate between IP and non-IP dialup services provided by whatever provider they were using. They just knew they were “online”. They might find it interesting to know what was going on behind the scenes.

  9. I just now noticed heritage games in the document. (Probably because they only appeared as of 1.8.)

    MUDs might be worth mentioning. These were the multiplayer version of Colossal Cave, and they exploded in both absolute number, and also variety – LPMuds, Dikus, Fidos, and a bunch more I don’t remember anymore. (Darker Realms, an LP based at Texas A&M that I spent way too much time on, turns out to still be running today.) Relevance? Everyone back then knew what a Beastly Fido was, I suppose. That, and a lot of players got immersion in object oriented programming, since that’s how these worlds were implemented.

    I recall playing a great deal of netrek (sic) during the early 1990s. Vector graphics drawn in a raster style (is vector vs. raster worth putting in?). Multiplayer – 16 at a time in classic, and I’d seen variations going up to 32. Real-time action – everything synced to a clock, which still amazes me, given the network equipment available at the time. Players grew very familiar with concepts such as ping rate, lag, traceroute, and other wrinkles that still apply today.

    Doom is becoming heritage, which is odd to me. It’s arguably in a weird place. The most recent version was faithful to the original (and its sequel), so it’s not dead knowledge. One feature I suppose is notable: on top of all the massive leaps in graphics and sound technology this game made, it was nevertheless shareware (not a new thing, although you were getting a lot of value here), and also easily moddable – it came with an editor, so hobbyists could develop their own maps and share them. It’s the first game I can think of that came with this feature, and I think this greatly increased Doom’s lifespan. And of course, it, too was multiplayer, and one of the first, and probably the primary driver behind gaming LAN parties.

    Doom, of course, was published by idSoftware, probably the most successful example of a shareware-based game developer at the time. Commander Keen (a platformer) and Castle Wolfenstein (an FPS) before them; Quake and damn near everything else afterward.

    Speaking of game lineages: an older game, Spacewar, was the ancestor of Star Control, having a heyday for a few years before Activision choked it into dormancy. Colossal Cave begat a pantheon of *Quest games from Sierra Online in the late 1980s / 1990s (King’s Quest got a reboot very recently). Roguelikes are a direct ancestor to the Diablo series, one of the main reasons Blizzard Entertainment is a modern day juggernaut. (The other is Warcraft, which dates back to a Dune game published by Westwood, but I don’t recall much in the way of hacker features on this.)

    Eric, and pretty much everyone else here, should be made aware, if they haven’t already, that someone apparently found an old bones file of Nethack’s devteam, and reloaded it, as of a year or so ago: the nuclear family of character-based roguelikes lives again. They even released an update to the venerable 3.4.3 that had stood for ten years. No new content AIUI, but it apparently had a bunch of work done behind the scenes to make development easier today. (IIRC, Subversion was barely a thing when 3.4.3 first came out.)

    1. >Eric, and pretty much everyone else here, should be made aware, if they haven’t already, that someone apparently found an old bones file of Nethack’s devteam, and reloaded it,

      Er. That was me, actually. Some of the devteam got rather pissed off about it, but as an old devteam member myself I felt I had both the right and the duty to bust the dysfunctional secrecy they had devolved into.

  10. “dungeon-crawling adventure game” is undefined in your text, but my interpretation would include Hunt the Wumpus, which predates Colossal Cave Adventure.

  11. > old bones file of Nethack’s devteam, and reloaded it, as of a year or so ago: the nuclear family of character-based roguelikes lives again

    > bust the dysfunctional secrecy they had devolved into.

    Ok, I know what a bones file is, but I’m unclear how the digging-up of an old one a) makes roguelikes live again and b) busts a secrecy culture.

    Could somebody please enlighten me?

    1. >Could somebody please enlighten me?

      I believe Paul was being metaphorical. It wasn’t a real bones file, but a git conversion of their repo – bones of their development history, as it were.

  12. Something like that, yeah. “Reloaded a bones file” was the closest NH concept I could think of to “resurrected the project devteam”.

    (Come to think of it, I have to wonder why no one ever implemented resurrection or necromancy in the game. Like, with a wand. It’s been a while, but I could probably brainstorm several YASDs with it…)

  13. After 2000, however, as multiprocessor systems became increasingly common even on desktops, “core” increasingly took on a conflicting meaning as shorthand for “processor core”.

    Is there a more precise date for this – or, is 2000 a precise date rather than a round number? I’m more interested in when single-core processors became obsolete than when consumer desktops with multicore processors first came on the market, and 2000 is definitely too early for that – it seems like just about anything you can get these days is at least 2-core x 2-thread, and I’m curious as to how long that’s actually been.

    Another interesting date, if it doesn’t turn out to be the same, would be when clock speed became less emphasized in CPU marketing. Like, the number in MHz used to be practically part of the product name.

  14. Ok, thanks for the clarifications re Nethack. I’ve built NH several times, but I never looked for version control history. I gather that it was unavailable to the public before the recent git conversion. I can see how that would piss off all the right people.

  15. (Come to think of it, I have to wonder why no one ever implemented resurrection or necromancy in the game. Like, with a wand. It’s been a while, but I could probably brainstorm several YASDs with it…)

    Looks like someone has failed to fully examine the possibilities of a wand of undead turning/spell of turn undead.

  16. Another interesting date, if it doesn’t turn out to be the same, would be when clock speed became less emphasized in CPU marketing. Like, the number in MHz used to be practically part of the product name.

    That would be 2007ish. Everybody hit 4 ± 0.25 gigahertz as their best and stuck there (oh, IBM POWER6 managed 5, but later generations crawled back down), at which point it became rather moot. If somebody has a 3.8 GHz Pentium IV, you can’t market a seventh-generation Intel Core i7-7700 to them by pointing out the latter is 3.6 GHz.

    1. >That would be 2007ish. Everybody hit 4 ± 0.25 gigahertz as their best and stuck there

      That’s worth adding.

  17. My reading of Wikipedia agrees with Steven’s:

    The term “smart phone” appeared in print as early as 1995, describing AT&T’s PhoneWriter Communicator.

    And in particular, I owned what was undeniably a smartphone, running Windows Mobile, in 2003.

    1. >And in particular, I owned what was undeniably a smartphone, running Windows Mobile, in 2003.

      How would you distinguish post-2007 smartphones from that?

  18. Eric, is there any chance of a post “Things every 2A activist once knew”?

    I’m thinking in particular of the S&W boycott during the late Clinton years, and the Bellesiles scandal, which marked respectively the political and intellectual turning points in the gun rights fight. You
    can add others; the Cincinnati Revolution comes to mind.

    I think this is important, because it is important that we on the pro-rights side know that our victories were not gotten easily, nor were they a gift given us by the GWB administration or Newt Gingrich. Actually, up until the expiration of the AWB in 2004, to the best of my knowledge NOT ONE gun control law had ever gone out of existence. Also, virtually every anti-gun law ever proposed would eventually become law.

    Almost OT, except for the Forgotten Knowledge angle.

    1. >Eric, is there any chance of a post “Things every 2A activist once knew”?

      It’s a great idea. I don’t think I know the territory well enough, though.

  19. Looks like someone has failed to fully examine the possibilities of a wand of undead turning/spell of turn undead.

    Harumph. You’re right. I hadn’t. (It’s not quite the thing I had in mind, but close enough that it should have twigged in memory if it were ever there to begin with.)

    2A: I’d kinda like this, too. GunCite.com ought to be this, but it seems rather dated now – I think the editor may have gotten busy elsewhere. (Notes addressing David Hemenway are particularly hard to find.)

    Priorities, though. I still need to look for that ICEI mission statement, then try to advertise, while I go about getting a Patreon account.

  20. How would you distinguish post-2007 smartphones from that?

    I don’t have any pithy adjective to suggest, but the (only) two notable things that changed in 2007 were: 1. iOS and Android both launched that year and 2. there was a dramatic elbow in the adoption curve.

  21. >Like, the number in MHz used to be practically part of the product name.

    Does anybody have any idea, in that era, why a box’s clock speed would be *under*-nameplated?

    My family bought an eMachine in 2001 with a nameplate clock speed of 766 MHz. We assumed for years that that was the actual clock speed. Then, once the family was done with it and had handed it to me to mess around with, I happened to notice that Linux, while booting, showed the clock speed at 1 GHz. I then ran several different clock speed utilities and each showed 1 GHz. I then determined the exact processor architecture (Coppermine Pentium 3), and looked up what CPUs Intel had produced of that architecture: there’d been a model that ran at 1 GHz, but none that ran at 766 GHz.

    I can understand over-nameplating a machine to pull a fast one on your customers, but why would anyone under-nameplate a machine and make it less marketable?

  22. Actually, up until the expiration of the AWB in 2004, to the best of my knowledge NOT ONE gun control law had ever gone out of existence.

    That’s false. Concealed carry laws began to liberalize a lot earlier than that; I’m sure you’ve previously seen https://en.wikipedia.org/wiki/Concealed_carry_in_the_United_States#/media/File:Rtc.gif. Earlier times also saw the repeal of a lot of Jim Crow laws aimed at disarming blacks. At the federal level, the Firearm Owners Protection Act of 1986 repealed some provisions of the Gun Control Act of 1968, such as the prohibition on shipping guns through the postal service.

    Also, virtually every anti-gun law ever proposed would eventually become law.

    I’m not going to bother digging for citations on this one, but given the sheer volume of stupid bills that get introduced with every legislative cycle and die a quiet death in committee, I’d be shocked if this were correct.

    However, I agree with your general sentiment that the 90s were a generally shitty time for RKBA, and that 2004 was an inflection point of sorts.

  23. Really? You had web service and apps in 2003?

    Yes. It had a real browser (not just WAP), GPRS data service, and a decent ecosystem of third-party apps. A couple I remember installing were an IRC client and a Z5 interpreter (some sort of Frotz derivative) for playing text adventures. When I got caught using the IRC client during class, the teacher asked, “Daniel, are you texting during class?”, to which I responded “No!” with feigned indignance :-).

  24. It wasn’t the capabilities that transformed, as much as it was the market. Before the iPhone, you had several competing design philosophies aimed at different markets (one manifestation being the different touchscreen, keyboard, and stylus approaches), multiple seriously competing OSes, a variety of processor architectures, and only niche adoption. After the iPhone (plus a bit of a shakeout period), you had only iPhone-like touchscreen phones, running either the industry-commodity OS or Apple’s, on ARM processors, with mass adoption.

    So I’d say that the modern smartphone was born in 2007, much like the modern PC was born in 1981. There wasn’t much you could do with an IBM PC in 1982 that you couldn’t have on a suitably-equipped CP/M machine in 1980, but the IBM PC is the direct lineal ancestor of the box I’m typing on in a way that a Z80 box was not (even if I have no specific hardware, beyond a legacy instruction set in the processor, to show that descent anymore).

  25. Really? You had web service and apps in 2003?

    I’ve still never found a better RSS reader than QuickNews on my Treo 650.

  26. Ken on 2017-02-14 at 21:19:31 said:
    > Eric, is there any chance of a post “Things every 2A activist once knew”?

    As far as I know no one has put that together, and I’m not sure it would be all that *useful*, but it might make for some interesting reading

    You should ping Clayton Cramer on it–he’s part of the group that took Bellesiles down, and has been doing a LOT of historical research.

  27. Another significant thing about 2007: that is the year Linux almost had the desktop, and then started going downhill.

    By 2007 I was at a point where I felt good about grabbing the latest Ubuntu and installing it for Aunt Tilly. Hardware support was finally working (mostly) or at least, the progress was in a positive direction. Skype worked, Java worked, Flash player worked, OpenOffice was getting better and better.

    And then Lennart Poettering started doing his thing. And all the main Linux distributions started installing his stuff and using us as debuggers. Ok, I don’t mind if someone wants to be on the bleeding edge, that is a great option to have. But when you just force everyone (even Aunt Tilly) to go through the pain, well… so we had avahi, pulseaudio, and then pulseaudio stabilized and we got systemd. And we got stupid stuff like mountpoints that even root couldn’t access, which broke a lot of backup scripts, and gave me grief when I was making simple filesystem diff utilities.

    I believe this 2007 inflection point is why Munich is considering abandoning Linux. Around 2007 it was compromised, and the forward momentum was killed. Munich gave it 10 years to recover. Hurray for bureaucratic momentum. I don’t want to blame Redhat entirely, but it seems like they’ve been making Linux more and more like Windows… a jobs project for certified admin, because it is a giant labyrinth of complexity and surprising behavior, not the simple and well known thing. The switch from Gnome 2 to Gnome 3 had some big issues. Still does; they’ve ruined cross platform compatibility. Gnome 3 has broken the usefulness of gtk for making crossplatform GUI apps, because their only concern is Gnome 3. They lost sight of what ESR wrote in the Art of Unix Programming, Linux attitude of emulate-and-subsume (rather than embrace-and-extend). I think the Gnome team should be forced to re-read the Art of Unix Programming every year. And the stinky attitude of Lennart and friends toward keeping things compatible with BSD where it can reasonably be done.

    Thankfully the BSDs are still around. Barely. And Alpine Linux. Alpine is a life saver. 2007. 2007 is the approximate year where stuff started breaking for the Linux desktop, and whenever it would stabilize, they’d find some way to complicate and break it again.

  28. 2007 is late by a few years for smartphones – Palm Treos were niche but not unheard-of by 2004-05, as were Windows Mobile phones. Both ecosystems dated back to a few years beforehand (Palm to before 2000 and WinMo/WinCE to shortly after 2000) and were functional with “web services and apps.” Just as the iPod didn’t invent the category of “MP3 player,” the iPhone didn’t invent the category of “smartphone.” Apple probably invented the “app store” model of app distribution, thought they certainly popularized it if they didn’t invent it, at any rate. Both had 3rd party apps and hardware; and are functionally equivalent to “post-’81 micros” in the sense I think you mean it. The 2007 smartphone evolutionary advance was putting capacitive screens (allows multitouch touchscreens that are more accurate) over top of “new OS” that have integrated app stores. Both iOS and Android are evolutionary developments of the Palm and WinMo devices that went before, NOT a revolution along the lines of the introduction of the IBM PC. And the Palm and WinMo smartphones were basically PDAs with integrated cellular modems; internalizing a capability that could be had via external hardware going back quite a while.

    The “first smartphone in the US” title arguably goes back to the Kyocera 6035 (2001, per wikipedia), but the Treo 600 (2003) is probably a better contender for “first mass-market/modern smartphone.” It has all the features of the popular conception of a smartphone, up to and including web access (not WAP, though rich web content was somewhat limited) and 3rd party apps. Pocket PC 2002 is contemporaneous, and the devices were more capable, at a higher cost. I bought my first WinMo smartphone device in 2005 or so; and it was every bit a smartphone in the vein of an Android or iPhone.

    And that’s not even touching the Blackberry, which may or may not be “a post-’81 micro” in the sense you appear to mean (no real 3rd-party apps, no 3rd party hardware at all), but is generally considered to be a “smartphone.”

  29. Short form:
    2003: The first modern smartphone (treo 600) ships

    (The Kyocera did not fully integrate the cellular part of the device into the package – it was basically a “PDA+off-board cell modem” in one clamshell package, the tow devices communicated with each other but were not systems-integrated)

  30. I carried a couple of different Treos. One saved my butt when I needed to make quick travel plan adjustments while I was on a business trip. It also caused me some grief when I couldn’t get the thing to shut off when the cabin door was closed; I finally wound up taking the battery out.

    I’m not entirely sure I can put my finger on the difference between the Treo and the iPhone, but it felt like a quantum leap, even with iOS 1 that didn’t officially allow third-party apps. The higher-resolution, larger screen and much more polished interface were the drivers, sure, but thinking about it, it seems to me the big advance was going from PalmOS to an OS X (and therefore Unix) derivative.

    In any case, while the Treo 650 that was my last one was a capable smartphone, the ones that came after left it in the dust.

  31. The difference between a Treo and and iPhone is akin to the difference between DOS and more advanced OSes. The revolutionary step was tightly integrating a cell modem to the PDA, giving you an “always-available” data connection (and voice connection) in a single, tightly integrated, device that ALSO had third-party app support, non-WAP web support, and a non-T9 keyboard. iOS and android devices have better and more-capable UX, but the Treos and WinMo phones are as significant to the development of smartphones as the IBM PC is to microcomputers.

  32. > “always-available” data connection (and voice connection)

    You mean or voice, right? The first generation iPhone didn’t let you do both at the same time. You couldn’t do it on Verizon until the iPhone 6.

  33. You mean or voice, right? The first generation iPhone didn’t let you do both at the same time. You couldn’t do it on Verizon until the iPhone 6.

    I’m trying not to get down into the weeds of implementation details. Both were “always available,” but not “simultaneously available.” The distinction is, with the pre-smartphone PDAs, you could cable up (or use BT if so equipped) a conventional cellphone for use as a data connection, or get a “sled” to mount to the back that either contained a fixed cell radio or a PCMCIA slot for a PCMCIA cell device (which may or may not have also had voice capability). From the Treo 600 on, these devices were fully-functional cell phones as well as fully functional PDAs with 3rd party apps (that last one rules out the RIM Blackberry devices, until very late in their life span), no extra hardware needed.

    Incidentally, other than the first generation GSM iPhone (I don’t recall offhand why the first gen GSM iPhone didn’t have the simultaneous data/voice capability), the simultaneous voice and data limitations on Verizon and Sprint were due to their use of CDMA networks instead of GSM/GPRS. (Fun fact, the latest LTE devices, unless they are doing Voice over LTE (VoIP over the LTE network), generally do not do simultaneous voice and data, because, unlike GSM/GPRS and it’s successors, and CDMA/EVDO, you don’t have to run a separate data network radio in the device. The early LTE devices did voice on the older GSM/CDMA voice networks, but the new ones do voice and data on LTE. If the voice is VoIP (VoLTE, usually called HD calling or something else like that), no problem, it’s just another data stream. But if the voice is not VoIP, then the voice stream occupies the LTE radio with an old-fashioned cellular voice stream.)

  34. >> BBS … 8N1 … 7E1

    There used to be some multiuser BBSs running on Unix-alikes back in the day usually on 7E1, and a local university had one running on VMS with about 30 modems on it, also on 7E1. This was in the mid-1980s.

    I have a vague idea that at least one of the old commercial “online systems” (CompuServe, BIX, Delphi, AOL, etc.) defaulted to 7E1, but the details have faded from my memory.

    1. I think those 7E1 links were heritage from ASR-33s which as someone pointed out upthread had even parity wired into the keystroke generation. I haven’t added this to the document because I’m not actually sure whether teletypes generated 7E1 or 7E2.

  35. @Ian Argent

    Both iOS and Android are evolutionary developments of the Palm and WinMo devices that went before, NOT a revolution along the lines of the introduction of the IBM PC.

    The IBM PC wasn’t a revolution, technically-speaking. It was within the normal variation seen in micros up to that point. To quote the Bytelines in the December 1981 BYTE, “[I]t offers no innovative features,” and “Several microcomputers are already on the market with features virtually identical to the Personal Computer — some even have more power[.]”

    The iPod and iPhone are quite analogous to the IBM PC; none of them were even remotely the first, none were particularly unusual technically, but they each defined the category and the subsequent mass market anyway.

  36. The “games” section should probably briefly mention games on _teletype_ terminals (most likely, the practice really became widespread with the Dartmouth implementation of BASIC, which first made computer programming accessible to a general population of college students) and the not-unrelated fact that simple VDT games were also played on early micros, because (1) some micros only supported VDT-like character graphics in the first place, and (2) these games were somewhat portable across micros and thus often featured in type-in magazines and books.

    README(.TXT) as a file name was not just a USENET thing, I assume. It was common in BBS downloads too, which generally were PKZIP-compressed archives; more relevantly (and most likely an earlier practice), diskette software often included a “README” file on disk, containing last-minute updates/release notes that did not make it into the printed documentation. The Wikipedia also cites “read.me” files from DECUSLIB tapes dating back to the mis-1970s.

    1. >The “games” section should probably briefly mention games on _teletype_ terminals (most likely, the practice really became widespread with the Dartmouth implementation of BASIC, which first made computer programming accessible to a general population of college students)

      I was one of those people, around 1972. Only I was 14 and in middle school – occasionally I could get time on a DTSS terminal at Ursinus college.

      But those very early games (Wumpus, Hammurabi) are, I think, out of scope. I say this despite being the maintainer of the last Wumpus port myself. They belong in a general history of popular computing, but not I think in this one.

      >The Wikipedia also cites “read.me” files from DECUSLIB tapes dating back to the mis-1970s.

      AHA!

      I completely believe this. There was most definitely overlap between early USENET source sharing and DECUS tapes – I’m pretty sure I remember seeing comp.sources stuff with DECUS attributions.

      It’s a safe bet then, that what happened is that the README convention spread from DECUS to both USENET and the BBS scene (probably through CP/M). The Wikipedia source shows that it goes back as far as TOPS-10, though I don’t recall seeing it there. I shall amend appropriately.

  37. > But those very early games (Wumpus, Hammurabi) are, I think, out of scope.

    Your call, of course. I thought that a brief mention would be appropriate, because even hackers who didn’t know of character-cell interfaces through VDTs, would surely know of them because of micros. –More generally, early micros also brought some worthwhile innovations in text interfaces which should probably get some recognition in these sections – the ‘orthodox’ file manager ala Midnight Commander seems to have originated there, as did the “user-friendly” text editing interface that later made it into “pico” (nowadays cloned by “nano”). Heck, some element of that has even made it into the latest versions of Emacs, which now makes the “graphical” menus browseable in text mode much like those old editors. Although, I can’t quite tell whether some of this seeming innovation was stuff that both micros and Unix simply picked up from other systems like VMS, or even mainframes.

  38. Did “everybody” share an opinion about VisiCalc?

    It seemed to me from my sideline seat in the 1980s computer skirmishes that, even more than e-mail, spreadsheets were the desired tool of the business users, while being the most wild and dangerous thing the “data processing” and allied financial accounting number crunchers could imagine. Users wanted to consolidate or compare numbers from this report to that, or from one source to another. D.P. looked at users key-stroking (and typo-introducing) data and algorithms using hardware and software they’d never spec’d, or documented, or trained on, or sometimes SEEN — and prepared to deal with complaints when the resulting Visi-number was somehow different from the MainFrame-Number. By preference the DP shop would have killed spreadsheets in the cradle, it seemed to me at the time.

    It was as much a generational thing as a business silo thing, in my experience. The older folks in DP, F&A, and facilities loved the mainframe, while younger ambitious souls in sales, logistics, personnel, and procurement would make plans, at home with personal tools, without completely disclosing quite how their recommendations were developed.

  39. > I haven’t added this to the document because I’m not actually sure whether teletypes generated 7E1 or 7E2.

    Since I still have the manual downloaded, I can go ahead and answer this – it’s 7E2.

  40. > Did “everybody” share an opinion about VisiCalc?

    > …spreadsheets were the desired tool of the business users, while being the most wild and dangerous thing the “data processing” and allied financial accounting number crunchers could imagine.

    Yup. Visicalc does not belong in the UI section, even though it was a very good example of text user interface, precisely because it was not a part of _hacker_ culture even tangentially. I think even today, most hackers understand that spreadsheets are actually a terrible idea, and even for simple calculations it’s better to write a short program in the high-level language of choice. (Back in the day this would’ve been BASIC or perhaps FORTRAN; today Python is popular.)

  41. > But those very early games (Wumpus, Hammurabi) are, I think, out of scope.

    Fair enough. But the claim that Adventure was the “very first” dungeon crawler remains false, unless you deny that Wumpus was in that genre. Since TEHOK isn’t a gaming history, may I suggest just changing the wording to avoid the whole argument?

    1. >But the claim that Adventure was the “very first” dungeon crawler remains false, unless you deny that Wumpus was in that genre.

      I’ve never thought of Hunt The Wumpus that way. Probably because to me “dungeon crawler” is associated with a visual feel and a set of tropes that the original Wumpus never tried to invoke.

      There were some later variants of Wumpus that included multiple hazards, like termites that could eat your arrows or bats that might pick you up and drop you at a random location. Those felt more like dungeon crawlers…but I think they postdated ADVENT, so it’s not clear which direction the influence ran.

  42. @esr:

    > But those very early games (Wumpus, Hammurabi) are, I think, out of scope. I say this despite being the maintainer of the last Wumpus port myself.

    Speaking of wumpus, have you ever encountered “Be the Wumpus”?

    It’s a modern game that involves hunting by sound in total darkness. Much fun.

    1. >Fair warning: that site will consume hours of your life.

      Fascinating. If the Antiquarian is correct, Crowthwer wasn’t intending to write what we’d think of as a dungeon crawler at all – those elements were Woods’s additions.

  43. >The IBM PC wasn’t a revolution, technically-speaking. It was within the normal variation seen in micros up to that point

    The revolution came about because the company whose name was synonymous with “computer” in the mind of the average business person was finally on minicomputers, which meant that they could be used for serious business computing just like IBM mainframes and minis before them. That provided confidence to the early adopters (“no one got fired for buying IBM”). This, plus the fact that IBM did not assert any patents to prevent clones (only copyright claims that required a clean-room reverse-engineered BIOS) produced a set of HW and SW standards that allowed an ecosystem to grow up around them, in which everyone felt confident it was all reasonably compatible. By the time IBM’s market share had dwindled to insignificance, the PC compatibles as a group had grown to the point that network effects locked it in as the only game in town (until Apple built mindshare from “creative” types).

    Before that, it wasn’t possible to build software that could run on different manufacturers’ hardware without modifications. With the IBM-compatible PCs, it was not only possible, but fairly easy to do. It freed business from the overly-bureaucratic IT priesthood just as Gutenberg’s printing press freed reformist Christians from the Vatican’s priesthood. That IT Reformation was the revolution.

    >Did “everybody” share an opinion about VisiCalc?

    VC, and later Lotus 1-2-3, were the entire reason many people bought PCs. “${SPREADSHEET} is my OS; DOS is my boot loader” was the attitude of these people (whether or not they had the fluency to be able to express it in those words).

  44. The IBM PC wasn’t a revolution, technically-speaking. It was within the normal variation seen in micros up to that point. To quote the Bytelines in the December 1981 BYTE, “[I]t offers no innovative features,” and “Several microcomputers are already on the market with features virtually identical to the Personal Computer — some even have more power[.]”

    The iPod and iPhone are quite analogous to the IBM PC; none of them were even remotely the first, none were particularly unusual technically, but they each defined the category and the subsequent mass market anyway.

    The RIM blackberry is analogous to the C/PM machines, apple IIs and other pre-IBM PC micros. I was (and am still) in the industry, and the Treo was the first “popular” smartphone – the one that individuals and small business owners purchased. iPhone and Android are the “windows machines era” of smartphones. Plenty of non-techies and non-enterprises bought Treos and Windows Mobile phones and used them for all sorts of silly and serious things.

    None of which is particularly relevant to “things every hacker knew,” but if it’s on the timeline, the smartphone era started in 2003 with the Treo 600.

  45. Before that, it wasn’t possible to build software that could run on different manufacturers’ hardware without modifications. With the IBM-compatible PCs, it was not only possible, but fairly easy to do. It freed business from the overly-bureaucratic IT priesthood just as Gutenberg’s printing press freed reformist Christians from the Vatican’s priesthood. That IT Reformation was the revolution.

    That’s another point towards the smartphone era starting with the Treo and the WinMo devices RIM devices were locked-in to RIM’s ecosystem, and there wasn’t (hardly) any 3rd party software for them. You could buy Palm OS software and Windows Mobile software at CompUSA. I don’t think anyone but Palm (and their relatives) ever made a PalmOS smartphone, but everybody and their dog made a WinMo device up until Android ate their lunch (which wasn’t for a year or two after Android launched, incidentally)

  46. @ESR
    >The very earliest VDTs, like the ASR-33, could form only upper-case letters.

    This sounds like it’s calling the ASR-33 a VDT. Perhaps something like this works better:

    The very earliest VDTs, like the ASR-33 paper teletypes before them, could form only upper-case letters.

  47. I think we’re actually at a rough consensus on the smartphone stuff?

    1) Yes, smartphones existed before 2007.

    2) Yes, something significant happened in 2007-2008 — a market transformation reasonably equivalent of the IBM PC revolution.

    3) Therefore, the 2007-2008 timeline entry on smartphones should be revised to resemble, more-or-less, the 1981 entry on the IBM PC. Say, something like “The first iPhones and Android phones ship; the smartphone market transitions to modern form.”

    Beyond that . . .

    While the MITS Altair is the generally-anointed “first personal computer” out of multiple possible candidates (various people will argue for everything from the Datapoint 2200 to the Apple I, but always in the context of arguing against the consensus around the MITS Altair), there’s no “first smartphone”, out of the similarly many possible candidate devices, that is so generally anointed. There is therefore no clear entry to be added to the timeline equivalent of the 1975 MITS Altair entry.

  48. >>Also, it’s worth noting that punched-paper tape was still used by the NSA for key distribution as recently as last year,

    > Oh dear Goddess. Well, I guess that blows it through the relevance filter.

    They don’t use DEL on those. They use HCF (Halt and Catch Fire).

  49. > Fascinating. If the Antiquarian is correct, Crowthwer wasn’t intending to write what we’d think of as a dungeon crawler at all – those elements were Woods’s additions.

    Yob doesn’t appear to have been doing anything intentionally dungeon-crawly with Wumpus, either. It’s clear that Wumpus prefigures the dungeon crawlers to come, but only by accident. So I withdraw my objection to Adventure being called “very first”.

  50. esr:

    The “Games before GUIs” section contains a minor error: “There was a another…”

    And in “On Steve Jobs’s passing”, you wrote: “The people who did the actual innovating in smartphones – notably Danger with their pioneering Hiptop…”, so it would appear that smartphones predate 2007 by your own admission. Unless you’ve since changed your conception of what constitutes a smartphone and not just a smartphone precursor, that is. Or I may have just misread you. ;-P

  51. It all depends on what you consider a “smartphone,” vs what’s a precursor. I plant the stick at “full web access, 3rd party apps, devices sold over the counter to and usable by non-Enterprise users.”

    That’s the Treo 600. If you add “multiple hardware OEMs,” that’s the Pocket PC 2002 devices of around 2003/04. If you change “3rd party apps” to “3rd party apps via integrated app store,” that’s iOS/Android. Delete “3rd-party apps” and “over the counter…” and you get Blackberries. (I’m not up on my RIM lore, I don’t know what their dates are with any precision.)

    I, based on my own domain knowledge, would roughly map the pre-Treo PDA/Blackberry period to the pre-IBM-PC era (The MTS Altair to Apple II micro era), the Treo/WinMo era to the IBM-PClone/DOS era, and the iOS/Android era to the Windows/OSX era. (PC centric view; obviously). I was and am deeply in all three eras in PDAs, and was involved with micros since my parents bought a C64 in the 1982-84 timeframe.

    There’s an elbow in adoption in 2007, no question. But it’s not that there were no mass-market smartphones prior to 2007. ESR asked “were there web services and apps in 2003?” The answer is yes. And they were on devices sold OTC next to “dumbphones,” and those devices were sold to non-Enterprise users.

    (Side note, there’s still a shocking amount of WinMo in embedded devices. I don’t recall seeing any PalmOS devices very recently, but I don’t keep an eye out for them either. Every time you sign for a package on UPS or FedEx, that’s a WinMo-powered cellular device, under a LoB app)

  52. “full web access, 3rd party apps, devices sold over the counter to and usable by non-Enterprise users.”

    Of course, it also has to have a large screen. Otherwise some feature phones, with J2ME for third party apps (app stores tended to be per carrier), qualify.

  53. Of course, it also has to have a large screen. Otherwise some feature phones, with J2ME for third party apps (app stores tended to be per carrier), qualify.

    “Full web access” will rule out the feature phones; even the ones with J2ME or BREW were limited to WAP browsing. Large screen is also somewhat subjective – I’d gate on “touchscreen” if I had to. (Which also rules out the early blackberries.)

  54. > “Full web access” will rule out the feature phones; even the ones with J2ME or BREW were limited to WAP browsing.

    Not in my experience (they had really crappy web browsers by even contemporary standards, but that’s not the same thing – they understood ordinary HTML just fine) – and in any case Opera Mini was one of the available apps.

    I was actually hesitant to suggest touchscreen because it rules out so many early Blackberry and, AIUI, Windows CE phones. Maybe “touchscreen or QWERTY keyboard”.

  55. ESR cut off the discussion of “first smartphone” by changing the note @ 2007 in the timeline to “iPhone and Android (both with Unix-based OSes) first ship.” Which is just as cromulent (and closer to in-scope) as “first smartphone,” and less subjective.

    The importance of iOS and Android isn’t that they are smartphones, but that after they came into being, feature phones were dead tech walking. 10 years on, and they’re just “phones.”

  56. Ian Argent:

    The importance of iOS and Android isn’t that they are smartphones, but that after they came into being, feature phones were dead tech walking. 10 years on, and they’re just “phones.”

    Surprisingly, there are rumors that the Nokia 3310 is going to be relaunched.

    For what it’s worth, I have a Nokia 1100 in a drawer. :-)

  57. Surprisingly, there are rumors that the Nokia 3310 is going to be relaunched.

    For what it’s worth, I have a Nokia 1100 in a drawer. :-)

    Never underestimate the market value of nostalgia.

Leave a Reply to esr Cancel reply

Your email address will not be published. Required fields are marked *