Things Every Hacker Once Knew: 1.5

The 1.5 revision of Things Every Hacker Once Knew is out.

Alas, I had to drop the reference to the Space Cadet keyboard. Turns out it shipped a 32-bit status word and this had nothing to do with 9-bit bytes at all. The indirect reference to the SAIL extended ASCII keyboard is still in.

Patrick Maupin’s revelation about the AT prefix is summarized.

The fact that UUCP was a hack around the old two-tier structure of phone rates is mentioned.

There’s more about TTL serial. Gary Miller, my very hardware-savvy lieutenant and now acting lead on the GPSD project, thinks this didn’t become a common way to ship data off peripherals and daughterboards until after 2000, with GPS chips leading the way. This matches my recollection, but I was pretty oblivious about that sort of thing until the last decade so I don’t consider my recollection very good evidence. Commentary an correction invited.

I’d like to pin down the year cathode-ray tubes disappeared. I know the leading display vendors ceased production in 2005, but I think the transition might have been as much as two years sooner. Again, corrections welcomed.

Published
Categorized as General

71 comments

    1. >Still getting v1.4, too; also, I think the versioning info up top is better, (a la How to Become A Hacker)

      I’ve pushed 1.5.

      The trouble with versioning info up top is that I have to effectively duplicate what’s at the head of the change log, and yes they fall out of sync when I do that. Better to have a single point of truth.

  1. At least in some countries, for most people, 56k modems were replaced by ADSL routers which offered speeds of 128k, so there wasn’t the jump from 56k modems to 1Mbit “wide area Internet” as you seem to imply. In recent years, common home ADSL bandwidth might have increased every year or two e.g. from 384k to 512k to 1Mbit.

    1. >so there wasn’t the jump from 56k modems to 1Mbit “wide area Internet” as you seem to imply.

      Not immediately in places without SDSL, true. On the other hand there’s much to be said for artful vagueness here. I know ADSL still lags, but the ramp-up to fast SDSL and optical happened pretty fast after ’97 – I know I got FIOS with 15mbps down/5mbps up in 2002 which
      was already an order of magnitude batter that 1mbps.

  2. You have stumbled on some popular content. Seems like it would be better to expand this, and make it a chapter of V2 of The Art of Unix Programming. With git and other changes since publishing, it needs an update anyway.

  3. the first workhorse Unix machine was the PDP-11 (first shipped in 1970). It had 16-bit words – but, due to some unusual peculiarities of the instruction set, octal made more sense for its machine code as well.

    This seems misstated. I’ve written a fair bit of PDP-11 assembler and I’d say that its affinity for octal was a feature, not a peculiarity. Eight registers, references to both fall naturally in groups of 3 in a single word instruction. Looks like it was designed for octal, and the older machines that had key switches on the front panel were labelled and grouped in octal IIRC. Not to mention that octal appeared everywhere in the documentation.

    https://infogalactic.com/info/PDP-11_architecture#Double-operand_instructions
    https://infogalactic.com/info/File:Pdp-11-70-panel.jpg

    One key item that might be worth mentioning here is that octal was more popular than hex possibly in great part because 7-segment LED displays were dirt cheap and available everywhere while the equivalent bit-point hex displays were rare, expensive and *very* power hungry.

    1. >One key item that might be worth mentioning here is that octal was more popular than hex possibly in great part because 7-segment LED displays were dirt cheap and available everywhere while the equivalent bit-point hex displays were rare, expensive and *very* power hungry.

      That is very interesting, but if it was causative it was causation not generally understood at the time.

  4. I’ll disagree about the 7-segment, bit, too: you can display hex just fine on a 7-segment display. Uppercase A, C, E, and F, and lowercase b and d, are all unambiguous.

  5. Were circuits for converting hex to 7-segment as widely available as the ones for converting binary-coded decimal (and therefore octal – the fourth bit could be held off, or with discrete logic anything depending on it could be simplified out)? They had to have been more expensive, though it’s possible that needing more displays might have made up the difference.

    Were there a significant number of octal machines that actually used this style of display? To my understanding, the PDP-11 simply used a light per bit, with printed boxes around them grouping them in groups of three.

    Maybe it was considered easier, therefore, to keep the conversions between binary and octal in your head than hex. It would have also indisputably have been easier to convert to text in a program, only needing to OR the value of the zero character, rather than either adding a different value or doing a memory lookup.

  6. The 7447 series BCD-to-7-segment decoders, which seems to be the most common commodity decoder today (and has apparently existed at least since 1974), outputs nonsense for A-E and blank for F.

  7. CRTs didn’t disappear, they faded away. :-P

    If I’m not mistaken, the absolute last holdouts for CRTs were Quake players. Something about the circle jumps in that game (presumably due to low lag) made it a necessity for anyone who played seriously.

    Not necessarily hacker specific, but since it’s Quake, definitely in the Venn pool.

  8. > Eight registers, references to both fall naturally in groups of 3 in a single word instruction

    Eight addressing modes too. When I noticed it, I thought it was extremely elegant that immediate operands are just (pc)+ – or, in octal, 27 [37 for memory]

  9. “The trouble with versioning info up top is that I have to effectively duplicate what’s at the head of the change log, and yes they fall out of sync when I do that. Better to have a single point of truth.”

    RCS can’t take care of that?

    1. >RCS can’t take care of that?

      Not when I’m using git, no. :-)

      Anyway, RCS version numbers don’t correspond to the external release number. The former changes much faster.

  10. Not bad on the “AT” command set. One nit: 0100000101010100 isn’t quite right. The time sequence, for an uppercase “AT” at 8 bits with no parity, in a regexp any hacker ought to be able to understand, but with whitespace inserted for clarity, is this:

    1+0 1000 0010 1+0 0010 1010 1+

    I still think you should put a note in about asynchronous start stop data. Most really old-time hackers knew they needed to match modems — they couldn’t use a synchronous modem with their async terminal. The synchronous modems were used for things like connecting an IBM mainframe to a 3270 terminal concentrator. But the connectors were the same, because they all used the RS-232 standard, which is data-stream agnostic.

    The protocols, such as bisync and HDLC/SDLC, used with synchronous modems allowed error correction and more efficient use of the precious PSTN bandwidth. This was such a good idea that even the later iterations of “asynchronous” modems used these under the hood, starting with MNP and V.42. When you sent data using an earlier modem, the actual data you sent was shipped over the wire. But with MNP or V.42, it was packetized and shipped synchronously (without start and stop bits), trading latency for bandwidth, and allowing transparent error correction. With later versions of MNP, and V.42bis and V.44, on-the-fly compression was added as well. (The V.42bis LZW compression was very similar to that used in .gif files.)

    1. >I still think you should put a note in about asynchronous start stop data. Most really old-time hackers knew they needed to match modems — they couldn’t use a synchronous modem with their async terminal.

      I guess I’m not old-time enough. Your mentions of 3270 and SDLC/HDLC suggests they were IBM mainframe tech, which would have put them outside of hacker common knowledge.

  11. The death of the CRT was around the time you mentioned – as with all such things, it took some time, but I recall seeing my first non-CRT screen in the late 90s(when they were rare and expensive), and getting my first flatscreen around 2003-04(by which point they were the obvious choice for anyone wanting a new monitor). I know I got my first flatscreen late enough in the game that I could dual-monitor my PC with no undue hassles on either hardware or software after an initial reboot – I think I used the VGA port for the CRT and the DVI port for the flatscreen, which implies a GPU new enough to have both(which fits, I bought that computer in 2002, well into DVI’s reign but before HDMI was big).

  12. @esr:
    > I’d like to pin down the year cathode-ray tubes disappeared. I know the leading display vendors ceased production in 2005, but I think the transition might have been as much as two years sooner.

    This sounds correct, but I’m not sure it’s apropos to “things every hacker once knew”. It’s more “things every adult is old enough to remember”.

    In 5 or 10 years there may be enough hackers that have never seen a CRT for it to fit in the article, but even then, the transition happened well after computing became a consumer thing (so the knowledge would be much more widely distributed than hackerdom), the electronic interfaces to LCD and CRT monitors were the same at the time of the transition (VGA/DVI), and the reasons that one would want an LCD vs a CRT can be determined by trying to carry them. I don’t think there’s too much there that is knowledge unique to old-time hackerdom.

    1. >I don’t think there’s too much there that is knowledge unique to old-time hackerdom.

      You raise a fair objection, but…while the fact that the transition happened is something every adult remembers, the date is something I think people are already forgetting. (The last time this happened was when hackers – and everybody else – forgot that color bit-mapped displays hadn’t always been ubiquitous; this happened remarkably soon after they became so in 1992.)

      I may take this out; as you say, it’s borderline. Or I may leave it in to continue the narrative arc that begins with the earliest VDTs in ’69. You know, it’s hard to grasp that the entire lifespan of the CRT-based terminal was only 36 years.

  13. You might note that (very short) serial connections are still commonly used for configuration purposes in the networking world. Generally a laptop is plugged into the console port (RJ-45 form factor, but still serial) of a Cisco or Juniper device (or occasionally another manufacturer’s device) or there’s a serial connection from a phone line with a modem to the RJ-45 console (or Aux) port on the networking device. This second form of connection is rapidly dying.

    1. >You might note that (very short) serial connections are still commonly used for configuration purposes in the networking world.

      I think this falls under “niche applications”.

  14. Was Pr1mos ever very common? It was a minicomputer OS, and someone over on LWN.net was praising its implementation of dynamic libraries as being very snappy.

    Been reading Alan Kay lately, which took me down the mainframe rabbit hole. Now that regular desktops are as powerful as old mainframes, it would be interesting to see an article about what lessons from the mainframe world still remain to be brought forward into our world.

    Burroughs MCP is still being shipped as Unisys Clearpath. Multics is still being sold as Stratus VOS. And from their descriptions, they still seem more advanced than Unix/Linux. Dynamic libraries on MCP sound like they are really thought out and done The Right Way.

  15. About the disappearance of CRTs, it happened at different times in different communities. I think the last holdouts were photographers and designers because early LCD screens had horrible color accuracy and most of the existing color calibration tools couldn’t handle them. The CRTs also had blacker blacks and the colors didn’t shift if you looked at the screen off-angle.

    I glanced at some photo forum archives, though, and by the end of 2003 the consensus (among photographers) already seems to have been that the LCDs are almost there and if you’re willing to spend an unreasonable amount of money, you could get something as good as the (cheaper) CRT. So I would probably say that the photo/design people switched en masse around 2004 and that implies that people who don’t care as deeply about color switched in 2002-2003.

    1. >people who don’t care as deeply about color switched in 2002-2003.

      I myself didn’t switch till late because I had the requirements of a page-layout person at a publishing house – I wasn’t as fussy about color as the photographers, but I needed to be able to view two 8.5×11 typeset pages at actual size.

      For a few years after 2003 you still couldn’t get a flatscreen that capable at non-ruinous prices. I think I decomissioned my last monster CRT in 2006 – I remember I was well behind the curve on this, and it might even have been a year or two later.

  16. The main thing hackers knew about synchronous modems was that they were useless for regular communications of the kind hackers were doing. They’d crop up at hamfests and the like, and it was not uncommon for an unwary hacker to pounce on what he thought was a smoking deal only to discover that he’d brought home a doorstop.

  17. I’ll disagree about the 7-segment, bit, too: you can display hex just fine on a 7-segment display. Uppercase A, C, E, and F, and lowercase b and d, are all unambiguous.

    There were computers that used seven-segment hex displays, such as the KIM-1, that were widely used by contemporary hackers.

    Consider this myth busted.

  18. > 1992 Bit-mapped color displays ship on consumer-grade PCs.

    The Amiga and Atari ST shipped in 1984, with support for color RGB CRT’s (this obviously skipped the problematic RF modulation step which was needed when hooking them up to a TV). Some home computers may also qualify here, especially if you don’t insist on a fully-general bitmapped display and allow for “character” modes much like what a terminal would generate. For instance, the Amstrad CPC, which was indeed quite popular; and the BBC micro, which launched as early as 1981.

    1. >especially if you don’t insist on a fully-general bitmapped display and allow for “character” modes much like what a terminal would generate.

      I specifically meant to insist on fully bit-mapped and digitally-driven color displays here. I remember the earlier color displays based on repurposing TV hardware and driven through analog modulation; they were horrible kludges unsuited for any serious work.

  19. @Jay:

    >The main thing hackers knew about synchronous modems was that they were useless for regular communications of the kind hackers were doing.

    Exactly my point. Knowledge of which poisonous nuts and berries to avoid.

  20. I think you’re dating bitmapped displays on PCs a bit too late. I bought a genuine IBM PC/AT in 1986 with an EGA display that was, I’m pretty sure, fully bitmapped and color. Yes, most consumers didn’t have them then, but every hacker/hobbyist I knew began getting them about that time.

    The VGA display came out in 1987 with 256 colors and was the true beginning of the modern display era.

    1. >I think you’re dating bitmapped displays on PCs a bit too late. I bought a genuine IBM PC/AT in 1986 with an EGA display that was, I’m pretty sure, fully bitmapped and color.

      Yes. Now, do you remember its resolution, and its character width and height when used in terminal mode? Probably not: if you’re like most people, you have mercifully blotted them from your memory.

      Here’s a hint: though EGA theoretically supported higher resolutions, the monitors generally available at the time maxed out at a 40×25 character display. Thus, EGA displays remained inferior to VDTs if you were doing any kind of programming or word processing. Which, in fact, is why IBM sold monochrome displays – they could form better, more readable text and were actually competitive with VDTs.

      That is what didn’t change until around 1992. Color display technology had to evolve for longer before it could match the dot pitch of a monochrome VDT.

  21. Not new to this version, but just a comment on the wording of this paragraph (bottom of the Hardware Context section):

    Often, however, nostalgia obscures how very underpowered those machines were. For example: a DEC VAX 11-780 minicomputer in the mid-1980s, used for timesharing and often supporting a dozen simultaneous uses, had less than a thousand times the processing power and less that five thousand times as much storage available as a low-end smartphone does in 2017.

    Although it’s obvious what you meant here (which is probably why it didn’t register when I read it the first time), the wording is confusing: “Less than 1000 times the processing power” seems to imply that a VAX 11-780 had “almost 1000 times more processing power” than a smartphone, which is obviously not what you meant. Likewise, having “less tha[n] 5000 times as much storage” as a smartphone seems to imply that a VAX 11-780 had almost 5000 times more storage than a smartphone, which is also obviously not what you meant.

    I suggest changing the wording to something resembling the following:

    Often, however, nostalgia obscures how very underpowered those machines were. For example: a DEC VAX 11-780 minicomputer in the mid-1980s, used for timesharing and often supporting a dozen simultaneous uses [users?], had less than 1/1000 of the processing power and less than 1/5000 of the storage capacity of a low-end smartphone in 2017.

  22. > Here’s a hint: though EGA theoretically supported higher resolutions, the monitors generally available at the time maxed out at a 40×25 character display.

    If you’re suggesting that MDA, CGA, and EGA monitors did not support 80×25 (the former only in text mode; EGA supported graphics modes of the same resolution), this is an extraordinary claim that requires evidence.

    It is true though that IBM monochrome displays did have superior quality to color displays, with a pixel resolution of 720×350 compared to CGA’s 640×200 and EGA’s 640×350. (VGA matched and exceeded it with a text mode resolution of 720×400.)

    There were almost certainly also quality differences not related to resolution, between monochrome displays that did not have shadow masks and subpixels, vs even white text at the same resolution on color displays. That could be what you’re thinking of?

    1. >If you’re suggesting that MDA, CGA, and EGA monitors did not support 80×25

      It’s tricky. The standards supported that resolution; the monitors generally available at the time did not. Their effective dot pitches were too coarse. This was a continuing annoyance for me through the 1980s.

      >There were almost certainly also quality differences not related to resolution, between monochrome displays that did not have shadow masks and subpixels, vs even white text at the same resolution on color displays. That could be what you’re thinking of?

      It’s almost equivalent to what I just said. Displays had to improve to the point where a three-color subpixel array had shrunk to the effective size of the monochrome dot on a VDT before they could actually obsolesce VDTs. Before that, you got all kinds of nasty artifacts and fringing, especially around the edges of text, that was quite eyestrain-inducing.

  23. @esr:
    > You raise a fair objection, but…while the fact that the transition happened is something every adult remembers, the date is something I think people are already forgetting.

    I think part of that is that the transition was much fuzzier than you may be remembering, and different people will have made it at different times or have different definitions for when it occurred.

    Portable computers made the transition very early, because “portable CRT” is an oxymoron. For the desktop transition, engineers for display manufacturers will remember the dates of the transition as “when we started design work to when we shipped the first unit”, salespeople will remember “when LCD monitors became n% of our total sales”, and the masses will tend to remember “when I bought my first LCD monitor to when I put my last CRT out to pasture.”

    For my family, that last one covers the range of approximately 2008-2015.

    1. >I think part of that is that the transition was much fuzzier than you may be remembering, and different people will have made it at different times or have different definitions for when it occurred.

      Which is more or less why I cited the date at which the majors stopped making CRTs. That’s a hard fact, and the clearest possible indication of where people with skin in the game thought CRTs were going.

  24. I think the source of confusion is that there are at least three plausible meanings of the term “resolution” involved here, and I picked the wrong one. Plus a difference between “supported 80 columns at all” – enough to be minimally usable, perhaps to someone not ‘spoiled’ by monochrome displays, vs “supported well enough to replace VDTs”.

    It doesn’t help that the most commonly available monitors in the CGA era were probably televisions or equivalent “composite monitors”, which allowed for some neat tricks to get more colors than CGA otherwise supported via carefully chosen dot patterns, but certainly weren’t any good as text displays

    1. >I think the source of confusion is that there are at least three plausible meanings of the term “resolution” involved here, and I picked the wrong one.

      And I should have made my original assertion clearer. I’ve revised the document accordingly.

  25. Before terminfo there was termcap, and manually editing termcap entries to describe new terminal devices was not uncommon back then.

    In addition to bold and reverse text, some terminals supported a blinking text attribute; which I suspect was inspiration for the early HTML tag (but I don’t have a source).

    Can you provide more information about the transition from the VT-100 style escape sequences to the later adoption of the ANSI-style sequences, which introduced a color palette?

    Should there be a mention of the early file transfer protocols? In particular I remember the progression of popular ones begin Kermit, XMODEM, YMODEM, and finally ZMODEM. And before any of those semi-automated protocols there was the uuencode(1)/uudecode(1) pair of commands which was the predecessor to Base-64 encoding still used today for embedding binary data in a stream of text characters.

  26. > I specifically meant to insist on fully bit-mapped and digitally-driven color displays here. I remember the earlier color displays based on repurposing TV hardware and driven through analog modulation

    Well, doesn’t ‘RGB monitor’ specifically imply that it’s _not_ driven by TV-like composite input? I’m also not disputing that monochrome monitors used to be a lot better in resolution – there’s a reason they were commonly used in DTP-like workflows. Early models of the Macintosh were obviously monochrome, and the Atari ST shipped with an option for a monochrome monitor. Perhaps 1992 is a good timeframe for ‘monochrome CRTs falling into disuse as color displays became cheaper and reached comparable quality’ – but this seemingly doesn’t have much to do with the end of dedicated terminals! I suspect that organizational inertia was a factor in their continued survival when they could’ve been replaced by terminal emulation software, running on cheap (“home”) micros. (For instance, the reasonably-cheap Commodore 128 had support for running CP/M, with 80×25-character output on a RGB monitor, independently of the main composite (‘TV’) screen. Surely other micros would’ve had broadly similar capabilities.)

  27. I know ADSL still lags, but the ramp-up to fast SDSL and optical happened pretty fast after ’97 – I know I got FIOS with 15mbps down/5mbps up in 2002 which was already an order of magnitude batter that 1mbps.

    I think this is more regionally variable than you suspect. While you probably have an accurate timeframe for urban hackers, rollout to other parts of the country was substantially later. DSL service areas only reached my home about a decade ago (the gee-whiz feature wasn’t actually the new speeds, but the fact that being online no longer blocked incoming phone calls; away from urban areas many residences aren’t covered by cell networks and still rely on landlines).

  28. > It’s tricky. The standards supported that resolution; the monitors generally available at the time did not. Their effective dot pitches were too coarse. This was a continuing annoyance for me through the 1980s.

    I remember my Amstrad CPC 464 having a perfectly readable 80 x 25 mode, especially when green or gray on black text was used.

    But I’m also remembering how it looked a) relative to other micros that used TV screens, and b) when seen through the eyes of an eight year old.

    I’ll drag mine out of storage this weekend and see if it’s as usable as I remember.

  29. And now that I think about it, I’d given myself CRT-induced eyestrain at the age of ten, an oddity back in late 80s New Zealand. ESR may have a point ;)

    1. >And now that I think about it, I’d given myself CRT-induced eyestrain at the age of ten, an oddity back in late 80s New Zealand. ESR may have a point ;)

      Yeah, see? People forget these things. I’ve noticed a chronic tendency for people to to back-project the performance of later, better hardware onto the older stuff and then react with initial disbelief when reminded of their own experience.

      A parallel kind of forgetting is hackers old enough to remember pre-Linux days nevertheless spacing how much of our work used to be done on proprietary systems using closed-source software. I’ve seen that a lot, too.

  30. On the subject of serial cables, I ran into something new (at least for me) today. This was a cellular modem which supported a USB cable. The USB cable was attached to a box with four 9-pin serial connectors, each of which fed a Cisco console cable. So does the modem host some kind of terminal emulation software, or is the USB cable given an IP Address?

  31. >> You might note that (very short) serial connections are still commonly used for configuration purposes in the networking world.

    > I think this falls under “niche applications”.

    In nine years of doing my job, every Cisco or Juniper device I’ve ever unpacked came with a serial cable, and this is still the primary way of configuring your router or switch, to the point that something like 95% of all the routers/switches I’ve installed were configured by this method. At the very least, each professional grade router/switch needs, by serial cable, a startup configuration which tells it where to find the TFTP server which hosts it’s real configuration.

    So definitely not niche, though possibly not a “hacker” thing.

  32. So does the modem host some kind of terminal emulation software, or is the USB cable given an IP Address?

    Most likely, the box with the cell modem is a terminal server (providing telnet/SSH access to those serial ports on TCP ports), and the USB device is a compatible multiport serial adapter.

  33. Serial TTL has been around since 1977, the wonderful SC/MP chip had it. There is an 8 bit register (“E”) that you could load data into and it would clock it out, likewise it would clock data into it. If you dig back into your pile of Dr Dobbs there were computers that used this function. It was the perfect way to connect SC/MP systems to the TV Typewriter. (And a side note, the SC/MP was one of the first micro chips that supported multiple processors).

    I also remember Kermit fondly, it was one way to connect to the SUNY systems “big iron” from a microcomputer. Lots of other University systems offered Kermit servers at the time, so it wasn’t just Columbia and SUNY.

  34. I think the most obvious reason why octal was popular is that the symbols it uses are a subset of those used in decimal, so no one had to learn any new symbols to work with it, and they were already encoded in consecutive locations in the ASCII character set. (In retrospect, having uppercase A follow 9 would have made bases through 36 far easier to handle.) The non-trivial work necessary to support the disjoint symbols was eventually overcome by the nice “two hex digits per byte” feature, as memory prices fell and the extra complexity of handling hex became seen as well worth the effort.

    As Tom Lehrer famously observed “Base Eight is just like Base Ten, really. . . If you’re missing two fingers.”

  35. (In retrospect, having uppercase A follow 9 would have made bases through 36 far easier to handle.)

    Perhaps, but you can’t fit both into a 32-sized row, and there’s only room in ASCII for two 64-sized rows. You couldn’t have all those nice bit structure properties we just spent a week talking about. In exchange for a row of 36 consisting of uppercase letters, you’d have to give up either lowercase being one bit off of uppercase, or control characters being a contiguous block easily mapped to (mostly) letters by bitmasking. Probably the best you could do is map ctrl-0..9A..F to one row of control characters, and ctrl-G..V to the other row, leaving ctrl-WXYZ unused. (In exchange you’d also gain shift *always* setting bit 32 instead of sometimes setting bit 16 – but the control rows wouldn’t be contiguous, so it would require a bit more logic. The 6-bit subsetting also wouldn’t be as nice, either.

  36. I forgot to mention the alternative of keeping the character set mostly as-is (perhaps with A as 64 instead of 65), but placing 0-9 at the end of the preceding row (codes 54 to 63). This loses the ability to convert to decimal digits with a bitwise-or instead of an add, which I’m not sure if it was historically important or not, but both ASCII and EBCDIC had.

  37. Indeed, there would have been some tradeoffs. But that just indicates those points were considered more important than making symbols higher-base numbers contiguous. If it were really important, we could have gone with [0..9A..Za..z] occupying 62 out of a block of 64 consecutive code points, trading “shift is bit-flip” for “shift is subtract 26” and if the other two code points were chosen well, we’d have gotten Base64 encoding just by adding 32 to the six bits to be encoded.

    1. >trading “shift is bit-flip” for “shift is subtract 26”

      Crash landing. Bit-flip was an achievable thing, if just barely, when your keyboard was an ASR-33-like maze of mechanical actuators connected to keycaps. Subtract 26 would have been utterly impractical.

  38. Duncan and Eric mentioned CRT-induced eyestrain. I use a glass filter with my CRT. It’s not possible to hang a glass filter on an LCD/LED monitor, is it? :-P

  39. > Bit-flip was an achievable thing, if just barely, when your keyboard was an ASR-33-like maze of mechanical actuators connected to keycaps.

    ASR-33 didn’t even have lower-case letters, so it’s a completely different sort of “shift” from the “32” bit-flip we associate with alphas. I think it had a proprietary code scheme anyway, because back when it was created, there weren’t really any standards.

    Flipping the “16” bit worked for <, >. [{ ]} \| 1! 2@ 3# 4$ 5%, but not 6-0 or :; ‘” so clearly by the time ASCII was codified, things had moved past some kind of simple scan code directly representing the key combo. I always assumed such things were implemented as a simple lookup table in the terminal’s ROM by then.

    1. > always assumed such things were implemented as a simple lookup table in the terminal’s ROM by then.

      Oh ho ho ho. ROM? What’s that?

      There weren’t no steenking ROM in the ASR-33. It was electromechanical logic all the way down, baby!

  40. @The Monster The ASR-33 was one of the first ASCII terminals – previous Teletypes having used a Baudot style code (a variant of ITA-2) And anyway, you can easily google pictures of the keyboard – shift-1..9 are !”#$%&'() and shift-KLMNOP are [\]^_@, perfectly matching to switching the fifth bit of the character value in ASCII.

    Incidentally, if you google an ASR-32, and google the Baudot code (actually a variant of ITA-2), you may notice that this code was also designed to make keyboards easy to make: QWERTYUIOP in the letter set map to 1234567890 in the figure set.

  41. Also, you’re making some unwarranted assumptions if you think :; belonged together on the keyboards being discussed (you’re also making mistakes about the existence of some keys, and some characters, on the ASR-33 in particular). If you haven’t, you should definitely read http://www.quadibloc.com/comp/kybint.htm

  42. I did manage to find an ASR-33 manual at http://www.soemtron.org/teletypemanuals.html – the shift key inverting the fifth bit was absolute – there were no keys for which it did anything else. It also mechanically prevented pressing any keys other than 1-9 and KLMNOP. The control key locked out all keys except the letter keys. The manual specifically mentions the ability to generate ESC by Shift-Ctrl-K, etc.

  43. Er, and the other symbol keys ,< .> /? :* ;+ -= of course – I meant it locked out all the other letters and zero (one might imagine shift-zero producing a space, which was why I looked for the manual in the first place, but evidently not.)

  44. @ESR @Random832
    >The ASR-33 was one of the first ASCII terminals

    I was misremembering it as having been born just before ASCII was adopted, perhaps influencing some ASCII design decisions and/or incorporating some emerging consensus about what ASCII should look like. I missed out on actually using them, but I’m sure I saw a few of them. I distinctly remember the paper tape thing on some terminals, but it’s possible that was some other model, which might explain fuzzy memory on key placement.

    We still had card-punch machines at school until about 1980, when we finally got green-screen terminals (and an APL/BASIC switchable [before boot] micro for the computer lab).

  45. The ASR-33 did have paper tape (that’s actually the reason for “ASR” – the non-paper-tape model was called “KSR”), but it was ASCII paper tape.

    It looks like the 33 evolved alongside ASCII – earlier versions of the manual have code charts and keyboard diagrams showing ASCII63, and Shift-NO generate arrows instead of ^_. The 1963 manual doesn’t mention ASCII by name (but shows a full code chart that matches ASCII63 exactly), the 1964 manual does. Incidentally, another thing I noticed on this read-through – the keyboard in the versions described in the 1964 manual onwards actually physically encoded even parity – and the ctrl and shift key both inverted the parity bit in addition to their other operation. (1963 manual version had mark parity)

    http://www.rtty.com/CODECARD/codecrd1.htm (which I looked up to show some examples of ASCII paper tape) also indicates another reason for the structure of ASCII – uppercase-only terminals needed to be able to convert incoming lowercase codes to uppercase. (Incidentally, the typewheel coding card mentions “ASCII66”, which doesn’t seem to have been a year that a revision to X3.4 was actually published)

    1. >It looks like the 33 evolved alongside ASCII

      That is correct. If you look at the list of early papers on ASCII at Bob Bemer’s website, you’ll find strong indications that the 33 was prototyped simultaneously with ASCII in 1961-63. The head of the X.34 committee was a Teletype employee, and it would be thus quite surprising (given the timing) if the evolving ASCII spec and the design process of the 33 didn’t influence each other.

  46. >33 was prototyped simultaneously with ASCII in 1961-63

    Then I wasn’t misremembering this much: When the `33 was designed, ASCII didn’t actually exist yet, but it definitely had its pulse on that “emerging consensus”.

    1. >Then I wasn’t misremembering this much: When the `33 was designed, ASCII didn’t actually exist yet, but it definitely had its pulse on that “emerging consensus”.

      The people involved are mostly dead now, but I’d bet that if we could ask we’d find there was conscious coordination between X.34 and the 33’s designers. All the incentives were virtuously aligned for people to play nice: X.34 stood to benefit from having a demonstration device by a major manufacturer, and Teletype stood to benefit from being able to say the 33 conformed to an ANSI standard.

  47. A data point on octal. I first encountered it on the GE 615/625/635 36-bit mainframe computer which came out in the mid-60’s, and then on (IIRC) the DEC-10. I vaguely remember that some other mainframes of the day also used octal. On the 635, half-words were 9 bits, and instruction codes were 9 bits and short addresses 18. Octal is far friendlier to those than Hex.

    The GE 635 was the basis for the GE 645, the MULTICS machine. Perhaps that lineage led to the use of octal in Unix, or perhaps it was DEC.

Leave a comment

Your email address will not be published. Required fields are marked *