Things Every Hacker Once Knew: 1.11

The newest version of Things Every Hacker Once Knew is only a minor update.

There’s material on SIGHUP; six-bit characters on 36-bit machines; a correction that XMODEM required 8 bits; and why screensavers are called that.

New submissions are ramping down; I don’t expect to need to issue another update of this for some time.

88 comments

  1. Er, I searched the page for “saver” and found nothing. Also, the latest version listed is 1.0 and the footer reads: “Last updated 2017-02-20 13:52:09 EST”.

    1. >Er, I searched the page for “saver” and found nothing. Also, the latest version listed is 1.0 and the footer reads: “Last updated 2017-02-20 13:52:09 EST”.

      Try a shift-refresh. I can see it.

  2. Ah, but the change history still doesn’t mention the 1.11 revision. Dunno if that matters or not, but I’m letting you know just in case. :o)

    1. >Ah, but the change history still doesn’t mention the 1.11 revision. Dunno if that matters or not, but I’m letting you know just in case. :o)

      Something’s weird. I see

      1.11: 2017-03-02
      SIGHUP. Six-bit characters on 36-bit machines. XMODEM required 8 bits. Screensavers.

  3. It’s weird indeed. I keep shift-refreshing, but I still don’t see that.

    I’m afraid we have entered… the Twilight Zone!

  4. Re: SIXBIT

    Contra your statement that SIXBIT was completely different from ASCII, my memory is almost the reverse. If you subtracted the ASCII code for a space from the code for a given character and masked off the high bits, you got the SIXBIT code.

    RADIX-50, which was the equivalent on the PDP-11 series, was a different matter.

    1. >Contra your statement that SIXBIT was completely different from ASCII, my memory is almost the reverse. If you subtracted the ASCII code for a space from the code for a given character and masked off the high bits, you got the SIXBIT code.

      I never mention SIXBIT. What sentence are you referring to?

  5. @esr –

    Not sure where in the text you talk about XMODEM requiring 8 bits. (I searched the entire doc for all of “8 bit”, “8-bit”, and “XMODEM”, and only saw the two together in the changelog.)

    I must be missing something. :-(

    1. >I must be missing something. :-(

      Previous version asserted incorrectly that XMODEm could pass 8-bit data over a linkk with parity – I misunderstood something I had read. That assertion has been removed.

  6. Never mind, it was my mistake. I expected to find the latest revision at the top of the list, possibly due to having gotten used to the order in which blog posts are displayed.

    Looks like I need a digital detox. v_v

  7. Wandered over to Wikipedia to confirm my memory and got confirmation re SIXBIT. I did get RADIX-50 slightly wrong, it was used on the DEC-10 as well, for symbol definitions in object files.

  8. Looks like there were three basic categories of six-bit codes: ASCII subsets (including SIXBIT and the code used for magnetic stripe cards), BCD derivatives (the midpoint in the evolution from punch cards to EBCDIC, with haphazard assignment of characters other than letters and digits), and unique character sets like FIELDATA and CDC.

  9. @esr Well, you don’t mention it directly. You do call out the PDP-10 (which would have used SIXBIT) as an example of a machine using a 6-bit character set for filenames. And in any case, “none of them even remotely similar to ASCII” is overstating the matter just a little bit. Some of them were similar to ASCII.

  10. In the sentence “During this transition period there were some odd hybrid mail addresses that used a “%” to weld bang-path roting to Internet routing.”, I believe “roting” is supposed to read “routing”.

  11. To my understanding, % was actually used to weld other forms of addressing, some of which used @ natively, like BITNET and CSNET.

    1. >To my understanding, % was actually used to weld other forms of addressing, some of which used @ natively, like BITNET and CSNET.

      I may remove that sentence entirely. I’m not sure it’s possible to say anything true and useful about these mixed addresses without going on at tedious length.

  12. There were many different 6-bit character encodings, none of them even remotely similar to ASCII and often differing even across a single manufacturer’s machines

    It’s in the section on 36-bit machines and references the DEC-10. It doesn’t use the word SIXBIT explicitly.

  13. I’m glad you mention six bit character sets and 36 bits. Even today, Chuck Moore uses 6 bit character sets in his personal versions of FORTH, and he keeps using 18 bits (half of 36 bits) for his stack-based chips (like the F18). This history makes more sense of that; his explanation that “18 bits is more efficient” didn’t really make sense to us.

    1. >Even today, Chuck Moore uses 6 bit character sets in his personal versions of FORTH

      Interesting. Which 6 bit character sets?

  14. 6 bit characters, 18 bit words for Chuck Moores F18 (and newer) chips, makes sense of another FORTH feature: the first 3 characters are a sufficient amount of information to look up a word in the dictionary. FoxPro/Clipper started as 16 bit, but maybe this explains why 4 characters is enough to dictionary lookup the built in symbols and functions.

  15. And yes, Clipper is still actively used and developed in the Open Source “Harbour” variant.

  16. Did the pace of commodity PC innovation really drop in 2001? That’s before the 64-bit revolution, the multicore revolution, or even the end of the gigahertz race between AMD and Intel. Dynamic frequency support wasn’t introduced until 2002ish, nor virtualization, though I guess those may not be “visible” innovations.

    I would put the end of visible innovation closer to 2009 or so, when every machine is a quadcore running around 3 GHz with a graphics card capable of pushing a 1080P HD signal. Even the slowdown didn’t really occur until closer to 2006 and the end of the giaghertz race.

    1. >Did the pace of commodity PC innovation really drop in 2001?

      Yes, but it wasn’t the last fall in velocity. You’re certainly not wrong about 2009.

      I had an inside view of this because I was still on the board of VA Linux into 2001. Though the obvious, proximate cause of the collapse in our hardware business was the dot-com bust, the underlying problem was that building Sun-class workstations out of PC hardware had gotten too easy. Our in-house engineering talent was having trouble staying ahead of what you could do with completely generic hardware – not because our engineers weren’t good, but because a lot of other peoples’ engineers further up the supply chain were good too.

      Whenever that kind of commodification happens, one consequence is that NRE for more special sauce gets harder to justify – your minimax shifts to raising volume and cutting production costs. Innovation slows down. All maturing industries look like this, and PCs are not an exception.

  17. @esr –

    > I’m not sure it’s possible to say anything true and useful about these mixed addresses without going on at tedious length.

    You could summarize by mentioning that it was so complicated that someone wrote a whole book about it.

    1. >You could summarize by mentioning that it was so complicated that someone wrote a whole book about it

      Yeah. I used to have a copy of that. Might still, somewhere.

  18. Yes, the ‘%’ in email addresses was widely used, and cursed, primarily during the transition to SMTP when there were mixed environments. Something you don’t mention is that the UUCP naming hierarchy was backwards (left-to-right) from today’s DNS scheme (right-to-left); so that the top-most scope in UUCP was listed first, whereas the top-most scope (domain) in DNS is last.

    I’m not sure if this fits into the scope or not, but what about some of the early widely-used or popular Internet services before the Internet became popular.

    * The Internet Oracle. Still running, this is one of the oldest and at the time a very popular Internet humor website, and which also existed as a Usenet group (rec.humor.oracle) back when Usenet thrived.

    * The Open Directory Project, aka DMOZ. One of the best known attempts at an open and collaborative curated index of the Internet. And which is finally shutting down this month.

    * Archie. This used to be the definitive way to find software, listing FTP sites, in the pre-search engine days. Speaking of which, FTP was itself was something that was universally known and used; and which though it still exists today is mostly forgotten.

    * Gopher. A widely used precursor to search engines.

    1. >Speaking of which, FTP was itself was something that was universally known and used; and which though it still exists today is mostly forgotten.

      Yeah, I probably need to add something about FTP. Archie and Gopher and DMOZ weren’t universal common knowledge yet by the time the Web obsolesced them; FTP, though, certainly was.

  19. Also don’t forget about the Unix fortune program. This along with The Internet Oracle were both widely appreciated sources of humor in the hacker community … and maybe they can fit into the Games section.

  20. esr:

    I used to have a copy of [a certain book]. Might still, somewhere.

    That you’ve lost track of your book collection indicates it’s huge, which doesn’t surprise me in the least. Can you make an estimate of how many you’ve got?

    Deron Meranda:

    …[DMOZ] is finally shutting down this month.

    That’s sad news; I used it a few times. Thanks to it, I found this humorous comparison between Friedrich Hayek and Salma Hayek. XD

    1. >That you’ve lost track of your book collection indicates it’s huge, which doesn’t surprise me in the least. Can you make an estimate of how many you’ve got?

      Probably somewhere around 300 in my office alone, and another 1500 or so downstairs.

  21. @esr:
    > Yeah, I probably need to add something about FTP.

    I’m not sure FTP is yet obsolete enough to be out of currency with younger hackers. All of the Debian primary mirrors seem to be FTP, and I run across FTP sites reasonably often. They seem to remain popular in academia. Mostly what’s obsolete is people accessing FTP sites interactively through the command line. Use cases like the Debian repositories, where a system component accesses an FTP site automatically are, from what I can see, quite common, and it is fairly common for websites that have files available for download to have an FTP download link for each file, or even for a web page to link to an FTP directory and for the user to browse the FTP directory in-browser. Where FTP really is obsolete is any case where authentication is needed, in which case solutions like SFTP are used, but anonymous, public access FTP seem to still be going strong from what I can see.

    1. >I’m not sure FTP is yet obsolete enough to be out of currency with younger hackers.

      The protocol isn’t. The CLI tool almost certainly is.

  22. Another place where octal suddenly makes sense; the original Burroughs B5500 was a 48 bit architecture, and I’m sure there were others. 48 bits gives you 8 characters (6 bit characters)… could this be connected somehow to MS-DOS having 8 character filenames?

  23. And the relevance of the B5500 is that it was/is a big inspiration for Alan Kay and his object oriented vision of computing.

  24. >The protocol isn’t. The CLI tool almost certainly is.

    I wonder…

    For a stricter definition of hacker (having been given the title by an existing member of hacker culture), and given the importance of Unix/Linux to hacker culture, I find it hard to believe that even younger hackers would be unfamiliar with the CLI tool, though I would expect that few of them use it. For looser definitions of hacker (people with the mindset, skillset, and volume and quality of code written to earn the title, but no connection to hacker culture to have had the title bestowed), I can construct a few corner cases where young, probably larval, hackers with prior exposure only to the graphical side of Windows would be unfamiliar with the CLI tool.

    1. >I find it hard to believe that even younger hackers would be unfamiliar with the CLI tool

      I don’t. I can’t remember when I last used it. Probably at least five years ago at this point.

      Strictly speaking I haven’t used ftp(1) itself in much longer than that; my FTP tool of choice for many years was ncftp, which is way better – lots of convenience and discovery features that could be in ftp(1) but aren’t. My actual belief is that the entire category of FTP-only clients is obsolete and has been for long enough to pass out of common knowledge.

      The old use cases for ftp have been eroded from two directions. The read side and unauthenticated-write side has been obsolesced by HTML download and upload links – wget(1) is the new ftp(1). The authenticated-write case has been largely clobbered by scp(1). What’s left?

  25. > Another place where octal suddenly makes sense; the original Burroughs B5500 was a 48 bit architecture, and I’m sure there were others. 48 bits gives you 8 characters (6 bit characters)… could this be connected somehow to MS-DOS having 8 character filenames?

    MS-DOS gets its 8.3 limit from CP/M (pre-DOS versions of FAT had a 6.3 limit). The CP/M filesystem used 8-bit bytes for the filename, with the high bit being used for file attribute bits. It looks like eleven bytes was just the amount of space left in the directory entry, which consisted of 16 bytes of block numbers and five bytes of other information.

  26. I regularly use FTP to upload to web space. From habit I use the CLI, which is still part of the default install of Windows.

    Paul

  27. The good old days, when in order to find software for my new Linux system, I had to hit up sunsite.unc.edu via ftp.

    The same if I wanted to download Doom maps.

    Ftp, the protocol, has been obsolete for some years. The last I remember using it, it was to do local-subnet uploads of data to autonomous underwater robots. The drive-a-truck-through-wide security holes in ftp doomed it for serious web or file ingest use sometime within the last ten years if not before. All of the major browsers have removed support for it except, apparently, for Firefox; and I think they have a ticket open to remove it.

  28. potential topics:

    How about the date when Knoppix first shipped? It was the first (successful) “Live” Linux distro on a CD-ROM and was a landmark date for me, as it was the first (real) Linux system I ever got working on my crappy W95 workstation. (Previous attempts to load Red Hat on obsolete 386/486 hardware with a series of floppies always failed, and the only other thing that worked prior to that was “tomsrtbt” and heck if I could figure that out by myself.)

    (Most of the rest of the stuff is probably too obscure.)

    There’s also the date that most BIOS could boot from an ordinary USB mass storage device that wasn’t emulating a CDROM. I think this was around some month in 2004 or so, and had it written down because it meant I needed instead to pop the case open to boot with my tools, (assuming it didn’t have a PATA CDROM drive.)

    PAE in the CPU is still rather recent to be included, but the kids will be scratching their head in about 5 more years. (I still don’t understand why this can’t be sensed and worked around rather than requiring a nonPAE kernel.)

    Zip disks? Jazz Drives? “Click o’ Death”, “Blue screen of death”, “Three finger salute”, or the finger protocol? Bidirectional parallel ports? Proprietary hardware dongles that sat on the parallel printer ports and provided a printer pass-through?

    1. >(Most of the rest of the stuff is probably too obscure.)

      I think we are fast reaching the point of diminishing returns.

  29. Bidirectional parallel ports?

    By “bidirectional parallel ports”, do you mean the parallel ports that were on many non-x86 systems, the parallel ports on the earliest batch of IBM PCs before IBM switched to “unidirectional” ones, running the “unidirectional” PC/XT/AT and clone parallel ports with the “nibble mode” software hack, most PS/2s’ “Type 1” parallel ports, late PS/2s’ “Type 3” parallel ports, the Intel-Xircom-Zenth EPP, the Microsoft-HP ECP, or IEEE-1284?

    Really, I’m not sure there was ever “common knowledge” about parallel ports beyond that you could (with varying degrees of success) attach things other than printers to them, and I’m not aware of any significant modern legacy.

  30. > Yeah, I probably need to add something about FTP. Archie and Gopher and DMOZ
    > weren’t universal common knowledge yet by the time the Web obsolesced them;

    Archie and Gopher would have been known about by hackers.

    I knew about them, and if I knew about them then any Unix hacker would know.

  31. RE: FTP

    > >I find it hard to believe that even younger hackers would be unfamiliar with the CLI tool

    > I don’t. I can’t remember when I last used it. Probably at least five years ago at this point.

    Last time I used it was about 20 months ago to download firmware images from SuperMicro. It was a real PITA because the goons I was working for blocked FTP at the firewall and SuperMicro didn’t have a mechanism for putting it on an authenticated HTTP server.

    In fact that is probably the biggest use case for the CLI FTP client (and like Mr. Raymond I preferred ncftp, but when in Rome one does the needful) was dealing with cut-rate hardware manufacturers.

  32. Found a spelling error:

    “In this century uuencoding has larely been replaced by MIME64”
    Should be
    “In this century uuencoding has largely been replaced by MIME64”

  33. Again, let me plug the date Knoppix came out. It was revolutionary because it largely self-configured on boot and because you could reboot and remove the disk and then be back to your MS Win world.

    There was a flood of users to Debian forums after many people tried Knoppix; so much so that they had to post notice that “Knoppix was not Debian” in a very “endless summer” kind of way. To compare it to the following historic quote and slogan, I think, isn’t too much over the top.

    >”Abe Lincoln may have freed all men, but Sam Colt made them equal”

    Steven>By “bidirectional parallel ports”, do you mean the parallel ports that were on many non-x86 systems, the parallel ports on the earliest batch of IBM PCs before IBM switched to “unidirectional” ones…

    I remember around the mid 90s there was a new option in many BIOS settings for parallel ports. The ports always had some bidirectional pins, but were mainly used to pump 8 bit words out to the dot-matrix printer. When you could do a function call to read the state of the pins as well as write to them, the ports started to be used for things like Jazz Drives. I think the available bandwidth was sped up at the same time.

    The enhanced port functionality may have also been used with the software copy protection dongles, these were quite popular before everything came with an ethernet port and at least one MAC address (which is another frequent scheme to try to enforce copy protection.)

    The BIOS settings allowed you to cripple the ports for backwards compatibility.

    Back in those days, the port was used by hardware hackers in much the same way the GPIO pins on microcontrollers are used today.

    Since ESR admitted a few years ago on the “SBC to NTP server” post that his solder-fu was weak, he’s probably biased toward the software side. (That’s not intended as an insult. At the time I personalty only had a weak grasp of DOS and Basic.)

  34. Nth-ing the fact that when most people say “FTP” that they don’t mean “CLI FTP”.

    I actually used the CLI a week ago because I had to solve someone else’s problem (and of course I told them how unsecured FTP was and how it was a popular vector for website defacement.) I get the feeling that the only way to let FTP die is to create a completely different protocol and default GUI and call it “Super-FTP” or something and hope it gets as widely adopted.

    I used it because FTP GUI has been removed from Firefox for quite some time IIRC. It took me a little bit to realize the file manager on my box did also allow for remote shares via FTP protocol.

  35. > There’s also the date that most BIOS could boot from an ordinary USB mass storage device that wasn’t emulating a CDROM.

    It’s nice, but I think it overlapped enough with CD burners being ubiquitous that there wasn’t really a hard transition for most people.

  36. I remember around the mid 90s there was a new option in many BIOS settings for parallel ports.

    Okay. Given that timing and that context, you mean the adoption of IEEE-1284 “Standard Signalling Method for a Bidirectional Parallel Peripheral Interface for Personal Computers” (March 1994). Since most “Super I/O” chipsets quickly implemented it, it became the first time x86 clone users could really rely on their parallel ports supporting more than the functions implemented by the XT/AT “unidirectional” type.

    Given the move of Unix hackers towards PC clones running open source at the same time, and the adoption of IEEE-1284 on non-PCs, the changeover probably did count as common knowledge across hackerdom for a while.

    But, as far as any legacy surviving to today, I can’t think of the last time I saw a parallel port even on a printer. The differences between types of parallel ports becomes a bit academic when there aren’t any in use.

  37. @Standard Mischief “Again, let me plug the date Knoppix came out.”

    Excellent point, Knoppix was the catalyst to get me and others onto Linux. The LiveCD experience was an easy way to try all the features of Linux, and Knoppix was loaded with tools and applications. It was a no risk way to try things out.

  38. There’s an analogous “Things every sysadmin once knew” to be written – but not by me. It would have a lot of overlap with this, as well.

  39. Bad Boys Rape Our Young Girls But Violet Goes Willingly?

    (resistor band color codes)

    More Home-Brew Computer Club than Unix hackers, but still.

  40. @Ian Argent –

    There’s an analogous “Things every sysadmin once knew” to be written … It would have a lot of overlap with this, as well.

    It would have to have a lot of overlap. I believe historically the distinction between hackers who were mainly programmers and hackers who were mainly sysadmins was almost nil. (I can tell you that I started out as a code monkey in a tiny startup who also had to keep the devel system running, and slowly drifted into pure systems administration.) And for those who just explored things on their own (i.e., outside of their regularly paying gig), you had no choice but to run the box (for better or worse) as you worked on the code.

    Places with old-school professional “operators” and “systems analysts” were almost always places where you couldn’t futz with things in the hacker way. And those sysadmins probably were most proficient (if not exclusively so) on The Real System, so they didn’t have to keep a lot of general, common knowledge in their minds.

    Note also that the current trendy “DevOps” is a fusion of being able to both program and administer the system(s). So the general knowledge again condenses to the same set. (One of the other regulars here privately mentioned the following snark to me: “So, apparently the current title of choice is ‘DevOps’, which roughly translates to ‘system administrator who can code’. In my day, we just called that ‘competent’.”)

    1. >(I can tell you that I started out as a code monkey in a tiny startup who also had to keep the devel system running, and slowly drifted into pure systems administration.)

      I was the admin for my company’s VAX running 4.1BSD at my second programming job (’83-’85). I can certify that the programmer/admin distinction was barely a gleam in anyone’s eye then, if that. I don’t think it hardened until into the 1990s.

  41. I can certify that the programmer/admin distinction was barely a gleam in anyone’s eye then, if that. I don’t think it hardened until into the 1990s.

    Windows

    1. >Windows

      How would that explain the distinction taking hold in Unix-land?

      No, I think this was a straightforward consequence of complexity escalation in what sysadmins had to know, especially as security and spam became increasingly pressing issues.

  42. From discussion here, I have gathered that prior to the point at which the roles begin to separate, sysadmin was a superset of hackers-as-programmers; but the skillset and knowledge base is an overlap, not a superset; and the overlap in day-to-day required knowledge isn’t as big as to justify overlapping the roles.

    One of the reasons I’m sceptical of the push for DevOps, myself, is that if you want people to be developers and operators too, you need people who are temperamentally suited to both roles.

    Anyway, by the time I entered the workforce in the early nineties, the roles were de facto separate; and this was prior to major adoption of Windows. This was also true at the university I was attending at the time (Stevens Institute), where the VAX cluster was run by a professional staff who were not all hacker-as-programmers (though there was at least one there, and he may still be there). (Overlap of entry into the workforce and college due to doing the co-op program, where I was paid an entry-level wage to do entry-level work in between semesters of college.)

    Now I want to re-read my copy of Cuckoo’s Egg, because it’s my impression that even by that point Cliff Stoll was a sysadmin first and foremost, and to the extent that he was a programmer was that the automation tools of the time were very primitive; and most of the people he interacted with during the events in question were sysadmins separate from programmers.

    I was the admin for my company’s VAX running 4.1BSD at my second programming job (’83-’85). I can certify that the programmer/admin distinction was barely a gleam in anyone’s eye then, if that. I don’t think it hardened until into the 1990s.

    De facto vs de jure – I would submit that then (as now) you were a special case.

    1. >De facto vs de jure – I would submit that then (as now) you were a special case.

      I didn’t think so. I think the demands of the sysadmin job have changed in a way that have made it a worse fit for me than it was. I was OK with being the house admin back in the ’80s, but I’m temperamentally unsuited to sysadminning today. There’s too much rote knowledge and boring procedure and too little figuring out things from generative principles to suit me.

  43. >Windows

    How would that explain the distinction taking hold in Unix-land?

    Background: I was coming of age right when this happened, and my first job was being hired on the spot in a Solaris shop because I knew Linux and a colo customer needed some hand-holding. As of 2000 in a Unix shop it was simply taken for granted that a 16-year-old admin would be able to wrangle shell scripts.

    The process as I interpreted it from on the ground looked like this: Netware had had some GUI configuration features for a bit. Windows NT (particularly 3.51) significantly expanded on its capabilities, and in conjunction with the release of Windows 95, became The Official Corporate Intranet Thing. Not only was NT configurable via GUI, it wasn’t effectively configurable without it.

    There remained a general class hierarchy of shops-with-Windows-9x and shops-with-Windows-NT-Workstation until Windows 2000 came along, which essentially sewed up the corporate landscape and placed the domain controller at the center of the universe. Suits viewed “admin” work as essentially a Hollywood parody of clicking around on complex (poorly-laid-out) GUIs, and wanted to see sufficiently impressive cockpits.

    Responding to this pressure, as early as the very late 90s, Sun and other publishers had started to roll out GUI control panels with mixed success; I have distinct memories of the GUI tools in Solaris 8, and they were essentially trivial Motif forms over standard Unix structures such as the passwd file, and frequent use tended to make one suspect.

    As suits started believing they were familiar with infrastructure work, however (because it was in the same GUI toolkit and thus, obviously, they understood it), they started to want to be able to hover on the Unix infrastructure that ran backends. At the same time, Linux started taking off as a serious force in serverland. However, Linux is free, right? But all these Unix admins are way more expensive than Windows admins. Easy solution: Linux has identical control panels to Windows, so we’ll just throw some Windows admins at our Unix servers.

    (Step 4: ??? Step 5: systemd)

  44. I didn’t think so. I think the demands of the sysadmin job have changed in a way that have made it a worse fit for me than it was. I was OK with being the house admin back in the ’80s, but I’m temperamentally unsuited to sysadminning today. There’s too much rote knowledge and boring procedure and too little figuring out things from generative principles to suit me

    That was kind of what I was getting at; that the sysadmin job changed. I’m saying, out of my own experience, the avalanche had already started prior to the early nineties – the system administrators were no longer whichever programmer got the hat pinned on them, but was already its own role and society; and had been so for some time.

    As a side note, the “rote knowledge and boring procedure” part of the sysadmin role is being automated away, slowly but surely, as the “root monkey” jobs are being given to scripts and abstracting tools.

  45. I think the preamble ought to have a link to the Jargon File, since it cites cultural literacy as a raison d’être for TEHOW. The reference to Cliff Stoll made me think- that’s the real lore of hacker culture, the stories; TJF and TEHOW are the necessary background knowledge for appreciating them. Whereas TJF has a bibliography linking to the stories, TEHOW doesn’t point the reader where to go next once he’s read the document and developed his interest further.

    Speaking of the bibliography, you may want to include When Wizards Stay up Late, which chronicles the formation of the Internet and various related features of hackerdom- RFCs, email, etc.

  46. I think the preamble ought to have a link to the Jargon File, since it cites cultural literacy as a raison d’être for TEHOW.

    Seconded. TEHOK is, in a lot of whats, a summary of the JF.

  47. “but I’m temperamentally unsuited to sysadminning today.”

    Every time ESR calls me for DNS help, I tell him I’ll make a sysadmin out of him yet…and am greeted with derision.

    1. >Every time ESR calls me for DNS help, I tell him I’ll make a sysadmin out of him yet…and am greeted with derision.

      I am beginning to think I’m going to have to do something serious about the way DNS and dhcp don’t talk to each other. It’s the root of about 75% of my sysadmining grief. dhcp should, in effect, dynamically update the DNS database…

  48. dhcp should, in effect, dynamically update the DNS database…

    This is one of the things that dynDNS makes their money on. It’s a common enough use case that my (terrible feature-poor) provider-provided router has an automatic function to report my dynamic/public IP to Dyn for them to automagically push it to DNS under a chosen domain. There are other vendors as well.

  49. (I realize that’s not exactly your use case, but I was pointing out that it’s enough of a problem on the public internet that several someones have evolved solutions)

  50. > dhcp should, in effect, dynamically update the DNS database…

    I think there’s an assumption on the part of most ISPs that people who aren’t paying for business-level service (which, naturally, comes with a static IP) aren’t entitled to run any services and therefore have no use for a domain name.

  51. I think there’s an assumption on the part of most ISPs that people who aren’t paying for business-level service (which, naturally, comes with a static IP) aren’t entitled to run any services and therefore have no use for a domain name.

    Any number of dynamic DNS services will happily do an end-run around this assertion. My ISP provides me the functionality to do so in the router they provided me, so it’s not a universal ISP assumption, either.

  52. First off, let me apologize if this has already been discussed and discarded, I’ve not been following the threads on the these postings.

    Looking at the time line, I’d suggest a few more dates:

    1974 Initial release of CP/M operating system (which eventually became PC-DOS and MS-DOS and the reason for many idioms like 8.3 filenames, etc.).
    1977 Introduction of the Apple II and TRS-80 Model I, arguably the second generation of the home computer.

    It’s a little misleading to have things jump straight from the Altair to the PC, since the PC was actually the third generation of personal computers.

    It might also be worth mentioning the TRS-80 CoCo (1980) as it was the first generally available color computer with gaming capability (at least, that didn’t require a $1000 color monitor). Lots of us got our start on the little 6809-based jewel :). (Note, I’m ignoring the Atari 800 here – 1979 -, because frankly nobody bought the thing). The Coco basically drove the desire for color graphics that eventually led to IBM creating the CGA display (and we’re back to $1000 color monitors!) and all the pretty colors we now see on our 40″ xga works of art. SO LONG, GREEN-SCREEN!

    One other might be the TI 99/4A (1981) which was remarkable simply for being the first home computer with a true 16-bit processor, but I’m not sure that really has much relevance on modern-day computing as the others above did.

    1. > I’m not sure that really has much relevance on modern-day computing as the others above did.

      A major failure mode I have to guard against in assembling this thing is the temptation to stuff it with trivia that many hackers get nostalgic about. In what I quote you’re asking the right question, but I think your filters are set too permissively. For example:

      >It’s a little misleading to have things jump straight from the Altair to the PC, since the PC was actually the third generation of personal computers.

      Misleading how? What about the specifics of the evolution of those intermediate stages have left traces that were relevant today? I think you’re on to the right sort of thing when you call out the CoCo, but I don’t see how it mattered to later developments whether the CoCo or the Atari 800 was the earliest color-capable machine.

      On the other hand, CP/M does deserve a mention because that’s where the 8+3 limitation in DOS came from, traces of which persisted very late.

  53. esr> I am beginning to think I’m going to have to do something serious about the way DNS and dhcp don’t talk to each other. It’s the root of about 75% of my sysadmining grief. dhcp should, in effect, dynamically update the DNS database…

    Please! There is supposed to be a way to link DHCP and BIND. There are (used to be?) instructions on how to do so. However, they… did… not… work. This is useful on internal networks of machines, where a DNS server would allow discovery of internal machines by name.

    1. >Should there be a reference to the chmod command in the octal section? AFAIK it is the one example of octal still in common use.

      It is, but it kind falls in a crack. It can’t be attributed to a 36-bit survival and is still common knowledge.

  54. > On the other hand, CP/M does deserve a mention because that’s where the 8+3 limitation in DOS came from, traces of which persisted very late.

    Even to today, arguably. Windows still generates 8.3 filenames by default, which has consequences for wildcard matching (*.htm will match the short filenames of .html files), and there are some other quirks (removal of trailing spaces in some contexts, removal of trailing dots, the fact that *.* will match filenames with no extension) that can be attributed to the original representation of the 8.3 filename (in both CP/M and DOS) as two space-padded fields with no explicit dot character.

  55. >On the other hand, CP/M does deserve a mention because that’s where the 8+3 limitation in DOS came from, traces of which persisted very late.

    Also, it was to maintain compatibility with CP/M’s use of slash as option flag that it was unavailable for use as path separator when DOS grew directories, leading to one of the most brain-dead decisions ever, flipping over to backslash as path separator and ignoring its use as an escape character in C and Unix.

    Without that, DOS could have been made very Unixy indeed.

  56. I think there’s an assumption on the part of most ISPs that people who aren’t paying for business-level service (which, naturally, comes with a static IP) aren’t entitled to run any services and therefore have no use for a domain name.

    A certain ISP, whose name will be withheld to protect the guilty, but which I will give the placeholder name of “AT&T Uverse” seems to assume that if you aren’t paying for business level service, you aren’t entitled to have anything on your local network that depends on working *local* name resolution, or to move USB wireless dongles between machines.

    One night, I was trying to putty into my Linux machine from the family Windows box, and the connection was being rejected. Grumbling about how I was using putty because I didn’t want to get up from my chair and walk to the other room, I did just that and checked the logs on the Linux machine, which claimed that the hostname of the Windows box didn’t match its IP address. So I pinged the Windows box:

    $ ping windowsbox
    ping windowsbox (192.168.foo.bar) baz(mumble) bytes of data.
    64 bytes from linuxbox (127.0.0.1): icmp_seq=1 ttl=64 time=?
    64 bytes from linuxbox (127.0.0.1): icmp_seq=2 ttl=64 time=?
    64 bytes from linuxbox (127.0.0.1): icmp_seq=3 ttl=64 time=?

    Turns out, whatever device had most recently connected to the router, the router was replying to DNS queries for that hostname by saying that the appropriate IP address was 127.0.0.1.

    It got worse. In trying to debug the issue, I reset the DHCP on the router. Bad idea. The router’s DNS had spot-welded the MAC addresses of the devices on the network to the first hostnames it had seen with those MAC addresses, and to the first IP addresses DHCP had assigned to those hostnames. When DHCP assigned new IP addresses after the reset, our local network became a hopeless mess. We may have attempted a full reset of the router. I forget. If we did it didn’t help. My dad called tech support the next morning, but they weren’t competent for anything but verifying that we could reach the internet (which had never been a problem). He blew up on them and said we’d be disconnecting our service. The call ended with us getting a discount on our service, and a new router shipped.

    The new router was a different model, and lacked the 127.0.0.1 bug, but a while later I moved a USB Wi-Fi dongle from one machine to another, and found that the new router was still spot welding MAC-addresses to hostnames and IPs. Having been burned the last time, I very carefully did no further changes to the configuration of any devices on the network, went out and bought a TP-Link router from fries, flashed it with OpenWRT, and moved our entire home network behind the OpenWRT router. The ISP router now works entirely as a point of connection for our internet line, and as the access point we let guests in the house use.

  57. ESR> Misleading how? What about the specifics of the evolution of those intermediate stages have left traces that were relevant today?

    I say misleading because the Altair was a first gen batteries-not-included-some-assembly-required device. Comparing it to say an Apple II (comes in a box, no soldering necessary to make it work) is like comparing bicycle-and-motor-kit to a Harley. While the Altair was technically the first mass production, general purpose home computer, it was far from what your audience for this document would have in mind given that definition. That’s why I talked about “generations”, the first generation was the Altair, the second generation was the Apple/TRS-80 (nice enclosed boxes of parts, but still pretty rough around the edges and not exactly turn-key), and then the 3rd gen PC (take it out of the box, connect cable X to port Y, insert the diskette, and flip the Big Red Switch(TM)).

    At least, that’s they way it makes sense in my mind. I will, of course, defer to the author ;).

  58. > Misleading how? What about the specifics of the evolution of those intermediate stages have left traces that were relevant today?

    I’d say that home computers in general were quite relevant to early hacker culture in a way that the IBM PC alone wouldn’t have been. Color-capable home computers (not just the TRS CoCo either, but e.g. the Commodore VIC20 and C= 64, or the ZX Spectrum) drove adoption of the home computer itself as a more market-friendly and far more ‘hackable’ counterpart to videogame consoles like e.g. the Atari 2600 or the NES/Famicom. Later home computers such as the Atari ST, Amiga and Acorn Electron can be seen as continuing this trend by adding further capabilities that effectively made them into general-purpose multimedia platforms. (The reliable user-response characteristics of these systems are still unsurpassed in some ways, BTW. These were more like embedded systems that just _happened_ to support computer-like interaction and highly-developed multimedia, rather than the ‘general-purpose’ workstations we see today.)

    All of these systems were rather more open and ‘hackable’ than the contemporary IBM-compatible PCs, albeit the earlier generation of home computers were definitely hampered by other factors such as the lack of usable data-storage media and data-management software facilities. (GEOS, released in 1986, eventually filled this gap even for the lowly C= 64.) I’m not sure that I’d describe any of them as “rough around the edges” though. Back then, if you bought one of these you knew what you were getting into, and everyone knew at least the incantation they needed to type in order to load and boot a game or other software.

  59. Good lord, GEOS. The memories. I wrote a lot of term papers in the word processor that came with that – pain in the neck doing WYSIWYG on a 40-column display.

    And it was a pretty functional GUI for the time, too. I want to say they eventually came out with a PC compatible version, but it was after OS/2, possibly after windows.

  60. > And it was a pretty functional GUI for the time

    More relevantly, it was an amazing achievement to make _any_ sort of GUI work on a potato like the C64. The screen resolution was mostly limited by the video output, BTW. As this recently-posted video shows, by using a monochrome TV, you could achieve a rather sharp output at the native 320×200 resolution – which was just enough for 80-column full-screen ‘text’, for a rather generous interpretation of ‘text’! (Not sure if just using the de-saturation control on a common color TV would’ve given the same result? IIRC, it did improve sharpness quite a bit.)

    The IBM PC version is from 1990 (Windows 3.0 was released the same year), and was known as PC/GEOS or GeoWorks. (There were of course other graphical interfaces for the PC – some released far earlier, such as GEM, DeskMate, DesQView etc.)

  61. I had a bunch of the pre-Windows PC GUIs (my father was a hobbyist, as was I); I just didn’t recall the relative dating and couldn’t be bothered to look

  62. > Not sure if just using the de-saturation control on a common color TV would’ve given the same result?

    As was discussed in an earlier thread, monochrome displays had superior resolution to early color displays because they had no shadow mask.

  63. > monochrome displays had superior resolution to early color displays because they had no shadow mask.

    Yes, but the beam focusing and shadow mask give you a practical limit of 640×480 pixels or so, which is not very relevant here. Later home computers (including Amiga) could definitely display fairly readable 80-column text on a color TV, although this would obviously be far less crisp than using an actual monitor.

  64. As a coincidental aside, I just had a conversation about what I was talking about (early computer generations and their differences) with a fraternity brother who is about 5 years my junior. He assumes that every computer had sockets you could push memory into and other nice “expansion” options. It was surprising to learn that once upon a time, SOP for a memory upgrade was “Step one, get your soldering iron….”

    Just thought that was a funny coincidence.

  65. It was surprising to learn that once upon a time, SOP for a memory upgrade was “Step one, get your soldering iron….”

    And now it is again!

Leave a Reply to guest Cancel reply

Your email address will not be published. Required fields are marked *