giflib: everything old is new again

In 1994 I handed off the maintainership of giflib, the open-source library used by pretty much everything in the universe that displays images for the single most widely used icon and image format on the World Wide Web, because patent issues made it unwise for the project to be run by someone in the U.S. Now, eighteen years later, Toshio Kuratomi (the hacker who took it over then) has asked me to resume the lead. I have accepted his request.

I don’t expect this to involve a lot of work. The code is very mature and stable; I left it in good shape and it’s coming back to me in good shape. I asked to rejoin because I thought it would be a good idea to run some auditing tools on the code to check it for correctness, and polish it into more modern C if that’s needed. Major changes seem highly unlikely.

Toshio has lost interest in the project, and was looking for someone to hand it to. Eighteen years is a long time, much longer than I myself spent on the project. He’s owed everyone’s thanks for his responsible stewardship of the code.

54 comments

    1. >So, will you update your gitflib page then?

      Done, thanks for the heads-up – I’d forgotten that was there.

  1. Kuratomi deserves thanks from nearly everyone that uses anything that operates on port 80 or 443. And a few others.

    I don’t know him, so I don’t know if he’d appreciate the attention, but might he like having folks express appreciation in some way (alcohol, food, straight money, donations to something?)

  2. @esr

    Yes. It hasn’t changed very much.

    Wow. That’s a testament to the quality of the original code. Eighteen years without much change in any piece of software is impressive in my book.

    1. >Wow. That’s a testament to the quality of the original code. Eighteen years without much change in any piece of software is impressive in my book.

      Perhaps somewhat less impressive when you realize that the total volume of code involved is quite small. But Gershon Elber did a good job on the original DOS code, I managed not to screw it up when I ported it and significantly improved the API, and most of the few problems since have been with edge cases where the GIF input was somewhat malformed.

  3. Were there other patents than the LZW compression patent [which expired in 2003], or is that the main one?

    It’s kind of interesting, in an every-cloud-has-a-silver-lining way – if not for that patent, we might not have gzip or PNG today, as both were made as a way to work around it.

  4. The code is very mature and stable

    This could be a useful test case for several things.

    1) How many changes have been made in this package over the years? Has the number of changes varied over time? (I.e. lines of code altered.)

    2) How many instantions of this code are out there? I.e. how many larger projects use it? How many new instantions each year? How many installs of those instantions? I.e. is it possible to gain a picture of how many working compilations of this code are out there?

    1. >This could be a useful test case for several things.

      Answers to the statistical questions you’re asking would be very interesting to have, but I can’t even imagine a way to collect any of them except “how many changes”, which would require stitching together the old CVS repo with the new post-2005 git one. I will probably do that at some point.

    1. >Is the TODO file up to date?

      I believe so. But bear in mind that a lot of it is blue-sky thinking for the so-far hypothetical 5.x release. I’m not going to do anything that will mess with the API., not without very strong reason and careful consideration first.

  5. Isn’t that one of those pieces of software where you, as the maintainer, are obliged to keep it working exactly the same — even make sure the bugs work the same way? I am reminded of Knuth’s statement about TeX and Metafont that once he dies the version numbers become Pi and e respectively, and that all bugs become features.

    I am also reminded of story that when Microsoft went from Windows 3.1 to Windows 95 that in their testing labs they found SimCity (I think that was the game) had a bug where it used memory after it had been free-ed. This caused it to break when run on Windows 95, so they actually put in a hack in the OS to keep the free-ed memory around for that specific game. Scary, but apparently true. (See here for this story.)

    Being beloved and popular has all sorts of special obligations.

    1. >Isn’t that one of those pieces of software where you, as the maintainer, are obliged to keep it working exactly the same — even make sure the bugs work the same way?

      No. We don’t do things that way in the open-source world. Object incompatibility isn’t the deal-breaker for us that it would be elsewhere, because we presume (a) our customers can rebuild from source code, (b) they’ll be building with our library in source code form, (c) where these things aren’t true, multiple distinguishable version of the object file will be available and the clients will tell the linker which version to grab.

      What does constitute good practice for us is carefully documenting the API and providing version symbols so client programs in source code can condition code based on what version is available. I’ll make sure that’s in place.

  6. Incidentally, for the record, in UNIX in the pre-ANSI days [and I’ve heard that MS was at the time rather reluctant to stray from what was documented in the K&R book and in the Unix implementation – which is why they used a new function name WinMain instead of making incompatible extensions to main], the ability to reuse the most recently freed memory pointer (for the specific purpose of realloc) was in fact documented as allowed.

  7. In 1994 I handed off the maintainership of giflib, […], because patent issues made it unwise for the project to be run by someone in the U.S. Now, eighteen years later, Toshio Kuratomi (the hacker who took it over then) has asked me to resume the lead.

    Strange, Toshio has been in the US since at least 1995.

    http://tiki-lounge.com/~toshio/resume/resume.html

  8. Having used giflib in a project. I’m impressed. Apart from having to update it to fix a security issue, and patching it so it doesn’t store loaded images (since i was essentially streaming), its held up quite well.

  9. No. We don’t do things that way in the open-source world.

    And you wonder why Microsoft keeps on winning? Until this changes, Linux will continue to languish in obscurity.

    Microsoft’s Herculean efforts to remain bug-for-bug compatible with each successive version of Windows is a big part of why Windows was historically considered a stable platform for the long term — and Linux is not. Conditions (a), (b), and (c) are all false often enough that nonfree software on Linux just isn’t an option, unless you freeze the platform at a specific version and/or distro.

    In past comments on this blog, Shenpen has put forth two basic rules that he thinks are reasonable to expect from any OS in order to consider it a mature, stable platform:

    1) Programs written for one version of an OS should run, if not bit-identical, then essentially the same with no new surprising bugs, on all subsequent versions of that OS, forever.

    2) An OS written for a particular hardware platform should support any commercially available peripheral which can be attached to that platform.

    The year Linux fulfills both of these conditions will be the year of the Linux desktop. Microsoft has made them a priority.

  10. Note: Above I say “was historically” because recently, particularly with Vista and Windows 8, Microsoft has been experimenting with wild new directions for Windows; and accordingly, consumer confidence in the platform has flagged. Which goes to show how important the issues of backward and forward compatibility really are.

    1. >Which goes to show how important the issues of backward and forward compatibility really are.

      Linux handles library compatibility with versioned libraries and applications that ask the linker for the versions they need. Application forward and backward compatibility is a different issue.

  11. >Microsoft’s Herculean efforts to remain bug-for-bug compatible with each successive
    >version of Windows is a big part of why Windows was historically considered a stable
    >platform for the long term — and Linux is not.

    Arguably this isn’t as much of a problem with Linux as it would be with windows. If I want to run my entire company on RedHat 3 (not RHEL3), I can, and can continue to do so for the foreseeable future (barring hardware incompatibility) simply by reinstalling. On the other hand, the day Microsoft shuts off the XP license server, even if I kept every single one of my licenses and disks and simply re-install when old machines break and have no plans on adding new computers, I’m SOL. So if you want bit compatibility with linux, you run the version you want to run. If you want it with Windows, you hope microsoft maintains compatibility.

    Now, from a practical standpoint, sticking with an old version of linux has problems all on its own, including that if you thought getting support for Linux was difficult when it was current, you’ll find the first answer to any question you ask is “Why are you using that old version, you should upgrade to the new version”, which is of course why RedHat and Microsoft sell long term support contracts.

    >In past comments on this blog, Shenpen has put forth two basic rules that he
    >thinks are reasonable to expect from any OS in order to consider it a mature, stable platform:

    And those are unreasonable demands that not even Windows with it’s slavish devotion to backwards compatibility meets. And Apple has had plenty of success trashing backwards compatibility when it suits them. See OS X in general, EOL of carbon, EOL of rosetta, EOL of ADB.

  12. “I am also reminded of story that when Microsoft went from Windows 3.1 to Windows 95 that in their testing labs they found SimCity (I think that was the game) had a bug where it used memory after it had been free-ed. This caused it to break when run on Windows 95, so they actually put in a hack in the OS to keep the free-ed memory around for that specific game.”

    Very bad software engineering practice. They should have fixed the application, not the operating system!

    This reminds me of a story that happened to me personally around 1999-2000. I was brought in as a consultant to a small dot-com (ah, the glory years) to port a fairly complex piece of C++ software that generated HTML code and served it up through the web server to a user running a standard web browser.

    The application was originally written on Windows and tied to COM and IIS. My task was to port it to Solaris, CORBA, and (if I remember correctly) the Netscape web server.

    The biggest headache (aside from CORBA — that’s a whole rant in itself) was that the original version was written with Microsoft VIsual Studio C++, and, in many places, depended on the behavior that dereferencing a null pointer threw a well-defined null pointer exception. However, if you read the C++ spec, dereferencing a null pointer is undefined behavior and well-behaved programs are not permitted to do it. And in fact, the Solaris C++ compiler did not handle it cleanly.

    So I had to go all through the code trying to find every possible spot where a pointer that might be null was dereferenced, and add an “if (ptr == NULL)” check in front of it. I was never really sure that I found every possible case, although the final version did appear to work.

    Programs should never depend on behavior that is explicitly not guaranteed in the spec!

  13. > The biggest headache (aside from CORBA — that’s a whole rant in itself)
    > was that the original version was written with Microsoft VIsual Studio C++,
    > and, in many places, depended on the behavior that dereferencing a null pointer
    > threw a well-defined null pointer exception.

    All the world’s a VA^H^H Windows box. I call it Microsoft brain damage, but sadly history shows that it didn’t happen there first, nor is it likely to happen there last.

  14. It’s not just the Windows world. I still find people comparing booleans to true. People on StackOverflow think it’s normal.

  15. @Bob:
    “I still find people comparing booleans to true. People on StackOverflow think it’s normal.”

    Well, it *is* normal. Provided it’s 1980 and you are programming in BASIC.

  16. Jeff Read writes: “Until this changes, Linux will continue to languish in obscurity.

    Yep, obscurity. I’ll wait for the laughter to die down.

    Jeff Read continues with: “Microsoft’s Herculean efforts to remain bug-for-bug compatible with each successive version of Windows is a big part of why Windows was historically considered a stable platform for the long term — and Linux is not.

    The only thing “Herculean” about Microsoft’s efforts are the Augean stables.

    And saying “compatible” and “successive version of Windows” in one sentence is pretty much another laugh riot. The amount of time that any serious software product had to undertake for each successive version of Windows was the real Herculean effort.

  17. Well, it *is* normal. Provided it’s 1980 and you are programming in BASIC.

    Every 1980s BASIC I tried lacked a true boolean type. They used the same convention as C: a zero value in the condition part of an IF meant false; nonzero meant true. Comparative operators returned 0 or -1. AND and OR were bitwise and logical operators. (TI-99/4A BASIC didn’t have AND or OR; you had to use * and +. Which only makes sense if true were 1 instead of -1; that may have been the case.)

    Yep, obscurity. I’ll wait for the laughter to die down.

    Linux used to have a whopping 2.5% desktop market share. Today that’s back down to around 1%. What happened? Well, the Mac basically captured most of the geek market, with a few hackers going grudgingly back to Windows.

    I’m not counting Android in this because a) no one really knows or gives a shit that Linux is inside; b) Android is a hand-rolled, completely custom platform that happens to use the Linux kernel. The kernel, by the way, carries almost none of the guilt for platform incompatibility at the user level because of Linus’s assiduously enforced policy of “don’t break the syscall interface”. (And yet, by the same token, Linus’s policy that only the syscall interface need remain stable, and anything within the kernel can change at any time, has made third-party driver support for Linux needlessly difficult.)

    And saying “compatible” and “successive version of Windows” in one sentence is pretty much another laugh riot. The amount of time that any serious software product had to undertake for each successive version of Windows was the real Herculean effort.

    I don’t remember any Windows 3.x programs that failed to run under Windows 9x.

  18. I don’t remember any Windows 3.x programs that failed to run under Windows 9x.

    To which I meant to add that the 9x to NT transition was more difficult because NT was an advanced operating system with real memory protection and enforced user permissions, not a thin and crufty shim on top of DOS. And you’d be surprised how many “serious software products” abused the Windows API or deliberately exploited the single-user DOS underpinnings of Windows in order to wring out performance or give developers a shortcut.

    But even then, part of the reason for Windows’s many security problems was that Microsoft favored backward compatibility over security — the right decision for them to make at the time.

  19. @Jeff Read:
    “Linus’s policy that only the syscall interface need remain stable, and anything within the kernel can change at any time, has made third-party driver support for Linux needlessly difficult.)”

    No, what’s made third-party driver support needlessly difficult is the refusal of manufacturers to release enough information to write and maintain open source drivers.

    1. >No, what’s made third-party driver support needlessly difficult is the refusal of manufacturers to release enough information to write and maintain open source drivers.

      Case in point, the CP2101 I bitched about a few days ago.

    1. >Sorry to hijack your thread, but … your next phone has arrived

      You are correct. I’ll probably order one next week after I get back from Penguicon.

  20. @Jeff Read: “Until this changes, Linux will continue to languish in obscurity.”

    @SPQR: “Yep, obscurity. I’ll wait for the laughter to die down.”

    @Jeff Read: “Linux used to have a whopping 2.5% desktop market share. Today that’s back down to around 1%.”

    I think you two are talking past each other. Jeff’s original post said nothing about “desktop”. There’s nothing obscure about Linux’s role in the server world.

    Actually, this is an interesting point in general. Open source has made bigger inroads on the server side than on the desktop; this issue goes beyond Linux or anything in the GNU toolset. And open source has never gotten much traction in the Windows world, for reasons that are not clear to me. (I’m talking about utilities and applications here, not the OS itself.)

    I think this must be due to culture, not technical limitations. There’s no obvious reason there couldn’t be a flourishing open source culture around Windows; after all, there’s plenty of closed-source freeware.

  21. >I think this must be due to culture, not technical limitations. There’s no obvious reason
    >there couldn’t be a flourishing open source culture around Windows; after all, there’s
    >plenty of closed-source freeware.

    It’s a matter of usefulness I think. Most Windows users aren’t programmers. Most server guys (when these systems were gaining traction) were. Server guys had a use for code access. Windows guys didn’t, nor did they have the tools.

  22. “Most Windows users aren’t programmers. Most server guys (when these systems were gaining traction) were. Server guys had a use for code access. Windows guys didn’t, nor did they have the tools.”

    Mmm, maybe. I think there’s more to it than this.

    The geek hobbyists who rolled up their sleeves and dived into Linux when it became available existed before Linux, and most of them were probably either running Windows or endlessly looking for alternatives that never panned out (e.g., Amiga). Before the PC, they were running Apple II’s.

    It may have more to do with timing. Before the Linux era flowered in the mid-to-late 90’s, the Net had a more limited user base, most of whom had at least some access to bigger iron than PC’s. There were many PC users, but they had limited connectivity to each other through BBSes and UUCP. (Yes, I’m going back to the themes of the “World Without Web” discussion we had on this blog.)

    If Linux had not come about when it did, and if the PC-based BSD’s had not been available as an alternative, I think we might have a seen a Windows open source movement created by the same people who, in our timeline, created the Linux ecosystem.

  23. Cathy said: Before the Linux era flowered in the mid-to-late 90?s, the Net had a more limited user base, most of whom had at least some access to bigger iron than PC’s. There were many PC users, but they had limited connectivity to each other through BBSes and UUCP. (Yes, I’m going back to the themes of the “World Without Web” discussion we had on this blog.)

    Anecdote time!

    My first Linux install was Slackware 2.3, with the 1.2.8 kernel.

    Downloaded over a modem and installed via floppies, onto a 486DX/100 with 4 megs of ram. (13 hour kernel build time. Good times.)

    So, yes. Linux hit the heavily geeky computer hobbyist and CS student populations first and heaviest.

  24. Sorry, this is off-topic, but I thought some people might be interested:

    http://www.tor.com/blogs/2012/04/torforge-e-book-titles-to-go-drm-free

    Tor ebooks will be shedding their DRM this summer. One step closer to usable and reliable ebooks! I’m still waiting for the formats to settle down enough to the point where I can be confident that in 30 years I’ll be able to read books on sale today, but we’re getting there.

    Is it me or is Tor always ahead of the curve?

  25. @Tom
    Baen was ahead of Tor by a very considerable margin re: DRM-free.

    In terms of long-term reliability, you can’t go wrong with ePub; the format is just a zip with HTML and XML data. So long as the file isn’t encrypted (ala B&N, but that’s easily bypassed) you can always pull the content out. I have done this with at least one book where there was censorship edits between regions and I wanted to read the un-censored version. Unzip, locate edit, fix, zip back up. Done.

    There are other technical issues with ePub as it currently stands, but they are issues with the ebook format in general.

  26. @jsk

    Yeah, the format’s open, but how can you know that future reader software will be interpreting it in the same way it interprets it today? I know some people think of a book as just a stream of text. I don’t have anything against that point of view, but for me a book is a lot more than that. I care a lot about formatting and typography. I want to have a reliable reading experience and know that in decades to come I will be able to view the content in the way the author or publisher laid it out.

    Even today no two ePub readers interpret the format in exactly the same way. To me that is a problem, and one of the reasons I have not adopted ebooks yet.

  27. And, for that matter, I am not yet convinced that ePub won’t fall to the wayside and be supplanted by another standard. Yes, there are open-source readers. But what if people stop maintaining them? Eventually they will become unusable, and take my books with them.

    The field still feels too risky for me to make a substantial investment in ebooks.

  28. @Tom
    That would be the ‘other technical issues’ I mentioned. And my point about ePub (and I guess mobipocket too, if it’s not encrypted) is that it CAN’T fall by the wayside. The data is accessible. All you’d need to do is convert to another format (via caliber or whatever).

    For now we have a trade off between the relatively unusual case of a book’s formatting being extremely specific (think House of Leaves and such) and being able to dynamically resize and reflow text. We need a good solution for the former; the latter is too much of a win to even consider doing away with.

  29. @jsk

    That would be the ‘other technical issues’ I mentioned. And my point about ePub (and I guess mobipocket too, if it’s not encrypted) is that it CAN’T fall by the wayside. The data is accessible. All you’d need to do is convert to another format (via caliber or whatever).

    That’s a good point.

    For now we have a trade off between the relatively unusual case of a book’s formatting being extremely specific (think House of Leaves and such) and being able to dynamically resize and reflow text. We need a good solution for the former; the latter is too much of a win to even consider doing away with.

    I consider formatting to be important even for novels, but I am probably in the minority. However, for many other types of books (e.g. textbooks, programming manuals, reference works, art books, magazines etc) formatting is absolutely critical. We’re still really a long way from having a good solution for these cases, in my view.

  30. @Sigivald:

    “My first Linux install was Slackware 2.3, with the 1.2.8 kernel. Downloaded over a modem and installed via floppies, onto a 486DX/100 with 4 megs of ram. (13 hour kernel build time. Good times.)”

    *snort*

    I can raise you one. My first Linux install was SLS 1.0, running the 0.98pl6 kernel. I *think* that this was the very first version of the kernel capable of running XFree86.

    It took about 30 5.25″ floppies to pull the whole system. I escaped the modem download by pulling the whole thing at work. Even then, I had to search the building high and low to find a PC with a 5.25″ floppy drive. (Our everyday computers were Mac SE’s.)

    The machine was a 486DX/66, double-booting Linux and Windows 3.1 via LILO. Even Grub didn’t exist back in 1993. I think it had 8 megs of RAM, which in practice meant that I could run *either* XFree86 *or* build code without swapping, but trying to run a big compile/link build under X would swap like crazy.

    But the whole thing was quite an improvement over my AT&T 3B1, commonly known back then as a “Unix PC”. The meaning of “Unix PC” has shifted a bit since then. :-)

  31. I go back to having a Unix SysV 386 install on an old 386/25mhz PC exchanging email with Uunet over UUCP. Heck, I once got a job solely on my ability to make UUCP dialup scripts work. (Remember “ogin:” … ) I toyed with Minux briefly and largely unsuccessfully in that brief interval between it and early Linux.

    Well, actually in the Windows 3.x to Win 95 era, I was supporting an industrial application that displayed scanned documents on a workstation as part of a large document storage system. And certainly the display app broke between those Win versions, I recall spending some time fixing it.

    As for obscurity, I’m well aware of the “market share” numbers but that does not make Linux desktops “obscure”. I used to believe that nonsense myself for years but had a revelation when about 4 or 5 years ago, I stayed in a small hotel in Prague. They had a tiny lobby with a PC on a table for use by the guests to access the Internet. It was running some Linux distro. I watched scores of hotel guests sit down at it, look at it quizically for a half minute or so, find the icon for a browser and be off using it. I burst out laughing when one guest told his wife that “Microsoft makes a different kind of Windoes for the Czech Republic but it works close enough”.

    The reason that Open Source lost traction in the Windows world was Microsoft’s practice of absorbing into Windows, Borg-like, every good app or tool that appeared. One of the things I’m enjoying about Android phones is the re-emergence of the creativity of the small software outfits to build little things for people to use without getting stomped upon by MS. I greatly miss those days in the PC world when you’d comb software shelves, PC BBS’s and PC magazines for word of a useful small tool to buy or download. Its why I’ll never ever use a Microsoft phone, period.

  32. @SPQR:
    “The reason that Open Source lost traction in the Windows world was Microsoft’s practice of absorbing into Windows, Borg-like, every good app or tool that appeared.”

    Yes, I think there’s a great of truth there.

  33. “Done, thanks for the heads-up – I’d forgotten that was there.” – I think there’s a fair amount of stuff on your website that could stand to be updated. The BROWSER project page, for example, has the aura of something that nothing ever came out of, when in fact the variable is used by both Debian’s sensible-browser script and (in absence of a DE, I haven’t traced what happens with e.g. GNOME in use) freedesktop’s xdg-open. It’d certainly be useful to add that information if only to give a heads-up that setting BROWSER to one of these makes no sense due to circularity, and that they may be serviceable fallbacks for an application to use if BROWSER is not set.

    No particular bad reflection on you, since it’s certainly possible that they added that functionality without telling you, but worth mentioning all the same [and really, that it was added without telling you is its own indicator of how successful it is]

  34. Is it me or is Tor always ahead of the curve?

    They’re lagging Baen by about a decade at this point, but, yeah, they are ahead of the general curve.

  35. I find it interesting that a “burn all GIFs” campaign attempt for Web video, spearheaded in part by Google, is failing.

    It’s becoming more generally accepted that if a standard is patented, paying patent license fees is a part of the cost of doing business.

    Also interesting: Google is apparently completely powerless to use the Android platform to force the issue the way Apple did with its iOS platform and Flash.

    This is why we “fanbois” do not look at market share numbers when determining which is the most influential smartphone platform.

  36. @Jeff Read:
    “It’s becoming more generally accepted that if a standard is patented, paying patent license fees is a part of the cost of doing business.”

    Not in the open source community. I don’t see how an implementation of a patented standard can be used in an open source project that will be freely distributed, barring some special licensing that essentially negates the patent for purposes of the project in question.

    Patented standards are evil. If you want to patent a proprietary system, go ahead, but don’t pretend it’s part of an interoperable standard. (And if it’s not interoperable, calling it a “standard” is meaningless.)

  37. My first Unix PC was an out of the box IBM AT with a 10MB hard drive running Microsoft Xenix (i.e. System III) with Microsoft’s C compiler, which had the nice property that you could compile the same code, library calls permitting, for both Xenix and DOS. I remember thinking, what could I possibly do that would fill up 10 MB of memory? Putting in all those floppies at build time certainly was a pain, though.

    That was also the last job at which I had an office with a door that shut.

Leave a comment

Your email address will not be published. Required fields are marked *