Autotools must die

Me, on the GPSD mailing list:

Once upon a time, I did not hate autotools. Yea verily, it was the morning of the world and all things (even autotools) seemed fresh and new). I’d say this innocence lasted until about, oh, 1995.

But autotools was a kluge. And it did accrete kluges and crocks around it, adding layers of complexity until it became sore difficult to tell which end was up. And lo, it became a festering pile of special cases and obscure semi-documented rules, leading to a combinatorial explosion of unplanned interactions and obscure lossage.

Like, say, the fact that our make check insists on running gps-makeregress twice and in spite of being a genuine autotools expert with fifteen years of experience bear-wrestling the sorry fscker I cannot figure out why it is doing this.

Oh, there’s a reason, all right. And if I were willing to quintuple the three hours I just spent poking at our build setup I’m sure I could find it. But that was three hours wasted as far as solving any real problem was concerned. Life is too short for such nonsense.

Autotools has reached the Chandrasekhar mass limit of software and become a black hole of suck that consumes endless hours of bright peoples’ time without any good result. It is well past time for it to die.

53 comments

  1. Grrrr I hate autotools from the user perspective. Configure, make, make install, is reasonable in theory but it seems like every different package has new gotchas, esp. in the configure step.

    I’ve been learning about cmake which seems to be superior.

  2. Autotools also sucks from the API breakage perspective. It seems like once a quarter, new ways to do old tasks get introduced, the old ways get deprecated, and nobody bothers to document either when a mechanism was introduced *or* when the previous mechanism was deprecated. A lot of the time it’s hard to tell whether this push comes from autoconf, automake, or libtool, and which of those are just following along. This almost necessitates that software maintainers check the blessed autoconf and Makefile.in files into revision control — violating the usual good practice to not do that.

    Sadly, autotools’ years of experience do give it wider portability, wider applicability and more flexibility than any of the alternatives I’ve found.

  3. Yeah, join the club.

    Bout ten years ago I tried to use autoconf on a personal project. First time I was hit with that baffling Wall O’ m4 I found myself mired in the same confusion that prevented me from ever being an effective Tcl programmer: “Should I quote this? Not quote it? Double-quote it? What?”

    I decided that my time was too valuable for wasting on that sort of nonsense, as I’d quickly find myself spending more time twiddling Autowhatever than making my code work.

  4. The remarkable thing about autotools is not how many kludges it has in it, but how few non-kludges. There is no central core of solid engineering around which the kludges are accumulated. It’s just kludges all the way down.

  5. Finally someone people listen to is piping up on this. I’ve been hating autotools since a few minutes after I first looked at it.

    So, what are we going to replace it with?

  6. >It’s just kludges all the way down.

    Yes. Yes it is.

    I favor scons myself. cmake is big and ugly and verbose by comparison.

  7. There are so many tools which purport to make life easier for the programmer but ends up sucking time and effort in achieving a reasonably simple result. Yet, for some strange reason, people seem to prefer those tools and they achieve mainstream acceptance. So much so, you’re condemned if you reject them and go with your own solution.

    I would rather learn to write and maintain a Makefile manually than rely on tools which “help” me create them. Really I found these tools mystifying. If I ever had a complex project, I would MUCH rather use an IDE which hides all the autoconf,automake crap beneath.

    There are a lot of “helper” libraries which add similar complexity and I hate all of them. This is one of the reasons I prefer to learn the lower-level tools which are simpler and more UNIX like than tools which try to become all-in-one. In a previous topic, I was referring to it from a different perspective, but the philosophy is the same.

  8. Autotools (and all of their competitors) drive more developers from C into the arms of the interpreted languages (Java, Perl, Python, Ruby) than all of the memory leaks and buffer overruns combined. Of course REAL crazy is when someone decides to replace autotools with Python setuptools — I refuse to name names on the grounds that someone might go look, and then blame *me* for the onset of insanity.

    Mind you, every language that’s good enough to build a complex system seems to accumulate a build system that accumulates more and more cruft. With Java, Ant in fine for building individual projects, but then it begat Maven, which suffers the same problems as autotools: it’s fine as long as you want to do everything its’ way. Python got setuptools. Perl got Makefile.PL. Ruby got gems. You don’t even want to think about the horror that is MSBuild on Windows (which is still a step above letting VisualStudio do everything).

    More amazing is that most commercial / internal development seems to be done without even that much attention to consistency.

  9. Well, if we’re venting, I might say that as a sometime build and tools automation guy, whatever it’s faults, autotools run from a shell, which makes automated builds possible.

    Unlike an IDE build.

    Nothing makes me more crazy than some developer saying “Well, it builds fine in eclipse… What’s wrong with your build server?”

  10. Sean C. Says:
    > Lots of smart people like scons

    esr Says:
    > I favor scons myself. cmake is big and ugly and verbose by comparison.

    I’ll check out scons, thanks.

  11. “most commercial / internal development seems to be done without even that much attention to consistency.”

    Most commercial/internal development has at least one of the following two things: A fully controllable environment, or a Windows deploy.

    With a fully controllable environment, you don’t need to check your version of libxml. You know. Pretty much nothing autotools does is applicable.

    With Windows, the standard is basically to assume that the user has a fresh install of windows and nothing else, so you need to ship every last additional library you need. When not combined with the fully-controlled environment, this still fails with some regularity, too.

  12. >Most commercial/internal development has at least one of the following two things: A fully controllable environment, or a Windows deploy.

    Yeah, I’ve thought for years that must be at least half the reason Windows build chains suck so badly. No selective pressure to improve. But this cuts both ways; increasingly I see younger hackers being blissfully unaware that cross-Unix portability was ever a big deal – they’ve learned to program in a post-ANSI-C and POSIX-conformant world, and the set of issues autotools was actually designed to tackle is mostly ancient history.

  13. The chromium project moved from scons to gyp last year. I haven’t seen much of scons but I like what I’ve seen of gyp better.

  14. @Jeremy,

    The problem with “fully controlled environments” is that they are rarely as fully controlled as you thought they were. Recent example: I was presented with a set of C++ code that was nominally POSIX and CORBA-compliant. My task? Build it with a different ORB with 64-bit Linux instead of 32-bit Windows, and replace the sound libraries with the Linux equivalent. The first trouble spot was that the existing code was only built with VisualStudio — I ended up replacing that with autotools (not fun). Then there were all of the places where the original developers coded outside the CORBA spec because the original ORB had “useful extensions”. The 32-bit to 64-bit conversion was almost trivial in comparison. Rewriting the sound modules … … sound on Linux sucks.

    In other news, Red Hat Enterprise Linux is a fine platform for 5-year old stable code. It is a horrible place to do new development, because all of the libraries (and the kernel) are So Damned Old. (Fair Disclosure: I am a Red Hat stock holder.)

  15. > (Fair Disclosure: I am a Red Hat stock holder.)

    Meanwhile, AAPL is up 2X over RHT in the last 12 months, and 2.6X over RHT in the last 5 years.

  16. Mr. Bowers:

    His point is that *clearly* APPL is better than RHT because their stock has gone up more.

  17. @Craig Trader:
    > Perl got Makefile.PL.

    Perl modules are encouraged to move from ExtUtils::MakeMaker, i.e. the Makefile.PL script which generates Makefile (similar to what ./configure script from autotools-managed installation does) to Module::Build, which uses Build.PL, or similar solution like Module::Install. Or even tools like Dist::Zilla which do more than just being responsible for building and installing module.

    Sidenote: when installing module from CPAN using CPAN client you don’t have to worry what method module uses: it would be handled automatically.

  18. No, my point is that clearly AAPL (the stock) is better than RHT (the stock), because it has increased in value over twice as fast.

    Are you buying a stock or investing in a company? There’s a difference and it’s important that you don’t confuse the two.

  19. > Problem: my boss was Rob Savoye, who was a maintainer of autotools. FAIL.

    … Problem: now you work for a guy who used to work for Symbian, GM, Oracle and Kmart. FAIL.

  20. @Jakub, Ok, it’s been a while since I fiddled with Perl (and frankly, the longer the better). My point was that everyone seems compelled to reinvent (or reimplement) this particular wheel. No one has gotten it “right” yet, but perhaps that’s because the problem continues to evolve. As Eric pointed out, the problem that autotools (and iMake before it) was designed to solve has (mostly) gone away. It may be the case that there will never be the One True Build Tool.

    @Ajay, I don’t think you can really compare SCons and gyp — I think they’re intended to solve different problems. (The documentation on gyp is a little light, but what I gather from the source is that it’s intended to allow you to generate configuration files for other build tools, including SCons).

  21. @esr:

    Yeah, a few years back I was struggling with autotools on some obscure thing or other and I thought: “Why does it have to be this difficult? This thing is just kludges on top of kudges!” and then I thought “I could probably hack something better in Python in a weekend!” and when I went hunting for existing solutions, I found SCons, which is exactly what I had in mind. So I never bothered writing my own solution.

    Unfortunately, autotools is very, very well entrenched and most projects are reluctant to give up a build system that’s already working.

  22. @Craig Trader:

    Of course there won’t ever be One True Build Tool, just like there will never be One True Language, or One True Operating System, etc.[1] Holy wars will continue wage on.

    I think there are four kinds of wheels that will continually be reinvented, just because programmers are programmers:

    – languages
    – build tools
    – editors/IDEs
    – version control

    Every new thing is born out of frustration with some tool or tools that came before it. Python was born of frustration with Perl, Perl was born out of frustration with awk/sed/shell, Emacs was born out of frustration with TECO, vi was born of the mind of a sick and twisted psychopath, etc.

  23. Can you describe why you prefer scons to cmake in any more details than mentioned? It seems like it is relatively straight-forward. I wanted to avoid gyp for work projects because it looks too new.

    And the pinnacle of autotools evil is libtool, the tool that does everything wrong.

  24. > And the pinnacle of autotools evil is libtool, the tool that does everything wrong.

    Really? Like what? Libtool is the one component of autotools that I can actually stand using. The pinnacle is automake.

    1. >Can you describe why you prefer scons to cmake in any more details than mentioned?

      Well, the superficial reason is that cmake rules look ugly, verbose, and heavyweight to me compared to scons rules.

      The deeper reason is that scons is written in Python and its extension language is — Python. That’s a big win.

  25. And …. whatever happened to writing a makefile that always works? Yes, you need some conditional compilation; that’s what small dependencies and build results are for. Dan Bernstein’s qmail runs on lots of different unices, and he only needs one Makefile.

  26. Russell: Have you tried out scons? It really blew me away with how straightforward it is, compared to the equivalent makefile. I love having something that can, in a few lines of code, extract all the C dependencies without making a huge fuss about it and compile a program. Or compile a library on multiple platforms without having to deal with the mess that is libtool. And having the change detection based on MD5 signatures instead of last-modified times is a surprisingly pleasant feature.

    I liked makefiles. But I’m not going back to them in the future.

  27. In my experience, a plain Makefile and optionally a shell script that gathers library-related configuration information via pkg-config is just fine for most projects.

    BTW, even if you look at the latest version of popular GNU tools, it’s obvious that autotools is just legacy, e.g. gzip and tar have an elaborate configure script that gathers so much detailed information, but then, both tools depend on a layer of wrapper functions that are written in a very system-specific way, i.e. they tie themselves to a very few specific stdio implementations and their internals, something no sane programmer striving for portability would do (and portability, that’s the whole reason for employing autotools, right?). And you can’t even disable that. Another example how braindead configure scripts are is the check whether the specified compiler is able to create executables that actually work. That check totally fails when you try to use a cross compiler. _Of course_ the executable won’t work in the build environment, but why should configure care about that?

  28. Makefiles suck. The Makefile syntax is too antiquated and too limited. These days, most projects have to deal not only with differing C libraries, operating systems and hardware platforms, now added to that mix we’ve got a plethora of different languages, compilers, virtualization systems, additional cross-platform problems, etc.

    This has led to all kinds of kludges from calling stuff like awk and sed from make, to autotools, you name it. I’ve even seen some mixed-language programs that do stuff like call a setup.py or a using a Makefile.PL to configure differing make systems to compile C code. Just look at the god-awful mess Putty is to compile!

    SCons gives you a whole Turing-complete language right in your Sconscript (Makefile equivalent) — but you only need to use it if you really need it. That makes it as good for small projects as for large projects.

  29. Gyp looks interesting, however i’m not really sure that gyp -> -> compile is a necessary process. There are some issues in scons for cross platform(e.g. it’s path handling is kind of brain-damaged) but at face value I think i’d prefer scons to gyp.

  30. @JonB: How exactly is SCons’ path handling brain-damaged? I’m guessing you’re referring to its handling of variant source trees. FWIW, I’m not a big fan of variant source trees. I personally think it’s better to be able to compile all variants from a single source tree whenever possible.

  31. I used to work on Conjure, a build system similar to scons, except written in Scheme and using the same as its build job description language. Perhaps it is my pro-Lisp bias but I found Scheme’s syntax to be cleaner and more suitable for this sort of thing; you can express things declaratively in Scheme much more easily than in Python which is through-and-through an imperative language.

    Sadly, Conjure’s maintainer has abandoned the project. I have thought many times of reviving it and porting it to Guile — perhaps the most commonly installed Scheme — so that ordinary users everywhere could benefit from its power. But like so many other round tuits that is one I just haven’t gotten.

    An advantage that CMake has is that rather than doing the building itself, it generates platform-specific build files — makefiles on Unix, Visual Studio solutions on Windows — thus integrating seamlessly with the build environments on those platforms and not requiring CMake to be installed on a computer which is doing the build.

    1. >Perhaps it is my pro-Lisp bias but I found Scheme’s syntax to be cleaner and more suitable for this sort of thing;

      I believe you. Unfortunately, the odds that a Scheme-based tool can gain acceptance under modern conditions are, well, effectively nil.

      >An advantage that CMake has is that rather than doing the building itself, it generates platform-specific build files

      Oh, *ghods*, nooooo! Not again! It’s the autotools blunder come back to haunt us! I’d forgotten that fundamental problem with cmake. This kind of system makes debugging a broken build procedure hell, because there’s no good way to go from errors in the generated stuff back to the line of the spec responsible.

  32. How exactly is SCons’ path handling brain-damaged? I’m guessing you’re referring to its handling of variant source trees. FWIW, I’m not a big fan of variant source trees. I personally think it’s better to be able to compile all variants from a single source tree whenever possible.

    Actually i really like variant dirs. They make cross compiling fun. Having said that i might be thinking of a different thing. I’m thinking of the ability to build a certain Sconscript multiple times to multiple destinations with different environment variables. (e.g. build to build/linux for x86-linux and build/win32 for x86-win32).

    No my problem is that the underlying tools will only look at env[‘PATH’] to find build tools (specifically c compilers) instead of checking os.environ when env[‘PATH’] is left default. This hurts platforms that don’t put things in ‘/bin’.

    Seriously, if i can type cl on the windows command line and get the visual c compiler then I should be able to build a sconscript that builds helloworld.c . (Unless of course i’m trying to do something tricky with paths and messed up)

    (And i don’t think it’s totally brain damaged, it’s just brain damaged if you’re trying to support too many platforms and can see the potential for zero reconfiguration of the build script)

  33. Just don’t use scons for Java compilation. (You wouldn’t use make for that either.)

    The way javac works (or doesn’t work) does not match the scons model. At all.

    But I’m happily using scons to build documents out of docbook markup. If I were doing C/C++ development, I’d use scons.

  34. I spent a week learning autotools to post-fit it to my big C++ project, and still wasn’t competent to make it do what I want. What is worse, I only need to mess with the build system once every 3 months, by which time I’ve forgotten it all again. I simply don’t have the time to maintain the level of expertise required.

    I tried doing the same thing with scons, and it took me a day from knowing nothing.

  35. Absolutely agree. Autotools sucks. It is fucking slow to modify the build environment and this script language magic is incredible horrible. I hate to work with. CMake and QMake are far more better. CMake is my faviourite.

  36. i think a big part of the problem is the way folks implement their build in a monolithic tightly coupled fashion. here is an example where I break things up… java folks use IoC dependency injection, i just bind the rule to a variable and can change that on the fly via the command line. here is an example

    http://www.qbalsoftware.com/Site/Blog/Entries/2011/5/9_Makefiles_Generic_Flexible_and_Simple_Part-2.html

    what i’m suggesting is the makefile.targets is the build system, and makefile, makefile.project and makefile.resolve is the model (the inputs to the build system). and the model can be implemented in any language and then call make passing in all the resolved properties, i just happened to use ‘make’ to define the model. and to deal with all the variants i create a driver script (which could be used to define the model too)

    http://www.qbalsoftware.com/Site/Blog/Entries/2011/5/9_Makefiles_Generic_Flexible_and_Simple_Part-3.html

    i use the same pattern when I need to build a bunch of open source packages with this driver http://www.qbalsoftware.com/Site/Blog/Entries/2011/6/10_Automation_Part-2.html

    for example, openssl used autotools configure and stunnel used apache configure, but the driver is the same, I just need to set a bunch of properties for each package

    this is how I try and build most automation. but… lots of tools make this really hard or impossible. for example, i cannot use this pattern with ant or maven (both of those technologies are wrong on so many levels)

    so there are 2 parts of the build system problem, the tools and the way folks use the tools.

    in regards to autotools. i’ve never used them. while at sun in the 90’s if our team used an open source or even another vendors source i’d rip out the build system and put one that kind-a-sort-a looks like the one i reference on my blog…

    google’s gyp project i think is on the right track, although i do not like the implementation… i don’t believe JSON, or XML should be used as a modeling language. i do think an external DSL that is specialized in creating build models is needed. keep the model separate from the build tool which should be another DSL. once the model is separated from the build tool, all kinds of possibilities open up that we can use that data for … we’d be in a position to look at the models from all the packages and write tools to provide some analysis…

  37. Chicken, an excellent implementation of Scheme, has a very badass attitude to configuration, as follows: You tell “make” if you are running on Linux, BSD, Mac OS X, Solaris, Haiku, Cygwin, MinGW, or MinGW+MSYS, and then a short platform-specific makefile writes out a little .h file and builds everything else as-is. If you are *not* running one of those, you’re on your own — but the community welcomes patches.

    This is the outcome of many years struggling with autotools, and several more struggling with CMake. (Chicken compiles Scheme to C and is written in Scheme, so it has to bootstrap itself, something CMake doesn’t or didn’t like.)

    By contrast, I did a bad restore from backup the other day, and Guile (which uses autotools, small blame to the maintainers — the FSF insists on it) got confused because I hadn’t restored the file timestamps properly, so the byte-compiled Scheme looked out of date relative to the source. No biggie, I just rebuilt Guile from scratch. Ouch. That took longer than it would have to *re-restore the backup three times over*, and the hard time was struggling with autotools; I had to run ./configure about twelve times before it reported all the missing C-level dependencies (not really missing, just out of date).

  38. I found autotools way too disgusting to start playing with.
    I met with the scconfig project, and I suggest you to use scconfig. It is an open source configuration utility, and depends only on C89 compiler and a very dumb make tool.
    See: http://repo.hu/projects/scconfig/

    SCons = too large, and depends on python, which is not always accessible

Leave a Reply to Sean C. Cancel reply

Your email address will not be published. Required fields are marked *