Refuting “The Mathematical Hacker”

Evan Miller’s essay The Mathematical Hacker is earnest, well-intentioned, and deeply wrong. Its errors begin with a serious misrepresentation of my views and my work, and fan out from there.

Miller chracterizes my views as “Mathematics is unnecessary [in programming] except in specialized fields such as 3D graphics or scientific computing.” But this is not even a good paraphrase of the quote he cites, which says hackers “won’t usually need trigonometry, calculus or analysis (there are exceptions to this in a handful of specific application areas like 3-D computer graphics).”

What I am actually asserting in that quote is that continuous, as opposed to discrete, mathematics, is not generally useful to programmers. I sharpen the point by continuing thus: “Knowing some formal logic and Boolean algebra is good. Some grounding in finite mathematics (including finite-set theory, combinatorics, and graph theory) can be helpful.”

This (which Miller blithely ignores) is very far from asserting that hackers can or should be indifferent to mathematics. Rather, it is me, as an erstwhile mathematician myself, attempting to point aspiring hackers at those domains of mathematics that are most likely to be useful to them. Much of theoretical computer science builds on these; automata theory and algorithmic complexity theory are among the more obvious examples.

Having gone wrong right at the start, Miller swiftly compounds his error: “If you are a systems and network programmer like Raymond, you can do your job just fine without anything more than multiplication and an occasional modulus.” This is somewhat too narrow about the scope of what I do, and even if it were accurate Miller would be arguing against his own case by taking far too narrow a view of what mathematics a “systems and network programmer” can use.

In fact, it is routine for systems programmers to have to grapple with problems in which graph theory, set theory, combinatorial enumeration, and statistics could be potent tools if the programmer knew them. If Miller knows this, he is being rhetorically dishonest; if he does not know it, he is far too ignorant about what systems programmers do to be making any claims about what they ought to know.

Thus, when Miller asserts that I would agree with the claim “from a workaday perspective, math is essentially useless”, he is ludicrously wrong. What he has done is conflate “math” with a particular kind of mathematics centered in calculus and continuous analysis – and as a (former) mathematician myself, I say this is a form of nonsense up with which I do not intend to put.

Miller perpetrates all these errors in his five opening paragraphs. Sadly, it only gets more dimwitted from there. Consider this gem: “One gets the impression reading Raymond, Graham, and Yegge (all self-styled Lisp hackers) that the ultimate goal of programming is to make a program that is more powerful than whatever program preceded it, usually by adding layers of abstraction.”

This is superficially profound-sounding, but nonsense. I admit to not being familiar with Steve Yegge’s work, but Paul Graham is hardly lost in layers of abstraction; he used his Lisp-programming chops to build a company that he sold for $50 million. One of my best-known projects is GPSD, which gets down in the mud required to do data analysis from navigational sensors and is deployed on millions of embedded platforms; another is GIFLIB, which has been throwing pixels on all the world’s display devices since 1989. Neither of us inhabits any la-la-land of pure computational aesthetics; this is Miller misreading us, wilfully ignoring what we actually code and ship.

From here, Miller wanders off into knocking over a succession of straw men, contrasting a “Lisp culture” that exists only in his imagination with a “Fortran culture” that I conjecture is equally fantastical. Not absolutely everything he says is nonsense, but what is true is not original and what is original is not true.

Miller finishes by saying “Lastly, we need the next generation of aspiring hackers to incorporate mathematics into their program of self-study. We need college students to take classes in physics, engineering, linear algebra, statistics, calculus, and numerical computing…”

This is true, but not for the reason Miller wants us to believe it is true. The bee in his bonnet is that continuous mathematics is generally useful to programmers, but that claim remains largely false and has nothing to do with the real utility of most of these fields. Their real utility is that they require their practioners to think and to engage with the way reality actually works in ways that softer majors outside science/technology/engineering/mathematics seldom do.

That kind of engagement we could certainly use more of. Arguments as bad as Evan Miller’s are unlikely to get us there.

92 comments

  1. Eric, you say “What I am actually asserting in that quote is that continuous, as opposed to discrete, mathematics, is not generally useful to programmers. I sharpen the point by continuing thus: “Knowing some formal logic and Boolean algebra is good. Some grounding in finite mathematics (including finite-set theory, combinatorics, and graph theory) can be helpful.” ”

    I might be just as dense as mr. Miller, but I don’t see your phrasing as “sharpening the point”. “Some grounding in [something] can be helpful” and “knowing [something] is good” is not really a strong suggestion of “you should study this”, more like “It’s nice if you have this knowledge”. I remember thinking something much like mr. Miller when I read those paragraphs back a few years ago.

    Your new phrasing “In fact, it is routine for systems programmers to have to grapple with problems in which graph theory, set theory, combinatorial enumeration, and statistics could be potent tools if the programmer knew them” is much clearer in expressing your view.

    1. >more like “It’s nice if you have this knowledge”.

      Part of the reason for my original phrasing was that, for reasons of history and the way mathematical pedogogy is structured, the question “do I need to know math?” generally reduces when asked by a non-math major to “do I need to know calculus, which scares me”.

      I was trying to be reassuring on that score.

  2. Since you haven’t read Yegge, I’m guessing that Miller is taking issue with posts of this sort by Yegge. It has a brief homage to Lisp, which allows Miller to tie “Lisp culture” to the sort of thinking Miller doesn’t like. But I think it misses Yegge’s main point, which is that new hardware (more cores, not faster), new problems (such as big data), and some pure software problems (intractactable code base sizes) require new approaches make progress in solving them. His suggestion is that the solutions to these problems will requires a fair bit of math, so that’s where programmers should spend their time if they want to tackle those problems.

    1. >But I think it misses Yegge’s main point,

      Having read that post, I agree. Yegge is asserting that if von Neumann had lived longer and programming were less of a kluge, mathematics would be more obviously valuable to programmers than it is now. Miller misrepresents Yegge even more perversely than he does me.

  3. I live out in the business and manufacturing end of programming. And have been building things for over 50 years. I use geometry, trigonometry and algebra on a — well, weekly basis anyway. I should probably know more about statistics, although I have learned enough of them to have a pretty good notion of what statistics can and can’t do. The times I have actually used calculus could be counted on the fingers of one hand. Even chemistry has been more directly useful in my life than calculus.

    On the other hand, taking the classes in calculus and linear algebra was hardly wasted. It expanded my thinking, and I don’t think one can really learn algebra without understanding what calculus does — at least analytic geometery.

    For me, discrete math didn’t do much — quite possibly because by the time I took it at 45 or 50, I had already been using the concepts for so long that it seemed quite obvious to me — ‘Well, of course. How else would you work with that?’ sort of deal. I’m sure that a nineteen year old would have had more difficulty.

    Base point, I agree with Eric and Heinlein. Understanding math, and the way of thinking required to do math is essential to be counted as an educated person — if for nothing else the practice in reasoning logically. Studying science enough to understand that there are some things that can be counted as truth, and some things that are speculation. And that the very real fact that the edges are fuzzy and subject to dissent does not make ‘f = ma’ or ‘evolution’ a mere theory that is probably wrong anyway. (And yes, f=ma is only an approximation that gets very approximate at very high velocity or acceleration. Guns work just fine anyway.)

  4. Calculus tends to appear in statistics. Even if the objects you are modeling are discrete, the probabilities are continuous.

    A coin has only two sides, but its heads frequency is a real number, and a person’s belief about that frequency is a probability distribution over real numbers. For that person to calculate the probability of heads on the next flip requires integration. The rule of succession is a general treatment of this sort of problem. The generalization to n>2 is useful for parameter estimation in markov chains on a small state space, e.g. for modeling text or genes. You can use the rule of succession without deriving it yourself, of course, so knowing calculus is not exactly a requirement here.

    Statistical models having many parameters are typically optimized by hill-climbing, which is possible to do via a guess-and-test method, but moving in the direction of the gradient (i.e. partial derivative) is faster in proportion to the number of parameters. Logistic classifiers are a simple example.

  5. How much of this, though, is the average hacker using things like set theory and combinatorial enumeration without thinking that that’s what he’s doing? How many of us pick these kinds of things by osmosis and example without knowing that there’s a theoretical basis for it?

    When I sit down to write code, I don’t think about that kind of thing; I just do it. I suspect that the reason you do, Eric, is precisely because of your mathematician roots.

    1. >I suspect that the reason you do, Eric, is precisely because of your mathematician roots.

      Of course you’re right. I see behavioral evidence that many programmers acquire a jackleg knowledge of things like (say) basic combinatorics that is much less systematized than mine. But I’m not sure what the point of asking “How much of this” is.

  6. I obviously agree with this post. Miller appeals to the “real-world relevance” of applied mathematics and numerical computation throughout the essay, but this does not make scientific computing any less of a niche as far as most developers/hackers are concerned. He also rejects computer science research as a way of creating safer and better languages and programming practices, as if applied math and numerics did not benefit from these results (The Fortress programming language, albeit abandoned is a good proof of concept). As parallel/distributed, multicore and GPGPU computing becomes increasingly relevant in this domain, it’s quite obvious that FORTRAN 77 simply does not cut it any more.

  7. > long int fac(unsigned long int n)

    A function which, let’s be clear, has exactly 21 valid input values on a 64-bit system, for only 17 of which his solution doesn’t lose precision and give the incorrect answer.

  8. Apart from batting .810 (I haven’t checked his fibonacci function, I suspect it has a longer range simply because the result doesn’t expand quite as fast), he also misses the point that people are given factorial and fibonacci as example problems not because they’re likely to need to calculate either in the real world, but because they’re useful examples for teaching iteration and recursion.

  9. His point about math, and statistics in particular revolutionizing industries in well taken. But by talking about “lisp culture” and “fortran culture” he’s converted an innocuous post into flamewar material. I know plenty of lisp programmers(myself included) who are quite into the kind of math he talks about.

    1. >But Eric, you are not a mathematician, and nor were you ever.

      I was invited to present a paper describing my original mathematical research at the AMS national conference in 1975, and did so. It was on a closed form for the Nth term of linear difference equations.

      It is true that I wasn’t a mathematician for very long.

  10. I’m not sure if I’m the only person to have read “IBM’s Early Computers” (http://amzn.to/UMzzYb), but what I did get from it is that the guys who invented and used FORTRAN had a very pragmatic attitude. FORTRAN was the first high level language, invented for IBM 701 and 704 mainframes in the mid-1950s, and the reason they made it had nothing to do with mathematical zen. They made it because it was a major pain in the arse to translate an engineering algorithm directly into assembly language and wanted to come up with an easier way of doing so. To this end the debuggers are probably much more significant.

    I think Evan Miller is reading too much into the purpose of FORTRAN being a language used primarily to solve math-heavy engineering problems. Theoretically, the engineer and the programmer could be different people, the engineer coming up with the algorithm he’d otherwise have to use on his slide rule and note pad, and the programmer turning it into FORTRAN code; the latter need know only enough about mathematics (at least, of the math specific to the problem, setting the foibles of 1950s mainframes aside) to read the equations and the rest of his knowledge is about programming (and probably, operating) the computer. This probably happened fairly often in some fields because those fields (things like aerospace where you have mechanical physics, radio, aerodynamics, and human factors melting together) are complex enough to employ large rooms full of people in different disciplines; the guys running the computers need to be whizbangs at running the computer, not necessarily mathematicians.

    “From here, Miller wanders off into knocking over a succession of straw men, contrasting a ‘Lisp culture’ that exists only in his imagination with a ‘Fortran culture’ that I conjecture is equally fantastical.”

    I know enough about the real Fortran culture that I’m quite certain the one that exists in his imagination exists _only_ in his imagination.

  11. I ask “how much of this” as a way of possibly getting to the bottom of the disagreement. Are people doing mathematical stuff and not realizing it? If so, then perhaps the right answer is to tell them what they’re already doing and how to do it better.

  12. I think (in general agreement, it seems, with several other commenters) that your “there are exceptions to this in a handful of specific application areas like 3-D computer graphics” could appropriately be updated to “[comma] statistics, and machine learning.” And continuous math can also come in handy in optimization problems even if they are discrete. Those things have grown in importance in the last few years, and seem likely to continue to grow. And adding them with the long tail of many other math-y niche problems (control theory, signal processing, implementing X-ray tomography, supporting people doing simulation of physical problems involving PDEs; also quantitative finance if you want to distinguish that from statistics and PDEs), arguably “handful” should be upgraded to something more generous like “some.” I don’t know how to characterize it quantitatively, but from anecdotal knowledge of programmers with strong math backgrounds (e.g., lapsed physicists) it seems pretty routine to find work where a continuous math background makes one significantly more valuable.

  13. Similarly, if it is ever necessary to compute a factorial, a programmer should be taught

    He’s seriously missing the forest for the trees with his examples. The programmer is not being taught how to compute a factorial (or Fibonacci numbers), but how to do recursion, which is an important tool in all sorts of real world problems. It’s just an easy to understand illustration of the concept.

  14. Fortran wasn’t a particularly good language for math heavy engineering problems. It was just that a lot of code was initially written in Fortran to do some number wrestling and the usual issue of installed code base ensued.

  15. I must come to his defense, though, about factorials and Fibonacci numbers: he’s correct in that there’s a better way to do those. The lesson isn’t that the examples are narrow; the lesson is that there may well be a better way to accomplish the same end that does not require recursion or even looping, and it’s worth looking to see if that can be accomplished.

  16. I should say that there may be a better way to accomplish the sane end in any programming problem that seems to need recursion or iteration, and looking for it might be productive.

  17. As someone who studied digital signal processing, I think the line between continuous and discrete math is very fuzzy in certain areas. The entire way we study and process discrete signals is based on transforming them into continuous mathematical objects (i.e. z transforms). This involves complex analysis and calculus, and allows us to treat digital sampling as just another operation in continuous space. The entire field of numerical approximation revolves around reproducing continuous math as reliably as possible. The implementation of that theory might end up being fully discrete, but you simply cannot understand it without grokking the ‘real’ stuff underneath.

    I recently saw the funny quote that “eventually the skillsets required for front-end programming converge on the same skills as game development”. I read a lot of game development papers, and whether it is in AI, graphics, animation or sound, continuous math always makes an appearance sooner than later.

    In my experience, there is indeed a certain resistance to that sort of math in the hacker world. I also had trouble reconciling the digital world of coding with the symbolic world on the blackboard (often preferring the former). Languages change how you think, and when your main interface to math is code, you will naturally think in discrete ways even when it’s not appropriate.

  18. @Jim Hurlburt
    > I’m sure that a nineteen year old would have had more difficulty.
    Funny you should mention that. I just got through my discrete math course (final exam was Wednesday of last week,) at the age of you-know-what. A couple of concepts, like the use of types of relations besides functions, were articulated to me in new ways. But for the most part, my reaction was pretty similar to yours– kind of fun, not very deep, and mostly consisting of learning wacky new symbols for stuff that’s obvious. I’ll happily concede that you need to know discrete math in order to be a programmer, but I’m not convinced that I didn’t know discrete math from the cradle. But hey, at least I know what backwards Es and upside-down As mean!

  19. > Are people doing mathematical stuff and not realizing it? If so, then perhaps the right answer is to tell them what they’re already doing and how to do it better.

    This applies to every problem ever. For example, the earliest CAT scanners used a kludge to reconstruct a 3d image from slices, before a mathematician came along and pointed out the Radon Transform.

    The only exception seems to be trivial problems where we understand the theory so well intuitively that understanding relevant formal theory doesn’t help much. As an informative example compare the Peano Axioms to the way most people count fruit when shopping.

  20. “What he has done is conflate “math” with a particular kind of mathematics centered in calculus and continuous analysis.”

    In other words, anything developed after the year 1800 isn’t “math” to him. This is the way mainstream America sees math, but as ESR says bears no relation to reality.

    I’m not sure where the pop culture idea came from that calculus, which is 300 years old, represents the pinnacle of mathematical complexity. It’s probably an artifact of our failure to teach even the existence of other types in high school. (It amazes me that we don’t teach basic stats in high school; it would be more useful to most people than calculus ever could be.)

    How many non-mathematician, non-software types have even heard of “finite-set theory, combinatorics, and graph theory”? We really need to remedy this at the high-school level — not necessarily teach these subjects in any detail, but explain that they exist and how useful they can be in a computerized world.

  21. “Do I need math for programming?” seems like a wrong question.

    If the person asking wants to try programming and has no math background, the answer is clearly “No, fool around with code and learn math as you need it/as you become curious about it.” Perhaps with a pointed note that it is normal and good to want to look under the hood.

    There’s also the issue of what distinguishes “mathematics for programming” from “programming for mathematics.” I think of numerical analysis more as the latter, for instance. If what you’re doing *is* math which happens to be implemented computationally (mathematical modeling, machine learning, etc.) then that’s not “math you need in order to be a programmer.” The examples he’s talking about (quality control in manufacturing; ANOVA in pharmaceuticals; optimization in logistics) are examples of applied math being valuable, and have nothing to do with the mathematics behind computation itself, which is, of course, discrete. I’m an applied mathematician myself, and an analyst at that; I’ll be the first shouting from the barricades that algorithms are eating the world. But that’s not hacking and it’s not even really computer science (though I’m happy to take my office in the CS building) — it’s just math you happen to do on a computer.

    I do sometimes see a bit of an anti-intellectual vein in web startup culture. Comments to the effect that nobody really needs to learn, e.g. sorting algorithms. I don’t like that attitude, but hey, if you can make a living without understanding what’s really going on, more power to ya, live and let live.

    1. > I’m an applied mathematician myself, and an analyst at that; I’ll be the first shouting from the barricades that algorithms are eating the world. But that’s not hacking and it’s not even really computer science (though I’m happy to take my office in the CS building) — it’s just math you happen to do on a computer.

      Yes, exactly. I find it difficult to imagine that anyone who is really literate in mathematics could fail to make this distinction. Which is why Miller’s article seems to me not only wrong but curiously hollow – for all his perfervid mathematicalism, he doesn’t really think about mathematics the way a mathematician does.

  22. I have multiple thoughts on this, but have just one ready to post right now.

    Using fancy math and computers together is actually a distinct skill in itself. A person can be very good at math and very good at conventional programming, and yet know next to nothing about doing math with computers.

    When using integers, or rational numbers with a bignum library (GMP), knowing computers and math separately is enough. But when dealing with approximate input, or irrationals, you have to deal with the computer’s limited precision. Just figuring out an error bound is advanced stuff.

    It doesn’t help that CPUs aren’t all equal when it comes to floating point operations. Most only promise “correct or one unit off” on the difficult stuff, and some are worse. Pedantically, a numeric program intended for “any IA-32 machine in general” should be prepared to cope with some really pathetic 387 clones from the dustbin of history. I doubt many are….

    That said, expertise at ordinary continuous math can still be a help. Given an initial naive problem definition that would involve juggling irrational numbers, a math expert may be able to simplify it on paper so that most of the work is done in the integer domain where computers are absolutely reliable. But it’s not always possible, so “numerics” experts have a stable niche.

  23. “I do sometimes see a bit of an anti-intellectual vein in web startup culture Comments to the effect that nobody really needs to learn, e.g. sorting algorithms…”

    It’s a generational/times change thing. Once we coded in assembler, later in FORTRAN and C. If you were a real programmer you always found some excuse…er, I mean need to write your own sorting routine. Nowadays, that stuff has become low level…just use the damn library and stop wasting time…

  24. I think a lot of the confusion is cause by not separating math needed to solve the problem at hand, eg, CAT scan slices, signal processing, routing, automatic translation, and the act of writing computer programs.

  25. > it’s just math you happen to do on a computer.

    One problem seems to be that most people do not recognize formal languages, automata, and computability as Mathematical subjects. School mathematics seems to be to blame:

    A Mathematician’s Lament
    http://www.maa.org/devlin/lockhartslament.pdf

    About Fortran not being “mathematics” like Lisp. Fortran is just an excuse to use LAPACK. FORTRAN = FORMULA TRANSLATOR. It is just computer readable formulas. (you see that I hate FORTRAN with a vengeance)

    Whenever this topic comes up, I just respond that “programming is herding bits”. And if herding logical values is not mathematics, what is.

  26. > Fortran wasn’t a particularly good language for math heavy engineering problems. It was just that a lot of code was initially written in Fortran to do some number wrestling and the usual issue of installed code base ensued.

    @SPQR: also Fortran being simple language, it got heavily optimized compilers (proprietary vendor compilers) first.

  27. “– for all his perfervid mathematicalism, he doesn’t really think about mathematics the way a mathematician does.”

    I can’t understand how he misses this. Maybe we should email him that G. H. Hardy quote about how proud he was that he never did anything useful.

    I was a physics major, yet even from that distance I can see that Lisp is a work of genius, and Mathematics is the Emperor of the Sciences. Miller needs to learn more respect.

  28. >We really need to remedy this at the high-school level — not necessarily teach these subjects in any detail, but explain that they exist and how useful they can be in a computerized world.

    That is what the new Study Edition of The Mathematical Experience is for. I haven’t read this version, but the original one, without the study questions is a very good survey.

  29. Nb. the functional programming that Lisp is based on… that is the pure math. It is Church lambda calculus in Church-Turing thesis, which is formal system in mathematical logic.

  30. LS on Saturday, December 15 2012 at 1:29 am said:
    “I do sometimes see a bit of an anti-intellectual vein in web startup culture Comments to the effect that nobody really needs to learn, e.g. sorting algorithms…”

    It’s a generational/times change thing. Once we coded in assembler, later in FORTRAN and C. If you were a real programmer you always found some excuse…er, I mean need to write your own sorting routine. Nowadays, that stuff has become low level…just use the damn library and stop wasting time…

    Well yes – almost all the time the standard library (for sort or whatever) is just fine. But I do think (and have evidence to back it up in some of the scaling issues my company has hit and resolved) that if you understand the math and the algorithm you’ll have a better shot at getting your app to scale properly. You’ll understand what bits can (and should) be parallelized, what order to do things (for ex should you sort lists then merge them or should you jam them together and sort the final result) and these choices can make a huge difference to whether you go titsup.com all the time when you get thousands of requests or not.

    it is incredibly easy to write a naive program to do something that is say o(N^3) or even o(2^N) on a key variable when a bit of thought and rearranging can make it o(N)logN or better. If you don’t have the Math you can’t do this kind of thing. Nor can you work around a reverse situation where your efficient algorithm is clobbered by a horribly written std lib function that you have to call repeatedly or where the way you call the lib function makes it worse. For example calling a lib func once with an array of data is generally more efficient than calling it item by item (but not always).

  31. @john
    > I know just enough statistics to be wrong almost all the time.
    So i guess that you often hit the wall when you are trying to walk trough a door? Otherwise i can hardly believe that you are wrong almost all the time.

  32. Evan Miller > Yet these sorts of mathematical “hacks” are rarely mentioned by our great hacker essayists. The reason, I think, is that “hacker literature” tends to be dominated by Lisp programmers, and Lisp programmers tend to be ignorant of applied mathematics.

    Can someone please point Miller to Gerald Sussman’s Structure and Interpretation of Classical Mechanics ? Written by a hacker-demigod and coauthor of the Wizard Book , it dives deeply into mathematics. And not just continuous mathematics, mind you, but continuous mathematics squared and metafied up to the wazoo. (How so? Briefly, physical bodies move through force fields on continuous paths, which nature selects from a continuum of other continuous paths. To predict the path nature picks as a function of the force field, you need to really get down and dirty with continuous math. Even pure mathematicians would be hard-pressed to get farther down and deeper into the dirt than Jay Sussman and Jack Wisdom did with their Scheme programs.) The claim that Lisp hackers can’t or won’t deal with this strikes me as absurd.

    To me, not being a hacker myself but thinking of hackers as a tribe of friendly neighbors and cousins, this book was a revelation when I first read it many years after graduating. If there are hackers in this thread who worked through it, too, I’d be curious to learn what their experience with it has been. Has it expanded your hacker minds the way it expanded my physicist mind? Either way, Evan Miller, missed out on a treat, that’s for sure.

  33. Fortran wasn’t a particularly good language for math heavy engineering problems. It was just that a lot of code was initially written in Fortran to do some number wrestling and the usual issue of installed code base ensued.

    No, you’re wrong. Fortran has historically been the best language for CPU-intensive numerical computing — much, much better than C. The reason is simple: aliasing issues. C explicitly allows you to treat the same location in memory as an integer, floating-point number, pointer, etc. This prevents the compiler from performing optimizations that it would have been able to perform had it known the type of the memory location with absolute certainty. Fortran provides that certainty. That’s why it was still used for engineering applications even when much “better” languages came along.

    Of course nowadays, if you are starting a new application, the tool to reach for would probably be C++, recent versions of which are much more type-safe and don’t have nearly quite the same aliasing issues that C is prone to. High-performance computing APIs like OpenCL are also designed to work with C++.

  34. Ah, I see that Jeff Read has not lost his ability to confuse his subjective opinions with absolute universal truths …

  35. @Jeff

    I don’t think that’s quite right. The “problem” in C is that whenever a function from another object-file is called, or a pointer dereferenced, the calling function must act as if every global variable, plus every local variable to which an address has been taken, was volatile.

    Where type punning comes into it, is that since type punning is almost illegal in standard C, C compilers partially fixed this by declaring that optimizations that assume it never occurs are OK. But this only reduces the number of pessimistic assumptions the compiler has to make, it doesn’t eliminate them. For example, because “char” is a special case where limited type punning is allowed (so you can memcpy() any structure), any string manipulation will deter “Fortran-style” optimization.

  36. In a pragmatic sense, mathematics is a very large and evolving toolbox, and its usefulness in problem-solving has been demonstrated for several thousand years now. Computer science has been with us for less than a century, and in many respects, is still in the infancy of its evolution. These two fields of science are symbiotic and its difficult to predict what the future may bring. Newton revolutionized mathematics. Whose to say that some future hacker may not achieve a similar feat.

  37. @TomA: Bravo!

    There should be no arguments concerning Lisp vs. Fortran formulations of Computer Science. It’s like debating wave vs. matrix formulations of Quantum Mechanics. Sooner or later, some Digital Dirac is going to come along and give us the general theory.

    It does not matter that Lisp view is recursive, while Fortran’s is iterative; the results are the same. It has been pointed out that both views are imperative, while Mathematics is declarative. We have declarative languages now. Perhaps that is the future of Computer Science.

  38. >Mathematics is the Emperor of the Sciences

    That reminds me of:

    Biology is really chemistry.
    Chemistry is really physics.
    Physics is really mathematics.

  39. This hardly seems like an argument worth having. The kind of math you need depends on what kinds of problems you are trying to solve.

  40. Save your breath, Monster, and just say it with a link.
    http://xkcd.com/435/
    ;-)

    Re: the OP, I’m yet another programmer keenly aware of the discrete / continuous distinction. Fresh out of college, I turned to astrodynamics. Great continuous fun for the whole nerdy family. Within a few years, I found myself associated with a different project involving deductive databases and knowledge representation, and while it thrust me into an element I hadn’t been as vigorously equipped for, it had obviously wider application. (Planning how to burn fuel to alter your orbit, versus integrating two sources of data at a semantic level. Which occurs more often in nature?)

    As much as I enjoy continuous math, discrete currently rules the roost.

  41. No mathematics can help to sort an array in constant or linear time on von Neumann architecture. No polynomial approximation or other algorithms can do that. However, a very impressive analog algorithm by A. Dewdney Spaghetti sort takes constant time O(1) to sort an array, and linear time O(n) to prepare its elements. Perhaps quantum computers will allow us to do that, so programmers will need continuous mathematics including quantum as well.

  42. I am by no means a hacker, but both in programming and business school it was drilled into me that math is the essence of both, but during 10 years of business programming I’ve yet to see any report more complicated than a percentage calculation. When I am bored I offer regression or correlation calculations to managers and they either don’t need it or have no idea of what I am talking about.

    My point: teenagers who suck at math should not be discouraged from learning programming. Hackers maybe they won’t be, but the “major in something, minor in programming, work at automating that something” life path which is increasingly often recommended (by me, too, it works great) is often doable with little math.

    For example in civil (construction) engineering you need plenty of math to actually do statical calculations for a building, but for developing the software that does the budgeting for a project and collects the actual work-hours and materials used and provides nice reports of budget vs. reality comparisons, you need little math. Similarly a mechanical engineer – programmer can focus on the BOMs & MRP part of the manufacturing software.

  43. @link0ff: Mathematics of computer science tells us that sequential sorting algorithm using comparison has minimum time complexity of O(N log N).

    Radix sort is O(k N) but it is non-comparative integer sorting algorithm.
    Bitonic mergesort is O(log^2 n) but it is parallel sorting algorith (like spaghetti-sort) and requires O(n log^2 n) processors.

    Also IMHO O(1) of spaghetti-sort is suspect…

  44. The description of spaghetti sort is intentionally whimsical. That Wikipedia entry would do well to include text describing the inherent fallacy of the sort. Since the sort is done by bringing your hand (let’s assume your hand is perfectly flat) down on a bunch of spaghetti rods, removing the first you touch, and repeating until order is known, it’s the computer analog of a parallel algorithm, one processor for each rod, all moving synchronously somehow (the processors = your hand). Who has a billion processors that automatically move in perfect sync?

    Of course, you wouldn’t know that without (discrete) math. Or at the very least, logic, which no one is proposing hackers not know.

  45. I’m programming for bankers, & I rather agree with Shenpen : I’ve done 3(yes, three) complicated algorithms in 12 years, and didn’t go mathematically further than division & rest(fizzbuzz level).

    Yet.

    Yet I’ve been trained as an engineer in plastics before switching job. I’ve been taught complex maths of many kinds and industrial values. Though none of those has been directly used, their indirect use is, I think, huge.

    Maths are useful because it trains your mind, helps you to see patterns, and train your capacity to abstract.

    Industrial values I’ve found useful a thousand times. I’ve been trained to “quickly change tools”(in the plastics domain, it means changing a mould in 4 minutes instead of 90). Being able to change quickly parts of my code is an obsession that saved my ass numerous time. I’ve also been trained to “Value engineering” as a reflex. Thus, I’m optimizing only when needed. It has become unconscious, but I avoid many tricks supposed to optimize things, but that are just irrelevant (optimizing a 90 seconds algorithm when you have first to read the database for 5 hours is not value added). My Computer-trained colleagues tend to be scandalized, but hell : it’s readable, it works, and the execution time is negligible in the context. Plus it’s been done in an acceptable(manager’s point of view) amount of time

    My bottom line is : intellectual training, though not mandatory for hacking, is a welcomed bonus. It trains and opens the mind, and might, at some occasions, give good habits. I’ve seen very good self-taught(with no diploma at all, unlike my own unrelated diploma) programmers, the best of them is impressive, but IMHO they lack the ability to see & correct some of their bad habits.

  46. I am reminded that, while working on GPSD, esr didn’t know what CRC was, or how to calculate it.

    http://old.nabble.com/Anyone-have-code-for-CRC-24Q–td18200466.html

    I also note that while writing, “One of my best-known projects is GPSD, which gets down in the mud required to do data analysis from navigational sensors and is deployed on millions of embedded platforms” esr carefully constructs a path clear of you actually claiming credit for those bits in gpsd, while simultaneously appearing to claim credit for these same math-y bits in gpsd. Good word-smithing apparently doesn’t need a math degree.

    I also take issue with your “all you need is discrete mathematics”. Basic calculus comes up in machine learning (e.g. deriving maximum likelihood estimators or evaluating Bayesian integrals) and signal processing. Linear algebra comes up more often than you’d think. Differential equations play into physical simulations pretty heavily Ppartial derivatives are for perceptrons/gradient descent. (http://www.cedar.buffalo.edu/~srihari/CSE555/Chap5.Part2.pdf)

    p.s. Steve Yegge is esr “done right” in terms of an actual hacker representing the larger hacker community.

    1. >I am reminded that, while working on GPSD, esr didn’t know what CRC was, or how to calculate it.

      Ah, I see the reality-distortion field has kicked in. In fact, I probably wrote my first CRC implementation around 1981. On a TRS-80, of all things

  47. Eric, I think you misread Evan, in much the same way as he misread you. Further, I think your response, correct though it may be in the details, only reinforces his point in the large.

    Your response argues that you do in fact think that certain branches of math are useful for hacking. I read Evan’s essay as trying to point out that such discussions are all we usually have, and are beside the point. His point is that math is for solving problems in the real world, and that computers are useful tools for doing math. You don’t learn math to solve computer problems. You learn math to solve problems with computers.

    In that sense, continuous math is very important. Without it, there are very relevant real-world problems you cannot solve, with or without a computer.

    His key paragraph is the third from the end.

  48. I find the point about Paul Graham making better things by putting a layer of abstraction on top of the previous thing mildly humourous since he is most memorable to me for Arc, a language specifically designed to cull as many abstractions as it can and still be a useable language.

  49. > I probably wrote my first CRC implementation around 1981.

    “probably” More mushy claims, Eric?

    It’s also quite likely that you just used the support in the SIO. xmodem much?

    1. >“probably” More mushy claims, Eric?

      I might have done one earlier. That was a long time ago and almost none of my code from my pre-Unix days has survived.

  50. @LeRoy:
    >I am reminded that, while working on GPSD, esr didn’t know what CRC was, or how to calculate it.

    I don’t know how you got that out of the link you gave. There he said:

    >Does anyone have an implenentation of CRC-24Q, the Qualcomm 3-byte
    cyclic redundancy check, handy?

    Which, at most, says that he didn’t know the details of a specific CRC algorithm.

  51. > Which, at most, says that he didn’t know the details of a specific CRC algorithm.

    He wasn’t interested in learning, either.

    Quoting further on in the thread:
    > “Maybe, but I don’t know them. Care to write some code? ”

    Perhaps you will ask Eric where the code in-question came from.

  52. > I might have done one earlier.

    “might”.

    > That was a long time ago and almost none of my code from my pre-Unix days has survived.

    convenient, this.

  53. “>Does anyone have an implenentation of CRC-24Q, the Qualcomm 3-byte
    cyclic redundancy check, handy?”

    @LeRoy: That’s the sound of an intelligent programmer trying to avoid reinventing the wheel.

  54. > That’s the sound of an intelligent programmer trying to avoid reinventing the wheel.

    Point in-fact, he did get someone to send him a pointer to the code. No credit was given. Typical.

    1. >No credit was given. Typical.

      The hackers among my regulars will already understand this, but I’ll lay it out for the benefit of others. When you receive unattributed generic code from a third party, hacker custom does not require you to attribute the person who passed you the code. That would actually be the wrong thing to do, as it might create a false impression about the authorship. It is unclear to me whether LeRoy is merely ignorant of custom, or knows better and is flinging feces anyway because he’s compulsive that way.

      Because I did in fact understand CRC theory (it’s related to some kinds of abstract algebra I find quite tasty) it bothered me that the general machinery of the CRC algorithm wasn’t quite as cleanly separated from the specifics of the seed and generator polynomial as it could have been. I ended up rewriting this code pretty heavily to clean it up; that’s why it has a GPSD copyright on it..

  55. > “The hackers among my regulars
    (if there are any)
    > “will already understand this, but I’ll lay it out for the benefit of others. When you receive unattributed generic code from a third party, hacker custom does not require you to attribute the person who passed you the code. That would actually be the wrong thing to do, as it might create a false impression about the authorship.”

    Obfuscate much?

    // This code “stolen” from Sven Reifegerste (zorci@gmx.de).
    // Found at http://rcswww.urz.tu-dresden.de/~sr21/crctester.c
    // from link at http://rcswww.urz.tu-dresden.de/~sr21/crc.html

  56. @el-slapper: math as mind-training: possibly yes, but I think my mind was trained more by some of the more rigorous kinds of analytical philosophy cum linguistics, a kind of verbal math with words, not numbers. This stuff for example http://en.wikipedia.org/wiki/Definite_description is fairly close to object-oriented programming paradigms.

    But as a hindsight what would have helped me the most in my education is learning simplicity. Instead of being overwhelmed both by complicated business concepts and programming concepts, someone should have explained to me that it is really about find someone doing a repetitive job, define the rules of the repetition, turn the rules to conditions and the repetition into a loop, done. Learning that such simple approaches are more effective that the latest bullshit-bingo of business process reengineering industrial best practices blahblahblah.

    I remember for example having to memorize exact definitions of normalization like BCNF and 4NF when instead someone should just have simply said “just move each thingie that is a thingie enough to deserve its own name or number into its own table”.

    Really learning unlearning complex and undigested definitions and learning the really simple and useful practices behind them was the hardest part of post-educational job experience.

  57. FORTRAN is the language of meteorology & will stay in use as long as weather & climate models as we know them exist. I’m not surprised that this didn’t come up in your discussion because meteorologists who specialize in modeling learn FORTRAN & the necessary scripting languages as part of the process of earning their degrees. You really don’t see the ‘real programmers’ getting hired to work with meteorological models anymore since C & other languages came to dominate the computer science side of things. I don’t see any other language that can match the ability of FORTRAN to compute the linear approximations of the non-linear physics of the atmosphere & ocean (plus the parameterizations of atmospheric processes) that make up our models.

  58. @ badgerwx – “FORTRAN is the language of meteorology”

    A lot of the new programming is being done in Python.

  59. Tom,
    I know there is a push (mostly in the research community I think) to use interpreted languages like python instead of compiled languages like FORTRAN, but I’m not sure that FORTRAN will ever be replaced – because the operational weather models (running real-time with rigid deadlines & covering the globe) need that faster execution you get with a compiled language. Since the modelers are pushing the limits of supercomputer hardware with higher resolution (horizontal & vertical), multi-model ensembles & coupled ocean-atmosphere-land models, I don’t think they can give up that bit of extra efficiency they get from FORTRAN – certainly not for the model cores that run the physics & dynamics of the atmosphere. The python example I’ve heard about so far are still calling FORTRAN modules to do the actual scientific calculations.

  60. NumPy/SciPy are scripting engines for LAPACK — written in Fortran.

    So, by the way, is MATLAB, which still sees much more use in the science and engineering community than NumPy or SciPy.

  61. I am reminded of the first answer to this Quora question, “What is it like to have an understanding of very advanced mathematics?” (Annoyingly, the site wants readers to sign in, but the answer I’m talking about can be read without doing this.)
    http://www.quora.com/Mathematics/What-is-it-like-to-have-an-understanding-of-very-advanced-mathematics

    Putting the bolded text together:

    “You can answer many seemingly difficult questions quickly.
    You are often confident that something is true long before you have an airtight proof for it.
    You are comfortable with feeling like you have no deep understanding of the problem you are studying.
    Your intuitive thinking about a problem is productive and usefully structured, wasting little time on being aimlessly puzzled.
    When trying to understand a new thing, you automatically focus on very simple examples that are easy to think about, and then you leverage intuition about the examples into more impressive insights.
    The biggest misconception that non-mathematicians have about how mathematicians think is that there is some mysterious mental faculty that is used to crack a problem all at once.
    You go up in abstraction, “higher and higher”. The main object of study yesterday becomes just an example or a tiny part of what you are considering today.
    The particularly “abstract” or “technical” parts of many other subjects seem quite accessible because they boil down to maths you already know.
    You generally feel confident about your ability to learn most quantitative ideas and techniques.
    You move easily between multiple seemingly very different ways of representing a problem.
    Spoiled by the power of your best tools, you tend to shy away from messy calculations or long, case-by-case arguments unless they are absolutely unavoidable.
    You develop a strong aesthetic preference for powerful and general ideas that connect hundreds of difficult questions, as opposed to resolutions of particular puzzles.
    Understanding something abstract or proving that something is true becomes a task a lot like building something.
    In listening to a seminar or while reading a paper, you don’t get stuck as much as you used to.
    You are good at generating your own definitions and your own questions in thinking about some new kind of abstraction.
    You are easily annoyed by imprecision in talking about the quantitative or logical. On the other hand, you are very comfortable with intentional imprecision or “hand-waving” in areas you know, because you know how to fill in the details.
    You are humble about your knowledge because you are aware of how weak maths is, and you are comfortable with the fact that you can say nothing intelligent about most problems.”

    Many of these are obviously applicable to programming – let’s say, to the kind of sensibility that a very good programmer or mathematician will tend to develop. It’s not all about huge towers of abstraction, though abstraction is important: giving structure, and allowing more powerful ideas to be thought. It’s more about habits of thought that enable someone to be productive when dealing with large, complex, interconnected systems. For example, the point about considering multiple representations of the same concept seems to me to be incredibly relevant to programming of all kinds – and it’s something that less-good programmers can easily get stuck on.

    Not only are there specific areas in coding where mathematics (even continuous mathematics) is directly relevant, but there are certainly also common sensibilities or patterns of this kind. If anything, systems-y work is probably more likely to gain from these patterns of thought, because its problems are often quite “pure” in comparison to, say, programs that have to know about taxes.

  62. The continued use of Fortran over Python for weather models ought to depend on the economics of maintenance, however. If they’re demonstrably not broke (and this might be true AFAIK), then in Fortran they’ll stay. But then, it wouldn’t really matter, since it would bear no impact on future code, as there’d be no need to adjust or maintain those models.

    I suspect this isn’t the case, though; it’s more likely that they need fairly continual maintenance (porting to new hardware, changing algorithms, etc.). If so, the question becomes whether it’s less costly to pay someone to make incremental changes in Fortran, or pay someone to rewrite the whole thing in Python on the premise that Python has better support and is generally more maintainable. For managers, it may the difference between finding a Python programmer, which is easy and therefore relatively cheap, or finding a Fortran programmer or training your own, both of which are expensive. For programmers, that means the question is whether it’s easier to rewrite, or to learn Fortran well enough to make any needed incremental changes. Too many variables involved here for me to know for sure.

  63. Meanwhile, I don’t really buy the argument that Fortran the language is just naturally better at computing any numeric results, such as linear approximations of non-linear equations. At best, I could believe it simply has optimized software libraries that no one’s ported to other languages. All those languages are as Turing-complete as Fortran. They all turn into object code at the end.

    In other words, if one were to write the required libraries in some other language, I believe it would take as much time and effort as when they were first written in Fortran, possibly quite less, since one hopefully wouldn’t have to expend the necessary microzen to understand the algorithms a second time. The only reason they haven’t, is that all that cost has already been sunk into Fortran, and it’s simply less expensive to assimilate than to overhaul.

  64. Looks like a blown up version of the olde “you never need algebra after high school” nonsense.

    I prefer the classical grade-school-kid-from-Lawrence-Mass definition of things: “There are two ways of doing anything, the smart way and the dumb way. When you do it the smart way, that’s mathematics.”

    -dlj.

  65. @ Badgerwx – “I don’t think they can give up that bit of extra efficiency they get from FORTRAN”

    I my experience, a lot of programmers (and researchers) seem to think that enhanced precision at the computing level somehow offsets the macro uncertainties inherent in weather and climate modeling. If this nut is ever cracked, I suspect that it will be a new code paradigm that does it and not more horsepower.

  66. @Paul Brinkley
    “Meanwhile, I don’t really buy the argument that Fortran the language is just naturally better at computing any numeric results, such as linear approximations of non-linear equations.”

    Take your favorite sea or ocean, say, the North Sea. Now construct a model of water currents and sand transport based on a 3D grid. On a nice accurate scale, both in space and time.

    Fortran is a horrible language. But it was made to handle very large 3D arrays with salinity, temperature, speed, sand content etc. The point is to do fast updates of the complete cell-cell interactions (Navier-Stokes, general thermodynamics). Fortran is really good at this.

    @Paul Brinkley
    “At best, I could believe it simply has optimized software libraries that no one’s ported to other languages. All those languages are as Turing-complete as Fortran. They all turn into object code at the end.”

    There are orders of magnitude differences in time needed to do the debugging and then the calculations in different languages. Like turning screws with a hammer.

  67. @Shenpen > When I am bored I offer regression or correlation calculations to managers and they either don’t need it or have no idea of what I am talking about.

    For the ones that don’t know what you’re talking about, have you tried calling it [regression, that is] a “trendline”? That’s what MS Excel calls it.

  68. As a mathematician, I demoted my “computer science” major to that of minor when I discovered that computer science isn’t the branch of mathematics that I believed it to be. Even though Computer Science has an important relationship with mathematics, and even though in theory, it’s mathematical at its core, you really don’t need to know all that much math to really get into it.

    Having said that, I think it’s foolish for Evan Miller to claim that there’s a divide between the Lisp mentality, and that of mathematics. I’ve only “discovered” Haskell and Lisp in the last three or four years; it is my great lament that I haven’t had the time to delve into these languages (and it’s my lament that, had I been a CS major rather than a minor, I probably *still* wouldn’t have been given a proper introduction to such languages), *precisely* because these languages are far more closely mathematical than their Algol or Fortran counterparts. Indeed, Lisp came into being, and made certain core decisions early on (like garbage collection), to study differential equations.

    As others have pointed out, his wailing about how Fibanacci number algorithms is taught, is rather misplaced as well, for two reasons: first, when the topic was first introduced in my Computer Science class, I was using either C++ or Java (I cannot remember which), but in either case, we learned the recursive version first, and then discussed the O(1) version that is also mentioned; second, I’m reading through SICP right now, and I like how both Fibanacci and Factorial are discussed, both as recursive operations and as iterative ones, and how the various algorithms use resources in different ways.

    As I read more of Miller’s essay (I’m not sure if I want to read the entire thing), I think I see his problem: he’s getting too caught up in the beautiful mathematics behind the algorithms, and is ignoring the entire purpose of things like SICP, which is to help programmers create beautiful, and sustainable, large systems. He’s too caught up in the beauty of the mathematics used, that he misses the beauty of the computational writing that Abelson and Sussman present.

    Indeed, I’m still struck by the beauty and simplicity of doing something as basic as defining helper functions *inside* a function, rather than outside of it, and thus avoiding a cluttering of the name-space!

  69. Ok. so I skimmed enough to come across this quote.

    “Mathematics, in the end, does not help you understand computer programming. It is not about finding metaphors, or understanding “fundamentals” that will never be applied. Rather, mathematics is a tool for understanding phenomena in the world: the motion of the planets, the patterns in data, the perception of color, or any of a myriad things in the world that might be understood better by manipulating equations.”

    I cannot imagine a more false description of mathematics. In one sense, Miller is right: mathematics doesn’t help you understand computer programming. But that is *precisely* because mathematics is *not* a tool for understanding phenomena of the world. Mathematics is solely and completely about examining ideas, making assumptions, and seeing where those assumptions can lead. That mathematics and science has had a very healthy, symbiotic relationship–of which both fields would be poorer without–is immaterial to what mathematics really is.

    Perhaps the best advice to give to hackers–to anyone, really–is to learn as much mathematics as you can, and appreciate it for its beauty. You’re not going to find any use for most of what you learn, but you are going to find many gems that take your breath away! And, along the way, you’re going to find that a think or two can be applied to the “real world” as well, but that’s immaterial. For that matter, if you *do* go the applied math route, you’re going to find many detours from that route that will take you to beautiful things, but will be utterly useless, for all you know. (That isn’t to say that someone, somewhere, isn’t going to find a use, but there’s far more beautiful mathematics out there, than there is for any one person to make use of even a small fraction of it!)

  70. Winter, you don’t turn screws in with a hammer. You pound the fuck out of them until their head is flush with the wood surface. I’ve seen my college roommate do this, so don’t try to tell me that nobody would do that, or that it can’t be done.

  71. @Russel NElson
    “Winter, you don’t turn screws in with a hammer. …”

    Indeed, which was my point. Likewise, you can simulate sand transport in seawater or the weather in C++ or Python. But you will feel like hammering screws into wood.

    With Fortran, it feels much more like using an electric screwdriver instead. However, when coding text manipulation or databases, Fortran still feels like an electric screwdriver, but now used for typing.

  72. “Indeed, which was my point. Likewise, you can simulate sand transport in seawater or the weather in C++ or Python. But you will feel like hammering screws into wood.”

    I have often wondered what would happen if Common Lisp were to be used for these kinds of things, rather than C++ or Python or even Fortran. Although I’m attracted to computer graphics programming (I particularly like the linear algebra and quaternions involved), I cannot bring myself to learn C++ to the level needed for games programming. With things like GOAL used by Naughty Dog, I even suspect that C++ isn’t necessary, except for the fact that the Industry insists it is.

    One of the things that impresses me about Lisp is that it can be as abstract or as efficient as you need it. Sometimes these goals conflict, so if you make something efficient, it’s ugly Lisp code (and vice-versa), but even so, you don’t have to drop into C to do it, like you do Python…and that’s a beautiful thing, in and of itself. (I would argue that C is ugly compared to Lisp or Python, anyway.)

    Indeed, because of this “I can be as high-level or as close-to-the-metal as I want, and *still* feel like I’m using a scripting language”, that I have come to realize that Common Lisp is a “transcendental” language–it transcends our definintion of “systems” or “scripting” or “real” or “toy” languages. I’m not sure how many transcendental languages there are–Haskell and Forth certainly are, and Smalltalk under certain conditions might be (I’m not as familiar with Smalltalk, though, so I can’t say for sure)–but the languages that are transcendental have a certain pure mathematical quality to them, and it’s this mathematical nature that gives them their power.

    As for C, C++, Java, Fortran, and even Python, Ruby, and so forth…they are all garbled messes of syntax in comparison….

  73. I even suspect that C++ isn’t necessary, except for the fact that the Industry insists it is.

    The game industry insists it is because it is. C++ is the only language that gives you both extreme performance and considerable expressive power.

    It’s also much, MUCH easier to hire C++ programmers than it is Lisp programmers.

    One of the things that impresses me about Lisp is that it can be as abstract or as efficient as you need it.

    So can C++, especially with template metaprogramming. It’s a very sharp knife, and if you can’t use it competently, that’s your problem, not C++’s. Other developers manage just fine, and indeed write some pretty cool stuff with it.

    I know, it’s ugly, and given the choice I’d gladly use something else. But it is what it is, and if you want to be in the game industry you use whatever the industry is using. For the kind of sophisticated, real-time, high-performance system a game (especially a AAA title) is, C++ is pretty much the only choice.

    In particular, you can NOT write a game with any language using a garbage collector. Programs with GC memory management require five times as much RAM to equal the performance of the same program with explicit memory management. Unless you’re writing a game that requires a modern PC yet looks like it was made at least ten years ago — Lisp, Python, etc. are straight out. (fyi, that’s precisely what Notch did with Minecraft, and why he got away with writing it in Java)

    Today, AAA game studios target consoles primarily, and PCs second; it used to be that memory was so tight you couldn’t even rely on having malloc() and free(); memory management was often done by the project manager on a whiteboard or in a spreadsheet. You needed to store a bit of game state, so you asked your boss for n bytes of RAM, and he would try to squeeze you in somewhere, or else say “nope, find a way to do it without”. This was true at least as recently as the sixth console generation, and may still hold today especially on the Wii.

    Oh, and GOAL was a one-off, PS2-specific, highly-experimental hack with a flaky compiler written completely in-house. It wasn’t really a Lisp the way CL and Scheme are — more like a clever and sophisticated sexpr-consuming macro assembler (itself written in CL) that spat out opcodes for the various PS2 chips. It is neither portable nor sustainable as a general strategy for writing games. Recent Naughty Dog games (like the Uncharted series) are written in C++.

  74. “It is neither portable nor sustainable as a general strategy for writing games. Recent Naughty Dog games (like the Uncharted series) are written in C++.”

    It is my understanding that the primary reason GOAL was abandoned, was because Naughty Dog was aquired by Sony, and Sony wanted to be able to share code between their different divisions. That’s kindof hard to do when everyone else is using C++! (It is precisely this reason, though, why I said C++ isn’t necessary to make games, except that the industry says it is.)

    Beyond social reasons, there was nothing to stop GOAL from being refined, perfected, and ported to other architectures, and still retain its Lispiness. And even though GOAL isn’t literally a Lisp like CL or Sheme, it’s still a valid example of what could be–*if* more developers would take the possibility seriously!

  75. “it used to be that memory was so tight you couldn’t even rely on having malloc() and free(); memory management was often done by the project manager on a whiteboard or in a spreadsheet.”

    Hmm… that might explain why I’ve tended to have an excess of memory and a shortage of computing power with almost every game purchase I’ve ever made.

  76. Alpheus,

    One thing I’ve learned since my n00b days is that programming is an inherently social activity; and “social reasons” will therefore dominate any consideration of languages or tools.

    The current situation is, you can choose to reimplement something like GOAL, building and debugging and testing all of the tooling it will take to produce game content in it, and then train a bunch of programmers into how to use the tool (no doubt meeting eyerolls and shudders as they recall their AI classes), or you can use C++ and start developing game content today, and be able to recruit developers that have the requisite skills today.

    The game has to be out by next Christmas. Your call.

  77. “One thing I’ve learned since my n00b days is that programming is an inherently social activity; and “social reasons” will therefore dominate any consideration of languages or tools.”

    Look, I’m not disagreeing with this, at least, not completely. Heck, the very fact that GOAL was killed when Naughty Dog was aquired by Sony ought to be proof. Having said that, there’s no reason why Sony couldn’t have decided to devote resources to GOAL, to try to build a community around it, and to see what can be done to extend it. Indeed, this is how new languages are born! But the community (or perhaps even just the managers of that community, because sometimes the managers underestimate the willingness of programmers to learn new things) aren’t as willing to do something new. Hence, the community has spoken, and C++ is the standard.

    I would hope that there’s always room for experimentation, and that there’s always hope that experiments can grow into something promising. The very fact that GOAL exists, ought to give us pause, and wonder: is it possible for something better than C++ to exist, even for games? The answer very well seems to be “perhaps”.

  78. Evan Miller is being myopic. In ancient Greek times, Plato treated mathematics (specifically geometry) with a mystical reverence, and the words “Let no one unversed in geometry enter here” were written on the door of the Academy.

    But the Greek philosophers didn’t revere geometry for any practical reason. The reason they used it was because geometry was the only system of axiomatic reasoning they had available to them…so for them geometry was a substitute for symbolic logic.

    Perhaps today we use mathematics and physics as a substitute for a logical world-view which is based on hard facts and solid reasoning. Since we cannot talk about anything in the world in a precise way except physics, we try to teach it as an ideal for the way we should understand everything in the world.

    I agree that continuous mathematics is not very useful any more…much of physics was born in the era before computers existed, and the equations they used were the ones which were simplest for them to calculate. Sometimes they simply ignored systems which were too difficult to compute.

Leave a Reply to esr Cancel reply

Your email address will not be published. Required fields are marked *