Here is the short version:
Writing program code is a good way of debugging your thinking -- Bill Venables
It's short, apt, and to the point. It does have a significant flaw: it uses a term I've come to hate, "bug". I don't know if  Grace Murray Hopper  is to blame for this term and the associated image of an insect creeping into a hapless programmer's hardware, but I suspect this one word may be responsible in some part for the sad state of the programming profession.
You see, a lot gets written about bugs, debugging, testing, and so on. A lot of that writing only serves to obscure one plain fact, which if I were slightly more pretentious I'd call one of the fundamental laws of software:
Every "bug" or defect in software is the result of a mismatch between a person's assumptions, beliefs or mental model of something (a.k.a. "the map"), and the reality of the corresponding situation (a.k.a. "the territory").
The software industry is currently held back by a conception of programming-as-manual-labor, consisting of semi-mechanically turning a specification document into executable code. In that interpretation "bugs" or "gremlins" are the equivalent of machine failures: something unavoidable, to be controlled by rigorous statistical controls, replacement of faulty equipment (programmers), and the like.
A better description would be much closer to "the art of improving your  understanding  of some business domain by expressing the details of that domain in a formal notation". The resulting program isn't quite a by-product of that activity - it's important, though not nearly as important as distilling the domain understanding.
You think you know when you can learn, are more  sure when you can write, even more  when you can teach, but certain when you can program. -- Alan Perlis
So, learning how to program is one way of learning how to think better. But wait; there's more.

An art with a history

It's easy, if your conception of programming is "something people do to earn a few bucks on freelance exchange sites by coding up Web sites", to think of programming as an area where only the past five years or so are of any interest. Get up to speed on the latest technology, and you're good to go.
In fact programming is a discipline with a rich and interesting history1. There is a beauty in the concrete expression of algorithmic ideas in actual programming languages, quite independently of the more mathematical aspects which form the somewhat separate discipline of "computer science". You can do quite a lot of mathy computer science without needing concepts like modularity, coupling or cohesion which are of intense interest to practicing programmers (the competent ones, at any rate) and which have engendered a variety of approaches.
People who like elegant intellectual constructions will appreciate what is to be found in programming languages, and if you can sort the "classics" from the dregs, in the architecture and design of many programs.

Deep implications

Mathematicians are concerned with the study of quantity and structure. Programming requires knowledge of what, despite its being considered a part of mathematics, strikes me as a distinct discipline: the intersection between the theory of computation and the theory of cognition. To program well, you have to have a feel for how computations unfold, but you must also have a well grounded understanding of how humans parse and manipulate textual descriptions of computations. It is in many ways a literary skill.
What is especially exciting about programming is that we have good reason to believe that our own minds can be understood adequately by looking at them as computations: in some sense, then, to become more familiar with this medium, textual descriptions of computations, is to have a new and very interesting handle on understanding ourselves.
This brief presentation of programming needs to be completed - in further posts - by a look the "dark side" of programming: biases that are occupational hazards of programmers; and by a closer look at the skill set of a competent programmer, and how that skill set overlaps with a rationalist's developmental objectives.


1
This history is sadly ignored by a majority of practicing programmers, to detrimental effect. Inventions pioneered in Lisp thirty or forty years ago are being rediscovered and touted as "revolutions" every few years in languages such as Java or C# - closures, aspects, metaprogramming...
New Comment
29 comments, sorted by Click to highlight new comments since: Today at 6:00 PM

When someone claims that a program has a non-obvious bug, I say "show me the miracle", that is a surprising observation whose origin isn't presently understood. Then, the process of debugging mostly consists in dissolving the mystery, which is usually achieved by performing experiments (sometimes written as actual tests).

[-][anonymous]14y50

What about bugs that arise not from a fundamental misunderstanding of the sort you're referring to, but from some sort of typo or language-specific error that you'd never think of as correct if only you'd noticed? These are more frequent, more annoying because they can come up even in simple tasks, and take just about as long to debug.

I realize that this sort of bug isn't interesting to write about, but you ignore this case completely when stating your 'fundamental law of software'.

I think it's covered: the programmer has a mental model of the what the code does (say, drawing a triangle on the screen) that does not match what the code actually does (completely fail to start up due to a syntax error). I agree that this isn't really all that great a way of describing those sorts of bugs or how to fix them, but it is at least included in the law.

Misunderstanding doesn't need to be "fundamental", the Nature is still the law, even if the law is stupid. Annoying detail that failed to be noticed is still a flaw in your map.

When the law needs interpretation like this to be valid, it is not more interesting, insightful, or useful than saying:

The lack of have a program that flawlessly achieves objective X, is the result of the programmer's map not accurately representing the fact in the territory that typing a certain sequence of characters and compiling would produce such a program.

A more interesting, less fundamental law, would be that:

A large class of "bugs" or defects in software is the result of a mismatch between a person's assumptions, beliefs or mental model of the problem domain (a.k.a. "the map"), and the reality of the corresponding situation (a.k.a. "the territory").

The map-territory metaphor applies to the actual program (or even a specific test case) and your understanding of the actual program (test case), not hypothetical program that would solve the informally-specified problem if it was available.

Note that the change I recommend as an improvement does narrow down which territory the law refers to. The problem with the original is that it doesn't actually specify anything like what you said.

It doesn't feel very fundamental. How commonly they crop up, and how easy they are to debug have much to do with your editor, coding style and interpreter/compiler.

  • the use of long'ish descriptive identifiers makes it less likely that single typos collide with other valid names, while text-completion largely eliminates single-character typos as a class of error.
  • syntax highlighting provides a useful form of spell-checking
  • consistent formatting makes it difficult to accidentally hide 'structural typos', especially given editor support (mainly brace matching).

These sorts of concerns are very amenable to technical solutions, which are commonly implemented to various degrees. But even if they were completely eliminated, programming wouldn't be that much easier. My boss would still be making fun of me for staring off into space for long stretches while I'm thinking through a problem.

This is exactly analogous to typos vs defects of argument in prose. Yes, spell-checking will miss typos that collide with valid words, but it feels off to claim this as a deep insight into the nature of writing.

A typo is when you think you have written down X, but actually you have written down Y. Are we not squarely in a map-territory discrepancy?

And, speaking from personal experience, those can be very painful to debug, because right until the very moment of realization you are prepared to swear that the problem must be something very subtle, since quite obviously you wrote down what you meant to write down.

If you're lucky, and work in a statically typed language, the compiler will catch typos for you. Lucky because the typo has to be a) in an identifier (the compiler doesn't check strings or integer values), b) such that it results in an undefined identifier (you could have a typo which turns one defined identifier into another defined identifier) or c) in an identifier of the wrong type.

I don't know what you mean by a language-specific error, what I can come up with is also a map-territory discrepancy: you think that the array accessing convention is 1-based when in fact it is 0-based.

The more down-to-earth formulation is "every bug is in fact a programmer mistake".

It's almost not worth mentioning... but my experience in a different domain says otherwise. Namely the game of Go: one of the first, most basic rules you learn in Go is "players alternate play". Or, to say it another way, "you ain't going to get to play two moves in a row". Every player is supposed to know this... and yet I have seen one strong player totally humiliate others by pointing out that they played exactly as if they hoped to get two moves in succession.

Every bug is in fact a programmer mistake, and everyone knows this... but why does everyone behave as if they thought different?

[-][anonymous]13y00

If we're thinking about programming as a way of deeply understanding a problem -- or at least as a process which leads to understanding a problem as a necessary step -- then we have these errors that don't reflect a misunderstanding of the problem. They may reflect a misunderstanding of the language, or a typo, which I realize are still map-territory discrepancies (isn't any mistake?) but have nothing to do with the problem you're trying to solve.

In a way, I suppose I'm nitpicking. But it also needs to be said because when debugging, you need to be aware of two levels of differences: differences between what the correct solution is and what you think it is, and differences between what the program does and what you think it does.

This comes up a lot when I'm grading mathematical proofs. Sometimes the mistake is a faulty step: for instance, an assumption of something that shouldn't be assumed, or maybe only a partial solution is found. Sometimes, the mistake is in the presentation: the idea of the proof matches the correct idea, but a key step is unexplained, or a final answer is wrong due to an error in arithmetic. I think it definitely matters which kind of error the students are making.

The big difference between a typo in writing and a typo in code is that in the first case the hardware that does the interpretation transparently covers up the mistake (which is why editing is a hard job, btw). In the second case the consequences can be more severe, are likely to crop up later and inconvenience more people. Code is unforgiving.

As a case study we could consider the latest "bug" to have a noticeable effect on LW. Someone released this code into production believing that it worked, which turned out to be very different from the reality.

I agree with the comments by Misha and JGWeissman, saying that the "fundamental law of software" given here doesn't focus tightly on what's especially interesting about programming. Still, I think I know what Morendil is saying is interseting here; and if it's a misrepresentation of the post, it at least represents what I think makes programming interesting, and hard.

(I'm going to try to explain this without assuming any programming jargon. Catch me if I fail!)

The first lesson of programming is the rules of the game: the syntax and semantics of a programming language, and how to run such code. For our purposes, this is just a substrate. Now, as anyone who's tried to teach an Intro to Programming class can tell you, this substrate is either hard to learn or hard to teach - I'm not really sure which. But that's not the point.

The second lesson of programming, though, is this: In a program of even moderate size, there are too many details to keep in your head. When you're writing line 20,001 of code in your program, it will be wrong -- unless you can make strong assumptions about the behavior of the previous 20,000 lines.

Thus, you learn to write your code in small, modular pieces, and you try to summarize each module's behavior clearly and concisely. Each module needs to be small enough that you can hold all the relevant details in your head -- the meaning of each line, including the summaries of all the code that it refers to. [1]

This introduces a key map-territory distinction. The territory is the code that implements a module; the map is the short description of the module. This distinction is unavoidable when you write large programs. You simply can't manipulate the entire territory in your head all at once, but you expect the code of any module to be frequently dependent on the behavior of several others.

In my experience, the toughest programming errors to fix arise as errors in some programmer's map. Suppose I wrote a module two weeks ago, and now my mental model of that module is slightly wrong, and so I misuse the module in a subtle way. Depending on the subtlety of the misunderstanding, the bug might not even show up until I change the program further -- which makes the bug hard to find, as it won't seem causally linked to the last change I made.

Some specialized map-territory distinctions exist in any field of endeavor. But, a programmer must learn a new map, over new territory, for every new project. That is, the mapped territory changes, sometimes frequently. Thus, vital programming skills relevant to LW interests are:

  • Accurately mapping territory,
  • Accurately conveying one's maps, and
  • Creating territory that can be accurately mapped.

What's more, computers usually give you sharp feedback about whether or not the program works as expected. As such, programming can be a tight feedback loop. So, programming has the right sort of conditions in which to learn these skills.


[1] Ok, you really just need to keep track of some of that detail -- especially if you maintain explicit, logical properties in your code. But you need to know exactly how much detail you can ignore; when you can safely ignore earlier detail, it's due to something like micro-modularization. (I could go into greater detail on this, but it's not the point.)

Actually even relatively simple programs can be hard to understand deeply and give rise to confusion, because they describe a complex computation; "complex" precisely in the sense that the computation's detailed shape is hard to deduce from the program text. (Langton's Ant being a good example, though without much practical relevance.)

For an interesting real-world example around which I had trouble bending my mind recently see mustache, a Ruby program which does template expansion. The challenge was to get this to handle indentation correctly. It's written well enough, it even has a nice set of unit tests.

But it made me realize that "indentation" was very much a harder concept to define formally than to grasp intuitively, and that it cut across the grain of the program's current design in interesting and non-obvious ways.

OK - that's entirely fair. The map-territory distinction I describe above reinforces a somewhat different set of rationality skills of rationality.

I've developed a sense for when I understand an algorithm to the point of being able to program it, rather than understanding it sufficiently to explain it or expect it to work. Feeling like I know how to code an algorithm is just like feeling sure that a mathematical proof is correct. They have a specific sort of robust clarity that most thought lacks. I hadn't really thought about it, but that must be a learned feeling. I expect it's a valuable sense to have; if programming can teach it, then more power to learning programming.

It's kinda funny, because my map includes "program WILL fail to compile on first compile attempt".
... sloppy coding, maybe, and I would not be shocked to find I pay for my arrogance later on in my carreer, but I've only had ONE bug I couldn't track down: in pong, my computer-controlled paddle behaves more and more sluggishly as the program runs. The player-controlled paddle works just fine. (you'd have to watch it in action...)

"[programming] is in many ways a literary skill."

Be that as it may (or may not), I'd like to see someone keep a straight face while saying that to an English major!

[-][anonymous]13y00

I'm sympathetic to the idea of programming for the purpose of technological literacy, brain exercise, and coolness (as opposed to programming as required for your work, which tends to have a much narrower focus and, in my case, stays at a pretty primitive level.)

If someone who doesn't have a primarily "coding" job wants to actually take your advice, where would it be a good idea to start, concretely?

Stuff I've considered:

learning/reviewing something like C++ that would let me do big computations and simulations (+1 for obvious professional value)

learning web development, beyond HTML (no obvious professional value, but it would be cool, and lends itself easily to a DIY project)

Learning something about computer architecture, operating systems, etc, and building toy models of those (no obvious professional value, looks terrifying, but has the advantage that I'd understand the technology I use instead of playing with a "magic box.")

Something else?

My advice would be, think of something you actually want to write, then learn the tools that will enable you to write it. It sounds like what you most want to actually write is simulations for use in your day job?

1This history is sadly ignored by a majority of practicing programmers, to detrimental effect.

Why does this happen? The majority of practicing programmers are using languages, and environments designed by other people, but language designers need more skills and interest than ordinary programmers so we might expect that they would be more likely to know about this and include the desirable features in the languages they create. (See James Gosling's quote about Java dragging C programmers half way to LISP).

What happened to the knowledge/skills/interest between the people who implemented LISP as pioneers, the people who learned on the pioneering LISP systems, and the people who create C#, Java, etc.?

Are they really lost and now being 'rediscovered', or were they pioneered, found wanting, and left dormant until technology, software and communications advanced enough to be able to make good use of them?

("It was better in the old days, let me tell you about the old days" is a recurrent theme in programmer blogs, and I wonder if it's really a signal of being in the in-crowd, rather than a genuine belief which alters behaviour accordingly and causes their software to include the sorts of features they are praising).

Are there any good books about the history of programming?

I can't think of one. The closest I ever got was to buy books of various ages, then getting a sense of the implied historical development.

There is a book (perhaps more than one) on the history of programming languages, a somewhat different topic.

Heh, I just submitted a somewhat similar article not realizing this one was already here. I think they complement each other pretty well, though.

I'd like you to go into more detail about why you say that programming is "literary". Do you mean making one's code readable to other programmers, or the process of creating a deliberate mental model of the program in the minds of the users, or something else?

Just as a story would, your program represents a human explanation and interpretation of its topic. You have a huge amount of qualitatively different ways to express what you're programming, and each one paints a very different mental picture of the forms and processes your program represents. Some ways may be objectively better, and some may be objectively worse; others are a matter of taste.

Once you have a broad outline, the majority of your creative time is solving a series of small puzzles -- understanding how to write each small part, and how to organize it and phrase it so that it fits into your narrative and is obviously correct. You're forced to organize each piece artfully, because otherwise the greater whole is going to be impossible to hold in your head at once, and you will make a lot of errors.

A lot of that work seems similar to the process of writing both fiction and (I presume) non-fiction. You're working under stricter stylistic limitations when programming, since the language must be completely precise, but you tend to make up for it by working with a palette of ideas which are more numerous and more alien than anything you could ever write into a character-based story.

Both, and more besides...

Agree on them being complementary, and there's a third one being drafted about cognitive biases affecting programmers.

The second part - "An art with a history" - doesn't seem related to the rest.

Hmm, the intended connection is that they are three distinct reasons why learning programming is a great idea. Does that help? Is the sense of disconnect due to abruptness of transition (i.e. form), or does the content strike you as not belonging there?

The biggest value I see in your post is the emphasis on the intellectual, rationalist nature of programming. The fact that programming has history has nothing to do with that.

Got it. Would it devalue the post for you if I made it clearer that I claim programming has aesthetic and philosphical value on top of its intellectual, rationalist appeal?

That's correct - there's an aesthetic value to writing posts to establish single coherent points. An essay beats a bullet list.

What I think would improve this post is using an example to illustrate how programming teaches understanding of the question. It would have to be something relatively simple, but where the error was conceptual.

This is by way of testing if the discussion area is a good way to get feedback on posts before posting them to LW.

This was previously here where it got some feedback, but the attention it got was short-lived and the post might do with further improvement.