(Epistemic status: Pretty sure the premises are all correct and the metaphor is accurate. Exploring the implications and conclusions.)

A programmer was given an aging piece of software to use to complete a project. To the company, using the old software seemed like the path of least resistance. But, opening the program and looking at the code, the programmer saw it had been hacked, edited, and added to by twenty or more programmers before her, each with their own style and vision of what a good program should be, each according to the conventions and concerns of the day.

"Ahhh, Legacy code," the programmer said. She frowned a little.

By now, the software was full of unused code and had a maze of lines of programming that referred to other lines that referred to yet other lines, and so forth. The programmer found it almost impossible to sort out exactly what many parts of the program were designed to do. Worse, though the computers used by the company were high-tech, parts of the software were designed to run on computers from decades before. The program was a nightmare to work with. Normally one had to doctor the finicky inputs, and often the software returned incorrect, puzzling, or even meaningless outputs.

When software like this does anything useful at all, it is a small miracle.

Most organizations, social hierarchies, and belief systems are heavy with garbage. There is no one to blame, because most of those systems are constructed from layers of intentions and interpretations spanning years, decades, or even centuries. Hence things like a legal system with thousands of pages of code and case law. Plenty of these systems evolved around social realities that have changed immensely since they were designed. Meanwhile, your own experience tells you that no matter how dysfunctional or downright it is, no organization, belief system, or social structure will ever agree with you that it is wrong. For the most part, none of these systems were designed to take the steps to eliminate themselves when they are no longer useful.

Once a system takes root in society and ages, it loses its agency, intelligence, and will. That system is usually very hard to remove even if it starts to work against its original intention. Most people either thoughtlessly adopt or begrudgingly embed themselves in whatever systems are presented by their culture and society. Doing so usually seems to be the path of least resistance, even necessity. Thus, many social structures and systems are aging and barely functioning software, legacy programs running on human hardware.

The programmer with the messy program did what any sensible software engineer would have done in her shoes: Set aside the old software and built something useful, that did what was needed, using the full power of the company’s modern computers.

Regarding Moral Obligations to Systems:

No one in history has ever died wishing they had paid more dues to hierarchies, bureaucrats, and society’s systems. Yet that is what those systems always seem to want: More. People treat them as basic truths, but do we owe such systems any more loyalty than we would give to old computer programs?

The next time old software gets hung up on a certain procedure unless you give it doctored inputs it likes to see, why even hesitate? When you recognize opportunities to bypass or delete such software, should your default choice be to seize them?

And finally, the clear implication is we should be writing new software, and attempts to "tweak" it are probably just going to pile on more garbage and spin it further out of spec. I think Ed Deming would advocate strongly that this is in fact the case. This could be an even stronger moral impetus to "delete/bypass" such a system.

I see the main conflict in my reasoning would be with people who have embedded themselves by default in the systems around them. It would be like all the people who accepted a bloated Windows because it's all their org and Best Buy ever gave them and now we're all switching to Linux. Maybe then the moral obligation is to try to facilitate "soft landings" for those already deep in the current systems.

New Comment
12 comments, sorted by Click to highlight new comments since: Today at 11:34 AM

The programmer with the messy program did what any sensible software engineer would have done in her shoes: Set aside the old software and built something useful, that did what was needed, using the full power of the company’s modern computers.

That's... not at all how it works in the real world. 

If you are lucky, you have a quality comprehensive regression test suite and you can redesign the code to pass every test, and then pray that the use cases not in the test suite don't mess things up. 

If you are less lucky, your test suite is neither comprehensive nor quality, and you have to wade through the test cases one by one to see which ones matter. 

If you are in a situation most programmers are in, there is no regression test suite at all, and you have to reproduce the old program not just feature for feature, but also bug for bug, since your users are now relying on the old behavior. And that effort is comparable to what has been spent on that code so far by everyone who worked on it.

If you have some leverage on your users, you can tell them that if they want this new feature by the date they want and at a price they can afford, they have to accept a version 0.1 of the new Minimum Viable Product and wait for more features as you implement them, with the understanding that they will never get a clone of of the original product, and will have to adapt their processes to the new one.

If you have no leverage, you do what your predecessors did and very carefully add the new feature without breaking (or "fixing") any of the old ones.

The story is from the 1990s. The character is actually my dad. It was a mid-sized actuarial firm. He started by writing a whole new program to do the function he needed the spaghetti-code laden crap to do. Then he added features here and there until he had just made a whole new program which was documented, easier to read, and functioning well. After awhile, he passed it to the other actuaries, and his work became the new software. But he never did use the old software.

I guess things are different now. As the person above also said, it's impossible to ignore the super-system that a small system is embedded in. Additionally, I think some of his reasoning for outright refusal to use the old software was that he wasn't comfortable that he wouldn't be able to audit it, and he was signing off on yearly reports for the firm.

I can see the appeal, but I worry that a metaphor where a single person is given a single piece of software, and has an option to rewrite it for their own and/or others’ purpose without grappling with myriad upstream and downstream dependencies, vested interests, and so forth is probably missing an important part of the dynamics of real world systems?

(This doesn’t really speak to moral obligations to systems, as much as practical challenges doing anything about them, but my experience is that the latter is a much more binding constraint.)

Indeed. I impulsively wrote some continuation story in response—it's very rough, and the later sections kind of got away from me, but I've posted a scribble of “Bad Reasons Behind Different Systems and a Story with No Good Moral” which may be of relevance.

I liked it. Made me consider a bit more.

First Take: Tangentially, does this point to an answer to the question of what are bureaucrats trying to maximize? (As sometimes addressed on LessWrong) Maybe they are trying to minimize operational hitches within their small realm.

I edited the title from being in all-caps to being in normal sentence-case. We don't really do all-caps titles here.

Duly Noted. What about the Subtopic Title? I'll see if I can change to normal-sentence case and bold.

No one in history has ever died wishing they had paid more dues to hierarchies, bureaucrats, and society’s systems.

False, and indicates a significant failure to appreciate people with values and desires different from your own.

There are lots of people who approach their death wishing they had acquired more social status, which is what "paying dues to society's systems" means. And it's literally a cliche that people approach death and start thinking about religion, which I suspect falls into your "hierarchies". And people do these things because they actually find that their own values align with those of the system, a possibility which you don't even seem to consider.

You are making too many assumptions about my values and desires. I don't care for religion and I think people can get a lot more social statues by bypassing or rendering irrelevant the social systems around them.

To pay all the dues would be like "Work to rule" in a factory, a well-known protest tactic of adhering to every policy as a method for bringing an operation to a standstill.

Many who get far places didn't pay all their dues. Your life isn't long enough. Maybe some pragmatic signaling, but no need to actually do everything that seems to be demanded.

I find this reply confusing. It seems to me that you're mixing up a descriptive and a prescriptive discussion of values.

"Your life isn't long enough" <-- prescriptive, a value that I largely agree with though we might argue about some specifics.

"No one has ever died wishing" <-- descriptive, but as a description of what people in fact want it's inaccurate.

New people: new ways are best

Old people: old ways are best

Is it this, or that simply appears to be the case because someone older is likely to be deeply embedded?

My dad doesn't think Windows is better than Linux or Mac. He sees me with OpenSuse and openly derides Windows all the time, but he figures he doesn't want to learn a whole new system. He's past EOL on Win7 at this point, but is so embedded in it, down to Excel for his accounting (was an actuary, on Excel from like the 80s through the 2000s).

Also, I have not argued that every new way is good. Some older techs are extremely good (Top of head example: no one who has ever used film and worked in a darkroom would say the experience of working in photoshop could ever fully replace that experience. Or another example, I hate turning on my computer to do anything with music. The screen/mouse/key interface is nothing nice to my creativity. And oh my goddess how cool the whole thing can sound and come together on a four track!).