I watched this YouTube video a few months ago that really stuck with me. It was about how to organize your kitchen. The video went through 10 tips, but before getting in to those tips it talked about the why. Why is organizing your kitchen important in the first place?

Well, the standard and obvious answer is that it makes the process of cooking smoother and more efficient. No digging around for those tongs as your chicken threatens to burn. But here's another answer that is perhaps less obvious and more important: motivation.

That's right, motivation. The desire to even be in the kitchen in the first place. If you have to dig through a chaotic cabinet under the sink to find the right pan, in practice you're just gonna end up ordering takeout. And that's a problem that compounds upon itself. If you cooked instead of ordering takeout you'd be building up your skill, making things easier for next time. Instead, your failures and bad experiences make you averse to wanting to cook in the future.

I think something similar is true in programming.

One simple example of this is with Vite vs Webpack. I recently was working on a side project that used Webpack. `npm run start` would take a good ten seconds or so. And I restart my dev server a lot, imposing this annoying ten second cost on myself beacuse... well, I'm not sure. I'm paranoid about hot reloading not working? Because I suck?

Anyway, I ended up switching to Vite which is more like one second instead of ten (don't quote me though). Soooo much nicer. I notice myself smiling inside whenever I restart my dev server. It's like reaching for a knife resting neatly on a wall mounted magnetic strip instead of having to dig through a messy drawer.

Another example that comes to my mind is with flaky tests. When I was working on Premium Poker Tools, my end-to-end tests got out of control. I'd make a small code change, random tests would fail, it didn't make sense, I'd have to spend hours debugging. Ugh. It definitely made me feel unmotivated.

But as a sort of counterexample, for a side project called Calibration Training, I didn't have any tests. At first it was fine, but at some point I started feeling nervous that a code change here would introduce a bug there. It's a little bit like the feeling of being in one of those antique shops: you don't want to accidentally bump into something and then hear the sound of glass shattering. Except even more scary because with programming I was especially worried about things failing silently. Anyway, this too was a bad developer experience. As a developer you want to write code, run tests, see them pass, feel confident that things are pretty stable, and move on.

Complexity itself is perhaps the worst. Well, when it gets out of control. No one wants to deal with the 13,000 line module in the legacy app. But everyone is motivated when starting a new project! When starting a new project, oh man, you can just crank out feature after feature after feature. It's almost addicting.

Complexity can be wrangled though. For Premium Poker Tools it grew pretty fast. As it grew I'd slowly start becoming overwhelmed. I'd hit a point of being sufficiently overwhelmed where I'd spend some time refactoring. Then I'd feel good and motivated again. But then the complexity would creep back up. So on and so forth, for however many cycles.

Because I worked for myself, I was actually able to take the time to refactor. But at almost all jobs I've had, that wouldn't have been the case. Complexity never gets wrangled. It just sits there and rots. Developers can smell hot rotten it is too, and they try to avoid touching it. "Maybe Alice could work on that feature instead." And when they can't avoid touching it, it's weeks of work for like a two line pull request that leaves everyone feeling demoralized at the end. Like those nights when you look up at the clock, see it's 9:30pm, and realize that it somehow took you three hours to cook chicken and broccoli for dinner even though the recipe said 45 mins. Or is that just me?

Another thing that contributes to developer experience (DX): working with technologies where you like the humans behind em. For example, Chakra UI seems to have really awesome people behind it, and that makes me happy as I use Chakra. OTOH, Java feels like it has a faceless, blood sucking corporation behind it.

Pointy Haired Boss is confused by all of this though, so let's take a moment to address his confusion. He doesn't see what the big deal is. Motivation? Morale? This isn't summer camp. It's business.

Developers are getting paid to write code, and paid extremely handsomely. Isn't that enough? Isn't that what being a professional is about? Putting in a good 9-to-5 regardless? Working on whatever tasks are of highest business value to the stakeholders, regardless of how "smelly" the corresponding parts of the codebase are? Sure, some people might drag their feet a little if the code is smelly. They might hang out at the water cooler a little longer, take a longer lunch, play an extra game of ping pong. But these are all just marginal things. Maybe 7 hours of work instead of 8, but not 3. Right?

...

...

...

Right???

I'm afraid not, Pointy Haired Boss. I'd guess that motivation can be the difference between something like a 2-4x improvement in productivity. It's a little sad to say, but the idea that we're all professionals who will work equally as hard regardless of the conditions is just untrue.

And hey, maybe that's not sad. Maybe that's not something to be ashamed of. Maybe we should embrace our humanity.

To be clear, I'm not proposing that our codebases need to spark little nuggets of joy in every corner. No. I think we should, as always, be pragmatic. Look at the pros and cons, find the right balance, and do what makes sense. But realize that DX often has quite the large impact on motivation, and motivation often has quite the large impact on productivity. Incorporate that into your cost-benefit calculus as you may.

New to LessWrong?

New Comment
7 comments, sorted by Click to highlight new comments since:

I agree. But why isn't this happening already? Too many developers complain about their jobs.

Partial answer is that good developer experience does not happen automatically. But that does not answer why companies do not create good environments on purpose. I mean, companies are already complaining about lack of developers, they have HR departments, they do various teambuilding activities, do Scrum and SAFe and whatever, so... why not this one thing that could improve the productivity a lot?

Maybe it is Sturgeon's Law: most companies are simply not competent enough to create a nice workplace. Their managers are an equivalent of... that kind of developers who fail at FizzBuzz test. Or rather there are multiple skills that managers need to have, those skills are not correlated to each other, and the managers are only hired because of those other skills (such as navigating corporate mazes successfully).

Perhaps managers are unable to empathize with the developers, because they are psychologically too different. First, it's about the jobs: developers get technical things done with precision, managers talk to people. (So when a developer gets depressed about some technical problem, a manager is likely to see it as a purely psychological problem that can be overcome by cheerful "I trust you, you can do it!!!" as if saying that could somehow fix the broken server.) Second, even if the managers are former developers, I suspect there is a selection based on personality traits; the less technically oriented ones are more likely to prefer promotion to management.

...or maybe all jobs suck, and developers are simply more likely to complain. Maybe all jobs could be made much more pleasant, but no one cares (maybe because it is not profitable to do so)? Maybe people who have the skill to make someone else's work day more pleasant... can do something more profitable instead?

Even darker thought: maybe being nice to people is not profitable in general, because then they get too confident and leave you for something better (which may turn out not to be actually better, but then they are already gone), while the right amount of suffering keeps them busy and makes them stay? Maybe everyone has a different long-term psychological setting "I can handle this much suffering", and if you give them more, they break, but if you give them less, they will seek something more challenging instead? So a developer who has their job too pleasant may decide to quit and try some better paying job instead. Assuming instinctively that pain is proportional to gain, so people who feel good at work must be leaving a lot of money on the table.

But that does not answer why companies do not create good environments on purpose. I mean, companies are already complaining about lack of developers, they have HR departments, they do various teambuilding activities, do Scrum and SAFe and whatever, so… why not this one thing that could improve the productivity a lot?

You're quoting a bunch of things as having being there primarily to "create good environments", which, in fact have nothing to do with that. HR departments are there to prevent the company from getting sued, they're not there to make the work environment better. Similarly, processes like Scrum and SAFe, in my experience, are more about management control than they are about productivity. In practice, what I've seen is that management is willing to leave significant amounts of productivity on the table, if it means that they have greater visibility and control over their employees.

If I had to guess as to the reason, I'd pick that one primarily: a crappy on-the-job environment is the price one pays for having control and visibility over what one's employees are doing, and thus avoiding principal-agent problems in the process.

And where does the "predictability trumps productivity" attitude (which I agree is real) come from?

I would guess the managers are not aligned with the company, because they have asymmetrical incentives: a failure is punished more than a success is rewarded. Thus a guaranteed small success is preferable to a non-certain greater success. (Even worse, a great success may increase the expectations.)

Or maybe it is how the companies are usually organized. As an example, imagine that you make a product that has three parts: A, B, C, each created by different people with different skills. If the people working on the part A get it ready earlier than expected, it does not change anything, they still need to wait for others. But if people working on the part B get it ready later than expected, the entire product must wait for them. Therefore "not being late" is valuable, but "being early" is not.

Therefore “not being late” is valuable, but “being early” is not.

That's a big part of it. A company is a soft real-time system. As much as developers like to complain about the seemingly nonsensical deadlines, those deadlines are there for a reason. There are other business processes that need to be coordinated, and there is pressure on developer managers, from elsewhere in the company, to provide a date for when the software will be ready.

Like any real-time system, therefore, it's important that things get done in a consistent amount of time. So just like how, in real-time software, I would rather have something take 200 clock cycles consistently, rather than 20 clock cycles most of the time, and 2000 clock cycles when there's an exception, managers will happily enforce processes that waste time, but which allow them the visibility to provide anticipated completion dates and status updates to the rest of the organization.

I agree with the idea in general, but not with its implementations I see.

If making things reliably on time is so important, you could simply hire more people. (Not in the last minute when you already see that you will miss the deadline; by then it's usually too late.)

In my experience, many software projects are late because the teams are chronically understaffed. If you are completing deadlines reliably on time, the managers feel that you have too many people on your team, so they remove one or two. (Maybe in other countries this works differently, I don't know.) Then there is no slack, which means that things like refactoring almost never happen, and when something unexpected happens, either the deadline is missed, or at least everyone is under big stress.

The usual response to this is that hiring more people costs more money. Yes, obviously. But the alternative, sacrificing lots of productivity to achieve greater reliability, also costs money.

Now that I think about it, maybe this is about different levels of management having different incentives. Like, maybe the upper management makes the strategical decision to sacrifice productivity for predictability, but then the lower management threatens predictability by keeping the teams too small and barely meeting the deadlines, because that is what their bonuses come from? I am just guessing here.

In my experience, many software projects are late because the teams are chronically understaffed. If you are completing deadlines reliably on time, the managers feel that you have too many people on your team, so they remove one or two.

It's interesting that you say that, because my experience (US, large corporation IT -- think large banks, large retail, 100,000+ total employees) has been the exact opposite. The projects that I've been working on have been all quite overstaffed, resulting in poor software architecture, thanks to Conway's Law. When I worked at the major retailer, for example, I genuinely felt that their IT systems would be healthier and projects would delivered more quickly if they simply fired half the programmers and let the other half get on with writing code rather than Slack messages.

Yeah that's an interesting question as to why it isn't happening already. As you point at there are probably lots of reasons, but my best guess is that the biggest one is the perception that a real professional wouldn't let it impact how hard they work.

I actually experienced this myself. Ie. when working on that side project where I initially was using Webpack instead of Vite (slow build tool that takes 10 seconds instead of 1), when taking a first stab at the calculus for whether it was worth moving to Vite, I was mostly just thinking about how it'd impact my productivity, in a first order sort of way. Slower builds mean I might lose track of my thought process, which would make me a little less productive. But then it occurred to me that slow builds → frustration → choosing to lay down and watch TV instead of working on the project in the first place. And that certainly impacts productivity, let alone happiness. I think there is something sorta taboo about this thought process though. You're supposed to be this professional who will suck it up and do the job regardless.

My second best guess is that it's a thought that is uncommon enough where it doesn't even occur to people in the first place.