by Leon Rosenshein

Any fool can write code the computer understands.

Any fool can write code that a computer can understand. Good programmers write code that humans can understand.

    – Martin Fowler

I’ve talked about that quote, and coding for the maintainer before. I’ve mentioned Martin Fowler even more often. However, almost all of those have been more than a year ago, and it’s an evergreen topic.

Over the last 40+ years I’ve written code in lots of languages. Various flavors of Ada, assembly, Bash, Basic, C/C++, Cobol, C#, Forth, Fortran, Java, JavaScript, F#, Pascal, Perl, PHP, PL/1, and Python, to name a few. In all of those languages you can write code that, if you don’t touch it for a few months, you look at it and wonder what idiot wrote it.

With most of them though, if you take your time, think things through, and decompose things, its relatively easy to write code that at least the author can come back to later and understand what’s going on.

The one language that gave me the most grief though, was VBA in Excel. And the place that it gave me the most grief was in the Master Block Tracking (MBT) spreadsheet, back on the Global Ortho project. Imagine a spreadsheet with ~30 tables, the largest of which was about ~1500 rows and 75 columns. All to produce a single table that showed what work was in progress, what work was blocked, and what work had been recently finished. It worked, but it was a nightmare.

The reason it was a nightmare was not because it grew organically. It wasn’t because the requirements were unreasonable, or unclear. It wasn’t because the language didn’t support what I was trying to do. The reason it was a nightmare was because of the sunk cost fallacy and code that humans couldn’t read.

First, and most importantly, we used the wrong tool. We should have dropped Excel and moved to a website. That would have the problems of tiering, access control, change management, and making sure everyone had the latest version of things. But we didn’t re-evaluate the goals and work backwards. We just adapted the current system to work.

Second, the original, manually updated MBT had a set of auxiliary tables that the production team would update with the current state of the various processing pipelines. Then the MBT would do all the calculations in a combination Visual Basic for Applications (VBA) and calculated cells. And we didn’t change that. Instead of stepping back and redesigning things into a properly tiered architecture with a data layer, a business logic layer, and a display layer, everything other than data storage got mashed into spreadsheet.

Even if we wanted to use Excel as the display engine so that there could be offline analysis, we should have changed the design. We had calculated cells that lookups on top of lookups and then tried to calculate state. Instead of building intermediate tables based on the business logic and loading them into the spreadsheet, every cell did the logic needed and calculated the intermediate values locally. When there are 15 (yes, 15) closing parentheses at the end of the equation for a calculated cell, your code is essentially unreadable. And we had lots of different cells that had equations like that.

Every time we added a state, or changed the definition of a state, or had to handle a new edge case as we re-processed data, I would spend hours reacquainting myself with what I had done and understanding how things worked before I could start making the changes needed. For the last 9 months, I was the only one who could figure out what was going on and how to change things. Let me tell you, being indispensable is NOT a place you want to be. You might be needed for that thing, but you never get to do anything else.

Fred from Scooby Doo unmasking the villian and seeing himself

Providing a way for our stakeholders to understand, at a glance, how things were progressing in the project, what they could expect to see released soon, and where the bottlenecks were is something that was essential to finishing the project and I’m still very proud of. The code behind that tool, on the other hand, is something I’m not very proud of. Because while the computer could understand what I had written, no one else, even me, could. That’s never a good situation to be in. I spent too many hours maintaining the system. Hours that should have been spent making things better instead of maintaining the status quo.

Now, when I run into a situation like that, I take the time to make things better. To make sure I’m using the right tool for the job. To make things more readable. To make updates easier. So that when I have to go back into something I don’t end up wasting a lot of time figuring out how and why I did something. Smaller functions. Stored intermediate values. Making the implicit explicit. So us poor humans can understand what’s going on.