As I sit down to write this next CODA installment, the January/February 2023 issue is available for consumption, as are the events incident to Southwest Airlines and the FAA. In the printed physical magazine world, content must be assembled in a layout long in advance of the publication date. That's how we ensure a quality CODE Magazine product. Our magazine analog is not so different from a software analog where we must also take care as we proceed along the software development journey for a given project.

The goal for software is the same as with the magazine, to deliver a quality product. Quality isn't some abstract thing in this case, because to make some determination regarding quality, it must be measurable. Quality is real, not an abstract thing. The quality of the software we build is entirely dependent on how we build it.

The recent events of Southwest Airlines and the FAA, and the not-so-distant events with the Boeing 737 Max-8, require me to look at this topic anew because although we understand what quality is, collectively, when confronted with scenarios that require attention to establish, maintain, and enhance quality, we're not interested in investing in such support infrastructure. Nevertheless, quality is desired because of the benefits that are incident to quality. It doesn't seem very responsible to, on one hand, desire the benefits of quality without doing the things necessary to earn a positive quality designation. Whether something is of good quality is in the eye of the beholder. Quality is a report card. It's a judgment. And we can't be the judges of our own work. That's what makes a good measure, because it can't be targeted. At its most abstract, quality is simply the result of something else.

Goals are important. They help a group focus on some common thing. But that isn't enough. We can't just focus on the end. How we get there matters because it has a knock-on effect to what we end up with. It may very well be that instead of focusing on quality, perhaps we should focus on those things we have control over that, when practiced, tend to make quality enhancement more likely, not less probable. That's what taking care is about.

With all the bad news recently about Southwest Airlines and the FAA and their antiquated manual process, the first thing I thought of was technical debt. I hold to the notion that every irresponsible decision an IT organization makes, the hard financial costs with that irresponsible decision is how technical debt may be quantified. Technical debt is an insidious thing because it's invisible. It isn't recorded in the financials in the sense that debt is usually referenced. Nevertheless, you know that technical debt exists. It exists in the corpus of our software in how it was designed, built, tested, and deployed. To the extent that our existing platform isn't receptive to changes that reflect current innovation that competitive, top players in an industry implement, technical debt exists. One way to quantify technical debt is to equate it to the annual EBITA from such foregone opportunities. The implication is that every year, an organization accrues more technical debt, even if it hasn't spent another cent on that project. Technical debt is about opportunity costs. Organizations that squander corporate opportunities are irresponsible.

That's the reason I decided to scour the CODE Magazine archives to get a sense of what's been written before in these pages. I encountered some of my previous work and that of others. One article that stuck out was from my back-page predecessor Ted Neward. In his Managed Coder Column from May/June 2016 (, he wrote On Responsibility wherein he raised the following assertion:

For an industry that prides itself on its analytical ability and abstract mental processing, we often don't do a great job applying that mental skill to the most important element of the programmer's tool chest - ourselves.

As insightful as that quote was seven years ago, it's even more so now. Later in his column, Ted called out to me over the issue of ethics, the law, and liability. Although I've generally written about those things in the past, with the new Southwest Airlines and FAA context, in conjunction with this once, present, and perhaps future topic of responsibility, regardless of what was written in the past, it bears repeating in a new way with the basic question of how we can improve our condition. How do we become more responsible? Perhaps it's a matter of first learning how to be responsible. And before that, we must each be willing to accept responsibility and be accountable; first to ourselves, and then to the team and the organization. It is important to remember that “we” is just a group of individual "I"s.

The difficulty with large systems that have existed for a long time (like Southwest's) is that many, many people; past, present, and future, have or will have an impact on that system. Why and how things are as they are isn't nearly as important as the fact that they are this way and our collective response to that fact. That collective response is just an aggregation of individual responses, from a variety of professionals and disciplines that have been brought to bear on a solution. At a fundamental level, those individual responses are best served when they're in sync with some stated overarching principles. Think of these principles as a compass or North Star to serve as a guide.

What if such stated principles exist or exist but aren't known? That's why the most important tool in the programmer's toolbox is ourselves. We're the most important tool because we each possess the ability to act appropriately at the “last responsible moment.” Whether or not we act appropriately at the last responsible moment is another question. If it seems in your projects that you're always reacting, and putting out fires, this applies to you. To be responsible, we must be responsible about being responsible.

Another twist on that is that serious people treat serious matters seriously. This is where rigorous honesty must be embraced. But that's a two-way street because it isn't in our nature to dredge up past transgressions to be used against us! Therefore, to be responsible about putting people on the path so that they may be responsible requires a culture of transparency. This is the essence of what continuous improvement is all about. Such efforts require egos being checked at the door. Because the serious fact is this: The work we do matters. It matters to other people because they depend on our technical work. To support such a culture, it requires the buy-in and trust of senior leadership. That same senior leadership needs to finally listen to the technical staff that has been trying to tell them the facts for years. Ask yourself if you would accept shoddy work from a contractor in your home. Of course, you wouldn't. Nobody would or should. But we collectively tolerate it in our projects. Responsibility requires oversight and leadership. Consider that Southwest Airlines and the FAA have different perspectives but with much in common.

Let's take the FAA scenario first. That situation, based on news reports, appears to be an oversight/contractor scenario. A public agency, the FAA will need to be transparent about whatever after-action report is published. The point here is that although the C-suite may find it beneficial to financially and legally organize in a way that uses non-geographically co-located entities as opposed to differently situated employees, that doesn't absolve executive management from its core oversight duties and responsibilities. What often happens is a slow depletion of what I refer to as “organizational knowledge.” Such knowledge, generally, is at the very least, an understanding of how things operate. When we staff things out, over time, organizations often defer the requirement for such direct knowledge in favor of the external entities. Once organizations start to lose grip on how their technical infrastructure operates, they become captive to these external entities. It's the height of irresponsibility on the part of management when it causes an organization to lose grip of its internal understanding of IT matters. Despite the difference in organizations, the FAA and Southwest situations are similar as to technical debt and the damage that scourge has on an organization. The organizational differences illustrate the different paths that can be encountered with technical debt accrual.

Southwest Airlines, based on news accounts, is a good example of a set of processes that can work, but not at scale. Being resilient in the face of many variables is a scaling characteristic. In general, the sensitivity that large systems with manual components have is that they can't be easily or feasibly simulated. The system and its components have evolved over time. It's simply “The System” with thousands and thousands of components, software, processes, people, and all the interdependencies therein. No one or group of people is likely to know how it all works. As for the right mix of events that will cause the system to melt down, there may be theories as to what they are, but until you're in it or very close to it, you won't know for sure. Therefore, you're forced to react, not respond.

It doesn't matter much why past choices were made. The harms are known and those harms can be proximately traced to things that give rise to technical debt. News accounts appear to confirm that, for years, there have been numerous warnings from staff. Good leadership involves listening to your people instead of kicking the can down the road. At some point, folks just need the wherewithal to be able to set about fixing issues that have been around for years without fear or favor. Because that's what taking care is all about: not causing harm.

At the outset, those kinds of value decisions are made within ourselves first. How we make those decisions, specifically what we choose to prioritize, ignore, etc., that's our internal process. That process is driven by an internal code. Such codes are often informed by, or at least should be informed by, externalities including applicable rules, regulations, and codes of ethics, a concept that Ted addresses well in his column. If yours is a public company, then every year, as part of SEC requirements, public companies are required under Sarbanes-Oxley, to disclose whether they have a code of ethics and whether any waivers to that code have been approved by the board of directors.

These governance tasks matter. Well-run organizations are managed by serious and responsible people who take care to “honor the deal.” To borrow Rod Paddock's phrase as to what companies profess regarding what they do versus what they do, is “a deal a deal?” Consider Southwest Airlines' published code of ethics as of 8/1/2018. Pay special attention to sections like conflict of interest and corporate opportunities. When we collectively and individually act in a way that restricts the organization from the opportunities that it should be able to avail itself of through innovation, we're not acting in the organization's best interest and therefore, we're not acting responsibly. We likely didn't affirmatively decide to violate the rules, although actions that do just that make intent irrelevant because the damage will be done!

There may in fact be real legal jeopardy as more of these issues arise. And more of these issues will arise. An example of legal jeopardy is the recently filed Southwest shareholder class action lawsuit. I foresee a day when, depending on the harm, we may very well see more criminal and civil sanctions levied on persons and organizations. The law provides for it. It just needs to be enforced. It's the consumers, the public, the very folks that either use the technology we build or are affected by it that will eventually demand legislation. People demand accountability and the bill eventually comes due. Responsible management keeps that in mind.

Where does all that begin? As Ted eloquently wrote seven years ago in these pages, it begins with ourselves, the most important tool in our toolbox. We need to take care that we remember to invest in ourselves so that we're best situated to take on responsibility, not only for ourselves, but on behalf of others. And to make it all work, others need to do for us as well.

Nothing substantial in our history has occurred without cooperation. A lot of that history isn't good, of course but that same force of cooperation can achieve a different, good result. It's within all of us to decide what path we each will take. This is something you can't ask ChatGPT to provide an answer for. Whatever group you're part of, whatever objective that group collectively works toward, you all need to be aligned, even if that means rejecting the objective! The opposite of cooperation is opposition. If you have opposing forces within your group, they'll cancel each other out. Your organization will incur the expense, with not only no benefit, but likely an impairment to existing systems. In other words, your organization spent money to be in a worse position. Put another way (and I give the credit to my superstar editor Melanie for these words that sum it up perfectly): “We all play nicely together or we all hang together.”