Writing software is hard, particularly when the schedules keep programmers “nose to the grindstone”; every so often, it's important to take a breather and look around the world and discover what we can find-ironically, what we find can often help us write software better.

Philosophy doesn't just question the imponderables about the universe; sometimes it shows us the limitations of our own, programming-trained mind, and leaves us to question the approaches we take in building software for the rest of the human race.

Let's take an easy question for a moment: what do you call a man with no hair on his head? Bald. (You were expecting a punchline, weren't you? Admit it.)

Baldness, for all of its negative qualities involving aging and the diminishing of the male ego, intrinsically is a pretty simple concept: when a man (or a woman, though a far less common case) loses the hair on their head, they are considered to be bald, and there's not much to it.

So exactly when does a head become bald? If I pluck a single hair out of my head, does that make me bald? Most would say no. If I pluck a second hair, then a third, and then a fourth? Still no. In fact, I can probably pluck hundreds of my hair off my head, and still not be considered bald. After all, on the typical human head, hundreds of thousands of strands of hair sprout and rest (and in my case, generally form an unruly mass that defies most sorts of organization other than the ponytail or the crew-cut).

But at some point, if I continue to pluck hairs from my head, some kind of interesting threshold is crossed, and suddenly one day my youngest son looks up at me and says, “Daddy, you don't have any more hair on your head!” I'm bald. Or I'm said to have a receding hairline. Or I have a bald spot. Or…

Suddenly, what was once a binary quantity, isn't.

The Sorites Paradox

In philosophy, the above is given the name “Sorites paradox”, the idea that there is some kind of unmeasurable threshold in a continuum that suddenly transforms an entity from one condition into another condition-in other words, we can point to men who are bald, and men who are not, but we can't really precisely define how many hairs have to be lost before the man can be called bald, or even what percentage of hair must be lost before a man is bald, and so on.

This paradox isn't quite as ridiculous to contemplate as you might think-consider the canonical human resources system, which sometimes wants to capture employee racial information (purely voluntarily, of course) for purposes of identifying the company as a “minority-owned” or “minority-operated” firm, for purposes of PR, insurance, or state or federal government categorical qualifications. The US Census also likes to voluntarily track racial breakdowns, as a means of identifying trends within the population and so forth.

And here again we run into the paradox: if you'll pardon the (perhaps) crude way of putting this, how “black” do you have to be, in order to be “black”? If one of your parents was white, are you still “black”? If one of your grandparents was black, and the rest white/Caucasian, are you still “black”? What if one of your great-grandparents was Asian in an otherwise entirely European lineage? Can you still claim to be Asian? My own grandfather came over from Poland when he was a small child, and his racial legacy is the strongest I can claim, since the rest of my grandparents being best described as “European mutt”. Does that make me Polish?

Leaving the politics of race alone for a moment, this presents quite a distressingly difficult problem to the system modeler, because somehow, we have to capture this unquantifiable in a quantifiable manner. And we have to use ones and zeroes to do it.

The Fallacy of the Binary

In programming, we make decisions based on binary logic-the “if/else” construct stands at the heart of almost all decision-making capabilities in every programming language we use. Even for more complex constructs like the switch/case construct in C#, the try/catch/finally exception-handling we see in any .NET language, or even the “On Error Gosub” from way back in the very old Visual Basic days, all of them fundamentally have some kind of if/else construct at the heart-if an exception is thrown, jump to this location and start a series of if/else tests: if the exception is of this type, else if the exception is of this other type, else if the exception is of this third type….

This love of the binary carries over into the constructs we build, too. Consider the user-interface constructs we use: the ubiquitous check box and its cousin, the radio button, both indicate binary selections-either the checkbox is “selected” or it's “not”. The radio button is almost always collected into a group of other choices, only one of which is “on”, the rest by design being “off”.

(Does anybody remember the 3-way checkbox from back in the Windows 3.0/3.1 days? A checkbox could be either “checked” with an X in the box, “unchecked” where the box was empty, and the third state, “indeterminate”, which was just shaded gray in the box? Anybody ever find a good use for it?)

Few things in life are actually binary, yet we as developers and architects often force-fit them to be that way. Consider the notion of gender: you're either male or female, right? All issues and discussions of sexual preference and gender identity aside, physically, a human being is either one or the other, labeled according to the body's plumbing. Despite the apparently obvious binary modeling situation here, nature has a different story to tell. Some people are born with a genetic condition that renders them, chromosomally, equally male and female at birth. And fetuses which haven't reached a particular stage of development aren't really either gender (yet). And in some systems (such as software to help track bodies at the coroner's office, where a former neighbor of mine works), gender simply can't be determined right away.

Or consider the notion of “transactionally guaranteed”. For years, if not decades, software developers have been holding up the notion of your bank account as a transactional system, pointing out that “when you transfer money from one account to another, there'd better not be any money lost-it either all goes through, or it doesn't”. And yet, this isn't really true in real life, either. As the NoSQL movement likes to call it, bank accounting is more on the order of “eventually correct”, meaning that given enough time, all the credits and debits line up correctly. Eventually.

After all, if bank accounts were transactionally accurate, then as soon as I wrote a check the balance online would reflect the new balance. Or, for those who care to discount the analog mechanism as anachronistic, when I deposit a check, regardless of amount, the funds would be available immediately for use, rather than tied up in a 5-day hold to verify the funds' source.

Sorites moves in strange and mysterious ways.

The lesson for the software developer? Next time you start reaching for an either/or situation, stop to consider the “other”. Consider giving up the “if/else” and replacing it with “switch”, just to make sure you open yourself up to the possibility.

For those interested in a light introduction to the paradox, I highly recommend the book 50 Philosophy Ideas You Really Should Know (Ideas You Really Need to Know) by Ben Dupre.