For an industry that prides itself on its analytical ability and abstract mental processing, we often don't do a great job of applying that mental skill to the most important element of the programmer's tool chest - that is, ourselves.
George Carlin, the late comedian, used to have a great one-liner: “Have you ever noticed that anybody driving slower than you is an idiot...and anyone driving faster than you is a maniac?” (C'mon, now, show of hands if you've ever thought that while driving down the freeway.) It stands to reason, right? If you thought you could drive faster and still be safe, you'd do so, and if you thought you needed to slow down, you would.
It turns out, not surprisingly, that we reflexively assume that our perceptions bear a one-to-one correspondence to reality. In fact, I'll even take that one step further and assume that our own personal perceptions are accurate and objective, even going so far as to assume that our accuracy and objectivity is better than anybody else's.
Don't believe me? Watch me prove it: I'll bet five bucks that any reader of this article will agree with my analysis of their political positions:
"You see yourself as being about as politically liberal as it is reasonable to be. On most issues, you see people who are to the left of you as a bit naïve, as more idealistic than realistic, and overly inclined to political correctness. At the same time, you see those who are to the right of you as rather selfish and uncaring, as somewhat narrow-minded and not fully in touch with the lives that many people live and the problems they face in today's world."
Winner, winner, chicken dinner. Don't believe me? Take a copy of this magazine, and make the same bet with any colleague or family member who hasn't read this article. Read them the above paragraph, and see if they agree. Best yet, read it to somebody far on the left of the political spectrum, and to somebody far on the right, and I'll bet you another five bucks that they both agree.
The secret to the paragraph's success? The same vibe that Carlin tapped into: If you felt that people to the left of you were more attuned to reality than you are, you'd have already moved in their direction, and similarly for people to the right of you. You (and everyone else) see your own political beliefs and leanings as the most realistic response to the specific times in which we live and the particular problems we face.
Want to see that paragraph translated almost directly into a topic that's near and dear to our industry's hearts? Try this on for size:
"You see yourself as being about as eager to adapt to new technologies and platforms as it is reasonable to be. For most new technologies, you see people jumping in ahead of you as being a little too “bleeding edge,” a little too willing to buy in before the technology/platform/device is ready for prime-time or fully debugged. At the same time, you see those who are yet to adopt some of the things you're already using as being a little slow and over-conservative, too concerned with maintaining the status quo and not willing to invest in the future enough to face the problems of modern software development."
Feels a little too close to home, if you ask me.
So… what now?
Naïve Realism
The original paragraph above (about politics) comes from a book describing the nascent field of social psychology. In the book, “The Wisest One in the Room: How You Can Benefit from Social Psychology” (by Thomas Gilovich and Lee Ross), the authors talk about what they call “naïve realism,” or the idea that somehow our view of the world is the most accurate one that could possibly be obtained. The key thing that most people miss (and hence the use of the term naïve in the phrase) is that each and every one of us holds this exact same viewpoint - and yet, somehow we all seem to disagree on the particulars to some degree or another.
(Just wait - some reader will email me and disagree with the above paragraph.)
It's relatively easy to accept that not everybody agrees with my conception of the world. It's much, much harder to accept that my reasoning and the rationale behind my conception of the world is no more reasonable or accurate or correct than yours is. This is particularly tricky in a climate where “alternative facts” and “that's just your opinion, man” are widely used as weapons in a war of words; keep in mind that social psychology is not suggesting that we each get our own facts. But we do need to understand that our interpretation of those facts is subject to our own perspective and bias.
One pithy example of this is rooted in the old saying (first uttered by Benjamin Disraeli, a British Prime Minister) that “There are three kinds of lies; lies, damned lies, and statistics.” A classic joke told during my political science classes, back in the mid-80s, was that a race was once run between the United States and the USSR as a symbol of peace. Each country ran a single competitor. When the race was over, the American came in first, the Soviet second. When the American papers ran the story, they said that the American won the race and the Soviet lost. When the Soviet national newspaper, Pravda
, ran the story, it said that the Soviet runner came in second, and the American came in second-to-last.
Another, more serious, example of this is another social psychological study. A Yale Law professor (Dan Kahan) showed participants in a study pictures of a clash between protestors and police that took place in Cambridge, Massachusetts in 2009. Half of the participants were told that the demonstrators were protesting abortion in front of a health center; the other half were told they were protesting the military's “Don't Ask; Don't Tell” policy in front of a military recruitment center. Assuming that we can know the political leanings of the person being surveyed, how many respondents saw protestors unfairly (or “illegally”) blocking access to the building, and how many saw a legitimate use of the First Amendment?
If you've swallowed the notion of naïve realism as an easy predictor of human behavior, you know the answer: Almost three-quarters of those who self-identified as liberal saw the anti-abortion protestors as an unfair block of access to the building, whereas if they were told the building was a military recruitment center, they saw the protest as fair. When the participants were self-identified as conservative, those numbers flipped almost identically. Our interpretation of a situation is highly dependent on our preconceptions - or, in other words, our understanding is almost entirely a thing of perception, not reality.
This is beyond important when working with colleagues, clients, and customers. It's an absolute necessity.
Clients and customers represent a very different viewpoint on the project than the one we hold, even in those situations where we work for the same company and/or in the same department. Our job, as technologists, is to bring new solutions to life; theirs is often something entirely different. Operations managers and system administrators, for example, need to keep everything running, and any time we introduce something new, that means that something could (and usually does) break. Security needs to prevent any sort of unauthorized access - the easiest way to do that is to deny any sort of access at all. (This is particularly easy to understand when you realize that if there is a data breach, it's not the programmers who are held responsible in the press and management meetings. At least not at first.) Users want something that minimizes the difference from what they do now to what they will be responsible for doing in the future. The security folks will be looking at the ease by which an attacker can break in, and so on. Their understanding of your work depends on their own perceptions and not some objective standard. Contrary to popular belief, “easy to use” to users is not something that is defined by “fewest clicks” or color schemes; instead, it's defined by how close it is to their own mental model of what their job is supposed to be.
Similarly, when a colleague approaches with the idea that “we should rewrite the UI layer in Angular,” it's important for us to understand that they didn't reach this conclusion in ignorance - they came to this thought through a process of their own. That process may or may not be similar to the one you use, but to them, it's the most correct process they know, because if they believed it to be incorrect in some fashion, they'd already have corrected it.
Summary
Working against naïve realism is not just a matter of realizing that everybody else does this; it's also a realization that you do it too. As do I. It's not something that we can change, either. It's just the way our brains are wired. But knowing that we do this gives us the ability to influence how we react to it. When challenged by somebody with a different opinion, it's necessary to assume that they came to that opinion reasonably, and not just because “they're an idiot” or “they're clearly reading the wrong papers or reports.” It's of course possible that they could be, but it's more likely that they read some of the same papers or reports you did and came away with a different interpretation, one based on their preconceptions and history.
Of course, so did you.
So how do we come to some kind of agreement in these situations? I can talk about that in a future column, but the upshot is “Patiently, and with a lot of communication.” Have that patience, seek to understand the position they're coming from so as to try and get a sense of what their preconceptions are/were, and spend time trying to understand where yours are coming from as well. It's not easy, but in the long run, it's far better than the flame wars and trolling.