Monday, June 3, 2013

Cognitive Biases in Software Engineering

Human logic, unlike that of the machines which we program and use every day, isn't perfect.  We make mistakes, we establish bad mental habits, and we have many cognitive biases that negatively impact our ability to be successful engineers.  I want to go over five of the most common biases that I see on a regular basis as a software engineer.

Fundamental Attribution Error
In social psychology, the fundamental attribution error (also known as correspondence bias or attribution effect) describes the tendency to overestimate the effect of disposition or personality and underestimate the effect of the situation in explaining social behavior. (ref)
This is my favorite cognitive bias, because it shows up everywhere.  When someone cuts you off on the road they are a complete asshole, but when you cut someone off it is because you didn't see them, or because you really need to get to work on time for a meeting.  When someone else writes a bug or takes the site down, it is because they are a negligent, crappy engineer.  If you create a bug it is because you were tired, there weren't enough automated tests, you were rushed, the requirements were poorly defined, or because it was a full moon.

This is why Etsy puts so much focus on Blameless PostMortems.  The reality is that the "situational aspects of a failure’s mechanism" are where you will find improvements that will meaningfully reduce the odds of failure in the future.  Screaming at people is a demonstrably poor way of making them perform at a higher level.  We are lucky in the software world, if something breaks we can frequently automate the process or add a test to ensure that it doesn't break in the same way again.  By being aware of this cognitive bias and working to focus on the situation instead of the personality of the human involved, we can reduce the negative impact that this bias has on our lives.

Confirmation Bias
Confirmation bias (also called confirmatory bias or myside bias) is a tendency of people to favor information that confirms their beliefs or hypotheses. (ref)
This bias most commonly occurs in manual testing.  When we write some new code, we are biased towards testing the cases that we know will work.  This allows us to spend a short amount of time testing (because everyone hates manual testing) and then proudly declare that "it works!".  This bias can result in poorly tested code, and the miserable practice of throwing code over the wall and having other people clean up the mess.

This is one of the harder biases to get over in my opinion, because it means acknowledging our own limitations, and really stressing the fragile parts of the code that we write.  We all want and expect our software to work, so we are inescapably drawn to evidence that confirms this desire.  Keep fighting this urge, keep testing, and always question your assumptions.

The Bandwagon Effect
The bandwagon effect asserts that conduct or beliefs spread among people, as fads and trends clearly do, with the probability of any individual adopting it increasing with the proportion who have already done so. (ref)
The most common area where I have seen this is when evaluating third party software.  I've actually walked into meetings where I thought everyone had decided not to purchase a given tool, and then walked out planning to sign a contract.  Once one person who is well respected outlines their opinion, it's appealing for everyone else to hop on board.  There are various strategies to combat this urge, and things like blind voting can work well in the right situation.  If you have a big decision to make (like spending six figures on a third party tool), I think it makes sense to have all of the stakeholders write out  their opinions in private first, without having their position influenced by other (potentially more senior) people.

Hyperbolic Discounting
Given two similar rewards, humans show a preference for one that arrives sooner rather than later. Humans are said to discount the value of the later reward, by a factor that increases with the length of the delay. (ref)
Stated simply, this means that something less valuable today is more appealing than something more valuable that will come in the future.  This is frequently how technical debt is incurred.  Poor design decisions and shortcuts are sexy because they give you a small amount of value right now (not having to do the work to architect things properly), and you dramatically discount the value you would get in the future by doing it right the first time.  It is very difficult for humans to project the long term cost of shoddy code in terms of time we will spend debugging, refactoring, and interpreting it.  I'm sure we've all experienced this, as you get into a project you see a better way to do things, but your mind resists spending the extra few hours it would take to refactor, even when you know that it will save many times that in the long run.

In my opinion the way around this is to deliberately stop and do an estimation exercise.  First think about how long the refactor will take, and be extremely generous (e.g. double your first estimate).  Then think about how many people will be working on the code, and how often.  Given really conservative estimates for how much incremental technical debt and confusion you are creating, figure out how long it would take to "pay off" the time you spend refactoring.  Typically this exercise throws things into stark relief, and it becomes obvious which route you should take.  If you are still on the fence, your decision should be based on the type of project you are working on.  At larger established companies, do the refactor.  At a startup that is trying to release an MVP, maybe do the hack (your code will probably be rewritten anyway).  That said, always think of who else will be working on your code...

Negativity Bias
Negativity bias is the psychological phenomenon by which humans pay more attention to and give more weight to negative rather than positive experiences or other kinds of information. (ref)
Have you ever worked with someone who always seems to be shooting down ideas, yet commands a lot of respect and seems extremely knowledgable?  Are you this person?  I know I have been at times.  The negativity bias is powerful, we are all hardwired to listen to the naysayers.  This is part of the reason why even topics with overwhelming evidence in their favor (like global warming) still have fervent deniers.  In software engineering, this often occurs when product managers or designers describe some feature that is fairly difficult to implement.  Engineers will frequently push back on this, explaining why the goals are impossible to achieve.  Because of the negativity bias we all have, these people can come across as intelligent, mature, and even seem to be saving the company time and money.  It is often the right decision to say no to things, but if you find someone who is doing this on a regular basis, take a step back and ask yourself if their arguments have merit or if you are simply falling prey to this cognitive trap.


For all of these biases, the most important step is to recognize that they exist, and try to identify them when they appear in your day to day life.  Once you do that, just take a step back and think about the situation.  It can even help to state the bias out loud - especially if other people are involved.  By bringing up the idea that you might be falling into a common cognitive trap, you can refocus the conversation and clear your mind.

If you like learning about rationality, cognitive bias, and how to overcome the limitations of our brains, check out the blog LessWrong, and go read Harry Potter and the Methods of Rationality (seriously, just do it).