The cognitive biases that influence product development decisions
Whenever people tell you “you’re biased,” your immediate reaction might be to be offended. “How dare they,” you might think to yourself. “I am not biased and pay particular attention to being fair, open, and honest about everything.”
However, the reality is that the chances are that you are biased because, ultimately, we all are. The important thing to remember is that we’re all biased and should adjust for this.
In general, a bias can be described as something that will place a disproportionate weight either for or against a particular idea or a thing, and can be inherent or cultivated and will usually lead to an individual behaving in a way that is closed-minded, prejudicial, or unfair.
The concept of cognitive bias was first introduced by researchers Amos Tversky and Daniel Kahneman in 1972. Since then, researchers have described a number of different biases that affect decision-making in a wide range of areas. In this post, we’ll look at a few that can heavily influence the delivery of a product.
When doing product development, deciding a system’s architecture, or just writing code, our decisions are impacted by our bias. Let’s consider the main bias in this article.
Anchoring bias
Or: Why we rely on the first piece of information we receive
This bias relates to our tendency to rely far too heavily on the very first piece of information we learn.
For example, if you have never been in a coffee shop before and the first time you walk into Starbucks, you will learn that the price of a tall skinny vanilla latte is $3.45. Suppose the second time you walk into a coffee shop, you walk into Courtney’s Cafe in the suburbs and on checking the menu. In that case, you learn that their price for a tall skinny vanilla latte is $2.50, you’ll immediately think that Courtney’s Cafe offers you a better deal on your coffee simply because you’ll be comparing it to the first piece of information you learned; tall skinny vanilla lattes typically cost $3.45.
There are ongoing studies as to the cause of anchoring bias. Still, there seem to be slight variances in the reasoning behind the bias depending upon the source of the initial anchoring value.
The original Tversky and Kahneman observation on anchoring bias was based on research of individuals setting their initial value and their subsequent adjustments from that point. In contrast, other anchors could be set by prior related knowledge (selective accessibility) or even the participant’s mood.
It doesn’t matter whether the context is determining a project’s schedule or anticipating a new feature’s impact. This bias has the propensity to impact our product perspective and causes us to hold onto a particular value, even when objectively it might seem irrational.
Fight anchoring bias: One approach to combating the bias is to sit and work through all the reasons why the anchor value might not seem appropriate. Research has shown that this weakens the anchoring effect of the original anchoring point.
Functional fixedness
Or: Why we put people into pigeonholes
This bias relates to our tendency to see objects or things only working in a particular way.
For example, you have a user experience issue to solve, but you don’t think about approaching the software developer who is sitting opposite you because you know their specialty is API development.
This bias was first identified by the German psychologist Karl Duncker in 1945 and demonstrated using ‘the candle problem’.
In his research, he presented participants with a candle, a box of thumb-tacks, and a book of matches and tasked them with attaching the candles to the wall as if to light the room without using any other items. For the majority, their journey to a solution started with an attempt to thumb-tack the candle to the wall, as thumb-tacks are used to attach things to the wall, and the thing that needs attaching is the candle.
Only by dispersing with the effect of functional fixedness can you find a solution involving lighting a match so that you can melt the bottom of the candle and fixing it to the inside of the box in which the thumb-tacks came, and subsequently tacking the box to the wall.
Functional fixedness occurs because to make sense of the world, our minds have made mental models of how things fit together and operate. In many situations, this is beneficial as it helps us shortcut the majority of life. However, when it comes to problem solving, which much of product development is, these fixed models limit our thinking and, therefore, our solutions.
Fight functional fixedness: One approach to combating the bias is to abstract the specific problem from its environment and seek inspiration from different sources, whether people or places.
The Dunning-Kruger effect
Or: Why we should take care when building teams
This bias relates to the tendency for individuals with lower ability/expertise/experience on a particular subject to overestimate their ability or knowledge in that area.
For example, if you ask your team to complete a quiz about your website traffic and then ask them to estimate how they have performed in the test, those with limited knowledge of site activity will overestimate their performance.
The effect is named after the social psychologists David Dunning and Justin Kruger, who described it in their 1999 article entitled “Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments”.
This bias is more pronounced in non-product environments, which is why you always find one person at a party who is willing to share their opinion on a subject like local politics, and they are the least likely to understand how local politics works, or how in the early audition rounds of a talent show there are individuals who shout back at the judges on their inability to identify talent.
This bias comes about as ‘incompetent’ people tend to overestimate their own skills, fail to recognize the genuine skill of others and fail to recognize their own mistakes.
As with all the biases mentioned, we are all susceptible to suffering from Dunning-Kruger but don’t worry. It’s helpful to understand so that you know which voices to listen to in the room and know when you should speak up and when you could leave the talking to the experts.
Fight Dunning-Kruger: When addressing this bias, the simple approach to fighting it is to increase your knowledge and experience in the subject matter. Then, your confidence will decline to more realistic levels and not remain overinflated.
Session Replay for Developers
Uncover frustrations, understand bugs and fix slowdowns like never before with OpenReplay — an open-source session replay tool for developers. Self-host it in minutes, and have complete control over your customer data. Check our GitHub repo and join the thousands of developers in our community.
And some other biases you might encounter
Actor-observer bias is the tendency to attribute your actions to external causes while attributing other people’s behaviors to internal causes. For example, you didn’t meet a deadline because of others pushing more work on you, while others didn’t meet their deadlines because of their character.
-
Attentional bias - the tendency to pay attention to some things while ignoring others. For example, when deciding which feature to prioritize, you push for a change in an area you know well while ignoring changes in areas you don’t, even if they might offer greater returns.
-
Availability heuristic - the tendency to place greater value on information that comes to your mind quickly. For example, if you’re asked about the biggest challenge you’re facing, you are more likely to remember the most recent challenge rather than one that might have caused a bigger headache.
-
Confirmation bias - The tendency to favor information that conforms to your existing beliefs and discount those that don’t match them. For example, you love all supporters of your own sports team and dislike those of your rival, despite not knowing them.
-
False consensus effect - The tendency to overestimate how much other people agree with you. For example, when asked who agreed with your recently suggested feature improvement, you’ll believe it was more than was the case.
-
Misinformation effect - The tendency for post-event information to interfere with your memory of the original event. For example, when in a team retrospective, you may feel strongly about a particular problem if multiple other participants have highlighted their struggles, even if you didn’t encounter the same struggles.
-
Self-serving bias - the tendency to blame external forces when bad things happen and give yourself credit when good things happen. For example, when your customer numbers jump, it’s because of a new feature that you introduced, but when they fall, it’s because of factors in the economy.
We’re all biased
After all this, the lesson we should learn is that we’re all biased in some way or another, but the first step to addressing this is to acknowledge it. Once you’ve acknowledged it, you’ve got the opportunity to take steps to minimize its effects and move towards a more suitable solution to the problem you’re trying to solve.