Great product teams understand the importance of dissenting feedback. However, a team might drive toward an inferior product because it fixates on a few pieces of positive feedback, or a persuasive product lead might seek out data to support her pet feature. Moving quickly often requires trusting our intuition, but what is the difference between gut instinct and bias?
**Note: Slideshare has altered some fonts and formatting
5. +
2 PIZZA TEAMSIf you need more than two pizzas to feed the team, the team is too big
Allison Abbott & Emma Sagan
6. CROSS-FUNCTIONAL
TEAMSThe make-up the team changes throughout the phases of the project depending on the needs of the project
PRODUCT MANAGERDESIGNER DEVELOPER
DATA
SCIENCE/MARKETING
29. • push yourself to listen to user feedback again
with a different lens
• ensure that multiple people listen to an
interview because chances are you’ll all have
different variations of selective attention
WHEN SYNTHESIZING
FEEDBACK …
SELECTIVE ATTENTIONAllison Abbott & Emma Sagan
36. • check yourself: is your body language biasing
your results?
• consider double-blind studies (although these
are typically used in academia and not
industry)
WHEN CONDUCTING
INTERVIEWS…
OBSERVER/EXPECTANCY
EFFECT
Allison Abbott & Emma Sagan
43. PRIMACY EFFECT
Seeing the initial evidence,
people form a working hypothesis
that affects how they interpret the
rest of the information
Allison Abbott & Emma Sagan
47. • check yourself: are you comparing all new
pieces of information to your first set of
feedback?
WHEN SYNTHESIZING
FEEDBACK …
PRIMACY EFFECTAllison Abbott & Emma Sagan
53. • ask yourself have you stumbled upon
something huge (maybe, maybe not)
• be aware that just because “everyone”
suddenly has this problem, doesn’t mean that
“everyone” is actually plagued by it (go seek
quantitative analysis!)
WHEN SIZING A
PAIN-POINT…
FREQUENCY ILLUSION
Allison Abbott & Emma Sagan
60. •make note of what pieces of feedback are most
sensational or memorable – and be sure to not
give those pieces more weight than they deserve
WHEN SYNTHESIZING
FEEDBACK …
AVAILABILITY HEURISTICS
Allison Abbott & Emma Sagan
62. The best way to
predict the future
is to create it
“ - Peter Drucker
Allison Abbott & Emma Sagan
63.
64. PRO-INNOVATION BIAS
The tendency to have an excessive
optimism towards an invention of
innovation’s usefulness throughout
society, while often failing to identify its
limitations or weaknesses
Allison Abbott & Emma Sagan
65. •be optimistic, but realistic
•ask yourself which aspects of your solution still
need improving
WHEN DECIDING IF YOUR
SOLUTION IS BIG ENOUGH…
PRO-INNOVATION BIAS
Allison Abbott & Emma Sagan
66. OBSERVER EFFECT
ACTIONS SPEAK LOUDER THAN WORDS
How you (unknowingly) bias
your experiment
PRIMACY EFFECT
FIRST COME, FIRST SERVED
Your brain latches onto the first information
FREQUENCY ILLUSION
AVAILABILITY HEURISTICS
THEORY OF HOW YOUR BRAIN WORKS | THINKING FAST & SLOW
SYSTEM 1 & 2 PROCESSING
EVERYONE’S IN A RELATIONSHIP EXCEPT ME
When you learn something new,
suddenly it’s everywhere
WILL YOU BE ATTACKED BY A SHARK?
Mental shortcuts to stay efficient
SELECTIVE ATTENTION
DANCING BEARS
How you prioritize and process stimuli
PRO-INNOVATION BIAS
NOT EVERYTHING IS GAME-CHANGING
Optimism without reality
Allison Abbott & Emma Sagan
75. 6 THINKING HATSEdward de Bono
White hat
neutral and objective view
Red hat
the emotional view
Black hat
the “devil’s advocate” view
Yellow hat
the sunny positive view
Green hat
the creative, generative view
Blue hat
the organizing view
Allison Abbott & Emma Sagan
76. TEST AN EXTREME IDEAWHEN TESTING NEW CONCEPTS, INTENTIONALLY TEST A DUD. THIS WILL HELP YOU LEVEL-SET FEEDBACK
(AND MIGHT SURPRISE YOU).
X
Allison Abbott & Emma Sagan
77.
78. allison.abbott@capitalone.co
m
@Allison_Abbott1
THANK YOU
emma.sagan@capitalone.co
m
@EmmaSagan
Anderson, C.A. (2007). Belief perseverance (pp. 109-110). In R. F. Baumeister & K. D. Vohs (Eds.), Encyclopedia of Social Psychology. Thousand
Oaks, CA: Sage.
Fabio De Sio, Chantal Marazia, Clever Hans and his effects: Karl Krall and the origins of experimental parapsychology in Germany, Studies in
History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences, Volume 48, Part A, December
2014, Pages 94–102
Glietman. Gross. Reisberg, “Psycology, 8th
Edition”
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2013. Print.
Kolko, Jon. Exposing the Magic of Design: A Practitioner's Guide to the Methods and Theory of Synthesis. New York: Oxford UP, 2011. Print.
Tversky, Amos; Kahneman, Daniel (1973). "Availability: A heuristic for judging frequency and probability".Cognitive Psychology 5 (2): 207–232.
Wason, P. C. (1960-01-01). On the Failure to Eliminate Hypotheses in a Conceptual Task. Quarterly journal of experimental psychology, 12, 129.
Noun Project Icons: Lisa Staudinger, Casper Jensen, Ilia Sokolov, artworkbean, Aaron Dodson, Ruben Vh , Rob Armes, Thomas Bruck,
Eugene Belyakoff, Alejandro Capellan
Editor's Notes
Hi Everyone,
Thank you for coming today. My name is Emma Sagan, I’m a product manager at Capital One Labs, and this is Allison Abbott, she’s a design research at labs. Today we’re going to talk about confirmation bias in new product development. We’ll explain how our brain can sometimes trick us into following the wrong path and what we do at the lab to set us up for as much success as possible, while acknowledging that no matter how hard you work to combat confirmation bias, you will at some point fall into it’s trap. Our hope is that you will leave today with the skills to identify it better in yourself and in your team.
So to get started, we’d like to tell you a little bit about the type of work that we do at Capital One labs, so that you can understand what approach to new product development we use and to provide you with some context as we discuss some of the methodologies we’ve developed to combat confirmation bias.
Capital One labs is the R&D arm of Capital One … blah blah blah
EXPLAIN METHODOLOGIES
EXPLAIN METHODOLOGIES
Team Size
Team type
Now that we’ve talked a bit about out process, we wanted to give you some quick context. I’m a product manager, which means I’ll stay on a project from inception through pilot. (Pass it to Allison)
Emma: So with this, we’d like to kick this off with a quick game. (pass it to Allison)
I’ve got a string of numbers up here. This string of numbers follows a certain rule I’ve got in my head.
Now, I want you to think of another set of three numbers – I will tell you that YES, it follows my rule, or NO it doesn’t.
(after hearing a few sets of numbers): Tell me what you think the rule is
Our rule was any three numbers in ascending order.
In this activity, you are tasked with creating a hypothesis for the way that the world works (or, this rule for the numbers), and then testing to see what the truth was.
What was happening in this video (or for some of you) was that you tested your hypothesis by trying to CONFIRM it – creating a string of numbers that followed what you THOUGHT was the truth.
And when I tell you that yes, your test is aligned with the way the rule works, you get pretty convinced that that’s the truth.
However, you saw that doubling was NOT the rule – but people kept testing out rules that confirmed their belief. They didn’t get any closer to figuring out the true rule was.
In order to not get caught up in reiterating tests that confirm our own beliefs, what you could have done was come up with a string of numbers that did NOT confirm your belief – and you would have figured out that your string of numbers did still follow the true rule.
You’d then be able to rule out your initial hypothesis, and learn more from this test as to what the true rule might be.
By seeking to disconfirm your hypothesis, you’re more likely to get to the truth faster.
What’s happening here is that when you find evidence that confirms your own beliefs, you tend to stop looking for more evidence – and it gets really hard to shake the idea that your rule is wrong, even in the face of clear indicators that it’s wrong (like the guy in this video telling them directly that their rule was wrong).
In the world of psychology, this is called the “makes sense stopping rule”
(Really. That’s really what it’s called. )
This “makes sense stopping rule” is one manifestation of Confirmation Bias.
So, what is confirmation bias?
(read)
The thing is, we’re all guilty of confirmation bias.
Confirmation bias does not arise from a longing to confirm, or from an overwhelming motivation to be “right.”
Confirmation bias comes about because of cognitive breakdowns – it’s really hard, on a cognitive level, to reconcile information that goes against what we already believe.
-We feel that confirmation bias is particularly relevant to the field of product development and innovation because it’s a world that involves a lot of faith, and a lot of optimism – we put our guts and soul into developing products we really believe in…
if we didn’t have that passion, we wouldn’t be innovating in the first place.
However, the same passion that is required for us to innovate, is the same area where we might fall victim to confirmation bias. We have strongly held beliefs that we have the right team, and the right set of ideas, that will lead us to shape the world around us for better….
…. but how can we keep these beliefs from getting in our own way? As humans who are all susceptible to the tendency to seek out and interpret information that upholds our beliefs, we close ourselves off to a world of feedback that might go against our beliefs, but that might help us make a product that really solves the right need, in the right way.
So, why are we wired in such a way that makes us have confirmation bias?
Let’s get nerdy for a second: we’ll first talk about the basic way that the brain processes information. With that background, we’ll then tell you about some scientific phenomena that are associated with confirmation bias, tell you some real-world examples, and then illustrate how this can impact product development.
If the toothpaste is 1.00 more than the brush, how much is the brush?
How many of you think 10 cents? How many of you think 5?
More than 50% of students surveyed at Harvard and MIT get this wrong
Your brain has two ways of processing information. Colloquially this is referred to as thinking fast and slow. Officially this is system one and two. Daniel Kahnemann, a nobel laureate and leader in this field, has demonstrated that the brain is broken into two systems.
2 x 2 = 4
57x13 = ??
System 1 is fast, automatic, unconscious. Your brain needs to rely on this type of processing in order to make it through the day. Think about how many quick decisions your brain made just to get here. There are simply too many decisions in a day to slow down and think through in detail. The challenge is that system 1 in error prone. Since it’s trying to work as efficiently as possible, it sometimes misses things. Take the toothbrush for example. Your brain thought it was a simple question, and came up with the “obvious” answer. It was until we slowed down that you realized you had made an error.
As we slowed down and walked through the problem, we enacted system 2. System 2 is slow, effortful, conscious and reliable. It is typically used to make complex decisions.
Please remember to stay as quiet as you can in the video, so those around you can complete the task.
-In the video, there was a lot going on, and you were only able to process what you were tasked with doing – counting the passes.
-It wasn’t the case that the bear was hard to see – the second time around, you saw him just fine.
-Right now, I bet you’re not thinking about the fact that the chair your sitting in is putting pressure on your back…that’s still a stimulus that’s been happening to you since you sat down here, but you weren’t paying attention to it until now because there were other stimuli in the enviornment that your brain decided to pay attention to.
-So the theory in the world of psychology is that our brains can’t actively process everything, but only the things we select to attend to.
-Most of the time, this is a good thing – if we were actively thinking about every last stimulus in our environment at every moment, we wouldn’t be able to function. Selective attention helps us most of the time, but not al of the time – and is one of the reasons we have confirmation bias
-The reason we gather and interpret information that goes along with our own beliefs is due to the fact that our brains can’t actively process too much at one time.
-When something is important to us, we’re more likely to pay attention to it – and,, paying attention to those important things means we’re often UNABLE to notice other (even striking) information in the same environment.
When you are conducting research or looking at outputs from research, we know it’s important to you that your product is succeeding.
Even if consciously you want honest feedback from your users, your desire for the product to succeed might make you more attentive to information that confirms that your product is successful.
So, while you hope that you’re listening to all your users’ feedback equally, there is a significant chance that you’re just not even taking in information that doesn’t align with what you are most interested in hearing.
If I told you that today is the 10th, what day of the week is next Tuesday, could you tell me? What’s AMAZING is that Hans could!! He could also do math! I don’t know about you, but that is SUPER cool. His trainer thought he had discovered a new level of intellect among our equestrian friends!
The trainer wasn’t a malicious trickster, he truly believed hans could do math too
Mental shortcut that allows people to solve problems and make judgments quickly and efficiently. Allow people to function without constantly stopping to think about their next course of action. They’re helpful in many situations, but they can also lead to biases.
When conducting empathy and user feedback interviews, there is a lot of discussion and material about best practices in framing questions. We’re taught not to ask leading questions, which for the most part we strive to do, but there is something completely out of our control – our micro-expressions
“rock?”
“clack?”
“duck?”
Take a look at the placement of the words, and how many people remembered each
What’s happening here: info gathered earlier in the process is likely to stick in your memory.
Now, in the world of confirmation bias, when something is encoded into LTM, we often forget where we learned that information from (whether be something we heard from a friend, or maybe we read it somewhere? Maybe an expert wrote about it?)…and we forget that it was simply the first informatin we came accros, and instead we believe that it’s just the way the world works. What we hear first (and there’s no specific reason to doubt it), we accept as truth…
There is a tendency to form opinions early and then evaluate any subsequent information based on the previously formed opinion.
From primacy effect sometimes comes something called belief persistence – once we see something that at first shapes our belief, we tend to cling to that belief even after receiving new information that contradicts or disconfirms the basis of that belief.
you can give all your attention to one participant.
You might start to form beliefs from what that one person said, because there isn’t yet anything to challenge that person’s feedback….
Are you looking at each person with equal weight, or did your first participant stick out more in your mind?
Are you comparing everything you hear in latter feedback sessions to the first few people’s feedback you synthesized?
Have you ever learned a new word and suddenly heard it everywhere? Ever ended a relationship and suddenly felt like EVERYONE was in a relationship. No matter where you turned, it felt like someone was falling in love, getting married. One of my favorite examples of frequency illusion is (and I have only heard about this from friends and relatives), which is when you’re pregnant suddenly the whole world is pregnant
Have you ever learned a new word and suddenly heard it everywhere? Ever ended a relationship and suddenly felt like EVERYONE was in a relationship. No matter where you turned, it felt like someone was falling in love, getting married.
There is a slight nuance here. Not everyone considers frequency illusion a type of confirmation bias, regardless of where you fall on this debate, we felt it was important to include in this.
This is also sometimes referred to as the baader-meinhof phenomenon
If any of you are from the midatlantic region, you might remember how last summer there were a string of shark attacks around the same beaches in North Carolina.
There was an outpouring of articles in newspapers and social media about what to do in a shark attack, if it was still safe to swim by those beaches, etc.
As I was preparing to leave for the beach with friends, my own mother called me more than once to remind me to hit a shark on the nose if I ever encountered one.
So, there was all this hoopla about shark attacks, and a good deal of attention and effort paid to them – The catch is that deadly shark attacks are actually significantly less likely to happen than pretty much any other form of beach vacation death. You are more likely to die from:
…
With the exception of being killed in a collapsing sand hole (which was news to me), I think we can all rationally see that these events are more likely to happen than shark attacks. However, what about events that aren’t as obvious?
For example, many people confidently think that the likelihood of dying from shark attacks is greater than that of dying from….
-being hit by falling airplane parts, when more people actually die from falling airplane parts.
-So, why do people confidently misattribute more deaths to shark attacks than events like these, even when it should be a 50/50 guess for us?
-Shark attacks are sensational, and the media loves reporting about them. When one occurs, the media continues the story for days and days. When we see something in the news about shark attacks, we’re transfixed We have a whole week dedicated to sharks – we’re fascinated by them. The amount of time we’re exposed to information about sharks and shark attacks is wildly disproportionate to the chances of us ever encountering one ourselves.
-So, because of all of the hoopla that goes on when a shark attack occurs, examples of shark attacks are easily retrievable from memory when we’re prompted with it.
Because it’s easily retrievable in memory, we’re likely to give it more weight than it really deserves.
-Because of that, people are likely to believe that shark attack deaths occur much more frequent than they actually are.
This is an example of an availability heuristic.
Our brains rely on mental shortcuts to make our lives easier – if we had to think through every possible scenario every time we interfaced with something, again, we’d be unable to function. More often than not, the fact that our brains make shortcuts based on what we see more often ends up helping us out…often, if we encounter something more often, it’s a good representation of the way the world works.
However, it’s not always the case that what we see often is a true representation of the way the world works, and it’s those instances that lead us into confirmation bias.
An availability heuristic is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, or making a decision, and often we attribute the things that are more available in our memory as the truth (even if it’s not true)
So, we saw how availability heuristics play into shark attacks – how is this relevant to product development?
-In the world of product development, there are certain phenomena that are more available to us than others…we might call these buzz words, or buzz areas. One such area is the “Uber for x” model : like Uber for Laundry or Uber for Real Estate, etc.
-We hear in the news how successful certain companies are in tackling this buzz area.
-Because that nugget is easily available in our minds – Uber for X = success – we begin to accept that as truth. Because of Uber’s success being so available to us, $4+ billion of VC money has been injected into the on-demand service platform
-However, what you’re less likely to see in your environments are the examples of when this model DOESN’T work. People may invest in these on-demand services because of seeing uber’s success, without seeing the whole picture (and the more examples of the way the model failed others)
Something you can do here is articulate to yourself what pieces of feedback you’re getting that are really memorable or sensational – for instance, if you had a really exuberant research participant. Making this known to yourself and your team, you can be aware of the risk of bias toward that person’s feedback, and ensure that you’re not giving that feedback any more weight than it’s due.
In a world where everyone thinks they’re the next unicorn, and there is a time pressure to get to market before the next person, the biggest barrier to success becomes ourselves.
Talk about confirmation bias in daily life…
Frequency Illusion
Primacy Effect
Roll Effect
The tendency to have an excessive optimism towards an invention of innovation’s usefulness throughout society, while often failing to identify its limitations or weakness
Along the way, we’ve told you about specific ways that you can acknowledge and combat each of the psychological phenomena that contribute to confirmation bias. But we wanted to leave you with some ideas of how you can combat confirmation bias holistically.
-This is an exercise you can do with your teams when you’re synthesizing feedback that will ensure biases are kept to a minimum. When going over feedback, make sure that all of the hats are worn –
-You might have a team that is smaller than 6 – that’s OK, you don’t need to make someone just be the “red hat” perosn the entire session. Just make sure that all of the hats are worn, by someone, by the end of each synthesis session.
-So, what does this all mean?
-At the end of the day, those of us in product development care – a lot – about what we do. We believe in what we do, and sometimes we have to take some leaps of faith.
-However, the very beliefs that drive our work can sometimes hurt us as much as they help us – we follow the one shining light that we believe in, by seeking out and interpreting information that upholds that belief, sometimes to the extent that we ignore other, brighter lights to follow.
-As we tried to show you today, confirmation bias isn’t something that we do intentionally or maliciously…but rather, it’s part of the human condition, and the way the human brain works.
-Confirmation bias is a result of automatic, unintentional strategies rather than deliberate deception. But we’re all guilty of it, and we have to be very intentional in the way we attempt to combat it.
-We left you with some of our ideas on how to combat it, and we’re constatly thinking about how we can get better at it ourselves. Fighting the human condition is hard, but it can be done if you’re aware of your own biases and ways to combat them.