User research is often a side-lined activity within the experimentation space, but it’s crucial for keeping your testing velocity up and your win rates high.
Many experimentation and optimization teams fail to make this connection due to a lack of time, resources, knowledge or the siloed structure of their organization.
In this talk, Chris will take you through some practical examples of how user research can drive the quantity of ideas, their quality, and originality, which in turn leads to a much more successful overall experimentation program.
Inbound Marekting 2.0 - The Paradigm Shift in Marketing | Axon Garside
User Research: The Superpower Behind Experimentation Programs | VWO Webinars
1. User Research: The Superpower
Behind Experimentation Programs
Chris Gibbins, Chief Experience Officer
VWO Webinar - 1st Feb 2023
2. UNDERSTAND
your customers, products & services
EXPERIMENT
to optimise, personalise & innovate
We’re a specialist
Experimentation Consultancy
SCALE
the capability across the organisation
4. The value of experimentation is being realised
Mobile
Web
Desktop
To eliminate guesswork & make better decisions
A safety-net for adventurous ideas & innovation
To avoid rolling out harmful changes
To test bold ideas early & cost effectively
To be more customer-centric
To drive incremental & measurable growth
Mobile
App
OTT
5. When auditing experimentation
programs, common themes that we see:
Low (Success) Win Rates
Poor problem statements
“Conversion rates too low from this page”
Lack of evidence / research
Weak hypotheses
e.g. “By making this button sticky we will
increase conversion rates”
Experimentation Teams disconnected from
Research teams
Confirmation Bias in Problem Statements
Low Velocity
Lack of originality & diversity
Playing it too safe
A lot of HARKing in A/B test results
6. Anyone know what HARKing is?
A lot of HARKing in A/B test results
8. Confirmation Bias in Problem Statements
CONFIRMATION BIAS
EVIDENCE
WE IGNORE
2. They realise they
need a problem
statement to add
to the backlog!
3. Only then do they
go and hunt for
some supporting
data! Ignoring
everything else.
Facts &
Evidence
EVIDENCE
WE BELIEVE
1. The team has an
A/B test idea
A cool
A/B test
idea that
they really
want to
run
9. Finding
solutions that are
significantly better
than the control is
hard.
And we’re pretty
rubbish at
predicting human
behaviour *Source: RONNY KOHAVI
Based on primary business metric
33%* Microsoft, Intuit Quicken
10%* Booking, Google Ads, Netflix
Bing 15%*
AirBnB Search 8%*
0%
100%
EXPERIMENTATION SUCCESS RATES
40%
Low (Success) Win Rates
10. How do teams increase the odds of
finding more winning solutions?
12. Kiss more to find more winners
Reduce the cost
of an experiment
Democratise
experimentation
Test everywhere Increased velocity
=
A/B A/B/n
(More variations)
QUANTITY OF IDEAS
13. - Surprises (serendipitousness!)
- Quantity
- Diversity
- Builds Empathy - keeps teams user-centric
User Research - a powerful fuel to drive more ideas
High
quantity
of diverse
ideas
QUANTITY OF IDEAS
14. Creative CX Opportunity Audit example
High
Priority
Medium
Priority
Low
Priority
Positive
Total (by
site area)
Sitewide 0 2 1 1 4
Homepage 0 3 2 0 5
Search / Nav 1 8 7 3 19
PLP 2 9 5 0 16
PDP 12 16 7 0 35
Extras 1 10 4 0 15
Basket 0 1 0 0 1
Checkout 4 3 3 0 10
Total (by
priority)
20 52 29 4 105
105 findings / problem statements
61 Solutions to A/B/n test
5 Personalisation ideas
2 Larger initiatives
26 Fixes / JDIs
Recommendations
User Research + Analysis + Ideation
QUANTITY OF IDEAS
16. The next 2 help you improve the
quality of A/B tests:
1. Problem Understanding
2. Quality of Execution
17. WINNING
SOLUTIONS
1. PROBLEM
UNDERSTANDING
Complete
understanding
No idea
2. QUALITY OF EXECUTION
Poorly designed,
buggy code
High quality design,
usability & code
User Research &
Analysis
UX Design, Usability &
Accessibility
Experiment
Trustworthiness
No flicker (FOOC)
Quality of coding & QA
Data quality
19. PROBLEM UNDERSTANDING
Problem/Opportunity Discovery & Understanding
OBJECTIVES
- To uncover opportunities for improvement,
optimisation, personalisation, innovation:
- Pain points / Usability issues / Confusion
- Unmet needs, expectations, priorities
- Objections, Distrust, Anxieties
- Interesting user behaviours / workarounds
- To quantify the extent of the opportunities e.g.:
- How many people are affected by this issue?
- Which audiences are affected?
20. PROBLEM UNDERSTANDING
Most effective research techniques
QUALITATIVE
- Usability Testing (Moderated)
- User Interviews (e.g. Jobs To Be Done)
Exploring the end to end journey
- Post-purchase surveys
E.g. “What nearly stopped you from
buying today?”
- Call Centre insights
- Sales Team interviews
QUANTITATIVE
- Behavioural Analytics
(e.g. Contentsquare, Fullstory, Hotjar,
Quantum Metric, VWO Insights)
- Web Analytics
(e.g. Google Analytics, Adobe, Amplitude)
- Jobs to be done surveys
Quantify importance of needs
21. This powerful combination of
user research (qual) & data analysis (quant)
Strong Problem
Statements
Evidence-based
How Might We’s
More creative
Ideation sessions
Powerful
data-driven
Hypotheses
Higher overall
success rates
=
PROBLEM UNDERSTANDING
22. Client example:
- The serendipitous nature of
User Research
- Perfect example of
problem understanding
PROBLEM UNDERSTANDING
23. The Log-in step
Prominent CTA’s
✓
Clear-ish booking summary
✓
Secure checkout padlock
✓
“Guest checkout” option available
✓
Low drop-off rate
73% Continue to the next step
18% Go back to basket (previous step)
9% Exit rate
✓
PROBLEM UNDERSTANDING
24. The Log-in step
PROBLEM UNDERSTANDING
QUESTION:
What was the most important
problem/opportunity is at this
step of the journey?
25. One of the Usability testing findings - Participant 7: Paul
Mark, participant
7, who regularly
books budget
hotels
(not real picture -
this-person-does-not-exi
st.com)
PROBLEM UNDERSTANDING
26. The Aha moment!
Mark, participant
7, who regularly
books budget
hotels
(not real picture -
this-person-does-not-exi
st.com)
PROBLEM UNDERSTANDING
28. If we’d only listened to best practices we’d never have
discovered this opportunity
PROBLEM UNDERSTANDING
29. 1. A brand new problem identified
2. Very clear Problem Statement with
evidence from user research
3. Hypothesis - focussed on a real
observed problem + data analysis
4. Four variations to test different
executions
30. CONTROL
VARIATION 4 -
Winning Variation
Significant uplift in
Booking Conversion Rate
(99% stat-sig)
PROBLEM UNDERSTANDING
31. Integrate research into your experimentation processes
UNDERSTAND
PRACTICAL TIPS
➔ Make friends with your research teams!
➔ Give them credit for winners that came from research
➔ Keep track of where the ideas came from for A/B tests
➔ Schedule in monthly usability testing before you even
know which journey you’ll be testing!
➔ You’ll get so much more value from moderated research
➔ Get the experimenters/optimizers to observe/notetake
➔ Make sure you have evidence-based problem statements
in your test plans! Avoid confirmation bias analysis.
➔ If you don’t have specialists UX Researchers, learn how to
run and moderate usability testing sessions yourself
32. UNDERSTAND
your customers, products & services
EXPERIMENT
to optimise, personalise & innovate
THANK YOU, QUESTIONS
SCALE
the capability across the organisation