by Andrew Rusling
Outcomes such as “subscriptions increased by 20%” or “complaints regarding the upload feature reduced to zero” are what makes a real difference in our customers lives and hence to the company’s bottom line. When a team is delivering outcomes like that, there is no denying their performance and hence their value to the company.
Delivering outcomes comes from understanding our customers, producing an output that may result in an outcome and then validating if we have achieved the desired outcome. At the very least one of these cycles produces knowledge. The Lean Start-up by Eric Ries, clearly explained this cycle, unfortunately it did not explain clearly how we should design, set up, run or analyse our experiments. I have met many people who agree we should follow the Lean Start-up approach; however, there is rarely any consensus on the experimentation approach that will make it a reality.
In 2017 Australia’s largest independent game studio, Halfbrick Studios, embarked upon a mission to better understand their customer and experiment their way to renewed success. Fruit Ninja Fight is one of the results of that approach. In 2018 Australia’s largest Telco, Telstra, focused on “co-creation” with their customers through a series of experiments; delivering improved customer satisfaction and faster results than ever before.
This presentation shares with you my experiences of working with those two Scrum based organisations as they sought to improve their outcomes through Experimentation.
11. Listen to your customers,
give them what they want,
not what they ask for.
12. Good at finding a spark
Terrible for validating the worth of that spark
Customer interviews
Image Reference: http://happilyhaverland.blogspot.com/2013/01/
13. Lessons Learnt: Interviews
• Pre-qualify your interviewees
• Face to face
• Ask to record the session
• Try whole team, then scale back
• Start Open, Narrow in
• Discuss your product last
Image Reference: https://www.flickr.com/photos/wocintechchat/22518583822
14. Your understanding What the user sees
Photo Reference: https://www.flickr.com/photos/eggrole/7524458398
15. Observational Testing
“Nothing is quite so humbling as being forced to
watch in silence as some poor play-tester stumbles
around your level for 20 minutes, unable to figure
out the "obvious" answer that you now realize is
completely arbitrary and impossible to figure out.”
17. Observational Testing: Process
1. Provide an objective
2. Observe them and their screen
1. No guidance
2. Video record
3. Ask them to explain their thinking
21. Surveys: Open over Closed
CD 1:
CD 2:
CD 3:
Please enter your top 3 Albums
Yield: Pearl Jam
1
2
3
4
5
Please prioritise these Albums
AC DC Live: AC DC
Meddle: Pink Floyd
Garbage: Garbage
Wish you were here: Pink Floyd
CLOSED OPEN
22.
23. • Alt. Background colour
• Alt. Background
• Alt. Eye direction
• Alt. Barry direction
• Alt. Barry image
• Just Letters
• Just jetpack
• Overlay text
• Etc. etc.
24.
25. Always set a hypothesis
• Yes it will be a guess!
• Guess hypothesis still educate
26. ? Baseline: Messaging anytime
Hypothesis: Messaging during prime time will increase open rate.
Validated
Invalidated
30. Versions released each week
1. Baseline version, just basic game, no
progression.
2. Improved tutorial
3. UI/UX tweaks
4. First trial of progression system
5. Second trial of different progression system
6. Third trial of different progression system
33. 0
10
20
30
40
50
60
70
80
1 2 3 4 5 6 7
%ofactiveUsersbycohort
End of week
Retention by Cohort
TutorialUI/UX
Progress 1
Progress 2
Progress 3
Base
34. Key Lessons I learnt
• Mindset is crucial
• Listen to your customers
• Customer Interviews
• Observational testing
• Survey's need to be open
• Always set a hypothesis
• Negative hypothesis
• Cohort analysis