SlideShare a Scribd company logo
1 of 136
Why does my AB testing suck?
5th Sep 2013 @OptimiseOrDie
#11 Top Split Test & CRO questions
#1 How to choose test type!
#2 Top CRO questions
2
11.1 – How to choose the test type
• Test complexity – this drains time!
• Analytics – will this allow you to test before/after?
• Money now or Precise data later
• What stage are you at with the client?
• How volatile is the data and traffic?
• Are there huge non weekly patterns – seasonal?
• A/B tests – Design shift, new baseline, local maxima
• MVT – Need to know variables, client can’t settle
• Small MVT – A/B + MVT benefits (2x2x2 or 2x2x4)
• Traffic is your biggest factor here
• Use the VWO test run calculator
• Do a rough calculation yourself
• Let’s look at some recent pages of yours
11.2 – Top Conversion Questions
• 32 questions, picked by Practitioners
• I plan to record them all as a course!
• What top stuff did I hear?
“How long will my test
take?”
“When should I check
the results?”
“How do I know if it‟s
ready?” 4
#1 The tennis court
– Let’s say we want to estimate, on average, what height Roger Federer
and Nadal hit the ball over the net at. So, let’s start the match:
5
First Set Federer 6-4
– We start to collect values
6
62cm
+/- 2cm
63.5cm
+/- 2cm
Second Set – Nadal 7-6
– Nadal starts sending them low over the net
7
62cm
+/- 1cm
62.5cm
+/- 1cm
Final Set Nadal 7-6
– We start to collect values
61.8cm
+/- .3cm
62cm
+/- .3cm
Let’s look at this a different way
9
62.5cm
+/- 1cm
9.1 ± 0.3%
Graph is a range, not a line:
9.1 ± 0.3%
#1 Summary
• The minimum length
– 2 business cycles
OR
– 250, pref 350 outcomes in each
– Calculate time to reach minimum
– Work on footfall, not sitewide
– So, if you get 1,000 visitors a day to a new test page
– You convert at around 5% (50 checkouts)
– You have two creatives
– At current volumes, you’ll get ~25 checkouts a day for each creative
– That means you need 14 days minimum
– If they separate, it might take less (but business cycle rule kicks in)
– If they don’t separate, it could take longer
– Remember it’s a fuzzy region – not a precise point
11
#1 Summary
• The minimum length
– Depends on performance
– If you test two shades of blue?
– Traffic may change
– PPC budget might run out
– TV advertising may start
– Segment level performance may drive
– You can estimate a test length – you cannot predict it
– Be aware of your marketing activity, always
– Watch your test like a hyper-engaged chef
12
11.3 – Are we there yet? Early test stages…
• Ignore the graphs. Don’t draw conclusions. Don’t dance. Calm down.
• Get a feel for the test but don’t do anything yet!
• Remember – in A/B - 50% of returning visitors will see a new shiny website!
• Until your test has had at least 1 business cycle and 250-350 outcomes, don’t
bother drawing conclusions or getting excited!
• You’re looking for anything that looks really odd – your analytics person should be
checking all the figures until you’re satisfied
• All tests move around or show big swings early in the testing cycle. Here is a very
high traffic site – it still takes 10 days to start settling. Lower traffic sites will
stretch this period further.
13
11.4 – What happens when a test flips on me?
• Something like this can happen:
• Check your sample size. If it‟s still small, then expect this until the test settles.
• If the test does genuinely flip – and quite severely – then something has changed with
the traffic mix, the customer base or your advertising. Maybe the PPC budget ran
out? Seriously!
• To analyse a flipped test, you‟ll need to check your segmented data. This is why you
have a split testing package AND an analytics system.
• The segmented data will help you to identify the source of the shift in response to your
test. I rarely get a flipped one and it‟s always something changing on me, without
being told. The heartless bastards.
14
11.5 – What happens if a test is still moving around?
• There are three reasons it is moving around
– Your sample size (outcomes) is still too small
– The external traffic mix, customers or reaction has
suddenly changed or
– Your inbound marketing driven traffic mix is
completely volatile (very rare)
• Check the sample size
• Check all your marketing activity
• Check the instrumentation
• If no reason, check segmentation
15
11.6 – How do I know when it’s ready?
• The hallmarks of a cooked test are:
– It’s done at least 1 or 2 (preferred) cycles
– You have at least 250-350 outcomes for each recipe
– It’s not moving around hugely at creative or segment level
performance
– The test results are clear – even if the precise values are not
– The intervals are not overlapping (much)
– If a test is still moving around, you need to investigate
– Always declare on a business cycle boundary – not the middle of
a period (this introduces bias)
– Don’t declare in the middle of a limited time period advertising
campaign (e.g. TV, print, online)
– Always test before and after large marketing campaigns (one
week on, one week off)
16
11.7 – What happens if it’s inconclusive?
• Analyse the segmentation
• One or more segments may be over and under
• They may be cancelling out – the average is a lie
• The segment level performance will help you
(beware of small sample sizes)
• If you genuinely have a test which failed to move any
segments, it’s a crap test
• This usually happens when it isn’t bold or brave
enough in shifting away from the original design,
particularly on lower traffic sites
• Get testing again!
17
11.8 – What QA testing should I do?
• Cross Browser Testing
• Testing from several locations (office, home, elsewhere)
• Testing the IP filtering is set up
• Test tags are firing correctly (analytics and the test tool)
• Test as a repeat visitor and check session timeouts
• Cross check figures from 2+ sources
• Monitor closely from launch, recheck
18
11.9 – What happens if it fails?
• Learn from the failure
• If you can’t learn from the failure, you’ve designed a crap test.
Next time you design, imagine all your stuff failing. What would
you do? If you don’t know or you’re not sure, get it changed so
that a negative becomes useful.
• So : failure itself at a creative or variable level should tell you
something.
• On a failed test, always analyse the segmentation
• One or more segments will be over and under
• Check for varied performance
• Now add the failure info to your Knowledge Base:
• Look at it carefully – what does the failure tell you? Which
element do you think drove the failure?
• If you know what failed (e.g. making the price bigger) then you
have very useful information
• You turned the handle the wrong way
• Now brainstorm a new test
19
11.10 – Should I run an A/A test first?
• No – and this is why:
– It’s a waste of time
– It’s easier to test and monitor instead
– You are eating into test time
– Also applies to A/A/B/B testing
– A/B/A running at 25%/50%/25% is the best
• Read my post here :
http://bit.ly/WcI9EZ
20
11.11 – What is a good conversion rate?
Higher than the one
you had last month!
21
#12 – Top reasons summary
• You weren’t bold enough
• You made the test too complex
• Your test didn’t tell you anything
(failures too!)
• You didn’t do browser QA
• The session model is broken
• Your redirects are flawed
• Your office is part of the bias
• The test isn’t truly random / The
samples aren’t representative
• Your sample size is too small
• You didn’t test for long enough
• You didn’t look at the error rates
• You didn’t cross instrument
22
• You’ve missed one or more
underlying cycles
• You don’t factor in before/after
cycles
• One test has an inherent
performance bias (load time, for
example)
• You didn’t watch segment
performance
• You’re measuring too shallowly in
the funnel
• Your traffic mix has changed
• You’re not measuring channel
switchers (phone/email/chat etc.)
• The analytics setup is broken!
#13 – Summary - tests
• This isn’t about tools – it’s about your thinking and approach to
problems. Bravery and curiosity more important than wizardry!
• Keep it simple and aim for actionable truths and insights
• Invest in staff, training, analytics (yours and your clients)
• More wired in clients means happier agency!
• Fixing problems impresses clients even before you start (health check)
• Prioritise issues into opportunity & effort
• Showing models around money is a winner
• Do something every week to make the client configuration better
• Let me use a till analogy!
• What about a Formula 1 racing car?
• Get clients to pay you to invest in their future
• Give staff time to train themselves, go on courses, get qualified
• On that note – experience with core skills + topups = GA experts
• Tap into the community out there
• Hopefully this has given you a great springboard to MORE!
23
Is there a way to fix this then?
24
Conversion
Heroes!
@OptimiseOrDie
END & QUESTIONS
25
Email
Twitter
: sullivac@gmail.com
: @OptimiseOrDie
: linkd.in/pvrg14
More reading.
26
RESOURCE PACK
27
So you want examples?
• Belron – Ed Colley
• Dell – Nazli Yuzak
• Shop Direct – Paul Postance (now with EE)
• Expedia – Oliver Paton
• Schuh – Stuart McMillan
• Soundcloud – Eleftherios Diakomichalis & Ole Bahlmann
• Gov.uk – Adam Bailin (now with the BBC)
Read the gov.uk principles : www.gov.uk/designprinciples
And my personal favourite of 2013 – Airbnb!
@OptimiseOrDie
Best of Twitter
@OptimiseOrDie
@danbarker Analytics
@fastbloke Analytics
@timlb Analytics
@jamesgurd Analytics
@therustybear Analytics
@carmenmardiros Analytics
@davechaffey Analytics
@priteshpatel9 Analytics
@cutroni Analytics
@Aschottmuller Analytics, CRO
@cartmetrix Analytics, CRO
@Kissmetrics CRO / UX
@Unbounce CRO / UX
@Morys CRO/Neuro
@PeepLaja CRO
@TheGrok CRO
@UIE UX
@LukeW UX / Forms
@cjforms UX / Forms
@axbom UX
@iatv UX
@Chudders Photo UX
@JeffreyGroks Innovation
@StephanieRieger Innovation
@DrEscotet Neuro
@TheBrainLady Neuro
@RogerDooley Neuro
@Cugelman Neuro
Best of the Web
@OptimiseOrDie
Whichtestwon.com
Unbounce.com
Kissmetrics.com
Uxmatters.com
RogerDooley.com
PhotoUX.com
TheTeamW.com
Baymard.com
Lukew.com
PRWD.com
Measuringusability.com
ConversionXL.com
Smartinsights.com
Econsultancy.com
Cutroni.com
www.GetMentalNotes.com
Best of Books
@OptimiseOrDie
#13 Top Augmentation Tools
#1 Session Replay
#2 Browser & Email testing
#3 VOC, Survey & Feedback tools
#4 Guerrilla Usability
#5 Productivity tools
#6 Split testing
#7 Performance
#8 Crowdsourcing
#9 Analytics Love 32
13.1 - Session Replay
• 3 kinds of tool :
Client side
• Normally Javascript based
• Pros : Rich mouse and click data,
errors, forms analytics, UI interactions.
• Cons : Dynamic content issue, Performance hit
Server side
• Black Box -> Proxy, Sniffer, Port copying device
• Pros : Gets all dynamic content, fast, legally tight
• Cons : No client side interactions, Ajax, HTML5 etc.
Hybrid
• Clientside and Sniffing with central data store
33
13.1 - Session Replay
• Vital for optimisers & fills in a ‘missing link’ for insight
• Rich source of data on visitor experiences
• Segment by browser, visitor type, behaviour, errors
• Forms Analytics (when instrumented) are awesome
• Can be used to optimise in real time!
Session replay tools
• Clicktale (Client) www.clicktale.com
• SessionCam (Client) www.sessioncam.com
• Mouseflow (Client) www.mouseflow.com
• Ghostrec (Client) www.ghostrec.com
• Usabilla (Client) www.usabilla.com
• Tealeaf (Hybrid) www.tealeaf.com
• UserReplay (Server) www.userreplay.com 34
35
36
37
13.2 - Feedback / VOC tools
• Anything that allows immediate realtime onpage feedback
• Comments on elements, pages and overall site & service
• Can be used for behavioural triggered feedback
• Tip! : Take the Call Centre for beers
• Kampyle
www.kampyle.com
• Qualaroo
www.qualaroo.com
• 4Q
4q.iperceptions.com
• Usabilla
www.usabilla.com
38
13.3 - Survey Tools
• Surveymonkey www.surveymonkey.com (1/5)
• Zoomerang www.zoomerang.com (3/5)
• SurveyGizmo www.surveygizmo.com (5/5)
• For surveys, web forms, checkouts, lead gen – anything with
form filling – you have to read these two:
Caroline Jarrett (@cjforms)
Luke Wroblewski (@lukew)
• With their work and copywriting from @stickycontent, I
managed to get a survey with a 35% clickthrough from email
and a whopping 94% form completion rate.
• Their awesome insights are the killer app I have when
optimising forms and funnel processes for clients.
39
13.4 – Experience the experience!
Email testing www.litmus.com
www.returnpath.com
www.lyris.com
Browser testing www.crossbrowsertesting.com
www.cloudtesting.com
www.multibrowserviewer.com
www.saucelabs.com
Mobile devices www.perfectomobile.com
www.deviceanywhere.com
www.mobilexweb.com/emulators
www.opendevicelab.com
40
13.5 - Stingy Client Testing
• Mobile can be lots of fun
• Some low budget stuff you may know about already:
CamStudio (free)
www.camstudio.org
Mediacam AV (cheap)
www.netu2.co.uk
Silverback (Mac)
www.silverbackapp.com
Screenflow (Mac)
www.telestream.net
UX Recorder (iOS), Skype Hugging, Reflection
www.uxrecorder.com & bit.ly/tesTfm & bit.ly/GZMgxR 41
13.6 - Productivity tools
Oh sh*t
42
1.6 - Join.me
43
1.6 - Pivotal Tracker
44
1.6 – Trello
45
1.6 - Basecamp
46
• Lots of people don’t know this
• Serious time is getting wasted on pulling and preparing data
• Use the Google API to roll your own reports straight into Big G
• Google Analytics + API + Google docs integration = A BETTER LIFE!
• Hack your way to having more productive weeks
• Learn how to do this to make completely custom reports
1.6 - Google Docs and Automation
47
• LucidChart
13.7 - Cloud Collaboration
48
• Webnotes
13.7 - Cloud Collaboration
49
• Protonotes
13.7 - Cloud Collaboration
50
• Conceptshare
13.7 - Cloud Collaboration
51
13.8 - Split testing tools – Cheap!
• Google Content Experiments
bit.ly/Ljg7Ds
• Multi Armed Bandit Explanation
bit.ly/Xa80O8
• Optimizely
www.optimizely.com
• Visual Website Optimizer
www.visualwebsiteoptimizer.com
52
13.9 - Performance
• Google Site Speed
• Webpagetest.org
• Mobitest.akamai.org53
Site Size Requests
The Daily Mail 4574k 437
Starbucks 1300k 145
Direct line 887k 45
Ikea (.se) 684k 14
Currys 667k 68
Marks & Spencers 308k 45
Tesco 234k 15
The Guardian 195k 35
BBC News 182k 62
Auto Trader 151k 47
Amazon 128k 16
Aviva 111k 18
Autoglass 25k 10
Real testing : mobitest.akamai.com
Slides : slidesha.re/PDpTPD
If you really care, download this deck:
Scare the Ecom or Trading director:
Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both)
Usertesting (B) www.usertesting.com
Userlytics (B) www.userlytics.com
Userzoom (S) www.userzoom.com
Intuition HQ (S) www.intuitionhq.com
Mechanical turk (S) www.mechanicalturk.com
Loop11 (S) www.loop11.com
Open Hallway (S) www.openhallway.com
What Users Do (P) www.whatusersdo.com
Feedback army (P) www.feedbackarmy.com
User feel (P) www.userfeel.com
Ethnio (For Recruiting) www.ethnio.com
Feedback on Prototypes / Mockups
Pidoco www.pidoco.com
Verify from Zurb www.verifyapp.com
Five second test www.fivesecondtest.com
Conceptshare www.conceptshare.com
Usabilla www.usabilla.com
13.10 – UX Crowd tools
57
13.11 - Web Analytics Love
• Properly instrumented analytics
• Investment of 5-10% of developer time
• Add more than you need
• Events insights
• Segmentation
• Call tracking love!
58
13.12 - Tap 2 Call tracking
Step 1 : Add a unique phone number on ALL channels
(or insert your own dynamic number)
Step 2 : For phones, add “Tap to Call” or “Click to Call”
• Add Analytics event or tag for phone calls!
• Very reliable data, easy & cheap to do
• What did they do before calling?
• Which page did they call you from?
• What PPC or SEO keyword did they use?
• Incredibly useful – this keyword level call data
• What are you over or underbidding for?
• Will help you shave 10, 20%+ off PPC
• Which online marketing really sucks?
59
0.0
5.0
10.0
15.0
20.0
25.0 safelit
windshieldchiprepair
safelitewindshield
autoglass
autowindowreplacement
safelightauto
autoglassrepair
windshieldreplacementcosts
safeliteautoglass
safeautoglass
safeliterepair
windshield
autoglassreplacement
carglass
safelitelocations
autoglassrepairquotes
windshieldrepair
mobilewindshieldreplacement
replacewindshield
carwindshieldrepair
newwindshieldcost
autoglasswindshieldreplacement
carwindowrepaircost
safegl
autoglassrepairhoustontx
windshieldcrack
Phone to Booking Ratio
60
13.12 – And desktops?
Step 1 : Add ‘Click to reveal’
• Can be a link, button or a collapsed section
• Add to your analytics software
• This is a great budget option!
Step 2 : Invest in call analytics
• Unique visitor tracking for desktop
• Gives you that detailed marketing data
• Easy to implement
• Integrates with your web analytics
• Let me explain…
61
13.12 - So what does phone tracking get you?
• You can do it for free on your online channels
• If you’ve got any phone sales or contact operation, this will
change the game for you
• For the first time, analytics for PHONE for web to claim
• Optimise your PPC spend
• Track and Test stuff on phones, using web technology
• The two best phone A/B tests? You’ll laugh!
62
Who?Company Website Coverage
Mongoose Metrics* www.mongoosemetrics.com UK, USA, Canada
Ifbyphone* www.ifbyphone.com USA
TheCallR* www.thecallr.com USA, Canada, UK, IT, FR, BE, ES, NL
Call tracking metrics www.calltrackingmetrics.com USA
Hosted Numbers www.hostednumbers.com USA
Callcap www.callcap.com USA
Freespee* www.freespee.com
UK, SE, FI, NO, DK, LT, PL, IE, CZ,
SI, AT, NL, DE
Adinsight* www.adinsight.co.uk UK
Infinity tracking* www.infinity-tracking.com UK
Optilead* www.optilead.co.uk UK
Switchboard free www.switchboardfree.co.uk UK
Freshegg www.freshegg.co.uk UK
Avanser www.avanser.com.au AUS
Jet Interactive* www.jetinteractive.com.au AUS
* I read up on these or talked to them. These are my picks.
63
64
13.12 - Web Analytics Love
• People, Process, Human problems
• UX of web analytics tools and reports
• Make the UI force decisions!
• Playability and exploration
• Skunkworks project time (5-10%)
• Give it love, time, money and iteration
• How often do you iterate analytics?
• Lastly, spend to automate, gain MORE time
65
END & QUESTIONS
66
END & QUESTIONS
67
When you get a 20% lift
RESOURCE PACK
• Maturity model
• Crowdsourced UX
• Collaborative tools
• Testing tools for CRO & QA
• Belron methodology example
• CRO and testing resources
68
Ad Hoc
Local Heroes
Chaotic Good
Level 1
Starter Level
Guessing
A/B testing
Basic tools
Analytics
Surveys
Contact Centre
Low budget
usability
Outline process
Small team
Low hanging fruit
+ Multi variate
Session replay
No segments
+Regular usability
testing/research
Prototyping
Session replay
Onsite feedback
________________________________________________________________________
_____________________ _
Dedicated team
Volume
opportunities
Cross silo team
Systematic tests
Ninja Team
Testing in the
DNA
Well developed Streamlined Company wide
+Funnel
optimisation
Call tracking
Some segments
Micro testing
Bounce rates
Big volume
landing pages
+ Funnel analysis
Low converting
& High loss pages
+ offline
integration
Single channel
picture
+ Funnel fixes
Forms analytics
Channel switches
+Cross channel
testing
Integrated CRO
and analytics
Segmentation
+Spread tool use
Dynamic adaptive
targeting
Machine learning
Realtime
Multichannel
funnels
Cross channel
synergy
________________________________________________________________________
_______________________
________________________________________________________________________
________________________
Testing
focus
Culture
Process
Analytics
focus
Insight
methods
+User Centered
Design
Layered feedback
Mini product tests
Get buyin
_________________________________________________________________________
_______________________Mission Prove ROI Scale the testing Mine value
Continual
improvement
+ Customer sat
scores tied to UX
Rapid iterative
testing and
design
+ All channel view
of customer
Driving offline
using online
All promotion
driven by testing
Level 2
Early maturity
Level 3
Serious testing
Level 4
Core business value
Level 5
You rock, awesomely
________________________________________________________________________
________________________
69
Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both)
Usertesting (B) www.usertesting.com
Userlytics (B) www.userlytics.com
Userzoom (S) www.userzoom.com
Intuition HQ (S) www.intuitionhq.com
Mechanical turk (S) www.mechanicalturk.com
Loop11 (S) www.loop11.com
Open Hallway (S) www.openhallway.com
What Users Do (P) www.whatusersdo.com
Feedback army (P) www.feedbackarmy.com
User feel (P) www.userfeel.com
Ethnio (For Recruiting) www.ethnio.com
Feedback on Prototypes / Mockups
Pidoco www.pidoco.com
Verify from Zurb www.verifyapp.com
Five second test www.fivesecondtest.com
Conceptshare www.conceptshare.com
Usabilla www.usabilla.com
2 - UX Crowd tools
70
3 - Collaborative Tools
Oh sh*t
71
3.1 - Join.me
72
3.2 - Pivotal Tracker
73
3.3 – Trello
74
3.4 - Basecamp
75
• Lots of people don’t know this
• Serious time is getting wasted on pulling and preparing data
• Use the Google API to roll your own reports straight into Big G
• Google Analytics + API + Google docs integration = A BETTER LIFE!
• Hack your way to having more productive weeks
• Learn how to do this to make completely custom reports
3.5 - Google Docs and Automation
76
• LucidChart
3.6 - Cloud Collaboration
77
• Webnotes
3.7 - Cloud Collaboration
78
• Protonotes
3.8 - Cloud Collaboration
79
• Conceptshare
3.9 - Cloud Collaboration
80
4 – QA and Testing tools
Email testing www.litmus.com
www.returnpath.com
www.lyris.com
Browser testing www.crossbrowsertesting.com
www.cloudtesting.com
www.multibrowserviewer.com
www.saucelabs.com
Mobile devices www.perfectomobile.com
www.deviceanywhere.com
www.mobilexweb.com/emulators
www.opendevicelab.com
81
5 – Méthodologies - Lean UX
Positive
– Lightweight and very fast methods
– Realtime or rapid improvements
– Documentation light, value high
– Low on wastage and frippery
– Fast time to market, then optimise
– Allows you to pivot into new areas
Negative
– Often needs user test feedback to
steer the development, as data not
enough
– Bosses distrust stuff where the
outcome isn’t known
“The application of UX design methods into product
development, tailored to fit Build-Measure-Learn cycles.”
82
5 - Agile UX / UCD / Collaborative Design
Positive
– User centric
– Goals met substantially
– Rapid time to market (especially when
using Agile iterations)
Negative
– Without quant data, user goals can
drive the show – missing the business
sweet spot
– Some people find it hard to integrate
with siloed teams
– Doesn’t’ work with waterfall IMHO
Wireframe
Prototype
TestAnalyse
Concept
Research
“An integration of User Experience Design and Agile*
Software Development Methodologies”
*Sometimes
83
CRO
84
5 - Lean Conversion Optimisation
Positive
– A blend of several techniques
– Multiple sources of Qual and Quant data aids triangulation
– CRO analytics focus drives unearned value inside all
products
Negative
– Needs a one team approach with a strong PM who is a
Polymath (Commercial, Analytics, UX, Technical)
– Only works if your teams can take the pace – you might be
surprised though!
“A blend of User Experience Design, Agile PM, Rapid Lean
UX Build-Measure-Learn cycles, triangulated data
sources, triage and prioritisation.”
85
5 - Lean CRO
Inspection
Immersion
Identify
Triage &
Triangulate
Outcome
Streams
Measure
Learn
Instrument
86
5 - Triage and Triangulation
• Starts with the analytics data
• Then UX and user journey walkthrough from SERPS -> key paths
• Then back to analytics data for a whole range of reports:
• Segmented reporting, Traffic sources, Device viewport and
browser, Platform (tablet, mobile, desktop) and many more
• We use other tools or insight sources to help form hypotheses
• We triangulate with other data where possible
• We estimate the potential uplift of fixing/improving something
as well as the difficulty (time/resource/complexity/risk)
• A simple quadrant shows the value clusters
• We then WORK the highest and easiest scores by…
• Turning every opportunity spotted into an OUTCOME
“This is where the smarts of CRO are – in identifying the
easiest stuff to test or fix that will drive the largest uplift.”
87
5 - The Bucket Methodology
“Helps you to stream actions from the insights and prioritisation work.
Forces an action for every issue, a counter for every opportunity being lost.”
 Test
If there is an obvious opportunity to shift behaviour, expose insight or
increase conversion – this bucket is where you place stuff for testing. If
you have traffic and leakage, this is the bucket for that issue.
 Instrument
If an issue is placed in this bucket, it means we need to beef up the
analytics reporting. This can involve fixing, adding or improving tag or
event handling on the analytics configuration. We instrument both
structurally and for insight in the pain points we’ve found.
 Hypothesise
This is where we’ve found a page, widget or process that’s just not working
well but we don’t see a clear single solution. Since we need to really shift
the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by
evidence and data, we’ll create test plans to find the answers to the
questions and change the conversion or KPI figure in the desired direction.
 Just Do It
JFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the
change is a no-brainer. Items marked with this flag can either be deployed
in a batch or as part of a controlled test. Stuff in here requires low effort
or are micro-opportunities to increase conversion and should be fixed.
 Investigate You need to do some testing with particular devices or need more
information to triangulate a problem you spotted. If an item is in this
bucket, you need to ask questions or do further digging. 88
5 - Belron example – Funnel replacement
Final
prototype
Usability
issues left
Final changes Release build
Legal review
kickoff
Cust services
review kickoff
Marketing
review
Test Plan
Signoff
(Legal, Mktng
, CCC)
Instrument
analytics
Instrument
Contact
Centre
Offline
tagging
QA testing
End-End
testing
Launch
90/10%
Monitor
Launch
80/20%
Monitor < 1
week
Launch
50/50%
Go live 100%
Analytics
review
Washup and
actions
New
hypotheses
New test
design
Rinse and
Repeat!
6 - CRO and Testing resources
• 101 Landing page tips : slidesha.re/8OnBRh
• 544 Optimisation tips : bit.ly/8mkWOB
• 108 Optimisation tips : bit.ly/3Z6GrP
• 32 CRO tips : bit.ly/4BZjcW
• 57 CRO books : bit.ly/dDjDRJ
• CRO article list : bit.ly/nEUgui
• Smashing Mag article : bit.ly/8X2fLk
90
END SLIDES
91
Feel free to steal, re-use, appropriate or otherwise lift
stuff from this deck.
If it was useful to you – email me or tweet me and tell me
why – I‟d be DELIGHTED to hear!
Regards,
Craig.
Why does my CRO suck?
5th Sep 2013 @OptimiseOrDie
@OptimiseOrDie
Timeline
- 1998 1999 - 2004 2004-2008 2008-2012
Belron Brands
@OptimiseOrDie
SE
O
@OptimiseOrDie
PP
C
UX Analytics
A/B and Multivariate testing
Customer Satisfaction
Design
QADevelopment
40+ websites, 34 countries, 19
languages, €1bn+ revenue
Performance
8 people
@OptimiseOrDie
Ahh, how it hurt
If you‟re not a part of the solution, there‟s
good money to be made in prolonging the problem
Out of my comfort zone…
@OptimiseOrDie
Behind enemy lines…
@OptimiseOrDie
Nice day at the office, dear?
@OptimiseOrDie
Competition…
Traffic is harder!
SEO/PPC
Panguin tool…
Casino Psychology
If it isn‟t working, you‟re not doing it right
@OptimiseOrDie
#1 : Your analytics are cattle trucked
@OptimiseOrDie
#1 : Your analytics are cattle trucked
@OptimiseOrDie
#1 : Common problems (GA)
• Dual purpose goal page
– One page used by two outcomes – and not split
• Cross domain tracking
– Where you jump between sites, this borks the data
• Filters not correctly set up
– Your office, agencies, developers are skewing data
• Code missing or double code
– Causes visit splitting, double pageviews, skews bounce rate
• Campaign, Social, Email tracking etc.
– External links you generate are not setup to record properly
• Errors not tracked (404, 5xx, Other)
– You are unaware of error volumes, locations and impact
• Dual flow funnels
– Flows join in the middle of a funnel or loop internally
• Event tracking skews bounce rate
– If an event is set to be „interactive‟ – it can skew bounce rate (example)
@OptimiseOrDie
#1 : Common problems (GA)
– EXAMPLE
110
Landing
1st
interaction
Loss
2nd
interaction
Loss
3rd
interaction
Loss
4th
interaction
Loss
55900 527 99.1% 66 87.5% 55 16.7% 33 40.0%
30900 4120 86.7% 2470 40.0% 1680 32.0% 1240 26.2%
#1 : Solutions
• Get a Health Check for your Analytics
– Try @prwd, @danbarker, @peter_oneill or ask me!
• Invest continually in instrumentation
– Aim for at least 5% of dev time to fix + improve
• Stop shrugging : plug your insight gaps
– Change „I don‟t know‟ to „I‟ll find out‟
• Look at event tracking (Google Analytics)
– If set up correctly, you get wonderful insights
• Would you use paper instead of a till?
– You wouldn‟t do it in retail so stop doing it online!
• How do you win F1 races?
– With the wrong performance data, you won‟t
@OptimiseOrDie
Insight - Inputs
#FAIL
Competitor
copying
Guessing
Dice rolling
An article
the CEO
read
Competitor
change
Panic
Ego
Opinion
Cherished
notions
Marketing
whims Cosmic rays
Not ‘on
brand’
enough
IT
inflexibility
Internal
company
needs
Some
dumbass
consultant
Shiny
feature
blindness
Knee jerk
reactons
#2 : Your inputs are all wrong
@OptimiseOrDie
Insight - Inputs
Insight
Segmentation
Surveys
Sales and
Call Centre
Session
Replay
Social
analytics
Customer
contact
Eye tracking
Usability
testing
Forms
analytics
Search
analytics Voice of
Customer
Market
research
A/B and
MVT testing
Big &
unstructured
data
Web
analytics
Competitor
evalsCustomer
services
#2 : Your inputs are all wrong
@OptimiseOrDie
#2 : Solutions
• Usability testing and User Centred design
– If you‟re not doing this properly, you‟re hosed
• Champion UX+ - with added numbers
– (Re)designing without inputs + numbers is guessing
• You need one team on this, not silos
– Stop handing round the baby (I‟ll come back to this)
• Ego, Opinion, Cherished notions – fill gaps
– Fill these vacuums with insights and data
• Champion the users
– Someone needs to take their side!
• You need multiple tool inputs
– Let me show you my core list
@OptimiseOrDie
#2 : Core tools
• Properly set up analytics
– Without this foundation, you‟re toast
• Session replay tools
– Clicktale, Tealeaf, Sessioncam and more…
• Cheap / Crowdsourced usability testing
– See the resource pack for more details
• Voice of Customer / Feedback tools
– 4Q, Kampyle, Qualaroo, Usabilla and more…
• A/B and Multivariate testing
– Optimizely, Google Content Experiments, VWO
• Email, Browser and Mobile testing
– You don‟t know if it works unless you check
@OptimiseOrDie
#3 : You‟re not testing (enough)
@OptimiseOrDie
#3 : Common problems
• Let’s take a quick poll
– How many tests do you complete a month?
• Not enough resource
– You MUST hire, invest and ringfence time and staff for CRO
• Testing has gone to sleep
– Some vendors have a „rescue‟ team for these accounts
• Vanity testing takes hold
– Getting one test done a quarter? Still showing it a year later?
• You keep testing without buyin at C-Level
– If nobody sees the flower, was it there?
• You haven’t got a process – just a plugin
– Insight, Brainstorm, Wireframe, Design, Build, QA test, Monitor,
Analyse. Tools, Process, People, Time -> INVEST
• IT or release barriers slow down work
– Circumvent with tagging tools
– Develop ways around the innovation barrier @OptimiseOrDie
#4 : Not executing fast enough
@OptimiseOrDie
#4 : Not executing fast enough
• Silo Mentality means pass the product
– No „one team‟ approach means no „one product‟
• The process is badly designed
– See the resource pack or ask me later!
• People mistake hypotheses for finals
– Endless argument, tweaking means NO TESTING – let the test
decide, please!
• No clarity : authority or decision making
– You need a strong leader to get things decided
• Signoff takes far too long
– Signoff by committee is a velocity killer – the CUSTOMER and
the NUMBERS are the signoff
• You set your target too low
– Aim for a high target and keep increasing it
@OptimiseOrDie
CRO
@OptimiseOrDie
#4 : Execution solutions
• Agile, One Team approach
– Everyone works on the lifecycle, together
• Hire Polymaths
– T-shaped or just multi-skilled, I hire them a lot
• Use Collaborative Tools, not meetings
– See the resource pack
• Market the results
– Market this stuff internally like a PR agency
– Encourage betting in the office
• Smash down silos – a special mission
– Involve the worst offenders in the hypothesis team
– “Hold your friends close, and your enemies closer”
– Work WITH the developers to find solutions
– Ask Developers and IT for solutions, not apologies
@OptimiseOrDie
#5 : Product cycles are too long
0 6 12 18
Months
Conversion
@OptimiseOrDie
#5 : Solutions
• Give Priority Boarding for opportunities
– The best seats reserved for metric shifters
• Release more often to close the gap
– More testing resource helps, analytics „hawk eye‟
• Kaizen – continuous improvement
– Others call it JFDI (just f***ing do it)
• Make changes AS WELL as tests, basically!
– These small things add up
• RUSH Hair booking – Over 100 changes
– No functional changes at all – 37% improvement
• Inbetween product lifecycles?
– The added lift for 10 days work, worth 360k
@OptimiseOrDie
#5 : Make your own cycles
@OptimiseOrDie
#6 – No Photo UX
24 Jan 2012
• Persuasion / Influence /
Direction / Explanation
• Helps people process
information and stories
• Vital to sell an „experience‟
• Helps people recognise and
discriminate between things
• Supports Scanning Visitors
• Drives emotional response
short.cx/YrBczl
• Very powerful and under-estimated area
• I‟ve done over 20M visitor tests with people
images for a service industry – some tips:
• The person, pose, eye gaze, facial
expressions and body language – cause
visceral emotional reactions and big changes
in behaviour
• Eye gaze crucial – to engage you or to „point‟
Photo UX
24 Jan 2012
• Negative body language is a turnoff
• Uniforms and branding a positive (ball cap)
• Hands are hard to handle – use a prop to help
• For Ecommerce – tip! test bigger images!
• Autoglass and Belron always use real people
• In most countries (out of 33) with strong female
and male images in test, female wins
• Smile and authenticity in these examples is
absolutely vital
• So, I have a question for you
Photo UX
@OptimiseOrDie
@OptimiseOrDie
Terrible Stock Photos : headsethotties.com & awkwardstockphotos.com
Laughing at Salads : womenlaughingwithsalad.tumblr.com
BBC Fake Smile Test : bbc.in/5rtnv
@OptimiseOrDie
SPAIN
+22% over control
99% confidence
@OptimiseOrDie
@OptimiseOrDie
#7 : Your tests are cattle trucked
• Many tests fail due to QA or browser bugs
– Always do cross browser QA testing – see resources
• Don’t rely on developers saying ‘yes’
– Use your analytics to define the list to test
• Cross instrument your analytics
– You need this to check the test software works
• Store the variant(s) seen in analytics
– Compare people who saw A/B/A vs. A/B/B
• Segment your data to find variances
– Failed tests usually show differences for segments
• Watch the test and analytics CLOSELY
– After you go live, religiously check both
– Read this article : stanford.io/15UYov0
@OptimiseOrDie
#8 : Stats are confusing
• Many testers & marketing people struggle
– How long will it take to run the test?
– Is the test ready?
– How long should I keep it running for?
– It says it‟s ready after 3 days – is it?
– Can we close it now – the numbers look great!
• A/B testing maths for dummies:
– http://bit.ly/15UXLS4
• For more advanced testers:
– Read this : http://bit.ly/1a4iJ1H
• I’m going to build a stats course
– To explain all the common questions
– To save me having to explain this crap all the time
@OptimiseOrDie
#9 : You‟re not segmenting
• Averages lie
– What about new vs. returning visitors?
– What about different keyword groups?
– Landing pages? Routes? Attributes
• Failed tests are just ‘averaged out’
– You must look at segment level data
– You must integrate the analytics + a/b test software
• The downside?
– You‟ll need more test data – to segment
• The upside?
– Helps figure out why test didn‟t perform
– Finds value in failed or „no difference‟ tests
– Drives further testing focus
@OptimiseOrDie
#10 : You‟re unichannel optimising
• Not using call tracking
– Look at Infinity Tracking (UK)
– Get Google keyword level call volumes!
• You don’t measure channel switchers
– People who bail a funnel and call
– People who use chat or other contact/sales
• You ‘forget’ mobile & tablet journeys
– Walk the path from search -> ppc/seo -> site
– Optimise for all your device mix & journeys
• You’re responsive
– Testing may now bleed across device platforms
– Changing in one place may impact many others
– QA, Device and Browser testing even more vital
@OptimiseOrDie
SUMMARY : The best Companies….
• Invest continually in Analytics instrumentation, tools & people
• Use an Agile, iterative, Cross-silo, One team project culture
• Prefer collaborative tools to having lots of meetings
• Prioritise development based on numbers and insight
• Practice real continuous product improvement, not SLED
• Source photos and copy that support persuasion and utility
• Have cross channel, cross device design, testing and QA
• Segment their data for valuable insights, every test or change
• Continually try to reduce cycle (iteration) time in their process
• Blend ‘long’ design, continuous improvement AND split tests
• Make optimisation the engine of change, not the slave of ego
• See the Maturity Model in the resource pack
@OptimiseOrDie

More Related Content

What's hot

#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
#Measurecamp : 18 Simple Ways to F*** up Your AB Testing#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
#Measurecamp : 18 Simple Ways to F*** up Your AB TestingCraig Sullivan
 
Myths and Illusions of Cross Device Testing - Elite Camp June 2015
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Myths and Illusions of Cross Device Testing - Elite Camp June 2015
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Craig Sullivan
 
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Craig Sullivan
 
Conversion Research in One Hour
Conversion Research in One HourConversion Research in One Hour
Conversion Research in One HourCraig Sullivan
 
#Measurefest : 20 Simple Ways to Fuck Up your AB tests
#Measurefest : 20 Simple Ways to Fuck Up your AB tests#Measurefest : 20 Simple Ways to Fuck Up your AB tests
#Measurefest : 20 Simple Ways to Fuck Up your AB testsCraig Sullivan
 
Surviving the hype cycle Shortcuts to split testing success
Surviving the hype cycle   Shortcuts to split testing successSurviving the hype cycle   Shortcuts to split testing success
Surviving the hype cycle Shortcuts to split testing successCraig Sullivan
 
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to me
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to meDigital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to me
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to meCraig Sullivan
 
20 Ways to Shaft your Split Tesring : Conversion Conference
20 Ways to Shaft your Split Tesring : Conversion Conference20 Ways to Shaft your Split Tesring : Conversion Conference
20 Ways to Shaft your Split Tesring : Conversion ConferenceCraig Sullivan
 
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014Craig Sullivan
 
Cross Device Optimisation - Google Analytics Shortcuts
Cross Device Optimisation - Google Analytics ShortcutsCross Device Optimisation - Google Analytics Shortcuts
Cross Device Optimisation - Google Analytics ShortcutsCraig Sullivan
 
Web Analytics Wednesday - Session Replay Tools are Vital
Web Analytics Wednesday - Session Replay Tools are VitalWeb Analytics Wednesday - Session Replay Tools are Vital
Web Analytics Wednesday - Session Replay Tools are VitalCraig Sullivan
 
Product design is Poo - And how to fix it!
Product design is Poo - And how to fix it!Product design is Poo - And how to fix it!
Product design is Poo - And how to fix it!Craig Sullivan
 
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CRO
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CROUXPA UK - Toolkits and Tips for Blending UX, Analytics and CRO
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CROCraig Sullivan
 
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to Me
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeBrighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to Me
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeCraig Sullivan
 
SAMPLE SIZE – The indispensable A/B test calculation that you’re not making
SAMPLE SIZE – The indispensable A/B test calculation that you’re not makingSAMPLE SIZE – The indispensable A/B test calculation that you’re not making
SAMPLE SIZE – The indispensable A/B test calculation that you’re not makingZack Notes
 
Product Design is Poo - And we're all going to die
Product Design is Poo - And we're all going to dieProduct Design is Poo - And we're all going to die
Product Design is Poo - And we're all going to dieCraig Sullivan
 
Onboard like a juggernaut - Elite camp 2015
Onboard like a juggernaut - Elite camp 2015Onboard like a juggernaut - Elite camp 2015
Onboard like a juggernaut - Elite camp 2015Conversionista
 
Natural born conversion killers - Conversion Jam
Natural born conversion killers - Conversion JamNatural born conversion killers - Conversion Jam
Natural born conversion killers - Conversion JamCraig Sullivan
 
Talks@Coursera - A/B Testing @ Internet Scale
Talks@Coursera - A/B Testing @ Internet ScaleTalks@Coursera - A/B Testing @ Internet Scale
Talks@Coursera - A/B Testing @ Internet Scalecourseratalks
 
A/B Testing and the Infinite Monkey Theory
A/B Testing and the Infinite Monkey TheoryA/B Testing and the Infinite Monkey Theory
A/B Testing and the Infinite Monkey TheoryUseItBetter
 

What's hot (20)

#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
#Measurecamp : 18 Simple Ways to F*** up Your AB Testing#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
 
Myths and Illusions of Cross Device Testing - Elite Camp June 2015
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Myths and Illusions of Cross Device Testing - Elite Camp June 2015
Myths and Illusions of Cross Device Testing - Elite Camp June 2015
 
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015
 
Conversion Research in One Hour
Conversion Research in One HourConversion Research in One Hour
Conversion Research in One Hour
 
#Measurefest : 20 Simple Ways to Fuck Up your AB tests
#Measurefest : 20 Simple Ways to Fuck Up your AB tests#Measurefest : 20 Simple Ways to Fuck Up your AB tests
#Measurefest : 20 Simple Ways to Fuck Up your AB tests
 
Surviving the hype cycle Shortcuts to split testing success
Surviving the hype cycle   Shortcuts to split testing successSurviving the hype cycle   Shortcuts to split testing success
Surviving the hype cycle Shortcuts to split testing success
 
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to me
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to meDigital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to me
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to me
 
20 Ways to Shaft your Split Tesring : Conversion Conference
20 Ways to Shaft your Split Tesring : Conversion Conference20 Ways to Shaft your Split Tesring : Conversion Conference
20 Ways to Shaft your Split Tesring : Conversion Conference
 
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014
 
Cross Device Optimisation - Google Analytics Shortcuts
Cross Device Optimisation - Google Analytics ShortcutsCross Device Optimisation - Google Analytics Shortcuts
Cross Device Optimisation - Google Analytics Shortcuts
 
Web Analytics Wednesday - Session Replay Tools are Vital
Web Analytics Wednesday - Session Replay Tools are VitalWeb Analytics Wednesday - Session Replay Tools are Vital
Web Analytics Wednesday - Session Replay Tools are Vital
 
Product design is Poo - And how to fix it!
Product design is Poo - And how to fix it!Product design is Poo - And how to fix it!
Product design is Poo - And how to fix it!
 
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CRO
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CROUXPA UK - Toolkits and Tips for Blending UX, Analytics and CRO
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CRO
 
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to Me
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeBrighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to Me
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to Me
 
SAMPLE SIZE – The indispensable A/B test calculation that you’re not making
SAMPLE SIZE – The indispensable A/B test calculation that you’re not makingSAMPLE SIZE – The indispensable A/B test calculation that you’re not making
SAMPLE SIZE – The indispensable A/B test calculation that you’re not making
 
Product Design is Poo - And we're all going to die
Product Design is Poo - And we're all going to dieProduct Design is Poo - And we're all going to die
Product Design is Poo - And we're all going to die
 
Onboard like a juggernaut - Elite camp 2015
Onboard like a juggernaut - Elite camp 2015Onboard like a juggernaut - Elite camp 2015
Onboard like a juggernaut - Elite camp 2015
 
Natural born conversion killers - Conversion Jam
Natural born conversion killers - Conversion JamNatural born conversion killers - Conversion Jam
Natural born conversion killers - Conversion Jam
 
Talks@Coursera - A/B Testing @ Internet Scale
Talks@Coursera - A/B Testing @ Internet ScaleTalks@Coursera - A/B Testing @ Internet Scale
Talks@Coursera - A/B Testing @ Internet Scale
 
A/B Testing and the Infinite Monkey Theory
A/B Testing and the Infinite Monkey TheoryA/B Testing and the Infinite Monkey Theory
A/B Testing and the Infinite Monkey Theory
 

Viewers also liked

Ab test -互联网渐进式解决方案
Ab test -互联网渐进式解决方案Ab test -互联网渐进式解决方案
Ab test -互联网渐进式解决方案文波 张
 
Big Data Is Not the Insight: The Language Of Discovery:
Big Data Is Not the Insight: The Language Of Discovery: Big Data Is Not the Insight: The Language Of Discovery:
Big Data Is Not the Insight: The Language Of Discovery: Joe Lamantia
 
How to Find the Big Growth Opportunities Hiding in Your Data
How to Find the Big Growth Opportunities Hiding in Your DataHow to Find the Big Growth Opportunities Hiding in Your Data
How to Find the Big Growth Opportunities Hiding in Your DataKissmetrics on SlideShare
 
Experimentation Platform at Netflix
Experimentation Platform at NetflixExperimentation Platform at Netflix
Experimentation Platform at NetflixSteve Urban
 
How to AB Test Landing Pages in Marketo
How to AB Test Landing Pages in MarketoHow to AB Test Landing Pages in Marketo
How to AB Test Landing Pages in MarketoJosh Hill
 
Data Storytelling: The only way to unlock true insight from your data
Data Storytelling: The only way to unlock true insight from your dataData Storytelling: The only way to unlock true insight from your data
Data Storytelling: The only way to unlock true insight from your dataBright North
 
Data stories - how to combine the power storytelling with effective data visu...
Data stories - how to combine the power storytelling with effective data visu...Data stories - how to combine the power storytelling with effective data visu...
Data stories - how to combine the power storytelling with effective data visu...Coincidencity
 
Netflix JavaScript Talks - Scaling A/B Testing on Netflix.com with Node.js
Netflix JavaScript Talks - Scaling A/B Testing on Netflix.com with Node.jsNetflix JavaScript Talks - Scaling A/B Testing on Netflix.com with Node.js
Netflix JavaScript Talks - Scaling A/B Testing on Netflix.com with Node.jsChris Saint-Amant
 
A/B Testing Framework Design
A/B Testing Framework DesignA/B Testing Framework Design
A/B Testing Framework DesignPatrick McKenzie
 
The Joy of Data Driven Storytelling
The Joy of Data Driven StorytellingThe Joy of Data Driven Storytelling
The Joy of Data Driven StorytellingLeslie Bradshaw
 

Viewers also liked (12)

Ab test
Ab testAb test
Ab test
 
Ab test -互联网渐进式解决方案
Ab test -互联网渐进式解决方案Ab test -互联网渐进式解决方案
Ab test -互联网渐进式解决方案
 
AB test
AB testAB test
AB test
 
Big Data Is Not the Insight: The Language Of Discovery:
Big Data Is Not the Insight: The Language Of Discovery: Big Data Is Not the Insight: The Language Of Discovery:
Big Data Is Not the Insight: The Language Of Discovery:
 
How to Find the Big Growth Opportunities Hiding in Your Data
How to Find the Big Growth Opportunities Hiding in Your DataHow to Find the Big Growth Opportunities Hiding in Your Data
How to Find the Big Growth Opportunities Hiding in Your Data
 
Experimentation Platform at Netflix
Experimentation Platform at NetflixExperimentation Platform at Netflix
Experimentation Platform at Netflix
 
How to AB Test Landing Pages in Marketo
How to AB Test Landing Pages in MarketoHow to AB Test Landing Pages in Marketo
How to AB Test Landing Pages in Marketo
 
Data Storytelling: The only way to unlock true insight from your data
Data Storytelling: The only way to unlock true insight from your dataData Storytelling: The only way to unlock true insight from your data
Data Storytelling: The only way to unlock true insight from your data
 
Data stories - how to combine the power storytelling with effective data visu...
Data stories - how to combine the power storytelling with effective data visu...Data stories - how to combine the power storytelling with effective data visu...
Data stories - how to combine the power storytelling with effective data visu...
 
Netflix JavaScript Talks - Scaling A/B Testing on Netflix.com with Node.js
Netflix JavaScript Talks - Scaling A/B Testing on Netflix.com with Node.jsNetflix JavaScript Talks - Scaling A/B Testing on Netflix.com with Node.js
Netflix JavaScript Talks - Scaling A/B Testing on Netflix.com with Node.js
 
A/B Testing Framework Design
A/B Testing Framework DesignA/B Testing Framework Design
A/B Testing Framework Design
 
The Joy of Data Driven Storytelling
The Joy of Data Driven StorytellingThe Joy of Data Driven Storytelling
The Joy of Data Driven Storytelling
 

Similar to Why do my AB tests suck? measurecamp

Anton Muzhailo - Practical Test Process Improvement using ISTQB
Anton Muzhailo - Practical Test Process Improvement using ISTQBAnton Muzhailo - Practical Test Process Improvement using ISTQB
Anton Muzhailo - Practical Test Process Improvement using ISTQBIevgenii Katsan
 
Patrick McKenzie Opticon 2014: Advanced A/B Testing
Patrick McKenzie Opticon 2014: Advanced A/B TestingPatrick McKenzie Opticon 2014: Advanced A/B Testing
Patrick McKenzie Opticon 2014: Advanced A/B TestingPatrick McKenzie
 
Losing is the New Winning
Losing is the New WinningLosing is the New Winning
Losing is the New WinningOptimizely
 
PI Boot Camp 2015.06 Participant Packet
PI Boot Camp 2015.06 Participant PacketPI Boot Camp 2015.06 Participant Packet
PI Boot Camp 2015.06 Participant PacketMike Rudolf
 
Management Consulting Productivity Hacks
Management Consulting Productivity HacksManagement Consulting Productivity Hacks
Management Consulting Productivity HacksAsen Gyczew
 
Ericriesleanstartuppresentationforweb2
Ericriesleanstartuppresentationforweb2Ericriesleanstartuppresentationforweb2
Ericriesleanstartuppresentationforweb2Edmund FOng
 
John Fodeh - Spend Wisely, Test Well
John Fodeh - Spend Wisely, Test WellJohn Fodeh - Spend Wisely, Test Well
John Fodeh - Spend Wisely, Test WellTEST Huddle
 
Startup KPIs and A/B Testing
Startup KPIs and A/B TestingStartup KPIs and A/B Testing
Startup KPIs and A/B TestingJeff McClelland
 
Agile Metrics...That Matter
Agile Metrics...That MatterAgile Metrics...That Matter
Agile Metrics...That MatterErik Weber
 
PAC 2019 virtual Joerek Van Gaalen
PAC 2019 virtual Joerek Van GaalenPAC 2019 virtual Joerek Van Gaalen
PAC 2019 virtual Joerek Van GaalenNeotys
 
Startup strategy1
Startup strategy1Startup strategy1
Startup strategy1twabou
 
Conversion Optimization: The World Beyond Headlines & Button Color
Conversion Optimization: The World Beyond Headlines & Button ColorConversion Optimization: The World Beyond Headlines & Button Color
Conversion Optimization: The World Beyond Headlines & Button ColorOptimizely
 
Optimizing Legal Service Delivery
Optimizing Legal Service DeliveryOptimizing Legal Service Delivery
Optimizing Legal Service DeliveryIFLP
 
Lean startup
Lean startupLean startup
Lean startupdgerges
 
Scrum Bangalore 18th Meetup - October 15, 2016 - Elasticity of Kanban - Saika...
Scrum Bangalore 18th Meetup - October 15, 2016 - Elasticity of Kanban - Saika...Scrum Bangalore 18th Meetup - October 15, 2016 - Elasticity of Kanban - Saika...
Scrum Bangalore 18th Meetup - October 15, 2016 - Elasticity of Kanban - Saika...Scrum Bangalore
 
Methods to Measure Marketing & The Metrics We Move
Methods to Measure Marketing & The Metrics We MoveMethods to Measure Marketing & The Metrics We Move
Methods to Measure Marketing & The Metrics We MoveTeacup Analytics
 
UX Field Research Toolkit - A Workshop at Big Design - 2017
UX Field Research Toolkit - A Workshop at Big Design - 2017UX Field Research Toolkit - A Workshop at Big Design - 2017
UX Field Research Toolkit - A Workshop at Big Design - 2017Kelly Moran
 
Research and Discovery Tools for Experimentation - 17 Apr 2024 - v 2.3 (1).pdf
Research and Discovery Tools for Experimentation - 17 Apr 2024 - v 2.3 (1).pdfResearch and Discovery Tools for Experimentation - 17 Apr 2024 - v 2.3 (1).pdf
Research and Discovery Tools for Experimentation - 17 Apr 2024 - v 2.3 (1).pdfVWO
 

Similar to Why do my AB tests suck? measurecamp (20)

The Finishing Line
The Finishing LineThe Finishing Line
The Finishing Line
 
Anton Muzhailo - Practical Test Process Improvement using ISTQB
Anton Muzhailo - Practical Test Process Improvement using ISTQBAnton Muzhailo - Practical Test Process Improvement using ISTQB
Anton Muzhailo - Practical Test Process Improvement using ISTQB
 
PQF Overview
PQF OverviewPQF Overview
PQF Overview
 
Patrick McKenzie Opticon 2014: Advanced A/B Testing
Patrick McKenzie Opticon 2014: Advanced A/B TestingPatrick McKenzie Opticon 2014: Advanced A/B Testing
Patrick McKenzie Opticon 2014: Advanced A/B Testing
 
Losing is the New Winning
Losing is the New WinningLosing is the New Winning
Losing is the New Winning
 
PI Boot Camp 2015.06 Participant Packet
PI Boot Camp 2015.06 Participant PacketPI Boot Camp 2015.06 Participant Packet
PI Boot Camp 2015.06 Participant Packet
 
Management Consulting Productivity Hacks
Management Consulting Productivity HacksManagement Consulting Productivity Hacks
Management Consulting Productivity Hacks
 
Ericriesleanstartuppresentationforweb2
Ericriesleanstartuppresentationforweb2Ericriesleanstartuppresentationforweb2
Ericriesleanstartuppresentationforweb2
 
John Fodeh - Spend Wisely, Test Well
John Fodeh - Spend Wisely, Test WellJohn Fodeh - Spend Wisely, Test Well
John Fodeh - Spend Wisely, Test Well
 
Startup KPIs and A/B Testing
Startup KPIs and A/B TestingStartup KPIs and A/B Testing
Startup KPIs and A/B Testing
 
Agile Metrics...That Matter
Agile Metrics...That MatterAgile Metrics...That Matter
Agile Metrics...That Matter
 
PAC 2019 virtual Joerek Van Gaalen
PAC 2019 virtual Joerek Van GaalenPAC 2019 virtual Joerek Van Gaalen
PAC 2019 virtual Joerek Van Gaalen
 
Startup strategy1
Startup strategy1Startup strategy1
Startup strategy1
 
Conversion Optimization: The World Beyond Headlines & Button Color
Conversion Optimization: The World Beyond Headlines & Button ColorConversion Optimization: The World Beyond Headlines & Button Color
Conversion Optimization: The World Beyond Headlines & Button Color
 
Optimizing Legal Service Delivery
Optimizing Legal Service DeliveryOptimizing Legal Service Delivery
Optimizing Legal Service Delivery
 
Lean startup
Lean startupLean startup
Lean startup
 
Scrum Bangalore 18th Meetup - October 15, 2016 - Elasticity of Kanban - Saika...
Scrum Bangalore 18th Meetup - October 15, 2016 - Elasticity of Kanban - Saika...Scrum Bangalore 18th Meetup - October 15, 2016 - Elasticity of Kanban - Saika...
Scrum Bangalore 18th Meetup - October 15, 2016 - Elasticity of Kanban - Saika...
 
Methods to Measure Marketing & The Metrics We Move
Methods to Measure Marketing & The Metrics We MoveMethods to Measure Marketing & The Metrics We Move
Methods to Measure Marketing & The Metrics We Move
 
UX Field Research Toolkit - A Workshop at Big Design - 2017
UX Field Research Toolkit - A Workshop at Big Design - 2017UX Field Research Toolkit - A Workshop at Big Design - 2017
UX Field Research Toolkit - A Workshop at Big Design - 2017
 
Research and Discovery Tools for Experimentation - 17 Apr 2024 - v 2.3 (1).pdf
Research and Discovery Tools for Experimentation - 17 Apr 2024 - v 2.3 (1).pdfResearch and Discovery Tools for Experimentation - 17 Apr 2024 - v 2.3 (1).pdf
Research and Discovery Tools for Experimentation - 17 Apr 2024 - v 2.3 (1).pdf
 

More from Craig Sullivan

Why does my Mobile Conversion rate suck? 19 Sep 2013 @ Conversion Thursday #...
Why does my Mobile Conversion rate suck?  19 Sep 2013 @ Conversion Thursday #...Why does my Mobile Conversion rate suck?  19 Sep 2013 @ Conversion Thursday #...
Why does my Mobile Conversion rate suck? 19 Sep 2013 @ Conversion Thursday #...Craig Sullivan
 
Toolkits and tips of the conversion pros v 1.6
Toolkits and tips of the conversion pros v 1.6Toolkits and tips of the conversion pros v 1.6
Toolkits and tips of the conversion pros v 1.6Craig Sullivan
 
Confessions of an uber optimiser conversion summit - craig sullivan - v 1.9
Confessions of an uber optimiser   conversion summit - craig sullivan - v 1.9Confessions of an uber optimiser   conversion summit - craig sullivan - v 1.9
Confessions of an uber optimiser conversion summit - craig sullivan - v 1.9Craig Sullivan
 
Elite Camp 2013 - Estonia
Elite Camp 2013 - EstoniaElite Camp 2013 - Estonia
Elite Camp 2013 - EstoniaCraig Sullivan
 
Conversionista : Conversion manager course - Stockholm 20 march 2013
Conversionista : Conversion manager course  - Stockholm 20 march 2013Conversionista : Conversion manager course  - Stockholm 20 march 2013
Conversionista : Conversion manager course - Stockholm 20 march 2013Craig Sullivan
 
3 Optimisation Decks : WAW Copenhagen - 27 Feb 2013
3 Optimisation Decks : WAW Copenhagen - 27 Feb 20133 Optimisation Decks : WAW Copenhagen - 27 Feb 2013
3 Optimisation Decks : WAW Copenhagen - 27 Feb 2013Craig Sullivan
 
Measure camp tools of the cro rabble
Measure camp   tools of the cro rabbleMeasure camp   tools of the cro rabble
Measure camp tools of the cro rabbleCraig Sullivan
 
5 cro tools that i can't live without
5 cro tools that i can't live without5 cro tools that i can't live without
5 cro tools that i can't live withoutCraig Sullivan
 
eMetrics Stockholm - What the F*** is wrong with my conversion?
eMetrics Stockholm - What the F*** is wrong with my conversion?eMetrics Stockholm - What the F*** is wrong with my conversion?
eMetrics Stockholm - What the F*** is wrong with my conversion?Craig Sullivan
 

More from Craig Sullivan (9)

Why does my Mobile Conversion rate suck? 19 Sep 2013 @ Conversion Thursday #...
Why does my Mobile Conversion rate suck?  19 Sep 2013 @ Conversion Thursday #...Why does my Mobile Conversion rate suck?  19 Sep 2013 @ Conversion Thursday #...
Why does my Mobile Conversion rate suck? 19 Sep 2013 @ Conversion Thursday #...
 
Toolkits and tips of the conversion pros v 1.6
Toolkits and tips of the conversion pros v 1.6Toolkits and tips of the conversion pros v 1.6
Toolkits and tips of the conversion pros v 1.6
 
Confessions of an uber optimiser conversion summit - craig sullivan - v 1.9
Confessions of an uber optimiser   conversion summit - craig sullivan - v 1.9Confessions of an uber optimiser   conversion summit - craig sullivan - v 1.9
Confessions of an uber optimiser conversion summit - craig sullivan - v 1.9
 
Elite Camp 2013 - Estonia
Elite Camp 2013 - EstoniaElite Camp 2013 - Estonia
Elite Camp 2013 - Estonia
 
Conversionista : Conversion manager course - Stockholm 20 march 2013
Conversionista : Conversion manager course  - Stockholm 20 march 2013Conversionista : Conversion manager course  - Stockholm 20 march 2013
Conversionista : Conversion manager course - Stockholm 20 march 2013
 
3 Optimisation Decks : WAW Copenhagen - 27 Feb 2013
3 Optimisation Decks : WAW Copenhagen - 27 Feb 20133 Optimisation Decks : WAW Copenhagen - 27 Feb 2013
3 Optimisation Decks : WAW Copenhagen - 27 Feb 2013
 
Measure camp tools of the cro rabble
Measure camp   tools of the cro rabbleMeasure camp   tools of the cro rabble
Measure camp tools of the cro rabble
 
5 cro tools that i can't live without
5 cro tools that i can't live without5 cro tools that i can't live without
5 cro tools that i can't live without
 
eMetrics Stockholm - What the F*** is wrong with my conversion?
eMetrics Stockholm - What the F*** is wrong with my conversion?eMetrics Stockholm - What the F*** is wrong with my conversion?
eMetrics Stockholm - What the F*** is wrong with my conversion?
 

Recently uploaded

Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksSoftradix Technologies
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraDeakin University
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 

Recently uploaded (20)

Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other Frameworks
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning era
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptxVulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 

Why do my AB tests suck? measurecamp

  • 1. Why does my AB testing suck? 5th Sep 2013 @OptimiseOrDie
  • 2. #11 Top Split Test & CRO questions #1 How to choose test type! #2 Top CRO questions 2
  • 3. 11.1 – How to choose the test type • Test complexity – this drains time! • Analytics – will this allow you to test before/after? • Money now or Precise data later • What stage are you at with the client? • How volatile is the data and traffic? • Are there huge non weekly patterns – seasonal? • A/B tests – Design shift, new baseline, local maxima • MVT – Need to know variables, client can’t settle • Small MVT – A/B + MVT benefits (2x2x2 or 2x2x4) • Traffic is your biggest factor here • Use the VWO test run calculator • Do a rough calculation yourself • Let’s look at some recent pages of yours
  • 4. 11.2 – Top Conversion Questions • 32 questions, picked by Practitioners • I plan to record them all as a course! • What top stuff did I hear? “How long will my test take?” “When should I check the results?” “How do I know if it‟s ready?” 4
  • 5. #1 The tennis court – Let’s say we want to estimate, on average, what height Roger Federer and Nadal hit the ball over the net at. So, let’s start the match: 5
  • 6. First Set Federer 6-4 – We start to collect values 6 62cm +/- 2cm 63.5cm +/- 2cm
  • 7. Second Set – Nadal 7-6 – Nadal starts sending them low over the net 7 62cm +/- 1cm 62.5cm +/- 1cm
  • 8. Final Set Nadal 7-6 – We start to collect values 61.8cm +/- .3cm 62cm +/- .3cm
  • 9. Let’s look at this a different way 9 62.5cm +/- 1cm 9.1 ± 0.3%
  • 10. Graph is a range, not a line: 9.1 ± 0.3%
  • 11. #1 Summary • The minimum length – 2 business cycles OR – 250, pref 350 outcomes in each – Calculate time to reach minimum – Work on footfall, not sitewide – So, if you get 1,000 visitors a day to a new test page – You convert at around 5% (50 checkouts) – You have two creatives – At current volumes, you’ll get ~25 checkouts a day for each creative – That means you need 14 days minimum – If they separate, it might take less (but business cycle rule kicks in) – If they don’t separate, it could take longer – Remember it’s a fuzzy region – not a precise point 11
  • 12. #1 Summary • The minimum length – Depends on performance – If you test two shades of blue? – Traffic may change – PPC budget might run out – TV advertising may start – Segment level performance may drive – You can estimate a test length – you cannot predict it – Be aware of your marketing activity, always – Watch your test like a hyper-engaged chef 12
  • 13. 11.3 – Are we there yet? Early test stages… • Ignore the graphs. Don’t draw conclusions. Don’t dance. Calm down. • Get a feel for the test but don’t do anything yet! • Remember – in A/B - 50% of returning visitors will see a new shiny website! • Until your test has had at least 1 business cycle and 250-350 outcomes, don’t bother drawing conclusions or getting excited! • You’re looking for anything that looks really odd – your analytics person should be checking all the figures until you’re satisfied • All tests move around or show big swings early in the testing cycle. Here is a very high traffic site – it still takes 10 days to start settling. Lower traffic sites will stretch this period further. 13
  • 14. 11.4 – What happens when a test flips on me? • Something like this can happen: • Check your sample size. If it‟s still small, then expect this until the test settles. • If the test does genuinely flip – and quite severely – then something has changed with the traffic mix, the customer base or your advertising. Maybe the PPC budget ran out? Seriously! • To analyse a flipped test, you‟ll need to check your segmented data. This is why you have a split testing package AND an analytics system. • The segmented data will help you to identify the source of the shift in response to your test. I rarely get a flipped one and it‟s always something changing on me, without being told. The heartless bastards. 14
  • 15. 11.5 – What happens if a test is still moving around? • There are three reasons it is moving around – Your sample size (outcomes) is still too small – The external traffic mix, customers or reaction has suddenly changed or – Your inbound marketing driven traffic mix is completely volatile (very rare) • Check the sample size • Check all your marketing activity • Check the instrumentation • If no reason, check segmentation 15
  • 16. 11.6 – How do I know when it’s ready? • The hallmarks of a cooked test are: – It’s done at least 1 or 2 (preferred) cycles – You have at least 250-350 outcomes for each recipe – It’s not moving around hugely at creative or segment level performance – The test results are clear – even if the precise values are not – The intervals are not overlapping (much) – If a test is still moving around, you need to investigate – Always declare on a business cycle boundary – not the middle of a period (this introduces bias) – Don’t declare in the middle of a limited time period advertising campaign (e.g. TV, print, online) – Always test before and after large marketing campaigns (one week on, one week off) 16
  • 17. 11.7 – What happens if it’s inconclusive? • Analyse the segmentation • One or more segments may be over and under • They may be cancelling out – the average is a lie • The segment level performance will help you (beware of small sample sizes) • If you genuinely have a test which failed to move any segments, it’s a crap test • This usually happens when it isn’t bold or brave enough in shifting away from the original design, particularly on lower traffic sites • Get testing again! 17
  • 18. 11.8 – What QA testing should I do? • Cross Browser Testing • Testing from several locations (office, home, elsewhere) • Testing the IP filtering is set up • Test tags are firing correctly (analytics and the test tool) • Test as a repeat visitor and check session timeouts • Cross check figures from 2+ sources • Monitor closely from launch, recheck 18
  • 19. 11.9 – What happens if it fails? • Learn from the failure • If you can’t learn from the failure, you’ve designed a crap test. Next time you design, imagine all your stuff failing. What would you do? If you don’t know or you’re not sure, get it changed so that a negative becomes useful. • So : failure itself at a creative or variable level should tell you something. • On a failed test, always analyse the segmentation • One or more segments will be over and under • Check for varied performance • Now add the failure info to your Knowledge Base: • Look at it carefully – what does the failure tell you? Which element do you think drove the failure? • If you know what failed (e.g. making the price bigger) then you have very useful information • You turned the handle the wrong way • Now brainstorm a new test 19
  • 20. 11.10 – Should I run an A/A test first? • No – and this is why: – It’s a waste of time – It’s easier to test and monitor instead – You are eating into test time – Also applies to A/A/B/B testing – A/B/A running at 25%/50%/25% is the best • Read my post here : http://bit.ly/WcI9EZ 20
  • 21. 11.11 – What is a good conversion rate? Higher than the one you had last month! 21
  • 22. #12 – Top reasons summary • You weren’t bold enough • You made the test too complex • Your test didn’t tell you anything (failures too!) • You didn’t do browser QA • The session model is broken • Your redirects are flawed • Your office is part of the bias • The test isn’t truly random / The samples aren’t representative • Your sample size is too small • You didn’t test for long enough • You didn’t look at the error rates • You didn’t cross instrument 22 • You’ve missed one or more underlying cycles • You don’t factor in before/after cycles • One test has an inherent performance bias (load time, for example) • You didn’t watch segment performance • You’re measuring too shallowly in the funnel • Your traffic mix has changed • You’re not measuring channel switchers (phone/email/chat etc.) • The analytics setup is broken!
  • 23. #13 – Summary - tests • This isn’t about tools – it’s about your thinking and approach to problems. Bravery and curiosity more important than wizardry! • Keep it simple and aim for actionable truths and insights • Invest in staff, training, analytics (yours and your clients) • More wired in clients means happier agency! • Fixing problems impresses clients even before you start (health check) • Prioritise issues into opportunity & effort • Showing models around money is a winner • Do something every week to make the client configuration better • Let me use a till analogy! • What about a Formula 1 racing car? • Get clients to pay you to invest in their future • Give staff time to train themselves, go on courses, get qualified • On that note – experience with core skills + topups = GA experts • Tap into the community out there • Hopefully this has given you a great springboard to MORE! 23
  • 24. Is there a way to fix this then? 24 Conversion Heroes! @OptimiseOrDie
  • 26. Email Twitter : sullivac@gmail.com : @OptimiseOrDie : linkd.in/pvrg14 More reading. 26
  • 28. So you want examples? • Belron – Ed Colley • Dell – Nazli Yuzak • Shop Direct – Paul Postance (now with EE) • Expedia – Oliver Paton • Schuh – Stuart McMillan • Soundcloud – Eleftherios Diakomichalis & Ole Bahlmann • Gov.uk – Adam Bailin (now with the BBC) Read the gov.uk principles : www.gov.uk/designprinciples And my personal favourite of 2013 – Airbnb! @OptimiseOrDie
  • 29. Best of Twitter @OptimiseOrDie @danbarker Analytics @fastbloke Analytics @timlb Analytics @jamesgurd Analytics @therustybear Analytics @carmenmardiros Analytics @davechaffey Analytics @priteshpatel9 Analytics @cutroni Analytics @Aschottmuller Analytics, CRO @cartmetrix Analytics, CRO @Kissmetrics CRO / UX @Unbounce CRO / UX @Morys CRO/Neuro @PeepLaja CRO @TheGrok CRO @UIE UX @LukeW UX / Forms @cjforms UX / Forms @axbom UX @iatv UX @Chudders Photo UX @JeffreyGroks Innovation @StephanieRieger Innovation @DrEscotet Neuro @TheBrainLady Neuro @RogerDooley Neuro @Cugelman Neuro
  • 30. Best of the Web @OptimiseOrDie Whichtestwon.com Unbounce.com Kissmetrics.com Uxmatters.com RogerDooley.com PhotoUX.com TheTeamW.com Baymard.com Lukew.com PRWD.com Measuringusability.com ConversionXL.com Smartinsights.com Econsultancy.com Cutroni.com www.GetMentalNotes.com
  • 32. #13 Top Augmentation Tools #1 Session Replay #2 Browser & Email testing #3 VOC, Survey & Feedback tools #4 Guerrilla Usability #5 Productivity tools #6 Split testing #7 Performance #8 Crowdsourcing #9 Analytics Love 32
  • 33. 13.1 - Session Replay • 3 kinds of tool : Client side • Normally Javascript based • Pros : Rich mouse and click data, errors, forms analytics, UI interactions. • Cons : Dynamic content issue, Performance hit Server side • Black Box -> Proxy, Sniffer, Port copying device • Pros : Gets all dynamic content, fast, legally tight • Cons : No client side interactions, Ajax, HTML5 etc. Hybrid • Clientside and Sniffing with central data store 33
  • 34. 13.1 - Session Replay • Vital for optimisers & fills in a ‘missing link’ for insight • Rich source of data on visitor experiences • Segment by browser, visitor type, behaviour, errors • Forms Analytics (when instrumented) are awesome • Can be used to optimise in real time! Session replay tools • Clicktale (Client) www.clicktale.com • SessionCam (Client) www.sessioncam.com • Mouseflow (Client) www.mouseflow.com • Ghostrec (Client) www.ghostrec.com • Usabilla (Client) www.usabilla.com • Tealeaf (Hybrid) www.tealeaf.com • UserReplay (Server) www.userreplay.com 34
  • 35. 35
  • 36. 36
  • 37. 37
  • 38. 13.2 - Feedback / VOC tools • Anything that allows immediate realtime onpage feedback • Comments on elements, pages and overall site & service • Can be used for behavioural triggered feedback • Tip! : Take the Call Centre for beers • Kampyle www.kampyle.com • Qualaroo www.qualaroo.com • 4Q 4q.iperceptions.com • Usabilla www.usabilla.com 38
  • 39. 13.3 - Survey Tools • Surveymonkey www.surveymonkey.com (1/5) • Zoomerang www.zoomerang.com (3/5) • SurveyGizmo www.surveygizmo.com (5/5) • For surveys, web forms, checkouts, lead gen – anything with form filling – you have to read these two: Caroline Jarrett (@cjforms) Luke Wroblewski (@lukew) • With their work and copywriting from @stickycontent, I managed to get a survey with a 35% clickthrough from email and a whopping 94% form completion rate. • Their awesome insights are the killer app I have when optimising forms and funnel processes for clients. 39
  • 40. 13.4 – Experience the experience! Email testing www.litmus.com www.returnpath.com www.lyris.com Browser testing www.crossbrowsertesting.com www.cloudtesting.com www.multibrowserviewer.com www.saucelabs.com Mobile devices www.perfectomobile.com www.deviceanywhere.com www.mobilexweb.com/emulators www.opendevicelab.com 40
  • 41. 13.5 - Stingy Client Testing • Mobile can be lots of fun • Some low budget stuff you may know about already: CamStudio (free) www.camstudio.org Mediacam AV (cheap) www.netu2.co.uk Silverback (Mac) www.silverbackapp.com Screenflow (Mac) www.telestream.net UX Recorder (iOS), Skype Hugging, Reflection www.uxrecorder.com & bit.ly/tesTfm & bit.ly/GZMgxR 41
  • 42. 13.6 - Productivity tools Oh sh*t 42
  • 44. 1.6 - Pivotal Tracker 44
  • 47. • Lots of people don’t know this • Serious time is getting wasted on pulling and preparing data • Use the Google API to roll your own reports straight into Big G • Google Analytics + API + Google docs integration = A BETTER LIFE! • Hack your way to having more productive weeks • Learn how to do this to make completely custom reports 1.6 - Google Docs and Automation 47
  • 48. • LucidChart 13.7 - Cloud Collaboration 48
  • 49. • Webnotes 13.7 - Cloud Collaboration 49
  • 50. • Protonotes 13.7 - Cloud Collaboration 50
  • 51. • Conceptshare 13.7 - Cloud Collaboration 51
  • 52. 13.8 - Split testing tools – Cheap! • Google Content Experiments bit.ly/Ljg7Ds • Multi Armed Bandit Explanation bit.ly/Xa80O8 • Optimizely www.optimizely.com • Visual Website Optimizer www.visualwebsiteoptimizer.com 52
  • 53. 13.9 - Performance • Google Site Speed • Webpagetest.org • Mobitest.akamai.org53
  • 54. Site Size Requests The Daily Mail 4574k 437 Starbucks 1300k 145 Direct line 887k 45 Ikea (.se) 684k 14 Currys 667k 68 Marks & Spencers 308k 45 Tesco 234k 15 The Guardian 195k 35 BBC News 182k 62 Auto Trader 151k 47 Amazon 128k 16 Aviva 111k 18 Autoglass 25k 10 Real testing : mobitest.akamai.com
  • 55. Slides : slidesha.re/PDpTPD If you really care, download this deck:
  • 56. Scare the Ecom or Trading director:
  • 57. Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both) Usertesting (B) www.usertesting.com Userlytics (B) www.userlytics.com Userzoom (S) www.userzoom.com Intuition HQ (S) www.intuitionhq.com Mechanical turk (S) www.mechanicalturk.com Loop11 (S) www.loop11.com Open Hallway (S) www.openhallway.com What Users Do (P) www.whatusersdo.com Feedback army (P) www.feedbackarmy.com User feel (P) www.userfeel.com Ethnio (For Recruiting) www.ethnio.com Feedback on Prototypes / Mockups Pidoco www.pidoco.com Verify from Zurb www.verifyapp.com Five second test www.fivesecondtest.com Conceptshare www.conceptshare.com Usabilla www.usabilla.com 13.10 – UX Crowd tools 57
  • 58. 13.11 - Web Analytics Love • Properly instrumented analytics • Investment of 5-10% of developer time • Add more than you need • Events insights • Segmentation • Call tracking love! 58
  • 59. 13.12 - Tap 2 Call tracking Step 1 : Add a unique phone number on ALL channels (or insert your own dynamic number) Step 2 : For phones, add “Tap to Call” or “Click to Call” • Add Analytics event or tag for phone calls! • Very reliable data, easy & cheap to do • What did they do before calling? • Which page did they call you from? • What PPC or SEO keyword did they use? • Incredibly useful – this keyword level call data • What are you over or underbidding for? • Will help you shave 10, 20%+ off PPC • Which online marketing really sucks? 59
  • 61. 13.12 – And desktops? Step 1 : Add ‘Click to reveal’ • Can be a link, button or a collapsed section • Add to your analytics software • This is a great budget option! Step 2 : Invest in call analytics • Unique visitor tracking for desktop • Gives you that detailed marketing data • Easy to implement • Integrates with your web analytics • Let me explain… 61
  • 62. 13.12 - So what does phone tracking get you? • You can do it for free on your online channels • If you’ve got any phone sales or contact operation, this will change the game for you • For the first time, analytics for PHONE for web to claim • Optimise your PPC spend • Track and Test stuff on phones, using web technology • The two best phone A/B tests? You’ll laugh! 62
  • 63. Who?Company Website Coverage Mongoose Metrics* www.mongoosemetrics.com UK, USA, Canada Ifbyphone* www.ifbyphone.com USA TheCallR* www.thecallr.com USA, Canada, UK, IT, FR, BE, ES, NL Call tracking metrics www.calltrackingmetrics.com USA Hosted Numbers www.hostednumbers.com USA Callcap www.callcap.com USA Freespee* www.freespee.com UK, SE, FI, NO, DK, LT, PL, IE, CZ, SI, AT, NL, DE Adinsight* www.adinsight.co.uk UK Infinity tracking* www.infinity-tracking.com UK Optilead* www.optilead.co.uk UK Switchboard free www.switchboardfree.co.uk UK Freshegg www.freshegg.co.uk UK Avanser www.avanser.com.au AUS Jet Interactive* www.jetinteractive.com.au AUS * I read up on these or talked to them. These are my picks. 63
  • 64. 64
  • 65. 13.12 - Web Analytics Love • People, Process, Human problems • UX of web analytics tools and reports • Make the UI force decisions! • Playability and exploration • Skunkworks project time (5-10%) • Give it love, time, money and iteration • How often do you iterate analytics? • Lastly, spend to automate, gain MORE time 65
  • 67. END & QUESTIONS 67 When you get a 20% lift
  • 68. RESOURCE PACK • Maturity model • Crowdsourced UX • Collaborative tools • Testing tools for CRO & QA • Belron methodology example • CRO and testing resources 68
  • 69. Ad Hoc Local Heroes Chaotic Good Level 1 Starter Level Guessing A/B testing Basic tools Analytics Surveys Contact Centre Low budget usability Outline process Small team Low hanging fruit + Multi variate Session replay No segments +Regular usability testing/research Prototyping Session replay Onsite feedback ________________________________________________________________________ _____________________ _ Dedicated team Volume opportunities Cross silo team Systematic tests Ninja Team Testing in the DNA Well developed Streamlined Company wide +Funnel optimisation Call tracking Some segments Micro testing Bounce rates Big volume landing pages + Funnel analysis Low converting & High loss pages + offline integration Single channel picture + Funnel fixes Forms analytics Channel switches +Cross channel testing Integrated CRO and analytics Segmentation +Spread tool use Dynamic adaptive targeting Machine learning Realtime Multichannel funnels Cross channel synergy ________________________________________________________________________ _______________________ ________________________________________________________________________ ________________________ Testing focus Culture Process Analytics focus Insight methods +User Centered Design Layered feedback Mini product tests Get buyin _________________________________________________________________________ _______________________Mission Prove ROI Scale the testing Mine value Continual improvement + Customer sat scores tied to UX Rapid iterative testing and design + All channel view of customer Driving offline using online All promotion driven by testing Level 2 Early maturity Level 3 Serious testing Level 4 Core business value Level 5 You rock, awesomely ________________________________________________________________________ ________________________ 69
  • 70. Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both) Usertesting (B) www.usertesting.com Userlytics (B) www.userlytics.com Userzoom (S) www.userzoom.com Intuition HQ (S) www.intuitionhq.com Mechanical turk (S) www.mechanicalturk.com Loop11 (S) www.loop11.com Open Hallway (S) www.openhallway.com What Users Do (P) www.whatusersdo.com Feedback army (P) www.feedbackarmy.com User feel (P) www.userfeel.com Ethnio (For Recruiting) www.ethnio.com Feedback on Prototypes / Mockups Pidoco www.pidoco.com Verify from Zurb www.verifyapp.com Five second test www.fivesecondtest.com Conceptshare www.conceptshare.com Usabilla www.usabilla.com 2 - UX Crowd tools 70
  • 71. 3 - Collaborative Tools Oh sh*t 71
  • 73. 3.2 - Pivotal Tracker 73
  • 76. • Lots of people don’t know this • Serious time is getting wasted on pulling and preparing data • Use the Google API to roll your own reports straight into Big G • Google Analytics + API + Google docs integration = A BETTER LIFE! • Hack your way to having more productive weeks • Learn how to do this to make completely custom reports 3.5 - Google Docs and Automation 76
  • 77. • LucidChart 3.6 - Cloud Collaboration 77
  • 78. • Webnotes 3.7 - Cloud Collaboration 78
  • 79. • Protonotes 3.8 - Cloud Collaboration 79
  • 80. • Conceptshare 3.9 - Cloud Collaboration 80
  • 81. 4 – QA and Testing tools Email testing www.litmus.com www.returnpath.com www.lyris.com Browser testing www.crossbrowsertesting.com www.cloudtesting.com www.multibrowserviewer.com www.saucelabs.com Mobile devices www.perfectomobile.com www.deviceanywhere.com www.mobilexweb.com/emulators www.opendevicelab.com 81
  • 82. 5 – Méthodologies - Lean UX Positive – Lightweight and very fast methods – Realtime or rapid improvements – Documentation light, value high – Low on wastage and frippery – Fast time to market, then optimise – Allows you to pivot into new areas Negative – Often needs user test feedback to steer the development, as data not enough – Bosses distrust stuff where the outcome isn’t known “The application of UX design methods into product development, tailored to fit Build-Measure-Learn cycles.” 82
  • 83. 5 - Agile UX / UCD / Collaborative Design Positive – User centric – Goals met substantially – Rapid time to market (especially when using Agile iterations) Negative – Without quant data, user goals can drive the show – missing the business sweet spot – Some people find it hard to integrate with siloed teams – Doesn’t’ work with waterfall IMHO Wireframe Prototype TestAnalyse Concept Research “An integration of User Experience Design and Agile* Software Development Methodologies” *Sometimes 83
  • 85. 5 - Lean Conversion Optimisation Positive – A blend of several techniques – Multiple sources of Qual and Quant data aids triangulation – CRO analytics focus drives unearned value inside all products Negative – Needs a one team approach with a strong PM who is a Polymath (Commercial, Analytics, UX, Technical) – Only works if your teams can take the pace – you might be surprised though! “A blend of User Experience Design, Agile PM, Rapid Lean UX Build-Measure-Learn cycles, triangulated data sources, triage and prioritisation.” 85
  • 86. 5 - Lean CRO Inspection Immersion Identify Triage & Triangulate Outcome Streams Measure Learn Instrument 86
  • 87. 5 - Triage and Triangulation • Starts with the analytics data • Then UX and user journey walkthrough from SERPS -> key paths • Then back to analytics data for a whole range of reports: • Segmented reporting, Traffic sources, Device viewport and browser, Platform (tablet, mobile, desktop) and many more • We use other tools or insight sources to help form hypotheses • We triangulate with other data where possible • We estimate the potential uplift of fixing/improving something as well as the difficulty (time/resource/complexity/risk) • A simple quadrant shows the value clusters • We then WORK the highest and easiest scores by… • Turning every opportunity spotted into an OUTCOME “This is where the smarts of CRO are – in identifying the easiest stuff to test or fix that will drive the largest uplift.” 87
  • 88. 5 - The Bucket Methodology “Helps you to stream actions from the insights and prioritisation work. Forces an action for every issue, a counter for every opportunity being lost.”  Test If there is an obvious opportunity to shift behaviour, expose insight or increase conversion – this bucket is where you place stuff for testing. If you have traffic and leakage, this is the bucket for that issue.  Instrument If an issue is placed in this bucket, it means we need to beef up the analytics reporting. This can involve fixing, adding or improving tag or event handling on the analytics configuration. We instrument both structurally and for insight in the pain points we’ve found.  Hypothesise This is where we’ve found a page, widget or process that’s just not working well but we don’t see a clear single solution. Since we need to really shift the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by evidence and data, we’ll create test plans to find the answers to the questions and change the conversion or KPI figure in the desired direction.  Just Do It JFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the change is a no-brainer. Items marked with this flag can either be deployed in a batch or as part of a controlled test. Stuff in here requires low effort or are micro-opportunities to increase conversion and should be fixed.  Investigate You need to do some testing with particular devices or need more information to triangulate a problem you spotted. If an item is in this bucket, you need to ask questions or do further digging. 88
  • 89. 5 - Belron example – Funnel replacement Final prototype Usability issues left Final changes Release build Legal review kickoff Cust services review kickoff Marketing review Test Plan Signoff (Legal, Mktng , CCC) Instrument analytics Instrument Contact Centre Offline tagging QA testing End-End testing Launch 90/10% Monitor Launch 80/20% Monitor < 1 week Launch 50/50% Go live 100% Analytics review Washup and actions New hypotheses New test design Rinse and Repeat!
  • 90. 6 - CRO and Testing resources • 101 Landing page tips : slidesha.re/8OnBRh • 544 Optimisation tips : bit.ly/8mkWOB • 108 Optimisation tips : bit.ly/3Z6GrP • 32 CRO tips : bit.ly/4BZjcW • 57 CRO books : bit.ly/dDjDRJ • CRO article list : bit.ly/nEUgui • Smashing Mag article : bit.ly/8X2fLk 90
  • 91. END SLIDES 91 Feel free to steal, re-use, appropriate or otherwise lift stuff from this deck. If it was useful to you – email me or tweet me and tell me why – I‟d be DELIGHTED to hear! Regards, Craig.
  • 92. Why does my CRO suck? 5th Sep 2013 @OptimiseOrDie
  • 93. @OptimiseOrDie Timeline - 1998 1999 - 2004 2004-2008 2008-2012
  • 95. SE O @OptimiseOrDie PP C UX Analytics A/B and Multivariate testing Customer Satisfaction Design QADevelopment 40+ websites, 34 countries, 19 languages, €1bn+ revenue Performance 8 people
  • 97. If you‟re not a part of the solution, there‟s good money to be made in prolonging the problem
  • 98. Out of my comfort zone… @OptimiseOrDie
  • 100. Nice day at the office, dear? @OptimiseOrDie
  • 101.
  • 106. If it isn‟t working, you‟re not doing it right @OptimiseOrDie
  • 107. #1 : Your analytics are cattle trucked @OptimiseOrDie
  • 108. #1 : Your analytics are cattle trucked @OptimiseOrDie
  • 109. #1 : Common problems (GA) • Dual purpose goal page – One page used by two outcomes – and not split • Cross domain tracking – Where you jump between sites, this borks the data • Filters not correctly set up – Your office, agencies, developers are skewing data • Code missing or double code – Causes visit splitting, double pageviews, skews bounce rate • Campaign, Social, Email tracking etc. – External links you generate are not setup to record properly • Errors not tracked (404, 5xx, Other) – You are unaware of error volumes, locations and impact • Dual flow funnels – Flows join in the middle of a funnel or loop internally • Event tracking skews bounce rate – If an event is set to be „interactive‟ – it can skew bounce rate (example) @OptimiseOrDie
  • 110. #1 : Common problems (GA) – EXAMPLE 110 Landing 1st interaction Loss 2nd interaction Loss 3rd interaction Loss 4th interaction Loss 55900 527 99.1% 66 87.5% 55 16.7% 33 40.0% 30900 4120 86.7% 2470 40.0% 1680 32.0% 1240 26.2%
  • 111. #1 : Solutions • Get a Health Check for your Analytics – Try @prwd, @danbarker, @peter_oneill or ask me! • Invest continually in instrumentation – Aim for at least 5% of dev time to fix + improve • Stop shrugging : plug your insight gaps – Change „I don‟t know‟ to „I‟ll find out‟ • Look at event tracking (Google Analytics) – If set up correctly, you get wonderful insights • Would you use paper instead of a till? – You wouldn‟t do it in retail so stop doing it online! • How do you win F1 races? – With the wrong performance data, you won‟t @OptimiseOrDie
  • 112. Insight - Inputs #FAIL Competitor copying Guessing Dice rolling An article the CEO read Competitor change Panic Ego Opinion Cherished notions Marketing whims Cosmic rays Not ‘on brand’ enough IT inflexibility Internal company needs Some dumbass consultant Shiny feature blindness Knee jerk reactons #2 : Your inputs are all wrong @OptimiseOrDie
  • 113. Insight - Inputs Insight Segmentation Surveys Sales and Call Centre Session Replay Social analytics Customer contact Eye tracking Usability testing Forms analytics Search analytics Voice of Customer Market research A/B and MVT testing Big & unstructured data Web analytics Competitor evalsCustomer services #2 : Your inputs are all wrong @OptimiseOrDie
  • 114. #2 : Solutions • Usability testing and User Centred design – If you‟re not doing this properly, you‟re hosed • Champion UX+ - with added numbers – (Re)designing without inputs + numbers is guessing • You need one team on this, not silos – Stop handing round the baby (I‟ll come back to this) • Ego, Opinion, Cherished notions – fill gaps – Fill these vacuums with insights and data • Champion the users – Someone needs to take their side! • You need multiple tool inputs – Let me show you my core list @OptimiseOrDie
  • 115. #2 : Core tools • Properly set up analytics – Without this foundation, you‟re toast • Session replay tools – Clicktale, Tealeaf, Sessioncam and more… • Cheap / Crowdsourced usability testing – See the resource pack for more details • Voice of Customer / Feedback tools – 4Q, Kampyle, Qualaroo, Usabilla and more… • A/B and Multivariate testing – Optimizely, Google Content Experiments, VWO • Email, Browser and Mobile testing – You don‟t know if it works unless you check @OptimiseOrDie
  • 116. #3 : You‟re not testing (enough) @OptimiseOrDie
  • 117. #3 : Common problems • Let’s take a quick poll – How many tests do you complete a month? • Not enough resource – You MUST hire, invest and ringfence time and staff for CRO • Testing has gone to sleep – Some vendors have a „rescue‟ team for these accounts • Vanity testing takes hold – Getting one test done a quarter? Still showing it a year later? • You keep testing without buyin at C-Level – If nobody sees the flower, was it there? • You haven’t got a process – just a plugin – Insight, Brainstorm, Wireframe, Design, Build, QA test, Monitor, Analyse. Tools, Process, People, Time -> INVEST • IT or release barriers slow down work – Circumvent with tagging tools – Develop ways around the innovation barrier @OptimiseOrDie
  • 118. #4 : Not executing fast enough @OptimiseOrDie
  • 119. #4 : Not executing fast enough • Silo Mentality means pass the product – No „one team‟ approach means no „one product‟ • The process is badly designed – See the resource pack or ask me later! • People mistake hypotheses for finals – Endless argument, tweaking means NO TESTING – let the test decide, please! • No clarity : authority or decision making – You need a strong leader to get things decided • Signoff takes far too long – Signoff by committee is a velocity killer – the CUSTOMER and the NUMBERS are the signoff • You set your target too low – Aim for a high target and keep increasing it @OptimiseOrDie
  • 121. #4 : Execution solutions • Agile, One Team approach – Everyone works on the lifecycle, together • Hire Polymaths – T-shaped or just multi-skilled, I hire them a lot • Use Collaborative Tools, not meetings – See the resource pack • Market the results – Market this stuff internally like a PR agency – Encourage betting in the office • Smash down silos – a special mission – Involve the worst offenders in the hypothesis team – “Hold your friends close, and your enemies closer” – Work WITH the developers to find solutions – Ask Developers and IT for solutions, not apologies @OptimiseOrDie
  • 122. #5 : Product cycles are too long 0 6 12 18 Months Conversion @OptimiseOrDie
  • 123. #5 : Solutions • Give Priority Boarding for opportunities – The best seats reserved for metric shifters • Release more often to close the gap – More testing resource helps, analytics „hawk eye‟ • Kaizen – continuous improvement – Others call it JFDI (just f***ing do it) • Make changes AS WELL as tests, basically! – These small things add up • RUSH Hair booking – Over 100 changes – No functional changes at all – 37% improvement • Inbetween product lifecycles? – The added lift for 10 days work, worth 360k @OptimiseOrDie
  • 124. #5 : Make your own cycles @OptimiseOrDie
  • 125. #6 – No Photo UX 24 Jan 2012 • Persuasion / Influence / Direction / Explanation • Helps people process information and stories • Vital to sell an „experience‟ • Helps people recognise and discriminate between things • Supports Scanning Visitors • Drives emotional response short.cx/YrBczl
  • 126. • Very powerful and under-estimated area • I‟ve done over 20M visitor tests with people images for a service industry – some tips: • The person, pose, eye gaze, facial expressions and body language – cause visceral emotional reactions and big changes in behaviour • Eye gaze crucial – to engage you or to „point‟ Photo UX 24 Jan 2012
  • 127. • Negative body language is a turnoff • Uniforms and branding a positive (ball cap) • Hands are hard to handle – use a prop to help • For Ecommerce – tip! test bigger images! • Autoglass and Belron always use real people • In most countries (out of 33) with strong female and male images in test, female wins • Smile and authenticity in these examples is absolutely vital • So, I have a question for you Photo UX @OptimiseOrDie
  • 129. Terrible Stock Photos : headsethotties.com & awkwardstockphotos.com Laughing at Salads : womenlaughingwithsalad.tumblr.com BBC Fake Smile Test : bbc.in/5rtnv @OptimiseOrDie
  • 130. SPAIN +22% over control 99% confidence @OptimiseOrDie
  • 132. #7 : Your tests are cattle trucked • Many tests fail due to QA or browser bugs – Always do cross browser QA testing – see resources • Don’t rely on developers saying ‘yes’ – Use your analytics to define the list to test • Cross instrument your analytics – You need this to check the test software works • Store the variant(s) seen in analytics – Compare people who saw A/B/A vs. A/B/B • Segment your data to find variances – Failed tests usually show differences for segments • Watch the test and analytics CLOSELY – After you go live, religiously check both – Read this article : stanford.io/15UYov0 @OptimiseOrDie
  • 133. #8 : Stats are confusing • Many testers & marketing people struggle – How long will it take to run the test? – Is the test ready? – How long should I keep it running for? – It says it‟s ready after 3 days – is it? – Can we close it now – the numbers look great! • A/B testing maths for dummies: – http://bit.ly/15UXLS4 • For more advanced testers: – Read this : http://bit.ly/1a4iJ1H • I’m going to build a stats course – To explain all the common questions – To save me having to explain this crap all the time @OptimiseOrDie
  • 134. #9 : You‟re not segmenting • Averages lie – What about new vs. returning visitors? – What about different keyword groups? – Landing pages? Routes? Attributes • Failed tests are just ‘averaged out’ – You must look at segment level data – You must integrate the analytics + a/b test software • The downside? – You‟ll need more test data – to segment • The upside? – Helps figure out why test didn‟t perform – Finds value in failed or „no difference‟ tests – Drives further testing focus @OptimiseOrDie
  • 135. #10 : You‟re unichannel optimising • Not using call tracking – Look at Infinity Tracking (UK) – Get Google keyword level call volumes! • You don’t measure channel switchers – People who bail a funnel and call – People who use chat or other contact/sales • You ‘forget’ mobile & tablet journeys – Walk the path from search -> ppc/seo -> site – Optimise for all your device mix & journeys • You’re responsive – Testing may now bleed across device platforms – Changing in one place may impact many others – QA, Device and Browser testing even more vital @OptimiseOrDie
  • 136. SUMMARY : The best Companies…. • Invest continually in Analytics instrumentation, tools & people • Use an Agile, iterative, Cross-silo, One team project culture • Prefer collaborative tools to having lots of meetings • Prioritise development based on numbers and insight • Practice real continuous product improvement, not SLED • Source photos and copy that support persuasion and utility • Have cross channel, cross device design, testing and QA • Segment their data for valuable insights, every test or change • Continually try to reduce cycle (iteration) time in their process • Blend ‘long’ design, continuous improvement AND split tests • Make optimisation the engine of change, not the slave of ego • See the Maturity Model in the resource pack @OptimiseOrDie

Editor's Notes

  1. Tomorrow - Go forth and kick their flabby low converting asses
  2. These are all people on twitter who cover hybrid stuff – where usability, psychology, analytics and persuasive writing collide. If you follow this lot, you’ll be much smarter within a month, guaranteed.
  3. And here are the most useful resources I regularly use or share with people. They have the best and most practical advice – cool insights but with practical applications.A special mention here to my friends at PRWD, who are one of the few companies blending Psychology, Split Testing and UX for superb gains in rapid time. Check out their resources section on their website.
  4. So – what’s driving this change then? Well there have been great books on selling and persuading people – all the way back to ‘Scientific Advertising’ in 1923.And my favourite here is the Cialdini work – simply because it’s a great help for people to find practical uses for these techniques.I’ve also included some analytics and testing books here – primarily because they help so MUCH in augmenting our customer insight, testing and measurement efforts.There are lots of books with really cool examples, great stories and absolutely no fucking useful information you can use on your website – if you’ve read some of these, you’ll know exactly what I mean. These are the tomes I got most practical use from and I’d recommend you buy the whole lot – worth every penny.
  5. “A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
  6. “A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
  7. “A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
  8. Create a suck index = pageviews * load time.
  9. Here I show you some examples of well known brands, some of whom should know better. The larger the size of the page, the longer it will take to download and render on the device, especially when you don’t have perfect data conditions. The numberof requests also makes a difference, as it’s inefficient on mobile to open lots of connections like this. In short, the smaller the pagesize and number of requests you can aim for, the better. I’m patient with bad data connections but do people have the tolerance for 10-15 seconds on mobile? No – it has to happen much faster.
  10. These are the results of a live test on a site, where an artificial delay is introduced in the performance testing. I’ve done some testing like this myself on desktop and mobile sites and confirm this is true – you’re increasing bounce rate, decreasing conversion, site engagement…It doesn’t matter what metric you use, performance equals MONEY or if not measured, a HUGE LOSS.
  11. Performance also harms the lifeblood of e-commerce and revenue generating websites – repeat visitors! The gap here in one second of delay is enormous over time. You’re basically sucking a huge portion of potential business out of your site, with every additional bit of waiting time you add.
  12. Add unique phone numbers to all your mobile sites and apps. That’s for starters.Then configure your analytics to collect data when people Click or Tap to make a phone call.Make sure you add other events like ringbacks, email, chat – any web forms or lead gen activity too.
  13. So what does this graph say? That I have a long tail thing I want to talk to you about?No – this shows how much the ratio of phone to online conversion we have, by keyword.Some keywords generate nearly 25 times the call volume of others, which is a huge differential.This means that if you thought you got ‘roughly’ the same proportion of phone calls for different marketing activity, you are wrong.What this graph tells me is that the last 2 years of my stats are basically a big dog poo.
  14. Add unique phone numbers to all your mobile sites and apps. That’s for starters.Then configure your analytics to collect data when people Click or Tap to make a phone call.Make sure you add other events like ringbacks, email, chat – any web forms or lead gen activity too.
  15. Phone tracking costs you nothing – you can add it in a few minutes to your app or mobile website, by changing your analytics tracking.Now you can see exactly which bits of inbound marketing are driving telephone and other contact channelsIf you have any sort of phone component in your service or support, the insight could be vitalYou can take traffic by keyword, source, campaign or advert creative and work out the TRUE mix of conversion activityAnd all this is also available on Desktop too – by using dynamic numbers, we can track exactly the same stuff.Talk to this company : www.infinity-tracking.com
  16. So what does this graph say? That I have a long tail thing I want to talk to you about?No – this shows how much the ratio of phone to online conversion we have, by keyword.Some keywords generate nearly 25 times the call volume of others, which is a huge differential.This means that if you thought you got ‘roughly’ the same proportion of phone calls for different marketing activity, you are wrong.What this graph tells me is that the last 2 years of my stats are basically a big dog poo.
  17. “A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
  18. “A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
  19. “A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
  20. This stuff is important. What do photographs do?Well they help me persuade people, influence their thinking, give them directions or cues and explain things – this is the scanning generation!And they’re very powerful when selling experiences, stories or using the power of social proofThey help people very quickly (more quickly than reading) discriminate, evaluate – work out what stuff is, how it’s organized, what the things are, what’s being shown to you.And most importantly, they drive emotional response in people. Whether you like being soggy, wet and without toilet paper for a 30 mile radius or not, a picture like this gets a RESPONSE! Work it!Lastly, a shout out to James Chudley, who’s book this example comes from.
  21. So this is quite a powerful area – what about people images?I’ve tested quite a few of these – in over 20 countries and over 15 languages. What did I find?Well - the person, pose, eye gaze, facial expressions and body language – cause visceral emotional reactions and big changes in behaviour. The difference between a crappy image and one optimised to get the right response is huge.And one interesting thing Eye gaze is pretty crucial – to engage you, the viewer, or to ‘point’ or draw eye gaze and attention to a product. I’ve tested all angles of viewing and in these people images, the best view is straight at the viewer or slightly away. Any further and the conversion rate drops. It makes a difference I can count.