Am I testing on the right platforms? This is the question that app developers and testers ask themselves every day. As more brands connect with users digitally, it’s getting harder to ensure a great user experience across mobile, web & wearables.
In this webinar, you’ll learn how to create an effective and customized test coverage strategy. We’ll describe how our new Test Coverage Toolkit’s combination of resources, recommendations and research can help you:
Know exactly what mix of mobile devices and browsers to test on
Keep up with the latest mobile devices, operating systems and browsers
Learn how to setup and maintain a robust and scalable test lab
Deliver a great digital user experience across devices
6. Test Coverage Optimization Process
2. Lab Sizing
Teams Projects
SDLC
Cycle
Requirements
How many?
v
Me
Analytics
Market
My Space
Popular & emerging
1. Coverage Mix
Industry trends & insights
Which?
7. Test Coverage Optimization Process
Me
Analytics
Market
My Space
Popular & emerging
Industry
Current state
Internal
External
Future state
The
Index
The
Optimizer
8. What does it take to cover the
REAL end user experience?
v
Device & Platform
Model
OS Version
Screen Size
browser
v
Environment
Location
Network
Phone events
Other apps
v
Conditions
Signal
Load
CPU
9. What’s new?
• Brazil, China, Netherlands
• Usage vs. Purchase Model
Digital Test Coverage Index 4th Edition
21. QUALITY
VELOCITY
16 DEVICES
REQUIRED COVERAGE
2 WEEK
SPRINTS
GOAL: 400
REGRESSION TESTS
TO RUN IN 48 HOURS
ARE 16 DEVICES ENOUGH TO COMPLETE
FULL REGRESSION IN 48 HOURS?
Quality vs. Velocity
22. COVERAGE
CAPACITY
EACH DEVICE EXECUTING 400 TEST CASES
48 HOURS
DESIRED REGRESSION TIME
32 DEVICES REQUIRED FOR FULL COVERAGE IN 48 HOURS
ACTUAL REGRESSION TIME
16 X
Meeting Velocity Goals
10 MIN * 400 TEST CASES = 67 HOURS (2.8 DAYS)
67 HOURS – 48 HOURS = 19 HOURS
19 HOURS / 48 HOURS = 0.4 (1 ADDITIONAL DEVICE PER DEVICE TYPE)
2X DEVICES REQUIRED
2.8 DAYS
23. Balancing Velocity and Coverage
Apple iPhone 6
Apple iPhone 6 Plus
Apple iPad Air 2
Apple iPhone 5S
Apple iPhone 6S
Apple iPad mini
Apple iPad 2
Apple iPhone 6S Plus
Primary
{
{Secondary
Samsung Galaxy S6
Samsung Galaxy S5
Samsung Galaxy S4
Samsung Galaxy Note 5
Google Nexus 5X
HTC One (M9)
LG G4
Samsung Galaxy Tab S2
Primary
{Secondary
{
iOS Android
75 High Priority Tests
200 Medium Priority Tests
125 Low Priority Tests
{ }RUN ON PRIMARY
DEVICES (400 TESTS)
RUN ON
SECONDARY
DEVICES (275 TESTS)
24. PRIMARY
25 TOTAL DEVICES FOR OPTIMIZED COVERAGE
PARALLEL
CAPACITY
10 MIN * 400 TEST CASES = 67 HOURS (2.8 DAYS)
67 HOURS – 48 HOURS = 19 HOURS
19 / 48 = 0.4 (1 ADDITIONAL DEVICE PER DEVICE TYPE)
2X DEVICES REQUIRED FOR HIGH PRIORITY TESTS
10 MIN * 275 TEST CASES = 46 HOURS (1.9 DAYS)
7 SECONDARY DEVICES
SECONDARY
Optimized Coverage
18 X
7 X
←
Before I pass it over to Carlo and Assaf, just a few quick housekeeping items.
We have a Q&A panel and we have people ready to answer your questions in real time as you type them in. So feel free to submit your questions using the Q&A panel.
We’re going to conduct two polls later on in the webinar, we hope you can participate and share your insight.
This webinar is being recorded and you will receive the recording and slides within a couple of days.
We also have a short survey that will pop up right after the webinar is over. We’d also appreciate it if you could fill that our as well because it helps us improve future webinars.
All right, without further ado…
What should you do with this data? Combine it with your own data and guidelines
What should you do with this data? Combine it with your own data and guidelines
User profile – Consider your end user real life experience