SlideShare a Scribd company logo
1 of 29
TEST METRICS IN AGILE
TEST MANAGEMENT
A POWERFUL TOOL TO SUPPORT CHANGES
Yulia Zavertailo, Senior Test manager
Agenda
− About me.
− Introduction to the client’s case.
− What needs to be improved?
− How do we visualize our results?
− A closer look at the KPIs and how to gather them.
− Conclusions.
03.04.2017 / 3
About me
− 10 years of work experience in the field of
Testing and Quality Assurance, 7.5 years in
Itera.
− Moved to Oslo in November 2014.
− Key competences:
− test management,
− test advisory on process establishment
in a project and entire organization.
− Very passionate about the ISO
9000/ISO29119 standards.
− Mentor and coach for junior testers.
− Love running, skiing and oil painting.
03.04.2017 / 4
IF Skadeforsikring – general
03.04.2017 / 5
− Waypoint is the largest digital solution at IF built on
modern technology platform with lots of interfaces
for clients and internal users (1000+).
− Methodology – Agile (Scrum, Kanban).
− 13 Agile teams.
− Each team has both business and IT competence.
− IT development is done in headquarters in Oslo.
− Large Scandinavian insurance company present in the
Nordic countries and Baltics with 3,6 millions of clients
and 6 800 employees.
– System is not covered by tests.
– No professional IT testing is done in the project.
– Business users are involved to do the functional acceptance and
regression testing.
– Long production cycle: releases are 3-4 times a year.
03.04.2017 / 6
IF Skadeforsikring - as a test challenge
− Critical issues found in production after release.
IF Skadeforsikring - in a result
− IT test team has grown from 1 to 17 IT testers in Riga, Latvia.
− Frequency of releases is every 5 weeks.
− Response on the entire system quality and stability within a few hours.
− Project discovers and fixes 136 bugs in average every release .
− There are still functional modules which do not have enough test
coverage = facing high risk of defects.
03.04.2017 / 7
Specifically…
what are we going to improve?
1. Increase frequency of releases so that IT delivers functionality to the
business every 5 weeks by:
– Automating the regression test coverage.
– Discovering and fixing bugs earlier in the cycle.
– Improving development teams efficiency by doing continuous IT testing.
03.04.2017 / 8
Specifically…
what are we going to improve?
2. Decrease amount of bugs in the production by:
– Running automated regression suits regularly.
– Improving the quality of testing itself.
– Introducing continuous manual testing while developing.
– Focus business resources doing acceptance testing only.
03.04.2017 / 9
Hmmm….
but how to prove the succeeded result?
Present test results our client is interested in:
– Test coverage.
– # of defects found in production.
– # of defects found in sprints.
– Time spent by end users.
– Time for feedback about system’s quality.
03.04.2017 / 10
KPI #1 – Test Coverage
What do we measure?
− Percentage of requirements covered by at least one test.
Why do we gather it in our Agile projects?
– Visualize “black spots” of our application.
– Visualize that the larger test coverage is, the less risk of missing
defects is.
– Visualize the functionality that needs more attention since it is less
covered by tests.
03.04.2017 / 11
Application test coverage - example
03.04.2017 / 12
1075
739
430
800
230
419
310
244
130
465
70
141
54
13 8
69
4 10
0
200
400
600
800
1000
1200
Module 1 Module 2 Module 3 Module 4 Module 5 Module 6
Total user stories US covered manual US covered automated
How to gather Test Coverage KPI
1. Configure your Agile TM tool in order to build a traceability matrix in
TM tool (TFS, Jira, QC).
2. Use the graphs module of your TM tool or export to Pivot tables in
Excel.
3. Go for the Excel-option if you do not have any TM tool.
03.04.2017 / 13
KPI #2 - Defect open and close rates
What Do we measure?
Defect Open and Close rate = Defects found during testing in a release *100
(Defects found during testing in a release + Defects found after release)
03.04.2017 / 14
Why do we gather it in our Agile projects?
− Categorizing defects as either open or closed.
− Certain defects could slip through the cracks and show up in the finalized
release.
− Testers and developers to work together to identify and address software
issues.
Sprint vs production defects - trend example
03.04.2017 / 15
170
131
85 85
150
85
105
129
141
112
182
174
152
231
175
229
139
226
175
222
124
26
7 13 7 10 17 10
29
20 17
47
20
33
21 21 25 24
13 15 18
0
144
124
72 78
140
68
95 100
121
95
135
154
119
210
154
204
115
213
160
204
124
0
50
100
150
200
250
Total Production Defects Sprint Defects
How to gather defect open and close rates KPI
To get a rate:
− Get the number of defects found during a release/sprint.
− Get the number of bugs reported by the business after release to
production.
− Calculate rate based on a formula.
To build a trend graph:
− Jira + Zephyr dashboards.
− TFS + pivot tables in Excel.
03.04.2017 / 16
KPI #3 - Issues reported by customers
What Do we measure?
% of customer reported issues = Total # of issues reported by customers *100
total number of issues reported
Why do we gather it in our Agile projects?
− Does the product meet the needs of a customer/end user?
− Effectiveness of test team.
− Types and number of defects lingering in released products.
− Critical issues with current QA processes?
03.04.2017 / 17
Customer issues reported – example of trend
03.04.2017 / 18
1
12 7 3 6 4 2 3 2 1 4 4 3 1 2 9 2
14 8 11 5
144
124
72 78
140
68
95
100
121
95
135
154
119
210
154
204
115
213
160
204
124
0
50
100
150
200
250
Customer reported isues Bugs all
Issues reported by customers per functional
area – example of trend
03.04.2017 / 19
10
3
10
1
6
3
9
3
2
5
15
5
6
1
3
1
1
3
2
1
1
12
2
3
1
4
4
1
5
1
3
4
3
7
2
2
1
2
1
4
6
5
5
2
5
1
2
2
2
2
1
3
2
2
2
1
2
4
9
5
5
5
4
6
4
7
6
0 5 10 15 20 25 30 35 40 45 50
Release1
Release2
Release3
Release4
Release5
Release6
Release7
Release8
Release9
Release10
Release11
Release12
Module1 Module2 Module3 Module4 Module5 Module6 Module7
How to gather issues reported by customers
KPI
To get a rate:
− Get the number of defects found by customers.
− Get the total number of defects found in a sprint.
− Calculate the rate based on a formula.
To build a trend graph:
− TFS + pivot tables in Excel.
03.04.2017 / 20
KPI #4 – Efficiency of end users’ testing
03.04.2017 / 21
What do we measure?
− Time spent by end users.
− # of bugs found by end users during acceptance testing of release vs
total.
Why do we gather it in our Agile projects?
− Feedback about the quality of requirement/acceptance criteria.
− Feedback about the quality of the functional and regression testing.
− To see the cost (time spent vs amount of bugs found).
Time spent by end users – example of trend
03.04.2017 / 22
97
83
135.5
161.5
124 127 123
237
152.5
135.5
288.5
112
131.5
147.5
159
110
30 30 29 28.5 30 31 31 26
15.5 16.5 21.5
15 14.5 16 12 13.5
80 75
62
54
34
47 48
40 40
12 10 7.5 6.5 8 9 5 7 6 5 8 5 10 9.5 5 6.5
30 29
79
85
69 69 69
96
39
48
193
54 54 54
77
42
0
50
100
150
200
250
300
350
Total
Country Domain 1
Country Domain 2
Country Domain 3
Country Domain 4
Country Domain 5
KPI #5 - Regression test suite duration
What do we measure?
− Time for feedback about system’s stability and quality – TA.
− Time for feedback about system’s stability and quality – Manual.
Why do we gather it in our Agile projects?
− Get a clue about time needed to regression-test a hot fix or feature.
− Decision-making input for further planning.
− Visualized benefit and efficiency of test automation.
03.04.2017 / 23
Regression test suite duration - example
03.04.2017 / 24
Test suite
# of test cases in
a suite
Time to run test suite
with TA, hrs
Time to run suite
manually, hrs
Smoke test suite 6 0,2 1,5
Regression suite 1 191 5,5 47,75
Regression suite 2 204 5,5 51
Regression suite 3 194 5,5 48,5
Total 595 16,7 149
KPIs framework – in a few simple steps
1. Decide which KPIs are important for your client (focus on KPIs 1-3).
2. Configure your test management tool and visualize.
3. Continuously use KPIs at all levels of communication.
03.04.2017 / 25
Summary
− KPIs framework is a powerful tool to be an Agile test leader!
− Control the status of quality in a measurable way.
− Control the general healthiness of the test solution and strategy in your
project.
− Lead and make decisions in your project.
− Empower yourself to prove that IT- testing is a must in your project!
03.04.2017 / 26
QUESTIONS?
03.04.2017 / 27
Contacts
• LinkedIn: https://no.linkedin.com/in/yuliazavertailo
• Email: iuliia.zavertailo@itera.no
03.04.2017 / 28
THANK YOU!
03.04.2017 / 29
TEST METRICS IN AGILE

More Related Content

What's hot

Test Automation is for Everyone
Test Automation is for EveryoneTest Automation is for Everyone
Test Automation is for EveryoneWorksoft
 
Test management with iso 29119 building up an efficient test process
Test management with iso 29119 building up an efficient test processTest management with iso 29119 building up an efficient test process
Test management with iso 29119 building up an efficient test processYulia Zavertailo
 
Webinar: "5 semplici passi per migliorare la Quality e i processi di Test".
Webinar: "5 semplici passi per migliorare la Quality e i processi di Test".Webinar: "5 semplici passi per migliorare la Quality e i processi di Test".
Webinar: "5 semplici passi per migliorare la Quality e i processi di Test".Emerasoft, solutions to collaborate
 
Zero touch QA automation platform for DevOps
Zero touch QA automation platform for DevOpsZero touch QA automation platform for DevOps
Zero touch QA automation platform for DevOpsTaUB Solutions
 
Continuous Testing for CTOs (Webinar Slides)
Continuous Testing for CTOs (Webinar Slides)Continuous Testing for CTOs (Webinar Slides)
Continuous Testing for CTOs (Webinar Slides)Rainforest QA
 
Test Automation Strategies and Frameworks: What Should Your Team Do?
Test Automation Strategies and Frameworks: What Should Your Team Do?Test Automation Strategies and Frameworks: What Should Your Team Do?
Test Automation Strategies and Frameworks: What Should Your Team Do?TechWell
 
qTest 7.4: New Features
qTest 7.4: New FeaturesqTest 7.4: New Features
qTest 7.4: New FeaturesQASymphony
 
МАРІЯ ДУБИЦЬКА «Війна за тестову документацію на проекті» Online QADay 2021 #2
МАРІЯ ДУБИЦЬКА «Війна за тестову документацію на проекті» Online QADay 2021 #2МАРІЯ ДУБИЦЬКА «Війна за тестову документацію на проекті» Online QADay 2021 #2
МАРІЯ ДУБИЦЬКА «Війна за тестову документацію на проекті» Online QADay 2021 #2GoQA
 
ATD 2018: Journey Ice-cream cone approach
ATD 2018: Journey Ice-cream cone approachATD 2018: Journey Ice-cream cone approach
ATD 2018: Journey Ice-cream cone approachKarla Silva
 
Agile Test Automation: Truth, Oxymoron or Lie?
Agile Test Automation: Truth, Oxymoron or Lie?Agile Test Automation: Truth, Oxymoron or Lie?
Agile Test Automation: Truth, Oxymoron or Lie?Original Software
 
Gap assessment Continuous Testing
Gap assessment   Continuous TestingGap assessment   Continuous Testing
Gap assessment Continuous TestingMarc Hornbeek
 
'Quality Engineering: Build It Right The First Time' by Allan Woodcock, Shoba...
'Quality Engineering: Build It Right The First Time' by Allan Woodcock, Shoba...'Quality Engineering: Build It Right The First Time' by Allan Woodcock, Shoba...
'Quality Engineering: Build It Right The First Time' by Allan Woodcock, Shoba...TEST Huddle
 
Test Case Prioritization Techniques
Test Case Prioritization TechniquesTest Case Prioritization Techniques
Test Case Prioritization TechniquesKanoah
 
Continuous Testing in Vegas
Continuous Testing in VegasContinuous Testing in Vegas
Continuous Testing in Vegasjaredrrichardson
 
Agile testing: from Quality Assurance to Quality Assistance
Agile testing: from Quality Assurance to Quality AssistanceAgile testing: from Quality Assurance to Quality Assistance
Agile testing: from Quality Assurance to Quality AssistanceLuca Giovenzana
 
QA Process Overview for Firefox OS 2014
QA Process Overview for Firefox OS 2014QA Process Overview for Firefox OS 2014
QA Process Overview for Firefox OS 2014Anthony Chung
 

What's hot (20)

Agile Testing
Agile Testing Agile Testing
Agile Testing
 
Test Automation is for Everyone
Test Automation is for EveryoneTest Automation is for Everyone
Test Automation is for Everyone
 
ISTQB Foundation Agile Tester 2014 Training, Agile SW Development
ISTQB Foundation Agile Tester 2014 Training, Agile SW DevelopmentISTQB Foundation Agile Tester 2014 Training, Agile SW Development
ISTQB Foundation Agile Tester 2014 Training, Agile SW Development
 
Test management with iso 29119 building up an efficient test process
Test management with iso 29119 building up an efficient test processTest management with iso 29119 building up an efficient test process
Test management with iso 29119 building up an efficient test process
 
Webinar: "5 semplici passi per migliorare la Quality e i processi di Test".
Webinar: "5 semplici passi per migliorare la Quality e i processi di Test".Webinar: "5 semplici passi per migliorare la Quality e i processi di Test".
Webinar: "5 semplici passi per migliorare la Quality e i processi di Test".
 
Zero touch QA automation platform for DevOps
Zero touch QA automation platform for DevOpsZero touch QA automation platform for DevOps
Zero touch QA automation platform for DevOps
 
Continuous Testing for CTOs (Webinar Slides)
Continuous Testing for CTOs (Webinar Slides)Continuous Testing for CTOs (Webinar Slides)
Continuous Testing for CTOs (Webinar Slides)
 
Test Automation Strategies and Frameworks: What Should Your Team Do?
Test Automation Strategies and Frameworks: What Should Your Team Do?Test Automation Strategies and Frameworks: What Should Your Team Do?
Test Automation Strategies and Frameworks: What Should Your Team Do?
 
qTest 7.4: New Features
qTest 7.4: New FeaturesqTest 7.4: New Features
qTest 7.4: New Features
 
МАРІЯ ДУБИЦЬКА «Війна за тестову документацію на проекті» Online QADay 2021 #2
МАРІЯ ДУБИЦЬКА «Війна за тестову документацію на проекті» Online QADay 2021 #2МАРІЯ ДУБИЦЬКА «Війна за тестову документацію на проекті» Online QADay 2021 #2
МАРІЯ ДУБИЦЬКА «Війна за тестову документацію на проекті» Online QADay 2021 #2
 
ATD 2018: Journey Ice-cream cone approach
ATD 2018: Journey Ice-cream cone approachATD 2018: Journey Ice-cream cone approach
ATD 2018: Journey Ice-cream cone approach
 
Test Automation
Test AutomationTest Automation
Test Automation
 
Agile Test Automation: Truth, Oxymoron or Lie?
Agile Test Automation: Truth, Oxymoron or Lie?Agile Test Automation: Truth, Oxymoron or Lie?
Agile Test Automation: Truth, Oxymoron or Lie?
 
Gap assessment Continuous Testing
Gap assessment   Continuous TestingGap assessment   Continuous Testing
Gap assessment Continuous Testing
 
Testing Metrics
Testing MetricsTesting Metrics
Testing Metrics
 
'Quality Engineering: Build It Right The First Time' by Allan Woodcock, Shoba...
'Quality Engineering: Build It Right The First Time' by Allan Woodcock, Shoba...'Quality Engineering: Build It Right The First Time' by Allan Woodcock, Shoba...
'Quality Engineering: Build It Right The First Time' by Allan Woodcock, Shoba...
 
Test Case Prioritization Techniques
Test Case Prioritization TechniquesTest Case Prioritization Techniques
Test Case Prioritization Techniques
 
Continuous Testing in Vegas
Continuous Testing in VegasContinuous Testing in Vegas
Continuous Testing in Vegas
 
Agile testing: from Quality Assurance to Quality Assistance
Agile testing: from Quality Assurance to Quality AssistanceAgile testing: from Quality Assurance to Quality Assistance
Agile testing: from Quality Assurance to Quality Assistance
 
QA Process Overview for Firefox OS 2014
QA Process Overview for Firefox OS 2014QA Process Overview for Firefox OS 2014
QA Process Overview for Firefox OS 2014
 

Viewers also liked

Velocity is not the Goal
Velocity is not the GoalVelocity is not the Goal
Velocity is not the GoalDoc Norton
 
Agile Metrics
Agile MetricsAgile Metrics
Agile Metricsnick945
 
Team Foundation Server - Tracking & Reporting
Team Foundation Server - Tracking & ReportingTeam Foundation Server - Tracking & Reporting
Team Foundation Server - Tracking & ReportingSteve Lange
 
Agile Metrics: It's Not All That Complicated
Agile Metrics: It's Not All That ComplicatedAgile Metrics: It's Not All That Complicated
Agile Metrics: It's Not All That ComplicatedVersionOne
 
Lean Agile Metrics And KPIs
Lean Agile Metrics And KPIsLean Agile Metrics And KPIs
Lean Agile Metrics And KPIsYuval Yeret
 

Viewers also liked (7)

Agile KPIs
Agile KPIsAgile KPIs
Agile KPIs
 
Velocity is not the Goal
Velocity is not the GoalVelocity is not the Goal
Velocity is not the Goal
 
Agile Metrics
Agile MetricsAgile Metrics
Agile Metrics
 
Team Foundation Server - Tracking & Reporting
Team Foundation Server - Tracking & ReportingTeam Foundation Server - Tracking & Reporting
Team Foundation Server - Tracking & Reporting
 
Agile Metrics V6
Agile Metrics V6Agile Metrics V6
Agile Metrics V6
 
Agile Metrics: It's Not All That Complicated
Agile Metrics: It's Not All That ComplicatedAgile Metrics: It's Not All That Complicated
Agile Metrics: It's Not All That Complicated
 
Lean Agile Metrics And KPIs
Lean Agile Metrics And KPIsLean Agile Metrics And KPIs
Lean Agile Metrics And KPIs
 

Similar to TEST METRICS IN AGILE

Test Metrics in Agile: A Powerful Tool to Demonstrate Value
Test Metrics in Agile: A Powerful Tool to Demonstrate ValueTest Metrics in Agile: A Powerful Tool to Demonstrate Value
Test Metrics in Agile: A Powerful Tool to Demonstrate ValueTechWell
 
Agile Test Management
Agile Test ManagementAgile Test Management
Agile Test Managementfranohara99
 
Use Automation to Assist—Not Replace—Manual Testing
Use Automation to Assist—Not Replace—Manual TestingUse Automation to Assist—Not Replace—Manual Testing
Use Automation to Assist—Not Replace—Manual TestingTechWell
 
ISTQBCH2.ppt
ISTQBCH2.pptISTQBCH2.ppt
ISTQBCH2.pptghkadous
 
ISTQB / ISEB Foundation Exam Practice - 2
ISTQB / ISEB Foundation Exam Practice - 2ISTQB / ISEB Foundation Exam Practice - 2
ISTQB / ISEB Foundation Exam Practice - 2Yogindernath Gupta
 
Shorten Business Life Cycle Using DevOps
Shorten Business Life Cycle Using DevOpsShorten Business Life Cycle Using DevOps
Shorten Business Life Cycle Using DevOpsPerfecto Mobile
 
Storyboard_SPS_Payroll
Storyboard_SPS_PayrollStoryboard_SPS_Payroll
Storyboard_SPS_PayrollBalaguru SP
 
Experimentation at Blue Apron (webinar)
Experimentation at Blue Apron (webinar)Experimentation at Blue Apron (webinar)
Experimentation at Blue Apron (webinar)Optimizely
 
Se 381 - lec 28 -- 34 - 12 jun12 - testing 1 of 2
Se 381 -  lec 28 -- 34 - 12 jun12 - testing 1 of 2Se 381 -  lec 28 -- 34 - 12 jun12 - testing 1 of 2
Se 381 - lec 28 -- 34 - 12 jun12 - testing 1 of 2babak danyal
 
Putting sprint development into operation
Putting sprint development into operationPutting sprint development into operation
Putting sprint development into operationNuno Fernandes
 
Use the Windshield, Not the Mirror Predictive Metrics that Drive Successful ...
 Use the Windshield, Not the Mirror Predictive Metrics that Drive Successful ... Use the Windshield, Not the Mirror Predictive Metrics that Drive Successful ...
Use the Windshield, Not the Mirror Predictive Metrics that Drive Successful ...Seapine Software
 
Continuous Performance Testing: The New Standard
Continuous Performance Testing: The New StandardContinuous Performance Testing: The New Standard
Continuous Performance Testing: The New StandardTechWell
 
Test Driven Development – What Works And What Doesn’t
Test Driven Development – What Works And What Doesn’t Test Driven Development – What Works And What Doesn’t
Test Driven Development – What Works And What Doesn’t Synerzip
 
ISTQB, ISEB Lecture Notes- 2
ISTQB, ISEB Lecture Notes- 2ISTQB, ISEB Lecture Notes- 2
ISTQB, ISEB Lecture Notes- 2onsoftwaretest
 
Continuous testing in agile projects 2015
Continuous testing in agile projects 2015Continuous testing in agile projects 2015
Continuous testing in agile projects 2015Fabricio Epaminondas
 
An Agile Testing Dashboard: Metrics that Matter
An Agile Testing Dashboard: Metrics that MatterAn Agile Testing Dashboard: Metrics that Matter
An Agile Testing Dashboard: Metrics that MatterTechWell
 
Software testing interview Q&A – Part 2
Software testing interview Q&A – Part 2Software testing interview Q&A – Part 2
Software testing interview Q&A – Part 2Khoa Bui
 
Implementing a Test Dashboard to Boost Quality
Implementing a Test Dashboard to Boost QualityImplementing a Test Dashboard to Boost Quality
Implementing a Test Dashboard to Boost QualityTechWell
 

Similar to TEST METRICS IN AGILE (20)

Test Metrics in Agile: A Powerful Tool to Demonstrate Value
Test Metrics in Agile: A Powerful Tool to Demonstrate ValueTest Metrics in Agile: A Powerful Tool to Demonstrate Value
Test Metrics in Agile: A Powerful Tool to Demonstrate Value
 
Agile Test Management
Agile Test ManagementAgile Test Management
Agile Test Management
 
Use Automation to Assist—Not Replace—Manual Testing
Use Automation to Assist—Not Replace—Manual TestingUse Automation to Assist—Not Replace—Manual Testing
Use Automation to Assist—Not Replace—Manual Testing
 
ISTQBCH2.ppt
ISTQBCH2.pptISTQBCH2.ppt
ISTQBCH2.ppt
 
ISTQBCH2.ppt
ISTQBCH2.pptISTQBCH2.ppt
ISTQBCH2.ppt
 
ISTQB / ISEB Foundation Exam Practice - 2
ISTQB / ISEB Foundation Exam Practice - 2ISTQB / ISEB Foundation Exam Practice - 2
ISTQB / ISEB Foundation Exam Practice - 2
 
Shorten Business Life Cycle Using DevOps
Shorten Business Life Cycle Using DevOpsShorten Business Life Cycle Using DevOps
Shorten Business Life Cycle Using DevOps
 
Storyboard_SPS_Payroll
Storyboard_SPS_PayrollStoryboard_SPS_Payroll
Storyboard_SPS_Payroll
 
Experimentation at Blue Apron (webinar)
Experimentation at Blue Apron (webinar)Experimentation at Blue Apron (webinar)
Experimentation at Blue Apron (webinar)
 
Se 381 - lec 28 -- 34 - 12 jun12 - testing 1 of 2
Se 381 -  lec 28 -- 34 - 12 jun12 - testing 1 of 2Se 381 -  lec 28 -- 34 - 12 jun12 - testing 1 of 2
Se 381 - lec 28 -- 34 - 12 jun12 - testing 1 of 2
 
Putting sprint development into operation
Putting sprint development into operationPutting sprint development into operation
Putting sprint development into operation
 
Use the Windshield, Not the Mirror Predictive Metrics that Drive Successful ...
 Use the Windshield, Not the Mirror Predictive Metrics that Drive Successful ... Use the Windshield, Not the Mirror Predictive Metrics that Drive Successful ...
Use the Windshield, Not the Mirror Predictive Metrics that Drive Successful ...
 
Continuous Performance Testing: The New Standard
Continuous Performance Testing: The New StandardContinuous Performance Testing: The New Standard
Continuous Performance Testing: The New Standard
 
Test Driven Development – What Works And What Doesn’t
Test Driven Development – What Works And What Doesn’t Test Driven Development – What Works And What Doesn’t
Test Driven Development – What Works And What Doesn’t
 
ISTQB, ISEB Lecture Notes- 2
ISTQB, ISEB Lecture Notes- 2ISTQB, ISEB Lecture Notes- 2
ISTQB, ISEB Lecture Notes- 2
 
Continuous testing in agile projects 2015
Continuous testing in agile projects 2015Continuous testing in agile projects 2015
Continuous testing in agile projects 2015
 
An Agile Testing Dashboard: Metrics that Matter
An Agile Testing Dashboard: Metrics that MatterAn Agile Testing Dashboard: Metrics that Matter
An Agile Testing Dashboard: Metrics that Matter
 
Neil Potter Presentation
Neil Potter Presentation Neil Potter Presentation
Neil Potter Presentation
 
Software testing interview Q&A – Part 2
Software testing interview Q&A – Part 2Software testing interview Q&A – Part 2
Software testing interview Q&A – Part 2
 
Implementing a Test Dashboard to Boost Quality
Implementing a Test Dashboard to Boost QualityImplementing a Test Dashboard to Boost Quality
Implementing a Test Dashboard to Boost Quality
 

TEST METRICS IN AGILE

  • 1. TEST METRICS IN AGILE TEST MANAGEMENT A POWERFUL TOOL TO SUPPORT CHANGES Yulia Zavertailo, Senior Test manager
  • 2. Agenda − About me. − Introduction to the client’s case. − What needs to be improved? − How do we visualize our results? − A closer look at the KPIs and how to gather them. − Conclusions. 03.04.2017 / 3
  • 3. About me − 10 years of work experience in the field of Testing and Quality Assurance, 7.5 years in Itera. − Moved to Oslo in November 2014. − Key competences: − test management, − test advisory on process establishment in a project and entire organization. − Very passionate about the ISO 9000/ISO29119 standards. − Mentor and coach for junior testers. − Love running, skiing and oil painting. 03.04.2017 / 4
  • 4. IF Skadeforsikring – general 03.04.2017 / 5 − Waypoint is the largest digital solution at IF built on modern technology platform with lots of interfaces for clients and internal users (1000+). − Methodology – Agile (Scrum, Kanban). − 13 Agile teams. − Each team has both business and IT competence. − IT development is done in headquarters in Oslo. − Large Scandinavian insurance company present in the Nordic countries and Baltics with 3,6 millions of clients and 6 800 employees.
  • 5. – System is not covered by tests. – No professional IT testing is done in the project. – Business users are involved to do the functional acceptance and regression testing. – Long production cycle: releases are 3-4 times a year. 03.04.2017 / 6 IF Skadeforsikring - as a test challenge − Critical issues found in production after release.
  • 6. IF Skadeforsikring - in a result − IT test team has grown from 1 to 17 IT testers in Riga, Latvia. − Frequency of releases is every 5 weeks. − Response on the entire system quality and stability within a few hours. − Project discovers and fixes 136 bugs in average every release . − There are still functional modules which do not have enough test coverage = facing high risk of defects. 03.04.2017 / 7
  • 7. Specifically… what are we going to improve? 1. Increase frequency of releases so that IT delivers functionality to the business every 5 weeks by: – Automating the regression test coverage. – Discovering and fixing bugs earlier in the cycle. – Improving development teams efficiency by doing continuous IT testing. 03.04.2017 / 8
  • 8. Specifically… what are we going to improve? 2. Decrease amount of bugs in the production by: – Running automated regression suits regularly. – Improving the quality of testing itself. – Introducing continuous manual testing while developing. – Focus business resources doing acceptance testing only. 03.04.2017 / 9
  • 9. Hmmm…. but how to prove the succeeded result? Present test results our client is interested in: – Test coverage. – # of defects found in production. – # of defects found in sprints. – Time spent by end users. – Time for feedback about system’s quality. 03.04.2017 / 10
  • 10. KPI #1 – Test Coverage What do we measure? − Percentage of requirements covered by at least one test. Why do we gather it in our Agile projects? – Visualize “black spots” of our application. – Visualize that the larger test coverage is, the less risk of missing defects is. – Visualize the functionality that needs more attention since it is less covered by tests. 03.04.2017 / 11
  • 11. Application test coverage - example 03.04.2017 / 12 1075 739 430 800 230 419 310 244 130 465 70 141 54 13 8 69 4 10 0 200 400 600 800 1000 1200 Module 1 Module 2 Module 3 Module 4 Module 5 Module 6 Total user stories US covered manual US covered automated
  • 12. How to gather Test Coverage KPI 1. Configure your Agile TM tool in order to build a traceability matrix in TM tool (TFS, Jira, QC). 2. Use the graphs module of your TM tool or export to Pivot tables in Excel. 3. Go for the Excel-option if you do not have any TM tool. 03.04.2017 / 13
  • 13. KPI #2 - Defect open and close rates What Do we measure? Defect Open and Close rate = Defects found during testing in a release *100 (Defects found during testing in a release + Defects found after release) 03.04.2017 / 14 Why do we gather it in our Agile projects? − Categorizing defects as either open or closed. − Certain defects could slip through the cracks and show up in the finalized release. − Testers and developers to work together to identify and address software issues.
  • 14. Sprint vs production defects - trend example 03.04.2017 / 15 170 131 85 85 150 85 105 129 141 112 182 174 152 231 175 229 139 226 175 222 124 26 7 13 7 10 17 10 29 20 17 47 20 33 21 21 25 24 13 15 18 0 144 124 72 78 140 68 95 100 121 95 135 154 119 210 154 204 115 213 160 204 124 0 50 100 150 200 250 Total Production Defects Sprint Defects
  • 15. How to gather defect open and close rates KPI To get a rate: − Get the number of defects found during a release/sprint. − Get the number of bugs reported by the business after release to production. − Calculate rate based on a formula. To build a trend graph: − Jira + Zephyr dashboards. − TFS + pivot tables in Excel. 03.04.2017 / 16
  • 16. KPI #3 - Issues reported by customers What Do we measure? % of customer reported issues = Total # of issues reported by customers *100 total number of issues reported Why do we gather it in our Agile projects? − Does the product meet the needs of a customer/end user? − Effectiveness of test team. − Types and number of defects lingering in released products. − Critical issues with current QA processes? 03.04.2017 / 17
  • 17. Customer issues reported – example of trend 03.04.2017 / 18 1 12 7 3 6 4 2 3 2 1 4 4 3 1 2 9 2 14 8 11 5 144 124 72 78 140 68 95 100 121 95 135 154 119 210 154 204 115 213 160 204 124 0 50 100 150 200 250 Customer reported isues Bugs all
  • 18. Issues reported by customers per functional area – example of trend 03.04.2017 / 19 10 3 10 1 6 3 9 3 2 5 15 5 6 1 3 1 1 3 2 1 1 12 2 3 1 4 4 1 5 1 3 4 3 7 2 2 1 2 1 4 6 5 5 2 5 1 2 2 2 2 1 3 2 2 2 1 2 4 9 5 5 5 4 6 4 7 6 0 5 10 15 20 25 30 35 40 45 50 Release1 Release2 Release3 Release4 Release5 Release6 Release7 Release8 Release9 Release10 Release11 Release12 Module1 Module2 Module3 Module4 Module5 Module6 Module7
  • 19. How to gather issues reported by customers KPI To get a rate: − Get the number of defects found by customers. − Get the total number of defects found in a sprint. − Calculate the rate based on a formula. To build a trend graph: − TFS + pivot tables in Excel. 03.04.2017 / 20
  • 20. KPI #4 – Efficiency of end users’ testing 03.04.2017 / 21 What do we measure? − Time spent by end users. − # of bugs found by end users during acceptance testing of release vs total. Why do we gather it in our Agile projects? − Feedback about the quality of requirement/acceptance criteria. − Feedback about the quality of the functional and regression testing. − To see the cost (time spent vs amount of bugs found).
  • 21. Time spent by end users – example of trend 03.04.2017 / 22 97 83 135.5 161.5 124 127 123 237 152.5 135.5 288.5 112 131.5 147.5 159 110 30 30 29 28.5 30 31 31 26 15.5 16.5 21.5 15 14.5 16 12 13.5 80 75 62 54 34 47 48 40 40 12 10 7.5 6.5 8 9 5 7 6 5 8 5 10 9.5 5 6.5 30 29 79 85 69 69 69 96 39 48 193 54 54 54 77 42 0 50 100 150 200 250 300 350 Total Country Domain 1 Country Domain 2 Country Domain 3 Country Domain 4 Country Domain 5
  • 22. KPI #5 - Regression test suite duration What do we measure? − Time for feedback about system’s stability and quality – TA. − Time for feedback about system’s stability and quality – Manual. Why do we gather it in our Agile projects? − Get a clue about time needed to regression-test a hot fix or feature. − Decision-making input for further planning. − Visualized benefit and efficiency of test automation. 03.04.2017 / 23
  • 23. Regression test suite duration - example 03.04.2017 / 24 Test suite # of test cases in a suite Time to run test suite with TA, hrs Time to run suite manually, hrs Smoke test suite 6 0,2 1,5 Regression suite 1 191 5,5 47,75 Regression suite 2 204 5,5 51 Regression suite 3 194 5,5 48,5 Total 595 16,7 149
  • 24. KPIs framework – in a few simple steps 1. Decide which KPIs are important for your client (focus on KPIs 1-3). 2. Configure your test management tool and visualize. 3. Continuously use KPIs at all levels of communication. 03.04.2017 / 25
  • 25. Summary − KPIs framework is a powerful tool to be an Agile test leader! − Control the status of quality in a measurable way. − Control the general healthiness of the test solution and strategy in your project. − Lead and make decisions in your project. − Empower yourself to prove that IT- testing is a must in your project! 03.04.2017 / 26
  • 27. Contacts • LinkedIn: https://no.linkedin.com/in/yuliazavertailo • Email: iuliia.zavertailo@itera.no 03.04.2017 / 28

Editor's Notes

  1. Comments: Waypoint system has core engine and core modules unified for everyone, though it s customized to the legal tax rules and insurance product specifics per country and domain market
  2. Сomments: System is not covered by tests -> test coverage is undefined No professional IT testing is done in the project -> neither manual nor automated Business users are involved to do the functional acceptance and regression testing –> up to 5-6 people in each country. Long production cycle: releases are 3-4 times a year -> with the system test 3-4 weeks each time (long production cycle).
  3. Comments: Increase the amount of releases so that IT delivers functionality to the business every 5 weeks -> instead of 3-4 times a year Automating the regression test coverage -> getting faster feedback about the system status and quality. Improving development teams efficiency by doing continuous IT testing -> instead of business testing at the end of the release
  4. Focus on presenting results interesting to the client: Test coverage increased -> risk areas are covered by tests regularly # of defects found in production -> reduced =less amount of critical bugs found in production # of defects found in sprints -> increased and fixed before going to production. Time spent by end users -> reduced
  5. Сomments: This is one of the most common metrics we can gather in our project! Metric number1. Visualize “black spots” of our application -> shows that there are X functions/features in our application and that the risk of bugs in the system=% of system which is not covered by tests, and in which areas it may occur).
  6. Comment to the example on the graph: Total features for module 1 – 1075 Covered by tests – 364 % of module 1 test coverage = 33
  7. Comments: Configure your Agile TM tool in order to build a traceability matrix in TM tool (TFS, Jira, QC etc): Make sure Epics/User stories ->linked to test cases (manual or TA) Use a query based filter to show Epics/User stories ->linked to test cases. Check how many Epics/User stories have test cases, how many have not. Use the graphs module of your TM tool or export to Pivot tables in Excel to visualize the filtered data from tractability matrix: Connect your Excel to TFS. Use the filters you configured to display the traceability matrix. Build the graph of required shape and colour.
  8. This is the most common metric number2 which we can gather in our project. Why do we gather it in our Agile projects? While there are numerous defect statuses which indicate the current state of an identified flaw, they can typically be categorized as either open or closed. If team members are not diligent about measuring the current status of their program’s flaws, certain defects could slip through the cracks and show up in the finalized release comparing the frequency of open defects with close rates will also provide insight into the ability of testers and developers to work together to identify and address software issues
  9. Comment to the example: Release 1 has 196 bugs reported in total, where defects open and close rate is 86 % Shows total number of defects (sprint +production) Shows sprint trend Shows production trend
  10. Comments: Issues found by customer after IT testing We count on acceptance testing issues
  11. Comments: Graph shows total defects found in a sprint (grey bar) and issues found by customers (in red)
  12. Comment: Use defects tracking tool - either Jira or TFS, or HP QC
  13. Comments: Metric is relevant for Norway, where it s still lots of testing is done by end users without involvement of IT testers.
  14. Comments: This metric doesnt have to be updated every release, it s one-time analysis Gives an overview of time to regression-test a hot fix or feature -> how long does it take the QA to actually test a feature and evaluate related risks. Gives us a decision-making input for further planning -> how frequently releases can be made. Visualized benefit and efficiency of test automation comparing to manual testing in out project.
  15. Comments: Decide which KPIs are important for your client -> Start from test coverage and defects open and close rates Use Confluence for graphs visualization! Continuously use KPIs at all levels of communication: Talks with developers or your test team. Support go-no go meetings. Steering committee meetings with budget owners, decision makers.
  16. Comments: KPIs let u control the status of quality in a measurable way, with its numbers. KPIs let u control the general healthiness of the test solution and strategy in your project giving you trends. KPIs let u lead and make decisions in your project with facts. KPIs empower you to prove that IT- testing is a must in your project! KPIs empower you to be a good leader and help you to adjust your Agile ship’s course before it crashes.