Good data feedback of software measurements is critical when analyzing measurement data, for drawing conclusions, and as the basis for taking action. Feedback to those involved in the activities being measured helps validate the data as well. In this presentation Ben Linders shows examples of how Ericsson Telecommunications delivers feedback at two levels: projects and the total development center. Although the basics are similar, the application differs, and the key success factors depend on the level and the audience. At the project level, you will see how the team reviews defect data, including defect classifications and test matrices. For development center feedback, you will see how line management and technical engineers review data and analyze information based on a balanced score card approach with measurable goals. Finally, Ben Linders shows examples, data summaries, and suggested action items that management teams from the project and development center levels review.
• Techniques used in data feedback reporting and key success factors
• Close the feedback loop with different levels in the organization
• Human factors that play a role in feedback sessions
VK Business Profile - provides IT solutions and Web Development
Experiences with Data Feedback - Better Software 2004 - Ben Linders
1. Rev A 2004-06-28 1
Make What’s Counted Count:
Experiences with Data Feedback.
2004 Better Software Conference,
September 30, San Jose, USA
Ben Linders
Operational Development & Quality
Ericsson R&D, The Netherlands
ben.linders@ericsson.com, +31 161 24 9885
2. Rev A 2004-06-28 2
Overview
• Why feedback?
• Experiences
– Product Quality
– R&D performance
• Key Success factors & Pitfalls
• Conclusions
Feedback: Bridging the gap
between data and actions!
3. Rev A 2004-06-28 3
Ericsson, The Netherlands
• Benelux Market Unit & Worldwide Main R&D Design Center
• R&D: Intelligent Networks
– Strategic Product Management
– Product marketing & technical sales support
– Provisioning & total project management
– Development & maintenance
– Integration & Verification
– Customization
– Supply & support
• 1300 employees, of which 350 in R&D
Measurements on all R&D levels
4. Rev A 2004-06-28 4
Measurements: Analyze
Data, lots of data, and nothing but data
… how can you get a meaning out of it?
We tried:
• Historical data: takes long to build up, much effort before
data can be used
• Industry data: hard to get, often too general
• “Brute force” SPC: conclusions didn’t match with our
perception and insight of the situation
5. Rev A 2004-06-28 5
Measurements: Actions
The purpose of measuring is … to take actions!
Insufficient actions:
• No insight in causes
• Debates on the measurement
• Insufficient responsibility for results
6. Rev A 2004-06-28 6
Effectiveness of Measurements
Change needed:
Show relation between measurements and daily work,
people should get insight in their own performance
Get people involved from definition until results
Assure that “vital few actions” are done
Co-operation: Line/projects – Operational Development!
7. Rev A 2004-06-28 7
Feedback: Definition
“Information about the past, delivered in the present, which may
influence future behavior” Seashore, Seashore & Weinberg, 1992
Analyze to understand current performance
Change behavior to reach better results
“Information about collected data
delivered to the people who have
been doing the work, in order to
support their understanding of the
situation at hand and help them to
take the needed actions” Linders, 2004
8. Rev A 2004-06-28 8
Feedback: Concepts
People are capable of analyzing their own performance and results,
you just have to provide them with the right data
Empowerment: Make decisions at the lowest possible level
Assure that you have valid data, before drawing conclusions
Don’t use data to punish people
Feedback is a means to discuss data in an open
atmosphere, enabling early conclusions and actions!
9. Rev A 2004-06-28 9
Feedback: Deployment
• Feedback should be:
– on something that is considered important
– quick and frequent
– specific, valid, and understandable
Start small,
with a team that is open for it and willing to try
Get feedback on how you are doing feedback
10. Rev A 2004-06-28 10
Experience 1: Measuring Product Quality
Old approach:
Quality Engineer gathered data, did analysis, and presented
conclusions to design and test teams (often in a big report)
Drawback:
Teams didn’t understand the data
Data available when the project was finished: Too late
No insight how measurement related to their work
Teams didn’t feel the need for changes
Result:
Hardly any improvement of product or process quality
11. Rev A 2004-06-28 11
Project Defect Model
• Project Defect Model:
– to control quality of the product during development
– and improve development/inspection/test processes
• Way of working:
– Estimate # defects made during development per phase
– Estimate defect detection rate per phase
– Track the estimates against actual # defects found
12. Rev A 2004-06-28 12
Project Defect Model: Feedback
New approach:
Quality Engineer provides the model, and coaches the teams in
estimating, tracking actuals, and drawing conclusions
Benefits:
Teams develop understanding on their way of working
Teams have better insight on their progress/results
Teams feel involved, it’s their data/conclusions
Result:
Conclusions (problem/risk) lead to early actions
Teams get recognized for good results
13. Rev A 2004-06-28 13
Experience 2: Steering R&D performance
Old approach:
Quality Engineer collected target data, and presented conclusions
to the management (or reported on them).
Drawback:
Much debates about the data/measurement
No insight in causes when targets were not met
Blame and denial
Result:
Metrics didn’t support controlled improvement of the performance
14. Rev A 2004-06-28 14
Balanced Scorecard
• Balanced Scorecard:
– comprehensive set of measurable targets
– from different focus areas
• Way of working:
– Collection of data by the Quality Engineers
– Feedback/Interview sessions with managers
• Show the raw data
• Ask management for explanation
• Have management draw conclusions and take actions
– Document the conclusions/actions with the data
15. Rev A 2004-06-28 15
Balanced Scorecard: Feedback
New approach:
Quality Engineer shows the raw data, signals trends, anomalies,
and red flags, asks critical questions; working towards actions
Benefits:
Management knows what happened, together with Quality Engineer
they can pinpoint and analyze the causes of insufficient performance
Management feel involved, it’s their data/conclusions
Result:
Managers take earlier action
Management Team focuses on show stoppers, while still keeping
overview of total performance
16. Rev A 2004-06-28 16
Key Success Factors
Data collected must relate to the organization’s goals
Management Support is crucial
Quality Engineers have a central role
People providing the data should be rewarded
Order of data – analysis – conclusions – actions – predictions
Communicate, communicate, communicate!
17. Rev A 2004-06-28 17
Pitfalls
People might distrust data, say that it is wrong:
Ask them for correct data (but don’t try to be perfect: optimize!)
People sometimes do not want to participate in analysis
Make clear that only they can draw conclusions
People sometimes do not want to take actions:
Top-down goal setting, assure they accept the targets
People are wary of change
Start with less critical measurement, communicate successes
18. Rev A 2004-06-28 18
Conclusions
Feedback improves effectiveness of measurements:
– Earlier insight in performance & risks
– Involvement of those whose work is measured
– Actions taken by teams and middle management
Make what’s counted count!
19. Rev A 2004-06-28 19
Further reading
References
– What did you say? The art of giving and receiving feedback. Seashore,
Seashore & Weinberg, Douglas Charles press, 1992
– Getting things done when you are not in charge. Geoffrey M. Bellman,
Fireside, 1993.
– How to talk about work performance: A feedback primer. Esther Derby,
Crosstalk December 2003, page 13-16.
Papers
– Controlling Product Quality During Development with a Defect Model,
Proceedings ESEPG 2003
– Make what’s counted count, Better Software magazine, march 2004
Ben Linders, Ericsson R&D, The Netherlands
ben.linders@ericsson.com, +31 161 24 9885