Performance
measurement of agile
teams
|
Performance measurement
of agile teams
continuous development of a product –
performance measured by sprint
Cracow Poland, October 7 2015
2Performance measurement of agile teams
Harold van Heeringen, METRI
Theo Prins, Sogeti Nederland
Edwin van Gorp, Sogeti Nederland
|
Sizing, Estimating & Control
 Sogeti department responsible for:
 Functional size measurement: Nesma, IFPUG and COSMIC;
 Estimating new projects, releases and contracts;
 Project control based on software metrics;
 Performance measurement;
 Benchmarking;
 Performance reporting.
 This presentation is about one engagement where the SEC department
was asked to help solving the issues with regard to the productivity of a
Sogeti agile team in a government project…. measured per sprint!
Performance measurement of agile teams 3
|
Background
 RfP of Dutch government agency in 2014
 Agile development of a product, taken over from another supplier;
 Performance measurement of the team per sprint;
 Based on ISO/IEC norm, in this case Nesma function points;
 Realistic targets set, based on Sogeti historical data, external
benchmark data and historical data of previous supplier!
 Sogeti offered to measure performance based on Project Delivery
Rate (PDR), measured in project function points delivered per sprint;
 Politics: Customer needs to prove the new supplier performs better!
 Customer selects Sogeti because of proved historical performance
and specialized department Sizing, Estimating & Control.
4Performance measurement of agile teams
|
The problem
 The application turned out to be very complex. Even the OTAP
environment was very complex to handle. The product backlog
consisted of many non-functional backlog items;
 Functional size measurement is a powerful way to objectively size
progress when it comes to functionality added, changed, or deleted,
but not for measuring non-functional sprint backlog items processed by
the team;
 The customer (product owner) decides which functional and non-
functional backlog items are put on the sprint backlog;
 Traditional data and performance measurement methods turned out to
be ineffective when measuring performance on sprint level when more
than average non-functional backlog items are put on the sprint
backlog.
5Performance measurement of agile teams
|
Two types of agile projects
1. Development of a set of specific requirements, prioritized on a backlog,
realized in a specific duration by a specific team in a specific amount of
effort hours and cost. At one point, the project, and the product, is
finished. (in traditional terms: new development or release).
2. Continued development (evolving) an existing application. No definite
end goal when the project or product is finished. A year divided into X
sprints of Y weeks and a fixed team working to deliver the sprint backlog
items. This only ends when the organization decides maintenance is no
longer needed. (in traditional terms: maintenance).
This industry paper investigates Type number 2.
6Performance measurement of agile teams
|
First supplier
 Big international system integrator;
 Application developed from scratch;
 About 15 sprints;
 Sprints with fluctuating functional sizes in Nesma FP;
 High complexity;
 Average PDR about 17 hours/FP.
 Almost all backlog items of functional nature!
 However, the customer was not happy and decided to select a new
supplier.
7Performance measurement of agile teams
|
After transition
 From Q3 2014 onwards, Sogeti took over;
 Sprints of 3 weeks;
 Sogeti team, product owner supplied by customer;
 Product backlog contains many non-functional items:
 Scrum team works hard, but hardly delivers function points;
 PDR (h/FP) relatively high, not in line with target PDR;
 Customer contract manager blames Sogeti for not being productive;
 Contract under pressure, media attention, pressure and politic issue;
 Sogeti department Sizing, Estimating & Control asked to analyze the
performance and to propose improvements.
Performance measurement of agile teams 8
|
Function points
 Important to understand, using a functional size measurement method
(an ISO/IEC standard) means measuring the size of the functional user
requirements that are implemented in the software;
 Non-functional requirements are not measured at all!
 More non-functional work in a sprint means that less functionality is
realized, and therefore a higher PDR (hours/FP)/lower productivity.
 In project estimation and benchmarking, the influence of NFR is
accounted for by the historical data used or the parametric model used,
or the peer group that is constructed based on projects with similar
characteristics.
Performance measurement of agile teams 9
|
Story points
 Usual way to estimate effort in agile teams;
 Team members assign story points to each backlog item, reflecting the
amount of work needed to realize the item;
 Subjective, not repeatable, not verifiable and not defensible, but
mainstream practice in agile teams because of ease of use;
 Story point-based metrics can not be compared with any measurements
or metrics outside the team;
 Story point measurement is not a standard, but SP do take into account
the effort spent on non-functional backlog items.
Performance measurement of agile teams 10
|
Non functional backlog items are important
11Performance measurement of agile teams
Sprint X FP SP Effort hours
Backlog item 1 4 4 90
Backlog item 2 0 6 120
Backlog item 3 0 2 45
Backlog item 4 5 3 65
Backlog item 5 4 3 80
Total 13 18 400
PDR = 400/13 = 30,8 Hours/FP
Ratio F/NF SP backlog items: 10/8
|
Sprint performance example
Performance measurement of agile teams 12
Sprint 1 2 3 4 5 6 7 8
Effort 345 389 367 412 365 375 390 401
Size (FP) 15 5 16 3 25 0 36 32
Sprint 1 2 3 4 5 6 7 8
PDR
(h/FP)
23,0 77,8 22,9 137,3 14,6 n/a 10,8 12,5
0,0
20,0
40,0
60,0
80,0
100,0
120,0
140,0
160,0
3 4 5 6 7 8 9 10
PDR(h/FP) Target PDR (h/FP)
|
Issue
 Team spends a lot of time on non-functional backlog items;
 PDR is not good enough to reach target;
 Stakeholders don’t understand this ‘technical issue’ and only see metrics
on the dashboard  PDR significant worse than expected;
 But… customer product owner decided on the product backlog items to
put on the sprint backlog!
 So, disappointing PDR mainly caused by the number of functional and
non-functional backlog items put in the sprint by the product owner (=
the customer!).
 Sogeti SEC wishes to address this issue and to come up with a proposal
for a more accurate performance measurement method.
13Performance measurement of agile teams
|
SEC proposal
Agile Normalized Size (ANS)
Functional size that could have been realized if the product owner only had
put functional backlog items in the sprint backlog.
Based on this ANS, a PDR (hour/FP) can be determined that can be
compared to the PDR’s in the databases with historical data.
14Performance measurement of agile teams
|
Method
1. Measure the functional size of the realized functional backlog items with
a standard method (Nesma/IFPUG FPA, COSMIC, …);
2. Determine whether the realized backlog items are functional or non-
functional;
3. Determine the number of story points of the functional backlog items
realized in the sprint;
4. Determine the total number of story points realized in the sprint;
5. Determine the agile normalized size:
(functional size / # functional story points)
* total # story points
15Performance measurement of agile teams
|
The example extended
16Performance measurement of agile teams
Sprint X FP SP Hours
Backlog item 1 4 4 90
Backlog item 2 0 6 120
Backlog item 3 0 2 45
Backlog item 4 5 3 65
Backlog item 5 4 3 80
Total 13 18 400
Agile normalized size = (13 / 10) * 18 = 23,4 nFP
PDR = 400/23,4 = 17,1 hours/nFP
Functional size: 13 FP
Functional SP: 10 SP
Total SP: 18 SP
Regular PDR: 400/13
= 30,7 hours/FP
|
The effect in multiple sprints
Sprint Size (FP) Functional SP Non-functional SP Total SP ANS (nFP)
16 20 32 12 44 27,5
17 25 28 16 44 39,3
18 18 24 20 44 33,0
19 29 35 4 39 32,3
20 4 6 36 42 28,0
21 15 16 24 40 37,5
17Performance measurement of agile teams
|
Advantages / disadvantages
Advantages
 Reduced influence of non-functional backlog items;
 The use of an ISO/IEC FSM standard – ability to benchmark.
Disadvantages
 Depending on accurate story point assignment (subjective);
 Possible for the team to tweak the performance figures;
 As the product owner is present, this risk is considered to be small;
 Impossible to measure ANS when the functional size delivered is 0.
Performance measurement of agile teams 18
|
Productivity measurement
Sprint Size (FP) ANS (nFP) Hours Hours/FP Hours/nFP
16 20 27,5 500 25,0 18,2
17 25 39,3 480 19,2 12,2
18 18 33,0 530 29,4 16,1
19 29 32,3 468 16,1 14,5
20 4 28,0 534 133,5 19,1
21 15 37,5 522 34,8 13,9
19Performance measurement of agile teams
|
The effect in multiple sprints
20Performance measurement of agile teams
|
The example
Sprint Size (FP) Functional SP Non funct. SP Story Points ANS (nFP)
16 20 32 12 44 27,5
17 25 28 16 44 39,3
18 18 24 20 44 33,0
19 29 35 4 39 32,3
20 4 6 36 42 28,0
21 15 16 24 40 37,5
22 0 0 41 41 n.t.b.
23 18 24 20 44 33,0
21Performance measurement of agile teams
|
Sprint 22: no productivity measurement
Sprint Size (FP) ANS (nFP) Hours Hours/FP Hours/nFP
16 20 27,5 500 25,0 18,2
17 25 39,3 480 19,2 12,2
18 18 33,0 530 29,4 16,1
19 29 32,3 468 16,1 14,5
20 4 28,0 534 133,5 19,1
21 15 37,5 522 34,8 13,9
22 0 N/A 512 N/A N/A
23 18 33,0 508 28,2 15,4
22Performance measurement of agile teams
|
Issue: completely non-functional sprints
 In sprint 22, zero function points were delivered;
 Size in FP is 0, ANS impossible to determine (dividing by zero);
 Impossible to determine productivity.
 Solution: progressive approach.
23Performance measurement of agile teams
|
Progressive approach
 Size measurement and productivity measurement not per sprint, but until
the last sprint;
 Does not focus on sprint, but on overall performance
(∑1-n functional size / ∑1-n functional story points)
* ∑1-n total story points
24Performance measurement of agile teams
|
Progressive approach
Sprint Size (FP) ANS (nFP) Hours Hours
(cumulative)
ANS
Progressive
Hours (cum) /
nFP (prog)
16 20 27,5 500 500 27,5 18,2
17 25 39,3 480 980 66,0 14,8
18 18 33,0 530 1.510 99,0 15,3
19 29 32,3 468 1.978 132,2 15,0
20 4 28,0 534 2.512 163,6 15,4
21 15 37,5 522 3.034 199,2 15,2
22 0 N/A 512 3.546 231,4 15,3
23 18 33,0 508 4.054 264,3 15,3
25Performance measurement of agile teams
|
Difference between the methods
Performance measurement of agile teams 26
|
Starting points
Documentation
After each sprint the functional documentation should be made up-to-
date and it must be clear:
 Which functionality was added in the sprint;
 Which functionality was changed in the sprint and in which way;
 Which functionality was deleted in the sprint;
 This should be part of the definition of done.
Effort administration
 The effort hours need to be booked in the effort administration in such a
way that it is possible to clearly identify the effort hours in scope and out
of scope of the performance measurement.
Performance measurement of agile teams 27
|
Conclusions and recommendations
 The productivity of an agile team in a contract can be measured and
benchmarked while taking into account the effect of non-functional
requirements;
 The customer now understands that non-functional backlog items have
impact on the PDR when using only Nesma/IFPUG function points in agile
projects when measuring on a sprint level. Customer is able to explain
that internally and politically. Pressure is less now, because targets are
met.
 The method can help other organizations as well!
Performance measurement of agile teams 28

More Related Content

PDF
Productivity measurement of agile teams (IWSM 2015)
PDF
Nesma autumn conference 2015 - Agile normalized size - Theo Prins
PDF
Nesma autumn conference 2015 - Functional testing miniguide - Ignacio López C...
PDF
Event based scheduling brown bag
PPT
Building Workflows For Digitisation and Digital Preservation –A Case Study: ...
PPTX
Analyze phase lean six sigma tollgate template
PDF
Nesma autum conference 2015 - Measuring & improving different dimensions - Ni...
PDF
How Do I Calculate Estimates for Budget Deliverables on Agile Projects this Y...
Productivity measurement of agile teams (IWSM 2015)
Nesma autumn conference 2015 - Agile normalized size - Theo Prins
Nesma autumn conference 2015 - Functional testing miniguide - Ignacio López C...
Event based scheduling brown bag
Building Workflows For Digitisation and Digital Preservation –A Case Study: ...
Analyze phase lean six sigma tollgate template
Nesma autum conference 2015 - Measuring & improving different dimensions - Ni...
How Do I Calculate Estimates for Budget Deliverables on Agile Projects this Y...

What's hot (20)

PPTX
Improve phase lean six sigma tollgate template
PPTX
Amreek dmaic template pph_may 14 project
PDF
CMMI and Agile
PPTX
Your Role as a Control Account Manager in the Integrated Baseline Review (IBR)
PPTX
Implementing Level 5 Metrics Programme @ Capgemini Netherlands
DOCX
Why agile is best for managing projects in principle but not always in practice
PPTX
Improve On-time deliveries_ Lean Six Sigma Green Belt Project
PPTX
Webb Control Tollgate 15 DEC 14 Final Version
PDF
Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...
PDF
PROJECT STORYBOARD: Herding Cats Using Lean Six Sigma: How to Plan for and Ma...
PPTX
LEAN: Understanding the SIPOC (Quality Systems Management Tools Series 2016)
PDF
Agile Metrics : A seminal approach for calculating Metrics in Agile Projects
PPT
Tollgate Presentation
PDF
Agile Lifecycle for Enterprise IT Programs
PPTX
Process mapping session final-Lean Six Sigma
PDF
Building a Credible Performance Measurement Baseline in Two Days
PDF
PROJECT STORYBOARD: Reducing Software Bug Fix Lead Time From 25 to 15 days
PDF
APQP Application
PPTX
Agile KPIs
Improve phase lean six sigma tollgate template
Amreek dmaic template pph_may 14 project
CMMI and Agile
Your Role as a Control Account Manager in the Integrated Baseline Review (IBR)
Implementing Level 5 Metrics Programme @ Capgemini Netherlands
Why agile is best for managing projects in principle but not always in practice
Improve On-time deliveries_ Lean Six Sigma Green Belt Project
Webb Control Tollgate 15 DEC 14 Final Version
Nesma autumn conference 2015 - Is FPA a valuable addition to predictable agil...
PROJECT STORYBOARD: Herding Cats Using Lean Six Sigma: How to Plan for and Ma...
LEAN: Understanding the SIPOC (Quality Systems Management Tools Series 2016)
Agile Metrics : A seminal approach for calculating Metrics in Agile Projects
Tollgate Presentation
Agile Lifecycle for Enterprise IT Programs
Process mapping session final-Lean Six Sigma
Building a Credible Performance Measurement Baseline in Two Days
PROJECT STORYBOARD: Reducing Software Bug Fix Lead Time From 25 to 15 days
APQP Application
Agile KPIs
Ad

Viewers also liked (20)

PPTX
Delivering learning coaching functions in an organisation
PPTX
Delivering learning coaching functions in an organisation intro
PDF
Delivering learning coaching functions in an organisation5
PPTX
25.What are different types of teams A Lecture By Mr Allah Dad Khan Visiting...
PPT
Balance Score Card
PDF
ORGANISATIONAL COACHING
DOCX
Types of teams
PPT
Performance Management & Employee Development {Lecture Notes}
PPT
Reward systems & legal issues
PPTX
Balance Score Card
PPT
Ojt Trainingslides[1]
PDF
Balance Score Card (Uni 2016)
PPT
Performance management notes
PPTX
Balance score card
PPT
contingent contract
PPTX
Unit- 8. Performance Management and employee development
PPT
Contingent Contract
PPTX
Balance score card
PPT
Six Sigma Final
PPT
Coaching and mentoring (HRM)
Delivering learning coaching functions in an organisation
Delivering learning coaching functions in an organisation intro
Delivering learning coaching functions in an organisation5
25.What are different types of teams A Lecture By Mr Allah Dad Khan Visiting...
Balance Score Card
ORGANISATIONAL COACHING
Types of teams
Performance Management & Employee Development {Lecture Notes}
Reward systems & legal issues
Balance Score Card
Ojt Trainingslides[1]
Balance Score Card (Uni 2016)
Performance management notes
Balance score card
contingent contract
Unit- 8. Performance Management and employee development
Contingent Contract
Balance score card
Six Sigma Final
Coaching and mentoring (HRM)
Ad

Similar to Performance measurement of agile teams harold van heeringen (20)

PDF
Productivity measurement of agile teams (IWSM 2015)
PDF
Agile Metrics
PDF
Agile metrics mihai olaru
PDF
Agile Prediction Model EASE 2016 V2
PPTX
Agile metrics
PPTX
Scrum_BLR 10th meet up 13 sept-2014 - How to Measure Efficiency or Productivi...
PDF
Measuring Agile: A Disciplined Approach To Metrics
PDF
5. agile estimation reconsidered again esteban sanchez
PPTX
Test Metrics in Agile - powerful tool to support changes - Zavertailo Iuliia
PDF
Agile Team Performance Measurement webinar
PPTX
Agile Project Manager slideshow template with sample data and analysis.
PPT
Agile Metrics
PDF
Practical agile analytics: Measure predictability and quantify risk with cycl...
PDF
Hass howard scrum master capabilities
PDF
Dawn Stevens: Adapting Traditional Metrics to Measure, Monitor, and Achieve A...
PDF
User Story Cycle Time - An Universal Agile Maturity Measurement
PDF
Metrics for aglie teams
PPT
Performance Evaluation in Agile
PDF
Individual performance in a team game.
PPTX
Agile metrics by kapil_puri
Productivity measurement of agile teams (IWSM 2015)
Agile Metrics
Agile metrics mihai olaru
Agile Prediction Model EASE 2016 V2
Agile metrics
Scrum_BLR 10th meet up 13 sept-2014 - How to Measure Efficiency or Productivi...
Measuring Agile: A Disciplined Approach To Metrics
5. agile estimation reconsidered again esteban sanchez
Test Metrics in Agile - powerful tool to support changes - Zavertailo Iuliia
Agile Team Performance Measurement webinar
Agile Project Manager slideshow template with sample data and analysis.
Agile Metrics
Practical agile analytics: Measure predictability and quantify risk with cycl...
Hass howard scrum master capabilities
Dawn Stevens: Adapting Traditional Metrics to Measure, Monitor, and Achieve A...
User Story Cycle Time - An Universal Agile Maturity Measurement
Metrics for aglie teams
Performance Evaluation in Agile
Individual performance in a team game.
Agile metrics by kapil_puri

More from IWSM Mensura (20)

PDF
When do software issues get reported in large open source software - Rakesh Rana
PDF
Accounting for non functional and project requirements - cosmic and ifpug dev...
PPTX
Workshop early or rapid cosmic fsm - Frank Vogelezang
PDF
Tips and hints for an effective cosmic learning process gained from industria...
PDF
The significance of ifpug base functionality types in effort estimation cig...
PDF
The effects of duration based moving windows with estimation by analogy - sou...
PDF
Software or service that's the question luigi buglione
PDF
Requirements effort estimation state of the practice - mohamad kassab
PDF
Quantitative functional change impact analysis in activity diagrams a cosmi...
PDF
Practical usage of fpa and automatic code review piotr popovski
PDF
Measurement as-a-service a new way of organizing metrics programs - wilhelm m...
PDF
Improving the cosmic approximate sizing using the fuzzy logic epcu model al...
PDF
Functional size measurement for processor load estimation hassan soubra
PDF
From software to service sustainability a still broader perspective - luigi...
PDF
Estimation and measuring of software size within the atos gobal delivery plat...
PDF
Energy wasting rate jérôme rocheteau
PDF
Do we measure functional size or do we count thomas fehlmann
PDF
Designing an unobtrusive analytics framework for monitoring java applications...
PDF
Combining qualitative and quantitative software process evaluation sylvie t...
PDF
Automatic measurements of use cases with cosmic thomas fehlmann
When do software issues get reported in large open source software - Rakesh Rana
Accounting for non functional and project requirements - cosmic and ifpug dev...
Workshop early or rapid cosmic fsm - Frank Vogelezang
Tips and hints for an effective cosmic learning process gained from industria...
The significance of ifpug base functionality types in effort estimation cig...
The effects of duration based moving windows with estimation by analogy - sou...
Software or service that's the question luigi buglione
Requirements effort estimation state of the practice - mohamad kassab
Quantitative functional change impact analysis in activity diagrams a cosmi...
Practical usage of fpa and automatic code review piotr popovski
Measurement as-a-service a new way of organizing metrics programs - wilhelm m...
Improving the cosmic approximate sizing using the fuzzy logic epcu model al...
Functional size measurement for processor load estimation hassan soubra
From software to service sustainability a still broader perspective - luigi...
Estimation and measuring of software size within the atos gobal delivery plat...
Energy wasting rate jérôme rocheteau
Do we measure functional size or do we count thomas fehlmann
Designing an unobtrusive analytics framework for monitoring java applications...
Combining qualitative and quantitative software process evaluation sylvie t...
Automatic measurements of use cases with cosmic thomas fehlmann

Recently uploaded (20)

PPTX
Presentation - Summer Internship at Samatrix.io_template_2.pptx
PPTX
Chapter_05_System Modeling for software engineering
PPTX
Streamlining Project Management in the AV Industry with D-Tools for Zoho CRM ...
PPTX
WJQSJXNAZJVCVSAXJHBZKSJXKJKXJSBHJBJEHHJB
PPTX
Lesson-3-Operation-System-Support.pptx-I
PPTX
Human-Computer Interaction for Lecture 2
PPTX
Post-Migration Optimization Playbook: Getting the Most Out of Your New Adobe ...
PPTX
UNIT II: Software design, software .pptx
PPTX
ROI from Efficient Content & Campaign Management in the Digital Media Industry
PDF
What Makes a Great Data Visualization Consulting Service.pdf
PDF
infoteam HELLAS company profile 2025 presentation
PDF
Bright VPN Crack Free Download (Latest 2025)
PPTX
Human-Computer Interaction for Lecture 1
PPTX
Folder Lock 10.1.9 Crack With Serial Key
PDF
Internet Download Manager IDM Crack powerful download accelerator New Version...
PDF
Coding with GPT-5- What’s New in GPT 5 That Benefits Developers.pdf
PDF
Cloud Native Aachen Meetup - Aug 21, 2025
PDF
Module 1 - Introduction to Generative AI.pdf
PPTX
Human Computer Interaction lecture Chapter 2.pptx
PDF
Ragic Data Security Overview: Certifications, Compliance, and Network Safegua...
Presentation - Summer Internship at Samatrix.io_template_2.pptx
Chapter_05_System Modeling for software engineering
Streamlining Project Management in the AV Industry with D-Tools for Zoho CRM ...
WJQSJXNAZJVCVSAXJHBZKSJXKJKXJSBHJBJEHHJB
Lesson-3-Operation-System-Support.pptx-I
Human-Computer Interaction for Lecture 2
Post-Migration Optimization Playbook: Getting the Most Out of Your New Adobe ...
UNIT II: Software design, software .pptx
ROI from Efficient Content & Campaign Management in the Digital Media Industry
What Makes a Great Data Visualization Consulting Service.pdf
infoteam HELLAS company profile 2025 presentation
Bright VPN Crack Free Download (Latest 2025)
Human-Computer Interaction for Lecture 1
Folder Lock 10.1.9 Crack With Serial Key
Internet Download Manager IDM Crack powerful download accelerator New Version...
Coding with GPT-5- What’s New in GPT 5 That Benefits Developers.pdf
Cloud Native Aachen Meetup - Aug 21, 2025
Module 1 - Introduction to Generative AI.pdf
Human Computer Interaction lecture Chapter 2.pptx
Ragic Data Security Overview: Certifications, Compliance, and Network Safegua...

Performance measurement of agile teams harold van heeringen

  • 2. | Performance measurement of agile teams continuous development of a product – performance measured by sprint Cracow Poland, October 7 2015 2Performance measurement of agile teams Harold van Heeringen, METRI Theo Prins, Sogeti Nederland Edwin van Gorp, Sogeti Nederland
  • 3. | Sizing, Estimating & Control  Sogeti department responsible for:  Functional size measurement: Nesma, IFPUG and COSMIC;  Estimating new projects, releases and contracts;  Project control based on software metrics;  Performance measurement;  Benchmarking;  Performance reporting.  This presentation is about one engagement where the SEC department was asked to help solving the issues with regard to the productivity of a Sogeti agile team in a government project…. measured per sprint! Performance measurement of agile teams 3
  • 4. | Background  RfP of Dutch government agency in 2014  Agile development of a product, taken over from another supplier;  Performance measurement of the team per sprint;  Based on ISO/IEC norm, in this case Nesma function points;  Realistic targets set, based on Sogeti historical data, external benchmark data and historical data of previous supplier!  Sogeti offered to measure performance based on Project Delivery Rate (PDR), measured in project function points delivered per sprint;  Politics: Customer needs to prove the new supplier performs better!  Customer selects Sogeti because of proved historical performance and specialized department Sizing, Estimating & Control. 4Performance measurement of agile teams
  • 5. | The problem  The application turned out to be very complex. Even the OTAP environment was very complex to handle. The product backlog consisted of many non-functional backlog items;  Functional size measurement is a powerful way to objectively size progress when it comes to functionality added, changed, or deleted, but not for measuring non-functional sprint backlog items processed by the team;  The customer (product owner) decides which functional and non- functional backlog items are put on the sprint backlog;  Traditional data and performance measurement methods turned out to be ineffective when measuring performance on sprint level when more than average non-functional backlog items are put on the sprint backlog. 5Performance measurement of agile teams
  • 6. | Two types of agile projects 1. Development of a set of specific requirements, prioritized on a backlog, realized in a specific duration by a specific team in a specific amount of effort hours and cost. At one point, the project, and the product, is finished. (in traditional terms: new development or release). 2. Continued development (evolving) an existing application. No definite end goal when the project or product is finished. A year divided into X sprints of Y weeks and a fixed team working to deliver the sprint backlog items. This only ends when the organization decides maintenance is no longer needed. (in traditional terms: maintenance). This industry paper investigates Type number 2. 6Performance measurement of agile teams
  • 7. | First supplier  Big international system integrator;  Application developed from scratch;  About 15 sprints;  Sprints with fluctuating functional sizes in Nesma FP;  High complexity;  Average PDR about 17 hours/FP.  Almost all backlog items of functional nature!  However, the customer was not happy and decided to select a new supplier. 7Performance measurement of agile teams
  • 8. | After transition  From Q3 2014 onwards, Sogeti took over;  Sprints of 3 weeks;  Sogeti team, product owner supplied by customer;  Product backlog contains many non-functional items:  Scrum team works hard, but hardly delivers function points;  PDR (h/FP) relatively high, not in line with target PDR;  Customer contract manager blames Sogeti for not being productive;  Contract under pressure, media attention, pressure and politic issue;  Sogeti department Sizing, Estimating & Control asked to analyze the performance and to propose improvements. Performance measurement of agile teams 8
  • 9. | Function points  Important to understand, using a functional size measurement method (an ISO/IEC standard) means measuring the size of the functional user requirements that are implemented in the software;  Non-functional requirements are not measured at all!  More non-functional work in a sprint means that less functionality is realized, and therefore a higher PDR (hours/FP)/lower productivity.  In project estimation and benchmarking, the influence of NFR is accounted for by the historical data used or the parametric model used, or the peer group that is constructed based on projects with similar characteristics. Performance measurement of agile teams 9
  • 10. | Story points  Usual way to estimate effort in agile teams;  Team members assign story points to each backlog item, reflecting the amount of work needed to realize the item;  Subjective, not repeatable, not verifiable and not defensible, but mainstream practice in agile teams because of ease of use;  Story point-based metrics can not be compared with any measurements or metrics outside the team;  Story point measurement is not a standard, but SP do take into account the effort spent on non-functional backlog items. Performance measurement of agile teams 10
  • 11. | Non functional backlog items are important 11Performance measurement of agile teams Sprint X FP SP Effort hours Backlog item 1 4 4 90 Backlog item 2 0 6 120 Backlog item 3 0 2 45 Backlog item 4 5 3 65 Backlog item 5 4 3 80 Total 13 18 400 PDR = 400/13 = 30,8 Hours/FP Ratio F/NF SP backlog items: 10/8
  • 12. | Sprint performance example Performance measurement of agile teams 12 Sprint 1 2 3 4 5 6 7 8 Effort 345 389 367 412 365 375 390 401 Size (FP) 15 5 16 3 25 0 36 32 Sprint 1 2 3 4 5 6 7 8 PDR (h/FP) 23,0 77,8 22,9 137,3 14,6 n/a 10,8 12,5 0,0 20,0 40,0 60,0 80,0 100,0 120,0 140,0 160,0 3 4 5 6 7 8 9 10 PDR(h/FP) Target PDR (h/FP)
  • 13. | Issue  Team spends a lot of time on non-functional backlog items;  PDR is not good enough to reach target;  Stakeholders don’t understand this ‘technical issue’ and only see metrics on the dashboard  PDR significant worse than expected;  But… customer product owner decided on the product backlog items to put on the sprint backlog!  So, disappointing PDR mainly caused by the number of functional and non-functional backlog items put in the sprint by the product owner (= the customer!).  Sogeti SEC wishes to address this issue and to come up with a proposal for a more accurate performance measurement method. 13Performance measurement of agile teams
  • 14. | SEC proposal Agile Normalized Size (ANS) Functional size that could have been realized if the product owner only had put functional backlog items in the sprint backlog. Based on this ANS, a PDR (hour/FP) can be determined that can be compared to the PDR’s in the databases with historical data. 14Performance measurement of agile teams
  • 15. | Method 1. Measure the functional size of the realized functional backlog items with a standard method (Nesma/IFPUG FPA, COSMIC, …); 2. Determine whether the realized backlog items are functional or non- functional; 3. Determine the number of story points of the functional backlog items realized in the sprint; 4. Determine the total number of story points realized in the sprint; 5. Determine the agile normalized size: (functional size / # functional story points) * total # story points 15Performance measurement of agile teams
  • 16. | The example extended 16Performance measurement of agile teams Sprint X FP SP Hours Backlog item 1 4 4 90 Backlog item 2 0 6 120 Backlog item 3 0 2 45 Backlog item 4 5 3 65 Backlog item 5 4 3 80 Total 13 18 400 Agile normalized size = (13 / 10) * 18 = 23,4 nFP PDR = 400/23,4 = 17,1 hours/nFP Functional size: 13 FP Functional SP: 10 SP Total SP: 18 SP Regular PDR: 400/13 = 30,7 hours/FP
  • 17. | The effect in multiple sprints Sprint Size (FP) Functional SP Non-functional SP Total SP ANS (nFP) 16 20 32 12 44 27,5 17 25 28 16 44 39,3 18 18 24 20 44 33,0 19 29 35 4 39 32,3 20 4 6 36 42 28,0 21 15 16 24 40 37,5 17Performance measurement of agile teams
  • 18. | Advantages / disadvantages Advantages  Reduced influence of non-functional backlog items;  The use of an ISO/IEC FSM standard – ability to benchmark. Disadvantages  Depending on accurate story point assignment (subjective);  Possible for the team to tweak the performance figures;  As the product owner is present, this risk is considered to be small;  Impossible to measure ANS when the functional size delivered is 0. Performance measurement of agile teams 18
  • 19. | Productivity measurement Sprint Size (FP) ANS (nFP) Hours Hours/FP Hours/nFP 16 20 27,5 500 25,0 18,2 17 25 39,3 480 19,2 12,2 18 18 33,0 530 29,4 16,1 19 29 32,3 468 16,1 14,5 20 4 28,0 534 133,5 19,1 21 15 37,5 522 34,8 13,9 19Performance measurement of agile teams
  • 20. | The effect in multiple sprints 20Performance measurement of agile teams
  • 21. | The example Sprint Size (FP) Functional SP Non funct. SP Story Points ANS (nFP) 16 20 32 12 44 27,5 17 25 28 16 44 39,3 18 18 24 20 44 33,0 19 29 35 4 39 32,3 20 4 6 36 42 28,0 21 15 16 24 40 37,5 22 0 0 41 41 n.t.b. 23 18 24 20 44 33,0 21Performance measurement of agile teams
  • 22. | Sprint 22: no productivity measurement Sprint Size (FP) ANS (nFP) Hours Hours/FP Hours/nFP 16 20 27,5 500 25,0 18,2 17 25 39,3 480 19,2 12,2 18 18 33,0 530 29,4 16,1 19 29 32,3 468 16,1 14,5 20 4 28,0 534 133,5 19,1 21 15 37,5 522 34,8 13,9 22 0 N/A 512 N/A N/A 23 18 33,0 508 28,2 15,4 22Performance measurement of agile teams
  • 23. | Issue: completely non-functional sprints  In sprint 22, zero function points were delivered;  Size in FP is 0, ANS impossible to determine (dividing by zero);  Impossible to determine productivity.  Solution: progressive approach. 23Performance measurement of agile teams
  • 24. | Progressive approach  Size measurement and productivity measurement not per sprint, but until the last sprint;  Does not focus on sprint, but on overall performance (∑1-n functional size / ∑1-n functional story points) * ∑1-n total story points 24Performance measurement of agile teams
  • 25. | Progressive approach Sprint Size (FP) ANS (nFP) Hours Hours (cumulative) ANS Progressive Hours (cum) / nFP (prog) 16 20 27,5 500 500 27,5 18,2 17 25 39,3 480 980 66,0 14,8 18 18 33,0 530 1.510 99,0 15,3 19 29 32,3 468 1.978 132,2 15,0 20 4 28,0 534 2.512 163,6 15,4 21 15 37,5 522 3.034 199,2 15,2 22 0 N/A 512 3.546 231,4 15,3 23 18 33,0 508 4.054 264,3 15,3 25Performance measurement of agile teams
  • 26. | Difference between the methods Performance measurement of agile teams 26
  • 27. | Starting points Documentation After each sprint the functional documentation should be made up-to- date and it must be clear:  Which functionality was added in the sprint;  Which functionality was changed in the sprint and in which way;  Which functionality was deleted in the sprint;  This should be part of the definition of done. Effort administration  The effort hours need to be booked in the effort administration in such a way that it is possible to clearly identify the effort hours in scope and out of scope of the performance measurement. Performance measurement of agile teams 27
  • 28. | Conclusions and recommendations  The productivity of an agile team in a contract can be measured and benchmarked while taking into account the effect of non-functional requirements;  The customer now understands that non-functional backlog items have impact on the PDR when using only Nesma/IFPUG function points in agile projects when measuring on a sprint level. Customer is able to explain that internally and politically. Pressure is less now, because targets are met.  The method can help other organizations as well! Performance measurement of agile teams 28