SlideShare a Scribd company logo
Harold van Heeringen,
ISBSG president
Overview
Benchmarking
Software Project Industry
Functional Size Measurement
Software Metrics
Historical Data: ISBSG
Project Benchmark Example
Organization Benchmark
Other uses for Benchmarking
Country report – Italy
Data submissions Italy
Benchmarking (wikipedia)
Benchmarking is the process of comparing one's business processes
and performance metrics to industry bests or best practices from
other industries.
Benchmarking is used to measure performance using a
specific indicator (cost per unit of measure, productivity per unit
of measure, cycle time of x per unit of measure or defects per
unit of measure) resulting in a metric of performance that is then
compared to others
This then allows organizations to develop plans on how to make
improvements or adapt specific best practices, usually with the
aim of increasing some aspect of performance. Benchmarking
may be a one-off event, but is often treated as a continuous
process in which organizations continually seek to improve their
practices.
Where are we now?
“Even the most detailed navigation map of an
area is useless if you don’t know where you are”

?
?
?

?
Informed decisions
Senior Management of IT departments/organizations
need to make decisions need to make decisions
based on ‘where they are’ and ‘where they want to
go’.
Benchmarking is about determining ‘where you are’
compared to relevant peers, in order to make
informed decisions.
But how to measure and determine where you are?
Software project industry
Low ‘performance metrics’ maturity
Few Performance Measurement Process implemented
Few Benchmarking processes implemented

Most organizations don’t know how good or how bad
they are in delivering or maintaining software.
These organizations are not able to assess their
competitive position, nor able to make informed
strategic decisions to improve their competitive
position.
But…
Best in Class organizations deliver software up to 30
times more productively than Worst in Class
organizations
High Productivity, High Quality
More functionality for the users against lower costs – value
Shorter Time to Market – competitive advantage!

Worst in Class organizations will find themselves in
trouble in an increasingly competitive market
Outperformed by competition
Internal IT departments get outsourced
Commercial software houses fail to win new contracts

Important to know where you stand!
Benchmark is essential!
Difficulty – low industry maturity
How to measure metrics like productivity, quality,
time-to-market in such a way that a meaningful
comparison is possible?

Comparing apples to apples
Software is not easy to compare
Functional Size Measurement
Function Point Analysis (NESMA, IFPUG or COSMIC)
Measure the functional user requirements – size in function points;
ISO standards – objective, independent, verifiable, repeatable;
Strong relation between functional size and project effort needed;

What to do with the results?
Project effort/duration/cost estimation
Benchmarking/performance measurement
Use in Request for Proposal management (answer price/FP questions)

What about historical data?
Company data (preferably for estimation)
Industry data (necessary for external benchmarking)
Unit of Measure (UoM)
Why are Function Points the best UoM to use in Benchmarking?
Functionality is of value for the client/business. More functionality means
more value. More Lines of code (technical size) is not necessarily of value.
Function Points are measured independent from technical requirements
500 FP of functionality implemented in Java SOA architecture

=
500 FP of functionality implemented in Cobol mainframe
Function Points are measured independent from implementation method
500 FP delivered in an agile development project

=
500 FP delivered in a COTS package implementation
Software metrics – some examples
Productivity
Productivity Rate: #Function points per staff month
PDR: #Effort hours per function Point
Quality
Defect Density: #Defects delivered per 1000 function
points
Time to Market
Speed: #Function points delivered per calendar month
Performance measurement
Measure the size of completed projects
Project size in Function Points
Product size in Function Points
Collect and analyze the data
Effort hours, duration, defects
Normalize the data when necessary
Store the data in the corporate database
Benchmark the project, internally and externally
Report metrics and trends
Different reports for different stakeholders
Depending on goals of the stakeholder
External benchmark
How to benchmark your performance externally?
Gartner / McKinsey/other ??
Very expensive!
No insight into data used !!!

Do it yourself Benchmarking, use:

Historical data of completed projects!!
Low cost, more feeling for the data, decide yourself which
peer groups are most relevant, decide yourself which
data is relevant!
Historical data: ISBSG repositories
International Software Benchmarking Standards Group
Independent and not-for-profit
Full Members are non-profit organizations, like DASMA, IFPUG,
FiSMA, QESP and NESMA. GUFPI-ISMA is now associate member
Grows and exploits two repositories of software data:
New development projects and enhancements (> 6000 projects)
Maintenance and support (> 1000 applications)

Everybody can submit project data
DCQ on the site
Completely anonymous
Free benchmark report in return
ISBSG
Mission: “To improve the management of IT resources by both
business and government, through the provision and
exploitation of public repositories of software engineering
knowledge that are standardized, verified, recent and
representative of current technologies”.
All ISBSG data is
validated and rated in accordance with its quality guidelines
current
representative of the industry
independent and trusted
captured from a range of organization sizes and industries
Website and portal
Example project benchmark
Project X was completed, the following data was collected:
Primary programming language: Java
Effort hours spent: 5730
Duration: 11 months
Defects found after delivery: 23
The functional size of the project was measured: 411 FP
Software metrics:
Project Delivery Rate: 5730/411 = 13,9 h/FP
Project Speed: 411/11 = 37,4 FP per calendar month
Defect Density: (23/411) *1000 = 56,0 defects/1000 FP
Example benchmark
ISBSG ‘New Developments & Enhancements’
Select the right ‘peer group’
Data Quality A or B
Count approach: IFPUG 4.x or NESMA
Primary Programming Language = ‘Java’
300 FP < Project Size < 500 FP
Results project benchmark
PDR
Speed
N
488
N
428
Minimum
0,1
Minimum
9,4
Percentile 10
2,5
Percentile 10
23,1
Percentile25
4,7
Percentile 25
32,5
Median
9,8
Median
53,8
Percentile 75
18,4
Percentile 75
95,4
Percentile
Percentile 90
130,2
90
28,9
Maximum
476,0
Maximum
621,3
Average
70,9
Average
15,2
Project Delivery Rate: 5730/411 = 13,9 h/FP
Project Speed: 411/11 = 37,4 FP per calendar month
Defect Density: (23/411) *1000 = 56,0 defects/1000 FP

Defect
Density
N
Minimum
Percentile 10
Percentile 25
Median
Percentile 75
Percentile 90
Maximum
Average

This project was carried out less productive and slower
than market average, and the quality is worse than average.

154
0,0
0,0
0,0
3,7
17,9
40,1
366,5
18,6
Organization benchmark
Organization Y - Speed index

Organization Y - Productivity lndex
2,0
1,8

Speed Index

Productivity Index

2,2

2,2
2,0
1,8
1,6
1,4
1,2
1,0
0,8
0,6
0,4

1,6
1,4
1,2
1,0
0,8
0,6

<2009

2009

Organization Y PI

2010

2011

2012

0,4
<2009

Target (baseline +50%)

2009

2010

2011

2012

Organization Y Speed index

Industry Productivity

Industry Speed

Lower bound (baseline -40%)

Quality Index

2,0
1,8
1,6
1,4
1,2
1,0
0,8
0,6
0,4
0,2
2009

2010

2011

Lower bound (baseline -40%)

Analysis:
Until 2010, the organization was improving
After 2010/2011, the trends go the wrong way
Recommendation: find the cause and draw up
improvement plan

Organization Y - Quality lndex

<2009

Target (baseline +50%)

2012

organization Y - Quality index

Target (baseline +50%)

Industry Quality level

Lower bound (baseline -50%)
Other uses for ISBSG data
Vendor selection, based on productivity, speed or
quality metrics, compared to the industry.
Definition of SLA agreements (or other KPI’s) based on
industry average performance.
Establish a baseline from which to measure future
improvement.
Explain to the client/business that a project was
carried out in a ‘better-than-average’ way, while the
client may perceive otherwise.
Analysis of the data
Analyze the difference in productivity or quality
between two (or more) types of projects:
Traditional vs. Agile
Outsourced vs. In-house
Government vs. Non-government
One site, multi site
Reuse vs. no reuse
Etcetera.

ISBSG Special Analysis reports
Traditional vs. Agile
Agile productivity: 10 – 20% gain after 1 year of
adoption
Agile cost: 20-40% lower after 1 year of adoption
Agile time-to-market: 10-60% less
Agile quality (post production defects): 2-8% higher
10 Take Aways from Reifer Agile Report
(www.isbsg.com)
Special reports
Impact of Software Size on Productivity
Government and Non-Government Software Project
Performance
ISBSG Software Industry Performance report
ISBSG The Performance of Business Application, Real-Time and
Component Software Projects
Estimates – How accurate are they?
Planning Projects – Role Percentages
Team size impact on productivity
Manage your M&S environment – what to expect?
Many more
The value of benchmarking software projects
Country report
Italy (IFPUG)

Latest project: 2005
Italy (COSMIC)

Latest project: 2010
Government vs. Non-government
Role percentages
We need data!
Everybody wants to use data
But nobody wants to submit data… Why not?
Is it hard?
Is there a risk?
Is the reward not big enough?

Why not try it? You’ll get a free Benchmark report and
100 portal credits in return!!
Are there any factors preventing you?
WWW.ISBSG.ORG
H.S.2

GUFPI-ISMA Event offer
GUFPI-ISMA members
always get a 10% discount,
using the code provided
earlier this year.
Dia 33
H.S.2

van Heeringen; 15-11-2013
Thank you!
NEtherlands Software Metrics users Association
To celebrate our anniversary NESMA
will organise the IWSM Mensura

October 6-8, 2014
2014.iwsm-mensura.org
Historical landmarks
Modern landmarks
European Cultural Capital 2001
The city of modern architecture
Largest soccerstadium in the Netherlands
Welcome to

Rotterdam
Busiest port in Europe
I hope to see you next year
October 6-8, 2014

2014.iwsm-mensura.org

More Related Content

PDF
Benchmarking
Gurkirat Singh
 
PDF
Wp prod mgmtcustsurv
勇 胥
 
PPT
A process maturity model for requirements engineering
Ian Sommerville
 
PPTX
Benchmarking
AnwarrChaudary
 
PPT
Session 2 & 3
rajeevgupta
 
PPT
BPR- An Introduction
Vipen Mahajan
 
PPT
Bpr
PREET619
 
PPT
Business Process Reengineering
9930302695
 
Benchmarking
Gurkirat Singh
 
Wp prod mgmtcustsurv
勇 胥
 
A process maturity model for requirements engineering
Ian Sommerville
 
Benchmarking
AnwarrChaudary
 
Session 2 & 3
rajeevgupta
 
BPR- An Introduction
Vipen Mahajan
 
Business Process Reengineering
9930302695
 

What's hot (10)

PDF
Business process reengineering
SANDESH GHOSAL
 
PDF
About BPR
Jane Cochrane
 
PPT
A Short Course in PBA and Proper PWS Formatting
ROBERT KNAUER CPCM CPPO (CEO & CLO)
 
PDF
Erp governance methodology and case studies v rjt
James Sutter
 
PPTX
ERP & BPR
Matthew Munger
 
PPTX
Frontiers 2015, by 3 Pillar, CES, Rockbridge
Rockbridge Associates, Inc.
 
PPT
Lect 6
Waqas Rafique
 
PPTX
Business process reengineering
Kavindra Singh
 
PPT
Erp Selection, Design, And Implementation Support
Chuck Papageorgiou
 
Business process reengineering
SANDESH GHOSAL
 
About BPR
Jane Cochrane
 
A Short Course in PBA and Proper PWS Formatting
ROBERT KNAUER CPCM CPPO (CEO & CLO)
 
Erp governance methodology and case studies v rjt
James Sutter
 
ERP & BPR
Matthew Munger
 
Frontiers 2015, by 3 Pillar, CES, Rockbridge
Rockbridge Associates, Inc.
 
Business process reengineering
Kavindra Singh
 
Erp Selection, Design, And Implementation Support
Chuck Papageorgiou
 
Ad

Viewers also liked (8)

PPTX
SOFTWARE MEASUREMENT ESTABLISHING A SOFTWARE MEASUREMENT PROCESS
Amin Bandeali
 
PPTX
Hotel benchmarking
John Sutton
 
ODP
Software Measurement: Lecture 1. Measures and Metrics
Programeter
 
PPTX
Key performance indicators 2010
Hj Arriffin Mansor
 
PPT
Quality Process KPIs Metrics
Douglas Gabel
 
PPTX
Unit & measurement
smartgeniusproduction
 
PDF
Software QA Metrics Dashboard Benchmarking
John Carter
 
ODP
Creating QA Dashboard
Petro Porchuk
 
SOFTWARE MEASUREMENT ESTABLISHING A SOFTWARE MEASUREMENT PROCESS
Amin Bandeali
 
Hotel benchmarking
John Sutton
 
Software Measurement: Lecture 1. Measures and Metrics
Programeter
 
Key performance indicators 2010
Hj Arriffin Mansor
 
Quality Process KPIs Metrics
Douglas Gabel
 
Unit & measurement
smartgeniusproduction
 
Software QA Metrics Dashboard Benchmarking
John Carter
 
Creating QA Dashboard
Petro Porchuk
 
Ad

Similar to The value of benchmarking software projects (20)

PDF
The value of benchmarking IT projects - H.S. van Heeringen
Harold van Heeringen
 
PDF
Are Function Points Still Relevant?
Premios Group
 
PDF
Are Function Points Still Relevant?
DCG Software Value
 
PPTX
Iwsm2014 importance of benchmarking (john ogilvie & harold van heeringen)
Nesma
 
PPTX
The importance of benchmarking software projects - Van Heeringen and Ogilvie
Harold van Heeringen
 
PPT
Software Estimating and Performance Measurement
Harold van Heeringen
 
PPT
MCIF- Per Kroll
Roopa Nadkarni
 
PDF
Using the ISBSG data to improve your organization success - van Heeringen (Me...
Harold van Heeringen
 
PDF
The Use of Functional Size in the Industry.pdf
Nesma
 
PDF
Improving Speed to Market in E-commerce
Cognizant
 
PDF
Importance of software quality metrics
Piyush Sohaney
 
PDF
How Can I Use SNAP to Improve My Estimation Practices?
DCG Software Value
 
PDF
Continuous Delivery Operating Model for Entertainment Video Providers: Buildi...
Cognizant
 
PDF
Agile Gurugram 2022 - Dinker Charak | Line of Sight from Engineering Excellen...
AgileNetwork
 
PDF
IT Cost Optimization POC Highlights: Creating Business Value from Software Us...
Scalable Software
 
PPT
Measuring the ROI of choosing Flex for Enterprise RIAs
Raffaele Mannella
 
PPTX
Webinar featuring Forrester TEI study: Driving 496% ROI with Tasktop Viz
Tasktop
 
PDF
Productivity measurement of agile teams (IWSM 2015)
Harold van Heeringen
 
PDF
Strategy Basecamp's IT Diagnostic - Six Steps to Improving Your Technology
Paul Osterberg
 
PDF
Tpm all you need v1.2
RastinKenarsari
 
The value of benchmarking IT projects - H.S. van Heeringen
Harold van Heeringen
 
Are Function Points Still Relevant?
Premios Group
 
Are Function Points Still Relevant?
DCG Software Value
 
Iwsm2014 importance of benchmarking (john ogilvie & harold van heeringen)
Nesma
 
The importance of benchmarking software projects - Van Heeringen and Ogilvie
Harold van Heeringen
 
Software Estimating and Performance Measurement
Harold van Heeringen
 
MCIF- Per Kroll
Roopa Nadkarni
 
Using the ISBSG data to improve your organization success - van Heeringen (Me...
Harold van Heeringen
 
The Use of Functional Size in the Industry.pdf
Nesma
 
Improving Speed to Market in E-commerce
Cognizant
 
Importance of software quality metrics
Piyush Sohaney
 
How Can I Use SNAP to Improve My Estimation Practices?
DCG Software Value
 
Continuous Delivery Operating Model for Entertainment Video Providers: Buildi...
Cognizant
 
Agile Gurugram 2022 - Dinker Charak | Line of Sight from Engineering Excellen...
AgileNetwork
 
IT Cost Optimization POC Highlights: Creating Business Value from Software Us...
Scalable Software
 
Measuring the ROI of choosing Flex for Enterprise RIAs
Raffaele Mannella
 
Webinar featuring Forrester TEI study: Driving 496% ROI with Tasktop Viz
Tasktop
 
Productivity measurement of agile teams (IWSM 2015)
Harold van Heeringen
 
Strategy Basecamp's IT Diagnostic - Six Steps to Improving Your Technology
Paul Osterberg
 
Tpm all you need v1.2
RastinKenarsari
 

More from Harold van Heeringen (20)

PDF
Improve Estimation maturity using Functional Size Measurement and Historical ...
Harold van Heeringen
 
PDF
Productivity measurement of agile teams (IWSM 2015)
Harold van Heeringen
 
PDF
Methodisch begroten van projecten hanzehogeschool groningen december2014
Harold van Heeringen
 
PDF
Van Heeringen and van Gorp - Measure the functional size of a mobile app usi...
Harold van Heeringen
 
PPTX
Measuring the functional size of mobile apps with COSMIC FP
Harold van Heeringen
 
PPT
Avoid software project horror stories - check the reality value of the estima...
Harold van Heeringen
 
PPT
ISMA 9 - van Heeringen - Using IFPUG and ISBSG to improve organization success
Harold van Heeringen
 
PPT
Gastcollege Hanzehogeschool Groningen 10 januari 2014
Harold van Heeringen
 
PDF
Asl bi sl metrics themasessie 2013 devops sogeti
Harold van Heeringen
 
PPT
Begroten van software projecten - Hogeschool Rotterdam gastcollege 05-11-2013
Harold van Heeringen
 
PDF
Van heeringen estimate faster, cheaper, better
Harold van Heeringen
 
PDF
van Heeringen - estimate faster,cheaper and better!
Harold van Heeringen
 
PDF
Begroten van agile projecten, technical meeting Sogeti 2013-09
Harold van Heeringen
 
PDF
Sogeti seminar Supplier Performance Measurement
Harold van Heeringen
 
PPT
Project Control using functional size - which method to use?
Harold van Heeringen
 
PPTX
Metrics based software supplier selection - Best practice used in the largest...
Harold van Heeringen
 
PPT
ISPA/SCEA conference Brussels 2012
Harold van Heeringen
 
PDF
Van heeringen metrics in rf ps
Harold van Heeringen
 
PPT
Acosm 2010 Harold Van Heeringen V3
Harold van Heeringen
 
PPT
Sogeti MD Seminar 21 sep 2010 (NL)
Harold van Heeringen
 
Improve Estimation maturity using Functional Size Measurement and Historical ...
Harold van Heeringen
 
Productivity measurement of agile teams (IWSM 2015)
Harold van Heeringen
 
Methodisch begroten van projecten hanzehogeschool groningen december2014
Harold van Heeringen
 
Van Heeringen and van Gorp - Measure the functional size of a mobile app usi...
Harold van Heeringen
 
Measuring the functional size of mobile apps with COSMIC FP
Harold van Heeringen
 
Avoid software project horror stories - check the reality value of the estima...
Harold van Heeringen
 
ISMA 9 - van Heeringen - Using IFPUG and ISBSG to improve organization success
Harold van Heeringen
 
Gastcollege Hanzehogeschool Groningen 10 januari 2014
Harold van Heeringen
 
Asl bi sl metrics themasessie 2013 devops sogeti
Harold van Heeringen
 
Begroten van software projecten - Hogeschool Rotterdam gastcollege 05-11-2013
Harold van Heeringen
 
Van heeringen estimate faster, cheaper, better
Harold van Heeringen
 
van Heeringen - estimate faster,cheaper and better!
Harold van Heeringen
 
Begroten van agile projecten, technical meeting Sogeti 2013-09
Harold van Heeringen
 
Sogeti seminar Supplier Performance Measurement
Harold van Heeringen
 
Project Control using functional size - which method to use?
Harold van Heeringen
 
Metrics based software supplier selection - Best practice used in the largest...
Harold van Heeringen
 
ISPA/SCEA conference Brussels 2012
Harold van Heeringen
 
Van heeringen metrics in rf ps
Harold van Heeringen
 
Acosm 2010 Harold Van Heeringen V3
Harold van Heeringen
 
Sogeti MD Seminar 21 sep 2010 (NL)
Harold van Heeringen
 

Recently uploaded (20)

PDF
askOdin - An Introduction to AI-Powered Investment Judgment
YekSoon LOK
 
PPTX
Virbyze_Our company profile_Preview.pptx
myckwabs
 
PDF
Bihar Idea festival - Pitch deck-your story.pdf
roharamuk
 
PDF
MDR Services – 24x7 Managed Detection and Response
CyberNX Technologies Private Limited
 
PPTX
Unlocking Creativity Top Adobe Tools for Content Creators Buy Adobe Software...
PI Software
 
PDF
NewBase 26 July 2025 Energy News issue - 1806 by Khaled Al Awadi_compressed.pdf
Khaled Al Awadi
 
PDF
bain-temasek-sea-green-economy-2022-report-investing-behind-the-new-realities...
YudiSaputra43
 
PPTX
Communications Recruiter Melbourne.pptx
ReithGordon
 
PPTX
Decoding BPMN: A Clear Guide to Business Process Modeling
RUPAL AGARWAL
 
PPTX
Financial Management for business management .pptx
Hasibullah Ahmadi
 
PDF
GenAI for Risk Management: Refresher for the Boards and Executives
Alexei Sidorenko, CRMP
 
PPTX
BUSINESS FINANCE POWER POINT PRESENTATION
JethSrey
 
PDF
From Risk to Opportunity: How Cybersecurity Enhances Your Staffing Business
Withum
 
PDF
SparkLabs Primer on Artificial Intelligence 2025
SparkLabs Group
 
PPTX
Social Media Marketing for Business Growth
vidhi622006
 
PDF
Keppel Ltd. 1H 2025 Results Presentation Slides
KeppelCorporation
 
PDF
Tariff Surcharge and Price Increase Decision
Joshua Gao
 
PDF
Gregory Felber - An Accomplished Underwater Marine Biologist
Gregory Felber
 
PDF
Rodolfo Belcastro su All Around The Worlds Magazine - Febbraio 2025
Rodolfo Belcastro
 
PPTX
Integrative Negotiation: Expanding the Pie
badranomar1990
 
askOdin - An Introduction to AI-Powered Investment Judgment
YekSoon LOK
 
Virbyze_Our company profile_Preview.pptx
myckwabs
 
Bihar Idea festival - Pitch deck-your story.pdf
roharamuk
 
MDR Services – 24x7 Managed Detection and Response
CyberNX Technologies Private Limited
 
Unlocking Creativity Top Adobe Tools for Content Creators Buy Adobe Software...
PI Software
 
NewBase 26 July 2025 Energy News issue - 1806 by Khaled Al Awadi_compressed.pdf
Khaled Al Awadi
 
bain-temasek-sea-green-economy-2022-report-investing-behind-the-new-realities...
YudiSaputra43
 
Communications Recruiter Melbourne.pptx
ReithGordon
 
Decoding BPMN: A Clear Guide to Business Process Modeling
RUPAL AGARWAL
 
Financial Management for business management .pptx
Hasibullah Ahmadi
 
GenAI for Risk Management: Refresher for the Boards and Executives
Alexei Sidorenko, CRMP
 
BUSINESS FINANCE POWER POINT PRESENTATION
JethSrey
 
From Risk to Opportunity: How Cybersecurity Enhances Your Staffing Business
Withum
 
SparkLabs Primer on Artificial Intelligence 2025
SparkLabs Group
 
Social Media Marketing for Business Growth
vidhi622006
 
Keppel Ltd. 1H 2025 Results Presentation Slides
KeppelCorporation
 
Tariff Surcharge and Price Increase Decision
Joshua Gao
 
Gregory Felber - An Accomplished Underwater Marine Biologist
Gregory Felber
 
Rodolfo Belcastro su All Around The Worlds Magazine - Febbraio 2025
Rodolfo Belcastro
 
Integrative Negotiation: Expanding the Pie
badranomar1990
 

The value of benchmarking software projects

  • 2. Overview Benchmarking Software Project Industry Functional Size Measurement Software Metrics Historical Data: ISBSG Project Benchmark Example Organization Benchmark Other uses for Benchmarking Country report – Italy Data submissions Italy
  • 3. Benchmarking (wikipedia) Benchmarking is the process of comparing one's business processes and performance metrics to industry bests or best practices from other industries. Benchmarking is used to measure performance using a specific indicator (cost per unit of measure, productivity per unit of measure, cycle time of x per unit of measure or defects per unit of measure) resulting in a metric of performance that is then compared to others This then allows organizations to develop plans on how to make improvements or adapt specific best practices, usually with the aim of increasing some aspect of performance. Benchmarking may be a one-off event, but is often treated as a continuous process in which organizations continually seek to improve their practices.
  • 4. Where are we now? “Even the most detailed navigation map of an area is useless if you don’t know where you are” ? ? ? ?
  • 5. Informed decisions Senior Management of IT departments/organizations need to make decisions need to make decisions based on ‘where they are’ and ‘where they want to go’. Benchmarking is about determining ‘where you are’ compared to relevant peers, in order to make informed decisions. But how to measure and determine where you are?
  • 6. Software project industry Low ‘performance metrics’ maturity Few Performance Measurement Process implemented Few Benchmarking processes implemented Most organizations don’t know how good or how bad they are in delivering or maintaining software. These organizations are not able to assess their competitive position, nor able to make informed strategic decisions to improve their competitive position.
  • 7. But… Best in Class organizations deliver software up to 30 times more productively than Worst in Class organizations High Productivity, High Quality More functionality for the users against lower costs – value Shorter Time to Market – competitive advantage! Worst in Class organizations will find themselves in trouble in an increasingly competitive market Outperformed by competition Internal IT departments get outsourced Commercial software houses fail to win new contracts Important to know where you stand! Benchmark is essential!
  • 8. Difficulty – low industry maturity How to measure metrics like productivity, quality, time-to-market in such a way that a meaningful comparison is possible? Comparing apples to apples
  • 9. Software is not easy to compare
  • 10. Functional Size Measurement Function Point Analysis (NESMA, IFPUG or COSMIC) Measure the functional user requirements – size in function points; ISO standards – objective, independent, verifiable, repeatable; Strong relation between functional size and project effort needed; What to do with the results? Project effort/duration/cost estimation Benchmarking/performance measurement Use in Request for Proposal management (answer price/FP questions) What about historical data? Company data (preferably for estimation) Industry data (necessary for external benchmarking)
  • 11. Unit of Measure (UoM) Why are Function Points the best UoM to use in Benchmarking? Functionality is of value for the client/business. More functionality means more value. More Lines of code (technical size) is not necessarily of value. Function Points are measured independent from technical requirements 500 FP of functionality implemented in Java SOA architecture = 500 FP of functionality implemented in Cobol mainframe Function Points are measured independent from implementation method 500 FP delivered in an agile development project = 500 FP delivered in a COTS package implementation
  • 12. Software metrics – some examples Productivity Productivity Rate: #Function points per staff month PDR: #Effort hours per function Point Quality Defect Density: #Defects delivered per 1000 function points Time to Market Speed: #Function points delivered per calendar month
  • 13. Performance measurement Measure the size of completed projects Project size in Function Points Product size in Function Points Collect and analyze the data Effort hours, duration, defects Normalize the data when necessary Store the data in the corporate database Benchmark the project, internally and externally Report metrics and trends Different reports for different stakeholders Depending on goals of the stakeholder
  • 14. External benchmark How to benchmark your performance externally? Gartner / McKinsey/other ?? Very expensive! No insight into data used !!! Do it yourself Benchmarking, use: Historical data of completed projects!! Low cost, more feeling for the data, decide yourself which peer groups are most relevant, decide yourself which data is relevant!
  • 15. Historical data: ISBSG repositories International Software Benchmarking Standards Group Independent and not-for-profit Full Members are non-profit organizations, like DASMA, IFPUG, FiSMA, QESP and NESMA. GUFPI-ISMA is now associate member Grows and exploits two repositories of software data: New development projects and enhancements (> 6000 projects) Maintenance and support (> 1000 applications) Everybody can submit project data DCQ on the site Completely anonymous Free benchmark report in return
  • 16. ISBSG Mission: “To improve the management of IT resources by both business and government, through the provision and exploitation of public repositories of software engineering knowledge that are standardized, verified, recent and representative of current technologies”. All ISBSG data is validated and rated in accordance with its quality guidelines current representative of the industry independent and trusted captured from a range of organization sizes and industries
  • 18. Example project benchmark Project X was completed, the following data was collected: Primary programming language: Java Effort hours spent: 5730 Duration: 11 months Defects found after delivery: 23 The functional size of the project was measured: 411 FP Software metrics: Project Delivery Rate: 5730/411 = 13,9 h/FP Project Speed: 411/11 = 37,4 FP per calendar month Defect Density: (23/411) *1000 = 56,0 defects/1000 FP
  • 19. Example benchmark ISBSG ‘New Developments & Enhancements’ Select the right ‘peer group’ Data Quality A or B Count approach: IFPUG 4.x or NESMA Primary Programming Language = ‘Java’ 300 FP < Project Size < 500 FP
  • 20. Results project benchmark PDR Speed N 488 N 428 Minimum 0,1 Minimum 9,4 Percentile 10 2,5 Percentile 10 23,1 Percentile25 4,7 Percentile 25 32,5 Median 9,8 Median 53,8 Percentile 75 18,4 Percentile 75 95,4 Percentile Percentile 90 130,2 90 28,9 Maximum 476,0 Maximum 621,3 Average 70,9 Average 15,2 Project Delivery Rate: 5730/411 = 13,9 h/FP Project Speed: 411/11 = 37,4 FP per calendar month Defect Density: (23/411) *1000 = 56,0 defects/1000 FP Defect Density N Minimum Percentile 10 Percentile 25 Median Percentile 75 Percentile 90 Maximum Average This project was carried out less productive and slower than market average, and the quality is worse than average. 154 0,0 0,0 0,0 3,7 17,9 40,1 366,5 18,6
  • 21. Organization benchmark Organization Y - Speed index Organization Y - Productivity lndex 2,0 1,8 Speed Index Productivity Index 2,2 2,2 2,0 1,8 1,6 1,4 1,2 1,0 0,8 0,6 0,4 1,6 1,4 1,2 1,0 0,8 0,6 <2009 2009 Organization Y PI 2010 2011 2012 0,4 <2009 Target (baseline +50%) 2009 2010 2011 2012 Organization Y Speed index Industry Productivity Industry Speed Lower bound (baseline -40%) Quality Index 2,0 1,8 1,6 1,4 1,2 1,0 0,8 0,6 0,4 0,2 2009 2010 2011 Lower bound (baseline -40%) Analysis: Until 2010, the organization was improving After 2010/2011, the trends go the wrong way Recommendation: find the cause and draw up improvement plan Organization Y - Quality lndex <2009 Target (baseline +50%) 2012 organization Y - Quality index Target (baseline +50%) Industry Quality level Lower bound (baseline -50%)
  • 22. Other uses for ISBSG data Vendor selection, based on productivity, speed or quality metrics, compared to the industry. Definition of SLA agreements (or other KPI’s) based on industry average performance. Establish a baseline from which to measure future improvement. Explain to the client/business that a project was carried out in a ‘better-than-average’ way, while the client may perceive otherwise.
  • 23. Analysis of the data Analyze the difference in productivity or quality between two (or more) types of projects: Traditional vs. Agile Outsourced vs. In-house Government vs. Non-government One site, multi site Reuse vs. no reuse Etcetera. ISBSG Special Analysis reports
  • 24. Traditional vs. Agile Agile productivity: 10 – 20% gain after 1 year of adoption Agile cost: 20-40% lower after 1 year of adoption Agile time-to-market: 10-60% less Agile quality (post production defects): 2-8% higher 10 Take Aways from Reifer Agile Report (www.isbsg.com)
  • 25. Special reports Impact of Software Size on Productivity Government and Non-Government Software Project Performance ISBSG Software Industry Performance report ISBSG The Performance of Business Application, Real-Time and Component Software Projects Estimates – How accurate are they? Planning Projects – Role Percentages Team size impact on productivity Manage your M&S environment – what to expect? Many more
  • 32. We need data! Everybody wants to use data But nobody wants to submit data… Why not? Is it hard? Is there a risk? Is the reward not big enough? Why not try it? You’ll get a free Benchmark report and 100 portal credits in return!! Are there any factors preventing you? WWW.ISBSG.ORG
  • 33. H.S.2 GUFPI-ISMA Event offer GUFPI-ISMA members always get a 10% discount, using the code provided earlier this year.
  • 36. NEtherlands Software Metrics users Association
  • 37. To celebrate our anniversary NESMA will organise the IWSM Mensura October 6-8, 2014 2014.iwsm-mensura.org
  • 41. The city of modern architecture
  • 42. Largest soccerstadium in the Netherlands
  • 44. Busiest port in Europe
  • 45. I hope to see you next year October 6-8, 2014 2014.iwsm-mensura.org