SlideShare a Scribd company logo
Performance Metrics Workbook for IT Projects
David Paschane, Ph.D.
Introduction
The VA is committed to fulfilling the President’s commitment for Executive departments to “use
innovative tools, methods, and systems to cooperate among themselves, across all levels of
Government, and with nonprofit organizations, businesses, and individuals in the private sector” (The
White House, January 21, 2009). Furthermore, VA is pursuing compliance to the OMB policy to create
and institutionalize a culture of open Government, one where the “integration of various disciplines
facilitates organization-wide and lasting change in the way the Government works,” and the use of best
practices “ take advantage of the expertise and insight of people both inside and outside the Federal
Government, and form high-impact collaborations with researchers, the private sector, and civil society”
(OMB M-10-06, December 8, 2009). The purpose of this workbook is to help be consistent with Public
Laws and Executive Orders in thinking about the strategic, innovative, sustainable, and appropriate
research-based collaboration that strengthens the VA’s capacity to improve performance with the
assistance of IT-enhancing methodologies.
The objective is to apply the scientific method to understanding and optimizing work variability within
the VA to positively affect the outcome variability among VA customers. Integration of methodology is
large organizations is referred to as Corporate Performance Management (CPM), which aligns best
practices of Business Process Management and Re-engineering (BPM/BPR), Business Analytics and
Intelligence (BA/BI), Project Management and Knowledge Management (PM/KM), Total Quality
Management and Lean Six Sigma (TQM/6Σ), and Geographic Information Science (GIS). Sustainment of
appropriate performance metrics, by organizational unit and work conditions, will require the ability to
create repeatable, recursive analyses that diagnose performance issues and emphasize the high-value
user-improvement trending status.
The maturity of analytic performance metrics is dependent on assessments of the needs and priorities
that range from senior managers to project managers, and across domains. Several domains are likely
candidates for performance metrics, depending on the progress of diagnosing performance trends and
meeting awareness and learning requirements. In most cases, performance metrics are based on
initiative-level information or control needs. As the performance architecture matures, performance
metrics will also be developed to support the stabilization of the conditions affecting performance,
optimization of cross-functional processes supporting executive decision-making, and contextualizing
outcomes to determine the where and how performance is impacting services to customers.
This workbook only examines the metrics at the project level that is the point where consistency is
important for building a sustainable performance environment. These can be used to design
methodologies for controlling performance, but they need to be put into the large context of how the
projects affect the business and IT operations, and the culture of the workforce. Ultimately these are all
parts of a learning process that sustains an analytic performance environment.
Performance Metrics at the Project Level
1.0 Project Performance Indicators
Project performance indicators are those that characterize the direct work of the project team in terms
of the project’s schedule, use of resources, and production of a quality product.
1.1 Schedule Deviation
Schedule Deviation is an indicator of the timeliness of delivering a product in terms of the customer’s
expectations. Schedule deviation metrics measure changes in project schedule, in terms of cumulative
time and percentage of schedule. These metrics are contextualized by other measurements, including
historical events and the number of rebaselines. The schedule deviation metrics can correspond to each
deliverable in a project. The metrics can also correspond to each major team, such as an employee team
or contracted team.
PROPOSED METRIC TARGET VALUES
Percent of days slippage from inception schedule 0%
Number of days slipped from inception schedule 0
Count of all re-baselines 0
Percent of days slippage in build schedule (after technical solution
approval)
0%
Number of days slipped in build schedule 0
Percent of days slippage since last re-baseline (approval within
governance)
0%
Number of days slipped since last re-baseline 0
Number of historical events that paused project (report events by
name)
0
Estimated days historical events impacted schedule 0
Count of missed milestone dates 0
Days overrun per product components 0
Days overrun per development milestones 0
Estimated days overrun on planned work packages till close out 0
1.2 Errors Saved
Errors Saved is an indicator of how well the project is able to sustain quality in the development process.
The quality work is largely dependent on the ability of the team to manage defects and errors by
preventing them, detecting them early, detecting them before passing through a major development
lifecycle process, or establishing quick and focused correction in order to achieve delivery milestones.
Classifications of defects and errors are necessary for identifying which require the most immediate
attention within the development lifecycle. The following are key classifications by which measures are
reported:
1. Project Component (2 types)
a. In project’s components
b. In other project’s components
2. Build Task (3 types)
a. In building a function
b. In technology refresh
c. In defect correction
3. Build Process (4 types)
a. In product integration
b. In solution build
c. In technological requirements
d. In business requirements
4. Impact Severity (4 types)
a. Critical (1) – causes system crashes or loss of data or functionality
b. Major (2) – impairs system function
c. Average (3) – requiring less than three workarounds
d. Minor (4) – minor cosmetic or documentation issues
5. Correction State (3 types)
a. Previously identified (reopened)
b. Newly identified, from passed process (injected)
c. Newly identified, identified in current process (errors)
6. Release Priority (2 types)
a. Required for next release milestone
b. Required for subsequent release milestones
7. Problem Characteristic (3 types)
a. Inaccurate information or action
b. Inconsistent information or action
c. Incomplete information or action
PROPOSED METRIC TARGET VALUES
Defects or errors found before milestone review completed 100%
Defects or errors corrected before milestone review completed 100%
Days to correct defects 1
Count of defects or errors by testing environment 0
1.3 Earned Value
Earned Value is an indicator of how efficiently a project is using work hours to produce portions of
product according to a planned schedule. A project is efficient if it expends budget (Actual Cost) to
support work performed (Earned Value) consistent with progress expected at any point in time (Planned
Value). The purpose of these metrics is to characterize key trends based on a standard use of work and
product definitions and estimates, and compliance to standard processes. A challenge in the
interpretation of earned value trends is the impact of (1) work flow dependencies, (2) changes in human
resources, (3) historical events that change course of work, and (4) scope increases. Another
measurement consideration is to apply a different set of earned value measures to different project
team subunits, especially when there is a mix of employees and contractors, or subunits working on
independent components. The measures of earned value should also account for the additional effort
required for defect corrections and risk mitigations that were not anticipated in the project plan. An
effective set of earned value measures will take into account the requirement to weight value earned
according to the goals of the organization.
PROPOSED METRIC TARGET VALUES
All product components and work packages have an estimated cost 1; [1 (yes), 2 (drafted), 3 (no)]
All product components and work packages have a standardized
definition
1; [1 (yes), 2 (drafted), 3 (no)]
The schedule for work packages has an approved earned value
weighting method that is consistent with performance goals
1; [1 (yes), 2 (drafted), 3 (no)]
Estimate at Completion (actual cost plus estimate to complete
assuming past results are not typical of expected results)
<5% of plan / <10% / >10%
Cost Performance Index (CPI) (identify project holds, per executive
thresholds)
>.97% / >.95% / <.95%
Schedule Performance Index (SPI) (identify project holds, per
executive thresholds)
>.97% / >.95% / <.95%
Cost Variance (earned value minus actual cost) 0
Actual cost to planned cost 100%
1.4 Project Condition Indicators
Project condition indicators are those that characterize factors affecting the project in terms of the
project’s staffing, scope, risks, and compliance to integration, design, and process requirements. These
indicators are often able to predict likelihood of changes in project performance indicators. While the
project team monitors these activities, changes in these indicators are often associated with the work of
other offices or managers.
1.5 Hiring Time
Hiring Time is an indicator to access the length of time that is associated with hiring employees. Hiring
Time metrics will characterize the ability to recruit and hire new employees to a team.
PROPOSED METRIC TARGET VALUES
Days between Personnel Description Signoff and Actual Hire Date <120 workdays
Time between project-required hire date and actual hire date 0 days
All hires total lag time by presence on a critical path 0 days
All hires total lag time by position categories 0 days
All hires total lag time by GS level 0 days
All hires total lag time by HR offices 0 days
All hires total lag time by occupational series
Number of all hiring stages exceeding target workdays x days
1.6 Contracting Time
Contracting Time is an indicator to assess the impact of efforts to reduce the length of time that is
required to complete before they are required by the teams.
PROPOSED METRIC TARGET VALUES
Days between SOW Sign-Off and Actual Award Date <120 workdays
Time between project-required award date and actual award date 0 days each
All contracts total lag time by presence on a critical path 0 days each
All contracts total lag time by contract types (level of effort on
complexity)
0 days each
All contracts total lag time by contract price categories 0 days each
All contracts total lag time by contracting offices 0 days each
All contracts total lag time by product category 0 days each
Number of all contracting stages exceeding target workdays x days each
1.7 Administrative Time
Administrative Time is an indicator of trends in planned and unplanned administrative tasks that distract
from scheduled project tasks. These metrics help identify the impact of time allocated by project
resources to tasks that are not identified in projects.
PROPOSED METRIC TARGET VALUES
Total unplanned non-manager staff time as administrative 0%
Total planned non-manager staff time as administrative <10%
Total unplanned manger time as administrative 0%
Average time required for conducting milestone reviews <1 day
Average manager time required for preparing and participating in
milestone reviews
<4 hours
Closed issues of total count of all milestone review issues 100%
Total work hours required for preparing nonstandard status reports 0
Total number of reports prepared by team to senior managers a year, over
two years
<25 a year
1.8 Team Readiness
Team Readiness is an indicator of trends in training, alignment and continuity of project teams. Team
Readiness metrics will help determine required training investments and potential training value.
PROPOSED METRIC TARGET VALUES
Total planned staff vacancy time 0 days
Slot time filled by staff under-qualified for role 0 days
Staff formally prepared to achieve current team and organizational
goals
100%
Staff formally prepared to comply with current team and
organizational processes
100%
Planned project tasks without staffing 0
Planned task hours without staffing 0
Project task hours worked (in a given period) of available staff hours
(in the same period)
100%
Staff certified as trained to standard skill set of respective roles 100%
Manager roles are formally authorized 1; [1 (yes), 2 (drafted), 3 (no)]
Total project manager vacancy time 0 days
Total unplanned hours of absences across the team 0
Total turnover rate of all staff <12% / <20% / >20%
Project manager turnover 0
Average leadership coaching sessions per year for key managers >11
1.9 Risk Control
Risk Control is an indicator of trends in the ability to anticipate and control issues that may negatively
affect the project. Risk Control metrics will assess the ability to provide consistency in estimating risks.
The metrics will also indicate needs to improve the quality of the risk identification, analysis, and
mitigation processes.
PROPOSED METRIC TARGET VALUES
Risk management plans approved 1; [1 (yes), 2 (drafted), 3 (no)]
Risk management meetings completed as planned 100%
Program funding matches budget at completion 100% / >95% / <95%
Operation funding matches year requirements 100% / >95% / <95%
Contractor burn rate (time and materials) < 103% of planned
Requirements and capability definitions identified, documented, and
approved
100% / >95% / <95%
Risk management plans in use for all medium and high risk items 1; [1 (yes), 2 (drafted), 3 (no)]
Number of risk management plans executed late 0
Risks accepted of those identified as within tolerable range 100%
Risks avoided of those identified as can be avoided with appropriate
actions
100%
Risks mitigated of those identified as having impacts that can be
reduced
100%
Number of unplanned risk that occurred 0
Estimated cost of unplanned risks occurred $0
1.10 Scope Control
Scope Control is an indicator of trends in changes to the scope of work in development and rework
projects. Scope Control metrics will assess the ability of the organization to reduce lifecycle costs such as
cost of unplanned work.
PROPOSED METRIC TARGET VALUES
Count of all scope changes over development life cycle 0
Count of post-requirements approval scope change 0
Count of changes within product versions 0
Estimated added cost of scope changes $0
Functional requirements change rate <8% / <15% / >15%
Interface requirements change rate <8% / <15% / >15%
1.11 Integration Readiness
Integration Readiness is an indictor of trends in the ability to integrate products into the business
environment, given the readiness of the project, key actors, and the efficiency of gate reviews.
PROPOSED METRIC TARGET VALUES
Enterprise Architecture Review (4) issues resolved 100%
Preliminary Concept Design Review issues resolved 100%
System Requirement Review issues resolved 100%
Initial Design Review issues resolved 100%
Preliminary Design Review issues resolved 100%
Critical Design Review issues resolved 100%
Test Readiness Review issues resolved 100%
Acceptance Testing Review issues resolved 100%
Deployment Readiness Review issues resolved 100%
Integrated Baseline Review issues resolved 100%
Program Management Review (4) issues resolved 100%
Planning, Architecture, Technology, and Services Board (4) issues
resolved
100%
Total reviews completed <18
Total review issues unresolved 0%
Total critical review issues requiring resolution 0
Testing strategy completed, aligned, and approved 1; [1 (yes), 2 (drafted), 3
(no)]
Testing environment complete and available for product testing 1; [1 (yes), 2 (drafted), 3
(no)]
System dependencies on schedule with product schedule 1; [1 (yes), 2 (drafted), 3
(no)]
Life cycle product are approved or in appropriate update 1; [1 (yes), 2 (drafted), 3
(no)]
Release management standards completed 100%
1.12 Reusability Compliance
Reusability Compliance is an indicator of trends in factors affecting the reusability and adaptive
configurations, according to development standards. Reusability Compliance metrics will assess the
ability to build and foster an enterprise infrastructure that creates agility and responsiveness while
managing IT investments and business services.
PROPOSED METRIC TARGET VALUES
Percent of components with reusability compliance design approved 100%
Associated Services with unknown capacity requirements 0%
Associated Services with a continuity plan 100%
Product reusability plan approved 1; [1 (yes), 2 (drafted), 3 (no)]
Product reusability standards met 1; [1 (yes), 2 (drafted), 3 (no)]
Percent of components certified reusable compliant 100%
Components are platform neutral 100%
Product performance optimization plan is built on simulated
parameters derived from process modeling
1; [1 (yes), 2 (drafted), 3 (no)]
Required business data is identified, categorized, and normalized 1; [1 (yes), 2 (drafted), 3 (no)]
Product uses a web services model for requesting and responding to
data use
1; [1 (yes), 2 (drafted), 3 (no)]
Business process and function definitions are integrated into the
services registry
1; [1 (yes), 2 (drafted), 3 (no)]
Hosting environment is approved for ensuring Service ability through
automated pre-integration
1; [1 (yes), 2 (drafted), 3 (no)]
1.13 Organizational Maturity
Organizational Maturity is an indicator of trends in the maturity of the organization, in terms of adopting
process standards. Organizational Maturity metrics will assess the organization’s ability to manage work
in accordance to approved process standards. It will help demonstrate progression towards process
maturity models.
PROPOSED METRIC TARGET VALUES
Percent of standard milestone reviews rated satisfactory 100%
The project has standard development processes and process goals
documented and approved
1; [1 (yes), 2 (drafted), 3 (no)]
Project manager has documented process discipline improvement
efforts by the team and reported to senior leadership quarterly
1; [1 (yes), 2 (drafted), 3 (no)]
Process compliance level (annual audit) 100%
Average lead time to start planned reviews 1 day
Days in review before Defined Requirements approved 1 day
Days in review before Functional Design approved 1 day
Days in review before Technical Design approved 1 day
Days in review before Configuration and Code Testing approved 1 day
Days in review before Test Script and Testing Plan approved 1 day
Days in review before Post-Testing Deployment approved 1 day
Days to deploy to production environment after Post-Testing
Deployment approval
1 day
2.0 Portfolio Indicators
Portfolio indicators are those that characterize how combined projects share a common performance
domain. These are still project measures, but handled in terms of organizational goals.
2.1 Organizational Interoperability
Organizational Interoperability is an indicator of factors affecting the interoperability of developments,
in terms of data sharing and application coordination with other agencies. These metrics will assess the
ability to deploy systems that are prepared to integrate with external stakeholder agencies.
PROPOSED METRIC TARGET VALUES
Percent of required organization interoperable components in If 0%, these metrics do not apply
project to project
System interoperability review completed 1; [1 (yes), 2 (drafted), 3 (no)]
Interoperable systems requirements completed 100%
Product interoperability plan approved 1; [1 (yes), 2 (drafted), 3 (no)]
Product interoperability standards met 1; [1 (yes), 2 (drafted), 3 (no)]
Percent of components certified interoperable 100%
Product performance optimization plan is built on simulated
parameters derived from process modeling across participating
agencies
1; [1 (yes), 2 (drafted), 3 (no)]
Required business data is identified, categorized, and normalized
across participating agencies
1; [1 (yes), 2 (drafted), 3 (no)]
Hosting environments are approved for ensuring interoperability
across participating agencies
1; [1 (yes), 2 (drafted), 3 (no)]
2.2 Online Customer Engagement
Online Customer Engagement (OCE) is an industry-standard indicator created to help assess the utility,
appeal, and customer benefit of web-enabled applications, and their impact on the customer and other
resources. OCE metrics will help assess the organization’s ability to deliver functionality to increase
stakeholder satisfaction and the ability to deliver applications that are flexible enough to integrate
future changes. The OCE metrics also help determine a design protocol for customer-facing design plans
to integrate customers into their management of services and benefits.
Metrics for OCE characterize the degree, depth, and impact of user (customer) involvement. This
extends not only to individual usage, but also behavior change, personally-realized effect, community
interaction, user perception, and peer influence. Ultimately, OCE reflects the quality of the content,
aesthetics, functionality, serviceability, infrastructure, alignment to user needs, facilitation of
communication and decision making, orientation, marketing, and promotion of organizational websites.
As these factors improve, websites will play a greater role engaging with users; consequently, (a)
transactions will become more efficient in terms of time and cost to the user; (b) user knowledge, skills,
and abilities will increase; and, (c) operational costs from other resources (e.g., call centers) should
decrease, thereby allowing resources to be reallocated to better serve customers. Associated OCE
studies can help identify evidence of how the organization enables customer’s self-empowerment,
prepares itself for adopting customer-facing IT advancements, and ensures the most effective customer-
facing products.
PROPOSED METRIC TARGET VALUES
Percent of required customer-facing products in project If 0%, these metrics do not apply
to project
Percent growth in unique online transactions >15% / >10% / <10%
Unique visitors by application vs. total call center volume >20% / >15% / <15%
Registration penetration, by application >15% / >10% / <10%
Application operational costs vs. call center operational costs >30% / >20% / <20%
Online community participation increase >20% / >15% / <15%
Average customer satisfaction rankings increase >10% / >5% / <5%
Average customer usability rankings increase >15% / >13% / <13%
Number of customer complaints reduced (via email and call center) >20% / >10% / <10%
Number of website content referral increase >15% / >5% / <5%
Annual resource redirection based on net changes in operational
costs
>15% / >10% / <10%
2.3 Value Enhancement
Value Enhancement is an indicator of how well the project will achieve the goals of the proposed
initiative and support the alignment between the development portfolio and the goals of the
organization and its stakeholders. These are not measures of the total benefit of the project, but a
project-view on the value plan.
PROPOSED METRIC TARGET VALUES
An independent review of project’s value enhancement has been
conducted
1; [1 (yes), 2 (drafted), 3 (no)]
Timing of value enhancement remains within achievement by the
project schedule
1; [1 (yes), 2 (drafted), 3 (no)]
Number of incomplete project dependencies that will affect value
enhancement estimates
0
The strategic plan for value enhancement identifies future versions
of the product that will add value
1; [1 (yes), 2 (drafted), 3 (no)]
Value enhancement has been tested against a stratified
representation of customer cohorts
1; [1 (yes), 2 (drafted), 3 (no)]

More Related Content

What's hot (20)

PPTX
00 safety_quality_construction
Adhitomo Wirawan
 
PPT
Episode 24 : Project Quality Management
SAJJAD KHUDHUR ABBAS
 
PDF
4 integration
Waseem Siddique
 
DOCX
Project on quality management
selinasimpson2901
 
PDF
Pmbok 5th planning process group part three
Hossam Maghrabi
 
PDF
A lean model based outlook on cost & quality optimization in software projects
Sonata Software
 
DOCX
Quality metrics in project management
selinasimpson1601
 
PPTX
PM FrameWork: Module 3
Mohammad Ashraf Khan, PMP
 
PDF
Project scope management 2
Mohammad Ashraf Khan, PMP
 
PPSX
Introduction to CMMI-DEV v1.3 - Day 4
Sherif Salah, MBA, ITIL, CMMI, MCSA, TQM
 
PPTX
Cmmi integrated work management level01
Elangovan Balusamy
 
DOCX
Quality management procedures
selinasimpson1201
 
PPT
Project Quality Management
asim78
 
PPT
Episode 23 : PROJECT TIME MANAGEMENT
SAJJAD KHUDHUR ABBAS
 
PDF
Monitor and Control Process Group - Part One
Hossam Maghrabi
 
DOCX
Plan_QualityManagement
Fahad Saleem
 
PPTX
Pre planning for large ERP/CRM initiative
sureshgk
 
PPSX
Introduction to CMMI-DEV v1.3 - Day 2
Sherif Salah, MBA, ITIL, CMMI, MCSA, TQM
 
PPTX
Factors affecting Quality of construction projects in India region
Ayush khandelwal
 
DOC
Quality assurance review check list
zeitgeistr2
 
00 safety_quality_construction
Adhitomo Wirawan
 
Episode 24 : Project Quality Management
SAJJAD KHUDHUR ABBAS
 
4 integration
Waseem Siddique
 
Project on quality management
selinasimpson2901
 
Pmbok 5th planning process group part three
Hossam Maghrabi
 
A lean model based outlook on cost & quality optimization in software projects
Sonata Software
 
Quality metrics in project management
selinasimpson1601
 
PM FrameWork: Module 3
Mohammad Ashraf Khan, PMP
 
Project scope management 2
Mohammad Ashraf Khan, PMP
 
Introduction to CMMI-DEV v1.3 - Day 4
Sherif Salah, MBA, ITIL, CMMI, MCSA, TQM
 
Cmmi integrated work management level01
Elangovan Balusamy
 
Quality management procedures
selinasimpson1201
 
Project Quality Management
asim78
 
Episode 23 : PROJECT TIME MANAGEMENT
SAJJAD KHUDHUR ABBAS
 
Monitor and Control Process Group - Part One
Hossam Maghrabi
 
Plan_QualityManagement
Fahad Saleem
 
Pre planning for large ERP/CRM initiative
sureshgk
 
Introduction to CMMI-DEV v1.3 - Day 2
Sherif Salah, MBA, ITIL, CMMI, MCSA, TQM
 
Factors affecting Quality of construction projects in India region
Ayush khandelwal
 
Quality assurance review check list
zeitgeistr2
 

Viewers also liked (20)

PDF
Peter Lik New Volcano images
ianhaight
 
PDF
Federal Injured Servicemember Programs
David Paschane, Ph.D.
 
PPS
Os dereitos lingüísticos na Universidade da Coruña
Servizo de Normalización Lingüística-Universidade da Coruña (SNL-UDC)
 
PPTX
Emberjs as a rails_developer
Sameera Gayan
 
PDF
Linkedin aanpassen
Waisheid | Wijs mat AI
 
PPTX
liofsocialemedia
Waisheid | Wijs mat AI
 
PPT
Candidateintro 2011
kensankson
 
PDF
Co-creatie 1/3; Waarom Open Innovatie En Co-creatie
Waisheid | Wijs mat AI
 
PDF
Better Government Presentation With Ivs December 21 2010 (Paschane)
David Paschane, Ph.D.
 
PDF
Peter Lik Nov08 New Release
ianhaight
 
PDF
Add slideshare 2 facebook fanpage
Waisheid | Wijs mat AI
 
PPS
As linguas minorizadas nas universidades das illas británicas
Servizo de Normalización Lingüística-Universidade da Coruña (SNL-UDC)
 
PPS
Emotional Intelligence
Beatrice Elizalde
 
PPT
299
Seaban
 
PPT
A D O P T I E
sofiedhondt
 
PDF
Veteran Patient Agency
David Paschane, Ph.D.
 
PDF
Health Outcome Infrastructure 1.2
David Paschane, Ph.D.
 
PPTX
Kontuuri loomine
Ljubov Fedotova
 
DOC
Match Foundation Business Executive Summary 12012008
David Paschane, Ph.D.
 
PPS
الضمائر
guest027a8a6
 
Peter Lik New Volcano images
ianhaight
 
Federal Injured Servicemember Programs
David Paschane, Ph.D.
 
Os dereitos lingüísticos na Universidade da Coruña
Servizo de Normalización Lingüística-Universidade da Coruña (SNL-UDC)
 
Emberjs as a rails_developer
Sameera Gayan
 
Linkedin aanpassen
Waisheid | Wijs mat AI
 
liofsocialemedia
Waisheid | Wijs mat AI
 
Candidateintro 2011
kensankson
 
Co-creatie 1/3; Waarom Open Innovatie En Co-creatie
Waisheid | Wijs mat AI
 
Better Government Presentation With Ivs December 21 2010 (Paschane)
David Paschane, Ph.D.
 
Peter Lik Nov08 New Release
ianhaight
 
Add slideshare 2 facebook fanpage
Waisheid | Wijs mat AI
 
As linguas minorizadas nas universidades das illas británicas
Servizo de Normalización Lingüística-Universidade da Coruña (SNL-UDC)
 
Emotional Intelligence
Beatrice Elizalde
 
299
Seaban
 
A D O P T I E
sofiedhondt
 
Veteran Patient Agency
David Paschane, Ph.D.
 
Health Outcome Infrastructure 1.2
David Paschane, Ph.D.
 
Kontuuri loomine
Ljubov Fedotova
 
Match Foundation Business Executive Summary 12012008
David Paschane, Ph.D.
 
الضمائر
guest027a8a6
 
Ad

Similar to Performance Methodology It Project Metrics Workbook (20)

PPTX
Software Project Management Chapter --- 4
prasannatejag25
 
PPT
Slides chapters 24-25
Priyanka Shetty
 
PPT
SE chapters 24-25
Hardik Patel
 
PPTX
Lecture 9 (02-06-2011)
love7love
 
PPT
Bsc how to fill initiatives templates-14 june10
Ajoy Jauhar
 
PPT
A metric expresses the degree to which a system, system component, or process...
preekrishiv
 
PDF
FINAL_SPM_document
Mudasser Akbar
 
DOCX
Measuring in
Davinder Singh
 
DOCX
Measuring in
Davinder Singh
 
PDF
Measuring the Results of your Agile Adoption
Software Guru
 
PDF
Successfully Integrating Agile and Earned Value
Glen Alleman
 
PDF
LRAFB_Project Profile
Dr. Robert L. Straitt
 
PPTX
PMP Exam Flashcards common definitions 7th edition original v2.0
Vinod Kumar, PMP®
 
PDF
Big Apple Scrum Day 2015 - Advanced Scrum Metrics Presentation
Jason Tice
 
PDF
Testing metrics
prats12345
 
PDF
Practical Software Development Metrics
Jari Kuusisto
 
PDF
'Metrics That Matter': Gabrielle Benefield @ Colombo Agile Con 2014
ColomboCampsCommunity
 
PPTX
Agile Metrics...That Matter
Erik Weber
 
DOC
Project management
ervinod
 
PPT
Presentation1
Preethi Subru
 
Software Project Management Chapter --- 4
prasannatejag25
 
Slides chapters 24-25
Priyanka Shetty
 
SE chapters 24-25
Hardik Patel
 
Lecture 9 (02-06-2011)
love7love
 
Bsc how to fill initiatives templates-14 june10
Ajoy Jauhar
 
A metric expresses the degree to which a system, system component, or process...
preekrishiv
 
FINAL_SPM_document
Mudasser Akbar
 
Measuring in
Davinder Singh
 
Measuring in
Davinder Singh
 
Measuring the Results of your Agile Adoption
Software Guru
 
Successfully Integrating Agile and Earned Value
Glen Alleman
 
LRAFB_Project Profile
Dr. Robert L. Straitt
 
PMP Exam Flashcards common definitions 7th edition original v2.0
Vinod Kumar, PMP®
 
Big Apple Scrum Day 2015 - Advanced Scrum Metrics Presentation
Jason Tice
 
Testing metrics
prats12345
 
Practical Software Development Metrics
Jari Kuusisto
 
'Metrics That Matter': Gabrielle Benefield @ Colombo Agile Con 2014
ColomboCampsCommunity
 
Agile Metrics...That Matter
Erik Weber
 
Project management
ervinod
 
Presentation1
Preethi Subru
 
Ad

More from David Paschane, Ph.D. (13)

PDF
Next Veterans
David Paschane, Ph.D.
 
PDF
Next Veterans Vision
David Paschane, Ph.D.
 
PDF
NESA DC 2018
David Paschane, Ph.D.
 
DOCX
NESA DC 2017 Action Plan
David Paschane, Ph.D.
 
PDF
Vision of PASS changing organizational performance
David Paschane, Ph.D.
 
PDF
2012 Government Bureaucracy Survey
David Paschane, Ph.D.
 
PDF
Performance Leadership Paschane Aplin 2011
David Paschane, Ph.D.
 
PDF
Graphic On Endstates
David Paschane, Ph.D.
 
PDF
IT Operations Collaborative Research
David Paschane, Ph.D.
 
PPT
MEDTALK: Patient Psychology of Health IT
David Paschane, Ph.D.
 
PDF
Psychological Adjustment For Employability
David Paschane, Ph.D.
 
DOCX
National Health It Research
David Paschane, Ph.D.
 
PDF
e-Healthcare Infrastructural Research
David Paschane, Ph.D.
 
Next Veterans
David Paschane, Ph.D.
 
Next Veterans Vision
David Paschane, Ph.D.
 
NESA DC 2018
David Paschane, Ph.D.
 
NESA DC 2017 Action Plan
David Paschane, Ph.D.
 
Vision of PASS changing organizational performance
David Paschane, Ph.D.
 
2012 Government Bureaucracy Survey
David Paschane, Ph.D.
 
Performance Leadership Paschane Aplin 2011
David Paschane, Ph.D.
 
Graphic On Endstates
David Paschane, Ph.D.
 
IT Operations Collaborative Research
David Paschane, Ph.D.
 
MEDTALK: Patient Psychology of Health IT
David Paschane, Ph.D.
 
Psychological Adjustment For Employability
David Paschane, Ph.D.
 
National Health It Research
David Paschane, Ph.D.
 
e-Healthcare Infrastructural Research
David Paschane, Ph.D.
 

Performance Methodology It Project Metrics Workbook

  • 1. Performance Metrics Workbook for IT Projects David Paschane, Ph.D. Introduction The VA is committed to fulfilling the President’s commitment for Executive departments to “use innovative tools, methods, and systems to cooperate among themselves, across all levels of Government, and with nonprofit organizations, businesses, and individuals in the private sector” (The White House, January 21, 2009). Furthermore, VA is pursuing compliance to the OMB policy to create and institutionalize a culture of open Government, one where the “integration of various disciplines facilitates organization-wide and lasting change in the way the Government works,” and the use of best practices “ take advantage of the expertise and insight of people both inside and outside the Federal Government, and form high-impact collaborations with researchers, the private sector, and civil society” (OMB M-10-06, December 8, 2009). The purpose of this workbook is to help be consistent with Public Laws and Executive Orders in thinking about the strategic, innovative, sustainable, and appropriate research-based collaboration that strengthens the VA’s capacity to improve performance with the assistance of IT-enhancing methodologies. The objective is to apply the scientific method to understanding and optimizing work variability within the VA to positively affect the outcome variability among VA customers. Integration of methodology is large organizations is referred to as Corporate Performance Management (CPM), which aligns best practices of Business Process Management and Re-engineering (BPM/BPR), Business Analytics and Intelligence (BA/BI), Project Management and Knowledge Management (PM/KM), Total Quality Management and Lean Six Sigma (TQM/6Σ), and Geographic Information Science (GIS). Sustainment of appropriate performance metrics, by organizational unit and work conditions, will require the ability to create repeatable, recursive analyses that diagnose performance issues and emphasize the high-value user-improvement trending status. The maturity of analytic performance metrics is dependent on assessments of the needs and priorities that range from senior managers to project managers, and across domains. Several domains are likely candidates for performance metrics, depending on the progress of diagnosing performance trends and meeting awareness and learning requirements. In most cases, performance metrics are based on initiative-level information or control needs. As the performance architecture matures, performance metrics will also be developed to support the stabilization of the conditions affecting performance, optimization of cross-functional processes supporting executive decision-making, and contextualizing outcomes to determine the where and how performance is impacting services to customers. This workbook only examines the metrics at the project level that is the point where consistency is important for building a sustainable performance environment. These can be used to design methodologies for controlling performance, but they need to be put into the large context of how the projects affect the business and IT operations, and the culture of the workforce. Ultimately these are all parts of a learning process that sustains an analytic performance environment.
  • 2. Performance Metrics at the Project Level 1.0 Project Performance Indicators Project performance indicators are those that characterize the direct work of the project team in terms of the project’s schedule, use of resources, and production of a quality product. 1.1 Schedule Deviation Schedule Deviation is an indicator of the timeliness of delivering a product in terms of the customer’s expectations. Schedule deviation metrics measure changes in project schedule, in terms of cumulative time and percentage of schedule. These metrics are contextualized by other measurements, including historical events and the number of rebaselines. The schedule deviation metrics can correspond to each deliverable in a project. The metrics can also correspond to each major team, such as an employee team or contracted team. PROPOSED METRIC TARGET VALUES Percent of days slippage from inception schedule 0% Number of days slipped from inception schedule 0 Count of all re-baselines 0 Percent of days slippage in build schedule (after technical solution approval) 0% Number of days slipped in build schedule 0 Percent of days slippage since last re-baseline (approval within governance) 0% Number of days slipped since last re-baseline 0 Number of historical events that paused project (report events by name) 0 Estimated days historical events impacted schedule 0 Count of missed milestone dates 0 Days overrun per product components 0 Days overrun per development milestones 0 Estimated days overrun on planned work packages till close out 0 1.2 Errors Saved Errors Saved is an indicator of how well the project is able to sustain quality in the development process. The quality work is largely dependent on the ability of the team to manage defects and errors by preventing them, detecting them early, detecting them before passing through a major development lifecycle process, or establishing quick and focused correction in order to achieve delivery milestones. Classifications of defects and errors are necessary for identifying which require the most immediate attention within the development lifecycle. The following are key classifications by which measures are reported: 1. Project Component (2 types) a. In project’s components
  • 3. b. In other project’s components 2. Build Task (3 types) a. In building a function b. In technology refresh c. In defect correction 3. Build Process (4 types) a. In product integration b. In solution build c. In technological requirements d. In business requirements 4. Impact Severity (4 types) a. Critical (1) – causes system crashes or loss of data or functionality b. Major (2) – impairs system function c. Average (3) – requiring less than three workarounds d. Minor (4) – minor cosmetic or documentation issues 5. Correction State (3 types) a. Previously identified (reopened) b. Newly identified, from passed process (injected) c. Newly identified, identified in current process (errors) 6. Release Priority (2 types) a. Required for next release milestone b. Required for subsequent release milestones 7. Problem Characteristic (3 types) a. Inaccurate information or action b. Inconsistent information or action c. Incomplete information or action PROPOSED METRIC TARGET VALUES Defects or errors found before milestone review completed 100% Defects or errors corrected before milestone review completed 100% Days to correct defects 1 Count of defects or errors by testing environment 0 1.3 Earned Value Earned Value is an indicator of how efficiently a project is using work hours to produce portions of product according to a planned schedule. A project is efficient if it expends budget (Actual Cost) to support work performed (Earned Value) consistent with progress expected at any point in time (Planned Value). The purpose of these metrics is to characterize key trends based on a standard use of work and product definitions and estimates, and compliance to standard processes. A challenge in the interpretation of earned value trends is the impact of (1) work flow dependencies, (2) changes in human resources, (3) historical events that change course of work, and (4) scope increases. Another measurement consideration is to apply a different set of earned value measures to different project team subunits, especially when there is a mix of employees and contractors, or subunits working on independent components. The measures of earned value should also account for the additional effort required for defect corrections and risk mitigations that were not anticipated in the project plan. An
  • 4. effective set of earned value measures will take into account the requirement to weight value earned according to the goals of the organization. PROPOSED METRIC TARGET VALUES All product components and work packages have an estimated cost 1; [1 (yes), 2 (drafted), 3 (no)] All product components and work packages have a standardized definition 1; [1 (yes), 2 (drafted), 3 (no)] The schedule for work packages has an approved earned value weighting method that is consistent with performance goals 1; [1 (yes), 2 (drafted), 3 (no)] Estimate at Completion (actual cost plus estimate to complete assuming past results are not typical of expected results) <5% of plan / <10% / >10% Cost Performance Index (CPI) (identify project holds, per executive thresholds) >.97% / >.95% / <.95% Schedule Performance Index (SPI) (identify project holds, per executive thresholds) >.97% / >.95% / <.95% Cost Variance (earned value minus actual cost) 0 Actual cost to planned cost 100% 1.4 Project Condition Indicators Project condition indicators are those that characterize factors affecting the project in terms of the project’s staffing, scope, risks, and compliance to integration, design, and process requirements. These indicators are often able to predict likelihood of changes in project performance indicators. While the project team monitors these activities, changes in these indicators are often associated with the work of other offices or managers. 1.5 Hiring Time Hiring Time is an indicator to access the length of time that is associated with hiring employees. Hiring Time metrics will characterize the ability to recruit and hire new employees to a team. PROPOSED METRIC TARGET VALUES Days between Personnel Description Signoff and Actual Hire Date <120 workdays Time between project-required hire date and actual hire date 0 days All hires total lag time by presence on a critical path 0 days All hires total lag time by position categories 0 days All hires total lag time by GS level 0 days All hires total lag time by HR offices 0 days All hires total lag time by occupational series Number of all hiring stages exceeding target workdays x days 1.6 Contracting Time Contracting Time is an indicator to assess the impact of efforts to reduce the length of time that is required to complete before they are required by the teams.
  • 5. PROPOSED METRIC TARGET VALUES Days between SOW Sign-Off and Actual Award Date <120 workdays Time between project-required award date and actual award date 0 days each All contracts total lag time by presence on a critical path 0 days each All contracts total lag time by contract types (level of effort on complexity) 0 days each All contracts total lag time by contract price categories 0 days each All contracts total lag time by contracting offices 0 days each All contracts total lag time by product category 0 days each Number of all contracting stages exceeding target workdays x days each 1.7 Administrative Time Administrative Time is an indicator of trends in planned and unplanned administrative tasks that distract from scheduled project tasks. These metrics help identify the impact of time allocated by project resources to tasks that are not identified in projects. PROPOSED METRIC TARGET VALUES Total unplanned non-manager staff time as administrative 0% Total planned non-manager staff time as administrative <10% Total unplanned manger time as administrative 0% Average time required for conducting milestone reviews <1 day Average manager time required for preparing and participating in milestone reviews <4 hours Closed issues of total count of all milestone review issues 100% Total work hours required for preparing nonstandard status reports 0 Total number of reports prepared by team to senior managers a year, over two years <25 a year 1.8 Team Readiness Team Readiness is an indicator of trends in training, alignment and continuity of project teams. Team Readiness metrics will help determine required training investments and potential training value. PROPOSED METRIC TARGET VALUES Total planned staff vacancy time 0 days Slot time filled by staff under-qualified for role 0 days Staff formally prepared to achieve current team and organizational goals 100% Staff formally prepared to comply with current team and organizational processes 100% Planned project tasks without staffing 0 Planned task hours without staffing 0 Project task hours worked (in a given period) of available staff hours (in the same period) 100% Staff certified as trained to standard skill set of respective roles 100% Manager roles are formally authorized 1; [1 (yes), 2 (drafted), 3 (no)]
  • 6. Total project manager vacancy time 0 days Total unplanned hours of absences across the team 0 Total turnover rate of all staff <12% / <20% / >20% Project manager turnover 0 Average leadership coaching sessions per year for key managers >11 1.9 Risk Control Risk Control is an indicator of trends in the ability to anticipate and control issues that may negatively affect the project. Risk Control metrics will assess the ability to provide consistency in estimating risks. The metrics will also indicate needs to improve the quality of the risk identification, analysis, and mitigation processes. PROPOSED METRIC TARGET VALUES Risk management plans approved 1; [1 (yes), 2 (drafted), 3 (no)] Risk management meetings completed as planned 100% Program funding matches budget at completion 100% / >95% / <95% Operation funding matches year requirements 100% / >95% / <95% Contractor burn rate (time and materials) < 103% of planned Requirements and capability definitions identified, documented, and approved 100% / >95% / <95% Risk management plans in use for all medium and high risk items 1; [1 (yes), 2 (drafted), 3 (no)] Number of risk management plans executed late 0 Risks accepted of those identified as within tolerable range 100% Risks avoided of those identified as can be avoided with appropriate actions 100% Risks mitigated of those identified as having impacts that can be reduced 100% Number of unplanned risk that occurred 0 Estimated cost of unplanned risks occurred $0 1.10 Scope Control Scope Control is an indicator of trends in changes to the scope of work in development and rework projects. Scope Control metrics will assess the ability of the organization to reduce lifecycle costs such as cost of unplanned work. PROPOSED METRIC TARGET VALUES Count of all scope changes over development life cycle 0 Count of post-requirements approval scope change 0 Count of changes within product versions 0 Estimated added cost of scope changes $0 Functional requirements change rate <8% / <15% / >15% Interface requirements change rate <8% / <15% / >15% 1.11 Integration Readiness
  • 7. Integration Readiness is an indictor of trends in the ability to integrate products into the business environment, given the readiness of the project, key actors, and the efficiency of gate reviews. PROPOSED METRIC TARGET VALUES Enterprise Architecture Review (4) issues resolved 100% Preliminary Concept Design Review issues resolved 100% System Requirement Review issues resolved 100% Initial Design Review issues resolved 100% Preliminary Design Review issues resolved 100% Critical Design Review issues resolved 100% Test Readiness Review issues resolved 100% Acceptance Testing Review issues resolved 100% Deployment Readiness Review issues resolved 100% Integrated Baseline Review issues resolved 100% Program Management Review (4) issues resolved 100% Planning, Architecture, Technology, and Services Board (4) issues resolved 100% Total reviews completed <18 Total review issues unresolved 0% Total critical review issues requiring resolution 0 Testing strategy completed, aligned, and approved 1; [1 (yes), 2 (drafted), 3 (no)] Testing environment complete and available for product testing 1; [1 (yes), 2 (drafted), 3 (no)] System dependencies on schedule with product schedule 1; [1 (yes), 2 (drafted), 3 (no)] Life cycle product are approved or in appropriate update 1; [1 (yes), 2 (drafted), 3 (no)] Release management standards completed 100% 1.12 Reusability Compliance Reusability Compliance is an indicator of trends in factors affecting the reusability and adaptive configurations, according to development standards. Reusability Compliance metrics will assess the ability to build and foster an enterprise infrastructure that creates agility and responsiveness while managing IT investments and business services. PROPOSED METRIC TARGET VALUES Percent of components with reusability compliance design approved 100% Associated Services with unknown capacity requirements 0% Associated Services with a continuity plan 100% Product reusability plan approved 1; [1 (yes), 2 (drafted), 3 (no)] Product reusability standards met 1; [1 (yes), 2 (drafted), 3 (no)] Percent of components certified reusable compliant 100% Components are platform neutral 100% Product performance optimization plan is built on simulated parameters derived from process modeling 1; [1 (yes), 2 (drafted), 3 (no)]
  • 8. Required business data is identified, categorized, and normalized 1; [1 (yes), 2 (drafted), 3 (no)] Product uses a web services model for requesting and responding to data use 1; [1 (yes), 2 (drafted), 3 (no)] Business process and function definitions are integrated into the services registry 1; [1 (yes), 2 (drafted), 3 (no)] Hosting environment is approved for ensuring Service ability through automated pre-integration 1; [1 (yes), 2 (drafted), 3 (no)] 1.13 Organizational Maturity Organizational Maturity is an indicator of trends in the maturity of the organization, in terms of adopting process standards. Organizational Maturity metrics will assess the organization’s ability to manage work in accordance to approved process standards. It will help demonstrate progression towards process maturity models. PROPOSED METRIC TARGET VALUES Percent of standard milestone reviews rated satisfactory 100% The project has standard development processes and process goals documented and approved 1; [1 (yes), 2 (drafted), 3 (no)] Project manager has documented process discipline improvement efforts by the team and reported to senior leadership quarterly 1; [1 (yes), 2 (drafted), 3 (no)] Process compliance level (annual audit) 100% Average lead time to start planned reviews 1 day Days in review before Defined Requirements approved 1 day Days in review before Functional Design approved 1 day Days in review before Technical Design approved 1 day Days in review before Configuration and Code Testing approved 1 day Days in review before Test Script and Testing Plan approved 1 day Days in review before Post-Testing Deployment approved 1 day Days to deploy to production environment after Post-Testing Deployment approval 1 day 2.0 Portfolio Indicators Portfolio indicators are those that characterize how combined projects share a common performance domain. These are still project measures, but handled in terms of organizational goals. 2.1 Organizational Interoperability Organizational Interoperability is an indicator of factors affecting the interoperability of developments, in terms of data sharing and application coordination with other agencies. These metrics will assess the ability to deploy systems that are prepared to integrate with external stakeholder agencies. PROPOSED METRIC TARGET VALUES Percent of required organization interoperable components in If 0%, these metrics do not apply
  • 9. project to project System interoperability review completed 1; [1 (yes), 2 (drafted), 3 (no)] Interoperable systems requirements completed 100% Product interoperability plan approved 1; [1 (yes), 2 (drafted), 3 (no)] Product interoperability standards met 1; [1 (yes), 2 (drafted), 3 (no)] Percent of components certified interoperable 100% Product performance optimization plan is built on simulated parameters derived from process modeling across participating agencies 1; [1 (yes), 2 (drafted), 3 (no)] Required business data is identified, categorized, and normalized across participating agencies 1; [1 (yes), 2 (drafted), 3 (no)] Hosting environments are approved for ensuring interoperability across participating agencies 1; [1 (yes), 2 (drafted), 3 (no)] 2.2 Online Customer Engagement Online Customer Engagement (OCE) is an industry-standard indicator created to help assess the utility, appeal, and customer benefit of web-enabled applications, and their impact on the customer and other resources. OCE metrics will help assess the organization’s ability to deliver functionality to increase stakeholder satisfaction and the ability to deliver applications that are flexible enough to integrate future changes. The OCE metrics also help determine a design protocol for customer-facing design plans to integrate customers into their management of services and benefits. Metrics for OCE characterize the degree, depth, and impact of user (customer) involvement. This extends not only to individual usage, but also behavior change, personally-realized effect, community interaction, user perception, and peer influence. Ultimately, OCE reflects the quality of the content, aesthetics, functionality, serviceability, infrastructure, alignment to user needs, facilitation of communication and decision making, orientation, marketing, and promotion of organizational websites. As these factors improve, websites will play a greater role engaging with users; consequently, (a) transactions will become more efficient in terms of time and cost to the user; (b) user knowledge, skills, and abilities will increase; and, (c) operational costs from other resources (e.g., call centers) should decrease, thereby allowing resources to be reallocated to better serve customers. Associated OCE studies can help identify evidence of how the organization enables customer’s self-empowerment, prepares itself for adopting customer-facing IT advancements, and ensures the most effective customer- facing products. PROPOSED METRIC TARGET VALUES Percent of required customer-facing products in project If 0%, these metrics do not apply to project Percent growth in unique online transactions >15% / >10% / <10% Unique visitors by application vs. total call center volume >20% / >15% / <15% Registration penetration, by application >15% / >10% / <10% Application operational costs vs. call center operational costs >30% / >20% / <20% Online community participation increase >20% / >15% / <15% Average customer satisfaction rankings increase >10% / >5% / <5% Average customer usability rankings increase >15% / >13% / <13% Number of customer complaints reduced (via email and call center) >20% / >10% / <10%
  • 10. Number of website content referral increase >15% / >5% / <5% Annual resource redirection based on net changes in operational costs >15% / >10% / <10% 2.3 Value Enhancement Value Enhancement is an indicator of how well the project will achieve the goals of the proposed initiative and support the alignment between the development portfolio and the goals of the organization and its stakeholders. These are not measures of the total benefit of the project, but a project-view on the value plan. PROPOSED METRIC TARGET VALUES An independent review of project’s value enhancement has been conducted 1; [1 (yes), 2 (drafted), 3 (no)] Timing of value enhancement remains within achievement by the project schedule 1; [1 (yes), 2 (drafted), 3 (no)] Number of incomplete project dependencies that will affect value enhancement estimates 0 The strategic plan for value enhancement identifies future versions of the product that will add value 1; [1 (yes), 2 (drafted), 3 (no)] Value enhancement has been tested against a stratified representation of customer cohorts 1; [1 (yes), 2 (drafted), 3 (no)]