SlideShare a Scribd company logo
#atlassian
Understanding Metrics: 
What to Measure and Why 
JOHN CUSTY • MANAGING CONSULTANT • JPC GROUP • @ITSMNINJA
About John Custy 
Service Management Practitioner, Consultant And Educator 
• Distinguished 
Professional in IT Service 
Management 
• ITIL Expert & ITIL Service 
Manager 
• ITIL Intermediate – SS, 
SD, ST, SO, CSI, OSA, 
SOA, PPO, RCV 
• KT Certified Instructor 
• ITIL Accredited Trainer 
• KCS Verified Consultant 
• ISO/IEC 20000 Consultant 
• ISFS, ISMAS based on 
ISO/IEC 27002 
• HDI Faculty & Certified 
Instructor
A story about 
metrics…
Source: Joëlle Flumet’s Broken Bed Installation
Understanding Metrics – What to Measure, and Why - John Custy
Source: betweennapsontheporch.net
Understanding Metrics – What to Measure, and Why - John Custy
The customer rating is 
5/5, but the customer 
WON’T return!
What are you measuring? 
Why are you measuring it?
What to Measure and Why 
PURPOSE OF METRICS 
TYPES OF METRICS 
COMMON SUPPORT METRICS 
METHODS OF REPORTING 
QUESTIONS
PURPOSE OF METRICS 
How to use metrics
PURPOSE OF METRICS 
How to use metrics 
• Inform your stakeholders 
• Report measurements so that stakeholders can understand activities and results 
• Promote the value of the organization 
• Determine the best way to communicate the information to the stakeholders 
• Perform better stakeholder analysis to facilitate stakeholder buy-in 
• Improve performance - people do what is measured
PURPOSE OF METRICS 
What are we trying to accomplish?
ENSURE ALIGNMENT 
• Account for IT Processes and 
Deliverables 
• Inform stakeholders 
• Understand IT Performance 
COMPLIANCE 
• Achieve certifications; ISO/IEC 
20000, COBIT 
• Measure progress to goals/ 
objectives 
OPERATIONAL EXCELLENCE 
• Measure IT Performance 
• Control IT Processes 
• Maximize IT Productivity (people) 
• Report Costs 
• Demonstrate value of IT 
Organization 
PURPOSE OF METRICS 
What are we trying to accomplish?
The Relationship of 
Metrics to Goals
The Relationship of 
Metrics to Goals 
VISION 
MISSION 
GOALS 
OBJECTIVES 
CSF 
KPI 
METRICS 
MEASUREMENT
PURPOSE OF METRICS 
Sharing Accomplishments 
What should you report?
PURPOSE OF METRICS 
Sharing Accomplishments 
What should you report? 
Key performance indicators 
Critical success factors 
Variances to baseline 
Progress towards targets 
Annotate milestones and abnormalities 
Service improvement projects
Understanding the 
Different Types of Metrics
DIFFERENT TYPES OF METRICS 
Metrics & Characteristics 
Quantitative 
• How much or how many 
• Ex. The number of times 
customers contact the service 
desk 
% :)
DIFFERENT TYPES OF METRICS 
Metrics & Characteristics 
Quantitative 
METRICS ! 
! 
•Performance indicators (PI) 
! 
•Key performance indicators (KPI) 
! 
•Key results indicators (KRI) 
• How much or how many 
• Ex. The number of times 
customers contact the service 
desk 
CHARACTERISTICS ! 
! 
•Efficiency vs. effectiveness 
! 
•Leading vs. lagging 
% :)
DIFFERENT TYPES OF METRICS 
Efficiency vs. Effectiveness 
Quantitative 
• How much or how many 
• Ex. The number of times 
customers contact the service 
desk 
% :)
DIFFERENT TYPES OF METRICS 
Efficiency vs. Effectiveness 
Quantitative 
EFFICIENCY ! 
! 
•How • How fast? 
much or how many 
! 
•• Ex. The number of times 
How customers many? 
contact the service 
! 
desk 
•Transactional Cost 
! 
•Incident/Request/Access Management 
! 
•Departmental Goals 
EFFECTIVENESS ! 
! 
•Accuracy 
! 
•Customer Satisfaction 
! 
•Total Organizational Cost 
! 
•Problem Management 
! 
•Enterprise Objectives 
% :)
DIFFERENT TYPES OF METRICS 
Quantitative vs. Qualitative 
Quantitative 
• How much or how many 
• Ex. The number of times 
customers contact the service 
desk 
% :)
DIFFERENT TYPES OF METRICS 
Quantitative vs. Qualitative 
Quantitative 
QUANTITATIVE ! 
•How much or how many 
! 
•Ex. The number of times customers 
contact the service desk 
• How much or how many 
• Ex. The number of times 
customers contact the service 
desk 
QUALITATIVE ! 
•How well something or someone is 
performing. 
! 
•Ex. Customer Satisfaction, Employee 
Satisfaction, stock price. 
% :)
Service Metrics
• Time to process an order 
• Time to check inventory item 
• Time to send/receive an e-mail 
• Time to … 
@ITSMNinja 
End-to-End Performance
• Cost Per Transaction 
• Cost Per User 
@ITSMNinja 
End-to-End Performance
Uptime – compared to … 
• Downtime per Service 
• Frequency and total amount of time 
• Number of incidents (type/category) 
• Number of recurring incidents 
• Time per incident 
@ITSMNinja 
Service Availability
• Problems identified per Service 
• # incidents per problem 
• Lost time per problem 
• Changes Per Service 
• # successful changes (time, budget) 
• Lost time due to changes – incidents and requests 
• # Service Requests due to changes 
• # problems due to changes – IT and business lost time 
• % improvements due to changes 
@ITSMNinja 
Service Availability
BALANCE 
NEEDED 
Operational metrics allow 
you to understand 
where to improve 
Service metrics report on 
the overall performance 
of the service
What Type of Metrics Are Reported?
What Type of Metrics Are Reported? 
IT INTERNAL METRICS 
SENIOR MANAGEMENT 
SERVICE MANAGEMENT 
BUSINESS UNIT METRICS 
REGULATORY/ 
COMPLIANCE
Four Types of Process Metrics 
PROGRESS EFFICIENCY EFFECTIVENESS COMPLIANCE 
! 
IN PROCESS 
MATURITY 
! 
USE OF 
RESOURCES 
! 
CORRECT AND 
COMPLETE THE 
FIRST TIME 
! 
TO PROCESS AND 
REGULATORY 
REQUIREMENTS
Identifying 
Common Service 
Desk Metrics
@ITSMNinja Typical Operational Measurements 
• Response 
• % connected immediately (Real-Time) 
• Abandon Rate 
• Wait (hold/queue) Time 
• Average Speed to Answer (ASA) 
• Response Time service level XX% in YY seconds 
• Call-Back Time 
• Desktop (PC)
@ITSMNinja Typical Operational Measurements 
• Resolution 
• Resolved First Contact 
• Resolved X hours, Y hours, Z hours 
• Cases re-opened, Repeats 
• Requests resolved without assistance (self-help) 
• Calls/Cases avoided due to self-help
@ITSMNinja Typical Operational Measurements 
• Cost 
• Call/Contact 
• Customer/User 
• Total Cost (TCO) of Support/Service
@ITSMNinja Typical Operational Measurements 
• Service Desk 
• Volumes, trends 
• Performance to goals 
• Incident Management 
• Volumes, trends, repeat incidents 
• Reduction in restoration time 
• Performance to goals 
• Customer Satisfaction
@ITSMNinja Typical Operational Measurements 
• Request Fulfillment 
• Volumes, trends 
• Time to complete requests 
• Performance to goals 
• Customer satisfaction
@ITSMNinja Typical Operational Measurements 
• Customer Satisfaction 
• Employee Satisfaction
@ITSMNinja Typical Operational Measurements 
• Knowledge Base Usage 
• Accesses/Searches per contact 
• # solutions per search 
• # solutions searched/opened/viewed 
• Time spent reviewing solutions 
• Ease of finding solutions 
• Quality of solutions (ability to use solutions)
@ITSMNinja Typical Operational Measurements 
• Service Asset and Configuration Management 
• Errors in CMDB 
• Resources improvement utilizing CMDB 
• Change Management 
• Number of incidents/requests due to the change 
• Additional (reduction) workload due to changes 
• Release Management 
• Number of incidents/requests due to the release 
• Additional (reduction) in workload due to releases
SERVICE DESK METRICS
SERVICE DESK METRICS 
WORKLOAD 
• 
• Volumes 
• Calls/Cases per 
customer per 
month 
• Number of 
registered users/ 
Total number of 
users 
• Time spent 
contacting users 
• Time spent on 
change related 
incidents/requests
SERVICE DESK METRICS 
WORKLOAD 
• 
• Volumes 
• Calls/Cases per 
customer per 
month 
• Number of 
registered users/ 
Total number of 
users 
• Time spent 
contacting users 
• Time spent on 
change related 
incidents/requests 
INDIVIDUALS 
• Number of calls 
taken 
• Average Handle 
Time (AHT) 
• Avai labi l i ty 
• Occupancy 
• Number of 
incidents/requests 
c l o s e d o n f i r s t 
contact 
• Customer 
Satisfaction 
• Contribution to 
Knowledge base
SERVICE DESK METRICS 
WORKLOAD 
• 
• Volumes 
• Calls/Cases per 
customer per 
month 
• Number of 
registered users/ 
Total number of 
users 
• Time spent 
contacting users 
• Time spent on 
change related 
incidents/requests 
INDIVIDUALS 
• Number of calls 
taken 
• Average Handle 
Time (AHT) 
• Avai labi l i ty 
• Occupancy 
• Number of 
incidents/requests 
c l o s e d o n f i r s t 
contact 
• Customer 
Satisfaction 
• Contribution to 
Knowledge base 
CUSTOMERS 
• Customer 
Satisfaction 
• Frequency of 
surveying, 
Number not 
responding 
• Volumes 
• Calls/Case
SERVICE DESK METRICS 
WORKLOAD 
• 
• Volumes 
• Calls/Cases per 
customer per 
month 
• Number of 
registered users/ 
Total number of 
users 
• Time spent 
contacting users 
• Time spent on 
change related 
incidents/requests 
INDIVIDUALS 
• Number of calls 
taken 
• Average Handle 
Time (AHT) 
• Avai labi l i ty 
• Occupancy 
• Number of 
incidents/requests 
c l o s e d o n f i r s t 
contact 
• Customer 
Satisfaction 
• Contribution to 
Knowledge base 
CUSTOMERS 
• Customer 
Satisfaction 
• Frequency of 
surveying, 
Number not 
responding 
• Volumes 
• Calls/Case 
RESPONSE 
• Average Speed to 
Answer (ASA) 
• % calls answered 
live vs.. queued 
• Call back time 
• Abandon Rate 
(ABA) 
• Responses within 
Service Level & 
Outside service 
l e v e l
INCIDENT, REQUEST AND ACCESS MANAGEMENT
INCIDENT, REQUEST AND ACCESS MANAGEMENT 
RESOLUTION 
• 
• Incident closure 
( f rom time of 
submission) 
• Mean Time for 
Service 
Restoration 
(MTSR) for Levels 
1 , 2 , & 3 
• Incidents matched 
(KE) 
• Incidents Re- 
Opened 
• Closed First 
Contact 
• Escalations for 
resolut ion 
• Remote tool 
u t i l i z a t i o n 
• Desk-side visits 
• Incidents closed 
v i a s e l f - h e l p
INCIDENT, REQUEST AND ACCESS MANAGEMENT 
RESOLUTION 
• 
• Incident closure 
( f rom time of 
submission) 
• Mean Time for 
Service 
Restoration 
(MTSR) for Levels 
1 , 2 , & 3 
• Incidents matched 
(KE) 
• Incidents Re- 
Opened 
• Closed First 
Contact 
• Escalations for 
resolut ion 
• Remote tool 
u t i l i z a t i o n 
• Desk-side visits 
• Incidents closed 
v i a s e l f - h e l p 
VOLUME 
• Total number of 
incidents/requests 
( b y p r i o r i t y & 
category) 
! 
• Security related 
incidents
INCIDENT, REQUEST AND ACCESS MANAGEMENT 
RESOLUTION 
• 
• Incident closure 
( f rom time of 
submission) 
• Mean Time for 
Service 
Restoration 
(MTSR) for Levels 
1 , 2 , & 3 
• Incidents matched 
(KE) 
• Incidents Re- 
Opened 
• Closed First 
Contact 
• Escalations for 
resolut ion 
• Remote tool 
u t i l i z a t i o n 
• Desk-side visits 
• Incidents closed 
v i a s e l f - h e l p 
VOLUME 
• Total number of 
incidents/requests 
( b y p r i o r i t y & 
category) 
! 
• Security related 
incidents 
RESPONSE TIME 
• Service Desk 
performance 
! 
• Level 2/3 – same 
as SD metrics
INCIDENT, REQUEST AND ACCESS MANAGEMENT 
RESOLUTION 
• 
• Incident closure 
( f rom time of 
submission) 
• Mean Time for 
Service 
Restoration 
(MTSR) for Levels 
1 , 2 , & 3 
• Incidents matched 
(KE) 
• Incidents Re- 
Opened 
• Closed First 
Contact 
• Escalations for 
resolut ion 
• Remote tool 
u t i l i z a t i o n 
• Desk-side visits 
• Incidents closed 
v i a s e l f - h e l p 
VOLUME 
• Total number of 
incidents/requests 
( b y p r i o r i t y & 
category) 
! 
• Security related 
incidents 
RESPONSE TIME 
• Service Desk 
performance 
! 
• Level 2/3 – same 
as SD metrics 
ESCALATION 
• Time to escalate 
! 
• % Escalated to 
correct group 
! 
• Technical & 
Hierarchical
INCIDENT, REQUEST AND ACCESS MANAGEMENT
INCIDENT, REQUEST AND ACCESS MANAGEMENT 
CUSTOMER SATISFACTION 
• Incident, Request & Access 
Management processes 
• 
SELF-SERVICE 
• Number of unique users 
! 
• Average t ime per user 
! 
• # pages viewed
Methods to Report 
Performance
Methods to Report 
Performance 
PERFORMANCE REPORTS 
BALANCED SCORECARD 
SUPPORT SCORECARD
Factors to Consider When Reporting
Factors to Consider When Reporting 
• Who are the stakeholders? 
• How does what you are reporting impact the stakeholders? 
• Reports must be easy to read and understood, thus they need to be developed with 
the stakeholder in mind. 
• Reports need to show how the support center is contributing to the goals of each 
stakeholder and the business. 
• Reports must identify the appropriate channels to communicate with each of the 
stakeholders.
Daily Report Examples
Trending Report 
Examples
Trending Report 
Examples
Activity Report Examples
Present your 
achievements with 
scorecards
Scorecards Drive Transformation 
I Relevant High (Business Low (Process Improvement) Moderate Transformation) 
to IT 
H General Specific 
T ISO/IEC 
17799 
ITIL/ISO/IEC 
20000 C COBIT 
Six Sigma 
ISO/IEC 9000 
Malcolm Baldrige Award 
Scorecards 
Standards help us to ensure that IT is aligned to meet business objectives
• Simple indicator 
• Reference base 
• Measures the key issues 
• Reports on progress to goals 
@ITSMNinja 
Support Scorecard
Scorecard Criteria: Operational Performance 
COLUMN TITLE COLUMN TITLE COLUMN TITLE 
Overall performance 
Response time 
Resolution time 
Closed first call 
Abandon time 
Wait time 
Status time 
Backlog aging
Scorecard Criteria: Operational Performance 
COLUMN TITLE COLUMN TITLE COLUMN TITLE 
Overall performance 
Response time 
Resolution time 
Closed first call 
Abandon time 
Wait time 
Status time 
Backlog aging
Scorecard Criteria: Operational Performance 
SERVICE SCORECARD 
The Acme Support Center Scorecard provides a 
weekly report of performance on our Service 
CS Overall Support Center Product 
Level commitments. 
Key Service Area Goal Actual 
Response Time 
Front-line Measurements: 
• Call Pick-Up Time - All incoming calls are answered by a support consultant 
• Call Waiting Time - Average is less that 3 minutes 
• Back-line Measurements: 
• Non-Accepted Call Back Time - All customers not responded to on the initial 
call by a support consultant will be called back within 30 minutes. 
80% 
3 minutes 
90% 
Resolve Time 
Resolved on First Contact - 30% resolved first call (4 month goal is 50%) 
Resolved Same Day - 40% resolved within 1 business day (4 month goal is 60%) 
Resolved Same Week - 85% resolved within 5 business days 
30% 
40% 
85% 
Status 
Priority 1 Issues - Customer provided status every 4 hours until resolved 
Priority 2 Issues - Customer provided status every 24 hours until resolved or 
workaround provided 
Call Aging - Manage backlog so that no more than 20% over 2 weeks and 5% 
over 30 days 
80% 
80% 
80% 
Backlog (Average age of open items) 3 days 
10 days 
Event Survey 
Overall satisfaction rating on a 1-5 scale 4.1 
COLUMN TITLE
Service Desk Scorecard 
COLUMN TITLE 
Overall performance 
Response time 
Resolution time 
Closed first call 
Abandon time 
Wait time 
Status time 
Backlog aging 
COLUMN TITLE 
Customer sat/quality 
Overall satisfaction 
Response satisfaction 
Resolution satisfaction 
Status satisfaction 
Improvement goals 
Alignment to business
Acme Service Desk Scorecard
Balanced Scorecard 
Business 
Goals 
Financial Perspective 
COLUMN TITLE 
Provide a good return on investment on IT-enabled business investments 
Manage IT-related business risks 
Improve corporate governance and transparency 
Customer Perspective 
Improve customer orientation and service 
Offer competitive products and services 
Establish service continuity and availability 
Create agility in responding to changing business requirements 
Achieve the cost optimization of service delivery 
Obtain reliable and useful information for strategic decision making 
Internal Perspective 
Improve and maintain business process functionality 
Lower process costs 
Provide compliance with external laws, regulations and contracts 
Provide compliance with internal policies 
Manage business change 
Improve and maintain operational and staff productivity 
Learning and Growth Perspective 
Manage product and business innovation 
Acquire and maintain skilled and motivated personnel
IT Balances Scorecard @ITSMNinja 
Source: ITIL Continuous Service Improvement
Understanding Metrics – What to Measure, and Why - John Custy
You cannot manage what you cannot 
CONTROL 
You cannot control what you cannot 
MEASURE 
You cannot measure what you cannot 
DEFINE
What do you need to measure? 
What should you be doing you do with the metrics you produce?
Thank you! 
Join me tomorrow @2PM to learn 
Knowledge Centered Support (KCS) – The Methodology that Really Works 
John Custy • Managing Consultant • JPC Group • @ITSMNinja

More Related Content

PDF
Knowledge-Centered Support at Atlassian - Neil Kenagy
Atlassian
 
PDF
Knowledge-Centered Support – The Methodology That Really Works - John Custy
Atlassian
 
PDF
Integrating Confluence and JIRA Service Desk for Knowledge Management - Anna ...
Atlassian
 
PPT
Delivering on the KCS promise and empowering people by tracking the evolution...
KM Chicago
 
PDF
Deflect Tickets and Stop Interruptions For Your IT Teams - Michael Knight
Atlassian
 
PPTX
Kcs overview for detroit 2010
IT Service and Support
 
PDF
K2 users group, portland intro and Project
Andy Hopkins
 
PPTX
Life of data from generation to visualization using big data
Blazeclan Technologies Private Limited
 
Knowledge-Centered Support at Atlassian - Neil Kenagy
Atlassian
 
Knowledge-Centered Support – The Methodology That Really Works - John Custy
Atlassian
 
Integrating Confluence and JIRA Service Desk for Knowledge Management - Anna ...
Atlassian
 
Delivering on the KCS promise and empowering people by tracking the evolution...
KM Chicago
 
Deflect Tickets and Stop Interruptions For Your IT Teams - Michael Knight
Atlassian
 
Kcs overview for detroit 2010
IT Service and Support
 
K2 users group, portland intro and Project
Andy Hopkins
 
Life of data from generation to visualization using big data
Blazeclan Technologies Private Limited
 

What's hot (20)

PDF
Nintex Update
EileenTan67
 
PDF
CCD2014 - Tony Atkins, Atlassian
Communardo GmbH
 
PDF
How to select the right database to empower your fundraising
Blackbaud Pacific
 
PDF
What’s hot in the world of atlassian
ACA IT-Solutions
 
PDF
Rock Solid Projects With Atlassian Dev Tools
ACA IT-Solutions
 
PDF
ITAM AUS 2017 The Long Way Round: from chaos to cohesion
Martin Thompson
 
PPTX
Real World Techniques for Enterprise Agile Adoption
Scott Richardson
 
PDF
Cirrus Insight + Spanning: 6 Ways to Cover Your SaaS
Cirrus Insight
 
PPTX
2014.07 Exec User Group - Atlassian - Sydney
ServiceRocket
 
PDF
Agile + Business = Analysis
Stephanie Lewandowski
 
PPTX
A Walk Around SQL Server Data Tools | SQL Saturday#392 by James McAuliffe
CCG
 
PDF
A leaders path to practical service management
David Mainville
 
PDF
Sage 2015 roadmap – next release and beyond
SociusPartner
 
PPTX
Group hug - Implementing Agile Across Multiple Teams
Richard Cheng
 
PPTX
AgileDC 2014: Achieving Enduring Agile Success in Large Organizations
Scott Richardson
 
PDF
Agile Analysis
Stephanie Lewandowski
 
PDF
New perspectives driving strategic initiatives
SociusPartner
 
PPTX
AgileDC 2016 - Transform the Corporate Ecosystem for Enterprise Agility
Scott Richardson
 
PDF
Integration + Automation: How Catholic Church Insurance is Streamlining its D...
EileenTan67
 
PDF
Bringing HR and payroll inhouse – the new roll of HRMS
SociusPartner
 
Nintex Update
EileenTan67
 
CCD2014 - Tony Atkins, Atlassian
Communardo GmbH
 
How to select the right database to empower your fundraising
Blackbaud Pacific
 
What’s hot in the world of atlassian
ACA IT-Solutions
 
Rock Solid Projects With Atlassian Dev Tools
ACA IT-Solutions
 
ITAM AUS 2017 The Long Way Round: from chaos to cohesion
Martin Thompson
 
Real World Techniques for Enterprise Agile Adoption
Scott Richardson
 
Cirrus Insight + Spanning: 6 Ways to Cover Your SaaS
Cirrus Insight
 
2014.07 Exec User Group - Atlassian - Sydney
ServiceRocket
 
Agile + Business = Analysis
Stephanie Lewandowski
 
A Walk Around SQL Server Data Tools | SQL Saturday#392 by James McAuliffe
CCG
 
A leaders path to practical service management
David Mainville
 
Sage 2015 roadmap – next release and beyond
SociusPartner
 
Group hug - Implementing Agile Across Multiple Teams
Richard Cheng
 
AgileDC 2014: Achieving Enduring Agile Success in Large Organizations
Scott Richardson
 
Agile Analysis
Stephanie Lewandowski
 
New perspectives driving strategic initiatives
SociusPartner
 
AgileDC 2016 - Transform the Corporate Ecosystem for Enterprise Agility
Scott Richardson
 
Integration + Automation: How Catholic Church Insurance is Streamlining its D...
EileenTan67
 
Bringing HR and payroll inhouse – the new roll of HRMS
SociusPartner
 
Ad

Viewers also liked (20)

PDF
How to Implement SLAs and Metrics in JIRA Service Desk - Lucas Dussurget
Atlassian
 
PDF
How to Graduate From Email Support - Tom Moors
Atlassian
 
PDF
Going Right! Software Delivery with Atlassian Solution
智治 長沢
 
PDF
Atlas Desk Team – A Year With JIRA Service Desk - Dan Horsfall and Nikki Nguyen
Atlassian
 
KEY
Velocity is not the Goal
Doc Norton
 
PPTX
IT Metrics Presentation
jmcarden
 
PPTX
IT governance and bal
sourov_das
 
PDF
1 workshop itil
Jones Ribeiro
 
PPT
Feb2007 Kelly Services Hdi Chapter Meeting 020807 Public Domain
IT Service and Support
 
POTX
Addicted to Busy
Rachel Leonhart
 
PPTX
Gathering Meaningful Statistics to measure excellence using Knowall Enquire
Laura Connaughton BA MLIS ALAI
 
PPTX
Gamifying Your Service Desk
Freshservice
 
PPTX
Metrics to Maturity, Intelligence for Innovation: Your Value Proposition
Cherwell Software
 
PPTX
PR Measurement Clinic: Assessing the Success of a Communications Strategy
Sandra Fathi
 
PPTX
First Contact Resolution - The Performance Driver!
HDI Orange County
 
PDF
Harvard Business Review - The New Conversation Taking Social Media from Talk ...
Alex Gonçalves
 
PDF
10 Ways To Work Effectively as a Distributed Team - Nick Pellow
Atlassian
 
POTX
Targets That Work (for the Service Desk), Susan Storey
Service Desk Institute
 
PPTX
Developing Metrics for Better ITSM
Ahmed Al-Hadidi
 
PPTX
Agile Reporting in JIRA
Cprime
 
How to Implement SLAs and Metrics in JIRA Service Desk - Lucas Dussurget
Atlassian
 
How to Graduate From Email Support - Tom Moors
Atlassian
 
Going Right! Software Delivery with Atlassian Solution
智治 長沢
 
Atlas Desk Team – A Year With JIRA Service Desk - Dan Horsfall and Nikki Nguyen
Atlassian
 
Velocity is not the Goal
Doc Norton
 
IT Metrics Presentation
jmcarden
 
IT governance and bal
sourov_das
 
1 workshop itil
Jones Ribeiro
 
Feb2007 Kelly Services Hdi Chapter Meeting 020807 Public Domain
IT Service and Support
 
Addicted to Busy
Rachel Leonhart
 
Gathering Meaningful Statistics to measure excellence using Knowall Enquire
Laura Connaughton BA MLIS ALAI
 
Gamifying Your Service Desk
Freshservice
 
Metrics to Maturity, Intelligence for Innovation: Your Value Proposition
Cherwell Software
 
PR Measurement Clinic: Assessing the Success of a Communications Strategy
Sandra Fathi
 
First Contact Resolution - The Performance Driver!
HDI Orange County
 
Harvard Business Review - The New Conversation Taking Social Media from Talk ...
Alex Gonçalves
 
10 Ways To Work Effectively as a Distributed Team - Nick Pellow
Atlassian
 
Targets That Work (for the Service Desk), Susan Storey
Service Desk Institute
 
Developing Metrics for Better ITSM
Ahmed Al-Hadidi
 
Agile Reporting in JIRA
Cprime
 
Ad

Similar to Understanding Metrics – What to Measure, and Why - John Custy (20)

PDF
Past and present | 25 years of Service Desk KPIs
MetricNet
 
PDF
Unleashing the Enormous Power of Service Desk KPIs
MetricNet
 
PPTX
it_Define_Service_Desk_Metrics_That_Matter_Storyboard.pptx
Abdulelah Aljabri
 
PDF
HDI Capital Area Meeting April 2016
hdicapitalarea
 
PDF
Rumburg seven kp_is
Baskaran Bendaiya
 
PPTX
IT Metric Portal (ITIL Final Assignment)
Abdullah Ahmet Aslan
 
PDF
Free Desktop Support Training Series | What You Need to Know About Desktop Su...
MetricNet
 
PDF
Free Service Desk Training Series | MetricNet's Service Desk Best Practices
MetricNet
 
PDF
The 80/20 Rule for Desktop KPI's: Less is More!
MetricNet
 
PDF
Dit yvol5iss4
Rick Lemieux
 
PPTX
Customer Service Analytics - Make Sense of All Your Data.pptx
Emmanuel Dauda
 
PPT
Service industry metrics
Dan Wilson
 
PDF
Free Desktop Support Training Series | The Zen of Support - The Path to Strat...
MetricNet
 
PDF
Metric Nets Seven Most Important Kp Is For The Service Desk V4
MMehterian
 
PPT
BMC BSM - Automate Service Management System
Vyom Labs
 
PDF
Cause and-effect
Karthik Arumugham
 
PDF
Metrics that Matter: Focusing on key metrics for an efficient service desk an...
Freshservice
 
PDF
The role of it leadership in service and support v2
MetricNet
 
PPTX
Slideshare_Metrics_Presentation_
Thomas Nanomantube
 
PPTX
Measuring the Success of Cloud-Based Services
Vistara
 
Past and present | 25 years of Service Desk KPIs
MetricNet
 
Unleashing the Enormous Power of Service Desk KPIs
MetricNet
 
it_Define_Service_Desk_Metrics_That_Matter_Storyboard.pptx
Abdulelah Aljabri
 
HDI Capital Area Meeting April 2016
hdicapitalarea
 
Rumburg seven kp_is
Baskaran Bendaiya
 
IT Metric Portal (ITIL Final Assignment)
Abdullah Ahmet Aslan
 
Free Desktop Support Training Series | What You Need to Know About Desktop Su...
MetricNet
 
Free Service Desk Training Series | MetricNet's Service Desk Best Practices
MetricNet
 
The 80/20 Rule for Desktop KPI's: Less is More!
MetricNet
 
Dit yvol5iss4
Rick Lemieux
 
Customer Service Analytics - Make Sense of All Your Data.pptx
Emmanuel Dauda
 
Service industry metrics
Dan Wilson
 
Free Desktop Support Training Series | The Zen of Support - The Path to Strat...
MetricNet
 
Metric Nets Seven Most Important Kp Is For The Service Desk V4
MMehterian
 
BMC BSM - Automate Service Management System
Vyom Labs
 
Cause and-effect
Karthik Arumugham
 
Metrics that Matter: Focusing on key metrics for an efficient service desk an...
Freshservice
 
The role of it leadership in service and support v2
MetricNet
 
Slideshare_Metrics_Presentation_
Thomas Nanomantube
 
Measuring the Success of Cloud-Based Services
Vistara
 

More from Atlassian (20)

PPTX
International Women's Day 2020
Atlassian
 
PDF
10 emerging trends that will unbreak your workplace in 2020
Atlassian
 
PDF
Forge App Showcase
Atlassian
 
PDF
Let's Build an Editor Macro with Forge UI
Atlassian
 
PDF
Meet the Forge Runtime
Atlassian
 
PDF
Forge UI: A New Way to Customize the Atlassian User Experience
Atlassian
 
PDF
Take Action with Forge Triggers
Atlassian
 
PDF
Observability and Troubleshooting in Forge
Atlassian
 
PDF
Trusted by Default: The Forge Security & Privacy Model
Atlassian
 
PDF
Designing Forge UI: A Story of Designing an App UI System
Atlassian
 
PDF
Forge: Under the Hood
Atlassian
 
PDF
Access to User Activities - Activity Platform APIs
Atlassian
 
PDF
Design Your Next App with the Atlassian Vendor Sketch Plugin
Atlassian
 
PDF
Tear Up Your Roadmap and Get Out of the Building
Atlassian
 
PDF
Nailing Measurement: a Framework for Measuring Metrics that Matter
Atlassian
 
PDF
Building Apps With Color Blind Users in Mind
Atlassian
 
PDF
Creating Inclusive Experiences: Balancing Personality and Accessibility in UX...
Atlassian
 
PDF
Beyond Diversity: A Guide to Building Balanced Teams
Atlassian
 
PDF
The Road(map) to Las Vegas - The Story of an Emerging Self-Managed Team
Atlassian
 
PDF
Building Apps With Enterprise in Mind
Atlassian
 
International Women's Day 2020
Atlassian
 
10 emerging trends that will unbreak your workplace in 2020
Atlassian
 
Forge App Showcase
Atlassian
 
Let's Build an Editor Macro with Forge UI
Atlassian
 
Meet the Forge Runtime
Atlassian
 
Forge UI: A New Way to Customize the Atlassian User Experience
Atlassian
 
Take Action with Forge Triggers
Atlassian
 
Observability and Troubleshooting in Forge
Atlassian
 
Trusted by Default: The Forge Security & Privacy Model
Atlassian
 
Designing Forge UI: A Story of Designing an App UI System
Atlassian
 
Forge: Under the Hood
Atlassian
 
Access to User Activities - Activity Platform APIs
Atlassian
 
Design Your Next App with the Atlassian Vendor Sketch Plugin
Atlassian
 
Tear Up Your Roadmap and Get Out of the Building
Atlassian
 
Nailing Measurement: a Framework for Measuring Metrics that Matter
Atlassian
 
Building Apps With Color Blind Users in Mind
Atlassian
 
Creating Inclusive Experiences: Balancing Personality and Accessibility in UX...
Atlassian
 
Beyond Diversity: A Guide to Building Balanced Teams
Atlassian
 
The Road(map) to Las Vegas - The Story of an Emerging Self-Managed Team
Atlassian
 
Building Apps With Enterprise in Mind
Atlassian
 

Recently uploaded (20)

PPTX
Maximizing Revenue with Marketo Measure: A Deep Dive into Multi-Touch Attribu...
bbedford2
 
DOCX
The Future of Smart Factories Why Embedded Analytics Leads the Way
Varsha Nayak
 
PDF
Microsoft Teams Essentials; The pricing and the versions_PDF.pdf
Q-Advise
 
PPTX
Explanation about Structures in C language.pptx
Veeral Rathod
 
PDF
The Role of Automation and AI in EHS Management for Data Centers.pdf
TECH EHS Solution
 
PDF
Become an Agentblazer Champion Challenge Kickoff
Dele Amefo
 
PPTX
TestNG for Java Testing and Automation testing
ssuser0213cb
 
PPTX
Role Of Python In Programing Language.pptx
jaykoshti048
 
PDF
Wondershare Filmora 14.5.20.12999 Crack Full New Version 2025
gsgssg2211
 
PDF
lesson-2-rules-of-netiquette.pdf.bshhsjdj
jasmenrojas249
 
PDF
Community & News Update Q2 Meet Up 2025
VictoriaMetrics
 
PDF
ShowUs: Pharo Stream Deck (ESUG 2025, Gdansk)
ESUG
 
PDF
PFAS Reporting Requirements 2026 Are You Submission Ready Certivo.pdf
Certivo Inc
 
PDF
Key Features to Look for in Arizona App Development Services
Net-Craft.com
 
PPTX
oapresentation.pptx
mehatdhavalrajubhai
 
PDF
Become an Agentblazer Champion Challenge
Dele Amefo
 
PPTX
Odoo Integration Services by Candidroot Solutions
CandidRoot Solutions Private Limited
 
PPTX
Visualising Data with Scatterplots in IBM SPSS Statistics.pptx
Version 1 Analytics
 
PPTX
Presentation about variables and constant.pptx
safalsingh810
 
PPTX
ConcordeApp: Engineering Global Impact & Unlocking Billions in Event ROI with AI
chastechaste14
 
Maximizing Revenue with Marketo Measure: A Deep Dive into Multi-Touch Attribu...
bbedford2
 
The Future of Smart Factories Why Embedded Analytics Leads the Way
Varsha Nayak
 
Microsoft Teams Essentials; The pricing and the versions_PDF.pdf
Q-Advise
 
Explanation about Structures in C language.pptx
Veeral Rathod
 
The Role of Automation and AI in EHS Management for Data Centers.pdf
TECH EHS Solution
 
Become an Agentblazer Champion Challenge Kickoff
Dele Amefo
 
TestNG for Java Testing and Automation testing
ssuser0213cb
 
Role Of Python In Programing Language.pptx
jaykoshti048
 
Wondershare Filmora 14.5.20.12999 Crack Full New Version 2025
gsgssg2211
 
lesson-2-rules-of-netiquette.pdf.bshhsjdj
jasmenrojas249
 
Community & News Update Q2 Meet Up 2025
VictoriaMetrics
 
ShowUs: Pharo Stream Deck (ESUG 2025, Gdansk)
ESUG
 
PFAS Reporting Requirements 2026 Are You Submission Ready Certivo.pdf
Certivo Inc
 
Key Features to Look for in Arizona App Development Services
Net-Craft.com
 
oapresentation.pptx
mehatdhavalrajubhai
 
Become an Agentblazer Champion Challenge
Dele Amefo
 
Odoo Integration Services by Candidroot Solutions
CandidRoot Solutions Private Limited
 
Visualising Data with Scatterplots in IBM SPSS Statistics.pptx
Version 1 Analytics
 
Presentation about variables and constant.pptx
safalsingh810
 
ConcordeApp: Engineering Global Impact & Unlocking Billions in Event ROI with AI
chastechaste14
 

Understanding Metrics – What to Measure, and Why - John Custy

  • 2. Understanding Metrics: What to Measure and Why JOHN CUSTY • MANAGING CONSULTANT • JPC GROUP • @ITSMNINJA
  • 3. About John Custy Service Management Practitioner, Consultant And Educator • Distinguished Professional in IT Service Management • ITIL Expert & ITIL Service Manager • ITIL Intermediate – SS, SD, ST, SO, CSI, OSA, SOA, PPO, RCV • KT Certified Instructor • ITIL Accredited Trainer • KCS Verified Consultant • ISO/IEC 20000 Consultant • ISFS, ISMAS based on ISO/IEC 27002 • HDI Faculty & Certified Instructor
  • 4. A story about metrics…
  • 5. Source: Joëlle Flumet’s Broken Bed Installation
  • 9. The customer rating is 5/5, but the customer WON’T return!
  • 10. What are you measuring? Why are you measuring it?
  • 11. What to Measure and Why PURPOSE OF METRICS TYPES OF METRICS COMMON SUPPORT METRICS METHODS OF REPORTING QUESTIONS
  • 12. PURPOSE OF METRICS How to use metrics
  • 13. PURPOSE OF METRICS How to use metrics • Inform your stakeholders • Report measurements so that stakeholders can understand activities and results • Promote the value of the organization • Determine the best way to communicate the information to the stakeholders • Perform better stakeholder analysis to facilitate stakeholder buy-in • Improve performance - people do what is measured
  • 14. PURPOSE OF METRICS What are we trying to accomplish?
  • 15. ENSURE ALIGNMENT • Account for IT Processes and Deliverables • Inform stakeholders • Understand IT Performance COMPLIANCE • Achieve certifications; ISO/IEC 20000, COBIT • Measure progress to goals/ objectives OPERATIONAL EXCELLENCE • Measure IT Performance • Control IT Processes • Maximize IT Productivity (people) • Report Costs • Demonstrate value of IT Organization PURPOSE OF METRICS What are we trying to accomplish?
  • 16. The Relationship of Metrics to Goals
  • 17. The Relationship of Metrics to Goals VISION MISSION GOALS OBJECTIVES CSF KPI METRICS MEASUREMENT
  • 18. PURPOSE OF METRICS Sharing Accomplishments What should you report?
  • 19. PURPOSE OF METRICS Sharing Accomplishments What should you report? Key performance indicators Critical success factors Variances to baseline Progress towards targets Annotate milestones and abnormalities Service improvement projects
  • 20. Understanding the Different Types of Metrics
  • 21. DIFFERENT TYPES OF METRICS Metrics & Characteristics Quantitative • How much or how many • Ex. The number of times customers contact the service desk % :)
  • 22. DIFFERENT TYPES OF METRICS Metrics & Characteristics Quantitative METRICS ! ! •Performance indicators (PI) ! •Key performance indicators (KPI) ! •Key results indicators (KRI) • How much or how many • Ex. The number of times customers contact the service desk CHARACTERISTICS ! ! •Efficiency vs. effectiveness ! •Leading vs. lagging % :)
  • 23. DIFFERENT TYPES OF METRICS Efficiency vs. Effectiveness Quantitative • How much or how many • Ex. The number of times customers contact the service desk % :)
  • 24. DIFFERENT TYPES OF METRICS Efficiency vs. Effectiveness Quantitative EFFICIENCY ! ! •How • How fast? much or how many ! •• Ex. The number of times How customers many? contact the service ! desk •Transactional Cost ! •Incident/Request/Access Management ! •Departmental Goals EFFECTIVENESS ! ! •Accuracy ! •Customer Satisfaction ! •Total Organizational Cost ! •Problem Management ! •Enterprise Objectives % :)
  • 25. DIFFERENT TYPES OF METRICS Quantitative vs. Qualitative Quantitative • How much or how many • Ex. The number of times customers contact the service desk % :)
  • 26. DIFFERENT TYPES OF METRICS Quantitative vs. Qualitative Quantitative QUANTITATIVE ! •How much or how many ! •Ex. The number of times customers contact the service desk • How much or how many • Ex. The number of times customers contact the service desk QUALITATIVE ! •How well something or someone is performing. ! •Ex. Customer Satisfaction, Employee Satisfaction, stock price. % :)
  • 28. • Time to process an order • Time to check inventory item • Time to send/receive an e-mail • Time to … @ITSMNinja End-to-End Performance
  • 29. • Cost Per Transaction • Cost Per User @ITSMNinja End-to-End Performance
  • 30. Uptime – compared to … • Downtime per Service • Frequency and total amount of time • Number of incidents (type/category) • Number of recurring incidents • Time per incident @ITSMNinja Service Availability
  • 31. • Problems identified per Service • # incidents per problem • Lost time per problem • Changes Per Service • # successful changes (time, budget) • Lost time due to changes – incidents and requests • # Service Requests due to changes • # problems due to changes – IT and business lost time • % improvements due to changes @ITSMNinja Service Availability
  • 32. BALANCE NEEDED Operational metrics allow you to understand where to improve Service metrics report on the overall performance of the service
  • 33. What Type of Metrics Are Reported?
  • 34. What Type of Metrics Are Reported? IT INTERNAL METRICS SENIOR MANAGEMENT SERVICE MANAGEMENT BUSINESS UNIT METRICS REGULATORY/ COMPLIANCE
  • 35. Four Types of Process Metrics PROGRESS EFFICIENCY EFFECTIVENESS COMPLIANCE ! IN PROCESS MATURITY ! USE OF RESOURCES ! CORRECT AND COMPLETE THE FIRST TIME ! TO PROCESS AND REGULATORY REQUIREMENTS
  • 37. @ITSMNinja Typical Operational Measurements • Response • % connected immediately (Real-Time) • Abandon Rate • Wait (hold/queue) Time • Average Speed to Answer (ASA) • Response Time service level XX% in YY seconds • Call-Back Time • Desktop (PC)
  • 38. @ITSMNinja Typical Operational Measurements • Resolution • Resolved First Contact • Resolved X hours, Y hours, Z hours • Cases re-opened, Repeats • Requests resolved without assistance (self-help) • Calls/Cases avoided due to self-help
  • 39. @ITSMNinja Typical Operational Measurements • Cost • Call/Contact • Customer/User • Total Cost (TCO) of Support/Service
  • 40. @ITSMNinja Typical Operational Measurements • Service Desk • Volumes, trends • Performance to goals • Incident Management • Volumes, trends, repeat incidents • Reduction in restoration time • Performance to goals • Customer Satisfaction
  • 41. @ITSMNinja Typical Operational Measurements • Request Fulfillment • Volumes, trends • Time to complete requests • Performance to goals • Customer satisfaction
  • 42. @ITSMNinja Typical Operational Measurements • Customer Satisfaction • Employee Satisfaction
  • 43. @ITSMNinja Typical Operational Measurements • Knowledge Base Usage • Accesses/Searches per contact • # solutions per search • # solutions searched/opened/viewed • Time spent reviewing solutions • Ease of finding solutions • Quality of solutions (ability to use solutions)
  • 44. @ITSMNinja Typical Operational Measurements • Service Asset and Configuration Management • Errors in CMDB • Resources improvement utilizing CMDB • Change Management • Number of incidents/requests due to the change • Additional (reduction) workload due to changes • Release Management • Number of incidents/requests due to the release • Additional (reduction) in workload due to releases
  • 46. SERVICE DESK METRICS WORKLOAD • • Volumes • Calls/Cases per customer per month • Number of registered users/ Total number of users • Time spent contacting users • Time spent on change related incidents/requests
  • 47. SERVICE DESK METRICS WORKLOAD • • Volumes • Calls/Cases per customer per month • Number of registered users/ Total number of users • Time spent contacting users • Time spent on change related incidents/requests INDIVIDUALS • Number of calls taken • Average Handle Time (AHT) • Avai labi l i ty • Occupancy • Number of incidents/requests c l o s e d o n f i r s t contact • Customer Satisfaction • Contribution to Knowledge base
  • 48. SERVICE DESK METRICS WORKLOAD • • Volumes • Calls/Cases per customer per month • Number of registered users/ Total number of users • Time spent contacting users • Time spent on change related incidents/requests INDIVIDUALS • Number of calls taken • Average Handle Time (AHT) • Avai labi l i ty • Occupancy • Number of incidents/requests c l o s e d o n f i r s t contact • Customer Satisfaction • Contribution to Knowledge base CUSTOMERS • Customer Satisfaction • Frequency of surveying, Number not responding • Volumes • Calls/Case
  • 49. SERVICE DESK METRICS WORKLOAD • • Volumes • Calls/Cases per customer per month • Number of registered users/ Total number of users • Time spent contacting users • Time spent on change related incidents/requests INDIVIDUALS • Number of calls taken • Average Handle Time (AHT) • Avai labi l i ty • Occupancy • Number of incidents/requests c l o s e d o n f i r s t contact • Customer Satisfaction • Contribution to Knowledge base CUSTOMERS • Customer Satisfaction • Frequency of surveying, Number not responding • Volumes • Calls/Case RESPONSE • Average Speed to Answer (ASA) • % calls answered live vs.. queued • Call back time • Abandon Rate (ABA) • Responses within Service Level & Outside service l e v e l
  • 50. INCIDENT, REQUEST AND ACCESS MANAGEMENT
  • 51. INCIDENT, REQUEST AND ACCESS MANAGEMENT RESOLUTION • • Incident closure ( f rom time of submission) • Mean Time for Service Restoration (MTSR) for Levels 1 , 2 , & 3 • Incidents matched (KE) • Incidents Re- Opened • Closed First Contact • Escalations for resolut ion • Remote tool u t i l i z a t i o n • Desk-side visits • Incidents closed v i a s e l f - h e l p
  • 52. INCIDENT, REQUEST AND ACCESS MANAGEMENT RESOLUTION • • Incident closure ( f rom time of submission) • Mean Time for Service Restoration (MTSR) for Levels 1 , 2 , & 3 • Incidents matched (KE) • Incidents Re- Opened • Closed First Contact • Escalations for resolut ion • Remote tool u t i l i z a t i o n • Desk-side visits • Incidents closed v i a s e l f - h e l p VOLUME • Total number of incidents/requests ( b y p r i o r i t y & category) ! • Security related incidents
  • 53. INCIDENT, REQUEST AND ACCESS MANAGEMENT RESOLUTION • • Incident closure ( f rom time of submission) • Mean Time for Service Restoration (MTSR) for Levels 1 , 2 , & 3 • Incidents matched (KE) • Incidents Re- Opened • Closed First Contact • Escalations for resolut ion • Remote tool u t i l i z a t i o n • Desk-side visits • Incidents closed v i a s e l f - h e l p VOLUME • Total number of incidents/requests ( b y p r i o r i t y & category) ! • Security related incidents RESPONSE TIME • Service Desk performance ! • Level 2/3 – same as SD metrics
  • 54. INCIDENT, REQUEST AND ACCESS MANAGEMENT RESOLUTION • • Incident closure ( f rom time of submission) • Mean Time for Service Restoration (MTSR) for Levels 1 , 2 , & 3 • Incidents matched (KE) • Incidents Re- Opened • Closed First Contact • Escalations for resolut ion • Remote tool u t i l i z a t i o n • Desk-side visits • Incidents closed v i a s e l f - h e l p VOLUME • Total number of incidents/requests ( b y p r i o r i t y & category) ! • Security related incidents RESPONSE TIME • Service Desk performance ! • Level 2/3 – same as SD metrics ESCALATION • Time to escalate ! • % Escalated to correct group ! • Technical & Hierarchical
  • 55. INCIDENT, REQUEST AND ACCESS MANAGEMENT
  • 56. INCIDENT, REQUEST AND ACCESS MANAGEMENT CUSTOMER SATISFACTION • Incident, Request & Access Management processes • SELF-SERVICE • Number of unique users ! • Average t ime per user ! • # pages viewed
  • 57. Methods to Report Performance
  • 58. Methods to Report Performance PERFORMANCE REPORTS BALANCED SCORECARD SUPPORT SCORECARD
  • 59. Factors to Consider When Reporting
  • 60. Factors to Consider When Reporting • Who are the stakeholders? • How does what you are reporting impact the stakeholders? • Reports must be easy to read and understood, thus they need to be developed with the stakeholder in mind. • Reports need to show how the support center is contributing to the goals of each stakeholder and the business. • Reports must identify the appropriate channels to communicate with each of the stakeholders.
  • 65. Present your achievements with scorecards
  • 66. Scorecards Drive Transformation I Relevant High (Business Low (Process Improvement) Moderate Transformation) to IT H General Specific T ISO/IEC 17799 ITIL/ISO/IEC 20000 C COBIT Six Sigma ISO/IEC 9000 Malcolm Baldrige Award Scorecards Standards help us to ensure that IT is aligned to meet business objectives
  • 67. • Simple indicator • Reference base • Measures the key issues • Reports on progress to goals @ITSMNinja Support Scorecard
  • 68. Scorecard Criteria: Operational Performance COLUMN TITLE COLUMN TITLE COLUMN TITLE Overall performance Response time Resolution time Closed first call Abandon time Wait time Status time Backlog aging
  • 69. Scorecard Criteria: Operational Performance COLUMN TITLE COLUMN TITLE COLUMN TITLE Overall performance Response time Resolution time Closed first call Abandon time Wait time Status time Backlog aging
  • 70. Scorecard Criteria: Operational Performance SERVICE SCORECARD The Acme Support Center Scorecard provides a weekly report of performance on our Service CS Overall Support Center Product Level commitments. Key Service Area Goal Actual Response Time Front-line Measurements: • Call Pick-Up Time - All incoming calls are answered by a support consultant • Call Waiting Time - Average is less that 3 minutes • Back-line Measurements: • Non-Accepted Call Back Time - All customers not responded to on the initial call by a support consultant will be called back within 30 minutes. 80% 3 minutes 90% Resolve Time Resolved on First Contact - 30% resolved first call (4 month goal is 50%) Resolved Same Day - 40% resolved within 1 business day (4 month goal is 60%) Resolved Same Week - 85% resolved within 5 business days 30% 40% 85% Status Priority 1 Issues - Customer provided status every 4 hours until resolved Priority 2 Issues - Customer provided status every 24 hours until resolved or workaround provided Call Aging - Manage backlog so that no more than 20% over 2 weeks and 5% over 30 days 80% 80% 80% Backlog (Average age of open items) 3 days 10 days Event Survey Overall satisfaction rating on a 1-5 scale 4.1 COLUMN TITLE
  • 71. Service Desk Scorecard COLUMN TITLE Overall performance Response time Resolution time Closed first call Abandon time Wait time Status time Backlog aging COLUMN TITLE Customer sat/quality Overall satisfaction Response satisfaction Resolution satisfaction Status satisfaction Improvement goals Alignment to business
  • 72. Acme Service Desk Scorecard
  • 73. Balanced Scorecard Business Goals Financial Perspective COLUMN TITLE Provide a good return on investment on IT-enabled business investments Manage IT-related business risks Improve corporate governance and transparency Customer Perspective Improve customer orientation and service Offer competitive products and services Establish service continuity and availability Create agility in responding to changing business requirements Achieve the cost optimization of service delivery Obtain reliable and useful information for strategic decision making Internal Perspective Improve and maintain business process functionality Lower process costs Provide compliance with external laws, regulations and contracts Provide compliance with internal policies Manage business change Improve and maintain operational and staff productivity Learning and Growth Perspective Manage product and business innovation Acquire and maintain skilled and motivated personnel
  • 74. IT Balances Scorecard @ITSMNinja Source: ITIL Continuous Service Improvement
  • 76. You cannot manage what you cannot CONTROL You cannot control what you cannot MEASURE You cannot measure what you cannot DEFINE
  • 77. What do you need to measure? What should you be doing you do with the metrics you produce?
  • 78. Thank you! Join me tomorrow @2PM to learn Knowledge Centered Support (KCS) – The Methodology that Really Works John Custy • Managing Consultant • JPC Group • @ITSMNinja