SlideShare a Scribd company logo
Presented by Melanie Mecca & Peter Aiken, Ph.D.
Data Management Maturity
Achieving Best Practices using DMM
Copyright 2013 by Data Blueprint
• Motivation
- Are we satisfied with current performance of DM?
• How did we get here?
- Building on previous research
• What is the Data Management Maturity Model?
- Ever heard of CMM/CMMI?
• How should it be used?
- Use Cases and Value Proposition
• Where to next?
• Q & A?
Outline: Design/Manage Data Structures
2
!3
Guided Navigation to Lasting Solutions
• Architecture & technology
neutral
• Industry independent
• Answers: “How are we doing?”
• Guides: “What should we do
next?”
• Baseline for:
o Managing data as a critical
asset
o Creating a tailored data
management strategy
o Accelerating an existing
program
o Engaging stakeholders
o Pinpointing high value
initiatives.
!4
Foundation for Business Results
• Trusted Data – demonstrated,
independently measured capability to
ensure customer confidence in the data
• Improved Risk and Analytics Decisions
–comprehensive and measured DM
strategy ensures decisions are based
on accurate data
• Cost Reduction/Operational Efficiency
–identification of current and target
states supports elimination of
redundant data and streamlining of
processes
• Regulatory Compliance – independently
evaluated and measured DM
capabilities to meet and substantiate
industry and regulator requirements.  
Copyright 2013 by Data Blueprint
• Motivation
- Are we satisfied with current performance of DM?
• How did we get here?
- Building on previous research
• What is the Data Management Maturity Model?
- Ever heard of CMM/CMMI?
• How should it be used?
- Use Cases and Value Proposition
• Where to next?
• Q & A?
Outline: Data Management Maturity
5
Copyright 2013 by Data Blueprint
Motivation
• "We want to move our data management
program to the next level"
– Question: What level are you at now?
• You are currently managing your data,
– But, if you can't measure it,
– How can you manage it effectively?
• How do you know where to put time, money,
and energy so that data management best
supports the mission?
"One day Alice came to a fork in the road and
saw a Cheshire cat in a tree. Which road do I
take? she asked. Where do you want to go?
was his response. I don't know, Alice
answered. Then, said the cat, it doesn't
matter."
Lewis Carroll from Alice in Wonderland
6
Copyright 2013 by Data Blueprint
DoD Origins
• US DoD Reverse Engineering
Program Manager
• We sponsored research at the
CMM/SEI asking
– “How can we measure the
performance of DoD and our
partners?”
– “Go check out what the Navy is up to!”
• SEI responded with an integrated
process/data improvement
approach
– DoD required SEI to remove the data
portion of the approach
– It grew into CMMI/DM BoK, etc.
7
Copyright 2013 by Data Blueprint
Acknowledgements
version (changing data into other forms, states, or
products), or scrubbing (inspecting and manipulat-
ing, recoding, or rekeying data to prepare it for sub-
sequent use).
• Approximately two-thirds of organizational data
Increasing data management practice maturity levels can positively impact the
coordination of data flow among organizations,individuals,and systems. Results
from a self-assessment provide a roadmap for improving organizational data
management practices.
Peter Aiken, Virginia Commonwealth University/Institute for Data Research
M. David Allen, Data Blueprint
Burt Parker, Independent consultant
Angela Mattia, J. Sergeant Reynolds Community College
A
s increasing amounts of data flow within and
between organizations, the problems that can
result from poor data management practices
are becoming more apparent. Studies have
shown that such poor practices are widespread.
Measuring Data Management
Practice Maturity:
A Community’s
Self-Assessment MITRE Corporation: Data Management Maturity Model
• Internal research project: Oct ‘94-Sept ‘95
• Based on Software Engineering Institute Capability
Maturity Model (SEI CMMSM) for Software Development
Projects
• Key Process Areas (KPAs) parallel SEI CMMSM KPAs, but
with data management focus and key practices
• Normative model for data management required; need to:
– Understand scope of data management
– Organize data management key practices
• Reported as not-done-well by those who do it
8
!9
CMMI Institute Background
• Evolved from Carnegie Mellon’s Software Engineering
Institute (SEI) - a federally funded research and
development center (FFRDC)
• Continues to support and provide all CMMI offerings and
services delivered over its 20+ year history at the SEI
o Industry leading reference models - benchmarks and guidelines
for improvement – Development, Acquisition, Services, People,
Data Management
o Training and Certification program, Partner program
• Dedicated training, partner and certification teams to
support organizations and professionals
• Now owned by ISACA (CISO/M, COBIT, IT Governance,
Cybersecurity) and joint product offerings are planned
!10
CMMI – Worldwide Process Improvement
CMMI Quick
Stats:
• Over 10,000
organizations
• 94 countries
• 12 National
governments
• 10 languages
• 500 Partners
• 1900+
Appraisals in
2016
Copyright 2013 by Data Blueprint
Source: Applications Executive Council, Applications Budget, Spend, and Performance Benchmarks: 2005 Member Survey Results, Washington D.C.: Corporate Executive Board 2006, p. 23.
Percentage of Projects on Budget
By Process Framework Adoption
…while the same pattern generally holds true for on-time performance
Percentage of Projects on Time
By Process Framework Adoption
Key Finding: Process Frameworks are not Created Equal
With the exception of CMM and ITIL, use of process-efficiency 

frameworks does not predict higher on-budget project delivery…
11
Copyright 2013 by Data Blueprint
CMMI Model Portfolio
12
Establish, Manage,
and Deliver
Services
Product
Development /
Software
Engineering
Acquire and
integrate products
/ supply chain
Workforce
development and
management
Rearchitecting to present a more unified/modular offering
!13
DMM and DMBOK
CMMI Institute and DAMA International
are collaborating to:
• Eliminate any confusion between the two tools
and highlight their complementarity
• Extend and enhance data management training
for organizations and professionals
• Provide benefits to DAMA members (members
receive a discount for our public training
classes)
Copyright 2013 by Data Blueprint
• Motivation
- Are we satisfied with current performance of DM?
• How did we get here?
- Building on previous research
• What is the Data Management Maturity Model?
- Ever heard of CMM/CMMI?
• How should it be used?
- Use Cases and Value Proposition
• Where to next?
• Q & A?
Outline: Data Management Maturity
14
!15
Data Management Maturity (DMM)SM Model
• DMM 1.0 released August 2014
o 3.5 years in development
o Sponsors – Microsoft, Lockheed
Martin, Booz Allen Hamilton
o 50+ contributing authors, 70+
peer reviewers, 80+ orgs
• Reference model framework of
fundamental best practices
o 414 specific practice statements
o 596 functional work products
o Maturity practices
• Measurement Instrument for
organizations to evaluate
capabilities and maturity,
identify gaps, and incorporate
guidelines for improvements.
!16
“You Are What You DO”
• Model emphasizes behavior
o Proactive positive behavioral
changes
o Creating and carrying out
effective, repeatable processes
o Leveraging and extending across
the organization
• Activities result in work
products
o Processes, standards, guidelines,
templates, policies, etc.
o Reuse and extension = maximum
value, lower costs, happier staff
• Practical focus reflects real-
world organizations – enterprise
program evolving to all hands on
deck.
One concept for process
improvement, others include:
• Norton Stage Theory
•TQM
•TQdM
•TDQM
• ISO 9000

and focus on understanding
current processes and
determining where to make
improvements.
Copyright 2013 by Data Blueprint
DMM Capability Maturity Model Levels
Our DM practices are informal and ad hoc,
dependent upon "heroes" and heroic efforts
Performed
(1)
Managed
(2)
Our DM practices are defined and
documented processes performed at
the business unit level
Our DM efforts remain aligned with
business strategy using
standardized and consistently
implemented practices
Defined
(3)
Measured
(4)
We manage our data as a asset using
advantageous data governance practices/structures


Optimized
(5)

DM is strategic organizational capability,
most importantly we have a process for
improving our DM capabilities
17
!18
DMM Capability Levels
Performed
Managed
Defined
Measured
Optimized
Level
1
Level
2
Level
3
Level
4
Level
5
Risk
Quality
Ad hoc
Reuse
Stress
Clarity
Capability – “We can do
this”
• Specific Practices -
“We’re doing it well”
• Work Products - “We’ve
documented the processes we are
following” (processes, work
products, guidelines, standards,
etc.)
Maturity – “….and we can
prove it”
• Process Stability &
Resilience – 

“Take it to the bank”
• Ensures Repeatability
• Policy, Training,
Quality Assurance, etc.
‹#›
DMM Structure
Core Category
Process
Area
Purpose
Introductory
Notes
Goal(s) of the Process
Area
Core Questions for the Process
Area
Functional Practices (Levels
1-5)
rRelated Process
Areas
Example Work Products
Infrastructure Support
Practices
eExplanatory Model Components Required for Model
Compliance
!19
Maintain fit-for-purpose data,
efficiently and effectively
DMM℠ Structure of 

5 Integrated 

DM Practice Areas
20
Copyright 2015 by Data Blueprint
Manage data coherently
Manage data assets professionally
Data architecture
implementation
Data lifecycle
implementation
Organizational support
!21
Planning for and managing
data assets as a critical
component of infrastructure,
emphasizing an organization-
wide approach and program
versus project by project,
data store by data store.
8
Data Management Strategy
!22
9
Implementing the building,
nurturing, sustaining, and
controlling power of collective
decision-making, and harnessing
staff expertise for
collaborative development of
knowledge management
Data Governance
!23
10
Comprises a 360 degree and
extensible approach to
improving the quality of
data organization-wide by
thoughtful planning and
integrated best practices.
Data Quality
!24
11
Ensures that
requirements for data
are specified and linked
to business processes
and metadata, enables
data lineage and
authoritative sources,
and exercises controls
and quality improvements
for data provided.
DMM Operations
!25
12
Key considerations for
developing a well-
organized data layer
that meets business
needs, with appropriate
technologies, enabling
integration,
interoperability, and
data provisioning.
Platform and Architecture
!26
Supporting Processes
Practices that
implement organization
and control for all
data management
processes, such as:
developing and
monitoring metrics;
managing risks,
configurations, process
quality and work
products.
Copyright 2013 by Data Blueprint
• Motivation
- Are we satisfied with current performance of DM?
• How did we get here?
- Building on previous research
• What is the Data Management Maturity Model?
- Ever heard of CMM/CMMI?
• How should it be used?
- Use Cases and Value Proposition
• Where to next?
• Q & A?
Outline: Data Management Maturity
27
Copyright 2013 by Data Blueprint
Assessment Components
Data Management Practice Areas
Data Management
Strategy
DM is practiced as a
coherent and
coordinated set of
activities
Data Quality
Delivery of data is
support of
organizational
objectives – the
currency of DM
Data 

Governance
Designating specific
individuals caretakers
for certain data
Data Platform/
Architecture
Efficient delivery of
data via appropriate
channels
Data Operations
Ensuring reliable
access to data
Capability
Maturity Model
Levels
Examples of practice
maturity
1 – Performed
Our DM practices are ad hoc and
dependent upon "heroes" and
heroic efforts
2 – Managed
We have DM experience and have
the ability to implement disciplined
processes
3 – Defined
We have standardized DM
practices so that all in the
organization can perform it with
uniform quality
4 – Measured
We manage our DM processes so
that the whole organization can
follow our standard DM guidance
5 – Optimized
We have a process for improving
our DM capabilities
28
Copyright 2013 by Data Blueprint
Industry Focused Results
• CMU's Software 

Engineering Institute (SEI) Collaboration
• Results from hundreds organizations in
various industries including:
✓ Public Companies
✓ State Government Agencies
✓ Federal Government
✓ International Organizations
• Defined industry standard
• Steps toward defining data management
"state of the practice"
29
Data Management Strategy
Data Governance
Platform & Architecture
Data Quality
Data Operations
Focus:
Implementation
and Access
Focus:
Guidance and
Facilitation
Optimized(V)

Measured(IV)

Defined(III)

Managed(II)

Initial(I)
Development guidance
Data Adminstration
Support systems
Asset recovery capability
Development training
0 1 2 3 4 5
Client Industry Competition All Respondents
Data Management Practices Assessment
Challenge
Challenge
Challenge
Data Program
Coordination
Organizational Data
Integration
Data Stewardship
Data Development
Data Support
Operations
30
Copyright 2015 by Data Blueprint
High Marks for IFC's Audit
31
Copyright 2015 by Data Blueprint
Leadership & Guidance
Asset Creation
Metadata Management
Quality Assurance
Change Management
Data Quality
0 1 2 3 4 5
TRE ISG IFC Industry Benchmarks Overall Benchmarks
1
2
3
4
5
DataProgramCoordination
OrganizationalDataIntegration
DataStewardship
DataDevelopment
DataSupportOperations
2007 Maturity Levels 2012 Maturity Levels
Comparison of DM Maturity 2007-2012
32
Copyright 2015 by Data Blueprint
!33Copyright 2015 by Data Blueprint Slide #
improving how the state prices and sells its goods and services, and more efficiently matching
citizens to benefits when they enroll.
“The first year of our data internship partnership has been a success,” said Governor McAuliffe.
“The program has helped the state save time and money by making some of our internal
processes more efficient and modern. And it has given students valuable real-world experience. I
look forward to seeing what the second year of the program can accomplish.”
“Data is an important resource that becomes even more critical as technology progresses,” said
VCU President Michael Rao, Ph.D. “VCU is uniquely positioned, both in its location and
through the wealth of talent at the School of Business, to help state agencies run their data-
centric systems more efficiently, while giving our students hands-on practice in the development
of data systems.”
During their internships, pairs of VCU students work closely with state agency CIOs to identify
specific business cases in which data can be used. Participants gain practical experience in using
data to drive re-engineering, while participating CIOs have concrete examples of how to make
better use of data to provide innovative and less costly services to citizens.
"Working with the talented VCU students gave us a different perspective on what the data was
telling us,” said Dave Burhop, Deputy Commissioner/CIO of the Virginia Department of Motor
Vehicles.
“The VCU interns provided an invaluable resource to the Governor’s Coordinating Council on
Homelessness,” said Pamela Kestner, Special Advisor on Families, Children and Poverty.
“They very effectively reviewed the data assets available in the participating state agencies and
identified analytic content that can be used to better serve the homeless population.”
“It's always useful to have ‘fresh eyes’ on data that we are used to seeing,” said Jim Rothrock,
Commissioner of the Department for Aging and Rehabilitative Services. “Our interns challenged
us and the way we interpret data. It was a refreshing and useful, and we cannot wait for new
experiences with new students.”
The data internships support Governor McAuliffe’s ongoing initiative to provide easier access to
open data in Virginia. The internships also support treating data as an enterprise asset, one of
four strategic goals of the enterprise information architecture strategy adopted by the
Commonwealth in August 2013. Better use of data allows the Commonwealth to identify
opportunities to avoid duplicative costs in collecting, maintaining and using information; and to
integrate services across agencies and localities to improve responses to constituent needs and
optimize government resources.
Virginia Secretary of Technology Karen Jackson and CIO of the Commonwealth Nelson Moe
are leading the effort on behalf of the state. Students who want to apply for internships should
contact Peter Aiken (peter.aiken@vcu.edu) for additional information.
Governor's Data Interns Program
!34
Using DMM in the State of Arizona
• Policies drive change
in state government
• Base policies on a
widely-accepted
framework
!35
DMM supports Arizona Strategy
• Metrics - DMM provides
measurement methodology
• Enterprise Architecture -
DMM provides gap analysis
and a path forward
• Emphasis on Lean - DMM
drives towards
eliminating silos for
improved efficiency
!36
DMM in Arizona – Current State
• Introduced DMM at annual
Arizona Data Management
Conference in January, 2016
• Wide buy-in from multiple
agencies
• “Building EDM Capabilities”
training for 20 students from
11 agencies March 2017
!37
DMM in Arizona – Next Steps
• Students want advanced training
• Students want to help other
agencies – DMM “Swat Team”
• 2nd Annual Data Management
Conference – April 26, 27
• Participating in Governor’s Goal
Council
• Planning DMM assessments for 3-4
agencies
• DMM adds structure and lends
credibility to the state DM
Program
‹#›
Natural events for employing the DMM
• Use Cases - assess current capabilities
before:
• Developing or enhancing DM program / strategy
• Embarking on a major architecture transformation
• Establishing data governance
• Expansion / enhancement of analytics
• Implementing a data quality program
• Implementing a metadata repository
• Designing and implementing multi-LOB solutions:
• Master Data Management
• Shared Data Services
• Enterprise Data Warehouse
• Implementing an ERP
• Other multi-business line efforts.
Like an Energy
audit or an
executive physical
!38
Starting the Journey - DMM Assessment Method
• To maximize the DMM’s value as a catalyst
for forging shared perspective and
accelerating programs, our method provides:
– Collaboration launch event with a broad range of
stakeholders
– Capabilities evaluated by consensus affirmations
– Solicits key business input through supplemental
interviews
– Verifies evaluation with work product reviews
(evidence)
– Report and executive briefing presents Scoring,
Findings, Observations, Strengths, and customized
specific Recommendations.To date, over 800 assessment participants from business, IT, and data management
have employed DMM 1.0 - practice by practice, work product by work product - to
evaluate their capabilities.
‹#›
DMM Assessment Summary

Sample Organization
!40
!41
Cumulative Benchmark – Multiple organizations
!42
DMM Training and Certification
Current Offerings
• Building EDM Capabilities
o Instructor-Led 3-day
interactive class
o eLearning –web-based 8-10 hour
class
• Advancing EDM Capabilities
o Instructor-led 5 day
interactive class
• Enterprise Data Management
Expert (EDME)
o Instructor-led 5 day
interactive class, preparation
for EDME certification
• (Near Future) DMM Associate
certification.
ENTERPRISE DATA MANAGEMENT VISION
• The State of Arizona is conducting a multi-year statewide data
management initiative to support:
o The Governor’s “Arizona Management System,” an innovative Lean
based approach instituted by the current governor to streamline & meet
the state’s goal to be: “The #1 state to live, work, play, recreate,
retire, visit, do business, and get an education.”
o Employee education to empower them to employ standards, methods
and tools for data-driven decision-making and disciplined problem
solving, which lead to greater creativity, control, and productivity in
citizen service
o Arizona realizes that trusted, accurate and sustainable data is essential
to successful data-driven decision making and accurate metrics,
and that a state-wide enterprise data management platform is critical
o The state is also focusing on modernizing its data layer and technology
stack to enable needed data sharing and protect privacy.
MULTI-YEAR EDM PROGRAM
• The Arizona Strategic Enterprise Technology office, within the Department
of Administration, developed a five-year Indefinite Delivery Indefinite
Quantity (IDIQ) contract to include the following services (selected):
o Review, assess and measure data management maturity, using the CMMI
Institute’s Data Management Maturity Model as a framework
o Make recommendations for a roadmap to maturity for state agencies to
achieve target maturity levels
o Train and educate ADOA-ASET, business data stewards, and management
staff to perform DMM Assessments and appraisals
o Recommend policies, standards, and processes
o Provide advisory services for best practice implementation.
DMM AND DATA DRIVEN DECISION MAKING
• The CMMI Institute was among the three vendors selected to provide the
IDIQ’s broad range of services over a five-year period, along with Quest
Communications and Data Blueprint
• We are working with the State of Arizona to implement a statewide data
management program to BUILD*IMPROVE*MEASURE its data
management processes and capabilities utilizing the DMM.
• We are teaming with the ADOA-ASET to help construct measurable inter-
and intra-agency DMM programs to:
o Provide education in data management best practices
o Evaluate data management practices in numerous agencies
o Assist the state in evolving a trusted data environment
o Help ASET develop the roles and resources to support the state-wide program.
USING DMM IN THE STATE OF ARIZONA
• Policies		are	driving	collaboration	
and	change	in	state	government	
• Basing	data	management	
policies	on	a	widely-accepted	
framework
DMM SUPPORTS ARIZONA STRATEGY
• Metrics – AMS requires metrics-
driven decisions, and DMM provides
a measurement methodology
• Enterprise Architecture - DMM
provides gap analysis and a path
forward for data sharing
• Emphasis on Lean - DMM drives
towards eliminating silos for
improved effectiveness and
efficiency
DMM IN ARIZONA – RECENT ACTIVITIES
• Introduced DMM at annual Arizona Data
Management Conference in January 2016
• Wide buy-in from multiple agencies
• Held CMMI Institute capability briefing for
key state data management personnel
• “Building EDM Capabilities” training for
20 students from 11 agencies in March 2017
2ND ANNUAL DATA MANAGEMENT CONFERENCE
• Hosted by ADOA-ASET, led by our POC Jeff Wolkove, Enterprise
Architect
• Over 75 attendees representing 40+ agencies
• Keynote speaker was State Chief Operating Officer, Henry Darwin
• Presented at state-wide Data Management Conference April 2017
o Moderated data management panel discussion
o Overview - Data Management Capability Evaluation
o Role of Data Governance in EDM
o Break Down Silos by Managing Data as Meaning (seminar)
DMM IN ARIZONA – NEXT STEPS
• Building	EDM	Capabilities	attendees	want	advanced	training	in	
the	DMM	
• Students	want	to	help	other	agencies	–	DMM	“Swat	Teams”	
• ASET	is	participating	in	Governor’s	Goal	Council	representing	
data	management	issues	
• Planning	DMM	assessments	for	3-4	agencies	
• Department	of	Corrections	-	Jun	2017	
• Department	of	Water	Resources	-	Jul	2017	
• Health	Care	Cost	Containment	System	–	Aug	2017	
• And	more	in	the	works….	
• DMM	adds	structure	and	lends	credibility	to	the	state	Data	
Management	Program	
• Arizona	has	set	a	target	for	35	cabinet-level	agencies	to	achieve	
DMM	Level	3	in	the	next	three	years.
OPPORTUNITIES FOR STATES AND TERRITORIES
• Arizona is providing leadership and vision which can be leveraged for other states and territories
• Analytics and automation, internet of things, unstructured data are testing status quo of staffing,
technology, and data management needs
• States are undertaking modernization of their technologies and processes
• The Arizona RFP has been provided to New York and Virginia, which has begun data management
capability evaluation for multiple agencies through the Virginia Commonwealth University graduate
school
• We presented a webinar on state-wide data management programs, featuring Jeff Wolkove, to the
National Association of State Chief Information Officers (NASCIO) in April 2017
• We would like to working with additional states; this will increase adoption of the DMM as a de facto
global standard and provide opportunities for our Partners and EDMEs.
WHAT CAN GO WRONG?
• It’s	estimated	that	the	average	hospital	has	8-12%	duplicate	records,	and	as	many	as	10%	of	incoming	patients	
are	misidentified	
• Sharing	of	patient	data	from	disparate	providers	increases	likelihood	of	duplicates	for	HIEs	due	to	defects	in	MPIs	
• Preventable	medical	errors	may	include:	
o Misdiagnosis	and	incorrect	treatment	procedures	
o Incorrect	or	wrong	dose	of	medication	
o Incorrect	blood	type	
o Unknown	medication	allergies	
o Repeated	diagnostic	tests	
• Cost	to	correct	a	duplicate	record	estimated	at	$1000	
• Claims	may	be	rejected	
• Increased	risk	of	lawsuits.
IMPROVED DATA QUALITY LEADS TO MANY BENEFITS
• Decreased	operational	risk	through	improvements	to	the	quality	of	patient	demographic	
data.	Specifically,	patient	safety	is	protected,	and	the	delivery	of	patient	care	improves

• Increased	operational	efficiency,	requiring	less	manual	effort	to	fix	data	issues,	fewer	
duplicate	test	orders	for	patients,	and	adoption	of	standard	data	representations

• Improved	interoperability	and	data	integration	through	adopting	data	standards	and	
data	management	practices	that	are	followed	by	staff	across	the	patient	lifecycle

• Improved	staff	productivity	by	expending	fewer	hours	on	detecting	and	remediating	data	
defects	to	perform	their	tasks	-	Increased	staff	awareness	for	contributing	to	and	
following	processes	that	improve	patient	identity	integrity
THE DMM WAS SELECTED AS THE FOUNDATION
• An	HHS	ONC	Community	of	Practice	(CoP)	for	patient	demographic	data	quality	was	convened	in	2015	with	government,	
health	care	provider,	and	industry	association	members.		The	CoP	analyzed	available	data	management	frameworks	to	
determine	which	encompassed	the	practices	that	health	care	providers	and	staff	needed	to	improve	patient	data	quality	
across	the	entire	organization.

• They	determined	that	the	DMM	was	most	closely	aligned	with	the	needs	of	the	health	care																																																					
industry	for	a	comprehensive	standard,	due	to	its:	
o Behavioral,	fact-based	approach	
o Organization-wide	scope	
o Built-in	path	for	improvements	
o Technology	and	architecture	neutrality	
o Enables	a	rapid	and	precise	evaluation	of	the	current	state	of	data	management	practices
DEVELOPMENT ACTIVITIES WITH HHS ONC
• CMMI	Institute	was	commissioned	to	develop	a	practical,	implementable	framework	for	HHS	
ONC	-	focused	on	advancing	the	goal	of	improved	patient	data	quality:	
o Describing	the	organizational	behavior	reflecting	sound	data	management	principles	in	a	health	care	context	
o Specific	data	management	practices	in	the	context	of	patient	demographic	data	
o Set	of	work	products	(policies,	plans,	processes,	standards,	etc.)	supporting	the	new	or	improved	capabilities

• Working	with	HHS	ONC	and	our	prime,	Audacious	Inquiry,	the	team	researched	ONC	
publications,	industry	surveys	and	studies.	We	then	analyzed	the	DMM	to	determine	the	process	
areas	(topics	/	disciplines)	and	functional	practices	addressing	the	scope:	
o Demographic	data	used	for	record	matching	
o Relevant	to	the	patient	care	lifecycle	and	stakeholders	
o Most	critical	to	assessing	and	monitoring	demographic	data	quality.
ONC	‘s	“Patient	Identification	and	Matching	Final	Report,”	February	7,	2014	
DMM TRANSFORMATION RESULTS
• Eliminated	Process	Areas	less	directly	related	to	the	scope	-	19	versus	25

• Practice	statements	were	transformed	into	questions	–	76	versus	414	statements	
o Invites	self-administration,	elicits	discussion	
o Encourages	stakeholder	representation	across	patient	care	lifecycle

• Five	capability	Levels	1-5	became	three	capability	Tiers	–	Foundational,	
Building,	and	Advanced

• Example	work	products	–	152	versus	596

• 96	versus	240+	pages

• Contextual	examples	throughout
Patient	Demographic	Data	Quality	
Framework
DESIGNED FOR A COLLABORATIVE APPROACH
• Managing	data	is	
primarily	a	people	
problem,	not	a	‘system	
problem’	
• Patient	data	primarily	
originates	in	Registration	
• However	-	it	can	be	
modified	at	any	point	
throughout	the	patient	
care	lifecycle
No	one	individual	knows	everything	about	the	patient	data
Industry-wide,	realization	of	the	long-term	goal,	
adoption	of	consistent	data	standards	and	standard	
matching	algorithms,	would	increase	interoperability	
and	minimize	duplicates
• Staff	who	have	access	to	the	
patient	record,	should	be	engaged	
in	decisions	about	standards,	
formats,	term,	values,	etc.	
• PDDQ	questions	are	designed	to	
be	posed	in	a	group	setting	
• Consensus	decisions	about	the	
current	state	
• Key	processes	and	standards	
should	be	agreed	to,	standardized,	
and	followed.		
HHS ONC ADVOCATES A NEW ROLE
• Who	should	lead	data	quality	plans	and	improvements	within	a	health	care	provider	organization?		ONC	proposed	
that	a	Data	Quality	Coordinator	(DQC)	for	patient	demographic	data	should	be	designated	for	each	organization	
o Leads	the	PDDQ	evaluation	effort	
o Coordinates	process	improvement	efforts,	for	example:	
• Establishing	data	governance	
• Defining	terms,	formats	and	values	for	demographic	data	
• Developing	quality	rules	
• Creating	a	data	quality	plan	for	the	organization	
o In	a	small	practice,	one	individual	or	a	part-time	individual	
• In	a	large	practice,	may	be	a	group	led	by	a	DQC

• Establishing	this	role	demonstrates	commitment
AUDIENCE FOR THE PDDQ
• The	PDDQ	was	designed	for	any	organization	creating,	managing,	or	aggregating	
patient	data	(e.g.,	master	patient	index,	MPI):	
o Hospitals	and	Health	systems	which	deliver	patient	care	
o Health	Information	Organizations	(HIOs)	which	facilitate	health	information	exchange	(HIE)	among	
and	between	multiple	stakeholders	
o Master	Data	Management	and	Master	Patient	Index	solution	vendors	which	provide	patient	
databases	that	enable	enterprise	wide	patient	data	management	
o Health	Information	Exchange	(HIE)	vendors	which	enable	doctors,	nurses,	pharmacists,	other	
health	care	providers	and	patients	to	electronically	access	and	securely	share	patient	data	
o Inpatient	or	outpatient	Electronic	Health	Record	(EHR)	vendors	who	manage	patient	data	
from	clinicians
EVALUATION QUESTIONS – DATA QUALITY PLANNING
No	context	or	elaborations	are	displayed	in	the	table	above
CONTEXTUAL INFORMATION
• Practice	Evaluation	Questions	are	supported	by	an	explanation	using	health	care	examples		
• Example	from	Data	Profiling:
2.1 Hasthe organization definedanapproachandmethodfor profilingadataset?	
A	data	profiling	method	is	a	planned	approach	to	analyzing	data	sets	that	is	not		
A	data	profiling	method	is	a	planned	approach	to	analyzing	data	sets	that	is	not	restricted	to	a	
specific	technology	solution.	The	method	serves	as	a	process	guide	that	defines	the	types	of	
analyses	to	be	performed,	their	rationale,	relevant	scenarios,	high-level	activity	steps,	tests	and	
rules	to	be	applied,	as	well	as	report	templates	for	results.	The	goal	is	to	define	the	process	
steps	and	supporting	work	products	so	they	are	reusable	across	various	data	stores.	
	 One	of	the	most	common	types	of	advanced	profiling	methods	is	aimed	at	the	identification	and	
resolution	of	duplicate	records	in	a	data	set.	The	patient	matching	algorithms	used	across	the	
healthcare	industry	are	a	classic	example	of	both	the	target	objective,	and	the	difficulty	in	
stabilizing	profiling	methods	that	work.	Most	algorithms	for	determining	duplicates	may	require	
trial	and	error,	as	well	as	customization	that	is	achieved	through	numerous	iterations	of	data	
analysis	and	standardization.		
	
SAMPLE SUMMARY DIAGRAM
• This	example	depicts	an	
organization	with	strong	
data	management	
capabilities

• HHS	ONC	recommends	that	
all	health	care	organizations	
with	patient	data	work	to	
achieve	a	score	of	Tier	3	in	
all	PDDQ	Process	Areas

• The	online	PDDQ	score	
graphic	is	a		horizontal	
stacked	bar	chart
HOW TO INTERPRET THE SCORES
Process	Area	scores	may	fall	into	a	range	from	0	to	3:	
• 0	-	1		Foundational:	Data	is	managed	as	a	requirement	for	projects	and	
processes	
• 1	-	2	Building:	Building	on	the	successful	completion	of	Tier	1,	data	is	
increasingly	managed	as	a	critical	infrastructural	asset		
• 2	-	3	Advanced	features	capabilities	that	comprise	completion	of	the	
practices	needed	for	a	sound	and	sustainable	program	for	managing	
patient	demographic	data	across	the	health	care	lifecycle.	
COLLABORATION AND COOPERATION
• Patient	data	is	a	common	thread	across	the	entire	provider	organization	
o Health	care	units	tend	to	focus	on	their	mission	–diagnostic	imaging,	pharmacy,	
laboratory,	claims	
o Capturing	or	modifying	patient	data	differently	magnifies	the	potential	for	
duplicates

• PDDQ	questions	zero	in	on	key	processes	that	may	already	be	performed,	
but	typically	not	examined	organization-wide

• Participants	can	raise	issues,	make	collective	decisions	(How	Are	We	
Doing?),	and	suggest	improvements

• The	DQC	leads	the	coordination	of	implementation	activities,	monitors	
progress
PRECISE SNAPSHOT OF CURRENT STATE
• PDDQ	means	that	health	care	organizations	don’t	have	to	guess	about	how	they’re	managing	their	
patient	data	
• A	PDDQ	evaluation	of	current	practices	will:	
• Enable	clear	identification	of	any	gaps	
• Enable	staff	to	understand	that	their	activities	affect	the	quality	of	patient	data	(not	just	‘system	problems’)	
• Create	awareness	about	practices	not	followed	for	all	care	areas	
• Discover	useful	work	products	elsewhere	in	the	organization,	and	discover	missing	work	products	(e.g.,	no	data	
entry	standards)	
• Give	everyone	a	voice	-	each	learns	what	others	are	doing,	leading	to	acceptance																																																																																	
of	results	and	further	cooperation	
• Set	a	baseline	for	monitoring	progress
ACTIONABLE IMPROVEMENTS
• Once	gaps	and	strengths,	have	been	identified,	the	organization	can	quickly	move	to	establishing	
new	capabilities	and	creating	supporting	work	products	–	for	example:	
o Data	Governance	–	determining	the	care	areas	and	roles	that	are	engaged	with	patient	demographic	
data,	forming	governance	group	
o Business	Terms	–	collaborative	agreements	on	demographic	data	–	definitions,	values	approved	by	
stakeholders	across	the	lifecycle	
o Data	Quality	Plan	–	organization-wide	plan	for	improving	data	quality,	which	may	include	new	quality	
rules,	registration	procedures,	data	profiling,	frequent	monitoring,	matching	algorithm	upgrades,	etc.

• The	organization	determines	its	priorities	and	a	timeline	to	meet	objectives,	including	resources	
and	roles.
IF YOU’RE A HEALTH CARE DATA PROFESSIONAL
• The	PDDQ	can	assist	you	in	many	ways:	
o Speeds	up	your	analysis	of	organization’s	DM	capabilities	
o Engages	stakeholders	and	provides	consensus	pre-approval	for	initiatives	
• Stakeholders	see	the	gaps	–	definitive	and	understand	recommended	improvements		
• Facilitates	funding	and	resource	commitments	
o Opportunity	for	multiple	implementation	projects	
o Opportunity	for	new	role	–	Data	Quality	Coordinator

• Taken	together,	helps	accelerate	your	career

• Recommended	first	step	–	master	the	PDDQ

• Achieve	EDME	Certification	–	master	the	recommended	method
ACCESS THE PDDQ
• https://www.healthit.gov/playbook/pddq-
framework/
Copyright 2013 by Data Blueprint
• Motivation
- Are we satisfied with current performance of DM?
• How did we get here?
- Building on previous research
• What is the Data Management Maturity Model?
- Ever heard of CMM/CMMI?
• How should it be used?
- Use Cases and Value Proposition
• Where to next?
• Q & A?
Outline: Data Management Maturity
69
LEARN MORE
• http://cmmiinstitute.com/patient-demographic-data-quality-pddq-framework	-	PDDQ	page	
• http://cmmiinstitute.com/sites/default/files/resource_asset/PDDQ_White_Paper_0.pdf	-	PDDQ	white	
paper	
• http://cmmiinstitute.com/sites/default/files/resource_asset/PDDQ%20Article.pdf		-	Article	-	Improving	
Patient	Data	Quality:	An	Introduction	to	the	Patient	Demographic	Data	Quality	(PDDQ)	Framework
Copyright 2013 by Data Blueprint
Questions?
+ =
71
10124 W. Broad Street, Suite C
Glen Allen, Virginia 23060
804.521.4056

More Related Content

What's hot (20)

PDF
A Comparative Study of Data Management Maturity Models
Data Crossroads
 
PDF
Introduction to Data Governance
John Bao Vuu
 
PDF
Data Architecture Strategies
DATAVERSITY
 
PDF
DAS Slides: Building a Data Strategy — Practical Steps for Aligning with Busi...
DATAVERSITY
 
PDF
Review of Data Management Maturity Models
Alan McSweeney
 
PDF
DMBOK 2.0 and other frameworks including TOGAF & COBIT - keynote from DAMA Au...
Christopher Bradley
 
PDF
Master Data Management – Aligning Data, Process, and Governance
DATAVERSITY
 
PDF
Data-Ed Slides: Best Practices in Data Stewardship (Technical)
DATAVERSITY
 
PPTX
Introduction to DCAM, the Data Management Capability Assessment Model - Editi...
Element22
 
PDF
Data Governance Best Practices
DATAVERSITY
 
PDF
Data Quality Best Practices
DATAVERSITY
 
PDF
Glossaries, Dictionaries, and Catalogs Result in Data Governance
DATAVERSITY
 
PDF
Master Data Management - Aligning Data, Process, and Governance
DATAVERSITY
 
PDF
Data Architecture Strategies: Data Architecture for Digital Transformation
DATAVERSITY
 
PDF
Reference master data management
Dr. Hamdan Al-Sabri
 
PPT
MDM Strategy & Roadmap
victorlbrown
 
PDF
Mdm: why, when, how
Jean-Michel Franco
 
PDF
3 Keys To Successful Master Data Management - Final Presentation
James Chi
 
PPTX
Chapter 3: Data Governance
Ahmed Alorage
 
PDF
Data Management vs Data Strategy
DATAVERSITY
 
A Comparative Study of Data Management Maturity Models
Data Crossroads
 
Introduction to Data Governance
John Bao Vuu
 
Data Architecture Strategies
DATAVERSITY
 
DAS Slides: Building a Data Strategy — Practical Steps for Aligning with Busi...
DATAVERSITY
 
Review of Data Management Maturity Models
Alan McSweeney
 
DMBOK 2.0 and other frameworks including TOGAF & COBIT - keynote from DAMA Au...
Christopher Bradley
 
Master Data Management – Aligning Data, Process, and Governance
DATAVERSITY
 
Data-Ed Slides: Best Practices in Data Stewardship (Technical)
DATAVERSITY
 
Introduction to DCAM, the Data Management Capability Assessment Model - Editi...
Element22
 
Data Governance Best Practices
DATAVERSITY
 
Data Quality Best Practices
DATAVERSITY
 
Glossaries, Dictionaries, and Catalogs Result in Data Governance
DATAVERSITY
 
Master Data Management - Aligning Data, Process, and Governance
DATAVERSITY
 
Data Architecture Strategies: Data Architecture for Digital Transformation
DATAVERSITY
 
Reference master data management
Dr. Hamdan Al-Sabri
 
MDM Strategy & Roadmap
victorlbrown
 
Mdm: why, when, how
Jean-Michel Franco
 
3 Keys To Successful Master Data Management - Final Presentation
James Chi
 
Chapter 3: Data Governance
Ahmed Alorage
 
Data Management vs Data Strategy
DATAVERSITY
 

Similar to Implementing the Data Maturity Model (DMM) (20)

PDF
DataEd Slides: Data Management Maturity - Achieving Best Practices Using DMM
DATAVERSITY
 
PDF
Data Ed: Best Practices with the DMM
Data Blueprint
 
PDF
Data-Ed Webinar: Best Practices with the DMM
DATAVERSITY
 
PDF
Data-Ed Online: Data Management Maturity Model
DATAVERSITY
 
PDF
Data-Ed: Best Practices with the Data Management Maturity Model
Data Blueprint
 
PDF
Best Practices with the DMM
DATAVERSITY
 
PDF
Data-Ed Webinar: Implementing the Data Management Maturity Model (DMM) - With...
DATAVERSITY
 
PDF
A Data Management Maturity Model Case Study
DATAVERSITY
 
PDF
DataEd Slides: Data Management Best Practices
DATAVERSITY
 
PDF
DataEd Slides: Data Management Best Practices
DATAVERSITY
 
PPTX
Organizational maturity model pcmm
Daniel Oskooei
 
PDF
Enterprise Data Management Framework Overview
John Bao Vuu
 
PDF
Predictive Analytics - How to get stuff out of your Crystal Ball
DATAVERSITY
 
PDF
Increasing Your Business Data and Analytics Maturity
DATAVERSITY
 
PDF
DataEd Webinar: Reference & Master Data Management - Unlocking Business Value
DATAVERSITY
 
PDF
Dmmaturitymodelscomparison 190513162839
Irina Steenbeek, PhD
 
PPTX
Lgyt6ttftnjihuhunjnnjnrd6tf tfv ytgyuguy-8.pptx
VenkataSujiAparnaSri
 
PDF
Data Maturity - A Balanced Approach
DATAVERSITY
 
PDF
Increasing Your Business Data & Analytics Maturity
Mario Faria
 
PDF
Is Our Information Management Mature?  
DATAVERSITY
 
DataEd Slides: Data Management Maturity - Achieving Best Practices Using DMM
DATAVERSITY
 
Data Ed: Best Practices with the DMM
Data Blueprint
 
Data-Ed Webinar: Best Practices with the DMM
DATAVERSITY
 
Data-Ed Online: Data Management Maturity Model
DATAVERSITY
 
Data-Ed: Best Practices with the Data Management Maturity Model
Data Blueprint
 
Best Practices with the DMM
DATAVERSITY
 
Data-Ed Webinar: Implementing the Data Management Maturity Model (DMM) - With...
DATAVERSITY
 
A Data Management Maturity Model Case Study
DATAVERSITY
 
DataEd Slides: Data Management Best Practices
DATAVERSITY
 
DataEd Slides: Data Management Best Practices
DATAVERSITY
 
Organizational maturity model pcmm
Daniel Oskooei
 
Enterprise Data Management Framework Overview
John Bao Vuu
 
Predictive Analytics - How to get stuff out of your Crystal Ball
DATAVERSITY
 
Increasing Your Business Data and Analytics Maturity
DATAVERSITY
 
DataEd Webinar: Reference & Master Data Management - Unlocking Business Value
DATAVERSITY
 
Dmmaturitymodelscomparison 190513162839
Irina Steenbeek, PhD
 
Lgyt6ttftnjihuhunjnnjnrd6tf tfv ytgyuguy-8.pptx
VenkataSujiAparnaSri
 
Data Maturity - A Balanced Approach
DATAVERSITY
 
Increasing Your Business Data & Analytics Maturity
Mario Faria
 
Is Our Information Management Mature?  
DATAVERSITY
 
Ad

More from DATAVERSITY (20)

PDF
Architecture, Products, and Total Cost of Ownership of the Leading Machine Le...
DATAVERSITY
 
PDF
Data at the Speed of Business with Data Mastering and Governance
DATAVERSITY
 
PDF
Exploring Levels of Data Literacy
DATAVERSITY
 
PDF
Make Data Work for You
DATAVERSITY
 
PDF
Data Catalogs Are the Answer – What is the Question?
DATAVERSITY
 
PDF
Data Catalogs Are the Answer – What Is the Question?
DATAVERSITY
 
PDF
Data Modeling Fundamentals
DATAVERSITY
 
PDF
Showing ROI for Your Analytic Project
DATAVERSITY
 
PDF
How a Semantic Layer Makes Data Mesh Work at Scale
DATAVERSITY
 
PDF
Is Enterprise Data Literacy Possible?
DATAVERSITY
 
PDF
The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...
DATAVERSITY
 
PDF
Emerging Trends in Data Architecture – What’s the Next Big Thing?
DATAVERSITY
 
PDF
Data Governance Trends - A Look Backwards and Forwards
DATAVERSITY
 
PDF
Data Governance Trends and Best Practices To Implement Today
DATAVERSITY
 
PDF
2023 Trends in Enterprise Analytics
DATAVERSITY
 
PDF
Data Strategy Best Practices
DATAVERSITY
 
PDF
Who Should Own Data Governance – IT or Business?
DATAVERSITY
 
PDF
Data Management Best Practices
DATAVERSITY
 
PDF
MLOps – Applying DevOps to Competitive Advantage
DATAVERSITY
 
PDF
Keeping the Pulse of Your Data – Why You Need Data Observability to Improve D...
DATAVERSITY
 
Architecture, Products, and Total Cost of Ownership of the Leading Machine Le...
DATAVERSITY
 
Data at the Speed of Business with Data Mastering and Governance
DATAVERSITY
 
Exploring Levels of Data Literacy
DATAVERSITY
 
Make Data Work for You
DATAVERSITY
 
Data Catalogs Are the Answer – What is the Question?
DATAVERSITY
 
Data Catalogs Are the Answer – What Is the Question?
DATAVERSITY
 
Data Modeling Fundamentals
DATAVERSITY
 
Showing ROI for Your Analytic Project
DATAVERSITY
 
How a Semantic Layer Makes Data Mesh Work at Scale
DATAVERSITY
 
Is Enterprise Data Literacy Possible?
DATAVERSITY
 
The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...
DATAVERSITY
 
Emerging Trends in Data Architecture – What’s the Next Big Thing?
DATAVERSITY
 
Data Governance Trends - A Look Backwards and Forwards
DATAVERSITY
 
Data Governance Trends and Best Practices To Implement Today
DATAVERSITY
 
2023 Trends in Enterprise Analytics
DATAVERSITY
 
Data Strategy Best Practices
DATAVERSITY
 
Who Should Own Data Governance – IT or Business?
DATAVERSITY
 
Data Management Best Practices
DATAVERSITY
 
MLOps – Applying DevOps to Competitive Advantage
DATAVERSITY
 
Keeping the Pulse of Your Data – Why You Need Data Observability to Improve D...
DATAVERSITY
 
Ad

Recently uploaded (20)

PDF
20250703_A. Stotz All Weather Strategy - Performance review July
FINNOMENAMarketing
 
PDF
NewBase 03 July 2025 Energy News issue - 1799 by Khaled Al Awadi_compressed.pdf
Khaled Al Awadi
 
PDF
_How Freshers Can Find the Best IT Companies in Jaipur with Salarite.pdf
SALARITE
 
PPTX
Hackathon - Technology - Idea Submission Template -HackerEarth.pptx
nanster236
 
PDF
Cloud Budgeting for Startups: Principles, Strategies, and Tools That Scale
Amnic
 
PDF
"Complete Guide to the Partner Visa 2025
Zealand Immigration
 
PDF
Gabino Barbosa - A Master Of Efficiency
Gabino Barbosa
 
PDF
Top 10 Emerging Tech Trends to Watch in 2025.pdf
marketingyourtechdig
 
PDF
FastnersFastnersFastnersFastnersFastners
mizhanw168
 
PPTX
Phygital & Omnichannel Retail: Navigating the Future of Seamless Shopping
RUPAL AGARWAL
 
PDF
Jordan Minnesota City Codes and Ordinances
Forklift Trucks in Minnesota
 
PPTX
Sustainability Strategy ESG Goals and Green Transformation Insights.pptx
presentifyai
 
PPTX
Asia Pacific Tropical Fruit Puree Market Overview & Growth
chanderdeepseoexpert
 
PPTX
Oil and Gas EPC Market Size & Share | Growth - 2034
Aman Bansal
 
PDF
Choosing the Right Packaging for Your Products – Sriram Enterprises, Tirunelveli
SRIRAM ENTERPRISES, TIRUNELVELI
 
PDF
Step-by-Step: Buying a Verified Cash App Accounts| PDF | Payments Service
https://pvabulkpro.com/
 
PPTX
SYMCA LGP - Social Enterprise Exchange.pptx
Social Enterprise Exchange
 
PDF
Top Supply Chain Management Tools Transforming Global Logistics.pdf
Enterprise Wired
 
PPTX
Top Oil and Gas Companies in India Fuelling the Nation’s Growth.pptx
Essar Group
 
PPTX
Business profile making an example ppt for small scales
Bindu222929
 
20250703_A. Stotz All Weather Strategy - Performance review July
FINNOMENAMarketing
 
NewBase 03 July 2025 Energy News issue - 1799 by Khaled Al Awadi_compressed.pdf
Khaled Al Awadi
 
_How Freshers Can Find the Best IT Companies in Jaipur with Salarite.pdf
SALARITE
 
Hackathon - Technology - Idea Submission Template -HackerEarth.pptx
nanster236
 
Cloud Budgeting for Startups: Principles, Strategies, and Tools That Scale
Amnic
 
"Complete Guide to the Partner Visa 2025
Zealand Immigration
 
Gabino Barbosa - A Master Of Efficiency
Gabino Barbosa
 
Top 10 Emerging Tech Trends to Watch in 2025.pdf
marketingyourtechdig
 
FastnersFastnersFastnersFastnersFastners
mizhanw168
 
Phygital & Omnichannel Retail: Navigating the Future of Seamless Shopping
RUPAL AGARWAL
 
Jordan Minnesota City Codes and Ordinances
Forklift Trucks in Minnesota
 
Sustainability Strategy ESG Goals and Green Transformation Insights.pptx
presentifyai
 
Asia Pacific Tropical Fruit Puree Market Overview & Growth
chanderdeepseoexpert
 
Oil and Gas EPC Market Size & Share | Growth - 2034
Aman Bansal
 
Choosing the Right Packaging for Your Products – Sriram Enterprises, Tirunelveli
SRIRAM ENTERPRISES, TIRUNELVELI
 
Step-by-Step: Buying a Verified Cash App Accounts| PDF | Payments Service
https://pvabulkpro.com/
 
SYMCA LGP - Social Enterprise Exchange.pptx
Social Enterprise Exchange
 
Top Supply Chain Management Tools Transforming Global Logistics.pdf
Enterprise Wired
 
Top Oil and Gas Companies in India Fuelling the Nation’s Growth.pptx
Essar Group
 
Business profile making an example ppt for small scales
Bindu222929
 

Implementing the Data Maturity Model (DMM)

  • 1. Presented by Melanie Mecca & Peter Aiken, Ph.D. Data Management Maturity Achieving Best Practices using DMM Copyright 2013 by Data Blueprint • Motivation - Are we satisfied with current performance of DM? • How did we get here? - Building on previous research • What is the Data Management Maturity Model? - Ever heard of CMM/CMMI? • How should it be used? - Use Cases and Value Proposition • Where to next? • Q & A? Outline: Design/Manage Data Structures 2
  • 2. !3 Guided Navigation to Lasting Solutions • Architecture & technology neutral • Industry independent • Answers: “How are we doing?” • Guides: “What should we do next?” • Baseline for: o Managing data as a critical asset o Creating a tailored data management strategy o Accelerating an existing program o Engaging stakeholders o Pinpointing high value initiatives. !4 Foundation for Business Results • Trusted Data – demonstrated, independently measured capability to ensure customer confidence in the data • Improved Risk and Analytics Decisions –comprehensive and measured DM strategy ensures decisions are based on accurate data • Cost Reduction/Operational Efficiency –identification of current and target states supports elimination of redundant data and streamlining of processes • Regulatory Compliance – independently evaluated and measured DM capabilities to meet and substantiate industry and regulator requirements.  
  • 3. Copyright 2013 by Data Blueprint • Motivation - Are we satisfied with current performance of DM? • How did we get here? - Building on previous research • What is the Data Management Maturity Model? - Ever heard of CMM/CMMI? • How should it be used? - Use Cases and Value Proposition • Where to next? • Q & A? Outline: Data Management Maturity 5 Copyright 2013 by Data Blueprint Motivation • "We want to move our data management program to the next level" – Question: What level are you at now? • You are currently managing your data, – But, if you can't measure it, – How can you manage it effectively? • How do you know where to put time, money, and energy so that data management best supports the mission? "One day Alice came to a fork in the road and saw a Cheshire cat in a tree. Which road do I take? she asked. Where do you want to go? was his response. I don't know, Alice answered. Then, said the cat, it doesn't matter." Lewis Carroll from Alice in Wonderland 6
  • 4. Copyright 2013 by Data Blueprint DoD Origins • US DoD Reverse Engineering Program Manager • We sponsored research at the CMM/SEI asking – “How can we measure the performance of DoD and our partners?” – “Go check out what the Navy is up to!” • SEI responded with an integrated process/data improvement approach – DoD required SEI to remove the data portion of the approach – It grew into CMMI/DM BoK, etc. 7 Copyright 2013 by Data Blueprint Acknowledgements version (changing data into other forms, states, or products), or scrubbing (inspecting and manipulat- ing, recoding, or rekeying data to prepare it for sub- sequent use). • Approximately two-thirds of organizational data Increasing data management practice maturity levels can positively impact the coordination of data flow among organizations,individuals,and systems. Results from a self-assessment provide a roadmap for improving organizational data management practices. Peter Aiken, Virginia Commonwealth University/Institute for Data Research M. David Allen, Data Blueprint Burt Parker, Independent consultant Angela Mattia, J. Sergeant Reynolds Community College A s increasing amounts of data flow within and between organizations, the problems that can result from poor data management practices are becoming more apparent. Studies have shown that such poor practices are widespread. Measuring Data Management Practice Maturity: A Community’s Self-Assessment MITRE Corporation: Data Management Maturity Model • Internal research project: Oct ‘94-Sept ‘95 • Based on Software Engineering Institute Capability Maturity Model (SEI CMMSM) for Software Development Projects • Key Process Areas (KPAs) parallel SEI CMMSM KPAs, but with data management focus and key practices • Normative model for data management required; need to: – Understand scope of data management – Organize data management key practices • Reported as not-done-well by those who do it 8
  • 5. !9 CMMI Institute Background • Evolved from Carnegie Mellon’s Software Engineering Institute (SEI) - a federally funded research and development center (FFRDC) • Continues to support and provide all CMMI offerings and services delivered over its 20+ year history at the SEI o Industry leading reference models - benchmarks and guidelines for improvement – Development, Acquisition, Services, People, Data Management o Training and Certification program, Partner program • Dedicated training, partner and certification teams to support organizations and professionals • Now owned by ISACA (CISO/M, COBIT, IT Governance, Cybersecurity) and joint product offerings are planned !10 CMMI – Worldwide Process Improvement CMMI Quick Stats: • Over 10,000 organizations • 94 countries • 12 National governments • 10 languages • 500 Partners • 1900+ Appraisals in 2016
  • 6. Copyright 2013 by Data Blueprint Source: Applications Executive Council, Applications Budget, Spend, and Performance Benchmarks: 2005 Member Survey Results, Washington D.C.: Corporate Executive Board 2006, p. 23. Percentage of Projects on Budget By Process Framework Adoption …while the same pattern generally holds true for on-time performance Percentage of Projects on Time By Process Framework Adoption Key Finding: Process Frameworks are not Created Equal With the exception of CMM and ITIL, use of process-efficiency 
 frameworks does not predict higher on-budget project delivery… 11 Copyright 2013 by Data Blueprint CMMI Model Portfolio 12 Establish, Manage, and Deliver Services Product Development / Software Engineering Acquire and integrate products / supply chain Workforce development and management Rearchitecting to present a more unified/modular offering
  • 7. !13 DMM and DMBOK CMMI Institute and DAMA International are collaborating to: • Eliminate any confusion between the two tools and highlight their complementarity • Extend and enhance data management training for organizations and professionals • Provide benefits to DAMA members (members receive a discount for our public training classes) Copyright 2013 by Data Blueprint • Motivation - Are we satisfied with current performance of DM? • How did we get here? - Building on previous research • What is the Data Management Maturity Model? - Ever heard of CMM/CMMI? • How should it be used? - Use Cases and Value Proposition • Where to next? • Q & A? Outline: Data Management Maturity 14
  • 8. !15 Data Management Maturity (DMM)SM Model • DMM 1.0 released August 2014 o 3.5 years in development o Sponsors – Microsoft, Lockheed Martin, Booz Allen Hamilton o 50+ contributing authors, 70+ peer reviewers, 80+ orgs • Reference model framework of fundamental best practices o 414 specific practice statements o 596 functional work products o Maturity practices • Measurement Instrument for organizations to evaluate capabilities and maturity, identify gaps, and incorporate guidelines for improvements. !16 “You Are What You DO” • Model emphasizes behavior o Proactive positive behavioral changes o Creating and carrying out effective, repeatable processes o Leveraging and extending across the organization • Activities result in work products o Processes, standards, guidelines, templates, policies, etc. o Reuse and extension = maximum value, lower costs, happier staff • Practical focus reflects real- world organizations – enterprise program evolving to all hands on deck.
  • 9. One concept for process improvement, others include: • Norton Stage Theory •TQM •TQdM •TDQM • ISO 9000
 and focus on understanding current processes and determining where to make improvements. Copyright 2013 by Data Blueprint DMM Capability Maturity Model Levels Our DM practices are informal and ad hoc, dependent upon "heroes" and heroic efforts Performed (1) Managed (2) Our DM practices are defined and documented processes performed at the business unit level Our DM efforts remain aligned with business strategy using standardized and consistently implemented practices Defined (3) Measured (4) We manage our data as a asset using advantageous data governance practices/structures 
 Optimized (5)
 DM is strategic organizational capability, most importantly we have a process for improving our DM capabilities 17 !18 DMM Capability Levels Performed Managed Defined Measured Optimized Level 1 Level 2 Level 3 Level 4 Level 5 Risk Quality Ad hoc Reuse Stress Clarity Capability – “We can do this” • Specific Practices - “We’re doing it well” • Work Products - “We’ve documented the processes we are following” (processes, work products, guidelines, standards, etc.) Maturity – “….and we can prove it” • Process Stability & Resilience – 
 “Take it to the bank” • Ensures Repeatability • Policy, Training, Quality Assurance, etc.
  • 10. ‹#› DMM Structure Core Category Process Area Purpose Introductory Notes Goal(s) of the Process Area Core Questions for the Process Area Functional Practices (Levels 1-5) rRelated Process Areas Example Work Products Infrastructure Support Practices eExplanatory Model Components Required for Model Compliance !19 Maintain fit-for-purpose data, efficiently and effectively DMM℠ Structure of 
 5 Integrated 
 DM Practice Areas 20 Copyright 2015 by Data Blueprint Manage data coherently Manage data assets professionally Data architecture implementation Data lifecycle implementation Organizational support
  • 11. !21 Planning for and managing data assets as a critical component of infrastructure, emphasizing an organization- wide approach and program versus project by project, data store by data store. 8 Data Management Strategy !22 9 Implementing the building, nurturing, sustaining, and controlling power of collective decision-making, and harnessing staff expertise for collaborative development of knowledge management Data Governance
  • 12. !23 10 Comprises a 360 degree and extensible approach to improving the quality of data organization-wide by thoughtful planning and integrated best practices. Data Quality !24 11 Ensures that requirements for data are specified and linked to business processes and metadata, enables data lineage and authoritative sources, and exercises controls and quality improvements for data provided. DMM Operations
  • 13. !25 12 Key considerations for developing a well- organized data layer that meets business needs, with appropriate technologies, enabling integration, interoperability, and data provisioning. Platform and Architecture !26 Supporting Processes Practices that implement organization and control for all data management processes, such as: developing and monitoring metrics; managing risks, configurations, process quality and work products.
  • 14. Copyright 2013 by Data Blueprint • Motivation - Are we satisfied with current performance of DM? • How did we get here? - Building on previous research • What is the Data Management Maturity Model? - Ever heard of CMM/CMMI? • How should it be used? - Use Cases and Value Proposition • Where to next? • Q & A? Outline: Data Management Maturity 27 Copyright 2013 by Data Blueprint Assessment Components Data Management Practice Areas Data Management Strategy DM is practiced as a coherent and coordinated set of activities Data Quality Delivery of data is support of organizational objectives – the currency of DM Data 
 Governance Designating specific individuals caretakers for certain data Data Platform/ Architecture Efficient delivery of data via appropriate channels Data Operations Ensuring reliable access to data Capability Maturity Model Levels Examples of practice maturity 1 – Performed Our DM practices are ad hoc and dependent upon "heroes" and heroic efforts 2 – Managed We have DM experience and have the ability to implement disciplined processes 3 – Defined We have standardized DM practices so that all in the organization can perform it with uniform quality 4 – Measured We manage our DM processes so that the whole organization can follow our standard DM guidance 5 – Optimized We have a process for improving our DM capabilities 28
  • 15. Copyright 2013 by Data Blueprint Industry Focused Results • CMU's Software 
 Engineering Institute (SEI) Collaboration • Results from hundreds organizations in various industries including: ✓ Public Companies ✓ State Government Agencies ✓ Federal Government ✓ International Organizations • Defined industry standard • Steps toward defining data management "state of the practice" 29 Data Management Strategy Data Governance Platform & Architecture Data Quality Data Operations Focus: Implementation and Access Focus: Guidance and Facilitation Optimized(V)
 Measured(IV)
 Defined(III)
 Managed(II)
 Initial(I) Development guidance Data Adminstration Support systems Asset recovery capability Development training 0 1 2 3 4 5 Client Industry Competition All Respondents Data Management Practices Assessment Challenge Challenge Challenge Data Program Coordination Organizational Data Integration Data Stewardship Data Development Data Support Operations 30 Copyright 2015 by Data Blueprint
  • 16. High Marks for IFC's Audit 31 Copyright 2015 by Data Blueprint Leadership & Guidance Asset Creation Metadata Management Quality Assurance Change Management Data Quality 0 1 2 3 4 5 TRE ISG IFC Industry Benchmarks Overall Benchmarks 1 2 3 4 5 DataProgramCoordination OrganizationalDataIntegration DataStewardship DataDevelopment DataSupportOperations 2007 Maturity Levels 2012 Maturity Levels Comparison of DM Maturity 2007-2012 32 Copyright 2015 by Data Blueprint
  • 17. !33Copyright 2015 by Data Blueprint Slide # improving how the state prices and sells its goods and services, and more efficiently matching citizens to benefits when they enroll. “The first year of our data internship partnership has been a success,” said Governor McAuliffe. “The program has helped the state save time and money by making some of our internal processes more efficient and modern. And it has given students valuable real-world experience. I look forward to seeing what the second year of the program can accomplish.” “Data is an important resource that becomes even more critical as technology progresses,” said VCU President Michael Rao, Ph.D. “VCU is uniquely positioned, both in its location and through the wealth of talent at the School of Business, to help state agencies run their data- centric systems more efficiently, while giving our students hands-on practice in the development of data systems.” During their internships, pairs of VCU students work closely with state agency CIOs to identify specific business cases in which data can be used. Participants gain practical experience in using data to drive re-engineering, while participating CIOs have concrete examples of how to make better use of data to provide innovative and less costly services to citizens. "Working with the talented VCU students gave us a different perspective on what the data was telling us,” said Dave Burhop, Deputy Commissioner/CIO of the Virginia Department of Motor Vehicles. “The VCU interns provided an invaluable resource to the Governor’s Coordinating Council on Homelessness,” said Pamela Kestner, Special Advisor on Families, Children and Poverty. “They very effectively reviewed the data assets available in the participating state agencies and identified analytic content that can be used to better serve the homeless population.” “It's always useful to have ‘fresh eyes’ on data that we are used to seeing,” said Jim Rothrock, Commissioner of the Department for Aging and Rehabilitative Services. “Our interns challenged us and the way we interpret data. It was a refreshing and useful, and we cannot wait for new experiences with new students.” The data internships support Governor McAuliffe’s ongoing initiative to provide easier access to open data in Virginia. The internships also support treating data as an enterprise asset, one of four strategic goals of the enterprise information architecture strategy adopted by the Commonwealth in August 2013. Better use of data allows the Commonwealth to identify opportunities to avoid duplicative costs in collecting, maintaining and using information; and to integrate services across agencies and localities to improve responses to constituent needs and optimize government resources. Virginia Secretary of Technology Karen Jackson and CIO of the Commonwealth Nelson Moe are leading the effort on behalf of the state. Students who want to apply for internships should contact Peter Aiken ([email protected]) for additional information. Governor's Data Interns Program !34 Using DMM in the State of Arizona • Policies drive change in state government • Base policies on a widely-accepted framework
  • 18. !35 DMM supports Arizona Strategy • Metrics - DMM provides measurement methodology • Enterprise Architecture - DMM provides gap analysis and a path forward • Emphasis on Lean - DMM drives towards eliminating silos for improved efficiency !36 DMM in Arizona – Current State • Introduced DMM at annual Arizona Data Management Conference in January, 2016 • Wide buy-in from multiple agencies • “Building EDM Capabilities” training for 20 students from 11 agencies March 2017
  • 19. !37 DMM in Arizona – Next Steps • Students want advanced training • Students want to help other agencies – DMM “Swat Team” • 2nd Annual Data Management Conference – April 26, 27 • Participating in Governor’s Goal Council • Planning DMM assessments for 3-4 agencies • DMM adds structure and lends credibility to the state DM Program ‹#› Natural events for employing the DMM • Use Cases - assess current capabilities before: • Developing or enhancing DM program / strategy • Embarking on a major architecture transformation • Establishing data governance • Expansion / enhancement of analytics • Implementing a data quality program • Implementing a metadata repository • Designing and implementing multi-LOB solutions: • Master Data Management • Shared Data Services • Enterprise Data Warehouse • Implementing an ERP • Other multi-business line efforts. Like an Energy audit or an executive physical !38
  • 20. Starting the Journey - DMM Assessment Method • To maximize the DMM’s value as a catalyst for forging shared perspective and accelerating programs, our method provides: – Collaboration launch event with a broad range of stakeholders – Capabilities evaluated by consensus affirmations – Solicits key business input through supplemental interviews – Verifies evaluation with work product reviews (evidence) – Report and executive briefing presents Scoring, Findings, Observations, Strengths, and customized specific Recommendations.To date, over 800 assessment participants from business, IT, and data management have employed DMM 1.0 - practice by practice, work product by work product - to evaluate their capabilities. ‹#› DMM Assessment Summary
 Sample Organization !40
  • 21. !41 Cumulative Benchmark – Multiple organizations !42 DMM Training and Certification Current Offerings • Building EDM Capabilities o Instructor-Led 3-day interactive class o eLearning –web-based 8-10 hour class • Advancing EDM Capabilities o Instructor-led 5 day interactive class • Enterprise Data Management Expert (EDME) o Instructor-led 5 day interactive class, preparation for EDME certification • (Near Future) DMM Associate certification.
  • 22. ENTERPRISE DATA MANAGEMENT VISION • The State of Arizona is conducting a multi-year statewide data management initiative to support: o The Governor’s “Arizona Management System,” an innovative Lean based approach instituted by the current governor to streamline & meet the state’s goal to be: “The #1 state to live, work, play, recreate, retire, visit, do business, and get an education.” o Employee education to empower them to employ standards, methods and tools for data-driven decision-making and disciplined problem solving, which lead to greater creativity, control, and productivity in citizen service o Arizona realizes that trusted, accurate and sustainable data is essential to successful data-driven decision making and accurate metrics, and that a state-wide enterprise data management platform is critical o The state is also focusing on modernizing its data layer and technology stack to enable needed data sharing and protect privacy. MULTI-YEAR EDM PROGRAM • The Arizona Strategic Enterprise Technology office, within the Department of Administration, developed a five-year Indefinite Delivery Indefinite Quantity (IDIQ) contract to include the following services (selected): o Review, assess and measure data management maturity, using the CMMI Institute’s Data Management Maturity Model as a framework o Make recommendations for a roadmap to maturity for state agencies to achieve target maturity levels o Train and educate ADOA-ASET, business data stewards, and management staff to perform DMM Assessments and appraisals o Recommend policies, standards, and processes o Provide advisory services for best practice implementation.
  • 23. DMM AND DATA DRIVEN DECISION MAKING • The CMMI Institute was among the three vendors selected to provide the IDIQ’s broad range of services over a five-year period, along with Quest Communications and Data Blueprint • We are working with the State of Arizona to implement a statewide data management program to BUILD*IMPROVE*MEASURE its data management processes and capabilities utilizing the DMM. • We are teaming with the ADOA-ASET to help construct measurable inter- and intra-agency DMM programs to: o Provide education in data management best practices o Evaluate data management practices in numerous agencies o Assist the state in evolving a trusted data environment o Help ASET develop the roles and resources to support the state-wide program. USING DMM IN THE STATE OF ARIZONA • Policies are driving collaboration and change in state government • Basing data management policies on a widely-accepted framework
  • 24. DMM SUPPORTS ARIZONA STRATEGY • Metrics – AMS requires metrics- driven decisions, and DMM provides a measurement methodology • Enterprise Architecture - DMM provides gap analysis and a path forward for data sharing • Emphasis on Lean - DMM drives towards eliminating silos for improved effectiveness and efficiency DMM IN ARIZONA – RECENT ACTIVITIES • Introduced DMM at annual Arizona Data Management Conference in January 2016 • Wide buy-in from multiple agencies • Held CMMI Institute capability briefing for key state data management personnel • “Building EDM Capabilities” training for 20 students from 11 agencies in March 2017
  • 25. 2ND ANNUAL DATA MANAGEMENT CONFERENCE • Hosted by ADOA-ASET, led by our POC Jeff Wolkove, Enterprise Architect • Over 75 attendees representing 40+ agencies • Keynote speaker was State Chief Operating Officer, Henry Darwin • Presented at state-wide Data Management Conference April 2017 o Moderated data management panel discussion o Overview - Data Management Capability Evaluation o Role of Data Governance in EDM o Break Down Silos by Managing Data as Meaning (seminar) DMM IN ARIZONA – NEXT STEPS • Building EDM Capabilities attendees want advanced training in the DMM • Students want to help other agencies – DMM “Swat Teams” • ASET is participating in Governor’s Goal Council representing data management issues • Planning DMM assessments for 3-4 agencies • Department of Corrections - Jun 2017 • Department of Water Resources - Jul 2017 • Health Care Cost Containment System – Aug 2017 • And more in the works…. • DMM adds structure and lends credibility to the state Data Management Program • Arizona has set a target for 35 cabinet-level agencies to achieve DMM Level 3 in the next three years.
  • 26. OPPORTUNITIES FOR STATES AND TERRITORIES • Arizona is providing leadership and vision which can be leveraged for other states and territories • Analytics and automation, internet of things, unstructured data are testing status quo of staffing, technology, and data management needs • States are undertaking modernization of their technologies and processes • The Arizona RFP has been provided to New York and Virginia, which has begun data management capability evaluation for multiple agencies through the Virginia Commonwealth University graduate school • We presented a webinar on state-wide data management programs, featuring Jeff Wolkove, to the National Association of State Chief Information Officers (NASCIO) in April 2017 • We would like to working with additional states; this will increase adoption of the DMM as a de facto global standard and provide opportunities for our Partners and EDMEs. WHAT CAN GO WRONG? • It’s estimated that the average hospital has 8-12% duplicate records, and as many as 10% of incoming patients are misidentified • Sharing of patient data from disparate providers increases likelihood of duplicates for HIEs due to defects in MPIs • Preventable medical errors may include: o Misdiagnosis and incorrect treatment procedures o Incorrect or wrong dose of medication o Incorrect blood type o Unknown medication allergies o Repeated diagnostic tests • Cost to correct a duplicate record estimated at $1000 • Claims may be rejected • Increased risk of lawsuits.
  • 27. IMPROVED DATA QUALITY LEADS TO MANY BENEFITS • Decreased operational risk through improvements to the quality of patient demographic data. Specifically, patient safety is protected, and the delivery of patient care improves
 • Increased operational efficiency, requiring less manual effort to fix data issues, fewer duplicate test orders for patients, and adoption of standard data representations
 • Improved interoperability and data integration through adopting data standards and data management practices that are followed by staff across the patient lifecycle
 • Improved staff productivity by expending fewer hours on detecting and remediating data defects to perform their tasks - Increased staff awareness for contributing to and following processes that improve patient identity integrity THE DMM WAS SELECTED AS THE FOUNDATION • An HHS ONC Community of Practice (CoP) for patient demographic data quality was convened in 2015 with government, health care provider, and industry association members. The CoP analyzed available data management frameworks to determine which encompassed the practices that health care providers and staff needed to improve patient data quality across the entire organization.
 • They determined that the DMM was most closely aligned with the needs of the health care industry for a comprehensive standard, due to its: o Behavioral, fact-based approach o Organization-wide scope o Built-in path for improvements o Technology and architecture neutrality o Enables a rapid and precise evaluation of the current state of data management practices
  • 28. DEVELOPMENT ACTIVITIES WITH HHS ONC • CMMI Institute was commissioned to develop a practical, implementable framework for HHS ONC - focused on advancing the goal of improved patient data quality: o Describing the organizational behavior reflecting sound data management principles in a health care context o Specific data management practices in the context of patient demographic data o Set of work products (policies, plans, processes, standards, etc.) supporting the new or improved capabilities
 • Working with HHS ONC and our prime, Audacious Inquiry, the team researched ONC publications, industry surveys and studies. We then analyzed the DMM to determine the process areas (topics / disciplines) and functional practices addressing the scope: o Demographic data used for record matching o Relevant to the patient care lifecycle and stakeholders o Most critical to assessing and monitoring demographic data quality. ONC ‘s “Patient Identification and Matching Final Report,” February 7, 2014 DMM TRANSFORMATION RESULTS • Eliminated Process Areas less directly related to the scope - 19 versus 25
 • Practice statements were transformed into questions – 76 versus 414 statements o Invites self-administration, elicits discussion o Encourages stakeholder representation across patient care lifecycle
 • Five capability Levels 1-5 became three capability Tiers – Foundational, Building, and Advanced
 • Example work products – 152 versus 596
 • 96 versus 240+ pages
 • Contextual examples throughout Patient Demographic Data Quality Framework
  • 29. DESIGNED FOR A COLLABORATIVE APPROACH • Managing data is primarily a people problem, not a ‘system problem’ • Patient data primarily originates in Registration • However - it can be modified at any point throughout the patient care lifecycle No one individual knows everything about the patient data Industry-wide, realization of the long-term goal, adoption of consistent data standards and standard matching algorithms, would increase interoperability and minimize duplicates • Staff who have access to the patient record, should be engaged in decisions about standards, formats, term, values, etc. • PDDQ questions are designed to be posed in a group setting • Consensus decisions about the current state • Key processes and standards should be agreed to, standardized, and followed. HHS ONC ADVOCATES A NEW ROLE • Who should lead data quality plans and improvements within a health care provider organization? ONC proposed that a Data Quality Coordinator (DQC) for patient demographic data should be designated for each organization o Leads the PDDQ evaluation effort o Coordinates process improvement efforts, for example: • Establishing data governance • Defining terms, formats and values for demographic data • Developing quality rules • Creating a data quality plan for the organization o In a small practice, one individual or a part-time individual • In a large practice, may be a group led by a DQC
 • Establishing this role demonstrates commitment
  • 30. AUDIENCE FOR THE PDDQ • The PDDQ was designed for any organization creating, managing, or aggregating patient data (e.g., master patient index, MPI): o Hospitals and Health systems which deliver patient care o Health Information Organizations (HIOs) which facilitate health information exchange (HIE) among and between multiple stakeholders o Master Data Management and Master Patient Index solution vendors which provide patient databases that enable enterprise wide patient data management o Health Information Exchange (HIE) vendors which enable doctors, nurses, pharmacists, other health care providers and patients to electronically access and securely share patient data o Inpatient or outpatient Electronic Health Record (EHR) vendors who manage patient data from clinicians EVALUATION QUESTIONS – DATA QUALITY PLANNING No context or elaborations are displayed in the table above
  • 31. CONTEXTUAL INFORMATION • Practice Evaluation Questions are supported by an explanation using health care examples • Example from Data Profiling: 2.1 Hasthe organization definedanapproachandmethodfor profilingadataset? A data profiling method is a planned approach to analyzing data sets that is not A data profiling method is a planned approach to analyzing data sets that is not restricted to a specific technology solution. The method serves as a process guide that defines the types of analyses to be performed, their rationale, relevant scenarios, high-level activity steps, tests and rules to be applied, as well as report templates for results. The goal is to define the process steps and supporting work products so they are reusable across various data stores. One of the most common types of advanced profiling methods is aimed at the identification and resolution of duplicate records in a data set. The patient matching algorithms used across the healthcare industry are a classic example of both the target objective, and the difficulty in stabilizing profiling methods that work. Most algorithms for determining duplicates may require trial and error, as well as customization that is achieved through numerous iterations of data analysis and standardization. SAMPLE SUMMARY DIAGRAM • This example depicts an organization with strong data management capabilities
 • HHS ONC recommends that all health care organizations with patient data work to achieve a score of Tier 3 in all PDDQ Process Areas
 • The online PDDQ score graphic is a horizontal stacked bar chart
  • 32. HOW TO INTERPRET THE SCORES Process Area scores may fall into a range from 0 to 3: • 0 - 1 Foundational: Data is managed as a requirement for projects and processes • 1 - 2 Building: Building on the successful completion of Tier 1, data is increasingly managed as a critical infrastructural asset • 2 - 3 Advanced features capabilities that comprise completion of the practices needed for a sound and sustainable program for managing patient demographic data across the health care lifecycle. COLLABORATION AND COOPERATION • Patient data is a common thread across the entire provider organization o Health care units tend to focus on their mission –diagnostic imaging, pharmacy, laboratory, claims o Capturing or modifying patient data differently magnifies the potential for duplicates
 • PDDQ questions zero in on key processes that may already be performed, but typically not examined organization-wide
 • Participants can raise issues, make collective decisions (How Are We Doing?), and suggest improvements
 • The DQC leads the coordination of implementation activities, monitors progress
  • 33. PRECISE SNAPSHOT OF CURRENT STATE • PDDQ means that health care organizations don’t have to guess about how they’re managing their patient data • A PDDQ evaluation of current practices will: • Enable clear identification of any gaps • Enable staff to understand that their activities affect the quality of patient data (not just ‘system problems’) • Create awareness about practices not followed for all care areas • Discover useful work products elsewhere in the organization, and discover missing work products (e.g., no data entry standards) • Give everyone a voice - each learns what others are doing, leading to acceptance of results and further cooperation • Set a baseline for monitoring progress ACTIONABLE IMPROVEMENTS • Once gaps and strengths, have been identified, the organization can quickly move to establishing new capabilities and creating supporting work products – for example: o Data Governance – determining the care areas and roles that are engaged with patient demographic data, forming governance group o Business Terms – collaborative agreements on demographic data – definitions, values approved by stakeholders across the lifecycle o Data Quality Plan – organization-wide plan for improving data quality, which may include new quality rules, registration procedures, data profiling, frequent monitoring, matching algorithm upgrades, etc.
 • The organization determines its priorities and a timeline to meet objectives, including resources and roles.
  • 34. IF YOU’RE A HEALTH CARE DATA PROFESSIONAL • The PDDQ can assist you in many ways: o Speeds up your analysis of organization’s DM capabilities o Engages stakeholders and provides consensus pre-approval for initiatives • Stakeholders see the gaps – definitive and understand recommended improvements • Facilitates funding and resource commitments o Opportunity for multiple implementation projects o Opportunity for new role – Data Quality Coordinator
 • Taken together, helps accelerate your career
 • Recommended first step – master the PDDQ
 • Achieve EDME Certification – master the recommended method ACCESS THE PDDQ • https://www.healthit.gov/playbook/pddq- framework/
  • 35. Copyright 2013 by Data Blueprint • Motivation - Are we satisfied with current performance of DM? • How did we get here? - Building on previous research • What is the Data Management Maturity Model? - Ever heard of CMM/CMMI? • How should it be used? - Use Cases and Value Proposition • Where to next? • Q & A? Outline: Data Management Maturity 69 LEARN MORE • http://cmmiinstitute.com/patient-demographic-data-quality-pddq-framework - PDDQ page • http://cmmiinstitute.com/sites/default/files/resource_asset/PDDQ_White_Paper_0.pdf - PDDQ white paper • http://cmmiinstitute.com/sites/default/files/resource_asset/PDDQ%20Article.pdf - Article - Improving Patient Data Quality: An Introduction to the Patient Demographic Data Quality (PDDQ) Framework
  • 36. Copyright 2013 by Data Blueprint Questions? + = 71 10124 W. Broad Street, Suite C Glen Allen, Virginia 23060 804.521.4056