The Basics of Monitoring,
Evaluation and Supervision
(of Health Services in Nepal)
Deepak K Karki
National Centre for AIDS and STD Control
Outline
 What is Supervision?
 What is Monitoring and
Evaluation
 M&E of (in) health sector
Before, we hear the followings
o Reporting
o Recording and Reporting (R&R)
o Monitoring and Evaluation
o M & e
o m & E
o Accountability Framework
What is the Framework of M&E
o Logical framework – 4x4 Table
o Result framework – Result planning
o Process model for M&E – IPO
Logical Framework Matrix
Project Summary
Objectively
Verifiable
Indicators (OVI)
Means of
Verification
(MOV)
Assumptions
Goal
Purpose/Objectives
Outputs
Activities
Result
Planning
Formulate/
revise
Strategy
Analyze
problem
and
intervention
data
Identify
Results –
Outcomes,
indicators
Select
Strategic
Programme
Select
Critical
interventio
ns
Monitoring
of results
Evaluation
of Changes
Result planning
Process model
Input Process Output
Cause – Effect Relationship
Project in log-frame language
Resources
Results
Purpose/Objectives
Goal Impact
Outcome
Output
Input Input
Output
Input
Outcome
Output
Input Input
What is Monitoring?
 Continuous process of collecting and analyzing
qualitative and quantitative data to track the progress of
programs.
 Assess the extent to which the input deliveries and
whether work schedule and other required actions are
proceeding according to plans.
 A process aimed at ensuring that activities are on the
right track and in case of deviation appropriate corrective
actions can be instituted.
 Managers depend on the resulting parameters to
determine which areas require greater effort and,
thereby, may contribute to an improved response
Monitoring and Evaluation are
same?
What is Evaluation?
 A periodic assessment of on-going or completed programs
 Links a particular output or outcome directly to particular
intervention
 Aim is to determine the relevance, efficiency, effectiveness,
impact and sustainability of the interventions.
 Explores deeper at explaining the cause and effect and other
wider issues about the interventions.
 Helps to deal with problems that monitoring is not able to
address. Monitoring data is often necessary to conduct
successful evaluation
 Helps program managers determine the value of particular
program
Difference between Monitoring and
Evaluation?
Monitoring Evaluation
Monitoring answers
“What are we doing?”
Evaluation results
“What have we
achieved?”
Monitoring and Evaluation is …
A continuum of observation, information
gathering, analysis, documentation,
supervision and assessment
M&E during Program/Project
Current M&E practice in health
system
• Raw data are generated monthly, trimester and
annually at SDPs, district, region and national level
• Indicator-based analyzed reports are generated each
trimester applying rates, ratios and percentage.
• Different EDPs supported districts vs. non-supported
districts comparative analysis report generated
periodically.
• Single integrated DoHS/Regional/District level
Annual Report
Bottom up Performance Review
Process
National Review Workshop (Aswin 15-30)
Regional Review Workshop (Aswin 1-15)
District Review Workshop(Bhadra 15-30)
Review at PHC/HP(Shrawan 15-30)
VDC Level Review(Shrawan 15-30)
Uses of M&E
Strategic decisions
Project design
Resource allocation
Project process verification
Project re-alignment
Project evaluation
General Framework of M&E
Indicators
Input Process Output Outcome Impact
People,
Money,
Equipment,
Policies,
etc.
Training,
Logistics,
Management,
IEC/BCC,
etc.
Services.
Service use,
Knowledge,
Quality,
Behaviour,
Safer
practices,
(population
level)
Health Status
Mortality
What is the Most Important
Indicator?
Input?
Process?
Output?
Outcome?
Impact?
 What these level of results usually refers to …?
(Let’s discuss, with some example from a project)
M&E pipeline
All
• Input
monitoring
Most
• Process
monitoring/
Evaluation
Some
• Outcome
Monitoring
Evaluation
Few
• Impact
Monitoring
Monitoring:
Process Evaluation
Evaluation:
Effectiveness Evaluation
Levels of Monitoring and Evaluation effort
Input Process Output Outcome Impact
Result monitoring
Policy Monitoring
• HIV prevalence among FSWs
• Prevalence of IPV among young pregnant women
Programme monitoring
• Programme coverage among FSWs
• Coverage of GBV reach
Project monitoring
• No of condoms distributed/consumed
• No of GBV sessions condcuted
Essential of programme logic
Assumptions/Context
Problem Statement
Implementation
Inputs Activities Outputs
Outcomes
Impacts
Monitoring and evaluation have
complementary roles
Thus, we came to know that M&E is
for
Programme
Improvement
Accountability
What are the Principles of M&E?
 National M&E policies
 Fundamental principles of M&E
 Learning approach
 M&E as an integral part of overall programme planning
cycle, including costs
 Partnership and stakeholders’ engagement
 Quantitative and qualitative approaches
 Time-bound approach
 Principle of One M&E
 Principle of Accountability
 Functional M&E system principle
Where do M&E stand for?
• Depends on what policies are adopted for
monitoring and evaluation
What is critical is “Accountability”
 Being CAR by All, ….. as per the roles defined and
agreed
Capable
Accountable
Responsive
12 Components of functional M&E
Why M&E in (of) health is complex?
 Health is a basic human right
 Health is the ultimate measure of development
(highest level of impact)
 Health is multi-factorial
 Health is complex
 Health is knowledge
 Good health is fairness
However, Access to Health is
determined by
 Individual and population
 Social and systems
 Endogenous and exogenous
 Proximal and distal
Social Determinant of Health (SDH)
In deed, it is complex
So the interventions?
… for good health …
 Are the interventions/programs sensitive
enough to address the health determinants?
 What is the performance of health
interventions/programs?
 How can we know they Are? Or they are Not?
What is Supervision?
Supervision (of HRH)
• Way of ensuring staff competence, effectiveness and
efficiency through observation, discussion, support
and guidance.
• Management by overseeing the performance or
operation of a person or group
• Supervision is concerned with encouraging the
members of a work unit to contribute positively
toward accomplishing the organizations goal and
objectives.
Supervision is for improving the
performance of HR
Supervision Methods
Indirect
 by analyzing records & reports and (Quantitative)
providing feedback
Direct
 by observing the performance of health workers
(Qualitative) while on job doing clinical and public
health assessment, counselling etc.
 by observing/verifying, IEC material displayed,
drug position;
 by discussing with service providers &
beneficiaries
Supervisory tools
 Job descriptions
 Checklists
 Supervision schedule
 Policy manuals
 Registers and records
 Charts and graphs
 Reports
 Work plans and work
schedules
 Guidelines for supervision
Supervision - activities
• Preparing checklist
• Preparing field visits
• Data collection and analysis
• Specifying training needs assessment
• Decision making for problems solving
Skills Requirement
Technical : Clinical, counseling
Human relation : Behavior, team spirit,
motivation, conflict resolution
Administrative : Planning, organizing, controlling
Decision taking : Problem solving, re-planning
Routine supervision strategy in
Nepal
From To
Times per Year per
institution
Center
Region 2
District
Terai 3
Hill 2
Mountain 1
District PHCC/HP 6
PHCC/HP SHP 6
SHP Ward 6
Challenges of M&E in health sector
in Nepal
• Data generation - quality, coverage, and use (?)
• Using evidence to inform policies and improve
programme – E2
• Using data for Advocacy and Action (A2)
Analysis & Advocacy (A2)  Efficacy & Efficiency (E2)
• Reviewing systems, tools and functions
• Updating in time with state-of-art knowledge
• Rolling out again
What about Quality?
• In M&E, Quality matters the Most
• Quality is not automatic nor it is free
• Quality is the composite result of quality
systems (policy and governance), tools
(design) and its execution (practice)
Quality matters …
Data is Essential … Quality is Concern
Quality is often Subjective too … Difficult to Define
… but You Know what is Quality
Data quality = Fitness for Use
~ Tayi and Ballou
Why quality data ?
Better data
Better
decisions
Better
health
The whole concept is …
Using data for decision making
Problem Solution
Problem
Problem
solving
Solution
Use the Data
6 Criteria for Data Quality
Data
quality
• Validity
• Reliability
• Integrity
• Precision
• Timeliness
• Confidentiality
What makes sense in M&E
• Data use is the key (central point of interest)
• Makes no sense whatever the efforts are placed
for strengthening of M&E, if the Data are Not Used
Data use?
Data Use
Behavioural
• Attitude
Technical
• Skills
Organizational
culture
• System
developed and
followed
Investing on M&E
 More relevant in broader development agenda
 For health, there is no alternative
 It is a governance and accountability issue
 If so, it deserves adequate resource allocated
for M&E
 It is the reflection of Attitude and Behavior of
individual and organization s/he belongs to
M&E is for Ensuring Universal
Health Coverage
• Quality M&E is to ensure the effective coverage
• Universal Health Coverage is …
• quality in access to health services - those who need the
services should get them, not only those who can pay for
them;
• that the quality of health services is good enough to
improve the health of those receiving services; and
• financial-risk protection - ensuring that the cost of using
care does not put people at risk of financial hardship.
3 dimensional pathway towards
Universal Health Coverage
Remember !
Data is for Decision Making
It is for Quality service to People
Quality is Behaviour
We have a Role to Play !
It is an Ethical Issue
M&E is the surest path to ensure Universal Health Coverage.
Thank you for your Commitment to M&E
Evaluation of Health Services
Deepak K Karki
National Centre for AIDS and STD Control
Outline
 What is Evaluation ?
 Evaluation of health service ? -
Why? and How?
What is Evaluation?
• Evaluation is the systematic investigation of the
merit, worth or significance of the service
• Effective program evaluation is a systematic way
to improve and account for public health actions.
• It is a learning.
Evaluation is for …
• Assigning ‘value’ addressing three inter-related
domains:
• Merit (or quality)
• Worth (or value, i.e., cost-effectiveness)
• Significance (or importance)
• Evaluation involves procedures that are useful, feasible,
ethical, and accurate.
• Evaluation is in Learning framework – better future
program design and execution arrangements.
What does evaluation explores?
• What will be evaluated? (i.e., what is "the program" and in what context
does it exist?)
• What aspects of the program will be considered when judging program
performance?
• What standards (i.e., type or level of performance) must be reached for
the program to be considered successful?
• What evidence will be used to indicate how the program has performed?
• What conclusions regarding program performance are justified by
comparing the available evidence to the selected standards?
• How will the lessons learned from the inquiry be used to improve public
health effectiveness?
Why health service evaluation?
 Effectiveness
 Accountability (policy/stakeholders)
 Improvement
 Impact
Where does evaluation hits, the
most?
• Implementation: Were your program’s activities put into place as
originally intended?
• Effectiveness: Is your program achieving the goals and objectives it
was intended to accomplish?
• Efficiency: Are your program’s activities being produced with
appropriate use of resources such as budget and staff time?
• Cost-Effectiveness: Does the value or benefit of achieving your
program’s goals and objectives exceed the cost of producing them?
• Attribution: Can progress on goals and objectives be shown to be
related to your program, as opposed to other things that are going
on at the same time?
Basic steps to do an evaluation
Types of evaluation
• Implementation/Process Evaluation
• Effectivess/Outcome
• Impact
Types of Evaluation design
• Experimental
• Quasi Experimental
• Observational
Evaluation question is the key
Key for a good evaluation design
Standards Questions
Utility • What is the purpose of the evaluation?
• Who will use the evaluation results and how will they use them?
• What special needs of any other stakeholders must be addressed?
Feasibility • What is the program’s stage of development?
• How intense is the program?
• How measurable are the components in the proposed focus?
Propriety • Will the focus and design adequately detect any unintended
consequences?
• Will the focus and design include examination of the experience of those?
• Who are affected by the program?
Accuracy • Is the focus broad enough to detect success or failure of the program?
• Is the design the right one to respond to the questions such as attribution
that are being asked by stakeholders?
Remember !
Evaluation is a periodic assessment
Aim is to determine the relevance, efficiency, effectiveness,
impact and sustainability of the interventions.
Explores deeper at explaining the cause and effect and other
wider issues about the interventions.
Helps program managers determine the value of particular
program

The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal

  • 1.
    The Basics ofMonitoring, Evaluation and Supervision (of Health Services in Nepal) Deepak K Karki National Centre for AIDS and STD Control
  • 2.
    Outline  What isSupervision?  What is Monitoring and Evaluation  M&E of (in) health sector
  • 3.
    Before, we hearthe followings o Reporting o Recording and Reporting (R&R) o Monitoring and Evaluation o M & e o m & E o Accountability Framework
  • 4.
    What is theFramework of M&E o Logical framework – 4x4 Table o Result framework – Result planning o Process model for M&E – IPO
  • 5.
    Logical Framework Matrix ProjectSummary Objectively Verifiable Indicators (OVI) Means of Verification (MOV) Assumptions Goal Purpose/Objectives Outputs Activities
  • 6.
  • 7.
    Process model Input ProcessOutput Cause – Effect Relationship
  • 8.
    Project in log-framelanguage Resources Results Purpose/Objectives Goal Impact Outcome Output Input Input Output Input Outcome Output Input Input
  • 9.
    What is Monitoring? Continuous process of collecting and analyzing qualitative and quantitative data to track the progress of programs.  Assess the extent to which the input deliveries and whether work schedule and other required actions are proceeding according to plans.  A process aimed at ensuring that activities are on the right track and in case of deviation appropriate corrective actions can be instituted.  Managers depend on the resulting parameters to determine which areas require greater effort and, thereby, may contribute to an improved response
  • 10.
  • 11.
    What is Evaluation? A periodic assessment of on-going or completed programs  Links a particular output or outcome directly to particular intervention  Aim is to determine the relevance, efficiency, effectiveness, impact and sustainability of the interventions.  Explores deeper at explaining the cause and effect and other wider issues about the interventions.  Helps to deal with problems that monitoring is not able to address. Monitoring data is often necessary to conduct successful evaluation  Helps program managers determine the value of particular program
  • 12.
    Difference between Monitoringand Evaluation? Monitoring Evaluation Monitoring answers “What are we doing?” Evaluation results “What have we achieved?”
  • 13.
    Monitoring and Evaluationis … A continuum of observation, information gathering, analysis, documentation, supervision and assessment
  • 14.
  • 15.
    Current M&E practicein health system • Raw data are generated monthly, trimester and annually at SDPs, district, region and national level • Indicator-based analyzed reports are generated each trimester applying rates, ratios and percentage. • Different EDPs supported districts vs. non-supported districts comparative analysis report generated periodically. • Single integrated DoHS/Regional/District level Annual Report
  • 16.
    Bottom up PerformanceReview Process National Review Workshop (Aswin 15-30) Regional Review Workshop (Aswin 1-15) District Review Workshop(Bhadra 15-30) Review at PHC/HP(Shrawan 15-30) VDC Level Review(Shrawan 15-30)
  • 17.
    Uses of M&E Strategicdecisions Project design Resource allocation Project process verification Project re-alignment Project evaluation
  • 18.
    General Framework ofM&E Indicators Input Process Output Outcome Impact People, Money, Equipment, Policies, etc. Training, Logistics, Management, IEC/BCC, etc. Services. Service use, Knowledge, Quality, Behaviour, Safer practices, (population level) Health Status Mortality
  • 19.
    What is theMost Important Indicator? Input? Process? Output? Outcome? Impact?  What these level of results usually refers to …? (Let’s discuss, with some example from a project)
  • 20.
    M&E pipeline All • Input monitoring Most •Process monitoring/ Evaluation Some • Outcome Monitoring Evaluation Few • Impact Monitoring Monitoring: Process Evaluation Evaluation: Effectiveness Evaluation Levels of Monitoring and Evaluation effort Input Process Output Outcome Impact
  • 21.
    Result monitoring Policy Monitoring •HIV prevalence among FSWs • Prevalence of IPV among young pregnant women Programme monitoring • Programme coverage among FSWs • Coverage of GBV reach Project monitoring • No of condoms distributed/consumed • No of GBV sessions condcuted
  • 22.
    Essential of programmelogic Assumptions/Context Problem Statement Implementation Inputs Activities Outputs Outcomes Impacts
  • 23.
    Monitoring and evaluationhave complementary roles
  • 24.
    Thus, we cameto know that M&E is for Programme Improvement Accountability
  • 25.
    What are thePrinciples of M&E?  National M&E policies  Fundamental principles of M&E  Learning approach  M&E as an integral part of overall programme planning cycle, including costs  Partnership and stakeholders’ engagement  Quantitative and qualitative approaches  Time-bound approach  Principle of One M&E  Principle of Accountability  Functional M&E system principle
  • 26.
    Where do M&Estand for? • Depends on what policies are adopted for monitoring and evaluation
  • 27.
    What is criticalis “Accountability”  Being CAR by All, ….. as per the roles defined and agreed Capable Accountable Responsive
  • 28.
    12 Components offunctional M&E
  • 29.
    Why M&E in(of) health is complex?  Health is a basic human right  Health is the ultimate measure of development (highest level of impact)  Health is multi-factorial  Health is complex  Health is knowledge  Good health is fairness
  • 30.
    However, Access toHealth is determined by  Individual and population  Social and systems  Endogenous and exogenous  Proximal and distal
  • 31.
  • 32.
    In deed, itis complex
  • 33.
    So the interventions? …for good health …  Are the interventions/programs sensitive enough to address the health determinants?  What is the performance of health interventions/programs?  How can we know they Are? Or they are Not?
  • 34.
  • 36.
    Supervision (of HRH) •Way of ensuring staff competence, effectiveness and efficiency through observation, discussion, support and guidance. • Management by overseeing the performance or operation of a person or group • Supervision is concerned with encouraging the members of a work unit to contribute positively toward accomplishing the organizations goal and objectives.
  • 37.
    Supervision is forimproving the performance of HR
  • 39.
    Supervision Methods Indirect  byanalyzing records & reports and (Quantitative) providing feedback Direct  by observing the performance of health workers (Qualitative) while on job doing clinical and public health assessment, counselling etc.  by observing/verifying, IEC material displayed, drug position;  by discussing with service providers & beneficiaries
  • 40.
    Supervisory tools  Jobdescriptions  Checklists  Supervision schedule  Policy manuals  Registers and records  Charts and graphs  Reports  Work plans and work schedules  Guidelines for supervision
  • 41.
    Supervision - activities •Preparing checklist • Preparing field visits • Data collection and analysis • Specifying training needs assessment • Decision making for problems solving
  • 42.
    Skills Requirement Technical :Clinical, counseling Human relation : Behavior, team spirit, motivation, conflict resolution Administrative : Planning, organizing, controlling Decision taking : Problem solving, re-planning
  • 43.
    Routine supervision strategyin Nepal From To Times per Year per institution Center Region 2 District Terai 3 Hill 2 Mountain 1 District PHCC/HP 6 PHCC/HP SHP 6 SHP Ward 6
  • 44.
    Challenges of M&Ein health sector in Nepal • Data generation - quality, coverage, and use (?) • Using evidence to inform policies and improve programme – E2 • Using data for Advocacy and Action (A2) Analysis & Advocacy (A2)  Efficacy & Efficiency (E2) • Reviewing systems, tools and functions • Updating in time with state-of-art knowledge • Rolling out again
  • 45.
    What about Quality? •In M&E, Quality matters the Most • Quality is not automatic nor it is free • Quality is the composite result of quality systems (policy and governance), tools (design) and its execution (practice)
  • 46.
    Quality matters … Datais Essential … Quality is Concern Quality is often Subjective too … Difficult to Define … but You Know what is Quality Data quality = Fitness for Use ~ Tayi and Ballou
  • 47.
    Why quality data? Better data Better decisions Better health The whole concept is …
  • 48.
    Using data fordecision making Problem Solution Problem Problem solving Solution Use the Data
  • 49.
    6 Criteria forData Quality Data quality • Validity • Reliability • Integrity • Precision • Timeliness • Confidentiality
  • 50.
    What makes sensein M&E • Data use is the key (central point of interest) • Makes no sense whatever the efforts are placed for strengthening of M&E, if the Data are Not Used
  • 51.
    Data use? Data Use Behavioural •Attitude Technical • Skills Organizational culture • System developed and followed
  • 52.
    Investing on M&E More relevant in broader development agenda  For health, there is no alternative  It is a governance and accountability issue  If so, it deserves adequate resource allocated for M&E  It is the reflection of Attitude and Behavior of individual and organization s/he belongs to
  • 53.
    M&E is forEnsuring Universal Health Coverage • Quality M&E is to ensure the effective coverage • Universal Health Coverage is … • quality in access to health services - those who need the services should get them, not only those who can pay for them; • that the quality of health services is good enough to improve the health of those receiving services; and • financial-risk protection - ensuring that the cost of using care does not put people at risk of financial hardship.
  • 54.
    3 dimensional pathwaytowards Universal Health Coverage
  • 55.
    Remember ! Data isfor Decision Making It is for Quality service to People Quality is Behaviour We have a Role to Play ! It is an Ethical Issue M&E is the surest path to ensure Universal Health Coverage. Thank you for your Commitment to M&E
  • 56.
    Evaluation of HealthServices Deepak K Karki National Centre for AIDS and STD Control
  • 57.
    Outline  What isEvaluation ?  Evaluation of health service ? - Why? and How?
  • 58.
    What is Evaluation? •Evaluation is the systematic investigation of the merit, worth or significance of the service • Effective program evaluation is a systematic way to improve and account for public health actions. • It is a learning.
  • 59.
    Evaluation is for… • Assigning ‘value’ addressing three inter-related domains: • Merit (or quality) • Worth (or value, i.e., cost-effectiveness) • Significance (or importance) • Evaluation involves procedures that are useful, feasible, ethical, and accurate. • Evaluation is in Learning framework – better future program design and execution arrangements.
  • 60.
    What does evaluationexplores? • What will be evaluated? (i.e., what is "the program" and in what context does it exist?) • What aspects of the program will be considered when judging program performance? • What standards (i.e., type or level of performance) must be reached for the program to be considered successful? • What evidence will be used to indicate how the program has performed? • What conclusions regarding program performance are justified by comparing the available evidence to the selected standards? • How will the lessons learned from the inquiry be used to improve public health effectiveness?
  • 61.
    Why health serviceevaluation?  Effectiveness  Accountability (policy/stakeholders)  Improvement  Impact
  • 62.
    Where does evaluationhits, the most? • Implementation: Were your program’s activities put into place as originally intended? • Effectiveness: Is your program achieving the goals and objectives it was intended to accomplish? • Efficiency: Are your program’s activities being produced with appropriate use of resources such as budget and staff time? • Cost-Effectiveness: Does the value or benefit of achieving your program’s goals and objectives exceed the cost of producing them? • Attribution: Can progress on goals and objectives be shown to be related to your program, as opposed to other things that are going on at the same time?
  • 63.
    Basic steps todo an evaluation
  • 64.
    Types of evaluation •Implementation/Process Evaluation • Effectivess/Outcome • Impact
  • 65.
    Types of Evaluationdesign • Experimental • Quasi Experimental • Observational Evaluation question is the key
  • 66.
    Key for agood evaluation design Standards Questions Utility • What is the purpose of the evaluation? • Who will use the evaluation results and how will they use them? • What special needs of any other stakeholders must be addressed? Feasibility • What is the program’s stage of development? • How intense is the program? • How measurable are the components in the proposed focus? Propriety • Will the focus and design adequately detect any unintended consequences? • Will the focus and design include examination of the experience of those? • Who are affected by the program? Accuracy • Is the focus broad enough to detect success or failure of the program? • Is the design the right one to respond to the questions such as attribution that are being asked by stakeholders?
  • 67.
    Remember ! Evaluation isa periodic assessment Aim is to determine the relevance, efficiency, effectiveness, impact and sustainability of the interventions. Explores deeper at explaining the cause and effect and other wider issues about the interventions. Helps program managers determine the value of particular program