Afternoon Sessions – Day 2
• Introduction to Assessment
• Formative, Summative and Internal Assessment
• Assessment Planning and Quality assurance
• Framing
• Essay question
• Short notes
• MCQs
Assessment
Types and
Characteristics
Dr Bharath Kumar Reddy A.
Associate Professor
Dept. of Forensic Medicine & Toxicology,
Government Medical College, Nalgonda.
Learning Objectives
• Define assessment and evaluation
• Know Types of assessment
• Understand Characteristics of a good assessment
• Validity
• Reliability
• Feasibility
• Objectivity
Assessment Vs Evaluation
• Assessment : Taking a measure of effectiveness. It is ongoing
• Evaluation: Process of observing and measuring a thing for the
purpose of judging it and of determining its value, either by
comparison to similar things, or to a standard.
• Done at the end of the course.
Assessment
Diagnostic
Assessment
Before Learning
Learning
Formative
Assessment for
Learning
Summative
Assessment of
Learning
Assessment Fundamentals
• Why do we assess?
• What should we assess?
• When should we assess?
• How should we assess?
Why Do We Assess?
• Determine whether learning outcomes are met
• Support of students’ learning
• Certification and competency judgment
• Teaching program development and implementation
• Accountability
• Understanding the learning process
Assessment Serves Multiple Stakeholders
• Students
• Teachers
• Department, Faculty, University, Administrators
• Public, Governmental Agencies
• Stakeholders’ interest in assessment is not
necessarily aligned.
Effective
Education
Knowledge
Attitude
Skills
What Should We Assess?
Knowledge and Performance
Miller GE. The assessment of clinical
skills/competence/performance. Academic Medicine
(Supplement) 1990; 65: S63-S7.
Knows
Shows how
Knows how
Does
Professional
authenticity
Cognition
Performance
Shows
how
Knows how
Knows
Does
Factual tests
Context-based tests
Performance Assessment in vitro
Performance Assessment in vivo
What Should We Assess?
Concept of Mastery
‘All or none state’ – not really
When Should We Assess?
Date of Examination
k
n
o
w
le
d
g
e
Time
• Assessment drives learning
• Is this not our current scenario
Continuum of Performance
• ‘Learning Curve’
A examination that attempts to test students’
mastery at a given point of time is less preferable
than one that tests the mastery over a span of time.
How Should We Assess?
• Assessment Instruments must be:
• Validity
• Reliability
• Feasibility
• Objectivity
• Educational Impact
Validity
• Truthfulness; Accuracy
• Validity: Ability of the assessment instrument
to test what it is supposed to test.
Validity
Reliability
• The degree of consistency between two measures of the same
thing. (Mehrens and Lehman, 1987
• The measure of how stable, dependable, trustworthy, and
consistent a test is in measuring the same thing each time
(Worthen et al., 1993)
Reliability
• Reliability refers to the consistency of test scores
and the concept of reliability is linked to specific
types of consistency.
• Over time
• Between different examiners,
• Different testing conditions
• Instruments for student assessment needs high
reliability to ensure transparency and fairness
Knowledge and Performance
Knows
Shows how
Knows how
Does
Validity
Cognition
Performance
Reliability
Principle One: “Learning >> Assessment >>
Feedback” in that order!
• Learning should drive assessment and feedback; not
the other way round
• Teaching is easy bit; defining the purpose of
assessment is critical
• Balance between “assessment of learning” and
“assessment for learning”
Principle Two: Validity, Validity, Validity
• Validity is the single most important determinant in in
assessment
• Our interest is in validity of inference; validity can not
be compromised
• Balance between educational needs and institutional
needs
Principle Three: Educational Impact
• Any assessment is anxiety provoking for the students
and now faculty…
• Assessment has potential positive and negative
steering effects on learning and professional
development
• “Curriculum instructs teacher what to teach; exam
instructs students what to learn.”
• Donald Melnick, 1991
Assessment is a moral activity. What
we choose to assess and how shows
quite starkly what we value.
Knight, 1995
Conclusions
• Importance to assessment
• Assessment methods inline with the T/L
methodologies
• Formative over summative
• Validity, Reliability and Feasibility
• Meet the standards
Clarifications..?
Thank You

Assessment.pptx

  • 1.
    Afternoon Sessions –Day 2 • Introduction to Assessment • Formative, Summative and Internal Assessment • Assessment Planning and Quality assurance • Framing • Essay question • Short notes • MCQs
  • 2.
    Assessment Types and Characteristics Dr BharathKumar Reddy A. Associate Professor Dept. of Forensic Medicine & Toxicology, Government Medical College, Nalgonda.
  • 3.
    Learning Objectives • Defineassessment and evaluation • Know Types of assessment • Understand Characteristics of a good assessment • Validity • Reliability • Feasibility • Objectivity
  • 5.
    Assessment Vs Evaluation •Assessment : Taking a measure of effectiveness. It is ongoing • Evaluation: Process of observing and measuring a thing for the purpose of judging it and of determining its value, either by comparison to similar things, or to a standard. • Done at the end of the course.
  • 6.
  • 10.
    Assessment Fundamentals • Whydo we assess? • What should we assess? • When should we assess? • How should we assess?
  • 11.
    Why Do WeAssess? • Determine whether learning outcomes are met • Support of students’ learning • Certification and competency judgment • Teaching program development and implementation • Accountability • Understanding the learning process
  • 12.
    Assessment Serves MultipleStakeholders • Students • Teachers • Department, Faculty, University, Administrators • Public, Governmental Agencies • Stakeholders’ interest in assessment is not necessarily aligned.
  • 13.
  • 14.
    Knowledge and Performance MillerGE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7. Knows Shows how Knows how Does Professional authenticity Cognition Performance
  • 15.
    Shows how Knows how Knows Does Factual tests Context-basedtests Performance Assessment in vitro Performance Assessment in vivo What Should We Assess?
  • 16.
    Concept of Mastery ‘Allor none state’ – not really When Should We Assess? Date of Examination k n o w le d g e Time
  • 17.
    • Assessment driveslearning • Is this not our current scenario
  • 18.
    Continuum of Performance •‘Learning Curve’
  • 19.
    A examination thatattempts to test students’ mastery at a given point of time is less preferable than one that tests the mastery over a span of time.
  • 20.
    How Should WeAssess? • Assessment Instruments must be: • Validity • Reliability • Feasibility • Objectivity • Educational Impact
  • 21.
    Validity • Truthfulness; Accuracy •Validity: Ability of the assessment instrument to test what it is supposed to test.
  • 22.
  • 23.
    Reliability • The degreeof consistency between two measures of the same thing. (Mehrens and Lehman, 1987 • The measure of how stable, dependable, trustworthy, and consistent a test is in measuring the same thing each time (Worthen et al., 1993)
  • 24.
    Reliability • Reliability refersto the consistency of test scores and the concept of reliability is linked to specific types of consistency. • Over time • Between different examiners, • Different testing conditions • Instruments for student assessment needs high reliability to ensure transparency and fairness
  • 28.
    Knowledge and Performance Knows Showshow Knows how Does Validity Cognition Performance Reliability
  • 29.
    Principle One: “Learning>> Assessment >> Feedback” in that order! • Learning should drive assessment and feedback; not the other way round • Teaching is easy bit; defining the purpose of assessment is critical • Balance between “assessment of learning” and “assessment for learning”
  • 30.
    Principle Two: Validity,Validity, Validity • Validity is the single most important determinant in in assessment • Our interest is in validity of inference; validity can not be compromised • Balance between educational needs and institutional needs
  • 31.
    Principle Three: EducationalImpact • Any assessment is anxiety provoking for the students and now faculty… • Assessment has potential positive and negative steering effects on learning and professional development • “Curriculum instructs teacher what to teach; exam instructs students what to learn.” • Donald Melnick, 1991
  • 32.
    Assessment is amoral activity. What we choose to assess and how shows quite starkly what we value. Knight, 1995
  • 34.
    Conclusions • Importance toassessment • Assessment methods inline with the T/L methodologies • Formative over summative • Validity, Reliability and Feasibility • Meet the standards
  • 35.
  • 36.