ASSESSMENT OF LEARNING 2
CHAPTER 6
WEST VISAYAS
STATE UNIVERSITY –
EXTENSION CAMPUS AT
HIMAMAYALAN CITY
JOLIETO CAPARIDA, BPE-SPE
MADELYN VIDAJA, BSED - ENGLISH
Characterized as the systematic determination of merit, worth and significance of
something or someone.
Characterize and appraise subjects of interest in a wide range of human enterprises,
including the Arts, business, computer science, criminal justice, engineering,
foundations and non-profit organizations, gov’t., heatlthcare, and other human
services.
EVALUATION
is the next stage in the process
A systematic, continous & comprehensive process of determining the
growth and progress of the pupil towards objectives or values of the
curriculum.
(micro/classroom level)
Book 1 and most of Chapter s 1 through 5
(Advance Method Book)
Concerns themselves w/ assessment
A.
EDUCATIONAL EVALUATION
B.
EVALUATION APPROACHES
C.
EVALUATION METHODS AND
TECHNIQUES
D.
THE CIPP EVALUATION MODEL
E.
SUMMARY OF KEYWORDS AND
PHRASES
United States
• Joint Committee on Standards for Educational Evaluation
• Developed standards for educational programmes, personnel, and
student evaluation.
U. S. Joint Committee on Standards
• Four (4) Sections
• 1.) Utility 3.) Propriety
• 2.)Feasibility 4.) Accuracy
Philippine Society for Educational Research
and Evaluation (PSERE)
*A society which looks into educational
evaluation.
A. EDUCATIONAL EVALUATION
Dept. Of Education (DepEd)
They mainly set the Educational
evaluation standards in the
Philippines.
• Various European Institution
• More or less related to those
produced by the Joint
Committee in the United
States.
• They provide guidelines about
basing value judgmentts on
• a. systematic inquiry
• b. evaluator competence and
integrity
• c. respect for people, and
• d. regard for the general and
public welfare.
3.
Integrity/
Honesty
1.
Systemati
c Inquiry
2.
Compet
ence
4.
Respect
for
People
5.
Responsibilities
for
General and
Public Welfare
GUIDING PRINCIPLES (for
evaluators)
Created by
American Evaluation Association
Can be used at various levels:
(Served as Benchmarks for good practices in educational evaluation)
1. Institutional Level when we evaluate learning
2. Policy Level when we evaluate institutions
3.International Level when we rank/evaluate the performance of various
institutions of higher learning
SYSTEMATIC INQUIRY
Evaluators conduct systematic,
databased inquiries about
whatever is being evaluated.
Inquiry cannot be based on pure
hearsay or perception but must be
based concrete evidence and data to
support the inquiry process.
•
Evaluation consulting and design
•Designing and administering data collection tools
•Analyzing and reporting evaluation results
•Helping organizations use results in program planning
•California Instructional Technology Clearinghouse,
Columbus Public Schools
•The Software and Hardware Industry
•Apple Computer Software Guides
•Microsoft Software Guides
•IBM Software Guides
•Strengths: These booklets are distributed free of
charge, and can be useful for learning about the
software for a particular platform.
•Weaknesses: Reviews are written to favor a particular
platform. Reviews may be dated or not comprehensive.
COMPETENCE
Evaluators provide
competent performance to
stakeholders.
The evaluators must be people
or persons of known
competence and generally
acknowledged in the
educational field.
INTEGRITY/HONESTY
Evaluators ensure the
honesty and integrity of the
entire evaluation process.
As such, the integrity of
authorities who conduct the
evaluation process must be
beyond reproach.
RESPECT FOR PEOPLE
Evaluators respect the security, dignity
and self-worth of the respondents,
program participants, clients and other
stakeholders, w/ whom they interact.
They cannot act as if they know
everything but must listen patiently to
the accounts of those whom they are
evaluating.
RESPONSIBILITIES FOR GENERAL
AND PUBLIC WELFARE
Evaluators articulate and take
into account the diversity of
interests and values that may be
related to the general and public
welfare.
Believed that an
INDIVIDUAL has a
FREEEDOM OF CHOICE
• He is UNIQUE
EVALUATION PROCESS
• Guided by Empirical
Inquiry
• Based on Objective
Standards
ALL EVALUATION
• Based on Subjectivist
Ethics
• Individual Subjective
experiences
B. EVALUATION
APPROACHES
Evaluation approaches are the various conceptual arrangements made for
designing and actually conducting the evaluation process.
Today, in educational setting (a. Original, b. Refinements/extensions)
1. LIBERAL DEMOCRACY
1st major classification of evaluation
Anchored by House (1990)
All major evaluation approaches are based on this common idealogy.
1. UTILITARIANISM
What is Good is Defined as that
w/c maximizes the happiness of
society as a whole.
2. INTUITIONIST OR PLURALIST
No single interpretation of “the
good” is assumed .
Need not be explicitly stated nor
justified.
FORMS
OF
SUBJECTIVIST ETHICS
EACH ETHICAL POSITION HAS ITS OWN WAYS OF OBTAINING KNOWLEDGE
OR EPISTEMOLOGY
EPISTEMOLOGY
(Ways of Obtaining Knowledge)
Knowledge
is acquired
w/c is capable
of external verification
& evidence
(intersubjective
agreement)
thru methods and
techniques
universally
accepted and
through the
presentation of
data.
The Objectivist Epistemology
Is Associated with the UTILITARIAN ETHICS
The Subjective Epistemology
Is Asso. w/ the
INTUITIONIST/PLURALIST ETHIC
It is used to acquire new
knowledge based on
existing personal
knowledge and
experiences that are
(explicit) or are not
(tacit) available for
public inspection.
The Objectivist Epistemology
Is Associated with the UTILITARIAN ETHICS
The Subjective Epistemology
Is Associated w/ the INTUITIONIST/PLURALIST ETHIC
Tacit Knowledge
Unwritten, unspoken, and hidden vast
storehouse of knowledge held by practically
every normal human being, based on his or
her emotions, experiences, insights, intuition,
observations and internalized information.
Explicit knowledge
It can be readily transmitted to others. The
information contained in encyclopedias and
textbooks
Used to acquire new knowledge based on existing personal
knowledge and experiences that are (explicit) or are not
(tacit) available for public inspection.
House’s approach
further subdivides the
epistemological approach
in terms of
TWO (2) MAIN POLITICAL PERSPECTIVES
1. ELITIST=An Approach in which the idea is to focus on
the perspectives of managers and top echelon
people and professionals.
2. MASS-BASED = An Approach in which the
focus is on consumers and the approaches
are participatory.
STUFFLEBEAM and WEBSTERS (1980)
Place approaches into one of
THREE(3) GROUPS ACCDG. TO THEIR ORIENTATION
Toward the role of values, an ethical consideration
1. THE POLITICAL ORIENTATION (PSEUDO EVALUATION)
Promotes a positive or negative view of an objective regardless of what its
value actually might be.
2. THE QUESTION ORIENTATION (QUASI-EVALUATION)
Includes approaches that might or might not provide answers specifically
related to the value of an object.
3. THE VALUES ORIENTATION (TRUE EVALUATION)
Includes approaches primarily intended to determine the value of some
object.
Classification of approaches for conducting evaluations based on epistemology, major
perspective, and orientation
Epistemology
(Ethic)
Major perspective
Orientation
Political
(Pseudo-evaluation)
Questions
(Quasi-evaluation)
Values
(True evaluation)
Objectivist
(Utilitarian)
Elite
(Managerial)
Politically controlled
Public relations
Experimental
research
Management
information systems
Testing programs
Objectives-based
Content analysis
Decision-oriented
Policy studies
Mass
(Consumers)
Accountability Consumer-oriented
Subjectivist
(Institutionalist/
Pluralist)
Elite
(Professional)
Accreditation/
certification
Connoisseur
Mass
(Participatory)
Adversary
Client-centered
Note. Epistemology and major perspective from House (1978). Orientation from Stufflebeam & Webster
(1980).
major
perspective
(House)
Orientation
(Stufflebeam/We
bster)
epistemology
Politically
controlled
Public
relation
studies
Pseudo-
evaluation
approaches
Approach Organizer Purpose Key
Strengths
Key
Weaknesses
Politically
controlled
Treats Get keep or
increase
influence
power or
money
Secure
evidence
advantages
to the client
in a conflict
Violates the
principle of
full and frank
disclosure
Public
relations
Propaganda
needs
Create
positive
public mage
Secure
evidence
most likely to
bolster
public
support
Violates the
principles of
balanced
reporting,
justified
conclusions
and
objectivity
Information obtained
trhough politically
controlled studies is
released to meet the
speacial interests of the
holder.
Public Health and
Safety
Evaluation results will
be used as framework
in public health
strategies.
POLITICALLY CONTROLLED
Used to paint
positive image
of an object.
Customers
perceive value
based on the
experiences
they received.
PUBLIC RELATIONS INFORMATION
Experimental
research
Management
info. Sys.’
Testing
programs
Objectivesb
ased studies
Content
analysis
Customer /
Constituents
Satisfaction
Survey
After Sales
Customers
Service
Enhancing the
Quality of
Products and
Services
Offered
Create More
Services and
Products that
will Benefit
the Public
Experimental
Research
Causal
relationships
Determine causal
relationships
between variables.
Strongest paradigm for
determining causal
relationships.
Requires controlled setting,
limits range of evidence,
focuses primarily on results.
Management
information
systems
Scientific
efficiency
Continuously supply
evidence needed to
fund, direct, &
control programs.
Gives managers detailed
evidence about complex
programs.
Human service variables are
rarely amenable to the
narrow, quantitative
definitions needed.
Testing programs
Individual
differences
Compare test scores
of individuals &
groups to selected
norms.
Produces valid & reliable
evidence in many
performance areas. Very
familiar to public.
Data usually only on testee
performance,
overemphasizes test-taking
skills, can be poor sample of
what is taught or expected.
Objectives-based Objectives
Relates outcomes to
objectives.
Common sense appeal,
widely used, uses
behavioral objectives &
testing technologies.
Leads to terminal evidence
often too narrow to provide
basis for judging to value of
a program.
Content Analysis
Content of a
communicatio
n
Describe & draw
conclusion about a
communication.
Allows for unobtrusive
analysis of large volumes
of unstructured, symbolic
materials.
Sample may be
unrepresentative yet
overwhelming in volume.
Analysis design often overly
simplistic for question.
Accountability
Performance
expectations
Provide constituents
with an accurate
accounting of results.
Popular with constituents.
Aimed at improving quality
of products and services.
Creates unrest between
practitioners & consumers.
Politics often forces
premature studies.
In norm-referenced test interpretation, your scores are compared with the test performance
of a particular reference group, called the norm group.
The norm group usually consists of large representative samples of individuals from specific
populations, undergraduates, senior managers or clerical workers. It is the average
performance and distribution of their scores that become the test norms of the group. –
(http://www.psychometric-success.com/aptitude-tests/interpreting-test-results.htm)
Design the Experiment
Collect and Analyze Data
Draw Conclusion
Goals and Objectives are similar in that they
describe the intended purposes and
expected results of teaching activities and
establish the foundation for assessment.
There are three types of learning objectives,
which reflect different aspects of student
learning:
Cognitive objectives: “What do you want your
graduates to know?”
Affective objectives: “What do you want your
graduates to think or care about?”
Behavioral Objectives: “What do you want your
graduates to be able to do?”
(http://assessment.uconn.edu/primer/goals1.h
tml)
Print
media
Newspaper items,
magazine articles, books,
catalogues
Other
writings
Web pages,
advertisements,
billboards, posters,
graffiti
Broadcast
media
Radio programs, news
items, TV programs
Other
recordings
Photos, drawings, videos,
films, music
Live
situations
Speeches, interviews,
plays, concerts
Observatio
ns
Gestures, rooms,
products in shops
For a media organization,
the main purpose of content analysis is
 to evaluate and improve its programming.
All media organizations are trying to achieve
some purpose.
For commercial media,
the purpose is simple:
 to make money, and survive.
For public and community-owned media,
there are usually several purposes,
sometimes conflicting - but each individual
program tends to have one main purpose.
http://www.audiencedialogue.net/kya16a.h
tml
Decision
oriented
Policy
studies
Accreditation/
certification
Connoisseur
Adversary Client-
centered
Most important questions when working with
statistics is “Why are we doing this?”
Proximate examples for such answers are
“To find out if this new drug works better than
the established ones” or
 “To describe the effect of inter-cropping on
plant growth” while ultimate answers are
“To improve medical treatment” or
 “To find appropriate cultivation techniques”.
Statistics are complied by an IT department
and then given back to the people who initially
requested them for interpretation.
http://journal.code4lib.org/articles/1275
Cutting Carbon Emissions
A service offered by companies that
focuses on the internal and
external needs of a business's customers.
Consumer orientation establishes
and monitors standards of customer
satisfaction and strives to meet the
clientele's needs and expectations related
to the product or service sold by the
business.
http://www.businessdictionary.com/defini
tion/consumer-orientation.html
CHED ACCREDITATION IN THE PHILIPPINES
The CHED has its scheme of quality assurance when colleges and
universities
submit themselves to voluntary accreditation through the four
accrediting agencies:
the Philippine Association of Accrediting Agencies of Schools, Colleges
and
Universities (PAASCU), the Philippine Association of Colleges and
UniversitiesCommission on Accreditation (PACU-COA), the Association of
Christian Schools
and Colleges (ACSC), the Accrediting Association of Chartered Colleges
and
Universities of the Philippines (AACCUP), all under the umbrella of the
Federation of
Accrediting Agency of the Philippines (FAAP).
The CHED recognizes only the FAAP-certified accreditation of the four
accrediting
agencies-without necessarily encroaching on the academic autonomy of
the latter.
http://stlinusonlineinstitute.com/yahoo_site_admin/assets/docs/CHED_A
CCREDITATION_IN_THE_PHILIPPINES.67223608.pdf
Accreditation is a concept of self-regulation which focuses on self-study
and evaluation and on the continuing improvement of educational
quality. It is both a process and a result.
As a process, it is a form of peer review in which an association of schools
and colleges establishes sets of criteria and procedures to encourage high
maintenance of standards of education among its affiliate members.
As a result, it is a form of certification granted by a recognized and
authorized accrediting agency to an educational program or to an
educational institution as possessing certain standards of quality which are
over and above those prescribed as minimum requirements for government
recognition. Accreditation is based upon an analysis of the merits of
educational operations in the context of the institution's philosophy and
objectives.
Membership to PACUCOA is open to all schools that are able to meet the
standards and requirements of the agency.
http://www.pacucoa.ph/general_info.htm
The connoisseurship model has two major implications: holistic approach to the
analysis and interpretation of data and multiple perspectives in the evaluative tasks.
http://ged550.wikispaces.com/Eisner's+Educational+Connoisseurship+Model
On being connoisseurs and critics
involves more than
gaining and exercising technical knowledge
and skills. It depends on us also cultivating
a kind of artistry. In this sense, educators
are not engineers applying their skills to
carry out a plan or drawing, they are artists
who are able to improvise and devise new
ways of looking at things.
http://infed.org/mobi/evaluation-theory-and-practice/
To this end, the approach makes use of
teams of evaluators who present two
opposing views (these teams are commonly
referred to as adversaries and advocates).
These two sides then agree on issues to
address, collect data or evidence which forms
a common database, and present their
arguments.
A neutral party is assigned to referee the
hearing, and is expected to arrive at a fair
verdict after consideration of all the evidence
presented.[4]
From the first day of service, and
continuing through each and every
session, the unique needs of the
client are at the core of our
treatment model. Trained therapy
professionals are dedicated to the
mission of HCT and to the clients we
serve.
http://healthcaretherapies.net/treat
ment_model.php
Client-Centered Nutrition Education (CCNE) is a style of education that
encourages participants to play an active role in their own learning and
allows staff to act as a guide or a facilitator.
CCNE provides opportunities for group discussion, incorporates hands-
on activities and, best of all, allows participants to share experiences and
provide social support to each other.
CCNE makes the learning experience more fun, engaging, and
meaningful, not only for participants, but also for staff.
http://www.dshs.state.tx.us/wichd/nut/ccne.aspx
DETAILED LIST OF
METHODS,
TECHNIQUES AND
APPROACHES FOR
CONDUCTING
EVALUATION
ACCELERATED AGING
ACTION RESEARCH
ADVANCED PRODUCT QUALITY
PLANNING
ALTERNATIVE ASSESSMENT
APPRECIATIVE INQUIRY
AXIOMATIC DESIGN
BENCHMARKING
CASE STUDY
CHANGE MANAGEMENT
CLINICAL TRIAL
COHORT STUDY
COMPETITOR ANALYSIS
CONSENSUS
DECISION-MAKING
CONSENSUS –SEEKING
DECISION-MAKING
CONTENT ANALYSIS
CONVERSATION ANALYSIS
COST-BENEFIT ANALYSIS
COURSE EVALUATION
DELPHI TECHNIQUE
DISCOURSE ANALYSIS
ELECTRONIC PORTFOLIO
ENVIRONMENTAL SCANNING
ETHNOGRAPHY
EXPERIMENT
EXPERIMENTAL TECHNIQUES
FACTOR ANALYSIS
FACTORIAL EXPERIMENT
FEASIBILITY STUDY
FIELD EXPERIMENT
FIXTURELESS IN-CIRCUIT TEST
FOCUS GROUP
FORCE FIELD ANALYSIS
GAME THEORY
GRADING
HISTORICAL METHOD
INQUIRY
INTERVIEW
MARKETING RESEARCH
META-ANALYSIS
METRICS
MOST SIGNIFICANT CHANGE
MULTIVARIATE STATISTICS
NATURALISTIC OBSERVATION
OBSERVATIONAL TECHNIQUES
 and others.
The Approach essentially systematizes the way we evaluate the different
dimensions and aspects of curriculum development and the sum/total of
student experiences in the educative process.
INPUTS PRODUCT
CONTEXT
THE ‘CIPP’ MODEL OF EVALUATION
CONTEXT
What is the
relation of the
course to other
courses?
Is the time
adequate?
What are critical
or important
external factors
(networks,
ministries)?
Should courses
be integrated or
separate?
What are the links
between the
course and
research/exten-
sion activities?
Is there a
need for a
course?
Is the course
relevant to job
needs?
What is the
entering ability
of students?
What are the
learning skills of
the students?
What is the
motivation of
the students/
What are the
living condiions
of students?
What is the
students’
existing
knowledge(*) (In
line WMF*)?
Are the aims
suitable?
Is the course
content clearly
defined?
Does the content
(knowledge, skills,
attitudes(*) In line
WMF*) match
student abilities
Is the content
relevant to
practical
problems?
What is the
theory
practice
relevance?
What
resources/
equipment are
available
What books do
the teachers
have?
INPUTS
What books do
the students
have?
How strong are
the teaching skills
of the teachers?
What time is
available
comparedwith the
workload, for
preparation?
What knowledge,
skills and attitudes,
relatede to the
subject, do the
teachers have?
How supportive is
the classroom
environment?
How many
students are
there?
How many
teachers are
there?
How is the course
organized?
What
regulation
relate to the
training?
Are the
objective
smart?
Do the
objectives
derive from
aims?
INPUTS
contn.
Use and apply
PROCESS
What is
the
workload
of
student?
How
well/actively
do students
participate?
Are there any
problems
related to
teaching?
Are there any
problems
related to
learning?
Is there an
effective 2-
way
communi-
cation
Is knowledge
only transferred
to students, or do
they use and
apply it?
Are there any
problems w/c
students face in
using/applying/an
alysing the
knowledge and
skills?
Are the teaching
and learning
process
continuously
evaluated?
Are teaching and
learning affected
by
practical/institut
ional problems?
What is the level of
cooperation/interper
sonal relations
between
teachers/students?
How is
disciplined
maintained?
Is there one final
exam ar the end or
several during the
course?Has the teacher’s
reputation improved or
been ruined as a
result?
Is there any informal
assessment?
What is the quality of
assessment (i.e. what
levels of KSA are
assessed?)
What are the
students’ KSA
levels after the
course?
Is the evaluation
carried out for the
whole (*) In-line
WMF*) process?
How do students
use what they
have learned?
How was the
overall experience
for the teachers
and for the
students?
What are the main
‘lessons’learned’?
Is there an official report?
PRODUCT
Those guided questions are not answered by the
teacher only or by a single individual. Instead, there
are many ways in which they can be answered. Some
of the more common methods are listed below.
1. Discussion with
class
2. Informal
conversation or
observation
3. Individual student
interviews
4. Evaluation forms
5. Observation in
class/session of
teacher/trainer by
colleagues
6. Video tape of own
teaching (micro-
teaching)
7. Organizational
documents
8. Participant contract
9. Performance test
10. Questionnaire
11. Self-assessment
12. Written test
ASSESSMENT is the
process of gathering and
analyzing specific
information as part of an
evaluation.
COMPETENCY
EVALUATION is a means
for teachers to
determine the ability of
their students in other
ways besides the
standardize test.
COURSE EVALUATION is
the process of evaluating
the instruction of a given
course.
EDUCATIONAL
EVALUATION is evaluation
that is conducted
specifically in an
educational setting.
IMMAMENT EVALUATION
opposed by Gilles Deleuze
to value judgment.
PERFORMANCE
EVALUATION is aterm
from the field of
language testing. It
stands in contrast to
competence evaluation.
PROGRAM EVALUATION is
essentially a set of
philosophies and techniques to
determine if a program ‘works’
E. SUMMARY OF KEYWORDS AND PHRASES
Educational Evaluation

Educational Evaluation

  • 1.
    ASSESSMENT OF LEARNING2 CHAPTER 6 WEST VISAYAS STATE UNIVERSITY – EXTENSION CAMPUS AT HIMAMAYALAN CITY JOLIETO CAPARIDA, BPE-SPE MADELYN VIDAJA, BSED - ENGLISH
  • 3.
    Characterized as thesystematic determination of merit, worth and significance of something or someone. Characterize and appraise subjects of interest in a wide range of human enterprises, including the Arts, business, computer science, criminal justice, engineering, foundations and non-profit organizations, gov’t., heatlthcare, and other human services. EVALUATION is the next stage in the process A systematic, continous & comprehensive process of determining the growth and progress of the pupil towards objectives or values of the curriculum. (micro/classroom level) Book 1 and most of Chapter s 1 through 5 (Advance Method Book) Concerns themselves w/ assessment
  • 4.
    A. EDUCATIONAL EVALUATION B. EVALUATION APPROACHES C. EVALUATIONMETHODS AND TECHNIQUES D. THE CIPP EVALUATION MODEL E. SUMMARY OF KEYWORDS AND PHRASES
  • 5.
    United States • JointCommittee on Standards for Educational Evaluation • Developed standards for educational programmes, personnel, and student evaluation. U. S. Joint Committee on Standards • Four (4) Sections • 1.) Utility 3.) Propriety • 2.)Feasibility 4.) Accuracy Philippine Society for Educational Research and Evaluation (PSERE) *A society which looks into educational evaluation. A. EDUCATIONAL EVALUATION
  • 6.
    Dept. Of Education(DepEd) They mainly set the Educational evaluation standards in the Philippines.
  • 7.
    • Various EuropeanInstitution • More or less related to those produced by the Joint Committee in the United States. • They provide guidelines about basing value judgmentts on • a. systematic inquiry • b. evaluator competence and integrity • c. respect for people, and • d. regard for the general and public welfare.
  • 8.
    3. Integrity/ Honesty 1. Systemati c Inquiry 2. Compet ence 4. Respect for People 5. Responsibilities for General and PublicWelfare GUIDING PRINCIPLES (for evaluators) Created by American Evaluation Association Can be used at various levels: (Served as Benchmarks for good practices in educational evaluation) 1. Institutional Level when we evaluate learning 2. Policy Level when we evaluate institutions 3.International Level when we rank/evaluate the performance of various institutions of higher learning
  • 9.
    SYSTEMATIC INQUIRY Evaluators conductsystematic, databased inquiries about whatever is being evaluated. Inquiry cannot be based on pure hearsay or perception but must be based concrete evidence and data to support the inquiry process.
  • 10.
    • Evaluation consulting anddesign •Designing and administering data collection tools •Analyzing and reporting evaluation results •Helping organizations use results in program planning
  • 11.
    •California Instructional TechnologyClearinghouse, Columbus Public Schools •The Software and Hardware Industry •Apple Computer Software Guides •Microsoft Software Guides •IBM Software Guides •Strengths: These booklets are distributed free of charge, and can be useful for learning about the software for a particular platform. •Weaknesses: Reviews are written to favor a particular platform. Reviews may be dated or not comprehensive.
  • 12.
    COMPETENCE Evaluators provide competent performanceto stakeholders. The evaluators must be people or persons of known competence and generally acknowledged in the educational field.
  • 13.
    INTEGRITY/HONESTY Evaluators ensure the honestyand integrity of the entire evaluation process. As such, the integrity of authorities who conduct the evaluation process must be beyond reproach.
  • 14.
    RESPECT FOR PEOPLE Evaluatorsrespect the security, dignity and self-worth of the respondents, program participants, clients and other stakeholders, w/ whom they interact. They cannot act as if they know everything but must listen patiently to the accounts of those whom they are evaluating.
  • 15.
    RESPONSIBILITIES FOR GENERAL ANDPUBLIC WELFARE Evaluators articulate and take into account the diversity of interests and values that may be related to the general and public welfare.
  • 16.
    Believed that an INDIVIDUALhas a FREEEDOM OF CHOICE • He is UNIQUE EVALUATION PROCESS • Guided by Empirical Inquiry • Based on Objective Standards ALL EVALUATION • Based on Subjectivist Ethics • Individual Subjective experiences B. EVALUATION APPROACHES Evaluation approaches are the various conceptual arrangements made for designing and actually conducting the evaluation process. Today, in educational setting (a. Original, b. Refinements/extensions) 1. LIBERAL DEMOCRACY 1st major classification of evaluation Anchored by House (1990) All major evaluation approaches are based on this common idealogy.
  • 17.
    1. UTILITARIANISM What isGood is Defined as that w/c maximizes the happiness of society as a whole. 2. INTUITIONIST OR PLURALIST No single interpretation of “the good” is assumed . Need not be explicitly stated nor justified. FORMS OF SUBJECTIVIST ETHICS EACH ETHICAL POSITION HAS ITS OWN WAYS OF OBTAINING KNOWLEDGE OR EPISTEMOLOGY
  • 18.
    EPISTEMOLOGY (Ways of ObtainingKnowledge) Knowledge is acquired w/c is capable of external verification & evidence (intersubjective agreement) thru methods and techniques universally accepted and through the presentation of data. The Objectivist Epistemology Is Associated with the UTILITARIAN ETHICS The Subjective Epistemology Is Asso. w/ the INTUITIONIST/PLURALIST ETHIC It is used to acquire new knowledge based on existing personal knowledge and experiences that are (explicit) or are not (tacit) available for public inspection.
  • 19.
    The Objectivist Epistemology IsAssociated with the UTILITARIAN ETHICS
  • 20.
    The Subjective Epistemology IsAssociated w/ the INTUITIONIST/PLURALIST ETHIC Tacit Knowledge Unwritten, unspoken, and hidden vast storehouse of knowledge held by practically every normal human being, based on his or her emotions, experiences, insights, intuition, observations and internalized information. Explicit knowledge It can be readily transmitted to others. The information contained in encyclopedias and textbooks Used to acquire new knowledge based on existing personal knowledge and experiences that are (explicit) or are not (tacit) available for public inspection.
  • 21.
    House’s approach further subdividesthe epistemological approach in terms of TWO (2) MAIN POLITICAL PERSPECTIVES 1. ELITIST=An Approach in which the idea is to focus on the perspectives of managers and top echelon people and professionals. 2. MASS-BASED = An Approach in which the focus is on consumers and the approaches are participatory.
  • 22.
    STUFFLEBEAM and WEBSTERS(1980) Place approaches into one of THREE(3) GROUPS ACCDG. TO THEIR ORIENTATION Toward the role of values, an ethical consideration 1. THE POLITICAL ORIENTATION (PSEUDO EVALUATION) Promotes a positive or negative view of an objective regardless of what its value actually might be. 2. THE QUESTION ORIENTATION (QUASI-EVALUATION) Includes approaches that might or might not provide answers specifically related to the value of an object. 3. THE VALUES ORIENTATION (TRUE EVALUATION) Includes approaches primarily intended to determine the value of some object.
  • 23.
    Classification of approachesfor conducting evaluations based on epistemology, major perspective, and orientation Epistemology (Ethic) Major perspective Orientation Political (Pseudo-evaluation) Questions (Quasi-evaluation) Values (True evaluation) Objectivist (Utilitarian) Elite (Managerial) Politically controlled Public relations Experimental research Management information systems Testing programs Objectives-based Content analysis Decision-oriented Policy studies Mass (Consumers) Accountability Consumer-oriented Subjectivist (Institutionalist/ Pluralist) Elite (Professional) Accreditation/ certification Connoisseur Mass (Participatory) Adversary Client-centered Note. Epistemology and major perspective from House (1978). Orientation from Stufflebeam & Webster (1980).
  • 24.
  • 25.
    Approach Organizer PurposeKey Strengths Key Weaknesses Politically controlled Treats Get keep or increase influence power or money Secure evidence advantages to the client in a conflict Violates the principle of full and frank disclosure Public relations Propaganda needs Create positive public mage Secure evidence most likely to bolster public support Violates the principles of balanced reporting, justified conclusions and objectivity
  • 26.
    Information obtained trhough politically controlledstudies is released to meet the speacial interests of the holder. Public Health and Safety Evaluation results will be used as framework in public health strategies. POLITICALLY CONTROLLED
  • 27.
    Used to paint positiveimage of an object. Customers perceive value based on the experiences they received. PUBLIC RELATIONS INFORMATION
  • 28.
  • 30.
    Customer / Constituents Satisfaction Survey After Sales Customers Service Enhancingthe Quality of Products and Services Offered Create More Services and Products that will Benefit the Public
  • 31.
    Experimental Research Causal relationships Determine causal relationships between variables. Strongestparadigm for determining causal relationships. Requires controlled setting, limits range of evidence, focuses primarily on results. Management information systems Scientific efficiency Continuously supply evidence needed to fund, direct, & control programs. Gives managers detailed evidence about complex programs. Human service variables are rarely amenable to the narrow, quantitative definitions needed. Testing programs Individual differences Compare test scores of individuals & groups to selected norms. Produces valid & reliable evidence in many performance areas. Very familiar to public. Data usually only on testee performance, overemphasizes test-taking skills, can be poor sample of what is taught or expected. Objectives-based Objectives Relates outcomes to objectives. Common sense appeal, widely used, uses behavioral objectives & testing technologies. Leads to terminal evidence often too narrow to provide basis for judging to value of a program. Content Analysis Content of a communicatio n Describe & draw conclusion about a communication. Allows for unobtrusive analysis of large volumes of unstructured, symbolic materials. Sample may be unrepresentative yet overwhelming in volume. Analysis design often overly simplistic for question. Accountability Performance expectations Provide constituents with an accurate accounting of results. Popular with constituents. Aimed at improving quality of products and services. Creates unrest between practitioners & consumers. Politics often forces premature studies.
  • 32.
    In norm-referenced testinterpretation, your scores are compared with the test performance of a particular reference group, called the norm group. The norm group usually consists of large representative samples of individuals from specific populations, undergraduates, senior managers or clerical workers. It is the average performance and distribution of their scores that become the test norms of the group. – (http://www.psychometric-success.com/aptitude-tests/interpreting-test-results.htm)
  • 34.
    Design the Experiment Collectand Analyze Data Draw Conclusion
  • 35.
    Goals and Objectivesare similar in that they describe the intended purposes and expected results of teaching activities and establish the foundation for assessment. There are three types of learning objectives, which reflect different aspects of student learning: Cognitive objectives: “What do you want your graduates to know?” Affective objectives: “What do you want your graduates to think or care about?” Behavioral Objectives: “What do you want your graduates to be able to do?” (http://assessment.uconn.edu/primer/goals1.h tml)
  • 36.
    Print media Newspaper items, magazine articles,books, catalogues Other writings Web pages, advertisements, billboards, posters, graffiti Broadcast media Radio programs, news items, TV programs Other recordings Photos, drawings, videos, films, music Live situations Speeches, interviews, plays, concerts Observatio ns Gestures, rooms, products in shops For a media organization, the main purpose of content analysis is  to evaluate and improve its programming. All media organizations are trying to achieve some purpose. For commercial media, the purpose is simple:  to make money, and survive. For public and community-owned media, there are usually several purposes, sometimes conflicting - but each individual program tends to have one main purpose. http://www.audiencedialogue.net/kya16a.h tml
  • 37.
  • 38.
    Most important questionswhen working with statistics is “Why are we doing this?” Proximate examples for such answers are “To find out if this new drug works better than the established ones” or  “To describe the effect of inter-cropping on plant growth” while ultimate answers are “To improve medical treatment” or  “To find appropriate cultivation techniques”. Statistics are complied by an IT department and then given back to the people who initially requested them for interpretation. http://journal.code4lib.org/articles/1275
  • 39.
  • 40.
    A service offeredby companies that focuses on the internal and external needs of a business's customers. Consumer orientation establishes and monitors standards of customer satisfaction and strives to meet the clientele's needs and expectations related to the product or service sold by the business. http://www.businessdictionary.com/defini tion/consumer-orientation.html
  • 42.
    CHED ACCREDITATION INTHE PHILIPPINES The CHED has its scheme of quality assurance when colleges and universities submit themselves to voluntary accreditation through the four accrediting agencies: the Philippine Association of Accrediting Agencies of Schools, Colleges and Universities (PAASCU), the Philippine Association of Colleges and UniversitiesCommission on Accreditation (PACU-COA), the Association of Christian Schools and Colleges (ACSC), the Accrediting Association of Chartered Colleges and Universities of the Philippines (AACCUP), all under the umbrella of the Federation of Accrediting Agency of the Philippines (FAAP). The CHED recognizes only the FAAP-certified accreditation of the four accrediting agencies-without necessarily encroaching on the academic autonomy of the latter. http://stlinusonlineinstitute.com/yahoo_site_admin/assets/docs/CHED_A CCREDITATION_IN_THE_PHILIPPINES.67223608.pdf
  • 43.
    Accreditation is aconcept of self-regulation which focuses on self-study and evaluation and on the continuing improvement of educational quality. It is both a process and a result. As a process, it is a form of peer review in which an association of schools and colleges establishes sets of criteria and procedures to encourage high maintenance of standards of education among its affiliate members. As a result, it is a form of certification granted by a recognized and authorized accrediting agency to an educational program or to an educational institution as possessing certain standards of quality which are over and above those prescribed as minimum requirements for government recognition. Accreditation is based upon an analysis of the merits of educational operations in the context of the institution's philosophy and objectives. Membership to PACUCOA is open to all schools that are able to meet the standards and requirements of the agency. http://www.pacucoa.ph/general_info.htm
  • 45.
    The connoisseurship modelhas two major implications: holistic approach to the analysis and interpretation of data and multiple perspectives in the evaluative tasks. http://ged550.wikispaces.com/Eisner's+Educational+Connoisseurship+Model On being connoisseurs and critics involves more than gaining and exercising technical knowledge and skills. It depends on us also cultivating a kind of artistry. In this sense, educators are not engineers applying their skills to carry out a plan or drawing, they are artists who are able to improvise and devise new ways of looking at things. http://infed.org/mobi/evaluation-theory-and-practice/
  • 46.
    To this end,the approach makes use of teams of evaluators who present two opposing views (these teams are commonly referred to as adversaries and advocates). These two sides then agree on issues to address, collect data or evidence which forms a common database, and present their arguments. A neutral party is assigned to referee the hearing, and is expected to arrive at a fair verdict after consideration of all the evidence presented.[4]
  • 47.
    From the firstday of service, and continuing through each and every session, the unique needs of the client are at the core of our treatment model. Trained therapy professionals are dedicated to the mission of HCT and to the clients we serve. http://healthcaretherapies.net/treat ment_model.php
  • 48.
    Client-Centered Nutrition Education(CCNE) is a style of education that encourages participants to play an active role in their own learning and allows staff to act as a guide or a facilitator. CCNE provides opportunities for group discussion, incorporates hands- on activities and, best of all, allows participants to share experiences and provide social support to each other. CCNE makes the learning experience more fun, engaging, and meaningful, not only for participants, but also for staff. http://www.dshs.state.tx.us/wichd/nut/ccne.aspx
  • 50.
    DETAILED LIST OF METHODS, TECHNIQUESAND APPROACHES FOR CONDUCTING EVALUATION ACCELERATED AGING ACTION RESEARCH ADVANCED PRODUCT QUALITY PLANNING ALTERNATIVE ASSESSMENT APPRECIATIVE INQUIRY AXIOMATIC DESIGN BENCHMARKING CASE STUDY CHANGE MANAGEMENT CLINICAL TRIAL COHORT STUDY COMPETITOR ANALYSIS CONSENSUS DECISION-MAKING CONSENSUS –SEEKING DECISION-MAKING CONTENT ANALYSIS CONVERSATION ANALYSIS COST-BENEFIT ANALYSIS COURSE EVALUATION DELPHI TECHNIQUE DISCOURSE ANALYSIS ELECTRONIC PORTFOLIO ENVIRONMENTAL SCANNING ETHNOGRAPHY EXPERIMENT EXPERIMENTAL TECHNIQUES FACTOR ANALYSIS FACTORIAL EXPERIMENT FEASIBILITY STUDY FIELD EXPERIMENT FIXTURELESS IN-CIRCUIT TEST FOCUS GROUP FORCE FIELD ANALYSIS GAME THEORY GRADING HISTORICAL METHOD INQUIRY INTERVIEW MARKETING RESEARCH META-ANALYSIS METRICS MOST SIGNIFICANT CHANGE MULTIVARIATE STATISTICS NATURALISTIC OBSERVATION OBSERVATIONAL TECHNIQUES  and others.
  • 52.
    The Approach essentiallysystematizes the way we evaluate the different dimensions and aspects of curriculum development and the sum/total of student experiences in the educative process. INPUTS PRODUCT CONTEXT THE ‘CIPP’ MODEL OF EVALUATION
  • 53.
    CONTEXT What is the relationof the course to other courses? Is the time adequate? What are critical or important external factors (networks, ministries)? Should courses be integrated or separate? What are the links between the course and research/exten- sion activities? Is there a need for a course? Is the course relevant to job needs?
  • 54.
    What is the enteringability of students? What are the learning skills of the students? What is the motivation of the students/ What are the living condiions of students? What is the students’ existing knowledge(*) (In line WMF*)? Are the aims suitable? Is the course content clearly defined? Does the content (knowledge, skills, attitudes(*) In line WMF*) match student abilities Is the content relevant to practical problems? What is the theory practice relevance? What resources/ equipment are available What books do the teachers have? INPUTS
  • 55.
    What books do thestudents have? How strong are the teaching skills of the teachers? What time is available comparedwith the workload, for preparation? What knowledge, skills and attitudes, relatede to the subject, do the teachers have? How supportive is the classroom environment? How many students are there? How many teachers are there? How is the course organized? What regulation relate to the training? Are the objective smart? Do the objectives derive from aims? INPUTS contn.
  • 56.
    Use and apply PROCESS Whatis the workload of student? How well/actively do students participate? Are there any problems related to teaching? Are there any problems related to learning? Is there an effective 2- way communi- cation Is knowledge only transferred to students, or do they use and apply it? Are there any problems w/c students face in using/applying/an alysing the knowledge and skills? Are the teaching and learning process continuously evaluated? Are teaching and learning affected by practical/institut ional problems? What is the level of cooperation/interper sonal relations between teachers/students? How is disciplined maintained?
  • 57.
    Is there onefinal exam ar the end or several during the course?Has the teacher’s reputation improved or been ruined as a result? Is there any informal assessment? What is the quality of assessment (i.e. what levels of KSA are assessed?) What are the students’ KSA levels after the course? Is the evaluation carried out for the whole (*) In-line WMF*) process? How do students use what they have learned? How was the overall experience for the teachers and for the students? What are the main ‘lessons’learned’? Is there an official report? PRODUCT
  • 58.
    Those guided questionsare not answered by the teacher only or by a single individual. Instead, there are many ways in which they can be answered. Some of the more common methods are listed below. 1. Discussion with class 2. Informal conversation or observation 3. Individual student interviews 4. Evaluation forms 5. Observation in class/session of teacher/trainer by colleagues 6. Video tape of own teaching (micro- teaching) 7. Organizational documents 8. Participant contract 9. Performance test 10. Questionnaire 11. Self-assessment 12. Written test
  • 59.
    ASSESSMENT is the processof gathering and analyzing specific information as part of an evaluation. COMPETENCY EVALUATION is a means for teachers to determine the ability of their students in other ways besides the standardize test. COURSE EVALUATION is the process of evaluating the instruction of a given course. EDUCATIONAL EVALUATION is evaluation that is conducted specifically in an educational setting. IMMAMENT EVALUATION opposed by Gilles Deleuze to value judgment. PERFORMANCE EVALUATION is aterm from the field of language testing. It stands in contrast to competence evaluation. PROGRAM EVALUATION is essentially a set of philosophies and techniques to determine if a program ‘works’ E. SUMMARY OF KEYWORDS AND PHRASES

Editor's Notes