FOREIGN LANGUAGE CLASSROOM ASSESSMENT IN SUPPORT OF TEACHING AND LEARNING Matt Poehner(mep158@psu.edu) Center for Advanced Language Proficiency Education and Research, The Pennsylvania State University
An assessor is: 
“an educator whose success is to be judged by what others learn rather than a referee for a basketball game who is hired to decide who is „right‟ or „wrong‟ (Cronbach et al., 1980) 
“Simply knowing the final score of the game after it is over is not very useful. What we need is a vivid rendering of how that game is played” (E.W. Eisner)
About Me
Relating Assessment to Teaching & Learning 
Divergence b/t activities of intervention in processesof development (i.e., teaching) & attempts to understand productsof development (i.e., testing… and assessment?) 
Teaching-Assessment Dualism-> both domains developed (throughout 20thCentury) their own theories, research methods, and forms of practice 
“Incompatible epistemological traditions” (Delandshere 2002, p. 1478) 
Esp. evident in area of Summative Assessment: at end of period of study learners receive scores, grades, or rankings to summarizeperformance, often in relation to others (Bachman 1990) 
◦How helpful is this „feedback‟ for learner development?
Modeling relationship between teaching and assessment 
Teaching and Assessment are separate enterprises 
Connection, or „spill-over‟ between them, is often viewed as a negative (narrowing the curriculum, teaching to the test, washback) 
Closer interface between assessment and teaching (Bachman & Cohen 1996); diagnostic assessment (Alderson 2006); interactive assessment (ongoing work in HK - Davison, Hamp-Lyons) 
Teaching 
Assessment
3 Ps of Assessment 
1.Purposes:Why are we doing the assessment? What information do we hope to obtain? What will we do with this information? What decisions need to be made? 
2.Principles:Where do (language) knowledge and abilities reside? How can we get at them? Under what conditions? What/how much can be inferred? How do we know if our inferences are valid? 
3.Practices:Open response or closed response? Discrete items or tasks/projects? Ongoing or one-off assessments? Timed assessments? Access to resources? Permission to interact/collaborate?
Moss (2003): Fundamental Differences b/t Psychometric Testing Paradigm & Classroom 
Discrete, one-off activity versus ongoing activity 
Opportunities to gather various kinds of info 
Formulation of kinds of information that are valued & to what degree 
Authority of „scientific‟ tests to trump other indicators? 
Object of assessment –ability as stable trait in an individual‟s head or emerging through collaborative performance 
Assessment Outcomes –reduction of complexity of performance to # (percentile ranking, score) or sharing of work (products & processes)
Assumptions Behind Assessment PracticesDelandshere 2002, p. 1480 
CONSIDER: when the same test is given to all sixth graders in a state to find out whether their educational experiences yield similar achievements, is it because we are working from a theory stating that if students have all been taught the same thing, they all will learn it in the same way at the same time? 
It seems unlikely that any educator would articulate such a theory. Yet without this perspective, how can current forms of state-mandated assessment be justified? 
When assumptions and theoretical explanations are not made explicit , they tend to appear to be unreasoned speculations
Classroom Assessment -Low Risk? (Rea-Dickins & Gardner 2000) 
Classroom assessment, despite assumptions to the contrary, is not necessarily low-stakes, because it is often the case that high-stakes decisions are predicated on learners‟ in-class performance. The problem is that because it tends to be informal and unsystematic, such an assessment could well result in either an overestimation of a learner‟s or a group‟s ability or conversely, it could underestimate their progress.
Delandshere (2002, p. 1475): From Assessment as Measurement to Assessment as Inquiry? 
Move from state of having knowledge to action involving participation, transaction, & transformation 
Move from educational practice of assessment where we have defined a priori what we are looking for to educational practice where we are participating in activities in which we formulate representations to better understand & transform world 
◦Doesn‟t mean don‟t have criteria to determine quality 
Inquiry –open, critical, dialogic –for purpose of understanding and supporting learning and knowing
Delandshere (2002, p. 1479): Assessment as Inquiry (cont.) 
„Typical‟ assessment question: What do students know? 
◦Most current educational assessment practices rest on this question, answered by administering tests, scoring responses 
◦Scores interpreted as representing amount or level of knowledgeeach student possesses in test 
◦Consistent w/ early theoretical perspectives on learning (e.g., behaviorism, assessment questions typically defined in narrow and oversimplified ways: How is Jenny doing in spelling this grading period? 
Different theoretical perspective: What do students know?Or How do they accomplish this task?
Efforts to Relate Assessment to Teaching/Learning 
Embedded Assessments: tasks/projects designed to perform learning function but also provide basis for evaluating learner performance (Spence-Brown 2001) 
◦But which orientation will dominate for teachers? And for students? 
Task-based Pedagogy: difference between parallel assessment tasks and learning tasks is presence of teacher support(Candlin 2001) 
-> Why must teacher be removed from this process when framed as assessment?
Possibility of Integrating Assessment & Teaching? 
In classroom, a single activity performing both evaluative and instructional functions? 
Question of perspective one wishes to take (i.e., viewing as „teaching‟ episode or „assessment‟ episode)? 
Consequences -> How does one proceed to structure activities/tasks to perform dual function? How does one approach interactions with learners? Assessment-Teaching
Formative Assessment(d‟Anglejan, Harley & Shapson 1990) 
information which will inform teachers and students about the degree of success of their respective efforts in the classroom. It allows teachers to diagnose students‟ strengths and weaknesses in relation to specific curricular objectives and thus guides them in organizing and structuring instructional material
Integrating Formative Assessment into Classroom Practice 
•Ellis (2003): Practice-Based or Incidental Formative Assessment –> Curriculum driven integrated into the instructional process. May involve: 
•External: teachers and students reflect on student performance while it unfolds or just after it unfolds 
•Internal: occurs through teacher questioning, probing and on-line feedback 
•Aligns with calls for Assessment-for-Learning, idea that assessment with formative feedback should be embedded in classroom activities (Black & William, 1996)
16 
Importance of feedback in Formative Assessment 
-“Feedback, however detailed, will not lead to improvement until a pupil understands both the feedbackand how to use itin the context of his/her own work” (Sadler, 1989; cited in Hall and Burke, 2003: 58) 
-While formative assessment may be informally conducted, feedback and monitoring must be systematic to avoid underestimates of learner abilities, overestimates of learner progress (Rea- Dickins & Gardner)
17Note about Feedback in Formative Assessment vs. Learner Self-Assessment 
“If the learner generates the relevant information by him/herself, the procedure is a part of self-monitoring” but “if the source of information is externalto the learner (for example, the teacher), then it is associated with feedback” (Sadler,1989). 
Sadler (ibid) says that “It is important to facilitate the transmission from feedback to self-monitoring”
18Summary 
1. Assessment need not be stand-alone activity at odds with teaching & learning; can be perspective on classroom activity, form of inquiry that accompanies daily practices & interactions 
2. Teachers are implicated in this process 
-source of support and feedback during activity (systematic & tailored) 
3. Learners expected to move toward their own source of feedback/support, guiding themselves more independently
Dynamic Assessment 
an “approach to understanding individual differences and their implications for instruction…[that] embeds intervention within the assessment procedure” (Lidz & Gindis 2003: 99) 
focuses “on modifiability and on producing suggestions for interventions that appear successful in facilitating improved learner performance” (Lidz 1991: 6) 
From this perspective, all assessments (even formal testing) mustalso be formative
Importance of the Zone of Proximal Development (ZPD) 
Actual level of development -> inferred according to observations of learner independent work/performances 
Proximal development -> understood according to learner responsiveness when offered support; how much more are they capable of when working cooperatively with others? 
“determining the actual level of development not only does not cover the whole picture of development, but very frequently encompasses only an insignificant[italics added] part of it” (Vygotsky, 1998)
Importance of Mediation for Diagnosis 
Responsiveness to mediation –support carefully calibrated to learner needs –indispensable for understanding cognitive ability (provides insight into the person‟s futurepotential development) 
-what individual is able to do one day with assistance, s/he is able to do tomorrow alone 
ZPD always situates instruction ahead of development relative to abilities in the process of maturing. The goal of DA is not to measure these abilities but to help them mature.
ZPD and Integration of Teaching & Assessing 
ZPD indicates abilities that are still emerging & therefore ripe for instructional interventionso development-oriented teaching should target ZPD by providing mediation to help him/her stretch beyond current capabilities (Teaching) 
To determine a learner‟s ZPD involves engaging him/her in activities beyond his/her current capabilities and providing mediationas needed (Assessment)
Types of DA 
Interventionist –mediation is scripted beforehand; mediation must be administered in a standardized format (same way for everyone) 
Interactionist –mediation is emergent in interaction between examiner & examinee (or mediator & learner) 
Differences in 
ScaleReplicability & comparisons 
PlanningDemands on mediator during procedure 
EffectivenessDefinitions of effectiveness
Inventory of Mediating Prompts 
1.Pause 
2.Repeat the whole phrase questioningly 
3.Repeat just the part of the sentence with the error 
4.Teacher asks, “What is wrong with that sentence?” 
5.Teacher points out the incorrect word 
6.Teacher asks either/or question (negros o negras?) 
7.Teacher identifies the correct answer 
8.Teacher explains why
Sample Cumulative Mediation Chart(Poehner, 2009)
Resources Online 
CALPER: 
http://calper.la.psu.edu 
1.Information regarding Dynamic Assessment, including ordering the Teachers Guide to Dynamic Assessment as well as free downloadable documents 
2.Assessment website: 
http://calper.la.psu.edu/assessment/index.php 
Including assessment terminology, overview of assessment practices, and additional readings

Foreign Language Classroom Assessment in Support of Teaching and Learning

  • 1.
    FOREIGN LANGUAGE CLASSROOMASSESSMENT IN SUPPORT OF TEACHING AND LEARNING Matt Poehner([email protected]) Center for Advanced Language Proficiency Education and Research, The Pennsylvania State University
  • 2.
    An assessor is: “an educator whose success is to be judged by what others learn rather than a referee for a basketball game who is hired to decide who is „right‟ or „wrong‟ (Cronbach et al., 1980) “Simply knowing the final score of the game after it is over is not very useful. What we need is a vivid rendering of how that game is played” (E.W. Eisner)
  • 3.
  • 4.
    Relating Assessment toTeaching & Learning Divergence b/t activities of intervention in processesof development (i.e., teaching) & attempts to understand productsof development (i.e., testing… and assessment?) Teaching-Assessment Dualism-> both domains developed (throughout 20thCentury) their own theories, research methods, and forms of practice “Incompatible epistemological traditions” (Delandshere 2002, p. 1478) Esp. evident in area of Summative Assessment: at end of period of study learners receive scores, grades, or rankings to summarizeperformance, often in relation to others (Bachman 1990) ◦How helpful is this „feedback‟ for learner development?
  • 5.
    Modeling relationship betweenteaching and assessment Teaching and Assessment are separate enterprises Connection, or „spill-over‟ between them, is often viewed as a negative (narrowing the curriculum, teaching to the test, washback) Closer interface between assessment and teaching (Bachman & Cohen 1996); diagnostic assessment (Alderson 2006); interactive assessment (ongoing work in HK - Davison, Hamp-Lyons) Teaching Assessment
  • 6.
    3 Ps ofAssessment 1.Purposes:Why are we doing the assessment? What information do we hope to obtain? What will we do with this information? What decisions need to be made? 2.Principles:Where do (language) knowledge and abilities reside? How can we get at them? Under what conditions? What/how much can be inferred? How do we know if our inferences are valid? 3.Practices:Open response or closed response? Discrete items or tasks/projects? Ongoing or one-off assessments? Timed assessments? Access to resources? Permission to interact/collaborate?
  • 7.
    Moss (2003): FundamentalDifferences b/t Psychometric Testing Paradigm & Classroom Discrete, one-off activity versus ongoing activity Opportunities to gather various kinds of info Formulation of kinds of information that are valued & to what degree Authority of „scientific‟ tests to trump other indicators? Object of assessment –ability as stable trait in an individual‟s head or emerging through collaborative performance Assessment Outcomes –reduction of complexity of performance to # (percentile ranking, score) or sharing of work (products & processes)
  • 8.
    Assumptions Behind AssessmentPracticesDelandshere 2002, p. 1480 CONSIDER: when the same test is given to all sixth graders in a state to find out whether their educational experiences yield similar achievements, is it because we are working from a theory stating that if students have all been taught the same thing, they all will learn it in the same way at the same time? It seems unlikely that any educator would articulate such a theory. Yet without this perspective, how can current forms of state-mandated assessment be justified? When assumptions and theoretical explanations are not made explicit , they tend to appear to be unreasoned speculations
  • 9.
    Classroom Assessment -LowRisk? (Rea-Dickins & Gardner 2000) Classroom assessment, despite assumptions to the contrary, is not necessarily low-stakes, because it is often the case that high-stakes decisions are predicated on learners‟ in-class performance. The problem is that because it tends to be informal and unsystematic, such an assessment could well result in either an overestimation of a learner‟s or a group‟s ability or conversely, it could underestimate their progress.
  • 10.
    Delandshere (2002, p.1475): From Assessment as Measurement to Assessment as Inquiry? Move from state of having knowledge to action involving participation, transaction, & transformation Move from educational practice of assessment where we have defined a priori what we are looking for to educational practice where we are participating in activities in which we formulate representations to better understand & transform world ◦Doesn‟t mean don‟t have criteria to determine quality Inquiry –open, critical, dialogic –for purpose of understanding and supporting learning and knowing
  • 11.
    Delandshere (2002, p.1479): Assessment as Inquiry (cont.) „Typical‟ assessment question: What do students know? ◦Most current educational assessment practices rest on this question, answered by administering tests, scoring responses ◦Scores interpreted as representing amount or level of knowledgeeach student possesses in test ◦Consistent w/ early theoretical perspectives on learning (e.g., behaviorism, assessment questions typically defined in narrow and oversimplified ways: How is Jenny doing in spelling this grading period? Different theoretical perspective: What do students know?Or How do they accomplish this task?
  • 12.
    Efforts to RelateAssessment to Teaching/Learning Embedded Assessments: tasks/projects designed to perform learning function but also provide basis for evaluating learner performance (Spence-Brown 2001) ◦But which orientation will dominate for teachers? And for students? Task-based Pedagogy: difference between parallel assessment tasks and learning tasks is presence of teacher support(Candlin 2001) -> Why must teacher be removed from this process when framed as assessment?
  • 13.
    Possibility of IntegratingAssessment & Teaching? In classroom, a single activity performing both evaluative and instructional functions? Question of perspective one wishes to take (i.e., viewing as „teaching‟ episode or „assessment‟ episode)? Consequences -> How does one proceed to structure activities/tasks to perform dual function? How does one approach interactions with learners? Assessment-Teaching
  • 14.
    Formative Assessment(d‟Anglejan, Harley& Shapson 1990) information which will inform teachers and students about the degree of success of their respective efforts in the classroom. It allows teachers to diagnose students‟ strengths and weaknesses in relation to specific curricular objectives and thus guides them in organizing and structuring instructional material
  • 15.
    Integrating Formative Assessmentinto Classroom Practice •Ellis (2003): Practice-Based or Incidental Formative Assessment –> Curriculum driven integrated into the instructional process. May involve: •External: teachers and students reflect on student performance while it unfolds or just after it unfolds •Internal: occurs through teacher questioning, probing and on-line feedback •Aligns with calls for Assessment-for-Learning, idea that assessment with formative feedback should be embedded in classroom activities (Black & William, 1996)
  • 16.
    16 Importance offeedback in Formative Assessment -“Feedback, however detailed, will not lead to improvement until a pupil understands both the feedbackand how to use itin the context of his/her own work” (Sadler, 1989; cited in Hall and Burke, 2003: 58) -While formative assessment may be informally conducted, feedback and monitoring must be systematic to avoid underestimates of learner abilities, overestimates of learner progress (Rea- Dickins & Gardner)
  • 17.
    17Note about Feedbackin Formative Assessment vs. Learner Self-Assessment “If the learner generates the relevant information by him/herself, the procedure is a part of self-monitoring” but “if the source of information is externalto the learner (for example, the teacher), then it is associated with feedback” (Sadler,1989). Sadler (ibid) says that “It is important to facilitate the transmission from feedback to self-monitoring”
  • 18.
    18Summary 1. Assessmentneed not be stand-alone activity at odds with teaching & learning; can be perspective on classroom activity, form of inquiry that accompanies daily practices & interactions 2. Teachers are implicated in this process -source of support and feedback during activity (systematic & tailored) 3. Learners expected to move toward their own source of feedback/support, guiding themselves more independently
  • 19.
    Dynamic Assessment an“approach to understanding individual differences and their implications for instruction…[that] embeds intervention within the assessment procedure” (Lidz & Gindis 2003: 99) focuses “on modifiability and on producing suggestions for interventions that appear successful in facilitating improved learner performance” (Lidz 1991: 6) From this perspective, all assessments (even formal testing) mustalso be formative
  • 20.
    Importance of theZone of Proximal Development (ZPD) Actual level of development -> inferred according to observations of learner independent work/performances Proximal development -> understood according to learner responsiveness when offered support; how much more are they capable of when working cooperatively with others? “determining the actual level of development not only does not cover the whole picture of development, but very frequently encompasses only an insignificant[italics added] part of it” (Vygotsky, 1998)
  • 21.
    Importance of Mediationfor Diagnosis Responsiveness to mediation –support carefully calibrated to learner needs –indispensable for understanding cognitive ability (provides insight into the person‟s futurepotential development) -what individual is able to do one day with assistance, s/he is able to do tomorrow alone ZPD always situates instruction ahead of development relative to abilities in the process of maturing. The goal of DA is not to measure these abilities but to help them mature.
  • 22.
    ZPD and Integrationof Teaching & Assessing ZPD indicates abilities that are still emerging & therefore ripe for instructional interventionso development-oriented teaching should target ZPD by providing mediation to help him/her stretch beyond current capabilities (Teaching) To determine a learner‟s ZPD involves engaging him/her in activities beyond his/her current capabilities and providing mediationas needed (Assessment)
  • 23.
    Types of DA Interventionist –mediation is scripted beforehand; mediation must be administered in a standardized format (same way for everyone) Interactionist –mediation is emergent in interaction between examiner & examinee (or mediator & learner) Differences in ScaleReplicability & comparisons PlanningDemands on mediator during procedure EffectivenessDefinitions of effectiveness
  • 24.
    Inventory of MediatingPrompts 1.Pause 2.Repeat the whole phrase questioningly 3.Repeat just the part of the sentence with the error 4.Teacher asks, “What is wrong with that sentence?” 5.Teacher points out the incorrect word 6.Teacher asks either/or question (negros o negras?) 7.Teacher identifies the correct answer 8.Teacher explains why
  • 25.
    Sample Cumulative MediationChart(Poehner, 2009)
  • 26.
    Resources Online CALPER: http://calper.la.psu.edu 1.Information regarding Dynamic Assessment, including ordering the Teachers Guide to Dynamic Assessment as well as free downloadable documents 2.Assessment website: http://calper.la.psu.edu/assessment/index.php Including assessment terminology, overview of assessment practices, and additional readings