Programme Assessment:
Insights about student learning
from TESTA
Dr Tansy Jessop, TESTA Project Leader
Presentation to the SDG Course Leaders
University of West of Scotland
12 May 2014
1) Assessment drives what students pay
attention to, and defines the actual
curriculum (Ramsden 1992).
2) Feedback is significant (Hattie, 2009; Black
and Wiliam, 1998)
3) Programme is central to influencing change.
TESTA premises
Thinking about modules
modulus (Latin): small measure
“interchangeable units”
“standardised units”
“sections for easy constructions”
“a self-contained unit”
How well does IKEA 101 packaging
work for Sociology 101?
Furniture
 Bite-sized
 Self-contained
 Interchangeable
 Quick and instantaneous
 Standardised
 Comes with written
instructions
 Consumption
Student Learning
 Long and complicated
 Interconnected
 Distinctive
 Slow, needs deliberation
 Varied, differentiated
 Tacit, unfathomable,
abstract
 Production
 HEA funded research project (2009-12)
 Seven programmes in four partner universities
 Maps programme-wide assessment
 Engages with Quality Assurance processes
 Diagnosis – intervention – cure
What is TESTA?
Transforming the Experience of Students through Assessment
TESTA ‘Cathedrals Group’ Universities
Edinburgh
Edinburgh Napier
Greenwich
Canterbury Christchurch
Glasgow
Lady Irwin College University of Delhi
University of West Scotland
Sheffield Hallam
TESTA
“…is a way of thinking
about assessment and
feedback”
Graham Gibbs
 Time-on-task
 Challenging and high expectations
 Students need to understand goals and standards
 Prompt feedback
 Detailed, high quality, developmental feedback
 Dialogic cycles of feedback
 Deep learning – beyond factual recall
Based on assessment principles
TESTA Research Methods
(Drawing on Gibbs and Dunbar-Goddet, 2008,2009)
ASSESSMENT
EXPERIENCE
QUESTIONNAIRE
FOCUS GROUPS
PROGRAMME AUDIT
Programme
Team
Meeting
 Number of assessment tasks
 Summative/formative
 Variety
 Proportion of exams
 Oral feedback
 Written feedback
 Speed of return of feedback
 Specificity of criteria, aims and learning outcomes.
Audit in a nutshell
 Quantity of Effort
 Coverage of content and knowledge
 Clear goals and standards
 Quantity and Quality of Feedback
 Use of feedback
 Appropriate assessment
 Learning from exams
 Deep and surface learning
Assessment Experience
Questionnaire
Focus Groups
 Student voice and narrative
 Explanation
 Corroboration & contradiction
 Compelling evidence with the stats
Case Study X: what’s going on?
 Mainly full-time lecturers
 Plenty of varieties of assessment, no exams
 Reasonable amount of formative assessment (14 x)
 33 summative assessments
 Masses of written feedback on assignments (15,000 words)
 Learning outcomes and criteria clearly specified
….looks like a ‘model’ assessment environment
But students:
 Don’t put in a lot of effort and distribute their effort across few topics
 Don’t think there is a lot of feedback or that it very useful, and don’t
make use of it
 Don’t think it is at all clear what the goals and standards are
 …are unhappy
Case Study Y: what’s going on?
 35 summative assessments
 No formative assessment specified in documents
 Learning outcomes and criteria wordy and woolly
 Marking by global, tacit, professional judgements
 Teaching staff mainly part-time and hourly paid
….looks like a problematic assessment environment
But students:
 Put in a lot of effort and distribute their effort across topics
 Have a very clear idea of goals and standards
 Are self-regulating and have a good idea of how to close the gap
Two paradigms…
Transmission Model
Social Constructivist model
 Audit
 AEQ
 Focus Groups
Research Methods
1) Any interesting patterns?
2) Anything particularly striking?
3) Any dangling questions, curiosities,
scepticisms?
4) Any predictions, hunches, thoughts about
what other data might throw up?
Task 1: Audit data
Audit Variables Humanities
(Mean)
Sciences
(Mean)
Professional
(Mean)
Total number of assessments 54 67 42
Number of summative assessments 42 43 32
Number of formative assessments 12 31 10
Varieties of assessment (n) 11 15 14
Proportion of examinations 14.3% 30.6% 15%
Time to return feedback 23 days 20 days 23 days
Amount of oral feedback 3hrs 17mins 4hrs 56mins 10hrs 33mins
Amount of written feedback 7,382 words 3,615 words 7,040 words
Discipline Effects
a) Fill in an AEQ from the vantage point of being a
student in the past, or one of your own students
b) Cluster the questions into scales using the
Scales Exercise.
c) What’s a ‘good’ score?
d) Any thoughts, issues, comments, questions?
Task 2: The AEQ
In pairs, explore programme audit and AEQ
data from one programme.
Triangulation: Audit and AEQ data
 In pairs/groups, read through quotes from student
focus group data on a particular theme.
 What problems does the data imply?
 What solutions might a programme develop to
address some of these challenges?
 A3 sheets provided to tease out challenges and
solutions.
Focus Group data
Challenges Solutions
Student voice data
 If there weren’t loads of other assessments, I’d do it.
 If there are no actual consequences of not doing it, most
students are going to sit in the bar.
 I would probably work for tasks, but for a lot of people, if
it’s not going to count towards your degree, why bother?
 The lecturers do formative assessment but we don’t get
any feedback on it.
Theme 1: Formative is a great idea
but…
We could do with more assessments over the course of the year
to make sure that people are actually doing stuff.
We get too much of this end or half way through the term essay
type things. Continual assessments would be so much better.
So you could have a great time doing nothing until like a month
before Christmas and you’d suddenly panic. I prefer steady
deadlines, there’s a gradual move forward, rather than bam!
Theme 2: Assessment isn’t driving
and distributing student effort
 The feedback is generally focused on the module.
 It’s difficult because your assignments are so detached from
the next one you do for that subject. They don’t relate to each
other.
 Because it’s at the end of the module, it doesn’t feed into our
future work.
 You’ll get really detailed, really commenting feedback from
one tutor and the next tutor will just say ‘Well done’.
Theme 3: Feedback is disjointed
and modular
 The assessment guidelines is a formal document so the language
is quite complex and I’ve had to read it a good few times to kind
of understand what they are saying.
 Assessment criteria can make you take a really narrow approach.
 They read the essay and then they get a general impression, then
they pluck a mark from the air.
 It’s a shot in the dark.
 We’ve got two tutors – one marks completely differently to the
other and it’s pot luck which one you get.
Theme 4: Students are not clear
about goals and standards
1. Too much summative; too little formative
2. Too wide a variety of assessment
3. Lack of time on task
4. Inconsistent marking standards
5. ‘Ticking’ modules off
6. Poor feedback: too little and too slow
7. Lack of oral feedback; lack of dialogue about standards
8. Instrumental reproduction of materials for marks
Main findings
1. Students and staff can’t do more of both.
2. Reductions in summative – how many is enough?
3. Increase in formative – and make sure it is valued and
required.
4. Debunking the myth of two summative per module.
5. Articulating rationale with students, lecturers, senior
managers and QA managers.
1. Summative-formative issues
The case of the under-performing engineers (Graham,
Strathclyde)
The case of the cunning (but not litigious) lawyers (Graham,
somewhere)
The case of the silent seminar (Winchester)
The case of the lost accountants (Winchester)
The case of the disengaged Media students (Winchester)
1. Examples of ramping up formative
The case of low effort on Media Studies
The case of bunching on the BA Primary
2. Examples of improving ‘time on task’
The case of the closed door (Psychology)
The case of the one-off in History (Bath Spa)
The case of the Sports Psychologist (Winchester)
The conversation gambit
3. Engaging students in reflection
through improving feedback
 The case of the maverick History lecturer (a dove)
 The case of the highly individualistic creative
writing markers
4. Internalising goals and standards
Programmatic Assessment Design
Feedback Practice
Paper processes to people talking
Changes
 Improvements in NSS scores on A&F – from bottom
quartile in 2009 to top quartile in 2013
 Three programmes with 100% satisfaction ratings post
TESTA
 All TESTA programmes have some movement upwards
on A&F scores
 Programme teams are talking about A&F and pedagogy
 Periodic review processes are changing for the better.
Impacts
www.testa.ac.uk
Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning.
Learning and Teaching in Higher Education. 1(1): 3-31.
Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment environments
that support learning. Assessment & Evaluation in Higher Education. 34,4: 481-489.
Hattie, J. (2007) The Power of Feedback. Review of Educational Research. 77(1) 81-112.
Jessop, T. and Maleckar, B. (in press). The Influence of disciplinary assessment patterns on student
learning: a comparative study. Studies in Higher Education.
Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-
scale study of students’ learning in response to different assessment patterns. Assessment and
Evaluation in Higher Education. 39(1) 73-88.
Jessop, T, McNab, N & Gubby, L. (2012) Mind the gap: An analysis of how quality assurance processes
influence programme assessment patterns. Active Learning in Higher Education. 13(3). 143-154.
Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher
education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517
Sadler, D.R. (1989) Formative assessment and the design of instructional systems, Instructional
Science, 18, 119-144.
References

More Related Content

PPTX
TESTA, Imperial College Education Day (March 2015)
PPTX
TESTA, AHE Conference Masterclass (June 2015)
PPTX
TESTA, Assessment for Learning Symposium, Durban University of Technology (Oc...
PPTX
TESTA, Southampton Feedback Champions Conference (April 2015)
PPTX
TESTA, School of Politics & International Relations, University of Nottingham...
PPTX
TESTA, UCL Teaching and Learning Conference Keynote (April 2015)
PPT
TESTA, Kingston University Keynote
PPTX
TESTA, British Council KEP India (March 2014)
TESTA, Imperial College Education Day (March 2015)
TESTA, AHE Conference Masterclass (June 2015)
TESTA, Assessment for Learning Symposium, Durban University of Technology (Oc...
TESTA, Southampton Feedback Champions Conference (April 2015)
TESTA, School of Politics & International Relations, University of Nottingham...
TESTA, UCL Teaching and Learning Conference Keynote (April 2015)
TESTA, Kingston University Keynote
TESTA, British Council KEP India (March 2014)

What's hot (20)

PPTX
Dispelling myths; challenging traditions: TESTA evidence
PPTX
TESTA Masterclass AHE Conference 2015
PPTX
TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013)
PPTX
Assessing Problem Based Learning
PPTX
Testa interactive masterclass
PPTX
TESTA, Universtiy of Warwick SCAP Conference (July 2013)
PPTX
The why and what of testa
PPTX
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
PPTX
Activating student agency through feedback
PPT
TESTA - UNSW, Sydney Australia (September 2011)
PPTX
Implications of TESTA for curriculum design
PDF
An evidence based model
PPTX
Why do TESTA?
PPTX
Out of the long shadow of the NSS: TESTA's transformative potential
PPTX
TESTA, HEPN University of Sheffield (December 2014)
PPTX
Portsmouth BAM Knowledge and Learning SIG
PPTX
Pg cert appraising your teaching 2011
PPTX
TESTA to FASTECH (November 2011)
PPTX
Effective questioning
PDF
Does your feedback feed forward?
Dispelling myths; challenging traditions: TESTA evidence
TESTA Masterclass AHE Conference 2015
TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013)
Assessing Problem Based Learning
Testa interactive masterclass
TESTA, Universtiy of Warwick SCAP Conference (July 2013)
The why and what of testa
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
Activating student agency through feedback
TESTA - UNSW, Sydney Australia (September 2011)
Implications of TESTA for curriculum design
An evidence based model
Why do TESTA?
Out of the long shadow of the NSS: TESTA's transformative potential
TESTA, HEPN University of Sheffield (December 2014)
Portsmouth BAM Knowledge and Learning SIG
Pg cert appraising your teaching 2011
TESTA to FASTECH (November 2011)
Effective questioning
Does your feedback feed forward?
Ad

Similar to TESTA, Presentation to the SDG Course Leaders, University of West of Scotland (May 2014) (20)

PPTX
TESTA, University of Greenwich Keynote (July 2013)
PPTX
TESTA, Durham University (December 2013)
PPTX
CAN Conference TESTA Programme Assessment
PPTX
Why a programme view? Why TESTA?
PPTX
TESTA to FASTECH Presentation
PPTX
Fostering a culture change in assessment and feedback through TESTA
PPTX
Liverpool
PPTX
TESTA Interactive Masterclass
PPTX
TESTA SEDA Keynote Spring 2016
PPTX
Birmingham Assessment and Feedback Symposium
PPTX
MMU TESTA Keynote
PPTX
Inspiring change in assessment and feedback
PPTX
Rough ready and rapid guide to TESTA
PPTX
TESTA, Sports Away Day Sheffield Hallam (June 2014)
PPTX
Improving student learning: Masterclass part 2
PPTX
SLTCC 2016 (Keynote 2) Evidence to Action: Why TESTA works
PPTX
Evidence to action: Why TESTA works
PPTX
Cracking the challenge of assessment and feeedback
PPTX
TESTA, HEIR Conference Keynote (September 2014)
PPTX
The why and what of testa
TESTA, University of Greenwich Keynote (July 2013)
TESTA, Durham University (December 2013)
CAN Conference TESTA Programme Assessment
Why a programme view? Why TESTA?
TESTA to FASTECH Presentation
Fostering a culture change in assessment and feedback through TESTA
Liverpool
TESTA Interactive Masterclass
TESTA SEDA Keynote Spring 2016
Birmingham Assessment and Feedback Symposium
MMU TESTA Keynote
Inspiring change in assessment and feedback
Rough ready and rapid guide to TESTA
TESTA, Sports Away Day Sheffield Hallam (June 2014)
Improving student learning: Masterclass part 2
SLTCC 2016 (Keynote 2) Evidence to Action: Why TESTA works
Evidence to action: Why TESTA works
Cracking the challenge of assessment and feeedback
TESTA, HEIR Conference Keynote (September 2014)
The why and what of testa
Ad

Recently uploaded (20)

PPTX
4. Diagnosis and treatment planning in RPD.pptx
PDF
faiz-khans about Radiotherapy Physics-02.pdf
PPTX
Thinking Routines and Learning Engagements.pptx
PPTX
CHROMIUM & Glucose Tolerance Factor.pptx
PPTX
pharmaceutics-1unit-1-221214121936-550b56aa.pptx
PDF
Health aspects of bilberry: A review on its general benefits
PPTX
Cite It Right: A Compact Illustration of APA 7th Edition.pptx
PDF
Disorder of Endocrine system (1).pdfyyhyyyy
PPTX
Neurological complocations of systemic disease
PDF
FAMILY PLANNING (preventative and social medicine pdf)
PDF
Review of Related Literature & Studies.pdf
PPT
hemostasis and its significance, physiology
PDF
BSc-Zoology-02Sem-DrVijay-Comparative anatomy of vertebrates.pdf
PPTX
ACFE CERTIFICATION TRAINING ON LAW.pptx
PPT
hsl powerpoint resource goyloveh feb 07.ppt
PDF
Kalaari-SaaS-Founder-Playbook-2024-Edition-.pdf
PDF
LATAM’s Top EdTech Innovators Transforming Learning in 2025.pdf
PPTX
principlesofmanagementsem1slides-131211060335-phpapp01 (1).ppt
PDF
Diabetes Mellitus , types , clinical picture, investigation and managment
PDF
POM_Unit1_Notes.pdf Introduction to Management #mba #bba #bcom #bballb #class...
4. Diagnosis and treatment planning in RPD.pptx
faiz-khans about Radiotherapy Physics-02.pdf
Thinking Routines and Learning Engagements.pptx
CHROMIUM & Glucose Tolerance Factor.pptx
pharmaceutics-1unit-1-221214121936-550b56aa.pptx
Health aspects of bilberry: A review on its general benefits
Cite It Right: A Compact Illustration of APA 7th Edition.pptx
Disorder of Endocrine system (1).pdfyyhyyyy
Neurological complocations of systemic disease
FAMILY PLANNING (preventative and social medicine pdf)
Review of Related Literature & Studies.pdf
hemostasis and its significance, physiology
BSc-Zoology-02Sem-DrVijay-Comparative anatomy of vertebrates.pdf
ACFE CERTIFICATION TRAINING ON LAW.pptx
hsl powerpoint resource goyloveh feb 07.ppt
Kalaari-SaaS-Founder-Playbook-2024-Edition-.pdf
LATAM’s Top EdTech Innovators Transforming Learning in 2025.pdf
principlesofmanagementsem1slides-131211060335-phpapp01 (1).ppt
Diabetes Mellitus , types , clinical picture, investigation and managment
POM_Unit1_Notes.pdf Introduction to Management #mba #bba #bcom #bballb #class...

TESTA, Presentation to the SDG Course Leaders, University of West of Scotland (May 2014)

  • 1. Programme Assessment: Insights about student learning from TESTA Dr Tansy Jessop, TESTA Project Leader Presentation to the SDG Course Leaders University of West of Scotland 12 May 2014
  • 2. 1) Assessment drives what students pay attention to, and defines the actual curriculum (Ramsden 1992). 2) Feedback is significant (Hattie, 2009; Black and Wiliam, 1998) 3) Programme is central to influencing change. TESTA premises
  • 3. Thinking about modules modulus (Latin): small measure “interchangeable units” “standardised units” “sections for easy constructions” “a self-contained unit”
  • 4. How well does IKEA 101 packaging work for Sociology 101? Furniture  Bite-sized  Self-contained  Interchangeable  Quick and instantaneous  Standardised  Comes with written instructions  Consumption Student Learning  Long and complicated  Interconnected  Distinctive  Slow, needs deliberation  Varied, differentiated  Tacit, unfathomable, abstract  Production
  • 5.  HEA funded research project (2009-12)  Seven programmes in four partner universities  Maps programme-wide assessment  Engages with Quality Assurance processes  Diagnosis – intervention – cure What is TESTA? Transforming the Experience of Students through Assessment
  • 7. Edinburgh Edinburgh Napier Greenwich Canterbury Christchurch Glasgow Lady Irwin College University of Delhi University of West Scotland Sheffield Hallam
  • 8. TESTA “…is a way of thinking about assessment and feedback” Graham Gibbs
  • 9.  Time-on-task  Challenging and high expectations  Students need to understand goals and standards  Prompt feedback  Detailed, high quality, developmental feedback  Dialogic cycles of feedback  Deep learning – beyond factual recall Based on assessment principles
  • 10. TESTA Research Methods (Drawing on Gibbs and Dunbar-Goddet, 2008,2009) ASSESSMENT EXPERIENCE QUESTIONNAIRE FOCUS GROUPS PROGRAMME AUDIT Programme Team Meeting
  • 11.  Number of assessment tasks  Summative/formative  Variety  Proportion of exams  Oral feedback  Written feedback  Speed of return of feedback  Specificity of criteria, aims and learning outcomes. Audit in a nutshell
  • 12.  Quantity of Effort  Coverage of content and knowledge  Clear goals and standards  Quantity and Quality of Feedback  Use of feedback  Appropriate assessment  Learning from exams  Deep and surface learning Assessment Experience Questionnaire
  • 13. Focus Groups  Student voice and narrative  Explanation  Corroboration & contradiction  Compelling evidence with the stats
  • 14. Case Study X: what’s going on?  Mainly full-time lecturers  Plenty of varieties of assessment, no exams  Reasonable amount of formative assessment (14 x)  33 summative assessments  Masses of written feedback on assignments (15,000 words)  Learning outcomes and criteria clearly specified ….looks like a ‘model’ assessment environment But students:  Don’t put in a lot of effort and distribute their effort across few topics  Don’t think there is a lot of feedback or that it very useful, and don’t make use of it  Don’t think it is at all clear what the goals and standards are  …are unhappy
  • 15. Case Study Y: what’s going on?  35 summative assessments  No formative assessment specified in documents  Learning outcomes and criteria wordy and woolly  Marking by global, tacit, professional judgements  Teaching staff mainly part-time and hourly paid ….looks like a problematic assessment environment But students:  Put in a lot of effort and distribute their effort across topics  Have a very clear idea of goals and standards  Are self-regulating and have a good idea of how to close the gap
  • 19.  Audit  AEQ  Focus Groups Research Methods
  • 20. 1) Any interesting patterns? 2) Anything particularly striking? 3) Any dangling questions, curiosities, scepticisms? 4) Any predictions, hunches, thoughts about what other data might throw up? Task 1: Audit data
  • 21. Audit Variables Humanities (Mean) Sciences (Mean) Professional (Mean) Total number of assessments 54 67 42 Number of summative assessments 42 43 32 Number of formative assessments 12 31 10 Varieties of assessment (n) 11 15 14 Proportion of examinations 14.3% 30.6% 15% Time to return feedback 23 days 20 days 23 days Amount of oral feedback 3hrs 17mins 4hrs 56mins 10hrs 33mins Amount of written feedback 7,382 words 3,615 words 7,040 words Discipline Effects
  • 22. a) Fill in an AEQ from the vantage point of being a student in the past, or one of your own students b) Cluster the questions into scales using the Scales Exercise. c) What’s a ‘good’ score? d) Any thoughts, issues, comments, questions? Task 2: The AEQ
  • 23. In pairs, explore programme audit and AEQ data from one programme. Triangulation: Audit and AEQ data
  • 24.  In pairs/groups, read through quotes from student focus group data on a particular theme.  What problems does the data imply?  What solutions might a programme develop to address some of these challenges?  A3 sheets provided to tease out challenges and solutions. Focus Group data
  • 26.  If there weren’t loads of other assessments, I’d do it.  If there are no actual consequences of not doing it, most students are going to sit in the bar.  I would probably work for tasks, but for a lot of people, if it’s not going to count towards your degree, why bother?  The lecturers do formative assessment but we don’t get any feedback on it. Theme 1: Formative is a great idea but…
  • 27. We could do with more assessments over the course of the year to make sure that people are actually doing stuff. We get too much of this end or half way through the term essay type things. Continual assessments would be so much better. So you could have a great time doing nothing until like a month before Christmas and you’d suddenly panic. I prefer steady deadlines, there’s a gradual move forward, rather than bam! Theme 2: Assessment isn’t driving and distributing student effort
  • 28.  The feedback is generally focused on the module.  It’s difficult because your assignments are so detached from the next one you do for that subject. They don’t relate to each other.  Because it’s at the end of the module, it doesn’t feed into our future work.  You’ll get really detailed, really commenting feedback from one tutor and the next tutor will just say ‘Well done’. Theme 3: Feedback is disjointed and modular
  • 29.  The assessment guidelines is a formal document so the language is quite complex and I’ve had to read it a good few times to kind of understand what they are saying.  Assessment criteria can make you take a really narrow approach.  They read the essay and then they get a general impression, then they pluck a mark from the air.  It’s a shot in the dark.  We’ve got two tutors – one marks completely differently to the other and it’s pot luck which one you get. Theme 4: Students are not clear about goals and standards
  • 30. 1. Too much summative; too little formative 2. Too wide a variety of assessment 3. Lack of time on task 4. Inconsistent marking standards 5. ‘Ticking’ modules off 6. Poor feedback: too little and too slow 7. Lack of oral feedback; lack of dialogue about standards 8. Instrumental reproduction of materials for marks Main findings
  • 31. 1. Students and staff can’t do more of both. 2. Reductions in summative – how many is enough? 3. Increase in formative – and make sure it is valued and required. 4. Debunking the myth of two summative per module. 5. Articulating rationale with students, lecturers, senior managers and QA managers. 1. Summative-formative issues
  • 32. The case of the under-performing engineers (Graham, Strathclyde) The case of the cunning (but not litigious) lawyers (Graham, somewhere) The case of the silent seminar (Winchester) The case of the lost accountants (Winchester) The case of the disengaged Media students (Winchester) 1. Examples of ramping up formative
  • 33. The case of low effort on Media Studies The case of bunching on the BA Primary 2. Examples of improving ‘time on task’
  • 34. The case of the closed door (Psychology) The case of the one-off in History (Bath Spa) The case of the Sports Psychologist (Winchester) The conversation gambit 3. Engaging students in reflection through improving feedback
  • 35.  The case of the maverick History lecturer (a dove)  The case of the highly individualistic creative writing markers 4. Internalising goals and standards
  • 36. Programmatic Assessment Design Feedback Practice Paper processes to people talking Changes
  • 37.  Improvements in NSS scores on A&F – from bottom quartile in 2009 to top quartile in 2013  Three programmes with 100% satisfaction ratings post TESTA  All TESTA programmes have some movement upwards on A&F scores  Programme teams are talking about A&F and pedagogy  Periodic review processes are changing for the better. Impacts
  • 39. Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31. Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment environments that support learning. Assessment & Evaluation in Higher Education. 34,4: 481-489. Hattie, J. (2007) The Power of Feedback. Review of Educational Research. 77(1) 81-112. Jessop, T. and Maleckar, B. (in press). The Influence of disciplinary assessment patterns on student learning: a comparative study. Studies in Higher Education. Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large- scale study of students’ learning in response to different assessment patterns. Assessment and Evaluation in Higher Education. 39(1) 73-88. Jessop, T, McNab, N & Gubby, L. (2012) Mind the gap: An analysis of how quality assurance processes influence programme assessment patterns. Active Learning in Higher Education. 13(3). 143-154. Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517 Sadler, D.R. (1989) Formative assessment and the design of instructional systems, Instructional Science, 18, 119-144. References

Editor's Notes

  • #3: Students spend most time and effort on assessment. Assessment is the cue for student learning and attention. It is also the area where students show least satisfaction on the NSS. Scores on other factors return about 85% of good rankings, whereas only 75% of students find assessment and feedback ‘good’. We often think the curriculum is the knowledge, content and skills we set out in the planned curriculum, but from a students’ perspective, the assessment demands frame the curriculum. Looking at assessment from a modular perspective leads to myopia about the whole degree, the disciplinary discourse, and often prevents students from connecting and integrating knowledge and meeting progression targets. It is very difficult for individual teachers on modules to change the way a programme works through exemplary assessment practice on modules. It takes a programme team and a programme to bring about changes in the student experience. Assessment innovations at the individual module level often fail to address assessment problems at the programme-level, some of which, such as too much summative assessment and not enough formative assessment, are a direct consequence of module-focused course design and innovation.
  • #4: Raise the question: are there problems with the packaging? Works for furniture – does it work for student learning? Assumptions of modularity: self-contained; disconnected; interchangeable. The next slide indicates some of the tensions of packaging learning in modules, and tensions inherent in the ,metaphor./
  • #5: Originally used for furniture and prefab and modular homes – how well does it suit educational purposes? I’m not taking issue with modules per se, but want to highlight that there have been some unintended consequences – some good, some bad – of using modular systems. Many programmes have navigated through them, some haven’t. Anyone who has built IKEA furniture knows that the instructions are far from self-evident – and we have translated a lot of our instructions, criteria, programme and module documents for students in ways that may be as baffling for them. Have we squeezed learning into a mould that works better for furniture?
  • #8: Huge appetite for programme-level data in the sector. Worked with more than 100 programmes in 40 universities internationally. The timing of TESTA – many universities revisiting the design of degrees, thinking about coherence, progression and the impact of modules on student learning. The confluence of modules with semesterisation, lacl of slow learning, silo effects and pointlessness of feedback after the end of a module…
  • #9: What started as a research methodology has become a way of thinking. David Nicol – changing the discourse, the way we think about assessment and feedback; not only technical, research, mapping, also shaping our thinking. Evidence, assessment principles
  • #11: Based on robust research methods about whole programmes - 40 audits; 2000 AEQ returns; 50 focus groups. The two triangulating methodologies of the AEQ and focus groups are student experience data – student voice etc. Three legged stool. These three elements of data are compiled into a case profile which captures the interaction of an academic’s programme view, the ‘official line’ or discourse of assessment and how students perceive it. This is a very dynamic rendering because student voice is explanatory, but also probes some of our assumptions as academics about how students work and how assessment works for them etc. Finally the case profile is subject to discussion and contextualisation by insiders – the people who teach on the programme, who prioritise interventions.
  • #12: Hard data from chat and documents
  • #14: More than 50
  • #15: Large programme; modular approaches; marker variation, late feedback; dependency on tutors
  • #20: Taster of each
  • #28: Student workloads often concentrated around two summative points per module. Sequencing, timing, bunching issues, and ticking off modules so students don’t pay attention to feedback at the end point.
  • #30: Limitations of explicit criteria, marker variation is huge, particularly in humanities, arts and professional courses (non science ones) Students haven’t internalised standards which are often tacit. Marking workshops, exemplars, peer review.
  • #34: Seminars youtube presentations; teaching student – map my programme; under-confident but keen journal club Principles make it authentic, Multi stage; Public work – social pressure; Spread and co-ordinate hand in dates; Formative requirements peer marking and accountability sampling Setting first year expectations Brief, frequent, innovative, developmental
  • #39: TESTA Higher Education Academy NTFS project, funded for 3 years in 2009. 4 partner universities, 7 programmes – ‘cathedrals group’. Gather data on whole programme assessment, and feed this back to teams in order to bring about changes. In the original seven programmes collected before and after data.