Rewriting the Rules 
HEIR network | September2014 
Dr Rachel Forsyth 
PL for curriculum development and innovation, MMU 
Professor Mark Stubbs 
Head of Learning and Research Technologies, MMU 
http://twitter.com/rmforsyth 
r.m.forsyth@mmu.ac.uk
TRansforming Assessment and 
Feedback For Institutional 
Change: TRAFFIC 
Aim: 
to align assessment and 
feedback policies, processes 
and support to institutional 
goals of enhancing student 
satisfaction and success.
Context 
• 36,000 students - 2700 on 
Combined Honours 
• 600,000 submissions annually 
• Big peaks for submission 
dates – systems issues 
• Adrift of sector on NSS 
• Just completed huge 
curriculum change
Baseline report 
• Review of documentation (policies, regs, etc) 
• Interviews with staff involved with assessment 
(purposive sampling) 
• Interviews with innovators 
• Focus groups with student support officers, 
administrators and technology-enhanced learning 
officers 
• Thorough discussion throughout governance 
structure
Understanding how best to make a difference 
People frustrated with: 
• Overly bureaucratic processes 
• Procedures which were not always clear 
• Lack of consistency 
• Stand-alone systems for different parts of 
process 
• Myths about assessment
Further investigation 
1. Collection of data on assignment types and 
submission dates: 
– myth-busting, guidance for consistency, 
development of EMA requirements 
2. Review of student comments about assessment 
and working with SU 
– unpicking dissatisfaction issues
Assignment types
Blog 
Poster 
Workbook 
Autobiography 
Computer Based 
Logbook 
Override 
Portfolio report 
Programming Assignment 
Programming Exercises 
Website 
Bibliography 
Documentation 
Learning Agreement 
Oral-practical 
Placement assessment 
Analysis 
Class Activity 
Data Exercise
Internal student survey 
2011/12
Removed from this section: comments from the 
student survey.
New procedures 
• Assignment briefs 
• Marking and moderation 
• Feedback planning 
• Formal annual review of 
assessment
New processes 
• Institution-wide 
coursework receipting 
system 
• Focus on Assessment 
in annual review 
• Automation where 
possible 
Photo by Freekz0r (CC licensed, from flickr
Are we in a better place now?
Some key points 
• Students’ Union involved throughout 
• Process approach helped 
• Sustained ā€˜marketing campaign’ 
• New procedures are pedagogically neutral – 
academic decision-making left to programme 
teams
Are we in a better place now? 
Also reviewing 
• Survey comments from students 
• Views of staff 
• New programme documentation 
QAA institutional review next year
Links 
New Institutional Code of Practice on Assessment 
http://www.mmu.ac.uk/academic/casqe/regulations/icp.php 
MMU assessment resources 
http://www.celt.mmu.ac.uk/assessment/index.php

HEIR conference 8-9 September 2014: Forsyth and Stubbs

  • 1.
    Rewriting the Rules HEIR network | September2014 Dr Rachel Forsyth PL for curriculum development and innovation, MMU Professor Mark Stubbs Head of Learning and Research Technologies, MMU http://twitter.com/rmforsyth [email protected]
  • 2.
    TRansforming Assessment and Feedback For Institutional Change: TRAFFIC Aim: to align assessment and feedback policies, processes and support to institutional goals of enhancing student satisfaction and success.
  • 3.
    Context • 36,000students - 2700 on Combined Honours • 600,000 submissions annually • Big peaks for submission dates – systems issues • Adrift of sector on NSS • Just completed huge curriculum change
  • 4.
    Baseline report •Review of documentation (policies, regs, etc) • Interviews with staff involved with assessment (purposive sampling) • Interviews with innovators • Focus groups with student support officers, administrators and technology-enhanced learning officers • Thorough discussion throughout governance structure
  • 5.
    Understanding how bestto make a difference People frustrated with: • Overly bureaucratic processes • Procedures which were not always clear • Lack of consistency • Stand-alone systems for different parts of process • Myths about assessment
  • 6.
    Further investigation 1.Collection of data on assignment types and submission dates: – myth-busting, guidance for consistency, development of EMA requirements 2. Review of student comments about assessment and working with SU – unpicking dissatisfaction issues
  • 7.
  • 8.
    Blog Poster Workbook Autobiography Computer Based Logbook Override Portfolio report Programming Assignment Programming Exercises Website Bibliography Documentation Learning Agreement Oral-practical Placement assessment Analysis Class Activity Data Exercise
  • 11.
  • 12.
    Removed from thissection: comments from the student survey.
  • 14.
    New procedures •Assignment briefs • Marking and moderation • Feedback planning • Formal annual review of assessment
  • 15.
    New processes •Institution-wide coursework receipting system • Focus on Assessment in annual review • Automation where possible Photo by Freekz0r (CC licensed, from flickr
  • 16.
    Are we ina better place now?
  • 17.
    Some key points • Students’ Union involved throughout • Process approach helped • Sustained ā€˜marketing campaign’ • New procedures are pedagogically neutral – academic decision-making left to programme teams
  • 18.
    Are we ina better place now? Also reviewing • Survey comments from students • Views of staff • New programme documentation QAA institutional review next year
  • 19.
    Links New InstitutionalCode of Practice on Assessment http://www.mmu.ac.uk/academic/casqe/regulations/icp.php MMU assessment resources http://www.celt.mmu.ac.uk/assessment/index.php

Editor's Notes

  • #9Ā But we also have a fantastically wide range of others, as you’d expect with the wide diversity of professional skills we’re aiming to test across the institution. Tasks need to be chosen to demonstrate achievement of the learning outcomes, and also need to be practical to do and to mark, and to engage students, and to discourage plagiarism or other forms of cheating. Quite a challenge, but we do try to encourage people to be creative and to specify tasks which they will look forward to marking.
  • #10Ā We’ve used this data to provide targeted guidance, both for the popular assignments and for others, to encourage people to experiment.
  • #14Ā Used data holistically rather than reacting to individual elements