Analysing Learning Analytics
James Little
Learning Technologist
SDDU
Analysing Learning Analytics
• What is learning analytics?
• How can learning analytics be performed?
– Considerations
– Examples
• What impacts can it have on your learning,
teaching or research?
WHAT IS LEARNING ANALYTICS?
What is Learning Analytics?
The 2013 NMC Horizon Report describes
learning analytics as:
“[a] field associated with deciphering trends and
patterns from educational big data, or huge sets
of student-related data, to further the
advancement of a personalized, supportive
system of higher education.”
What is Learning Analytics?
What is the purpose of learning analytics?
A JISC CETIS (2012) report identifies a focus on using it:
• for individual learners to reflect on their achievements and patterns
of behaviour in relation to others;
• as predictors of students requiring extra support and attention;
• to help teachers and support staff plan supporting interventions
with individuals and groups;
• for functional groups such as course team seeking to improve
current courses or develop new curriculum offerings; and
• for institutional administrators taking decisions on matters such as
marketing and recruitment or efficiency and effectiveness
measures.
What is Learning Analytics?
Does Learning Analytics = Big Data?
What is Learning Analytics?
Does Learning Analytics = Big Data?
Yes.
What is Learning Analytics?
Big Data is…
• Multiple sets of data from different areas and
attempting to analyse these - i.e. start with
“dirty data” is big data (VLE/Student Records).
– Data has to be sorted first then analysed together
– Proposed NHS core.health data from multiple
areas
What is Learning Analytics?
Does Learning Analytics = Big Data?
Yes… and No!
What is Learning Analytics?
Big Data is not simply a large sample size
(M Callaghan 2014. - HPC Computer Officer)
What is Learning Analytics?
Learning Analytics can look at big data,
large sample sizes of data, or even small
sample sizes.
Purpose to look at both trends and
patterns which can then enable action on a
local, personal level as well as to inform
wider decisions.
HOW CAN LEARNING ANALYTICS BE
PERFORMED? - CONSIDERATIONS
How can learning analytics be
performed?
• 3 considerations:
1. Context (what you want to know) affects:
2. Sample Size (small or large)
3. Scope (sliding scale between narrow or wide)
(Scope and sample size can be interrelated)
Context
• What do you want to find out?
– To improve something?
– To gain insight into an unknown?
– To provide detailed support?
Context
A JISC CETIS (2012) report identifies a focus on using it:
• for individual learners to reflect on their achievements and patterns
of behaviour in relation to others;
• as predictors of students requiring extra support and attention;
• to help teachers and support staff plan supporting interventions
with individuals and groups;
• for functional groups such as course team seeking to improve
current courses or develop new curriculum offerings; and
• for institutional administrators taking decisions on matters such as
marketing and recruitment or efficiency and effectiveness
measures.
Defining Sample Size
Small: Individuals on one unit (i.e.
Medical Anatomy)
Medium: Individuals on
the unit stretching back
5 years (i.e. Medical
Anatomy)
Large: Similar units
across the sector (i.e.
Medical Anatomy unit
in all the Med Schools)
(Context: looking at a subject area within a
discipline)
Defining Scope
Narrow: Outcomes for one
session on – i.e. Philosophy Logic
and Reason, session 4
Medium: Outcomes
for a whole unit – all
of Philosophy Logic
and Reason
Wide: Outcomes for
all units that are
taken by the student
at UoL.
(Context: A Philosophy student)
Defining Scope
Narrow: 1 student’s
information at UoL and their
programme of study
Medium: A whole
cohort of student’s
information at UoL
on one course
Wide: All of the
students at UoL
(Context: Student experience)
Sample Size/Scope
Big
Medium
Small
Scale of challenge!
MOOCS
VIRTUAL LEARNING ENVIRONMENTS
INSTITUTIONAL DATA
UNIT ACTIVITY
PROGRAMMES
SCHOOLS
DEPARTMENT
Learning Analytics can take place at every level
Sample Size/Scope
• Historically, information at the small sample size and scope
has been available for analysis – thinking about module
information, marks, information with a school or
department.
• A focus on quantitative data
• New platforms such as MOOCs and other at-scale platforms
(VLEs now included) have enable the potential for more
information to be produced and collected.
• This works well within specific contexts (such as a one
MOOC or platform)
• Challenges are across education as a whole to liberate and
find multiple sets of institution-wide data.
HOW CAN LEARNING ANALYTICS BE
PERFORMED? – EXAMPLES
Examples
• VLE – Applied and Professional Ethics Online Masters IDEA
CETL (University of Leeds)
– Discussion forum analytics used to keep track of activity
– http://www.slideshare.net/MeganKime/meaningful-discussion-activity-for-online-distance-learners
• VLE - Advanced Nursing Studies (University of Sheffield)
– Gradebook kept track of formative and summative assignments
– Useful for predicting and supporting student needs
– http://www.sheffield.ac.uk/snm/postgraduatetaught/mmed-sci-advanced-nursing-studies
• Tools used those available within a specific platform and for a
specific course.
• Sample size and scope small.
Examples
• MOOCS / FutureLearn – Introduction to Anatomy
• Statistics produced on:
– Signups
– Completion
– Discussion engagement
– Drop-out rate
• Used to inform next approaches to MOOC design
• Sample size large, scope small.
Examples
• MOOCS / FutureLearn – First 8 Courses
– https://about.futurelearn.com/blog/measuring-our-first-eight-courses/
Examples
• MOOCS / FutureLearn – First 8 Courses
– https://about.futurelearn.com/blog/measuring-our-first-eight-courses/
Examples
• Sample size large, scope medium.
• MOOCS / FutureLearn – First 8 Courses
– https://about.futurelearn.com/blog/measuring-our-first-eight-courses/
Examples
– Quiz/Polling tool
– Can be used to analyse understanding of a group
– Often used within one session/unit
– Potential to combine information across sessions
Examples
• Past examples have been re-active and aimed at
those running the course (academics and
professional support)
• Power lies in personalising data for the individuals
taking part and also being pro-active
Examples
• Stanford Lytics Labs – Sherif Halawa
– http://web.stanford.edu/~halawa/cgi-bin/
• Developing Tools that will analyse a large data set on online
courses:
– Perform analysis to predict when addition support needed
– Predict potential drop-outs
– Offer tailored support
• Sample-size large, scope large.
Examples – WorldWide
• The Glass Classroom
• Santa Monica College’s Glass Classroom initiative strives to enhance student and teacher
performance through the collection and analysis of large amounts of data. Using real-time
feedback, adaptive courseware adjusts based on an individual’s performance in the classroom in
order to meet educational objectives.
• jPoll at Griffith University
• jPoll is an enterprise-wide tool developed by Griffith University in Australia, directed at
capturing, maintaining, and engaging students in a range of interactive teaching situations.
Originally developed as a replacement for clicker-type technologies, jPoll is helping educators
identify problem areas for students via learning analytics.
• Predictive Learning Analytics Framework
• The American Public University System is working with Western Interstate Commission for Higher
Education’s Cooperative for Educational Technologies to share a large data pool of student records
across ten universities. Their goal is for this data to inform strategies for improving student learning
outcomes.
• Stanford University’s Multimodal Learning Analytics
• In partnership with the AT&T Foundation, Lemann Foundation, and National Science Foundation,
Stanford is exploring new ways to assess project-based learning activities through students’
gestures, words, and other expressions.
Information from: http://www.edudemic.com/learning-analytics-in-education/
Tools Summary
• For small and large sample sizes – but small scope:
– Analytics can take place within a specific platform
(VLE/FutureLearn)
– Also can be analysed using ‘standard’ statistical
tools such as:
• R
• SPSS
• MatLab
Tools Summary
• “Big Data” – (multiple sets of data from
different areas) requires:
– Sorting (combining and standardising the data)
– Analysis – driven by needs (carried out by new techniques i.e. Social
Network Analysis / Pattern Recognition)
– Re-surfacing
• To individuals to look at their fit within the wider context
• To individual providing support to identify support within a unit of
study
• To committees making specific strategic decisions
WHAT IMPACTS CAN IT HAVE ON YOUR
LEARNING, TEACHING OR RESEARCH?
What impacts can it have on your
learning, teaching or research?
• Learning & Teaching:
– Using data produced by platforms or specific tools
can directly inform the learning and education
process in situ (formative assessment).
• Can be used by the academic
• Can be used by the student
– Looking at overall trends with a unit, programme
or field can enable decisions for next time it is run.
What impacts can it have on your
learning, teaching or research?
• Research
– Connect any research data back into teaching
activities
– Use information generated from learning and
teaching to generate research
Impacts - Challenges
• To liberate and find multiple sets of institution-wide
data.
• Re-surfacing information in a timely and specific way
Next Steps…
• What could you do with information already available to you?
– Feedback to students
– Enable students to access
– Share with colleagues
• Could you enable more information to be generated, through
use of new tools or techniques?
• Could you use information from multiple sources to provide
support?
• What scope and sample size would be involved?
Further Resources & References
• Learning Analytics Community Exchange
– http://www.laceproject.eu/lace/
– http://www.laceproject.eu/blog/using-data-to-improve-student-success/
– http://www.laceproject.eu/blog/moocs-learning-analytics/
• Stanford Lytics Lab
– http://lytics.stanford.edu/
• Edudemic
http://www.edudemic.com/learning-analytics-in-education/
• Greller, Wolfgang; Drachsler, Hendrik (2012). "Translating Learning
into Numbers: Toward a Generic Framework for Learning Analytics."
(pdf). Educational Technology and Society 15 (3): 42–57.
• Cooper, Adam. A Brief History of Analytics A Briefing Paper. CETIS
Analytics Series. JISC CETIS, November 2012.
http://publications.cetis.ac.uk/wp-
content/uploads/2012/12/Analytics-Brief-History-Vol-1-No9.pdf.
THANK YOU
QUESTIONS & DISCUSSION

Learning Analytics

  • 1.
    Analysing Learning Analytics JamesLittle Learning Technologist SDDU
  • 2.
    Analysing Learning Analytics •What is learning analytics? • How can learning analytics be performed? – Considerations – Examples • What impacts can it have on your learning, teaching or research?
  • 3.
    WHAT IS LEARNINGANALYTICS?
  • 4.
    What is LearningAnalytics? The 2013 NMC Horizon Report describes learning analytics as: “[a] field associated with deciphering trends and patterns from educational big data, or huge sets of student-related data, to further the advancement of a personalized, supportive system of higher education.”
  • 5.
    What is LearningAnalytics? What is the purpose of learning analytics? A JISC CETIS (2012) report identifies a focus on using it: • for individual learners to reflect on their achievements and patterns of behaviour in relation to others; • as predictors of students requiring extra support and attention; • to help teachers and support staff plan supporting interventions with individuals and groups; • for functional groups such as course team seeking to improve current courses or develop new curriculum offerings; and • for institutional administrators taking decisions on matters such as marketing and recruitment or efficiency and effectiveness measures.
  • 6.
    What is LearningAnalytics? Does Learning Analytics = Big Data?
  • 7.
    What is LearningAnalytics? Does Learning Analytics = Big Data? Yes.
  • 8.
    What is LearningAnalytics? Big Data is… • Multiple sets of data from different areas and attempting to analyse these - i.e. start with “dirty data” is big data (VLE/Student Records). – Data has to be sorted first then analysed together – Proposed NHS core.health data from multiple areas
  • 9.
    What is LearningAnalytics? Does Learning Analytics = Big Data? Yes… and No!
  • 10.
    What is LearningAnalytics? Big Data is not simply a large sample size (M Callaghan 2014. - HPC Computer Officer)
  • 11.
    What is LearningAnalytics? Learning Analytics can look at big data, large sample sizes of data, or even small sample sizes. Purpose to look at both trends and patterns which can then enable action on a local, personal level as well as to inform wider decisions.
  • 12.
    HOW CAN LEARNINGANALYTICS BE PERFORMED? - CONSIDERATIONS
  • 13.
    How can learninganalytics be performed? • 3 considerations: 1. Context (what you want to know) affects: 2. Sample Size (small or large) 3. Scope (sliding scale between narrow or wide) (Scope and sample size can be interrelated)
  • 14.
    Context • What doyou want to find out? – To improve something? – To gain insight into an unknown? – To provide detailed support?
  • 15.
    Context A JISC CETIS(2012) report identifies a focus on using it: • for individual learners to reflect on their achievements and patterns of behaviour in relation to others; • as predictors of students requiring extra support and attention; • to help teachers and support staff plan supporting interventions with individuals and groups; • for functional groups such as course team seeking to improve current courses or develop new curriculum offerings; and • for institutional administrators taking decisions on matters such as marketing and recruitment or efficiency and effectiveness measures.
  • 16.
    Defining Sample Size Small:Individuals on one unit (i.e. Medical Anatomy) Medium: Individuals on the unit stretching back 5 years (i.e. Medical Anatomy) Large: Similar units across the sector (i.e. Medical Anatomy unit in all the Med Schools) (Context: looking at a subject area within a discipline)
  • 17.
    Defining Scope Narrow: Outcomesfor one session on – i.e. Philosophy Logic and Reason, session 4 Medium: Outcomes for a whole unit – all of Philosophy Logic and Reason Wide: Outcomes for all units that are taken by the student at UoL. (Context: A Philosophy student)
  • 18.
    Defining Scope Narrow: 1student’s information at UoL and their programme of study Medium: A whole cohort of student’s information at UoL on one course Wide: All of the students at UoL (Context: Student experience)
  • 19.
    Sample Size/Scope Big Medium Small Scale ofchallenge! MOOCS VIRTUAL LEARNING ENVIRONMENTS INSTITUTIONAL DATA UNIT ACTIVITY PROGRAMMES SCHOOLS DEPARTMENT Learning Analytics can take place at every level
  • 20.
    Sample Size/Scope • Historically,information at the small sample size and scope has been available for analysis – thinking about module information, marks, information with a school or department. • A focus on quantitative data • New platforms such as MOOCs and other at-scale platforms (VLEs now included) have enable the potential for more information to be produced and collected. • This works well within specific contexts (such as a one MOOC or platform) • Challenges are across education as a whole to liberate and find multiple sets of institution-wide data.
  • 21.
    HOW CAN LEARNINGANALYTICS BE PERFORMED? – EXAMPLES
  • 22.
    Examples • VLE –Applied and Professional Ethics Online Masters IDEA CETL (University of Leeds) – Discussion forum analytics used to keep track of activity – http://www.slideshare.net/MeganKime/meaningful-discussion-activity-for-online-distance-learners • VLE - Advanced Nursing Studies (University of Sheffield) – Gradebook kept track of formative and summative assignments – Useful for predicting and supporting student needs – http://www.sheffield.ac.uk/snm/postgraduatetaught/mmed-sci-advanced-nursing-studies • Tools used those available within a specific platform and for a specific course. • Sample size and scope small.
  • 23.
    Examples • MOOCS /FutureLearn – Introduction to Anatomy • Statistics produced on: – Signups – Completion – Discussion engagement – Drop-out rate • Used to inform next approaches to MOOC design • Sample size large, scope small.
  • 24.
    Examples • MOOCS /FutureLearn – First 8 Courses – https://about.futurelearn.com/blog/measuring-our-first-eight-courses/
  • 25.
    Examples • MOOCS /FutureLearn – First 8 Courses – https://about.futurelearn.com/blog/measuring-our-first-eight-courses/
  • 26.
    Examples • Sample sizelarge, scope medium. • MOOCS / FutureLearn – First 8 Courses – https://about.futurelearn.com/blog/measuring-our-first-eight-courses/
  • 27.
    Examples – Quiz/Polling tool –Can be used to analyse understanding of a group – Often used within one session/unit – Potential to combine information across sessions
  • 28.
    Examples • Past exampleshave been re-active and aimed at those running the course (academics and professional support) • Power lies in personalising data for the individuals taking part and also being pro-active
  • 29.
    Examples • Stanford LyticsLabs – Sherif Halawa – http://web.stanford.edu/~halawa/cgi-bin/ • Developing Tools that will analyse a large data set on online courses: – Perform analysis to predict when addition support needed – Predict potential drop-outs – Offer tailored support • Sample-size large, scope large.
  • 30.
    Examples – WorldWide •The Glass Classroom • Santa Monica College’s Glass Classroom initiative strives to enhance student and teacher performance through the collection and analysis of large amounts of data. Using real-time feedback, adaptive courseware adjusts based on an individual’s performance in the classroom in order to meet educational objectives. • jPoll at Griffith University • jPoll is an enterprise-wide tool developed by Griffith University in Australia, directed at capturing, maintaining, and engaging students in a range of interactive teaching situations. Originally developed as a replacement for clicker-type technologies, jPoll is helping educators identify problem areas for students via learning analytics. • Predictive Learning Analytics Framework • The American Public University System is working with Western Interstate Commission for Higher Education’s Cooperative for Educational Technologies to share a large data pool of student records across ten universities. Their goal is for this data to inform strategies for improving student learning outcomes. • Stanford University’s Multimodal Learning Analytics • In partnership with the AT&T Foundation, Lemann Foundation, and National Science Foundation, Stanford is exploring new ways to assess project-based learning activities through students’ gestures, words, and other expressions. Information from: http://www.edudemic.com/learning-analytics-in-education/
  • 31.
    Tools Summary • Forsmall and large sample sizes – but small scope: – Analytics can take place within a specific platform (VLE/FutureLearn) – Also can be analysed using ‘standard’ statistical tools such as: • R • SPSS • MatLab
  • 32.
    Tools Summary • “BigData” – (multiple sets of data from different areas) requires: – Sorting (combining and standardising the data) – Analysis – driven by needs (carried out by new techniques i.e. Social Network Analysis / Pattern Recognition) – Re-surfacing • To individuals to look at their fit within the wider context • To individual providing support to identify support within a unit of study • To committees making specific strategic decisions
  • 33.
    WHAT IMPACTS CANIT HAVE ON YOUR LEARNING, TEACHING OR RESEARCH?
  • 34.
    What impacts canit have on your learning, teaching or research? • Learning & Teaching: – Using data produced by platforms or specific tools can directly inform the learning and education process in situ (formative assessment). • Can be used by the academic • Can be used by the student – Looking at overall trends with a unit, programme or field can enable decisions for next time it is run.
  • 35.
    What impacts canit have on your learning, teaching or research? • Research – Connect any research data back into teaching activities – Use information generated from learning and teaching to generate research
  • 36.
    Impacts - Challenges •To liberate and find multiple sets of institution-wide data. • Re-surfacing information in a timely and specific way
  • 37.
    Next Steps… • Whatcould you do with information already available to you? – Feedback to students – Enable students to access – Share with colleagues • Could you enable more information to be generated, through use of new tools or techniques? • Could you use information from multiple sources to provide support? • What scope and sample size would be involved?
  • 38.
    Further Resources &References • Learning Analytics Community Exchange – http://www.laceproject.eu/lace/ – http://www.laceproject.eu/blog/using-data-to-improve-student-success/ – http://www.laceproject.eu/blog/moocs-learning-analytics/ • Stanford Lytics Lab – http://lytics.stanford.edu/ • Edudemic http://www.edudemic.com/learning-analytics-in-education/ • Greller, Wolfgang; Drachsler, Hendrik (2012). "Translating Learning into Numbers: Toward a Generic Framework for Learning Analytics." (pdf). Educational Technology and Society 15 (3): 42–57. • Cooper, Adam. A Brief History of Analytics A Briefing Paper. CETIS Analytics Series. JISC CETIS, November 2012. http://publications.cetis.ac.uk/wp- content/uploads/2012/12/Analytics-Brief-History-Vol-1-No9.pdf.
  • 39.
  • 40.

Editor's Notes

  • #3 An overview – much more information out there…
  • #12 Make a note that I think that learning analytics can take place on a course or local small sample size and scope level too.
  • #14 Hinting at before is scope and sample size
  • #15 Hinting at before is scope and sample size
  • #20 Historically, information at the small sample size and scope has been available for analysis in HE – thinking about module information, marks, information with a school or department. Now, new platforms such as MOOCs and other at-scale platforms (VLEs now included) have enable the potential for more information to be produced and collected – but this works well within specific contexts (such as a one MOOC or platform) but challenges are across HE as a whole. Work is being done to transparent information etc…. But also at a local level. //
  • #34 Follow on from
  • #35 Follow on from
  • #36 Follow on from
  • #37 Follow on from
  • #38 Follow on from
  • #42 All types of sample size/scope can be used to inform at different levels (i.e. University Policy versus indidudaul students knowledge as per 5 above)…. Best to contexualise the why/purpose with the tools used. Tools can be specific one such as those used for “Big Data tools used for large sample sizes and research.” or for small ones/existing (and upgradded) VLE Links to educational reseacg and mooc analysis. Split into research tools – applied to learning Educational tools that produce statics that can be used Bespoke edicational tools for a specific context (Denstiry learning dashboard).