CSEDU 2022: 14th International Conference on Computer Supported Education
Katrien Verbert
Augment / HCI – KU Leuven
katrien.verbert@kuleuven.be
Designing Learning Analytics Dashboards: Lessons Learned
Augment
2
Dashboards - explainable AI - recommender systems – visualisation – intelligent user interfaces
Human-Computer Interaction research group
intro
3
Team
Augment
4
“Learning analytics is
about collecting traces
that learners leave
behind and using those
traces to improve
learning.”
- Erik Duval
5 Duval, E., & Verbert, K. (2012). Learning analytics. E-Learning and Education, 1(8).
LEARNING ANALYTICS
intro
Src: Steve Schoettler
LEARNING ANALYTICS
intro
6
Users who bought the same product also bought product B and C
Recommender systems
RecSys
7
8
9
10
Gutiérrez, F., Seipp, K., Ochoa, X., Chiluiza, K., De Laet, T. and Verbert, K., 2018. LADA: A learning analytics
dashboard for academic advising. Computers in Human Behavior.
Academic risk prediction
RecSys
11
Illustration: Erik Blad for The Intercept
“Black box” approach does not work
RecSys
§ Explaining model outcomes to increase user trust and acceptance
§ Enable users to interact with the explanation process to improve the model
Models
LA dashboards for recommender systems
RecSys
12
q Transparency
q User control
q Diversity – novelty
q Cold start
q Context-aware interfaces
13
He, C., Parra, D. and Verbert, K., 2016. Interactive recommender systems: A survey of the state of the art
and future research challenges and opportunities. Expert Systems with Applications, 56, pp.9-27.
LA dashboards for recommender systems
RecSys
14
Explaining compentence-based recommendations
RecSys
15
16
Evaluation setup
• Participants: 66 job seekers
• Data Collection and measurements:
• ResQue questionnaire
• Open questions
• Logging
17
Francisco Gutierrez, Sven Charleer, Robin De Croon, Nyi Nyi Htun, Gerd Goetschalckx, Katrien Verbert Explaining and exploring job
recommendations: a user-driven approach for designing an interactive job recommender system. RecSys 19. pp. 1-10
RecSys
18
• Explanations contribute to support user empowerment.
• A diverse set of actionable insights.
• Differences across user groups: need for personalisation.
19
Results
RecSys
How to automatically adapt
the exercise recommending
on Wiski to the level of
students?
How do (placebo)
explanations affect initial trust
in Wiski for recommending
exercises?
Goals and research questions
Automatic adaptation Explanations & trust Young target audience
Middle and high school students
Ooge, J., Kato, S., & Verbert, K. (2022). Explaining recommendations in e-learning: effects on adolescents’
initial trust. In Proceedings of the 27th Annual Conference on Intelligent User Interfaces. ACM.
Explaining exercise recommendations
RecSys
20
Why?
Justification
Comparison
with others
Real
explanation
Placebo
explanation
No
explanation
21
Methodology: randomised control trial
RecSys
22
… did increase multidimensional initial trust
… did not increase one-dimensional initial trust
… led to accepting more recommended
exercises
compared to both placebo and no explanations
Real explanations
RecSys
23
… did not increase initial trust compared to no explanations
… may undermine perceived integrity
… are a useful baseline:
• how critical are students towards explanations?
• how much transparency do students need?
Placebo explanations
RecSys
24
Can be acceptable in low-stakes situations
(e.g., drilling exercises):
indications of difficulty level might suffice
Personal level indication:
Easy, Medium and Hard
tags
No explanations
RecSys
25
https://www.imec-int.com/en/research-portfolio/aida
Next steps
RecSys
26
https://www.imec-int.com/en/research-portfolio/aida
Next steps
RecSys
27
Src: Steve Schoettler
LEARNING ANALYTICS
28
Verbert,
K.,
Govaerts,
S.,
Duval,
E.,
Santos
Odriozola,
J.,
Van
Assche,
F.,
Parra
Chico,
G.,
Klerkx,
J.
(2014).
Learning
dashboards:
an
overview
and
future
research
opportunities.
Personal
and
Ubiquitous
Computing,
18(6),
1499-1514.
29
BLENDED LEARNING
F2F GROUP WORK
STUDENT-ADVISER
30 https://www.flickr.com/photos/lockechrisj/
Sten Govaerts, Katrien Verbert, Aberlardo Pardo, Erik Duval. The student activity meter for awareness and self-reflection.
CHI'12 Extended Abstracts on Human Factors in Computing Systems. ACM, 2012.
CREATING EFFECTIVE LEARNING DASHBOARDS
blended learning
abundance of data - effort - outcome
31
CREATING EFFECTIVE LEARNING DASHBOARDS
blended learning
Verbert, K., Govaerts, S., Duval, E., Santos, J. L., Van Assche, F., Parra, G., & Klerkx, J. (2013). Learning dashboards: an overview
and future research opportunities. Personal and Ubiquitous Computing, 1-16.
32
RQ1: How should we visualise learner data to support students to
explore the path from effort to outcomes?
RQ2: How can we promote students, inside and outside the
classroom, to actively explore this effort to outcomes path?
33
CREATING EFFECTIVE LEARNING DASHBOARDS
blended learning
abundance of data - effort - outcome
34
Charleer, S., Klerkx, J., Santos, J. L., & Duval, E. Improving awareness and reflection through collaborative, interactive
visualizations of badges. In Proceedings of the 3rd Workshop on Awareness and Reflection in Technology-Enhanced Learning,
pages 69-81. CEUR Workshop Proceedings, 2013
ARTEL 2014 . Graz, Austria
ARTEL 2013
35
36
EVALUATIONS
Abstract the LA data
Provide access to the artefacts
Provide access to teacher and peer feedback
37
RESULTS
RQ1: What are relevant learning traces, and how should we visualise
these data to support students to explore the path from effort to
outcomes?
38
RESULTS
RQ2: How can we promote students, inside and outside the classroom,
to actively explore this effort to outcomes path?
Visualise the learner path
Integrate LA into the workflow
Facilitate collaborative exploration of the LA data
39
Visualise the learner path
40
Visualise the learner path
41
Visualise the learner path
42
Visualise the learner path
43
BALANCED DISCUSSION IN THE CLASSROOM
F2F Group Work
RQ3: What are the design challenges for ambient Learning
Dashboards to promote balanced group participation in
classrooms, and how can they be met?
RQ4: Are ambient Learning Dashboards effective means for
creating balanced group participation in classroom settings?
over- and under-participation
44
K. Bachour, F. Kaplan, and P. Dillenbourg. An interactive table for supporting participation balance in
face-to-face collaborative learning. IEEE Trans. Learn. Technol., 3(3):203–213, July 2010.
Over-
participation:“free-
riders” can affect
the motivated
learner to reduce
contributions
G. Salomon and T. Globerson. When teams do not function the way they ought to. International Journal of Educational Research, 13(1):89 – 99, 1989.
EVALUATION SETUP
45
EVALUATION SETUP
case study 1
# participants 12 students
deployment
1 3h session with dashboard
1 3h session without dashboard
evaluation
class discussion, questionnaires
(perceived
distraction/awareness/usefulness),
activity/quality logging
case study 2
# participants 19 students
deployment
half 3h session without dashboard,
half 3h session with dashboard
evaluation
questionnaires (perceived importance
feedback/motivation)
activity/quality logging
46
EVALUATION SETUP
47
Visualise balance in an abstract and neutral way
Add the qualitative dimension to the visualisation
48
RESULTS
RQ3: What are the design challenges for ambient LDs to promote
balanced group participation in classrooms, and how can they be met?
Ambient dashboards as support for teacher/presenter
Ambient dashboards raise awareness of the invisible
Ambient feedback information can activate students
49
RESULTS
RQ4: Are ambient LDs effective means for creating balanced group
participation in classroom settings?
Charleer, S., Klerkx, J., Duval, E., De Laet, T. and Verbert, K. (2017) ‘Towards balanced discussions in the classroom using ambient
information visualisations’, Int. J. Technology Enhanced Learning, Vol. 9, Nos. 2/3, pp.227–253.
Next steps
https://www.imec-int.com/en/research-portfolio/steams
50
51
https://www.imec-int.com/en/research-portfolio/steams
Next steps
52
Next steps
53
Next steps
SUPPORTING ADVISER-STUDENT DIALOGUE
RQ5: What are the design challenges for creating a Learning
Dashboard to support study advice sessions, and how can they be
met?
RQ6: How does such a Learning Dashboard contribute to the role
of the adviser, student, and dialogue?
54
lack of data-based feedback
55
56
EVALUATION SETUP
design
# participants
17 study advisers (preliminary feedback)
5 study advisers (iterative feedback)
approach brainstorms/observations/iterative design
evaluation
# participants 5 study advisors
deployment
Engineering Science, Engineering Science:
Architecture
97 sessions (15-30min per session)
evaluation
15 sessions observed
questionnaires perceived usefulness
57
Factual Insights (-)
Interpretative Insights (+)
Reflective Insights (!)
58
Charleer, S., Moere, A. V., Klerkx, J., Verbert, K., & De Laet, T. (2017). Learning analytics dashboards to support
adviser-student dialogue. IEEE Transactions on Learning Technologies, 11(3), 389-399.
59
RESULTS
S. Claes, N. Wouters, K. Slegers, and A. V. Moere. Controlling In-the-Wild Evaluation Studies of Public Displays. pages 81–84, 2015.
60
“When students see the numbers, they are surprised, but
now they believe me.
Before, I used my gut feeling, now I feel
more certain of what I say as well”.
“It’s like a main thread
guiding the conversation.”
“I can talk about what to do with the results, instead of each
time looking for the data and puzzling it together.”
“Students don’t know where to look during the conversation,
and avoid eye contact.
The dashboard provides them a point of focus”.
“A student changed her study
method in June and could now
see it paid off.”
LISSA supports a personal dialogue.
ü the level of usage depends on the experience and style of
the study advisors
ü fact-based evidence at the side
ü narrative thread
ü key moments and student path help to reconstruct personal
track
“I can focus on the student’s
personal path, rather than
on the facts.”
“Now, I can blame
the dashboard and
focus on collaboratively
looking for the next step to
take.”
61
Doubting to continue (Group 1)
Doubting which courses to take (Group 2)
Doubting which courses to deliberate (Group 3)
Martijn Millecamp, Francisco Gutiérrez, Sven Charleer, Katrien Verbert, Tinne De Laet. A qualitative evaluation of a
learning dashboard to support advisor-student dialogues, FP@LAK18
62
Group 1 Group 2 Group 3
Time db used 0.58 0.43 0.43
Avg. nb of
insights
13.8 10.1 3.67
Avg nb factual
insights
4.7 3.9 0.33
Avg nb of
interpretative
insights
3.33 3 3.2
Avg nb of
reflective insights
5.8 3.2 2
63
Group 1 Group 2 Group 3
Time db used 0.58 0.43 0.43
Avg. nb of
insights
13.8 10.1 3.67
Avg nb factual
insights
4.7 3.9 0.33
Avg nb of
interpretative
insights
3.33 3 3.2
Avg nb of
reflective insights
5.8 3.2 2
64
Group 1 Group 2 Group 3
Time db used 0.58 0.43 0.43
Avg. nb of
insights
13.8 10.1 3.67
Avg nb factual
insights
4.7 3.9 0.33
Avg nb of
interpretative
insights
3.33 3 3.2
Avg nb of
reflective insights
5.8 3.2 2
65
Data Confidence
Collaboration
Adviser’s role
66
RESULTS
RQ6: How does such a Learning Dashboard contribute to the role of the
adviser, student, and dialogue?
RQ5: What are the design challenges for creating a Learning Dashboard
to support study advice sessions, and how can they be met?
Authorship
Visual Encoding
Ethics
[!] Wording matters.
73% chance of success
73% of students of earlier
cohorts with the same study
efficiency obtained the
bachelor degree
http://blog.associatie.kuleuven.be/tinnedelaet/the-nonsense-of-chances-of-success-and-predictive-models/
67
[!] Do not oversimplify. Show uncertainty.
• reality is complex
• measurement is limited
• individual circumstances
• need for nuance
• trigger reflection
68
69
• reality is complex
• measurement is limited
• individual circumstances
• need for nuance
• trigger reflection
[!] Do not oversimplify. Show uncertainty.
[!] Feedback must be “actionable”.
Warning!
Male students have 10%
less probability to be
successful.
You are male.
Warning!
Your online activity is
lagging behind.
action?
?
action?
?
ü
70
Take-away messages
• Involvement of end-users has been key to come up with interfaces
tailored to the needs of users
• Actionable vs non-actionable feedback
• Need for personalisation and simplification
71
Katrien Verbert – KU Leuven
katrien.verbert@cs.kuleuven.be
Twitter: @katrien_v
Thank you! Questions?
Augment
Slide design: Sven Charleer 72

Designing Learning Analytics Dashboards: Lessons Learned

  • 1.
    CSEDU 2022: 14thInternational Conference on Computer Supported Education Katrien Verbert Augment / HCI – KU Leuven [email protected] Designing Learning Analytics Dashboards: Lessons Learned Augment 2
  • 2.
    Dashboards - explainableAI - recommender systems – visualisation – intelligent user interfaces Human-Computer Interaction research group intro 3
  • 3.
  • 4.
    “Learning analytics is aboutcollecting traces that learners leave behind and using those traces to improve learning.” - Erik Duval 5 Duval, E., & Verbert, K. (2012). Learning analytics. E-Learning and Education, 1(8). LEARNING ANALYTICS intro
  • 5.
  • 6.
    Users who boughtthe same product also bought product B and C Recommender systems RecSys 7
  • 7.
  • 8.
  • 9.
    10 Gutiérrez, F., Seipp,K., Ochoa, X., Chiluiza, K., De Laet, T. and Verbert, K., 2018. LADA: A learning analytics dashboard for academic advising. Computers in Human Behavior. Academic risk prediction RecSys
  • 10.
    11 Illustration: Erik Bladfor The Intercept “Black box” approach does not work RecSys
  • 11.
    § Explaining modeloutcomes to increase user trust and acceptance § Enable users to interact with the explanation process to improve the model Models LA dashboards for recommender systems RecSys 12
  • 12.
    q Transparency q Usercontrol q Diversity – novelty q Cold start q Context-aware interfaces 13 He, C., Parra, D. and Verbert, K., 2016. Interactive recommender systems: A survey of the state of the art and future research challenges and opportunities. Expert Systems with Applications, 56, pp.9-27. LA dashboards for recommender systems RecSys
  • 13.
  • 14.
  • 15.
  • 16.
    Evaluation setup • Participants:66 job seekers • Data Collection and measurements: • ResQue questionnaire • Open questions • Logging 17 Francisco Gutierrez, Sven Charleer, Robin De Croon, Nyi Nyi Htun, Gerd Goetschalckx, Katrien Verbert Explaining and exploring job recommendations: a user-driven approach for designing an interactive job recommender system. RecSys 19. pp. 1-10 RecSys
  • 17.
  • 18.
    • Explanations contributeto support user empowerment. • A diverse set of actionable insights. • Differences across user groups: need for personalisation. 19 Results RecSys
  • 19.
    How to automaticallyadapt the exercise recommending on Wiski to the level of students? How do (placebo) explanations affect initial trust in Wiski for recommending exercises? Goals and research questions Automatic adaptation Explanations & trust Young target audience Middle and high school students Ooge, J., Kato, S., & Verbert, K. (2022). Explaining recommendations in e-learning: effects on adolescents’ initial trust. In Proceedings of the 27th Annual Conference on Intelligent User Interfaces. ACM. Explaining exercise recommendations RecSys 20
  • 20.
  • 21.
  • 22.
    … did increasemultidimensional initial trust … did not increase one-dimensional initial trust … led to accepting more recommended exercises compared to both placebo and no explanations Real explanations RecSys 23
  • 23.
    … did notincrease initial trust compared to no explanations … may undermine perceived integrity … are a useful baseline: • how critical are students towards explanations? • how much transparency do students need? Placebo explanations RecSys 24
  • 24.
    Can be acceptablein low-stakes situations (e.g., drilling exercises): indications of difficulty level might suffice Personal level indication: Easy, Medium and Hard tags No explanations RecSys 25
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
    BLENDED LEARNING F2F GROUPWORK STUDENT-ADVISER 30 https://www.flickr.com/photos/lockechrisj/
  • 30.
    Sten Govaerts, KatrienVerbert, Aberlardo Pardo, Erik Duval. The student activity meter for awareness and self-reflection. CHI'12 Extended Abstracts on Human Factors in Computing Systems. ACM, 2012. CREATING EFFECTIVE LEARNING DASHBOARDS blended learning abundance of data - effort - outcome 31
  • 31.
    CREATING EFFECTIVE LEARNINGDASHBOARDS blended learning Verbert, K., Govaerts, S., Duval, E., Santos, J. L., Van Assche, F., Parra, G., & Klerkx, J. (2013). Learning dashboards: an overview and future research opportunities. Personal and Ubiquitous Computing, 1-16. 32
  • 32.
    RQ1: How shouldwe visualise learner data to support students to explore the path from effort to outcomes? RQ2: How can we promote students, inside and outside the classroom, to actively explore this effort to outcomes path? 33 CREATING EFFECTIVE LEARNING DASHBOARDS blended learning abundance of data - effort - outcome
  • 33.
  • 34.
    Charleer, S., Klerkx,J., Santos, J. L., & Duval, E. Improving awareness and reflection through collaborative, interactive visualizations of badges. In Proceedings of the 3rd Workshop on Awareness and Reflection in Technology-Enhanced Learning, pages 69-81. CEUR Workshop Proceedings, 2013 ARTEL 2014 . Graz, Austria ARTEL 2013 35
  • 35.
  • 36.
    Abstract the LAdata Provide access to the artefacts Provide access to teacher and peer feedback 37 RESULTS RQ1: What are relevant learning traces, and how should we visualise these data to support students to explore the path from effort to outcomes?
  • 37.
    38 RESULTS RQ2: How canwe promote students, inside and outside the classroom, to actively explore this effort to outcomes path? Visualise the learner path Integrate LA into the workflow Facilitate collaborative exploration of the LA data
  • 38.
  • 39.
  • 40.
  • 41.
  • 42.
    43 BALANCED DISCUSSION INTHE CLASSROOM F2F Group Work RQ3: What are the design challenges for ambient Learning Dashboards to promote balanced group participation in classrooms, and how can they be met? RQ4: Are ambient Learning Dashboards effective means for creating balanced group participation in classroom settings? over- and under-participation
  • 43.
    44 K. Bachour, F.Kaplan, and P. Dillenbourg. An interactive table for supporting participation balance in face-to-face collaborative learning. IEEE Trans. Learn. Technol., 3(3):203–213, July 2010. Over- participation:“free- riders” can affect the motivated learner to reduce contributions G. Salomon and T. Globerson. When teams do not function the way they ought to. International Journal of Educational Research, 13(1):89 – 99, 1989.
  • 44.
  • 45.
    EVALUATION SETUP case study1 # participants 12 students deployment 1 3h session with dashboard 1 3h session without dashboard evaluation class discussion, questionnaires (perceived distraction/awareness/usefulness), activity/quality logging case study 2 # participants 19 students deployment half 3h session without dashboard, half 3h session with dashboard evaluation questionnaires (perceived importance feedback/motivation) activity/quality logging 46
  • 46.
  • 47.
    Visualise balance inan abstract and neutral way Add the qualitative dimension to the visualisation 48 RESULTS RQ3: What are the design challenges for ambient LDs to promote balanced group participation in classrooms, and how can they be met?
  • 48.
    Ambient dashboards assupport for teacher/presenter Ambient dashboards raise awareness of the invisible Ambient feedback information can activate students 49 RESULTS RQ4: Are ambient LDs effective means for creating balanced group participation in classroom settings? Charleer, S., Klerkx, J., Duval, E., De Laet, T. and Verbert, K. (2017) ‘Towards balanced discussions in the classroom using ambient information visualisations’, Int. J. Technology Enhanced Learning, Vol. 9, Nos. 2/3, pp.227–253.
  • 49.
  • 50.
  • 51.
  • 52.
  • 53.
    SUPPORTING ADVISER-STUDENT DIALOGUE RQ5:What are the design challenges for creating a Learning Dashboard to support study advice sessions, and how can they be met? RQ6: How does such a Learning Dashboard contribute to the role of the adviser, student, and dialogue? 54 lack of data-based feedback
  • 54.
  • 55.
  • 56.
    EVALUATION SETUP design # participants 17study advisers (preliminary feedback) 5 study advisers (iterative feedback) approach brainstorms/observations/iterative design evaluation # participants 5 study advisors deployment Engineering Science, Engineering Science: Architecture 97 sessions (15-30min per session) evaluation 15 sessions observed questionnaires perceived usefulness 57
  • 57.
    Factual Insights (-) InterpretativeInsights (+) Reflective Insights (!) 58
  • 58.
    Charleer, S., Moere,A. V., Klerkx, J., Verbert, K., & De Laet, T. (2017). Learning analytics dashboards to support adviser-student dialogue. IEEE Transactions on Learning Technologies, 11(3), 389-399. 59
  • 59.
    RESULTS S. Claes, N.Wouters, K. Slegers, and A. V. Moere. Controlling In-the-Wild Evaluation Studies of Public Displays. pages 81–84, 2015. 60
  • 60.
    “When students seethe numbers, they are surprised, but now they believe me. Before, I used my gut feeling, now I feel more certain of what I say as well”. “It’s like a main thread guiding the conversation.” “I can talk about what to do with the results, instead of each time looking for the data and puzzling it together.” “Students don’t know where to look during the conversation, and avoid eye contact. The dashboard provides them a point of focus”. “A student changed her study method in June and could now see it paid off.” LISSA supports a personal dialogue. ü the level of usage depends on the experience and style of the study advisors ü fact-based evidence at the side ü narrative thread ü key moments and student path help to reconstruct personal track “I can focus on the student’s personal path, rather than on the facts.” “Now, I can blame the dashboard and focus on collaboratively looking for the next step to take.” 61
  • 61.
    Doubting to continue(Group 1) Doubting which courses to take (Group 2) Doubting which courses to deliberate (Group 3) Martijn Millecamp, Francisco Gutiérrez, Sven Charleer, Katrien Verbert, Tinne De Laet. A qualitative evaluation of a learning dashboard to support advisor-student dialogues, FP@LAK18 62
  • 62.
    Group 1 Group2 Group 3 Time db used 0.58 0.43 0.43 Avg. nb of insights 13.8 10.1 3.67 Avg nb factual insights 4.7 3.9 0.33 Avg nb of interpretative insights 3.33 3 3.2 Avg nb of reflective insights 5.8 3.2 2 63
  • 63.
    Group 1 Group2 Group 3 Time db used 0.58 0.43 0.43 Avg. nb of insights 13.8 10.1 3.67 Avg nb factual insights 4.7 3.9 0.33 Avg nb of interpretative insights 3.33 3 3.2 Avg nb of reflective insights 5.8 3.2 2 64
  • 64.
    Group 1 Group2 Group 3 Time db used 0.58 0.43 0.43 Avg. nb of insights 13.8 10.1 3.67 Avg nb factual insights 4.7 3.9 0.33 Avg nb of interpretative insights 3.33 3 3.2 Avg nb of reflective insights 5.8 3.2 2 65
  • 65.
    Data Confidence Collaboration Adviser’s role 66 RESULTS RQ6:How does such a Learning Dashboard contribute to the role of the adviser, student, and dialogue? RQ5: What are the design challenges for creating a Learning Dashboard to support study advice sessions, and how can they be met? Authorship Visual Encoding Ethics
  • 66.
    [!] Wording matters. 73%chance of success 73% of students of earlier cohorts with the same study efficiency obtained the bachelor degree http://blog.associatie.kuleuven.be/tinnedelaet/the-nonsense-of-chances-of-success-and-predictive-models/ 67
  • 67.
    [!] Do notoversimplify. Show uncertainty. • reality is complex • measurement is limited • individual circumstances • need for nuance • trigger reflection 68
  • 68.
    69 • reality iscomplex • measurement is limited • individual circumstances • need for nuance • trigger reflection [!] Do not oversimplify. Show uncertainty.
  • 69.
    [!] Feedback mustbe “actionable”. Warning! Male students have 10% less probability to be successful. You are male. Warning! Your online activity is lagging behind. action? ? action? ? ü 70
  • 70.
    Take-away messages • Involvementof end-users has been key to come up with interfaces tailored to the needs of users • Actionable vs non-actionable feedback • Need for personalisation and simplification 71
  • 71.
    Katrien Verbert –KU Leuven [email protected] Twitter: @katrien_v Thank you! Questions? Augment Slide design: Sven Charleer 72