4.18.24 Movement Legacies, Reflection, and Review.pptx
Challenges of designing learner dashboards
1. Challenges of designing
learner dashboards
Liz Bennett
Director of Teaching and Learning
School of Education and Professional Development
4/26/2018 1
2. Alluring potential of learning analytics
• Higher Education Commission (2016) From Bricks to Clicks; “potential
to transform the higher education sector”
• to deliver better understand of learners and the learning process,
• To provide timely, informative and adaptive feedback, and foster
lifelong learning (Gaešvić, Dawson, & Siemens, 2015; Pardo et al
2017; Schumacher & Ifenthaler 2016);
• To improve student outcomes (retention and attainment)
01/06/2017 2
3. What are Learner Dashboards?
Learner Dashboards are the
graphical interface that
manipulate and present data
about students’ learning
behaviours (attendance, visits to
the library, which books they take
out, their attainment etc, fine
grained data from VLE).
4. Types of dashboard
• Management information; data sets that inform institutional
knowledge eg patterns of retention amongst particular groups
• Academic cohort data – leading to academic led actions or
interventions
• Learner dashboards
• Predictive models that send alerts to students to suggest
actionable insights
• Historic models to inform students and hence influence
behaviours
01/06/2017 4
5. HeLF (2017)
• Rapid rise in number of HEIs developing
learning analytics in last 2 years from 34%
to 66%
• Focus is retention rather than on learning
• Descriptive (historical) dashboards 45%
• Predictive 25%
01/06/2017 5
6. SRHE Scoping Study
Aims :
• undergraduate students’ response to receiving feedback on progress
via a dashboard
• the impact that different ways of presenting data (for example
qualitative, quantitative, benchmarked, signposting further action,
mediated by personal tutors) has on students’ learning response
01/06/2017 6
7. Methods – small scale qualitative study 24 final year undergraduates
Focus Groups Interviews
8. Sample
• Round 1: self selecting, 10 out of cohort of 180
• 1st to 168th out of 180
• Round 2: 14 out of cohort of 16
• 1st to 16th in a group of 16 students
• Assignment data versus ‘on track’ score
• Slightly more were doing worse in this assignment compared to their ‘on
track’ score
9. Put yourself in the position of a student
Which of the 10 dashboard elements would you want to see on a
dashboard?
13. Case study: Justine
• High attaining student: got 75% in the most recent assignment
• On-track score: 71% (first)
• This assignment was pulling her ‘on-track’ score up
• Came 15th out of 180 students in the assignment
• How do you think she responded?
13
I’m happy with that [mark of 75%] but I don’t think I need to
know what position I’m in. Because I know that I’ve done better
than the majority, so that’s fine… I was happy with the grade and
I’ve done better than the majority I still think that [positional
data] makes me feel I could’ve done better.
14 other people have still done better than me...I had thought I'd
really, really topped it, I’ve maxed out here. And it’s taken away a
bit from that feeling of elation…
14. Case study: India
• 58% in the assignment
• On-track score: 61% (low 2:1)
• This assignment was pulling down her on-track score
• Came 16th out of 16 students in the assignment
• How do you think she responded?
01/06/2017 14
I think it gives me motivation to try harder
Me You pointed out straight away to the on-track slider
R Straight away, yeah. I think this is what I’m more focused
on. Even though they’re the individual grades. But I want to see
the overall, where I am working at the moment.
15. Hence
• Dashboards are interpreted by individuals
• Studies have shown that individual’s responses confound algorithm
predictions (Pardo, Ellis, & Calvo 2015; Beheshitha, Hatala, Gašević &
Joksimović 2016)
• But they do appear to be motivating (but less so for Justine)
01/06/2017 15
16. Practical recommendations for learner dashboards
• Show personal trjaectories
• Enable students to choose what they see
Designs need to:
• Develop descriptive rather than predictive dashboards
• Ensure driven by values that are student centred not
just KPI driven.
Institutional practices
need to:
17. Criteria for evaluating learner dashboards
• Sense making
• Customisable
• Actionable insights
• Educational alliance
18. Sense making
Making sense of the dashboard element.
“That was a thrown in essay. I knew it was going to be [lower than my
average]” (Malcom)
“Students don’t want is ‘that tools are not personalized, “bloated ( =
zillions of features we do not need), badly designed (features we need
are not streamlined, easy to use, intuitive), and cumbersome [system]”
(Dahlstrom et al., 2014, p. 10)
19. Customisable
Provides students with a sense of agency through enabling them to
choose how they see their data displayed.
“The average of everyone else, doesn’t really mean anything to me
personally. As long as I’m doing what I need to do.” (Malcolm)
20. Actionable insights
Helps students to make informed decisions about their learning
behaviour ie to identify things that they might do.
“It shows me that I should be putting in a lot more hours than I am.”
(India)
“On my average. I’d want to get that up a bit. because I don’t really like
being where I am. “ (Lydia)
21. Educational alliance
the extent to which the element supports a quality relationship
between students and tutors (Ajjawi & Boud 2017, p.254)
“we usually do tutorials when we have assignments due, to talk about
drafts and things like that. …you’ve got that time to prepare yourself for
whatever questions you know they’re going to ask you. And then the
feedback is good as well.”
22. Activity
Discuss how you would develop the dashboard elements, and the way
that dashboards are used in practice, to enhance one of the follow;
• Sense making
• Customisable
• Actionable insights
• Educational alliance
Note key points on flip chart paper.
23. Practical recommendations for learner dashboards
• Show personal trajectories
• Enable students to choose what they see
• Enable students to trust the sources of data by
avoiding aggregating data
• Provide actionable insights by avoiding aggregating
data
Designs need to:
• Develop descriptive rather than predictive dashboards
• Ensure driven by values that are student centred not
just KPI driven
• Embed use of learner dashboards in other supportive
processes
Institutional practices
need to:
24. Conclusions
• People are complex and unpredictable
• In general learner dashboards are motivation
• Comparison can be double edged sword – for some allowing
motivation, for others being demotivating
• Hence need to allow dashboards to be customisable
• Students’ need to interpret dashboards so they need contextual
information such as which assignment
• Institutions have a role in helping students to interpret and focus on
meaningful actions.
01/06/2017 24
25. Full report on SRHE website
@lizbennett1
e.bennett@hud.ac.uk
@suefolley
s.folley@hud.ac.uk
Editor's Notes
It will cover
What are learner dashboards (proxies for learning)
Drivers for learner dashboards
Outline our study methods
Findings – elements that students liked and didn’t
Understanding student dispositions and their responses to dashboards: case studies of Justine and India
Conclusions and recommendations for practice.
igher Education Commission, From Bricks to Clicks1 , was published in January 2016. The report looked at the
potential of data and analytics in higher education and made several recommendations, including that HEIs should consider adopting learning analytics focusing on the improvement of learning and teaching processes and student engagement.
Liz
Context and rationale
Definition: Learning analytics is the field of collection and manipulation of data derived from students’ learning behaviour (Siemens & Gašević, 2012)
Dashboards are the graphical interface that manipulate and present data about students’ learning behaviours (attendance, visits to the library, which books they take out, their attainment etc).
Although only a few UK HEIs have developed a dashboard for students, most other UK HEIs have an aspiration to develop their use (Sclater 2014). NTU is one that has implemented a student facing dashboard – based on student engagement ie using data from attendance, library, vle, but it doesn’t include assessment data.
Data are proxies for learning so have limitations…
Rhona suggests 5 uses of learner dashboards, with the final one being to inform curriculum design.
Newland and Trueman
This presentation focusses on illustrating how particular students responded
making recommendations based on these
Sue Methods
The study used a small scale study. Two methods were used to gathering data, focus groups and semi- structured interviews:
Pilot focus group with self-selecting sample asked them to comment on the format of the dashboard design elements e.g. pie chart, comparative data, progress, word cloud. We did this in November 16 and presented findings at SRHE. Seven students participated. A second focus group was held with 10 students who were a self-selecting sample, who received £10 gift voucher to thank them for participating.
The dashboard displayed the degree classification that the student was on track to achieve, ranged from 51% (low 2:2) to 74% (1st) for the first round, and the range of participants in the second round was from 60% (border of 2:1 and 2:2) to 76% (1st). Slightly more students were doing worse in the assignment presented on the dashboard than their overall on-track score. Thus the sample had the potential to uncover a range of emotional responses to the assignment data, not just being pleased that this assignment was bringing their average mark up or just disappointment that it was lowering their mark.
Sue
They mostly did not like the VLE data which they did not consider particularly useful. (element figure 1i)
Little relevance to them eg I could have bought the book
What is the point of knowing what others feel?
VLE use is only a proxy for what learning goes on and students know this…
Need for “trusted, accurate, visually understandable, and relevant data that align with their needs are key factors related to adoption” (Ali et al., 2012; Austin, 2011; Dawson, McWilliam,&Tan, 2008; Klein et al., in press, 2016a, 2017)
Sue
There was a range of responses the Personal Tutor Meeting log. (element figure 1c) in particular they were concerned that this section needed to have a clear purpose:
There was a variety of responses to the positional data, with some finding it motivating and some (even a high achiever) finding it challenging (and some low achieving finding it motivating. The majority found it motivating.
Student choice important here so that they can choose if they see this and who they are compared with
Attendance data liked but invoked compliant - docile behaviours e.g. I should ensure I don’t forget my card, or authorise my absence – what MacFarlane calls student performativity or presentism (the equivalent of hanging your coat on the back of your chair to show you are in the office).
Student choice important here so that a they can set what comprises Red Green Amber criteria
Flags otherwise encode institutional values (performative norming)
(Justine 15th out of 178)
So we can see that dashboards are individually interpreted
Studies have shown that individual’s responses confound algorithm predictions
- the digital footprint was unable to predict how well students were going to achieve on their course
Beheshitha, Hatala, Gašević & Joksimović (2016) failed to identify any patterns between the learner dashboard visualisations and their subsequent learning behaviours;
Motivating – caveats the individual who doesn’t want to see comparision (Justine)
focus on a student’s personal trajectory (ipstative data) that draws attention to their past and present scores to illuminate their learning gain;
allow students to personalise their learner dashboard. In particular, allow students control over the way that comparisons are made with other student’s performance (norm referenced data), for example by allowing students to choose whether they see their data compared to others in the cohort and, if they do, to choose who their scores are compared against. This could be the average mark for the module or cohort or to the highest performers on the module or cohort;
embed the use of dashboards into personal development planning and or personal academic tutorial processes to ensure that each student is individually and collectively supported to interpret and plan how to act on their data;
focus on the way that interventions are signposted with an awareness of the emotional component of dashboard feedback;
interrogate the institutional values that underpin the adoption of learner dashboards with a particular focus on how trust and student agency are engendered and how these are translated into the principles that are driving the adoption of dashboards.
Involves its clarity and purpose.
Also involves the students understanding what the purpose of the data is and that it is telling them something that they think is relevant. This involves relating the dashboard element to the student’s understanding for instance is the data correct, how should I interpret its meaning, Students have a wealth of knowledge about lives as students to bring into the interpretation process.
Justine too
Is India & Lydia an actionable insight or motivation to act?
Educational alliance from psychotherapy
Of course this isn’t really clear from a display as it is about the relationships that underpin the LD. However making it a criteria for the LD emphasises the socio cultural dimension of LD design.
Strong sense of relationship and working together with tutor evident (small cohort)
focus on a student’s personal trajectory (ipstative data) that draws attention to their past and present scores to illuminate their learning gain;
allow students to personalise their learner dashboard. In particular, allow students control over the way that comparisons are made with other student’s performance (norm referenced data), for example by allowing students to choose whether they see their data compared to others in the cohort and, if they do, to choose who their scores are compared against. This could be the average mark for the module or cohort or to the highest performers on the module or cohort;
embed the use of dashboards into personal development planning and or personal academic tutorial processes to ensure that each student is individually and collectively supported to interpret and plan how to act on their data;
focus on the way that interventions are signposted with an awareness of the emotional component of dashboard feedback;
interrogate the institutional values that underpin the adoption of learner dashboards with a particular focus on how trust and student agency are engendered and how these are translated into the principles that are driving the adoption of dashboards.