8th UK Learning Analytics Network Meeting, The Open University, 2nd November 2016
1) The power of 151 Learning Designs on 113K+ students at the OU?
2) How can we use learning design to empower teachers?
3) How can Early Alert Systems improve Student Engagement and Academic Success? (Amara Atif, Macquarie University)
4) What evidence is there that learning design makes a difference over time and how students engage?
Procuring digital preservation CAN be quick and painless with our new dynamic...
Learning design meets learning analytics: Dr Bart Rienties, Open University
1. Learning design meets learning
analytics
JISC/OU, 2nd of November 2016
@DrBartRienties
Reader in Learning Analytics
A special thanks to Avinash Boroowa, Aida Azadegan, Shi-Min Chua, Simon Cross, Rebecca Ferguson, Lee Farrington-Flint, Christothea Herodotou, Martin
Hlosta, Wayne Holmes, Garron Hillaire, Simon Knight, Nai Li, Vicky Marsh, Kevin Mayles, Jenna Mittelmeier, Vicky Murphy, Quan Nguygen, Tom Olney, Lynda
Prescott, John Richardson, Jekaterina Rogaten, Matt Schencks, Mike Sharples, Dirk Tempelaar, Lisette Toetenel, Thomas Ullmann, Denise Whitelock, John
Woodthorpe, Zdenek Zdrahal, and others…A special thanks to Prof Belinda Tynan for her continuous support on analytics at the OU UK
2. Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and Implementation of a Learning Analytics Toolkit for Teachers. Journal of Educational Technology & Society, 15(3), 58-76.
3. Agenda
1. The power of 151 Learning Designs on 113K+ students at the
OU?
2. How can we use learning design to empower teachers?
3. How can Early Alert Systems improve Student Engagement
and Academic Success? (Amara Atif, Macquarie University)
4. What evidence is there that learning design makes a
difference over time and how students engage?
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15. Assimilative Finding and
handling
information
Communicati
on
Productive Experiential Interactive/
Adaptive
Assessment
Type of
activity
Attending to
information
Searching for
and
processing
information
Discussing
module related
content with at
least one other
person (student
or tutor)
Actively
constructing an
artefact
Applying
learning in a
real-world
setting
Applying
learning in a
simulated
setting
All forms of
assessment,
whether
continuous,
end of
module, or
formative
(assessment
for learning)
Examples of
activity
Read, Watch,
Listen, Think
about,
Access,
Observe,
Review, Study
List, Analyse,
Collate, Plot,
Find,
Discover,
Access, Use,
Gather, Order,
Classify,
Select,
Assess,
Manipulate
Communicate,
Debate,
Discuss, Argue,
Share, Report,
Collaborate,
Present,
Describe,
Question
Create, Build,
Make, Design,
Construct,
Contribute,
Complete,
Produce, Write,
Draw, Refine,
Compose,
Synthesise,
Remix
Practice,
Apply, Mimic,
Experience,
Explore,
Investigate,
Perform,
Engage
Explore,
Experiment,
Trial, Improve,
Model,
Simulate
Write,
Present,
Report,
Demonstrate,
Critique
Toetenel, L. & Rienties, B. (2016). Analysing 157 Learning Designs using Learning Analytic approaches as a means to evaluate the impact of pedagogical
decision-making. British Journal of Educational Technology, 47(5), 981–992
16.
17.
18.
19. Method – data sets
• Combination of four different data sets:
• learning design data (189 modules mapped,
276 module implementations included)
• student feedback data (140)
• VLE data (141 modules)
• Academic Performance (151)
• Data sets merged and cleaned
• 111,256 students undertook these modules
20. Toetenel, L. & Rienties, B. (2016). Analysing 157 Learning Designs using Learning Analytic approaches as a means to evaluate the impact of pedagogical
decision-making. British Journal of Educational Technology, 47(5), 981–992
21. Constructivist
Learning Design
Assessment
Learning Design
Productive
Learning Design
Socio-construct.
Learning Design
VLE Engagement
Student
Satisfaction
Student
retention
Learning Design
151 modules
Week 1 Week 2 Week30
+
Disciplines Levels
Size module
Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across
151 modules. Computers in Human Behavior, 60 (2016), 333-341
26. Model 1 Model 2 Model 3
Level0 -.279** -.291** -.116
Level1 -.341* -.352* -.067
Level2 .221* .229* .275**
Level3 .128 .130 .139
Year of implementation .048 .049 .090
Faculty 1 -.205* -.211* -.196*
Faculty 2 -.022 -.020 -.228**
Faculty 3 -.206* -.210* -.308**
Faculty other .216 .214 .024
Size of module .210* .209* .242**
Learner satisfaction (SEAM) -.040 .103
Finding information .147
Communication .393**
Productive .135
Experiential .353**
Interactive -.081
Assessment .076
R-sq adj 18% 18% 40%
n = 140, * p < .05, ** p < .01
Table 3 Regression model of LMS engagement predicted by institutional, satisfaction and learning design analytics
• Level of study predict VLE
engagement
• Faculties have different VLE
engagement
• Learning design
(communication & experiential)
predict VLE engagement (with
22% unique variance
explained)
27. Model 1 Model 2 Model 3
Level0 .284** .304** .351**
Level1 .259 .243 .265
Level2 -.211 -.197 -.212
Level3 -.035 -.029 -.018
Year of
implementation .028 -.071 -.059
Faculty 1 .149 .188 .213*
Faculty 2 -.039 .029 .045
Faculty 3 .090 .188 .236*
Faculty other .046 .077 .051
Size of module .016 -.049 -.071
Finding information -.270** -.294**
Communication .005 .050
Productive -.243** -.274**
Experiential -.111 -.105
Interactive .173* .221*
Assessment -.208* -.221*
LMS engagement .117
R-sq adj 20% 30% 31%
n = 150 (Model 1-2), 140 (Model 3), * p < .05, ** p < .01
Table 4 Regression model of learner satisfaction predicted by institutional and learning design analytics
• Level of study predict
satisfaction
• Learning design (finding info,
productive, assessment)
negatively predict satisfaction
• Interactive learning design
positively predicts satisfaction
• VLE engagement and
satisfaction unrelated
28. Model 1 Model 2 Model 3
Level0 -.142 -.147 .005
Level1 -.227 -.236 .017
Level2 -.134 -.170 -.004
Level3 .059 -.059 .215
Year of implementation -.191** -.152* -.151*
Faculty 1 .355** .374** .360**
Faculty 2 -.033 -.032 -.189*
Faculty 3 .095 .113 .069
Faculty other .129 .156 .034
Size of module -.298** -.285** -.239**
Learner satisfaction (SEAM) -.082 -.058
LMS Engagement -.070 -.190*
Finding information -.154
Communication .500**
Productive .133
Experiential .008
Interactive -.049
Assessment .063
R-sq adj 30% 30% 36%
n = 150 (Model 1-2), 140 (Model 3), * p < .05, ** p < .01
Table 5 Regression model of learning performance predicted by institutional, satisfaction and learning design analytics
• Size of module and discipline
predict completion
• Satisfaction unrelated to
completion
• Learning design
(communication) predicts
completion
29. Constructivist
Learning Design
Assessment
Learning Design
Productive
Learning Design
Socio-construct.
Learning Design
VLE Engagement
Student
Satisfaction
Student
retention
150+ modules
Week 1 Week 2 Week30
+
Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across
151 modules. Computers in Human Behavior, 60 (2016), 333-341
Communication
30. Agenda
1. The power of 151 Learning Designs on 113K+ students at the
OU?
2. How can we use learning design to empower teachers?
3. How can Early Alert Systems improve Student Engagement
and Academic Success? (Amara Atif, Macquarie University)
4. What evidence is there that learning design makes a
difference over time and how students engage?
31.
32.
33. Toetenel, L., Rienties, B. (2016) Learning Design – creative design to visualise learning activities. Open Learning, 31(3), 233-244.
34. Agenda
1. The power of 151 Learning Designs on 113K+ students at the
OU?
2. How can we use learning design to empower teachers?
3. How can Early Alert Systems improve Student Engagement
and Academic Success? (Amara Atif, Macquarie University)
4. What evidence is there that learning design makes a
difference over time and how students engage?
35. Early Alert Systems using Learning Analytics to Determine (and Improve) Student
Engagement and Academic Success in a Unit:
Student and Teacher Perspectives
November 02, 2016 Amara Atif [amara.atif@mq.edu.au]
36. AIMS
(1) To study the attitudes, opinions and preferences of students with respect
to early alerts.
(2) To study the perspectives of teachers regarding early alerts and the
potential benefits, usage, usefulness and barriers to the use of a
prototype early alert system as a means to improve the engagement and
academic success of students at a unit-level.
37. STUDENT SURVEYS - METHODOLOGY
The purpose of this survey is to gather feedback to determine the students’ attitudes,
opinions and preferences with respect to early alerts - how do students respond to receiving
an early alert and do their opinions or preferences regarding the early alerts change if they
actually receive them.
We surveyed 7, 035 students in 17 undergraduate units in semester 2, 2015.
639 responses were deemed complete and usable of which 13% were international students,
62% were first years.
The link to the survey was included in the unit and convenors informed students via
announcements in iLearn (Moodle based LMS at MQ).
38. STUDENT SURVEYS - RESULTS
Do you want to be contacted?
79% wanted to be contacted if their performance in the unit was unsatisfactory
(If yes) When students’ like to be contacted?
39. STUDENT SURVEYS - RESULTS
Reason for contact (For what specific behaviours do you want to be contacted?)
40. How would you like to be advised about opportunities to seek assistance?
STUDENT SURVEYS - RESULTS
41. From the following strategies, which do you think would motivate you to seek help?
STUDENT SURVEYS - RESULTS
42. STUDENT SURVEYS - RESULTS
88 students were contacted by a teaching or student support staff about their academic
performance.
65 said that they took an action such as got attentive and started to work seriously; emailed teaching
staff
What was your attitude towards being contacted? Please mark on a scale of 1 (strongly disagree),
2 (disagree), 3 (neutral), 4 (agree) and 5 (strongly agree)?
4.14 I appreciated that there was someone watching out for me
4.14 I was grateful that somebody contacted me about my academic standing in this unit
3.80 I was glad to speak to my teaching staff about my situation
What impact did receiving an email from your unit convenor have on your motivation to
continue in the unit? (84 responses)
55% It made me feel better
53% It made me feel like I could improve
12% It made me feel worse
43. STUDENT SURVEYS - RESULTS
Explain why you felt this way.
“It was encouraging/motivating/felt more confident.” (13)
“I felt like the convenor cared and wanted me to do well.” (9)
“Because it was a wake up call, there is help available but I need to be receptive to this help and make sure to commit more
time to the unit so that I am well prepared for my exam.”
“I know I was a capable student, who just lacked major motivation. The email basically kicked me into gear and I completed
all my assessments post-email to a high level.”
“The unit convenor gave me specific advice and encouraged me and it made me feel much better.”
“I’m glad I was contacted as it motivated me to complete the work and almost “show” the unit convenor that I could do it.”
“I felt like I needed to actively work throughout the semester, rather than procrastinating.”
“Given the fact that the unit convenor took the time to send an email shows his enthusiasm about teaching this unit and his
willingness to engage with the students. He goes beyond his obligations as a lecturer and shows genuine concern about the
students’ performance, something that I haven’t experienced in any of my previous units.”
44. Research & Publications
1. Atif, A., Richards, D., Bilgin, A. and Marrone, M. (2013). Learning Analytics in Higher Education: A Summary of Tools and Approaches,
in H. Carter, M. Gosper and J. Hedberg (Eds.) Ascilite 2013: Electric Dreams: Proceedings of the 30th Ascilite conference, Macquarie
University, Sydney, Australia, 1-4 December 2013, pp. 68-72.
2. Atif, A., Richards, D., and Bilgin, A. (2013). A Student Retention Model: Empirical, Theoretical and Pragmatic Considerations, in Hepu
Deng and Craig Standing (Eds.) ACIS 2013: Information systems: Transforming the Future: Proceedings of the 24th Australasian
Conference on Information Systems, Melbourne, Australia, 4-6 December, 2013, pp. 1-11.
3. Atif, A., Richards, D., and Bilgin, A. (2015). Student Preferences and Attitudes to the Use of Early Alerts. Paper presented at the 21st
Americas Conference on Information Systems (AMCIS), August 13-15, Puerto Rico.
4. Liu, D., Froissard, C., Richards, D., and Atif, A. (2015). Validating the Effectiveness of the Moodle Engagement Analytics Plugin to Predict
Student Academic Performance. Paper presented at the 21st Americas Conference on Information Systems (AMCIS), August 13-15, Puerto
Rico.
5. Atif, A., Liu, D., Froissard, C., and Richards, D. (2015). An Enhanced Learning Analytics Plugin for Moodle – Student Engagement and
Personalised Intervention. Paper presented at the Australasian Society of Computers in Learning in Tertiary Education (ASCILITE), Nov
30-Dec 2 2015, Curtin University, Perth, Western Australia.
6. Liu, D., Froissard, C., Richards, D., and Atif, A. (2015). Identifying and Contacting Dis-engaged Students in Moodle. Workshop presented
at the Australian Learning Analytics Summer Institute (ALASI-15) at University of Sydney, Australia, Nov 26-27, 2015.
7. Liu, D. Y., Richards, D., Dawson, P., Froissard, J.-C., and Atif, A. (2016). Knowledge Acquisition for Learning Analytics: Comparing
Teacher-Derived, Algorithm-Derived, and Hybrid Models in the Moodle Engagement Analytics Plugin. Paper presented at the Pacific Rim
Knowledge Acquisition Workshop (PKAW).
45. Early Alert Systems using Learning Analytics to Determine (and Improve) Student
Engagement and Academic Success in a Unit:
Student and Teacher Perspectives
November 02, 2016 Amara Atif [amara.atif@mq.edu.au]
46. Agenda
1. The power of 151 Learning Designs on 113K+ students at the
OU?
2. How can we use learning design to empower teachers?
3. How can Early Alert Systems improve Student Engagement
and Academic Success? (Amara Atif, Macquarie University)
4. What evidence is there that learning design makes a
difference over time and how students engage?
47. What evidence is there that learning design makes a
difference over time and how students engage?
• 40 OU modules
• 30 weeks time analysed
• Learning design per week
• Learning activities per week
• VLE engagement per week
Nguyen, Q., Rienties, B., Toetenel, L. (Submitted: 17-10-2016). Unravelling the dynamics of instructional practice: A longitudinal study on learning design and
VLE activities. Paper submitted to LAK2017.
49. Conclusions (Part I)
1. Learning design strongly influences
student engagement, satisfaction and
performance
2. Visualising learning design decisions
by teachers lead to more
interactive/communicative designs
50. Conclusions (Part II)
1. Learning design per week strongly
influences learning analytics per week
2. Visualising learning analytics data can
encourage teachers to intervene in-
presentation and redesign afterwards
51. Learning design meets learning
analytics
JISC/OU, 2nd of November 2016
@DrBartRienties
Reader in Learning Analytics
A special thanks to Avinash Boroowa, Aida Azadegan, Shi-Min Chua, Simon Cross, Rebecca Ferguson, Lee Farrington-Flint, Christothea Herodotou, Martin
Hlosta, Wayne Holmes, Garron Hillaire, Simon Knight, Nai Li, Vicky Marsh, Kevin Mayles, Jenna Mittelmeier, Vicky Murphy, Quan Nguygen, Tom Olney, Lynda
Prescott, John Richardson, Jekaterina Rogaten, Matt Schencks, Mike Sharples, Dirk Tempelaar, Lisette Toetenel, Thomas Ullmann, Denise Whitelock, John
Woodthorpe, Zdenek Zdrahal, and others…A special thanks to Prof Belinda Tynan for her continuous support on analytics at the OU UK
Editor's Notes
Learning Design Team has mapped 100+ modules
Explain seven categories
For each module, the learning design team together with module chairs create activity charts of what kind of activities students are expected to do in a week.
5131 students responded – 28%, between 18-76%
Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort).
Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort).
(1)
What are the opinions and preferences of students with respect to early alerts?
How do students respond (attitude) to receiving an early alert/intervention?
Do students report change in behaviour for how they studied for a unit, if they actually receive an early alert?
Do early alerts increase student performance and motivation to continue in the unit?
Do early alert notifications increase student motivation to utilise the campus student support services?
(2)
What are the perceptions of teachers with respect to early alerts?
What are the experiences and motivations of teachers with regard to usage, helpfulness and barriers/challenges to the use of a prototype early alert system?
Undergraduate units (between 59-1455 students) representing all four faculties (including arts, human sciences, business, science and engineering) delivered in either an online or blended mode at our institution.
Unit Selection Criteria:
(1) These units were selected because they consisted of a range of online activities (forum discussions, quizzes, and assignments) that students needed to complete in the LMS.
(2) They had a relatively high number of at-risk students (at least 10% non-completion and fail rate in the last study period).
Undergraduate units (between 59-1455 students) representing all four faculties (including arts, human sciences, business, science and engineering) delivered in either an online or blended mode at our institution.
Unit Selection Criteria:
(1) These units were selected because they consisted of a range of online activities (forum discussions, quizzes, and assignments) that students needed to complete in the LMS.
(2) They had a relatively high number of at-risk students (at least 10% non-completion and fail rate in the last study period).