Satirical Depths - A Study of Gabriel Okara's Poem - 'You Laughed and Laughed...
SHEILA project LAK17 workshop slides
1. Supporting Higher Education to Integrate Learning Analytics
http://sheilaproject.eu/
LA Policy: Developing an Ins4tu4onal Policy for Learning Analy4cs using the RAPID
Outcome Mapping Approach
LAK’17 Workshop
13th March 2017
Yi-Shan Tsai, Dragan Gašević, Pedro J. Muñoz-Merino, Shane Dawson, Maren Scheffel,
Alexander Whitelock-Wainwright
2. Workshop schedule
Time Schedule
13.00-13.05 Welcome & introduc5on
13.05-14.30 Five presenta5ons & discussions
1. Overview & LA in Australia: Prof Dragan Gasevic (University
of Edinburgh)
2. LA in Europe (SHEILA Project): Prof Pedro J. Muñoz-Merino
(Universidad Carlos III de Madrid)
3. SHEILA – group concept mapping: Maren Scheffel (Open
University of the Netherlands)
4. SHEILA – ins5tu5onal interviews & survey: Dr Yi-Shan Tsai
(University of Edinburgh)
5. SHEILA – student survey: Alexander Whitelock-Wainwright
(University of Liverpool)
14.30-15.00 A^ernoon Tea
15.00-16.00 Group work on the assessment of ins5tu5onal readiness
16.00-17.00 LA Policy dra^ing
3. National project to benchmark
LA status, policy and practices for
Australian Universities
Dragan Gasevic
(thanks Shane Dawson for the slides!)
6. Aims
understand current LA practice in Australia
unpack the challenges to institutional adoption
identify practices that can aid the implementation
of LA
7. Approach
2 complementary but separate studies
Study 1 – interviews with senior institutional leaders
Study 2 – concept mapping with LA expert panel
8. Study 1
First study
Interviews with 32 Universities:
Identification of current practice, methods and
approaches
Identification of key drivers for institutions, stage of
development, process for implementation, project leads
9. Study 1
Much interest in LA
Stated organisational priority
LA projects were in the early phases of implementation
and small scale (at time of interview July 2014)
2 distinct clusters across variables such as:
implementation, conceptualisation, readiness
Cluster 1 (n=15) – Solutions focused
Cluster 2 (n=17) – Process focused
11. Strategic capability
1. Solutions focused
› LA to address a pressing need
› Time sensitive
2. Process focused
› Networked and integrated model
› Minimal time pressures
› Innovation and experimentation
12. Study 2
What are the ideal dimensions for long term
sustainable uptake of LA?
Invite to Australian and international LA experts
28 completed the entire concept mapping phases
Prompt: ‘for LA to make a continued impact on
learning and teaching it would need to…’
3 phases – brainstorming; sorting and ranking
of statements
14. Bringing it together
Study 1 – 2 clusters
Study 2 – 7 clusters
Essentially – how an organisation approaches its
conceptualisation of LA underpins (2 clusters) the
method for deployment and adoption (7 clusters)
18. Bringing it together
Challenges to be addressed:
Leadership awareness
Teams are seldom interdisciplinary
IT driven and system focused
Scale versus understanding
Capabilities and skills deficit.
Over reliance on current research – requires further
validation across different contexts to demonstrate
transportability of models
19. Complexity
Leveraging the outcomes of short term goals for long
term gain
How do we merge both models to gain both short and
long term impact?
20. Conclusion
LA requires alternate models for implementation
and leadership
› Enabling leadership
› Whole of organisation
› Models that are agile and research informed
Working in complexity creates friction
› Embrace the friction – generates innovation
21. Conclusion
A solutions based model can drive change –
but need to be mindful of responding to changing
organsational needs
Process based model can drive innovation and
interest – but need to be mindful of how to scale
22. Conclusion
Combined model framed in the organisational context
› Small, diffuse pockets of innovation to build
capacity and build interest
› View to scale adoption – demonstration of
impact (technical, pedagogical)
› Distributed enabling leadership (complexity
leadership)
23. Conclusion
Any “successful” adoption of LA will be dependent
on an institution’s ability to rapidly recognise and
respond to the organisational culture and the
concerns of all stakeholders.
55. Go Zone – Privacy & Transparency
1 7
20 31
43 86
2
10
17 24
45
64 65
88
92
9
15
56
60
74
87
34
69
96
6.08
5.44
3.12
ease
3.83 6.03 6.59
importance
r = 0.45
2. transparency, i.e. clearly informing students of how their data is collected, used and protected
88. a clear descrip5on of data protec5on measures taken
10. a clear descrip5on of data usage
17. being clear about the purpose for collec5on certain types of data
24. aligned with data protec5on regula5ons (ins5tu5onal, na5onal, interna5onal)
34. to assure that the collected data is used only for the purpose of improving learning and instruc5on
96. an agreement between learners, teachers and policy makers on regula5ng a proper use of data
67. Institutional interviews
• Methodology
• Topics of the ques5ons
• Analysis
• Preliminary findings
Researchers:
Yi-Shan Tsai (University of Edinburgh)
Ioana Jivet (Open University of the Netherlands)
Pedro Manuel Moreno Marcos(Universidad Carlos III de Madrid)
Kairit Tammets (Tallinn University)
68. Interviews – methodology
• Purpose: to understand the adop5on of Learning
Analy5cs in higher educa5on ins5tu5ons through
direct conversa5ons with decision makers.
• Sampling: deans, vice-deans, vice-rectors, vice
principals, heads of IT or eLearning, and senior
researchers or project managers of learning
analy5cs.
• Length of interviews: 25 ~ 60 mins
• Period of data collec5on: August 2016 ~ January
2017
69. Interviews – methodology
• 64 interviews
• 51 HEIs
• 16 countries
* Two addi5onal interviews were carried out with a na5onal collabora5ve ICT organisa5on in Netherlands and the
Ministry of Educa5on and Science in Estonia.
70. Interviews – topics
• LA projects, scope, mo5va5ons, and goals
• LA strategy
• Progress and achieved goals
• Challenges
• Ethics and privacy
71. Interviews – analysis
Macfadyen, L., Dawson, S., Pardo, A., Gašević, D., (2014). The learning analytics imperative and the
sociotechnical challenge: Policy for complex systems. Research & Practice in Assessment, 9(Winter 2014), 17-28.
72. Interviews – preliminary findings
• Current state of adop4on
NO PLANS
IN PREPARATION
IMPLEMENTED
9 7 5
12
18
THE ADOPTION OF LEARNING ANALYTICS
Ins5tu5on-wide Par5al/ Pilots Data explora5on/cleaning N/A
73. Interviews – preliminary findings
• Step 1 – Map poli4cal context
LA
Learner
Teaching Ins5tu5onal
External
q Data protec5on regula5ons
• 2018 GDPR
• The strictness of exis5ng DP
regula5ons à A stopper
q Pressure to adopt LA
q Exis5ng solu5ons focus on
addressing reten5on
problems.
q No one-size-fits-all solu5ons
74. Interviews – preliminary findings
• Step 2 – Iden4fy key stakeholders
Managers
Students
Teachers
q 25 ins5tu5ons have
established formal
working groups.
q Not all ins5tu5ons have
planned to provide
analy5cs data to students.
• Concerns about
demo5va5on.
Professional
Supports
75. Interviews – preliminary findings
• Step 3 – Iden4fy desired behaviour changes
LA
Learner
Teaching
Ins5tu5onal
q Interven5ons
• E-mail alerts and personal
contacts with students (7
HEIs)
• Teaching reports (1 UK HEI)
q Feedback
• 3 HEIs received posi5ve
feedback
• Low response rates
How can ins5tu5ons engage
students with LA?
76. No defined strategy
Interviews – preliminary findings
• Step 4 – Develop engagement strategy
LA
Digitalisa5on strategies
Teaching & learning strategies
q Consulta4ons with primary
stakeholders
• 23 HEIs
• Surveys
• focus groups
• Workshops
• Annual staff development
conference
• Self-help toolkit
q Visualised dashboard
• 24 HEIs
Danger of being data driven
77. Interviews – preliminary findings
• Step 5 – Analyse internal capacity to effect change
Managers
Students
Teachers Professional
Supports
Ins5tu5onal context
q Technological
resources
q Human
resources
q Funding
q Ins5tu5onal
culture
• Awareness
• Buy-in
• Common
understanding
• Analy5cs skills
Ø Clear value and relevance
Ø Ins5tu5onal priori5es Ø Workload
Ø Teaching styles
Ø Monitoring
Ø Confidence in handling new
technology & analy5cs data
Ø Passive engagement with studies
Ø Lack of interest in certain subjects
Ethics &
Privacy
Trust in
LA
78. Interviews – preliminary findings
• Step 6 – Establish monitoring and learning
frameworks
• Few were able to talk about plans for evalua5on.
• It is difficult to isolate and define the success of a
learning analy5cs project that is implemented
alongside other projects with the same goals to
enhance learning and teaching.
• 12 ins5tu5ons have developed or planned to
develop a policy to accommodate the use of
learning analy5cs.
79. Conclusion
• Success to date:
• Scaled up the ins5tu5onal capacity (step 5)
• Beger understanding of challenges
• Gained experience
Siemens, G., Dawson, S., & Lynch, G. (2014). Improving the Quality and Produc5vity of the Higher Educa5on Sector - Policy and Strategy for Systems-Level Deployment
of Learning Analy5cs. Canberra, Australia: Office of Learning and Teaching, Australian Government.
80. Institutional survey
• Topics inves4gated
The development of LA
• Current adop5on
• Ins5tu5onal infrastructure
and capacity
• Strategy and policy
• Legal and ethical
considera5ons
• Evalua5on
Self-evalua5on of LA maturity
and Ins5tu5onal readiness
• LA maturity
• LA success
• Culture
• Data and research
capabili5es
• Legal and ethical
considera5ons
• Training and
communica5on
84. Institutional survey
• Strategy and evalua5on
• Among ins5tu5ons that have implemented LA:
§ There is a strategy: 20%
§ In the process of developing a strategy: 20%
§ No clear strategy: 46.7%
§ Have developed success criteria: 27%
88. What do students want? Towards an
instrument for students’ evalua6on of
quality of learning analy6cs services
Alexander Whitelock-Wainwright, Dragan Gašević, &
Ricardo Tejeiro
A.Wainwright@Liverpool.ac.uk
90. What is service quality?
• Subjec5ve assessment of the degree to which a
service user’s needs or expecta5ons were met
(Parasuraman, Zeithaml, & Malhotra , 2005).
Expecta5
ons
Service
Usage/
Exposure
Percep5
ons
Avtu
de
91. What is service quality? (Contd.)
• Encourages users to use own service over
compe5tors (Parasuraman, Zeithaml, & Berry,
1988).
• Service quality in higher educa5on (Spooren,
Brockx, & Mortelmans, 2013).
• Ideological gap (Ng & Forbes, 2009).
92. Service Quality in Learning
Analy6cs
• Learning analy5cs services designed to support
learning.
• Various stakeholder groups within learning
analy5cs (e.g., students, teachers, managers)
(Clow, 2012).
• Quality indicators of learning analy5cs tools
(Scheffel, Drachsler, Stoyanov, & Specht, 2014).
97. Pilot Study Results
• Instrument reduced to 19 items.
• Two factor solu5on for both scales:
• Service expecta5ons:
o Desires scale – 0.88 alpha.
o Predic5ve scale – 0.88 alpha.
• Ethical expecta5ons:
o Desires scale – 0.82 alpha.
o Predic5ve scale – 0.86 alpha.
105. Workshop schedule
Time Schedule
13.00-13.05 Welcome & introduc5on
13.05-14.30 Five presenta5ons & discussions
1. Overview & LA in Australia: Prof Dragan Gasevic (University
of Edinburgh)
2. LA in Europe (SHEILA Project): Prof Pedro J. Muñoz-Merino
(Universidad Carlos III de Madrid)
3. SHEILA – group concept mapping: Maren Scheffel (Open
University of the Netherlands)
4. SHEILA – ins5tu5onal interviews & survey: Dr Yi-Shan Tsai
(University of Edinburgh)
5. SHEILA – student survey: Alexander Whitelock-Wainwright
(University of Liverpool)
14.30-15.00 A^ernoon Tea
15.00-16.00 Group work on the assessment of ins5tu5onal readiness
16.00-17.00 LA Policy dra^ing
106. • Name
• Organisa5on
• Your experience with LA
• Expecta5ons of the workshop
Tell us about you
107. Assessment of institutional readiness
• Use the ROMA framework to map your progress
towards deploying learning analy5cs.
Macfadyen, L., Dawson, S., Pardo, A., Gašević, D., (2014). The learning analytics imperative and the
sociotechnical challenge: Policy for complex systems. Research & Practice in Assessment, 9 (Winter 2014), 17-28.
1) Map poli4cal context
2) Iden4fy key stakeholders
3) Iden4fy desired behaviour
changes
4) Develop engagement
strategy
5) Analyse internal capacity to
effect changes
6) Establish monitoring and
learning frameworks
118. • 1. Audience
• Who is this policy for? Consider whose working
ac5vi5es the policy will shape.
• 2. Purpose
• What are your policy objec5ves? Consider your
objec5ves for LA and the changes you seek to achieve.
The policy will set out as the reason for introducing LA.
Audience & Purpose ~5mins
Keep your group notes here:
Group 1: hgps://goo.gl/CcGlTk
Group 2: hgps://goo.gl/zrMsyi
Group 3: hgps://goo.gl/3WNr62
Group 4: hgps://goo.gl/WzLmQc
119. • 3. Process
• a) Data collec5on
• b) Data management
• c) Stakeholder engagement
• d) Evalua5on
Process ~15mins
Keep your group notes here:
Group 1: hgps://goo.gl/CcGlTk
Group 2: hgps://goo.gl/zrMsyi
Group 3: hgps://goo.gl/3WNr62
Group 4: hgps://goo.gl/WzLmQc
120. • 4. Policy management
Policy management ~10mins
Final check-ups
• Will the policy need to state anything that should NOT be done in rela5on to
LA?
• Are there any limita5ons or poten5al challenges and risks about LA that
should be addressed in the policy?
• Can you narrow the policy down to a short list of principles?
Keep your group notes here:
Group 1: hgps://goo.gl/CcGlTk
Group 2: hgps://goo.gl/zrMsyi
Group 3: hgps://goo.gl/3WNr62
Group 4: hgps://goo.gl/WzLmQc