1. Student Digital Experience Tracker 2018
Sarah Knight,Tabetha Newman & Helen Beetham
18/10/2017
#digitalstudent https://digitalstudent.jiscinvolve.org
2. Introducing theTracker team
18/10/17
Jess Francis. Research manager, JiscRuth Drysdale, Senior co-design manager, Jisc
Helen Beetham
Sarah Knight, Head of change:
student experience, Jisc
Tabetha Newman Mike Gulliver
#digitalstudent http://bit.ly/trackerguide
3. Update on the tracker project
Sarah Knight
18/10/17 #digitalstudent http://bit.ly/trackerguide
4. The ‘digital student’ project
»Reviewed students’ expectations and experiences
of the digital curriculum, environment and services:
»in HE (2013-2014)
»among school leavers (2014)
»in FE (2014) and in the adult and skills sector (2015)
»among online learners (2016)
http://digitalstudent.jiscinvolve.org
18/10/17 #digitalstudent http://bit.ly/trackerguide
5. Outcome: student digital experience tracker
From consultations with Jisc stakeholders we found:
› interest in the ‘digital student’ studies
› appetite to find out more locally, engage
students in their own digital experience
› no existing instruments, surveys or quality
processes captured the digital experience
› organisations wanted something proven,
easy to administer, credible, and actionable.
18/10/17
Find out more: http://ji.sc/student-tracker
#digitalstudent http://bit.ly/trackerguide
6. Tracker 2017
» 74 UK + 10 international
institutions involved
» Over 27,000 student responses
» In-depth evaluation and analysis
» New case studies showing
organisational impact
» Findings have had high impact
and sector interest, see
http://bit.ly/tracker17brief
» Full report available from
http://bit.ly/jisctracker17
18/10/17 #digitalstudent http://bit.ly/trackerguide
7. Reasons for using the Tracker
1.Inform ourselves about the student digital experience
2.Gather evidence to support specific actions (e.g. in respect of the
curriculum, student digital support, and/or digital infrastructure)
3.Demonstrate that we are engaging with students and responding to
their feedback
4.Evaluate ourselves – against other institutions (benchmark), or over time
(monitor), or in relation to our aims for a specific
initiative/strategy/project etc (evaluate)
5.Gain intrinsic value from the project process (e.g. student engagement,
other stakeholder engagement, benefits of working with Jisc/other
institutions)
18/10/17 #digitalstudent http://bit.ly/trackerguide
8. Evaluation 2017: what we learned
and how we responded
Helen Beetham
18/10/17 #digitalstudent http://bit.ly/trackerguide
9. »Analysis of student dataset
»Survey of institutional leads
»Follow-up interviews
»Analysis confirms the survey is
robust (e.g. factors hang together)
»Survey/interviews found process is valued
»Scale of use: 84 institutions, nearly 30k students
(22k in the UK), considerably more this year..
Evaluation 2017
18/10/17 #digitalstudent http://bit.ly/trackerguide
10. Evaluation 2017
»Unexpected outcome: impact of synthesised findings
› over 14k downloads of the findings (all formats)
› 4.5k views of news story online
› high level interest from external agencies
› opportunities for research and sector intelligence
18/10/17 #digitalstudent http://bit.ly/trackerguide
11. New for 2017-18
»Refinements to student questions (4 versions, 2 languages)
»New! Questions asked at organisational level
»Review panels to support analysis and reporting
»Option to send out individualised links to learners
»More customisation available
»BOS now a full Jisc service:
› updated guidance + contextual help
› tracker.support@jisc.org +
› webinars, FAQs, jiscmail list etc
18/10/17 #digitalstudent http://bit.ly/trackerguide
12. New for organisational leads
»Tell us about (at least 4 out of ) 10 key factors
› identified from research + consultation
› may be related to the overall student digital experience
› strategy, infrastructure, LT staff, staff CPD and reward…
»Compare with sector norms
»Compare organisational with student view (and with staff)
»Option to be involved with analysis, review and reporting
at sector level
18/10/17 #digitalstudent http://bit.ly/trackerguide
13. New in the question set
»New structure for users with clearer navigation
»Small number of questions removed
»Digital on my course:
› questions about digital environment now core
› new questions on teaching spaces, course software
»Attitude to digital learning
› new question on preference for independent/group work
› new question on ‘how much digital…?’
»Clearer questions on support and preparation to learn
»Two new questions (key/summary metrics?):
› Overall satisfaction with digital environment
› Overall satisfaction with digital learning and teaching18/10/17 #digitalstudent http://bit.ly/trackerguide
14. Making the questions count
Over to you
18/10/17 #digitalstudent http://bit.ly/trackerguide
15. Activity: making the questions count
»Focus on two or three questions you find interesting or
valuable
»In pairs discuss:
› what could your organisation do with this data?
AND/OR
› What difference could this data make to learners?
»Share your thoughts on the padlet
bit.ly/experttrack
18/10/17 #digitalstudent http://bit.ly/trackerguide
17. What organisations told us
»7 detailed case studies (2016) + 15 snapshots (2017)
»Evidence of early value:
› Engaging students in thinking about digital issues
› Raising awareness and ‘making the case’
› Identifying strengths and weaknesses relative to sector norms
› Baselining digital initiatives
»Evidence of longer-term impact…
› We hope will emerge this year with returning organisations
18/10/17 #digitalstudent http://bit.ly/trackerguide
18. What organisations told us
»Why they did it
18/10/17 #digitalstudent http://bit.ly/trackerguide
19. What organisations told us
»What they got
out of it
18/10/17 #digitalstudent http://bit.ly/trackerguide
20. How can we make more of a difference…?
»Already this year: 211 registrations (2,243 views),
83 confirmations
»More international interest: Australia, Canada, US, New
Zealand, Africa, Brazil…
»Most participants have answered most or all of the 10
organisational questions
»Around half of previous users have signed up: many more
plan to run the tracker on a two-year cycle
»What impact could this data be having across the
sector(s)…?
21. Making a difference
Over to you
27/06/2017 https://www.jisc.ac.uk/rd/projects/student-digital-experience-tracker #digitalstudent
22. Activity: making the difference
»At your table:
› You have a folder addressed to a stakeholder
› In the folder are the findings of the 2017-18 tracker
› Add messages to the outside of the folder that will
persuade this stakeholder to open and read it
› (think format and medium as well as content)
› What do you want them to hear?What do you want
them to do (differently) as a result?
› Be ready to feed back (or act out)!
24. Next steps and opportunities to get involved
»Expert Review Panels
› Help frame headline messages & recommendations
› Student representation + tracker users + partner bodies | sector-
specific but working closely together
› Review summary data | meet (May 2018) | review final report
› No identifiable institutional data will be shared with Panels
› Sign up via online form
»Staff tracker
› Complementary to student tracker (& organisational qus)
› Help review & trial draft question set (sign up via jiscmail list)
18/10/17 #digitalstudent http://bit.ly/trackerguide
25. Get involved
»Sign up for 2018Tracker
http://bit.ly/trackersignup18
»Join the tracker mailing list
http://jiscmail.ac.uk/jisc-
digitalstudent-tracker
»See project website:
http://ji.sc/student-tracker for
briefing booklet and reports
»Guidance available from
http://bit.ly/trackerguide
»Follow our blog:
https://digitalstudent.jiscinvolve
.org
»Follow #digitalstudent and
@jisc
»New! Apply to join the expert review panels:
https://tracker.onlinesurveys.ac.uk/tracker-panel-2018
18/10/17 #digitalstudent http://bit.ly/trackerguide