Gallagher, S. (2015). Measuring learning engagement in online medical education. Paper presented at the AMEE eLearning Symposium, Glasgow.
DOI: 10.13140/RG.2.1.2873.7767
Fostering Friendships - Enhancing Social Bonds in the Classroom
Measuring Learner Engagement in Online Medical Education
1. Measuring learner
engagement in online medical
education
Steve Gallagher
Dunedin School of Medicine
University of Otago
New Zealand
steve.gallagher@otago.ac.nz
@stevegallagher
People in education technology talk a lot about the importance of making learning engaging.
Here you can see some examples from learning delivery platforms, adaptive learning systems, and the elearning guild.
Use our platform, follow our tips, use a responsive design to make your learning engaging.
It seems clear from the previous examples that it means different things to different people, and might be referring to the visual design, interactivity, or time learners spend working on a learning resource.
Sometimes it’s used to talk about a student’s participation in a course or programme over a long period of time. Tying this down is difficult.
So what am I talking about?
I’m talking about the need, in my mind anyway, to operationally define engagement so it can be measured in a meaningful way, so that we can find out what students think about the online learning resources that we develop and in some cases inflict on them.
This project started when we were about to launch an online self-directed eCase for 2nd year medical students at the University of Otago in Dunedin, New Zealand. This project was part of a collaboration with AD Instruments and at that time used their platform ‘Lab Tutor Online’.
It had been a major project and was a new direction in how case-based learning was being delivered in the curriculum. Previously, and most commonly still, case-based learning occurs using paper cases and two face-to-face tutorials in small groups. This project took the first of those two hour blocks for one case, and developed a self-directed learning experience. Understandably, we were keen to know how this project was perceived by students.
When we were talking about evaluating the student experience, someone said ‘we need to ask them if they thought it was engaging’, which is a bit of a trigger word for me. Because what do we mean by engagement?
Other disciplines have been talking about this for a long time, and I looked to Information Science and Human Computer Interaction for advice.
Excellent paper. Highly recommended.
Also acknowledges the difficulty of defining engagement. Looked at a number of theoretical approaches including flow theory, play theory and aesthetic theory and identified commonalities in their conceptualization of engagement which helped arrive the definition above.
They went on to suggest dimensions that are particularly important in education, and furthermore interviewed users of a number of different systems to develop a conceptual theory of engagement that traverses the point of engagement, continual engagement and disengagement. Its well worth a read.
What I’m presenting today is inspired by this approach, but not a test of their framework.
Many inspired by the O’Brien and Toms approach, but other (such as integration) more specifically related to medical education.
These items, and the aesthetic appeal and endurability item from the previous slide make up the 11 item instrument.
Both of these were asked on 5-pt scales, though they used different labels that the others.
11 items, range is 1 – 5.
Total instrument score bounded by 11-55
234 respondents, rr=82%
Average engagement 36.5
Mean engagement score was related to the level of challenge reported. Significant difference between 3 and 4-5 combined
Significant linear trend relating effectiveness and engagement.
Little things make a difference.
Modern typeface
Whitespace
Smoother transitions between screens
Embedded media
Checkpoint not commit – improved user experience
Significant differences on many items, Mann-Whitney test.
Total score changed form 36.5 to 38.9 between years, significant t-test, p=.001
Cronbachs alpha = 0.83
Broadly, some evidence that the instrument is responsive to change. Low response rate, be wary of reading too much into this, but promising.
We can’t say if the change in engagement reported here is related to change in outcomes for students. We want to investigate this in the future.
Developed a paper-based version, adapted some questions, excluded others.
Initial psychometric analysis indicates good internal consistency (0.80), but there are some items that we introduced that had relatively low correlations
However, we did have a 100% (n=76) response rate, and the data were useful in identifying strengths and weaknesses of the system
Mean total engagement score here was 24.8 (a bit of a hack, had to transform the data to make score comparable because fewer items.) Indicative only.
Small groups, paper based. Slightly modified instrument, same number of items
Higher overall engagement, numbers too low for stats.
Why? Moe appreciative of flexibility, more ecases highly valued, more likely to revisit, more appreciative of flexibility.
Reflect change in cohort – clinical attachments.