Improving student engagement with the assessment process in undergraduate microbiology modules
1. Improving student engagement with
the assessment process in
undergraduate microbiology modules
Dr Alison Graham
School of Natural and Environmental Sciences
Newcastle University
Microbiology Society Annual Conference 2018
2. Initial aims Aims of Project
To engage students in the entire marking process
from the setting of marking criteria through the
receipt and feed-forward application of feedback
To write/design effective marking criteria that are
specific to individual pieces of work
To engage students in the process of using
marking criteria in preparation for an assignment
To provide feedback on coursework that links
directly to marking criteria
To use Feedback Studio to develop libraries of
feedback comments that can function much like
dialogue with students
Implicit questions in our
original proposal:
1. What do students already
know about marking
criteria?
2. Can typed (even
repeated!) comments work
like a dialogue? Will
students recognise this?
3. Can we involve students in
writing marking criteria?
4. Level 4: Microbiology - focus group
How can marking criteria be used to make expectations clear?
I have read a research paper published
in a peer-reviewed journal.
1. Yes
2. I’ve read some
but found them
difficult to
understand
3. No
4. I’m not sure what
you mean by a
peer-reviewed
journal
Write your report “in the format of a
scientific paper” – do you know what
this means?
1. Yes
2. No
3. To some
extent
6. Level 4: Microbiology – engaging with criteria
1. 0-39%
2. 40-49%
3. 50-59%
4. 60-69%
5. 70-100%
Into what grade boundary would
results example 1 fall?
Which title scored the highest?
1. Example 1
2. Example 2
3. Example 3
9. Using Feedback Studio to provide feedback linked to
marking criteria
Feedback Studio is:
• Part of Turnitin software, accessed at Newcastle University through VLE
(Blackboard)
• A platform through which students submit coursework online as Word
document or PDF (or in other file formats)
• A platform through which markers can provide three types of feedback:
o In-text comments: Bubble comments, Text comments, QuickMark
comments
o Rubric/Grading Form
o General comments: Voice comments and Text comments
12. What did the students think?
• 75% found it useful to have the marking
criteria in advance.
• 100% thought it was useful to see how
they performed against the marking
criteria.
• 100% preferred electronic feedback to
feedback on a pro forma or mark sheet.
• 100% thought electronic feedback
makes it easier to understand comments
about grammar.
• 100% thought electronic marking
encourages more positive feedback.
• 100% found the comments to be
specific to the piece of work.
• 100% would like to have received more
electronic feedback in other modules.
13. What did the students think?
“This [the rubric] was the most useful aspect of the electronic
feedback as this helped me to gauge which areas of the
assignment I was lacking and therefore where I would need to
focus my improvement for future work.
It also helped me to understand why I had received the mark I
had in relation to the marking criteria for each section and thus
why my overall grade was within a certain grade boundary.”
“I felt like it was easier for the marker to provide positive
comments and this is also important feedback - it is good to
know when a specific section is very good in order to use this
style/technique in another piece of work.
I think the automated comments and marking rubric make the
marker more fair as it ensures they connect each section of
work to the relevant criteria section.”
Overall, the students voted that
the electronic comments were
more positive, more fair, more
thorough, more helpful, easier
to understand and specific
(compared to other feedback).
“I have received helpful
feedback during the
module, course or unit”
Mean agreement: 4.8/5.0
14. Feedback Studio analysis
Number of students
that receive different
types of grammatical
comments – identify
common errors e.g.
punctuation
Number of students
that fall into each
mark range for each
criterion
15. Feedback Studio analysis
Grade range
% viewed feedback
3.5 weeks later 6.5 months
later
70-100% 84 84
60-69% 46 64
50-59% 49 51
40-49% 48 52
0-39% 14 14
Percentage of
students that viewed
feedback from
microbiology report
(2013-14 academic
year; n = 184):
• After 3.5 weeks
• After 6.5 months
16. Benefits of electronic marking
Students’ perspective
• Feedback is easier to read and is automatically saved online
• Students can access feedback in private and on their own time
• More positive feedback
• Increased perceptions of fairness and transparency with rubric
• More detailed feedback; moderation more obvious
Markers’ perspective
• No printing/scanning for retention
• Linked to originality check
• More detailed comments with less work
• Library bank of comments helps to avoid repetition/increases consistency
• Record of submission, return of feedback and feedback viewed
17. Impact and reflections
• Assessment-specific marking
criteria and engagement sessions
are now standard in all modules I
lead e .g.
• Level 4 Microbiology
• Level 5 Employability Skills
• Level 6 Dissertation projects
• Colleagues across the University
have adopted assessment-
specific marking criteria and
Feedback Studio.
• Writing effective marking criteria
takes time and several revisions.
• Can independent engagement
with marking criteria be just as
useful as classroom sessions?
• How can students be involved in
writing the marking criteria?
• How can we keep feedback in
students’ consciousness?
18. Any questions?
For more information, please get in touch:
alison.graham@ncl.ac.uk, Tweet @alisonigraham
or visit
https://www.slideshare.net/alisongraham15/presentations
Dr Sara Marsham
@sara_marine
Thanks to Newcastle
University Innovation Fund
for funding the original
work & ongoing support.
Our thanks to all of
the students who
took part and shared
their opinions.