What's New in Teams Calling, Meetings and Devices March 2024
Test your EMA fitness
1. 06/09/2016 Test your EMA fitness: a tailored approach to
enhancing assessment through technology
2. Overview
»The story so far, background and context
»Introduction to the EMA readiness tool
»An institutional perspective
»Panel discussion
#jiscassess
06/09/2016
4. Assessment and feedback challenges
› Highly devolved responsibility and
inconsistent practices
› Lack of developmental focus
› Traditional practices dominate
› Timeliness, quality and consistency
of feedback
› Learner in passive role
› Lack of relevance to world of work
Assessment and feedback (2012)
» Nuanced responses but broad trends
visible
» Localised initiatives but beginning to
scale up
» System integration is a key problem
area
» EMA exposes variability in business
processes
EMA (2014)
06/09/2016
7. Developing solutions
Challenge
06/09/2016
Variation of business processes
EMA systems not meeting
UK needs
EMA processes and
systems guide
Traditional practices, assessment
design, assessment literacies, risk
aversion
Transforming
assessment and
feedback guide
EMA readiness tool
8. ALT member involvement so far
»Helped prioritise challenges
»Helped define themes for toolkit
»Initial feedback on self-assessment questions
»Discussed initial process mapping and identified issues
»Members involved in all working groups
06/09/2016
9. Tackling business process issues
06/09/2016
» Lots of local variation
» Only 12% of universities
have ‘highly standardised’
processes
» Different interpretations of
policy
» Variation hidden till EMA
tried
10. Clarifying business processes
06/09/2016
Simple model to prompt process review:
» Are you doing additional tasks - if so,
why?
» Are the tasks being done by the right
people eg do you have academic staff
undertaking administrative duties that
do not require academic judgement?
» Do you have systems that could carry out
some of the tasks you are doing
manually?
» Do you have multiple ways of performing
the same task - if so, why?
11. Defining system requirements
» Generic process descriptions
facilitated clear definition of system
requirements
» Requirements validated with UCISA
» Suppliers responded to
requirements using a standard
template
06/09/2016
13. Technology-enhanced assessment design
» Themes:
› Assessment design
› Assessing group work
› Assessment literacy
› Pattern and scheduling
› Employability and assessment
› Feedback and feedforward
› Inclusive assessment
› Marking practice
› Peer assessment and review
› Quality assurance and standards
06/09/2016
14. Embedded in the resources
EMA case studies and examples
» Queen’s University, Belfast
» Institute of Education (now part of UCL)
» University of Hertfordshire
» Keele University
» Manchester Metropolitan University
» Bedford College
» Walsall College
» Sheffield University
» Podcasts
› Benefits of EMA
› Managing an EMA project
› Assessment and feedback lifecycle
› Processes and systems
› Value of Jisc resources
» Other resources
› Assessment for learning
principles
› Assessment timelines tool
› Previous guides e.g.
feedforward; changing practice
06/09/2016
» Videos
» Employability and assessment
» e-Portfolios
» Peer assessment
» Reconceptualising feedback
15. you said…
»Using the resources for:
› Source for discussion for basic tutor
training
› Academic unit away day sessions
› Institution-wide enhancement projects
› Articulating benefits
to senior managers
»Gaps:
› Online exams / tests
06/09/2016
The straightforward framework
from MMU has enabled us to
start programme level
discussions
The content is pitched at the right
level to be useful to any
academic/technologist facilitating or
supporting assessment.
16. EMA readiness tool
06/09/2016
Arose from demand for internal benchmarking.
Measures against:
» Strategy and policy
» Curriculum data
» Processes and working practices
» Technology
» Culture
» Student Experience
http://ji.sc/emaready
17. EMA levels of maturity
06/09/2016
Researching At an early stage of EMA. You do not seem to have a comprehensive view
of institutional activity overall: policy, process and systems seem
fragmented. Ensure you have senior management support to undertake
further investigation. Find the areas of good practice you can build on.
Start by defining your educational principles.
Exploring You are probably aware of pockets of good practice but have not really
begun to try to scale this up. You will need to be clear about expected
benefits in order to effect the cultural change needed.
Embedding You are at a tipping point where fairly widespread experimentation is
close to becoming mainstream practice. A key issue will be ensuring that
business processes are sufficiently consistent to support a more holistic
approach.
Enhancing You probably are already supporting the core of the assessment and
feedback lifecycle with technology and looking to fill gaps and find more
elegant solutions to existing workarounds.
Pioneering You are looking to go beyond automation, standardisation and efficiency
gains to ensuring that EMA has a truly transformative impact on learning
and teaching in your institution. Your institution is probably a provider of
many of the resources in our toolkit but we hope we can still provide
some inspiration and support.
18. Piloting the tool
»Anglia Ruskin University
»Aston University
»Birmingham City University
»Manchester Metropolitan
University
»Plymouth University
»University of Bradford
»University of Edinburgh
»University of Hull
»University of Nottingham
»University of Sheffield
»University of Southampton
»University ofYork
06/09/2016
19. Experiences from a pilot institution
Strategic Areas within theTeam:
»Assessment & Feedback
»Research & Professionalism
»Stakeholder Engagement
»Blended & Distance Learning
06/09/2016
Emma Purnell
Senior LearningTechnologist
Plymouth University
24. Piloting – summary of themes
»Supporting collaborative
conversations not individual
responses
»Internal knowledge-building
not external benchmarking
»Providing a cross-institutional
view
»Completing from a range of
perspectives
»Progression isn’t even…
06/09/2016
25. Piloting – summary of requirements
» As central team I want to share the
tool with a range of depts. so they
can complete at departmental level
» As a central team I want to be able to
see a collated view of results across
all to inform conversations, and
inform how to target support
» As an EMA lead I want to be able to
see my responses as well as my
answers so I can better understand
the outcomes
Must haves
» As an EMA lead I want to have a free
text box as part of each question to
clarify how I’ve responded
» As an EMA lead I want to be able to
save my responses before I complete
the survey so I can return to them
later
» As a central team I want to be able to
tailor the questions for each
departmental context so their results
will be more meaningful
Nice to haves
06/09/2016
26. Next steps
»Inviting further feedback on this
first beta version to inform future
development
»Developing the ‘institutional
dashboard’ views alongside other
‘discovery’ tools
FE and skills
»Piloting with institutions to assess
suitability
»FE ‘benchmarking’ tool and guide
06/09/2016
27. Panel discussion
»Questions, reflections
and thoughts
»How does the Plymouth
experience relate to your
contexts?
»If you were starting to consider
EMA how would you
anticipate using the tool?
06/09/2016
28. Find out more
06/09/2016
» EMA readiness tool: http://ji.sc/emaready
» Transforming assessment and feedback with
technology guide: http://ji.sc/transforming-
assessment-feedback-guide
» EMAprocessesandsystemguide:
http://ji.sc/ema-processes-systems-guide
» Supplierresponsestosystemrequirements:
http://ji.sc/supplier-responses-ema
» New guide for FE and skills with case studies:
bit.ly/Jisc-assessment-guide-FEandSkills
» 2012Landscapereportavailablefrom:
bit.ly/jisc-ema
29. Find out more
06/09/2016
» Webinar ‘Online exams: migration
or transformation?’
» 21September 2016, 12pm-1pm
» www.jisc.ac.uk/events/
» Join the conversationon the blog:
ema.jiscinvolve.org/
» and on twitter #jiscassess
» Join the mailing list:
jiscmail.ac.uk/tech-enhanced-
assessment
31. jisc.ac.uk
Except where otherwise noted, this work
is licensed under CC-BY-NC-ND
Find out more…
Contact
Lisa Gray
Senior Co-Design Manager,
Student Experience
lisa.gray@jisc.ac.uk
06/09/2016
Editor's Notes
In 2012 a baseline review undertaken by 8 institutions showed:
In terms of strategy and policy, a key issue is that although there maybe an overall assessment strategy, responsibility for implementing it is devolved to departments/faculties. This results in considerable variation in assessment and feedback practices, making it difficult to achieve parity of experience for learners.
Another key issue is that strategy documents tend to be quite procedural in focus and don’t reflect current thinking around effective assessment practice and the value that assessment can bring to learning.
When it comes to academic practice the issues are varied and complex but include the emphasis on summative assessment and the persistence of traditional forms such as essays/exams.
Timeliness, along with quality and consistency of feedback, was an issue across the board. Even where clear deadlines are set there isn’t always in time to feed into next assignment. Curriculum design (modular approach) can also provide barriers to the ongoing developmental approach to feedback at a programme level.
There is a perception that learners don’t engage with the feedback they receive. Tutors may feel they have given a lot of feedback and support but it hasn’t been acted upon. Learners are seen as passive – waiting for feedback to be delivered to them but the reality is less clear cut as the value of acting on feedback is not always well-communicated, and was notably absent in most induction processes. Learning design often puts the learner in a passive role.
And finally, the assessment and feedback process, particularly the emphasis on high-stakes assessment and the value that is placed on marks and grades, is very different to the formative ways professionals develop during their working life, where much value is gained from feedback from for examples peers.
In 2014 we researched the state of play with EMA – online submission, marking and feedback
There was a recognition that a number of localised initiatives were starting to scale up, and it was clear that institutions are increasingly moving beyond experimentation and are looking to take a more strategic approach to the application of EMA . Although that may not be instit wide. There are some egs of policies in development in the report.
The challenges were evident across all three areas of technology, process and practice – it won’t come as a surprise that integration of systems was a key problem area (even though there a small common set of tools in use), as was the variation in business processes unearthed when technology solutions were put into play, as well as challenges with assessment and feedback practice which I’ve already touched on.
In trying to review assessment and feedback practice overall it is easy to feel initially overwhelmed by the diversity of the landscape. Every institution develops its own policies, procedures and practices - usually the responsibility for this is highly devolved so there can be very different policies and practices between different departments or schools even within a single University. In trying to make sense of the overall picture, a tool that has proven very useful in providing a common starting point is the assessment and feedback life-cycle shown here.
The assessment and feedback life-cycle was originally developed by Manchester Metropolitan University and it has been picked up and used or adapted by many other institutions.
The life-cycle is an academic model. It shows a high level view of the academic processes involved in assessment and feedback and offers a ready means of mapping business processes and potential supporting technologies against this. The model can be applied to both formative and summative assessment and to any scale of learning e.g. from whole courses/programmes of learning or to short pieces of learning such as a short course that takes place over a single day. The model covers all assessment and feedback practice whether or not materials are in digital format and supported by information systems.
Use of this model has been central to our work in terms of serving as a framework to gain an overall picture of institution wide activity. It offers a means of encouraging dialogue between different types of stakeholders who may work on one aspect of assessment and thus have a view of only part of the life-cycle. So far the model has resonated with everyone we have spoken to in the course of the research and it will be interesting to see how far it extends elsewhere in Europe.
Through consultation we prioritised the challenges identified through the EMA landscape study - here you can see a top 20 listed around the assessment and feedback lifecycle.
At the start of the research there was a general feeling that stages 2-4 are better understood and less problematic than some of the other components of the life-cycle, not least because many institutions are managing all of the related information within a single VLE system. Stages 5-8 were felt to be where we begin to open Pandora's box ... This proved to be borne out by the findings of the research.
You can see the clustering around the later stages of the lifecycle, particularly around marking and production of feedback and recording grades. There are also a lot of issues that together group around the theme of student assessment literacies.
Marking and production of feedback appears to be the most problematic component of the life-cycle as it is the area where variety of pedagogic practice results in a situation where the fit between institutional processes and the functionality of commercially available systems is least well matched. We heard a very clear message from universities that existing systems do not adequately meet UK requirements in these areas. A basic issue is that marks and feedback are different things and need to be handled differently whereas technology platforms tend to conflate the two. It was also observed that systems seem too often to be predicated on an assumption that 1 student = 1 assignment = 1 mark. This model may usually be adequate for formative assessment but does not meet UK requirements for summative assessment processes. Systems would ideally offer a range of different workflows based on different roles e.g. first marker, second marker, moderator, external examiner etc.
These are the top 20 issues as mapped against the A&F lifecycle.
Ability of the technology to handle a variety of typical UK marking and moderation workflows
Reliability of submission systems
Lack of interoperability between marking systems and student record systems
Need to develop more effective student assessment literacies
Student engagement with feedback
Risk aversion
Ability to manage marks and feedback separately
Academic resistance to online marking
Need for greater creativity 10. Ability to gain longitudinal overview of student achievement
So, by September 14 we had reviewed the state of play, and through a number of opportunities had prioritised the challenges.
The next phase was to as part of a ‘co-design’ process, collaboratively work up solution ideas to these challenges. To this end, we ran, 2 participatory workshops in December and January last year.
Around 30 representatives from 30 different institutions
During our workshops over the winter participants from the sector came up with 30 solution ideas.
These fell into 5 clear groups – helpfully addressing the top few priorities.
3 of these areas were taken forward, addressing
1. the variation in business processes making it hard to be consistently clear about how assessments were being handled within different deparments, and therefore making it harder to be clear on requirements to systems providers
2. the main systems in use not meeting UK needs, particularly around for example anonymous marking, double blind marking
3. More change management issues around moving towards more ‘assessment for learning’ pedagogies, rethinking assessment design, and raising staff and student assessment literacies.
I’ll now introduce each of these 3 resources in more detail.
Business processes
Institutions reported much variation in business processes– which reflects the findings from the assessment and feedback programme – which highlighted the extent of the diversity with regard to how a&f strategy is translated into more localised policy. reflects 2 sides of same coin – harder to integrate systems (& design effective workflows) when business processes are very varied.
The 12% that said their processes were highly standardised, tended on the whole to be quite small/specialist orgs. Or had undertaken a lot of business process review already.
The devil is in the detail – the free text comments give a lot more insight on this, including the challenges being faced, including time for staff to step back and look at why things are happening, and at times misunderstandings of policies.
So we focused on clarifying EMA processes – to both help universities review their processes and have something to compare theirs against; and to help provide some clarity to suppliers around UK requirements.
3 ‘to be’ process maps, reducing them to their most efficient. Top level for submission, marking and feedback. And detailed maps for submission / marking and feedback
Outlines the flow of tasks, and who’s responsibility this could/should be. And highlights systems requirements at each stage.
Prompting the following questions.
For each of the process maps the system requirements relating to each stage are highlighted.
These requirements can either be looked at in relation to the processes, or as a full list of requirements to support full EMA process.
This list was shared with suppliers earlier this year, and they have shared how they meet those requirements.
V xPlease note the following caveats:
The data is based on self-reporting by the suppliers. Neither Jisc nor UCISA has tested the products and inclusion in this listing should not be taken as an endorsement of particular products
The individual responses are recorded under supplier name not product name. If you are having trouble finding a product check you have the correct supplier. For example: Banner = Ellucian; Canvas = Instructure; Moodle Coursework = University of London; URKUND = Prioinfocenter
The list of requirements relates solely to EMA and may not represent the full functionality of the systems included here eg student record systems cover many functions other than assessment
This listing includes products intended to cover most of the EMA lifecycle as well as some more niche products. It is intended as a means of identifying which combination of products could meet your needs. It is not a like-for-like comparison of similar systems.
The third resource is aimed more at teaching and learning staff, course and curriculum teams and assessment leads, and aims to provide guidance for staff on technology-enhanced assessment and feedback.
So, for example if you were interested in exploring how you could get better interaction with students around the feedback they receive, you can explore this topic in the ‘themes’ highlighted. You would see:
Why its important
Common problems
Tech
How it relates to the lifecycle
Examples and case studies and further resources
You can also explore by stage of the assessment and feedback lifecycle, so for example if you were interested in looking how technology can best help with your marking and feedback, you could also go via the ‘marking and production of feedback’ stage of the lifecycle, and get a full breakdown of the ways technology can be used to support this stage, with examples of how others have approached it.
Under each stage of the lifecycle:
What does it involve
What are we trying to achieve
How we might use tech
Benefits
Challenges
Resources
Related themes
Idea is to complete at either course, department or institutional level. To get a sense of where you’re currently at, and some suggestions for what areas to focus on to move forward.
Doesn’t aim to tell you where you’re meant to get to, just provide a starting point for thinking about where support can be best offered, and to inform conversations and discussion.
Intro
Practice and Technology
Electronic Submission of Coursework
Moodle Assignment tool is our primary route
Exception to this is Modules that use PebblePad for Portfolio assessment and they use ATLAS, the assessment space within PebblePad)
Turnitin
Is only used generating originality reports only via the plug in within the Moodle Assignment tool,
We don’t use TII Grademark for marking or feedback.
We recommend TII is used developmentally and the default settings we have made on the API support this approach (though not recommended they can be changed if needed)
Summative Computer Aided Assessment – QMP
Policy
Anonymous marking policy has been a key driver, policy recommends electronic submission as the most efficient way to do this process which has significantly increased esubmissions
Marking and moderation policy incorporates both paper online processes
Academic regulations currently being updated to align and include more about eassessment
Assessment change management and decision making
APSG Consists of all Faculty registrars, head of student experience, It Services, learning Technologists, data support
Drive to provide a consistent student experience for esubmission
Previously multiple tools and processes to use for esubmission, but led to inconsistency in student experience, we have streamlined the routes and processes available
Audit completed within the team
Pleased that we are in embedding for the majority of categories in our initial results, we spent a lot of last academic year in the exploratory phase and during the summer implemented a number of changes to both technology and process to enhance our overall EMA practice and to ensure alignment to policy. We still have some work to do and alignment to policy is a key area for us.
I think the nature of the multi choice options in the self-assessment tool meant some of our practice didn’t quite fit into the choices given and we are still borderline exploratory in some areas. Our pioneering level is also possibly down to this and me not having all the answers to that category.
With the last point in mind, it would be beneficial to do the self-assessment again with some additional stakeholders who would be better placed to give some answers
Have the ability for a ‘comment’ field, as a reminder a question may not have been answered fully and to be able to go back to the original answers given and make notes
Each category result recommended tailored resources which we have included in our help and guidance
Initially the most beneficial use in our context for the toolkit would be a staff development activity within our ASTI team to look holistically at the assessment life cycle across the institution and see where we might need to improve (and raise awareness within the team).
There is also scope for the self-assessment tool to be used in Faculties for benchmarking and awareness raising.
Sharing resources within our help and guidance