The Learning Analytics Research Network (LEARN) invites you to join us for a talk about the exciting ways in which the University of Technology Sydney is using participatory design to augment existing classroom practices with learning analytics. Simon Knight, a LEARN Visiting Scholar from the University of Technology Sydney, will introduce a variety of projects, including their work developing analytics to support student writing.
Come meet others at NYU interested in learning analytics while learning from the examples of leading work in Australia. A light lunch will be served and the talk will be followed by a short Q&A. RSVP is required.
About Simon Knight
Simon Knight is a lecturer at the University of Technology Sydney in the Faculty of Transdisciplinary Innovation. His research investigates how people find and evaluate evidence, particularly in the context of learning and educator practices. Dr Knight received his Bachelor’s degree in Philosophy and Psychology from the University of Leeds before completing a teacher education program and Philosophy of Education MA at the UCL Institute of Education. Following teaching high school social sciences, Dr Knight completed an MPhil in Educational Research Methods at Cambridge, and PhD in Learning Analytics at the UK Open University.
About Simon’s Talk
How do we make use of data about our students to support their learning, and where does learning analytics fit into that? Educators are increasingly asked to work with data and technologies such as learning analytics to support and provide evidence of student learning. However, what learning analytics developers should design for, and how educators will implement analytics, is unclear. Learning analytics risks the same levels of low uptake and implementation as many other educational technologies if they do not align with educator practice and needs. How then do we tackle this gap, to support and develop technologies that are implemented in practice, for impact on learning?
At the University of Technology Sydney, we have taken a participatory design based approach to designing and implementing learning analytics in practice, and understanding their impact. In our work we have identified existing practices with which learning analytics may be aligned to augment them. This talk introduces some of these projects, particularly drawing on our work in developing analytics to support student writing (writing analytics), giving examples of how analytics were aligned with existing pedagogic practices to support learning. Through this augmentation, supported by design-based approaches, we argue we can develop research and practice in tandem.
Aligning Learning Analytics with Classroom Practices & Needs
1. Aligning Learning Analytics
with Classroom Practices
and Needs
Dr Simon Knight
@sjgknight
www.sjgknight.com
Senior Lecturer
Faculty of Transdisciplinary Innovation
University of Technology Sydney
2. Acknowledgements
http://sjgknight.com
@sjgknight
Particular thanks to Shibani Antonette who conducted
much of the work here as part of her doctorate (papers in
submission) and Simon Buckingham Shum, director of
CIC and collaborator on the work.
• Academic collaborators, including:
Law - Philippa Ryan
Accounting – Nicole Sutton, Raechel Wight
• Colleagues in CIC, particularly Simon Buckingham
Shum, Shibani Antonette, Sophie Abel
• Funding via UTS Teaching and Learning grants, and
an ATN learning analytics grant
• Student participants
• Demo: http://acawriter-
demo.utscic.edu.au/
• If you have a sample text,
paste it in the editor and click
on ‘Get Feedback & Save’.
3. Learning, Evidence, and Data
What do we want students to learn?
How do we gather and use evidence of learning, to support
learning?
What’s the role of data in that?
Open Source Tool & Resources: http://goo.gl/VNNU24 3
?
?
?
4. Evidence, data, and learning
Learning Analytics: Which uses data (and data science
methods) in learning contexts, to understand and support that
learning
Open Source Tool & Resources: http://goo.gl/VNNU24
4
7. AI or IA
Per Baker “Stupid tutoring systems, intelligent humans” (2016):
• 25 years on AIED adoption has not been widespread
• Rather than new AI technologies, focus on how to better use such
intelligent systems alongside educators and students in flexible
ways.
• Parallel calls for learning design <-> learning analytics integration
(e.g. Wise, 2014).
Open Source Tool & Resources: http://goo.gl/VNNU24
7
8. Why won’t they use it??
Integration of innovations must consider their distance
from existing culture, practice, and technologies
(Ferguson et al., 2014; Zhao, Pugh, Sheldon, & Byers, 2002)
Open Source Tool & Resources: http://goo.gl/VNNU24
8
9. Technology Integration…
• Learning analytics for new assessment:
E.g. ITS, choice-based assessment, etc.
• Learning analytics to automate assessment:
E.g., automated essay scoring
• Learning analytics to augment assessment:
use of learning data to enhance existing good practice
9
Open Source Tool & Resources: http://goo.gl/VNNU24
10. Balancing IA and AI
Augmentation:
• Supports and
enhances
• Raises
awareness &
implementation
Open Source Tool & Resources: http://goo.gl/VNNU24
10
AI:
• Transformative potential
• Challenges of gathering
new data types &
developing new tech &
contexts
11. I’ll persuade you of 2 things:
1. Augmentation
• IA intelligence amplification/augmentation over AI)
2. Design approach (for research and practice impact)
Open Source Tool & Resources: http://goo.gl/VNNU24
11
12. I’ll persuade you of 2 things:
1. Augmentation
• IA intelligence amplification/augmentation over AI)
2. Design approach (for research and practice impact)
3. (and to explore the full resources http://goo.gl/VNNU24 )
To do that, we’ll use: A particular writing analytics context
Open Source Tool & Resources: http://goo.gl/VNNU24
12
14. Writing skills are important for success in our school,
workplace, and personal lives
Geiser & Studley, 2001; Light, 2001; Powell, 2009; Sharp, 2007
By ccarlstead CC-By
https://www.flickr.com/photos/cristic/359572656/
15. Why Writing?
• Teaching writing is hard (Ganobcsik-Williams, 2006)
• Students often judge their work by more superficial criteria
than the analytical standards that educators apply (Andrews, 2009;
Lea & Street, 1998; Lillis & Turner, 2001; Norton, 1990).
Open Source Tool & Resources:
http://goo.gl/VNNU24
15
17. Automated tools - issues
Open Source Tool & Resources:
http://goo.gl/VNNU24
Impact on student writing not
studied extensively
Gap between potential and actual
use of technologies
18
20. A hallmark of academic writing is that it
works with ideas.
Such writing typically displays specific “rhetorical moves”
— a clear signal to the reader what the sentence’s purpose is
in the persuasive narrative, e.g.
UTS CIC 21
Contrast
“However, a recognized challenge is…”
“Despite repeated efforts…”
“Although it was predicted that…”
21. Signalling to readers that we’re “working with ideas”
Archetypal rhetorical moves made in academic writing
UTS CIC 22
Move Examples
Background
While data was previously studied in educational research, analytics
now enables more…
Recent studies indicate that the effects of the drug could be
permanent.
Summary
This paper will examine the question of how we develop scalable
learning analytics applications
Contrast
However, a recognized challenge in the field of learning analytics is
the uncertainty around LA’s pedagogical relevance
Question
Little research exists on how automated feedback impacts student
writing.
22. UTS CIC 23
Move Examples
Emphasis
The key elements for this approach are...
It is important to note that the policy applies to all universities.
Novelty
This new model suggests a view of learning that is an embodied and
relational process
Surprise
Surprisingly, the results indicate a weak link between customer
satisfaction and brand value.
Trend
With the growing quantity of data generated, there is increasing
interest in analytics
Signalling to readers that we’re “working with ideas”
Archetypal rhetorical moves made in academic writing
27. Law Essay Context
• Writing is a key
disciplinary skill for law
students
• Criteria require the use of
rhetorical moves
28
28. Thumbs up from students
Students already self-assess as part of their unit
Self-selecting sample also tried out the tool (after
submission) and gave feedback
Knight, S., Buckingham Shum, S., Ryan, P., Sándor, Á., & Wang, X. (2017). Academic Writing Analytics for
Civil Law: Participatory Design Through Academic and Student Engagement. International Journal of Artificial
Intelligence in Education. https://doi.org/10.1007/s40593-016-0120-1
29. STUDENT FEEDBACK (n = 12 of 40 using tool)
• Useful to reflect
• Highlighting (and
lack of) targets
attention
• Sentence-level
helps see
structure and style
• Immediate & not
embarrassing
• Accuracy
concerns
• False sense of
security?
When I compared […] essays, I didn’t see much difference in the stats
analysed by the software – all my work seemed to have quite low
detection rates of ‘importance’, yet on some I got 60%, while others 95%.
[like human feedback] it is something to reflect on and consider in order
to make decisions whether implementing the suggestions/feedback will
improve your piece of writing, or your writing generally.
Said feedback was instructive…“ because of the way the information is
presented by breaking down the sentences and clearly marking those
that are salient as being contrast or position etc
I realise now what descriptive writing is - the software had quite a bit to
say about my lack of justification - also true - pressed for time and difficult
circumstances have caused this for me in this instance - good to see it
sampled.
Knight, S., Buckingham Shum, S., Ryan, P., Sándor, Á., & Wang, X. (2017). Academic Writing Analytics for Civil Law:
Participatory Design Through Academic and Student Engagement. International Journal of Artificial Intelligence in
Education. https://doi.org/10.1007/s40593-016-0121-0
31. Augmenting existing good practice
• Augmented intelligence (not artificial intelligence)
• Integrates with existing practice by representing and building
on that practice
• Supports flexible use of analytics, through patterns
describing this use in contexts
32
Open Source Tool & Resources:
http://goo.gl/VNNU24
32. Design process to…
• Understand existing patterns of writing support in a particular
institutional context, and develop abstractions of these
• Augment these patterns with additional – learning analytics
that complement the original designs
• Evaluate the implementation of these patterns, to understand
relations among them and the development of a larger
pattern-set that can be augmented with learning analytics
Open Source Tool & Resources:
http://goo.gl/VNNU24
33
33. For example Task Design – 2016
36
DESIGN 1: Benchmarking and Automated Writing Analytics
Problem: We wanted students to engage with exemplars and their assessment, in order that they have an
activity that (1) prompts them to critically apply the assessment criteria, (2) prompts them to engage
actively with exemplars, (3) provides us as researchers with information regarding their ability to
appropriately assess texts.
Task: The initial base task (task 2) consisted of a task in which students were provided with three
exemplars of varying quality, and asked to assess those exemplars using the assessment criteria. The
application of the assessment criteria involves a mediating process of evaluative judgement in the
application of assessment criteria, which in turn should produce the outcome of improved self-assessment
ability.
Tools/materials and participant structures: This task was designed for individual completion, making
use of the instructor’s rubric, and both high and low quality exemplars.
Iterations and Augmentation: The task design was modeled on an existing common practice at the
institution. To augment this with writing feedback, in the initial iteration of the task, students were
provided with texts that had been marked up using writing feedback (from either a tool for feedback on
rhetorical structures in writing, or one focusing on spelling and grammar, or from the instructor). With the
intent of foregrounding salient features of the texts through the provision of NLP-derived feedback in the
form of highlights.
+
2016 Key Tasks:
1. Benchmark
2. Self-assess
34. Framework @UTS for educators to co-design
Analytics/IA augment teaching practice
Shibani, A., Knight, S. and Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model, and Writing Analytics Evaluations. Proc. 9th International
Conference on Learning Analytics & Knowledge (LAK19). ACM Press, NY, pp. 210-219. DOI: https://doi.org/10.1145/3303772.3303785. Eprint: https://tinyurl.com/lak19clad
Student
Task
Design
Feedback
& User
Interface
Features
in the
Data
Educators
Analytics/AI
designers
Assessment
39. Intervention Design
• Consisted of several tasks co-designed with the instructor
• Student data collected for evaluation and analysis
Matching
rhetorical
structures to
instructor’s
rubric
Revision of
given text and
self-
assessment
Assessment
of given text
(low quality)
Viewing
an
exemplar
revised
essay
Feedback
Survey
Introductory
reading on
rhetorical
writing (offline)
Augmentation
here
40. Educator: explains to her students why good lawyers
know how to use rhetorical moves
“[rhetorical moves] indicate to the reader the writer’s attitude to the text.
Why do we worry about that? Because as lawyers, our job is to […] argue that
the way that we see the facts and the law favours a certain position or outcome.”
https://youtu.be/ruN_Vy3knB8
45. Task Design – Iteration 2
48
DESIGN 2: Benchmarking, Text-Revision, and Automated Writing Analytics
Problem: We wanted students to critically consider how specific features in the text instantiate responses to the assessment criteria, and to
develop the student’s interaction with the application of the criteria for building their understanding of how to – practically – improve a text.
Task: The initial task (task 2) was amended, and an additional task was added (task 1). Task 1 consisted of a task in which the students were
asked to match excerpts from a text to the criteria that they addressed (for example, a sentence providing background information aligns with
the criterion “Identification of relevant issues”, while a sentence providing evaluation or analysis of a claim or piece of evidence aligns with the
criterion “Critical analysis, evaluation, original insight”. The revised task 2 involved students assessing a single exemplar text using the
assessment criteria, and being specifically asked how they would suggest improving the text. In task 3, then, the students were asked to edit the
text they were provided with (in an editable window, see Figure 2), and (task 4) to evaluate the improvements that they had made (i.e., to
provide a new assessment of the quality of the text). Following task 4 the students were provided with their own text revisions, and those of an
instructor on the same text, providing a ‘good’ exemplar to demonstrate the improvements made. While the original task (above) was intended
to produce a mediating process of evaluative judgement, the revision task was – in addition – designed to produce a mediating process of
revision strategy application, to produce the outcome of increased capacity and motivation to revise, and improved self-assessment ability. The
first task was specifically designed to develop evaluative judgement through understanding of the assessment criteria, and thus to improve self-
assessment through understanding of rhetorical structures.
Tools/materials and participant structures: As in design 1, this task was designed for individual completion, making use of the instructor’s
rubric, and in task 2 a lower quality exemplar, with task 4 providing the higher quality comparator. The instructor’s rubric and the lower-
quality exemplar drive the first and second-to-fourth tasks from the task structures list respectively.
Iterations and Augmentation: This task design developed from that described in design 1. As in that case, a between-subjects design was
used to provide some students with instructor-based (static) feedback, others with dynamic feedback from AWA, and others with no feedback.
Prior work has been conducted to establish conceptual relations between the instructor’s criteria, rhetorical structures, and their specific
instantiation in AWA (Knight, Buckingham Shum, Ryan, Sándor, & Wang, 2017). These relationships were foregrounded to the AWA group
through static highlights flagging the AWA moves on the sentences to be aligned with the criteria. Then, the revision task was also augmented
by AWA, with feedback provided on-request (via a button) to students as they revised the draft they were provided with.
+
Key Tasks:
1. Benchmark (lite)
2. Revise a text
3. Self-assess revisions
4. Self-assess
46. Implementation
• Implemented over two semesters in a tutorial session
• Students working:
• under different feedback conditions (AWA, instructor, none)
• And in semester 2, individually or in pairs,
• Data from ~320 students for analysis
47. Scored revisions (r 1)
No significant differences in scores
No difference in ‘usefulness’
rating between groups who
got:
1. automated annotations,
2. pre-annotated instructor
initial draft,
3. no feedback
Task Perceptions
48. Lessons
I thought it was a good exercise- especially to
understand the perspective a bit better from a markers
point of view when marking our essays. I think the
chance we had to manipulate the essay to improve it
and the mark makes us think about how and what we
would change to make our points clearer
“Possibly needs more direction post the review as
the outcome highlighted specific areas, however
the way to respond to the areas is not that clear
“I found this writing exercise very
helpful. While I was naturally using
discourse markers in my work, I was
unaware of the mechanics. Now that
I am aware of rhetorical moves, I am
finding it easier to both plan and
execute essays.”
The highlighting only alerted to me
what was good. However, there
should be highlight to alert me to
problems in the essay as well. the
highlighting only showed me what was
a 'summary' etc. There should be more
categories and types of feedback such
as grammar issues, sentence
structure.
• Tasks – independent of tool, acceptable
• Tool – areas for improvement, but generally appreciated
50. Writing Activity; New tool + peer discussion
UTS CIC 54
More info: http://heta.io/resources/wawa-improve-sample-text-plus-peer-discussion-civil-law/
51. Educator: explains to her students why good lawyers
know how to use rhetorical moves
“[rhetorical moves] indicate to the reader the writer’s attitude to the text.
Why do we worry about that? Because as lawyers, our job is to […] argue that
the way that we see the facts and the law favours a certain position or outcome.”
https://youtu.be/ruN_Vy3knB8
60. • Compared ratings of exercise usefulness, with/without AcaWriter (law)
• Compared n of rhetorical moves in draft revision with/without AcaWriter (law)
• Compared scored draft revisions with/without AcaWriter (law)
What does success look like?
Shibani, A., Knight, S. and Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model, and Writing Analytics Evaluations. Proc. 9th International Conference on
Learning Analytics & Knowledge (LAK19). ACM Press, NY, pp. 210-219. DOI: https://doi.org/10.1145/3303772.3303785. Open Access Eprint: https://tinyurl.com/lak19clad
Shibani, A. (2019, In Prep). Augmenting Pedagogic Writing Practice with Contextualizable Learning Analytics. Doctoral Dissertation, Connected Intelligence Centre, University of Technology Sydney
61. • The writing exercise was meaningful without AcaWriter, but with
AcaWriter it was rated significantly more useful
• Students who used AcaWriter made significantly more academic
rhetorical moves in their revised essays
• A significantly higher proportion of AcaWriter users improved their
drafts (some students degraded them across drafts)
What does success look like?
Shibani, A., Knight, S. and Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model, and Writing Analytics Evaluations. Proc. 9th International Conference on
Learning Analytics & Knowledge (LAK19). ACM Press, NY, pp. 210-219. DOI: https://doi.org/10.1145/3303772.3303785. Open Access Eprint: https://tinyurl.com/lak19clad
Shibani, A. (2019, In Prep). Augmenting Pedagogic Writing Practice with Contextualizable Learning Analytics. Doctoral Dissertation, Connected Intelligence Centre, University of Technology Sydney
62. UTS CIC 66
How useful did you find the task to improve your
essay/ report writing?
Statistically significant difference between
no feedback and AcaWriter feedback
groups (p-value <0.005)
Cohen’s-d estimate: 0.82 (large)
Law: n=90
Acc: n= 302
63. Evaluating impact – Student responses
UTS CIC 68
“It's like having a tutor or another
person check and give constructive
feedback on your work. Can be helpful for
struggling students.” “I believe this exercise may be of better
use to some than others, and that it
offers good information that could be of
use, for me personally, the program
would need to be able to help me to
better understand what I'm doing
incorrectly than correctly, and as such, I
believe that a human reading through it
is still more effective in that regard.”
Shibani, A., Knight, S. and Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model, and Writing Analytics Evaluations. Proc. 9th International Conference on
Learning Analytics & Knowledge (LAK19). ACM Press, NY, pp. 210-219. DOI: https://doi.org/10.1145/3303772.3303785. Open Access Eprint: https://tinyurl.com/lak19clad
64. Evaluating impact – Student responses
UTS CIC 69
“I think what is being taught is something I
was already aware of. However, by being
forced to actually identify ways of
arguing, along with the types of words
used to do so, it has broadened my
perspective. I think I will be more aware
of the way I am writing now.” A good reminder of important elements
of essay writing. However, I am not
sure how useful AcaWriter actually
is other than providing some general
feedback
Shibani, A., Knight, S. and Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model, and Writing Analytics Evaluations. Proc. 9th International Conference on
Learning Analytics & Knowledge (LAK19). ACM Press, NY, pp. 210-219. DOI: https://doi.org/10.1145/3303772.3303785. Open Access Eprint: https://tinyurl.com/lak19clad
70. Task Design – Iteration 3
75
Shibani, A. (2017). Combining automated and peer feedback for effective learning design in writing practices. In Yu, F.Y. et
al. (Eds.). Proceedings of the 25th International Conference on Computers in Education, New Zealand.
DESIGN 3: Benchmarking, Text-Revision, Peer-Discussion, and Automated Writing Analytics
Problem: Building on the previous designs, we additionally wanted students to engage with each other around the
application of assessment criteria, to further develop their evaluative judgement, and ability to explain and justify
their judgements of texts and their revisions.
Task: The initial base tasks in design 2 were adapted, such that in in one group of students they were asked to work
as dyads, submitting a single revised text, and in the other group they worked individually.
Tools/materials and participant structures: In this design, the participant structure varied by group, with some
working in pairs and others individually. When students work in dyads, they involve in discussion consisting of
reflection and critique on the structure of essays and the application of automated feedback. The materials and tool
for this design are the same as those in design 2.
Iterations and Augmentation: This task design developed from that described in design 2. A key concern in this
design was that peer discussion may mediate the understanding and use of the augmented feedback provided by
AWA; that is, this task may develop students’ abilities to – critically – use such feedback, and that through
observation of this dialogue research and implementation data is obtained. A further alternative design iteration (to
be implemented in 2018) consists of asking students to work individually first (with, or without, augmentation), and
then to work in dyads (or not) to create a hybrid revised text to submit.
+
Key Tasks:
1. Benchmark (lite)
2. Revise a text
3. Self-assess
revisions
4. Peer assessment
discussions
5. Self-assess
76. Shibani, Knight, Buckingham Shum (in submission)
I think you could put together a couple of
options in terms of the packages, what it
would mean to adopt AcaWriter….. Because I
think probably the biggest hurdle for adoption
is in terms of getting it in place people not
having a sense of what it is they’re
committing to
I think it’s more important to
say to as many academics as
possible, we’ve got this tool.
This is how law used it […] but
there are many other
problems it could solve. Do
you want to go away and
think about whether you
could use a writing analysis
tool? […] I would also try and
find out how the particular
industry that they support, that
that faculty delivers graduates
into is already using writing
analysis software to give it
some practice or authentic
meaning
obviously, it’s not perfect. I actually think the fact
that it’s not perfect, which, let’s face it, spell check
isn’t perfect, Grammarly isn’t perfect. All they ask
you to do is think about it […] And I know what
Grammarly’s doing, and I know why I would override
what Grammarly suggests. Now if that’s what the
students are doing, well, more power to them, but at
least they understand what their text is doing and how
it’s behaving
82. Dr Cherie Lucas
Lecturer
UTS School of Pharmacy
Educator: AcaWriter supports professional reflection
by Pharmacy students following work placements
https://cic.uts.edu.au/immediate-personalised-feedback-on-reflective-writing
83. UTS CIC 90
Writing Context – Postgrad. Pharmacist reflection
Assessment Rubric
Assessment
Key to the automated
annotations on writing
Features
in the
Data
Feedback
& User
Interface
86. Designing writing activities using AcaWriter
94
1. Implementation and integration to scale
2. Learning tasks are central
3. We don’t need perfect LA to achieve impact
4. We can tune rule-based analytics for particular tasks
5. We can share augmented tasks to build technical
and social infrastructure
87. Thank
you
http://sjgknight.com
@sjgknight
Acknowledgements:
• Academic collaborators, including:
Law - Philippa Ryan
Accounting – Nicole Sutton, Raechel Wight
• Colleagues in CIC, particularly Simon Buckingham
Shum, Shibani Antonette, Sophie Abel
• Funding via UTS Teaching and Learning grants, and
an ATN learning analytics grant
• Student participants
• Demo: http://acawriter-
demo.utscic.edu.au/
• If you have a sample text,
paste it in the editor and click
on ‘Get Feedback & Save’.
88. ACAWRITER DEMO
UTS CIC 97
• Go to http://acawriter-demo.utscic.edu.au/
• If you have a sample text to try on, paste it in the editor and click on
‘Get Feedback & Save’.
Other sample texts to try:
https://tinyurl.com/yarcup6t