IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
Learning Analytics for Online Discussions: A Pedagogical Model for Intervention
1. Learning Analytics for Online Discussions
A Pedagogical Model for Intervention
with Embedded and Extracted Analytics
Alyssa Friend Wise
Simon Fraser University
LAK13
Learning Analytics & Knowledge
Leuven, Belgium
April 11, 2013
2. Situating the Work
Learning context: Online discussions
Data type: Process data based on clickstream
Timeline:
In-process learning events, short
cycles of feedback
Interpretation/action:
Instructors and learners
making local pedagogical decisions
3. Three Challenges in Using Learning
Analytics to Support Decision Making
about Learning Events in Progress
1. Capturing meaningful traces of Online
Speaking &
learners’ activity
Listening
2. Presenting data to learners in a Embedded &
useful form Extracted
Analytics
3. Supporting interpretation and use Pedagogical
of the analytics in decision making Model of
Intervention
5. 1. Capturing Meaningful Traces
Need for a Learning Model
Capturing traces of activity in an online
learning environment that are meaningful
necessitates having a specific model of
learning for the particular environment
Our work investigates and supports learning
through online discussions
The learning model draws on our existing
research program focusing on how students
contribute and attend to other’s messages
6. 1. Capturing Meaningful Traces
Learning through Online
Discussions
Social constructivist perspective - learners build
understanding through dialoging with others
Varying importance of :
Sharing and supporting one’s own ideas
Being exposed to multiple viewpoints
Experiencing personal cognitive conflict
Negotiation of group understandings
Two basic (and related) underlying processes
“Speaking” - externalizing one’s ideas by contributing
posts to the discussion
“Listening”- taking in the externalizations of others by
accessing existing posts
7. 1. Capturing Meaningful Traces
“Speaking” & “Listening” Online
In online discussions learners have a high
degree of control over the timeline and pace
of their engagement
Opportunities for thoughtful listening and
reflective speaking
Challenges of time management, especially for
prolific discussions
Helping learners to actively monitor and
regulate how they speak and listen in online
discussions is an important tool for supporting
productive engagement in discussions.
8. 1. Capturing Meaningful Traces
Speaking Listening
Mechanism for sharing Attending to the ideas of others is
ideas with others a critical, though “invisible” part of
learning through online discussions
Value in speaking that is Value in listening that is
Recurring, responsive and Broad (to consider multiple ideas)
rationaled Integrated (so replies are informed
Temporally distributed by reads)
Moderate portioned Reflective (to provide context for
discussion flow)
While “speaking” is visible, Early research suggested
not all qualities are salient universally poor behaviors, but our
in the system (esp. as recent work shows students
related to time) interact with prior messages in
very distinct ways
Post quality info valuable, E.g. Coverage, Interactive, Self-
Focused, Targeted
complex to assess
9. 1. Capturing Meaningful Traces
Metric Definition Criteria
Range Span of days a student logged in to the discussion
Number of Number of times a student logged in to the
sessions discussion
Average Total time a student spent in the discussions Temporal
session length divided by his / her number of sessions Distribution
Percent of
Number of sessions in which a student made a
sessions with
post, divided by his/her total of number sessions
posts
Total number of posts a student contributed to the
Posts
discussion
Speaking
Total number of words posted by a student
Average post Quantity
divided by the number of posts he/she made to
length
the discussion
Number of unique posts that a student read
Percent of Listening
divided by the total number of posts made by
posts read Breadth
others to the discussion
Number of
Number of times a student revisited posts that
reviews of own Listening
he/she had made previously in the discussion
posts Reflectivity
Number of Number of times a student revisited others’ posts Listening
reviews of that he/she had viewed previously in the Reflectivity
others posts discussion
10. 1. Capturing Meaningful Traces
Data Processing
mySQL query merging log + post tables produces list of all actions
Action type (view-post, create-post, edit-post, delete-post)
Time-date stamp
ID of user performing the action
ID of post being acted on
Length of post being acted on
ID of user who created post being acted on
Excel VBA macros
Clean data and separate by user
Calculate action duration (subtraction of sequential time stamps)
Divide actions into sessions-of-use (60-min abandonment threshold)
Make adjusted estimates for duration of session-ending actions
View actions were sub-categorized as reads or scans based on a
maximum reading speed of 6.5 wps
View actions on a user’s own posts re-coded as self-reviews
12. 2. Presenting Data
Simple Table Format
Metric Your Data Class Average Observations
(Week X) (Week X)
Range of participation 4 days 5 days
# of sessions 6 13
Average session length 33 min 48 min
% of sessions with posts 67% 49%
# of posts made 8 12
Average post length 286 words 125 words
% of posts read 72% 87%
#of reviews of own posts 22 13
#of reviews of others’ posts 8 112
13. 2. Presenting Data
Extracted vs Embedded Analytics
Analytics described so far are extracted traces of
the learning activity presented back to learners for
interpretation
But there is also a second class of analytics, traces
of the learning activity that can be embedded in
the discussion interface
Embedded Analytics in the Visual Discussion Forum
Viewed / unviewed posts (blue / red)
Which posts / parts of discussion attended to thusfar
Own posts shown in light blue
Amount and distribution
16. 3. Supporting Interpretation
Concerns / Dangers
Rigidity of interpretation (e.g. more is better)
Lackof transparency with regards to data
capture and access
Hegemony of optimizing to only that which
can be measured
Possibly
impeding learner development of
metacognitive and self-regulative learning skills
17. 3. Supporting Interpretation
Pedagogical Framework for
Learning Analytics Intervention
1. Integration with the Learning Activity
2. Diversity of Metrics based on Learning Model
3. Agency in Interpreting Meaning
4. Reflection in Explicit Space|Time
5. Dialogue to Negotiate Interpretation
6. Parity between Instructor and Students
18. 3. Supporting Interpretation
Context of Initial Implementation
Blended doctoral seminar with 9 students
10 week-long online discussions about ed tech
Reflective journal (and embedded analytics) for
all 10 weeks
Extracted analytics added for weeks 5 to 10
Guidelines for participation, facilitation and
analytics based on the learning model given to
students in discussion weeks 1, 2, and 5
19. 3. Supporting Interpretation
1. Integration
with the Learning Activity
Connect the purpose of the learning activity with
the instructor’s expectations for a productive
process for engaging in it in and how the
learning analytics provide indicators of this
Discussion Participation Guidelines Learning Analytics Guidelines
Attending to Others Posts Attending to Others’ Posts
Broad Listening: Try to read as many posts % of The proportion of posts you
as possible to consider everyone’s ideas in the posts read (not scanned) at least
discussion. This can help you examine and read once.
support your own ideas more deeply. However,
when time is limited it is better to view a portion
It is good to read as many posts as
in depth, then everything superficially.
possible to consider everyone’s ideas in
the discussion However, when time is
*The visual interface shows posts that you
limited it is better to view a portion in
have viewed in blue and new ones in red to
depth, then everything superficially.
help you track this.
20. 3. Supporting Interpretation
1. Integration
with the Learning Activity
Connect the purpose of the learning activity with
the instructor’s expectations for a productive
process for engaging in it in and how the
learning analytics provide indicators of this
Initial Findings
High student overall buy-in to guidelines /
metrics, was difficult to isolate the two as
students seemed to think of them together
Students interpreted metrics in terms of the
guidelines
Students described using the guidelines and
metrics to decide how to participate
21. 2. Presenting Data
2. Diversity
of Learning -Model based Metrics
Metric Your Data Class Average Observations
(Week X) (Week X)
Range of participation 4 days 5 days
# of sessions 6 13
Average session length 33 min 48 min
% of sessions with posts 67% 49%
# of posts made 8 12
Average post length 286 words 125 words
% of posts read 72% 87%
#of reviews of own posts 22 13
#of reviews of others’ posts 8 112
22. 2. Presenting Data
2. Diversity
of Learning -Model based Metrics
Metric Your Data Class Average Observations
(Week X) (Week X)
Range of participation 4 days 5 days
# of sessions 6 13
Average session length 33 min 48 min
Initial Findings
Students found different metrics valuable –
multiple pathways
Highlighted lack of listening by some of the
vociferous speakers / honored efforts of others
Trust of the numbers was important, calculation
choices became important
23. 3. Supporting Interpretation
3. Agency
in Interpreting Meaning
Guidelines present metrics as a starting point for
consideration, not as absolute arbiters of
engagement in the activity
Use of class average to provide context for #s
Students set personal goals for participation and
use the analytics to help monitor these
Initial Findings
Students found goal-setting valuable as motivating
them to improve, used multiple strategies, drew on
metrics and tried to adjust behaviors
Validation and surprises - emotional reactions No
major “big brother” issues
Involuntary propensity to target average
24. 3. Supporting Interpretation
4. Reflection
in Explicit Space |Time
Dual danger of omnipresent analytics
Reflection “anyplace/anytime” happens nowhere/never
Attention to constantly available metrics can distract
from engagement in the activity itself
Our solution: Establish a rhythm for reflection
Place: Online reflective journal (private wiki)
Time: ~10-15 min at start of class each week
Initial Findings
Students consistently set-goals and reflected, many
also reported reviewing reflections
High student self awareness of if meeting goals
Dedicated time strained class time / flow
25. 3. Supporting Interpretation
5. Dialogue
to Negotiate Interpretation
Reflective dialogue between students and the instructor
about their participation, grounded in the analytics
Conducted thought the online reflective journal (private
wiki) shared between each student and instructor
Both creates an audience for the reflection and allows
for feedback, suggestions etc.
Initial Findings
Having an audience for the journal mattered
Negotiation and contextualization of analytics - students
explained choices, strategies, struggles
Instructor responses seen as supportive, providing
guidance to help students move towards goals
Does this challenge agency? Some tensions…
26. 3. Supporting Interpretation
6. Parity
between Instructor and Students
Instructor participates in same practices of goal-
setting, analytics interpretation and reflective
journaling (in wiki visible to whole class)
Goal to create sense of openness / equity around
data use (analytics with, not on students)
Initial Findings
Instructor's reflection useful as an initial model and
reassuring comparison point, but not for parity
Instructor seen as having a positive role in overseeing
and guiding discussion related activities, perhaps
lack of parity is not problematic (in this context)
27. Future Plans
Research Completion
Textual and computational analysis to detect
actual changes in discussion participation across
the course and their alignment with goals,
addition of extracted analytics
Tool Development
Adding measures of post quality
Scale-up of data processing, dialogue/negotiation
Addressing challenges of metrics based on
averages and adequate reference points
28. Alyssa Friend Wise
Simon Fraser University
afw3@sfu.ca
http://www.sfu.ca/~afw3/research/e-listening/index.html