Blackboard’s data science team conducts large-scale analysis of the relationship between the use of our academic technologies and student impact, in order to inform product design, disseminate effective practices, and advance the base of empirical research in educational technologies.
In this presentation, John Whitmer, Director of Analytics & Research, will discuss findings from 2016. Some findings challenge our conventional knowledge, while others confirm what we believed to be true.
Archived presentation made to JISC Learning Analytics workgroup on Feb 22, 2017
4. Educational Technology Assessment Hierarchy
Does it impact
student learning?
(Learning Analy6cs)
How many people use it?
(Adop6on)
Does it work? (SLAs)
5. What is Learning Analytics?
Learning and Knowledge
Analytics Conference, 2011
“ ...measurement, collec6on, analysis and
repor6ng of data about learners and their
contexts, for purposes of understanding
and op2mizing learning
and the environments
in which it occurs.”
6. Meta- questions driving our Learning Analytics research @ Blackboard
1. How is student/faculty use of Bb plaorms (e.g. Learn, Collab,
etc.) related to student achievement? [or sa6sfac6on, or risk, or …]
3. What data elements, feature sets, and func6onality can we
create to integrate these findings into Bb products to help faculty
improve student achievement?
2. Do these findings apply equally to students ‘at promise’ due to
their academic achievement or background characteris6cs? (e.g.
race, class, family educa6on, geography)
8. Commitment to Privacy & Openness
• Analyze data records
that are not only
removed of PII, but
de-personalized
(individual &
ins6tu6onal levels)
• Share results and open
discussion procedures
for analysis to inform
broader educa6onal
community
• Respect territorial
jurisdic6ons and safe
harbor provisions
11. Bb Study: Relationship Time in Learn & Grade
• Distribution in Time Spent is
highly skewed toward low
access
• Transforming data (log
transform) can produce normal
curves for analysis
• Of course, huge variation of
quality within that time spent
(of course materials, of student
activity)
12. Findings: Relationship Time in Learn & Grade
• Question: what is the
relationship between student
use of Learn and their course
grade?
• Investigate at student-course
level (one student, one course)
• 1.2M students, 34,519 courses,
788 institutions
• Significant, but effect size < 1%
13. Finding: Tool Use & Grade
Tool use and Final Grade do not have a linear rela6onship;
there is a diminishing marginal effect of tool use on Final Grade
Interpreta6ons
• Students absent from course ac6vity are at
greatest risk of low achievement.
• The first 6me you read/see a PowerPoint
presenta6on, you learn a lot, but the
second 6me you read/see it, you learn
less.
• GeYng from a 90% to a 95% requires
more effort than geYng from a 60% to a
65%.
Log transforma2on shows
stronger trend
17. Investigation Grade by Specific Tools Used
Ques6on: what is the rela6onship
between use of Learn and student grade,
based on the tool used?
Analysis Steps
1. Filter data for courses with poten6al meaningful
use (>60 min average, enrollment >10 <500,
gradebook used)
2. Iden6fy most frequently used tools
3. Separate tool use into no use & quar6les
4. Divide students into 3 groups by course grade
• High (80+)
• Passing (60-79)
• Low/Failing (0-59)
23. Research Questions Ques2ons
1. Are there systema6c ways that instructors use LMS tools in
their courses that span instructors and ins6tu6ons?
2. What recommenda6ons can be drawn for faculty, instruc6onal
designers, and other academic technology leaders seeking to
increase the impact of LMS use at their ins6tu6on?
Methods
1. Use same filtered data sample of student-course data
2. Calculate rela6ve student 6me per tool (as % of total course
6me), for comparison between courses
3. Cluster by pakerns in the balance of 6me spent in each tool
(unsupervised machine-learning; k means cluster analysis)
4. Add data as relevant to pakerns about enrollment, total 6me,
etc.
5. Make up cool names for each cluster and interpret meaning
31. Finding: Discussions with low/high avg use
Compare courses with low forum use to courses with forum use >1 hour / student average
32. Summary & Future Directions for DS Research
Summary
• Tremendous varia6on in use of Learn; most use skewed toward low/very low use.
• Importance of 6me spent in Learn for learning is also tremendously varied (“necessary” and “effec6ve” use
of Learn)
• Cri6cal to account for this varia6on to understand poten6al importance of Learn ac6vity
Future Direc2ons
• Analyze quality of ac2vity in greater depth (e.g. content of assignments, words in forum posts) to get
insights into quality of interac6ons
• Conduct 6me-series analysis (quan6ta6ve methods, design also needed); when someone accesses is more
important than if they do.
• Create proxies/derived values for behavior (above average, at average, etc.) by tool
34. Blackboard Analytics – Product Naming
Blackboard Analytics
Data warehouse products
Blackboard Analytics
Suite of analytics products
35. Blackboard Analy.cs
Product portfolio
Blackboard Intelligence
• Analy6cs for Learn – LMS data
• Student Management – SIS data
• Finance, HR, Advancement – ERP data
Blackboard Predict
• Predic6ve analy6cs and early alerts for reten6on
• Provides data for faculty and advisors about at-risk students
• Formerly Blue Canary
X-Ray Learning Analytics
• Classroom engagement data for faculty
• Ac6vity aggregated into 30+ visualiza6ons
• Currently available for Moodlerooms & Self-Hosted Moodlers only
Past view Current view Future view
Past view Current view Future view
Past view Current view Future view
37. Blackboard Analytics – Our Approach & Philosophy
Products that provide insight into
the teaching and learning process
Our Philosophy:
Data complements
human decision-making
Core competency:
Learning so{ware
and academic data
A team of experts
in the analy6cs field