SlideShare a Scribd company logo
1 of 30
Trends and Issues in Student-Facing
Learning Analytics Reporting Systems
Research
Robert Bodily - Brigham Young University, USA
Katrien Verbert - University of Leuven, Belgium
RESEARCHQUESTIONS
1. What types of features do student-facing learning analytics
reporting systems have?
2. What are the different kinds of data collected in these
systems?
3. How are the system designs analyzed and reported on?
4. What are the perceptions of students about these systems?
5. What is the actual effect of these systems on student
behavior, student skills, and student achievement?
Research Questions
Inclusion Criteria
1. Track learning analytics data
(e.g. time spent or resource use)
beyond assessment data.
2. Report the learning analytics
data directly to students.
3. List of articles can be found at
http://bobbodily.com/article_list
INCLUSIONCRITERIA
Methodology
Education
Education & Computer
Science
Computer
Science
Conference
Proceedings
LAK and EDM proceedings IEEE Xplore
Peer Reviewed
Journal Articles
ERIC Google Scholar
ACM Database,
Computers and
Applied Sciences
Literature
Reviews
Verbert et al. 2013, Verbert
et al. 2014, Schwendimann
et al. 2016
Drachsler et al.
2015, Romero
and Ventura 2010
945 articles retrieved from the initial search
94 that fit the inclusion criteria
METHODOLOGY
Coding Categories
Functionality of the system
Data sources tracked and reported
Design analyses conducted
Student perceptions
Actual measured effects
Student use
CODINGCATEGORIES
Functionality
Purpose of the system
Data mining
Visualizations
Class comparison
Recommendation
Feedback
Interactivity
FUNCTIONALITY
Purpose of the System
FUNCTIONALITY
Category Name # of articles % of articles
Awareness or reflection 35 37
Recommend resources 27 29
Improve retention or engagement 18 19
Increase online social behavior 7 7
Recommend courses 3 3
Other 4 4
Data Mining
My article definition: any type of statistical analysis
beyond descriptive statistics
• 46 (49%) included a data mining component
• More common in recommender and data mining
systems, less common in dashboards
• Only 16 (17%) included a visualization component and
a recommendations component
FUNCTIONALITY
Visualizations
FUNCTIONALITY Visualization Type # of Articles
Bar chart 25
Line chart 19
Table 15
Network graph 10
Scatterplot 10
Donut graph 5
Radar chart 4
Pie chart 3
Timeline 3
Word cloud 3
Other 23
Visualizations in the Other category:
• Learning path visualization
• Box and whisker plot
• Tree map
• Explanatory decision tree
• Parallel coordinates graph
• Planning and reflection tool
• Plant metaphor visual
• Tree metaphor
Class Comparison
My article definition: the system had to allow students
to see other students data in comparison with their
own
• 35 (37%) of articles included a class comparison
feature
• Which students are motivated by comparison?
• Which students are unmotivated by comparison?
• What effect does personalizing reporting system
features have on student motivation?
FUNCTIONALITY
Recommendations
My article definition: Recommending or suggesting to
the student what to do.
• 43 articles (46%) included recommendations
• 78% of data mining articles provided
recommendations
• Future research should examine differences
between transparent recommendations and
traditional black-box recommendations
FUNCTIONALITY
Feedback
My article definition: Telling the user what has
happened using text.
• 17 systems (18%) provided text feedback
• Used frequently for just-in-time feedback and rarely
for unit-level or concept-level data reports
FUNCTIONALITY
Interactivity
My article definition: Allowing the user to interact with
the reporting system in some way
• 29 systems (31%) included an interactivity component
• Includes linking to content, filtering data results, or
providing simple/advanced views
• Future research should examine how students are
using these interactive features
FUNCTIONALITY
Data Sources
DATASOURCES
Subcategory Name # of Articles % of Articles
Resource use 71 76%
Assessment data 34 36%
Social interaction 33 35%
Time spent 29 31%
Other sensor data 7 7%
Manually reported data 5 5%
Design Analysis
Needs Assessment
Information Selection
Visual Design
Usability Testing
DESIGNANALYSIS
Needs Assessment
• 6 articles (6%) included a needs assessment
• Santos, Verbert, Govaerts, & Duval (2013)
• Surveyed students to identify needs
• Had students rank needs on importance
• Targeted the most important student issues
• Future research should include explicitly discussed
needs assessments.
DESIGNANALYSIS
Information Selection
• 14 articles (15%) included information selection
justification
• Ott, Robins, Haden, & Shephard (2015)
• Examined the literature
• Feild (2015)
• Exploratory data analysis
• Iandoli, Quinto, De Liddo, and Buckingham Shum
(2014)
• Used a theoretical framework
DESIGNANALYSIS
Visual Design
• 12 articles (13%) discussed the visual design or
recommendation design process
• Olmos & Corrin (2012)
• Iterative visual design process
• 85% of articles only presented the final visualization or
recommendation
• Future research should report on the visual design
process used
DESIGNANALYSIS
Usability Testing
• 10 articles (11%) reported a usability test
• Santos, Verbert, & Duval (2012) and Santos,
Govaerts, Verbert, & Duval (2012)
• System Usability Scale (SUS)
• Santos, Boticario, and Perez-Marin
• Usability and accessibility expert
• Future research should use a system usability scale
(SUS), evaluation expert, or other appropriate
methods to assess usability
DESIGNANALYSIS
Student Perceptions
STUDENTPERCEPTIONS
Sub-category # of articles % of articles
Usability 32 34%
Useful/Satisfaction 34 37%
Behavior 16 17%
Achievement 2 2%
Skills 15 16%
Actual Measured Effects
Student behavior changes
Student achievement changes
Student skills changes
ACTUALMEASUREDEFFECTS
Student Behavior Changes
1. 21% of students accepted the system recommendation to view additional content (Hsu, 2008)
2. Students participating in courses using the system were more likely to continue taking classes than those who
did not enroll in these courses (Arnold, Hall, Street, Lafayette, & Pistilli, 2012)
3. Students who enabled notifications (on 2 out of 3 systems) showed increased contributions in the social
network space (Xu & Makos, 2015)
4. Students visited the discussion space more frequently but did not post more frequently (Nakashara, Yaegashi,
Hisamatsu, & Yamauchi, 2005)
5. The percentage of posts viewed increased for all students, but there were few sustained changes (Wise, Zhao,
& Hausknecht, 2014)
6. The number of students completing assignments increased and LMS use increased (Chen, Chang, & Wang,
2008)
7. About 50% of students accepted recommendations from the system (Huang, Huang, Wang, & Hwang, 2009)
8. There was an 83.3% student interaction increase after recommendations were given (Holanda et al., 2012)
9. Students completed assignments more quickly and were able to complete the entire course more quickly
(Vesin, Klasnja-Milicevic, Ivanovic, & Budimac, 2013)
10. *For two of the three visualizations, students post quantity increased; for one of three, student post
quantity decreased (Beheshitha, Hatala, Gašević, & Joksimović, 2016)
11. *Students logged in more frequently, completed their coursework more quickly, completed more
questions, and answered more questions correctly on assignments (Santos et al., 2014)
12. *There were no significant differences between the treatment and control groups in terms of learning
efficiency (Janssen et al., 2007)
ACTUALMEASUREDEFFECTS
*Sample size greater than 150 and conducted an actual experiment (randomized control trial or other equivalent
group mean difference testing method)
Student Behavior Changes (summary)
1. N > 150 and used a randomized control trial or other equivalent
group mean difference testing method
2. For two of the three visualizations, students post quantity
increased; for one of three, student post quantity decreased
(Beheshitha, Hatala, Gašević, & Joksimović, 2016)
3. Students logged in more frequently, completed their coursework
more quickly, completed more questions, and answered more
questions correctly on assignments (Santos et al., 2014)
4. There were no significant differences between the treatment and
control groups in terms of learning efficiency (Janssen et al., 2007)
ACTUALMEASUREDEFFECTS
Student Achievement Changes
1. No significant achievement differences (Grann & Bushway, 2014)
2. More A’s and B’s and fewer C’s and D’s (Arnold et al., 2012)
3. No significant achievement differences (Park & Jo, 2015)
4. Students received more passing grades (Denley, 2014)
5. Frequency and quality of posts was affected positively and negatively (Beheshitha, Hatala, Gašević, &
Joksimović, 2016)
6. Students performed significantly better on the evaluation task (Huang, Huang, Wang, & Hwang, 2009)
7. Treatment group performed significantly better on final exam (Wang, 2008)
8. *No significant differences between treatment and control (Santos, Boticario, & Perez-Marin, 2014)
9. *No significant achievement differences (Ott, Robins, Haden, & Shephard, 2015)
10. *No significant achievement differences, but one course had an effect with Pell eligible students (Dodge,
Whitmer, & Frazee, 2015)
11. *Treatment group performed significantly better on final exam (Kim, Jo, & Park, 2015)
*Sample size greater than 150 and conducted an actual experiment (randomized control trial or other
equivalent group mean difference testing method)
ACTUALMEASUREDEFFECTS
Student Achievement Changes (summary)
1. N > 150 and used a randomized control trial or other equivalent group
mean difference testing method
2.No significant differences between treatment and control (Santos,
Boticario, & Perez-Marin, 2014)
3.No significant achievement differences (Ott, Robins, Haden, &
Shephard, 2015)
4.No significant achievement differences, but one course had an
effect with Pell eligible students (Dodge, Whitmer, & Frazee, 2015)
5. Treatment group performed significantly better on final exam
(Kim, Jo, & Park, 2015)
ACTUALMEASUREDEFFECTS
Student Skills Changes
1. Significant increase in student self-awareness accuracy (Kerly, Ellis, &
Bull, 2008)
2. Female students had increased interest when they had a choice to
use the system; male students reported higher interest with
mandatory notifications (Muldner, Wixon, Rai, Burleson, Woolf, &
Arroyo, 2015)
ACTUALMEASUREDEFFECTS
Student Use
• 12 articles (13%) tracked some form of student use
• Most articles reported on aggregate class level statistics
• Percent of class that accessed the system
• Number of interactions over the course of the semester
• Future research should investigate how students are using
visualization or recommendation reports
• Student use data could help us understand why or how a system is
helping students
STUDENTUSE
Question Category %
What is the intended goal of the system? Intended Goal 100
What visual techniques will best represent your data? Visualizations 13
What types of data support your goal? Information Selection 15
What do students need? Does it align with your goal? Needs Assessment 6
Is the system easy and intuitive to use? Usability Test 11
Why use the visual techniques you have chosen? Visual Design 13
How do students perceive the reporting system? Student Perceptions 17
What is the effect on student behavior/achievement? Actual Effects 18
How are students using the system? How often? Why? Student Use 13
Practitioner Recommendations
PRACTITIONERRECOMMENDATIONS
Future Research
1. Student use: How are students using reporting systems? Are students even
using them?
2. Design process: Are some data types and visualization types better than others,
and in what contexts?
3. Design process: Only a few authors reported on conducting needs assessments
and usability tests. What effect do these methods have on experimental rigor and
accuracy of findings?
4. Experimental research: Quasi-experimental methods, such as propensity score
matching, have yet to be used in a student-facing reporting tool context
5. Experimental research: Current research shows mixed results regarding the
efficacy of these systems. More experimental research is needed on the effects of
these systems on student behavior, achievement, and skills.
FUTURERESEARCH
Thank you! Questions?
Articles: www.bobbodily.com/article_list
www.bobbodily.com
bodilyrobert@gmail.com
THANKYOU!QUESTIONS?

More Related Content

What's hot

iGeneration Conference
iGeneration ConferenceiGeneration Conference
iGeneration Conference
nix1
 
Using Blackboard Learn alongside Microsoft OneNote: the overlaps, the complem...
Using Blackboard Learn alongside Microsoft OneNote: the overlaps, the complem...Using Blackboard Learn alongside Microsoft OneNote: the overlaps, the complem...
Using Blackboard Learn alongside Microsoft OneNote: the overlaps, the complem...
Blackboard APAC
 
Toward an automated student feedback system for text based assignments - Pete...
Toward an automated student feedback system for text based assignments - Pete...Toward an automated student feedback system for text based assignments - Pete...
Toward an automated student feedback system for text based assignments - Pete...
Blackboard APAC
 

What's hot (20)

Tips for Assessing Student Learning Using Blackboard
Tips for Assessing Student Learning Using BlackboardTips for Assessing Student Learning Using Blackboard
Tips for Assessing Student Learning Using Blackboard
 
Institutional roll out of submission and marking ss
Institutional roll out of submission and marking ssInstitutional roll out of submission and marking ss
Institutional roll out of submission and marking ss
 
NIU Blackboard Portfolio Pilot Information
NIU Blackboard Portfolio Pilot InformationNIU Blackboard Portfolio Pilot Information
NIU Blackboard Portfolio Pilot Information
 
Delivering Student Retention & Success with Predictive Analytics | Nicole Wal...
Delivering Student Retention & Success with Predictive Analytics | Nicole Wal...Delivering Student Retention & Success with Predictive Analytics | Nicole Wal...
Delivering Student Retention & Success with Predictive Analytics | Nicole Wal...
 
Learning Analytics
Learning AnalyticsLearning Analytics
Learning Analytics
 
iGeneration Conference
iGeneration ConferenceiGeneration Conference
iGeneration Conference
 
Gavin henrick learning_analytics-workshop_questions
Gavin henrick learning_analytics-workshop_questionsGavin henrick learning_analytics-workshop_questions
Gavin henrick learning_analytics-workshop_questions
 
E-Learning in Newborn Health A paradigm shift in continuing professional deve...
E-Learning in Newborn HealthA paradigm shift in continuing professional deve...E-Learning in Newborn HealthA paradigm shift in continuing professional deve...
E-Learning in Newborn Health A paradigm shift in continuing professional deve...
 
Using Blackboard Learn alongside Microsoft OneNote: the overlaps, the complem...
Using Blackboard Learn alongside Microsoft OneNote: the overlaps, the complem...Using Blackboard Learn alongside Microsoft OneNote: the overlaps, the complem...
Using Blackboard Learn alongside Microsoft OneNote: the overlaps, the complem...
 
Efficiency in teaching using these 5 Moodlerooms tools and tips | Grant Beeve...
Efficiency in teaching using these 5 Moodlerooms tools and tips | Grant Beeve...Efficiency in teaching using these 5 Moodlerooms tools and tips | Grant Beeve...
Efficiency in teaching using these 5 Moodlerooms tools and tips | Grant Beeve...
 
CoLT 2017 - Prof. Dr. Vishna Devi Nadarajah
CoLT 2017 - Prof. Dr. Vishna Devi NadarajahCoLT 2017 - Prof. Dr. Vishna Devi Nadarajah
CoLT 2017 - Prof. Dr. Vishna Devi Nadarajah
 
Toward an automated student feedback system for text based assignments - Pete...
Toward an automated student feedback system for text based assignments - Pete...Toward an automated student feedback system for text based assignments - Pete...
Toward an automated student feedback system for text based assignments - Pete...
 
SCOF: A Standardised, Customisable Online Feedback Tool
SCOF: A Standardised, Customisable Online Feedback ToolSCOF: A Standardised, Customisable Online Feedback Tool
SCOF: A Standardised, Customisable Online Feedback Tool
 
Effective Teaching with Learn@UW
Effective Teaching with Learn@UWEffective Teaching with Learn@UW
Effective Teaching with Learn@UW
 
NIU Alumni Survey Results 2001-2005
NIU Alumni Survey Results 2001-2005NIU Alumni Survey Results 2001-2005
NIU Alumni Survey Results 2001-2005
 
Supporting Faculty in the Virtual Classroom
Supporting Faculty in the Virtual ClassroomSupporting Faculty in the Virtual Classroom
Supporting Faculty in the Virtual Classroom
 
Blackboard Opening Keynote | Katie Blot - Blackboard | TLCANZ17
Blackboard Opening Keynote | Katie Blot - Blackboard | TLCANZ17Blackboard Opening Keynote | Katie Blot - Blackboard | TLCANZ17
Blackboard Opening Keynote | Katie Blot - Blackboard | TLCANZ17
 
Extending Moodle - Moodlemoot Romania 2013
Extending Moodle - Moodlemoot Romania 2013Extending Moodle - Moodlemoot Romania 2013
Extending Moodle - Moodlemoot Romania 2013
 
Supporting Faculty In The Angel Virtual Classroom
Supporting Faculty In The Angel Virtual ClassroomSupporting Faculty In The Angel Virtual Classroom
Supporting Faculty In The Angel Virtual Classroom
 
Developing Conceptual Understanding Through Alternative Assessment
Developing Conceptual Understanding Through Alternative AssessmentDeveloping Conceptual Understanding Through Alternative Assessment
Developing Conceptual Understanding Through Alternative Assessment
 

Viewers also liked

Visual Learning Pulse - Final Thesis presentation
Visual Learning Pulse - Final Thesis presentationVisual Learning Pulse - Final Thesis presentation
Visual Learning Pulse - Final Thesis presentation
Daniele Di Mitri
 

Viewers also liked (20)

Data protection and privacy framework in the design of learning analytics sys...
Data protection and privacy framework in the design of learning analytics sys...Data protection and privacy framework in the design of learning analytics sys...
Data protection and privacy framework in the design of learning analytics sys...
 
SHEILA project LAK17 workshop slides
SHEILA project LAK17 workshop slidesSHEILA project LAK17 workshop slides
SHEILA project LAK17 workshop slides
 
Trialling the Moodle My feedback report at UCL
Trialling the Moodle My feedback report at UCLTrialling the Moodle My feedback report at UCL
Trialling the Moodle My feedback report at UCL
 
Resilience Thinking, Social Learning and Open Innovation Platforms
Resilience Thinking, Social Learning and Open Innovation PlatformsResilience Thinking, Social Learning and Open Innovation Platforms
Resilience Thinking, Social Learning and Open Innovation Platforms
 
ALT-C Moodle My feedback report
ALT-C Moodle My feedback reportALT-C Moodle My feedback report
ALT-C Moodle My feedback report
 
A Whistestop Tour of Theories for TEL Research
A Whistestop Tour of Theories for TEL ResearchA Whistestop Tour of Theories for TEL Research
A Whistestop Tour of Theories for TEL Research
 
Visual Learning Pulse - Final Thesis presentation
Visual Learning Pulse - Final Thesis presentationVisual Learning Pulse - Final Thesis presentation
Visual Learning Pulse - Final Thesis presentation
 
Implemententing analytics part 1 - Niall Sclater
Implemententing analytics part 1 - Niall SclaterImplemententing analytics part 1 - Niall Sclater
Implemententing analytics part 1 - Niall Sclater
 
Monadic Java
Monadic JavaMonadic Java
Monadic Java
 
Blackboard Learning Analytics Research Update
Blackboard Learning Analytics Research UpdateBlackboard Learning Analytics Research Update
Blackboard Learning Analytics Research Update
 
Where is the evidence? A call to action for learning analytics
Where is the evidence? A call to action for learning analyticsWhere is the evidence? A call to action for learning analytics
Where is the evidence? A call to action for learning analytics
 
Open Source Creativity
Open Source CreativityOpen Source Creativity
Open Source Creativity
 
5 Steps To A Smart Compensation Plan
5 Steps To A Smart Compensation Plan5 Steps To A Smart Compensation Plan
5 Steps To A Smart Compensation Plan
 
2015 Travel Trends
2015 Travel Trends 2015 Travel Trends
2015 Travel Trends
 
Building the learning analytics curriculum: Should we teach (a code of) ethics?
Building the learning analytics curriculum: Should we teach (a code of) ethics?Building the learning analytics curriculum: Should we teach (a code of) ethics?
Building the learning analytics curriculum: Should we teach (a code of) ethics?
 
Eco-nomics, The hidden costs of consumption
Eco-nomics, The hidden costs of consumptionEco-nomics, The hidden costs of consumption
Eco-nomics, The hidden costs of consumption
 
You Suck At PowerPoint! by @jessedee
You Suck At PowerPoint! by @jessedeeYou Suck At PowerPoint! by @jessedee
You Suck At PowerPoint! by @jessedee
 
SlideShare 101
SlideShare 101SlideShare 101
SlideShare 101
 
Crap. The Content Marketing Deluge.
Crap. The Content Marketing Deluge.Crap. The Content Marketing Deluge.
Crap. The Content Marketing Deluge.
 
What Would Steve Do? 10 Lessons from the World's Most Captivating Presenters
What Would Steve Do? 10 Lessons from the World's Most Captivating PresentersWhat Would Steve Do? 10 Lessons from the World's Most Captivating Presenters
What Would Steve Do? 10 Lessons from the World's Most Captivating Presenters
 

Similar to LAK '17 Trends and issues in student-facing learning analytics reporting systems research

Peerwise and students’ contribution experiences from the field
Peerwise and students’ contribution experiences from the fieldPeerwise and students’ contribution experiences from the field
Peerwise and students’ contribution experiences from the field
Lenandlar Singh
 
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docxIntervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
normanibarber20063
 
Utilizing Rubrics in Audio/Visual Production
Utilizing Rubrics in Audio/Visual ProductionUtilizing Rubrics in Audio/Visual Production
Utilizing Rubrics in Audio/Visual Production
Corey Anderson
 
Iblc10 making an existing assessment more efficient
Iblc10   making an existing assessment more efficientIblc10   making an existing assessment more efficient
Iblc10 making an existing assessment more efficient
Mark Russell
 

Similar to LAK '17 Trends and issues in student-facing learning analytics reporting systems research (20)

TESTA to FASTECH Presentation
TESTA to FASTECH PresentationTESTA to FASTECH Presentation
TESTA to FASTECH Presentation
 
Using e instruction’s® cps™ to support effective instruction
Using e instruction’s® cps™ to support effective instructionUsing e instruction’s® cps™ to support effective instruction
Using e instruction’s® cps™ to support effective instruction
 
TESTA to FASTECH (November 2011)
 TESTA to FASTECH (November 2011) TESTA to FASTECH (November 2011)
TESTA to FASTECH (November 2011)
 
ABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of DerbyABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of Derby
 
E assessment
E assessmentE assessment
E assessment
 
Peerwise and students’ contribution experiences from the field
Peerwise and students’ contribution experiences from the fieldPeerwise and students’ contribution experiences from the field
Peerwise and students’ contribution experiences from the field
 
An evidence based model
An evidence based modelAn evidence based model
An evidence based model
 
Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...
Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...
Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...
 
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docxIntervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
Intervention forEducationMarkis’ EdwardsJanuary 29, 2018.docx
 
Utilizing Rubrics in Audio/Visual Production
Utilizing Rubrics in Audio/Visual ProductionUtilizing Rubrics in Audio/Visual Production
Utilizing Rubrics in Audio/Visual Production
 
Instructional Feedback
Instructional FeedbackInstructional Feedback
Instructional Feedback
 
Cpjwece2011
Cpjwece2011Cpjwece2011
Cpjwece2011
 
Labs without Limits
Labs without LimitsLabs without Limits
Labs without Limits
 
Empowering the Instructor with Learning Analytics
Empowering the Instructor with Learning AnalyticsEmpowering the Instructor with Learning Analytics
Empowering the Instructor with Learning Analytics
 
Do Student Minutes Online Affect Final Grades?
Do Student Minutes Online Affect Final Grades?Do Student Minutes Online Affect Final Grades?
Do Student Minutes Online Affect Final Grades?
 
A Design Methodology for Learning Analytics Information Systems HICSS 2020 An...
A Design Methodology for Learning Analytics Information Systems HICSS 2020 An...A Design Methodology for Learning Analytics Information Systems HICSS 2020 An...
A Design Methodology for Learning Analytics Information Systems HICSS 2020 An...
 
Developmental evaluations for institutional impact
Developmental evaluations for institutional impactDevelopmental evaluations for institutional impact
Developmental evaluations for institutional impact
 
Design based for lisbon 2011
Design based for lisbon 2011Design based for lisbon 2011
Design based for lisbon 2011
 
Lessons Learned from Moodle VLE/LMS Data in the Field
Lessons Learned from Moodle VLE/LMS Data in the FieldLessons Learned from Moodle VLE/LMS Data in the Field
Lessons Learned from Moodle VLE/LMS Data in the Field
 
Iblc10 making an existing assessment more efficient
Iblc10   making an existing assessment more efficientIblc10   making an existing assessment more efficient
Iblc10 making an existing assessment more efficient
 

Recently uploaded

1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
AnaAcapella
 

Recently uploaded (20)

Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
Spatium Project Simulation student brief
Spatium Project Simulation student briefSpatium Project Simulation student brief
Spatium Project Simulation student brief
 
Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Magic bus Group work1and 2 (Team 3).pptx
Magic bus Group work1and 2 (Team 3).pptxMagic bus Group work1and 2 (Team 3).pptx
Magic bus Group work1and 2 (Team 3).pptx
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 

LAK '17 Trends and issues in student-facing learning analytics reporting systems research

  • 1. Trends and Issues in Student-Facing Learning Analytics Reporting Systems Research Robert Bodily - Brigham Young University, USA Katrien Verbert - University of Leuven, Belgium
  • 2. RESEARCHQUESTIONS 1. What types of features do student-facing learning analytics reporting systems have? 2. What are the different kinds of data collected in these systems? 3. How are the system designs analyzed and reported on? 4. What are the perceptions of students about these systems? 5. What is the actual effect of these systems on student behavior, student skills, and student achievement? Research Questions
  • 3. Inclusion Criteria 1. Track learning analytics data (e.g. time spent or resource use) beyond assessment data. 2. Report the learning analytics data directly to students. 3. List of articles can be found at http://bobbodily.com/article_list INCLUSIONCRITERIA
  • 4. Methodology Education Education & Computer Science Computer Science Conference Proceedings LAK and EDM proceedings IEEE Xplore Peer Reviewed Journal Articles ERIC Google Scholar ACM Database, Computers and Applied Sciences Literature Reviews Verbert et al. 2013, Verbert et al. 2014, Schwendimann et al. 2016 Drachsler et al. 2015, Romero and Ventura 2010 945 articles retrieved from the initial search 94 that fit the inclusion criteria METHODOLOGY
  • 5. Coding Categories Functionality of the system Data sources tracked and reported Design analyses conducted Student perceptions Actual measured effects Student use CODINGCATEGORIES
  • 6. Functionality Purpose of the system Data mining Visualizations Class comparison Recommendation Feedback Interactivity FUNCTIONALITY
  • 7. Purpose of the System FUNCTIONALITY Category Name # of articles % of articles Awareness or reflection 35 37 Recommend resources 27 29 Improve retention or engagement 18 19 Increase online social behavior 7 7 Recommend courses 3 3 Other 4 4
  • 8. Data Mining My article definition: any type of statistical analysis beyond descriptive statistics • 46 (49%) included a data mining component • More common in recommender and data mining systems, less common in dashboards • Only 16 (17%) included a visualization component and a recommendations component FUNCTIONALITY
  • 9. Visualizations FUNCTIONALITY Visualization Type # of Articles Bar chart 25 Line chart 19 Table 15 Network graph 10 Scatterplot 10 Donut graph 5 Radar chart 4 Pie chart 3 Timeline 3 Word cloud 3 Other 23 Visualizations in the Other category: • Learning path visualization • Box and whisker plot • Tree map • Explanatory decision tree • Parallel coordinates graph • Planning and reflection tool • Plant metaphor visual • Tree metaphor
  • 10. Class Comparison My article definition: the system had to allow students to see other students data in comparison with their own • 35 (37%) of articles included a class comparison feature • Which students are motivated by comparison? • Which students are unmotivated by comparison? • What effect does personalizing reporting system features have on student motivation? FUNCTIONALITY
  • 11. Recommendations My article definition: Recommending or suggesting to the student what to do. • 43 articles (46%) included recommendations • 78% of data mining articles provided recommendations • Future research should examine differences between transparent recommendations and traditional black-box recommendations FUNCTIONALITY
  • 12. Feedback My article definition: Telling the user what has happened using text. • 17 systems (18%) provided text feedback • Used frequently for just-in-time feedback and rarely for unit-level or concept-level data reports FUNCTIONALITY
  • 13. Interactivity My article definition: Allowing the user to interact with the reporting system in some way • 29 systems (31%) included an interactivity component • Includes linking to content, filtering data results, or providing simple/advanced views • Future research should examine how students are using these interactive features FUNCTIONALITY
  • 14. Data Sources DATASOURCES Subcategory Name # of Articles % of Articles Resource use 71 76% Assessment data 34 36% Social interaction 33 35% Time spent 29 31% Other sensor data 7 7% Manually reported data 5 5%
  • 15. Design Analysis Needs Assessment Information Selection Visual Design Usability Testing DESIGNANALYSIS
  • 16. Needs Assessment • 6 articles (6%) included a needs assessment • Santos, Verbert, Govaerts, & Duval (2013) • Surveyed students to identify needs • Had students rank needs on importance • Targeted the most important student issues • Future research should include explicitly discussed needs assessments. DESIGNANALYSIS
  • 17. Information Selection • 14 articles (15%) included information selection justification • Ott, Robins, Haden, & Shephard (2015) • Examined the literature • Feild (2015) • Exploratory data analysis • Iandoli, Quinto, De Liddo, and Buckingham Shum (2014) • Used a theoretical framework DESIGNANALYSIS
  • 18. Visual Design • 12 articles (13%) discussed the visual design or recommendation design process • Olmos & Corrin (2012) • Iterative visual design process • 85% of articles only presented the final visualization or recommendation • Future research should report on the visual design process used DESIGNANALYSIS
  • 19. Usability Testing • 10 articles (11%) reported a usability test • Santos, Verbert, & Duval (2012) and Santos, Govaerts, Verbert, & Duval (2012) • System Usability Scale (SUS) • Santos, Boticario, and Perez-Marin • Usability and accessibility expert • Future research should use a system usability scale (SUS), evaluation expert, or other appropriate methods to assess usability DESIGNANALYSIS
  • 20. Student Perceptions STUDENTPERCEPTIONS Sub-category # of articles % of articles Usability 32 34% Useful/Satisfaction 34 37% Behavior 16 17% Achievement 2 2% Skills 15 16%
  • 21. Actual Measured Effects Student behavior changes Student achievement changes Student skills changes ACTUALMEASUREDEFFECTS
  • 22. Student Behavior Changes 1. 21% of students accepted the system recommendation to view additional content (Hsu, 2008) 2. Students participating in courses using the system were more likely to continue taking classes than those who did not enroll in these courses (Arnold, Hall, Street, Lafayette, & Pistilli, 2012) 3. Students who enabled notifications (on 2 out of 3 systems) showed increased contributions in the social network space (Xu & Makos, 2015) 4. Students visited the discussion space more frequently but did not post more frequently (Nakashara, Yaegashi, Hisamatsu, & Yamauchi, 2005) 5. The percentage of posts viewed increased for all students, but there were few sustained changes (Wise, Zhao, & Hausknecht, 2014) 6. The number of students completing assignments increased and LMS use increased (Chen, Chang, & Wang, 2008) 7. About 50% of students accepted recommendations from the system (Huang, Huang, Wang, & Hwang, 2009) 8. There was an 83.3% student interaction increase after recommendations were given (Holanda et al., 2012) 9. Students completed assignments more quickly and were able to complete the entire course more quickly (Vesin, Klasnja-Milicevic, Ivanovic, & Budimac, 2013) 10. *For two of the three visualizations, students post quantity increased; for one of three, student post quantity decreased (Beheshitha, Hatala, Gašević, & Joksimović, 2016) 11. *Students logged in more frequently, completed their coursework more quickly, completed more questions, and answered more questions correctly on assignments (Santos et al., 2014) 12. *There were no significant differences between the treatment and control groups in terms of learning efficiency (Janssen et al., 2007) ACTUALMEASUREDEFFECTS *Sample size greater than 150 and conducted an actual experiment (randomized control trial or other equivalent group mean difference testing method)
  • 23. Student Behavior Changes (summary) 1. N > 150 and used a randomized control trial or other equivalent group mean difference testing method 2. For two of the three visualizations, students post quantity increased; for one of three, student post quantity decreased (Beheshitha, Hatala, Gašević, & Joksimović, 2016) 3. Students logged in more frequently, completed their coursework more quickly, completed more questions, and answered more questions correctly on assignments (Santos et al., 2014) 4. There were no significant differences between the treatment and control groups in terms of learning efficiency (Janssen et al., 2007) ACTUALMEASUREDEFFECTS
  • 24. Student Achievement Changes 1. No significant achievement differences (Grann & Bushway, 2014) 2. More A’s and B’s and fewer C’s and D’s (Arnold et al., 2012) 3. No significant achievement differences (Park & Jo, 2015) 4. Students received more passing grades (Denley, 2014) 5. Frequency and quality of posts was affected positively and negatively (Beheshitha, Hatala, Gašević, & Joksimović, 2016) 6. Students performed significantly better on the evaluation task (Huang, Huang, Wang, & Hwang, 2009) 7. Treatment group performed significantly better on final exam (Wang, 2008) 8. *No significant differences between treatment and control (Santos, Boticario, & Perez-Marin, 2014) 9. *No significant achievement differences (Ott, Robins, Haden, & Shephard, 2015) 10. *No significant achievement differences, but one course had an effect with Pell eligible students (Dodge, Whitmer, & Frazee, 2015) 11. *Treatment group performed significantly better on final exam (Kim, Jo, & Park, 2015) *Sample size greater than 150 and conducted an actual experiment (randomized control trial or other equivalent group mean difference testing method) ACTUALMEASUREDEFFECTS
  • 25. Student Achievement Changes (summary) 1. N > 150 and used a randomized control trial or other equivalent group mean difference testing method 2.No significant differences between treatment and control (Santos, Boticario, & Perez-Marin, 2014) 3.No significant achievement differences (Ott, Robins, Haden, & Shephard, 2015) 4.No significant achievement differences, but one course had an effect with Pell eligible students (Dodge, Whitmer, & Frazee, 2015) 5. Treatment group performed significantly better on final exam (Kim, Jo, & Park, 2015) ACTUALMEASUREDEFFECTS
  • 26. Student Skills Changes 1. Significant increase in student self-awareness accuracy (Kerly, Ellis, & Bull, 2008) 2. Female students had increased interest when they had a choice to use the system; male students reported higher interest with mandatory notifications (Muldner, Wixon, Rai, Burleson, Woolf, & Arroyo, 2015) ACTUALMEASUREDEFFECTS
  • 27. Student Use • 12 articles (13%) tracked some form of student use • Most articles reported on aggregate class level statistics • Percent of class that accessed the system • Number of interactions over the course of the semester • Future research should investigate how students are using visualization or recommendation reports • Student use data could help us understand why or how a system is helping students STUDENTUSE
  • 28. Question Category % What is the intended goal of the system? Intended Goal 100 What visual techniques will best represent your data? Visualizations 13 What types of data support your goal? Information Selection 15 What do students need? Does it align with your goal? Needs Assessment 6 Is the system easy and intuitive to use? Usability Test 11 Why use the visual techniques you have chosen? Visual Design 13 How do students perceive the reporting system? Student Perceptions 17 What is the effect on student behavior/achievement? Actual Effects 18 How are students using the system? How often? Why? Student Use 13 Practitioner Recommendations PRACTITIONERRECOMMENDATIONS
  • 29. Future Research 1. Student use: How are students using reporting systems? Are students even using them? 2. Design process: Are some data types and visualization types better than others, and in what contexts? 3. Design process: Only a few authors reported on conducting needs assessments and usability tests. What effect do these methods have on experimental rigor and accuracy of findings? 4. Experimental research: Quasi-experimental methods, such as propensity score matching, have yet to be used in a student-facing reporting tool context 5. Experimental research: Current research shows mixed results regarding the efficacy of these systems. More experimental research is needed on the effects of these systems on student behavior, achievement, and skills. FUTURERESEARCH
  • 30. Thank you! Questions? Articles: www.bobbodily.com/article_list www.bobbodily.com bodilyrobert@gmail.com THANKYOU!QUESTIONS?