Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Analysing the Use of Distributed Digital Learning Resources

Presentation at the conference on Data Science and Social Research, Naples 17-19 Feb 2016

Related Books

Free with a 30 day trial from Scribd

See all

Related Audiobooks

Free with a 30 day trial from Scribd

See all
  • Be the first to comment

Analysing the Use of Distributed Digital Learning Resources

  1. 1. Analysing the Use of Distributed Digital Learning Resources: a Case Study on eSchoolbag Platform in Estonia Mart Laanpere, sen.researcher @ Centre for Educational Technology, Tallinn University Conference on Data Science and Social Research :: Naples, 19 February, 2016
  2. 2. Learning Analytics “in theWild”  Most of Learning Analytics research is conducted on the data that comes from a single closed system (e.g. Moodle, MOOC)  As the digital footprints of learners are increasingly expanding towards “the Wild” (open Web), we need Learning Analytics that is able to aggregate the data from distributed environment  National strategy for lifelong learning: Digital turn towards BYOD and digital textbooks, analytics & recommender systems  Need for Learning Analytics that is not “pedagogically neutral”, i.e. includes the metrics and indicators that are drawn from contemporary learning theories
  3. 3. Current situation with DLR in Estonia  Koolielu.ee (since 2009): repository of teacher-created learning resources, more than half of Estonian teacher are registered users, Quality Assurance (subject moderators and QA checklist)  LeMill.net: 42K users, 73K learning resources, getting old  Digital Exams: EIS prototype was received with mixed feelings  Textbook publishers are experimenting with various e-textbook formats (ePub, Web-based, apps, eLessons, LCMS)  Majority of actively used digital learning resources are scattered around Web 2.0 (blogs, wikis, LearningApps, Khan Academy, Kahoot, Weebly, HotPotatoes etc)
  4. 4. Towards DLR cloud: requirements for eSB  Metadata harvesting:  Automatic, every 24 hrs from multiple repositories (incl. Finnish)  Content provider responsible for interfacing and metadata quality  Creating collections from DLR:  Powerful metadata-based search and recommendation  Collections created by teachers for students, for learners  Shareable on multiple end-user platforms  Learning analytics:  Tracking the activities of users (TinCan API, LRS)  Indicators and metrics drawn from trialogical learning theory  Recommender system
  5. 5. Digital Learning Resource cloud
  6. 6. Configurations of digital textbook 2.0 Planetary system model Linux model Lego model Stabile core Dynamic core No core at all
  7. 7. Levels of textbook co-authorship Level Learner’s contribution Examples of tools 6: Creating Creates a new resource from scratch GeoGebra, iMovie, Aurasma, PhotoStory, GarageBand, iBooksAuthor 5: Remixing Rips, mixes, cuts, adds visuals or subtitles “Hitler gets angry” video, 9gag, samples, GeoGebra, GDocs 4: Expanding Curates, adds external resources to collection Scoop.it, blog 3: Submitting Solves a task, submits to teacher for the feedback Kahoot, Khan Academy, online tests, worksheets made with Gdocs 2: Interacting Self-test, simple game LearningApps, HotPotatoes, SCORM 1: Annotating Likes, bookmarks, comments Youtube video, ePub, PDF, Web page 0: Consuming Views, listens, reads PowerPoint, PDF, video
  8. 8. Discussion & conclusions  Learning analytics works differently in a distributed environment, tools need adaptation  LA becomes more relevant to teachers and students if the units of analysis relate to a theory of learning (if possible, several alternative theories)  Open issues: privacy-preserving data mining, aggregating the data from state registries, research and Learning Analytics

×