Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Ismar 2016 Presentation

892 views

Published on

Presentation given by Mark Billinghurst at the ISMAR 2016 conference on September 20th 2016. This talk describes work being done on using gaze tracking to enhance remote collaboration.

Published in: Technology

Ismar 2016 Presentation

  1. 1. DoYou See What I See? The Effect of GazeTracking onTask Space Remote Collaboration Kunal Gupta1, Gun A. Lee2, and Mark Billinghurst2 1HIT Lab NZ, University of Canterbury 2Empathic CompuIng Lab, University of South Australia ISMAR 2016, September 20th, 2016
  2. 2. Mo#va#on • Improving remote assistance of expert user (e.g. maintenance) • SupporIng rich communicaIon cues
  3. 3. Task Space Teleconferencing • Focus on sharing view of remote task space • Methods •  Handheld tablets with cameras + AR cues, Fixed cameras in workspace • LimitaIons •  Fixed viewpoint, Difficult to know where user looking
  4. 4. Head Worn Collabora#ve Systems • Place camera on head + use head mounted display •  HWC + HMD + remote poinIng improves collaboraIon • LimitaIons •  Remote view fixed, Expert doesn’t know exactly where worker looking
  5. 5. Gaze Tracking in Teleconferencing •  Monitor based (Brennan 2008), (Carle[a 2010) •  Gaze provides a[enIon cue, significantly improved performance •  Head mounted (Fussell 2003), (Ou 2005) – no HMD •  No performance improvement, focus of a[enIon can be predicted
  6. 6. Comparison to Previous Work Rem = remote collaboraIon, FtF = face to face collaboraIon Gaze = eye tracking, HWC = head worn camera, HMD = head mounted display
  7. 7. Key Research Ques#ons • Q1: Will sharing of gaze and pointer cues affect the feeling of co-presence between users? • Q2: Will sharing of gaze and pointer cues improve performance in a remote collaboraIve task?
  8. 8. Prototype Design • Combining the following (1) a head mounted eye-tracker (2) head mounted camera (3) head mounted display (4) remote viewing sohware System Diagram
  9. 9. Local Worker Brother Air Scouter Microsoh Lifecam HD 5000 Logitech Webcam c920
  10. 10. Pupil Labs Eye Tracking • Open source eye-tracking • Use IR reflecIon into eye • Image processing on PC • Tracks eye at 30 fps • Provides raw data • www.pupil-labs.com
  11. 11. Remote Expert Desktop • Live camera view • Gaze shown as red dot • Can add pointer cues •  Mouse input •  Green dot • Shown in HMD
  12. 12. User Experiment • In remote expert collabora#on ... • Does Pointer / Eye tracker cues have significant effect on co-presence? • Does Pointer / Eye tracker cues have significant effect on task performance?
  13. 13. Experimental Design Eye tracker cue Pointer cue No Yes No NONE E Yes P Both
  14. 14. Experimental Design – Setup
  15. 15. Experimental Design – Task • Block assembly • Four different structures • 17 pieces in each • Pilot tested to balance difficulty level • Assigned to condiIons with counter balancing • AcIve head movement encouraged through secondary task (Imer) and L-shape desk setup
  16. 16. Experimental Design – Procedure • PracIce trial in face-to-face collaboraIon • ParIcipants separated for the experimental trials • For each condiIon: • Remote helper creates structure based on instrucIon • Perform experimental task • Answer per-condiIon quesIonnaire • Post-experimental quesIonnaire & debriefing
  17. 17. Experimental Design – Par#cipants • Within-subject • Balanced LaIn square design • 30 parIcipants (15 pairs) recruited, 26 retained • 21-33 years old, 73% male • Fluent English speaking • No one had done block assembly over video conferencing before
  18. 18. Results - Summary •  Both the POINTER and EYE TRACKING visual cues helped parIcipants to perform the task significantly faster. •  The POINTER cue significantly improved both local and remote users’ perceived quality of communicaIon, collaboraIon, and co-presence. •  The EYE TRACKING significantly improved the communicaIon and collaboraIon quality, and sense of being focused for local workers, and enjoyment for remote helpers. •  The BOTH condiIon ranked as the best in most of the aspects of user experience, while the NONE condiIon was ranked as the worst. •  Visual cues made the conversaIon more efficient, changed the choice of wording in deicIc expressions, and helped parIcipants’ feel more connected.
  19. 19. Results – Task comple#on #me • Repeated measure two-way ANOVA (α = 0.05) • POINTER cue • F (1,12)=4.908, p=.047* • 15% less Ime • EYETRACKER cue • F (1,12)=5.811, p=.033* • 10% less Ime • InteracIon • F (1,12)=0.566, p=.466 sec.
  20. 20. Results – Per-condi#on ra#ng ques#onnaire • Q1 I felt connected with my partner. • Q2 I felt I was present with my partner. • Q3 My partner was able to sense my presence. • Q4 My partner (or for Remote Helper: I) could tell when I (or for Remote Helper: my partner) needed assistance. • Q5 I enjoyed the experience. • Q6 I was able to focus on the task acIvity. • Q7 I am confident that we completed the task correctly. • Q8 My partner and I worked together well. • Q9 I was able to express myself clearly. • Q10 I was able to understand partner’s message. • Q11 Informa9on from partner was helpful. Adopted from [Kim et al. 2014]
  21. 21. Results – Per-condi#on ra#ng ques#onnaire • 7-point Likert Scale • 1: totally disagree ~ 7: totally agree • Internal consistency: Cronbach’s α=.937 • Aggregated into 0~100 scale
  22. 22. Results – Per-condi#on ra#ng ques#onnaire • Aligned Rank Transform (ART) + Repeated measure ANOVA (α = 0.05) • POINTER cue • F (1,12)=7.414, p=.019* • EYETRACKER cue • F (1,12)=26.822, p<.001* • InteracIon • F (1,12)=2.023, p=.180
  23. 23. Results – Per-condi#on ra#ng ques#onnaire • Aligned Rank Transform (ART) + Repeated measure ANOVA (α = 0.05) • POINTER cue •  F (1,12)=11.914, p=.005* • EYETRACKER cue •  F (1,12)=15.929, p=.002* • InteracIon •  F (1,12)=5.157, p=.042*
  24. 24. Results – Per-condi#on ra#ng ques#onnaire • Local Workers
  25. 25. Results – Per-condi#on ra#ng ques#onnaire • Remote Helpers
  26. 26. Results – Ranking
  27. 27. Results – Ranking • Friedman test (α = 0.05) • Local worker • E > None on C5 • Remote helper • E > None on C2,4,5 1: the best ~ 4: the worst
  28. 28. Results – Preference and qualita#ve feedback • Understanding partner •  Local workers: 85% preferred condiIons including POINTER cue “With Pointer, I can relate to what he is talking about, because I could understand him more.” •  Remote helpers: 70% preferred the BOTH condiIon “The eye tracker helps me to look in the same view of my partner.” • Performing task efficiently •  77% of Local & 85% of Remote users preferred the BOTH condiIon “The eye tracker was giving my partner more informaFon about where I looked at, while the pointer was for giving me the instrucFon from my partner.”
  29. 29. Results – Behaviour observa#on • Pointer cue reduced number of phrases said •  Local worker F (1,11)=6.532, p=.027* •  Remote helper F (1,11)=8.479, p=.014* • Referring to objects & direcIng Without Pointer With Pointer describe features (colour, size, shape, ...) “this one” “move leG/right”, “in front of ” “put it here”, “next to this” “that” object “this” object
  30. 30. Discussion • Performance improved by using eye-tracker and pointer •  Pointer – giving direct guidance •  Eye-tracker – showed workers focus of a[enIon • Eye tracker provided benefit even without pointer •  Same view condiIon for local worker, but improved communicaIon quality • Benefit of gaze informaIon •  Improved communicaIon, more enjoyable, focus, reduce interrupIon • Benefit of virtual pointer cue •  Ease of direcIon, increased sense of presence, increased partner awareness • Different user needs •  Local worker – understanding/empathy – benefits from eye-tracking •  Remote expert – giving instrucIon – benefits from using pointer cue
  31. 31. Implica#ons 1.  Eye-tracking can be used to change the nature of remote collaboraIon with head worn systems •  Make remote user aware of implicit intenIons 2.  Providing gaze cues alone can significantly improve the remote collaboraIon even without remote poinIng •  eye-tracking just as beneficial as using remote poinIng by itself 3.  CommunicaIon cues like gaze and poinIng play a very important role in creaIng a sense of co-presence and deeper understanding •  Most of the users preferred gaze + pointer due to connecIon created
  32. 32. Limita#ons 1.  Prototype too bulky and tethered to PC, not wearable • New HMDs with integrated cameras could overcome this 2.  Task limited compared to realisIc remote collaboraIon • Did have key elements such as object idenIficaIon 3.  Experimental measures not so detailed • Detailed behavioral analysis, conversaIonal analysis
  33. 33. Conclusions • Does sharing eye tracking informaIon between local user and remote expert help in terms of co-presence and performance? •  Using gaze and pointer a[enIon cues improved performance Ime. •  Gaze and pointer cues improved the feeling of co-presence • Many areas for future work •  Explore parallel task – both people with same roles •  Provide symmetric communicaIon cues – gaze both ways •  Use other physiological cues, GSR, heart rate, EEG
  34. 34. Empathy Glasses (CHI 2016) • Combine together eye-tracking, display, face expression • Implicit cues – eye gaze, face expression + + Pupil Labs Epson BT-200 AffecIveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in CompuFng Systems. ACM.
  35. 35. Affec#veWear – Emo#on Glasses • Photo sensors to recognize expression • User calibraIon • Machine learning • Recognizing 8 face expressions
  36. 36. Remote Collabora#on • Eye gaze pointer and remote poinIng • Face expression display • Implicit cues for remote collaboraIon
  37. 37. Contact Us www.empathiccomputing.org @marknb00 mark.billinghurst@unisa.edu.au gun.lee@unisa.edu.au

×