This document discusses how augmented and virtual reality technologies can be used to create more empathetic and collaborative experiences. It outlines trends in content capture, networking bandwidth, and natural interfaces that enable new types of shared experiences. Examples are presented of past and current AR/VR systems that allow remote users to share live video, 3D spaces, gestures, and physiological cues like gaze and expression. The document concludes that AR and VR are well-suited for developing empathetic computing applications by allowing users to understand, experience, and share perspectives and emotions.
9. Modern Technology Trends
1. Improved Content Capture
• Move from sharing faces to sharing places
2. Increased Network Bandwidth
• Sharing natural communication cues
3. Implicit Understanding
• Recognizing behaviour and emotion
These three trends combined with AR/MR technology
can create new types of collaborative experiences
23. Natural Gesture Input
• Freehand input
• Depth sensors, rich two handed gestures
Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Leichter, D. K. C. R. I., ... & Izadi, S.
(2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
https://www.youtube.com/watch?v=A-xXrMpOHyc
27. Milgram’s Reality-Virtuality Continuum
Mixed Reality
Reality - Virtuality (RV) Continuum
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays
IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
29. AR Video Conferencing (2001)
• Bringing conferencing into real world
• Using AR video textures of remote people
• Attaching AR video to real objects
Billinghurst, M., & Kato, H. (2002). Collaborative augmented reality. Communications
of the ACM, 45(7), 64-70.
31. MR Point Cloud Sharing (2016)
• Use video-see through HMD
• Occulus Rift
• Depth camera on HMD
• Softkinetic
• Create point cloud
• Remote hand gestures
• User’s space
37. AR/VR Collaboration (2017)
• Make 3D copy of real space
• AR user in real world, VR user in 3D copy of real space
• Hololens for AR user, HTC Vive for VR user
• Share virtual body cues (head, hands, gaze information)
Real World Virtual World
45. Example: Google Glass
• Camera + Processing + Display + Connectivity
• Ego-Vision Collaboration (But with FixedView)
46. Current Collaboration onWearables
• First person remote conferencing/hangouts
• Limitations
• Single POV, no spatial cues, no annotations, etc
47. Social Panoramas (ISMAR 2014)
• Capture and share social spaces in real time
• Supports independent views into Panorama
Reichherzer, C., Nassani, A., & Billinghurst, M. (2014, September). [Poster] Social panoramas using
wearable computers. In Mixed and Augmented Reality (ISMAR), 2014 IEEE International Symposium
on (pp. 303-304). IEEE.
48. Implementation
• Google Glass
• Capture live image panorama (compass + camera)
• Remote device (tablet)
• Immersive viewing, live annotation
55. JospehTame – Tokyo Marathon
• Live streaming from Tokyo marathon
• http://josephta.me/en/tokyo-marathon/
56. Empathy Glasses (CHI 2016)
• Combine together eye-tracking, display, face expression
• Impicit cues – eye gaze, face expression
++
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of
the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
57. AffectiveWear – Emotion Glasses
• Photo sensors to recognize expression
• User calibration
• Machine learning
• Recognizing 8 face expressions
58. Empathy Glasses in Use
• Eye gaze pointer and remote pointing
• Face expression display
• In future integrated eye-tracking/display
64. AR and VR for Empathic Computing
• VR systems are ideal for trying experiences:
• Strong story telling medium
• Provide total immersion/3D experience
• Easy to change virtual body scale and representation
• AR systems are idea for live sharing:
• Allow overlay on real world view/can share viewpoints
• Support remote annotation/communication
• Enhance real world task