SlideShare a Scribd company logo
1 of 66
Download to read offline
LECTURE 10: RESEARCH
DIRECTIONS IN AR AND VR
COMP 4010 ā€“ Virtual Reality
Semester 5 ā€“ 2018
Bruce Thomas, Mark Billinghurst
University of South Australia
October 30th 2018
Key Technologies for VR Systems
ā€¢ Visual Display
ā€¢ Stimulate visual sense
ā€¢ Audio/Tactile Display
ā€¢ Stimulate hearing/touch
ā€¢ Tracking
ā€¢ Changing viewpoint
ā€¢ User input
ā€¢ Input Devices
ā€¢ Supporting user interaction
Many Directions for Research
ā€¢ Research in each of the key technology areas for VR
ā€¢ Display, input, tracking, graphics, etc.
ā€¢ Research in the phenomena of VR
ā€¢ Psychological experience of VR, measuring Presence, etc..
ā€¢ Research in tools for VR
ā€¢ VR authoring tools, automatic world creation, etc.
ā€¢ Research in many application areas
ā€¢ Collaborative/social, virtual characters, medical, education, etc.
Future Visions of VR: Ready Player One
ā€¢ https://www.youtube.com/watch?v=LiK2fhOY0nE
Today vs. Tomorrow
VR in 2018 VR in 2045
Graphics High quality Photo-realistic
Display 110-150 degrees Total immersion
Interaction Handheld controller Full gesture/body/gaze
Navigation Limited movement Natural
Multiuser Few users Millions of users
Augmented Reality
ā€¢ Defining Characteristics [Azuma 97]
ā€¢ Combines Real andVirtual Images
ā€¢ Both can be seen at the same time
ā€¢ Interactive in real-time
ā€¢ The virtual content can be interacted with
ā€¢ Registered in 3D
ā€¢ Virtual objects appear fixed in space
Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
Future Vision of AR: IronMan
ā€¢ https://www.youtube.com/watch?v=Y1TEK2Wf_e8
Key Enabling Technologies
1. Combines Real andVirtual Images
Display Technology
2. Registered in 3D
Tracking Technologies
3. Interactive in real-time
InteractionTechnologies
Future research can be done in each of these areas
DISPLAY
ā€¢ Past
ā€¢ Bulky Head mounted displays
ā€¢ Current
ā€¢ Handheld, lightweight head mounted
ā€¢ Future
ā€¢ Projected AR
ā€¢ Wide FOV see through
ā€¢ Retinal displays
ā€¢ Contact lens
Evolution in Displays
Wide FOV See-Through (3+ years)
ā€¢ Waveguide techniques
ā€¢ Wider FOV
ā€¢ Thin see through
ā€¢ Socially acceptable
ā€¢ Pinlight Displays
ā€¢ LCD panel + point light sources
ā€¢ 110 degree FOV
ā€¢ UNC/Nvidia
Lumus DK40
Maimone, A., Lanman, D., Rathinavel, K., Keller, K., Luebke, D., & Fuchs, H. (2014). Pinlight displays: wide
field of view augmented reality eyeglasses using defocused point light sources. In ACM SIGGRAPH 2014
Emerging Technologies (p. 20). ACM.
Pinlight Display Demo
https://www.youtube.com/watch?v=tJULL1Oou9k
Light Field Displays
https://www.youtube.com/watch?v=J28AvVBZWbg
Retinal Displays (5+ years)
ā€¢ Photons scanned into eye
ā€¢ Infinite depth of field
ā€¢ Bright outdoor performance
ā€¢ Overcome visual defects
ā€¢ True 3D stereo with depth modulation
ā€¢ Microvision (1993-)
ā€¢ Head mounted monochrome
ā€¢ MagicLeap (2013-)
ā€¢ Projecting light field into eye
Contact Lens (10 ā€“ 15 + years)
ā€¢ Contact Lens only
ā€¢ Unobtrusive
ā€¢ Significant technical challenges
ā€¢ Power, data, resolution
ā€¢ Babak Parviz (2008)
ā€¢ Contact Lens + Micro-display
ā€¢ Wide FOV
ā€¢ socially acceptable
ā€¢ Innovega (innovega-inc.com)
http://spectrum.ieee.org/biomedical/bionics/augmented-reality-in-a-contact-lens/
TRACKING
Evolution of Tracking
ā€¢ Past
ā€¢ Location based, marker based,
ā€¢ magnetic/mechanical
ā€¢ Present
ā€¢ Image based, hybrid tracking
ā€¢ Future
ā€¢ Ubiquitous
ā€¢ Model based
ā€¢ Environmental
Model Based Tracking (1-3 yrs)
ā€¢ Track from known 3D model
ā€¢ Use depth + colour information
ā€¢ Match input to model template
ā€¢ Use CAD model of targets
ā€¢ Recent innovations
ā€¢ Learn models online
ā€¢ Tracking from cluttered scene
ā€¢ Track from deformable objects
Hinterstoisser, S., Lepetit, V., Ilic, S., Holzer, S., Bradski, G., Konolige, K., & Navab, N. (2013).
Model based training, detection and pose estimation of texture-less 3D objects in heavily
cluttered scenes. In Computer Visionā€“ACCV 2012 (pp. 548-562). Springer Berlin Heidelberg.
Deformable Object Tracking
https://www.youtube.com/watch?v=KThSoK0VTDU
Environmental Tracking (3+ yrs)
ā€¢ Environment capture
ā€¢ Use depth sensors to capture scene & track from model
ā€¢ InifinitAM (www.robots.ox.ac.uk/~victor/infinitam/)
ā€¢ Real time scene capture on mobiles, dense or sparse capture
ā€¢ Dynamic memory swapping allows large environment capture
ā€¢ Cross platform, open source library available
InfinitAM Demo
https://www.youtube.com/watch?v=47zTHHxJjQU
Fusion4D (2016)
ā€¢ Shahram Izhadi (Microsoft + perceptiveIO)
ā€¢ Real capture and dynamic reconstruction
ā€¢ RGBD sensors + incremental reconstruction
Fusion4D Demo
https://www.youtube.com/watch?v=rnz0Kt36mOQ
Wide Area Outdoor Tracking (5+ yrs)
ā€¢ Process
ā€¢ Combine panoramaā€™s into point cloud model (offline)
ā€¢ Initialize camera tracking from point cloud
ā€¢ Update pose by aligning camera image to point cloud
ā€¢ Accurate to 25 cm, 0.5 degree over very wide area
Ventura, J., & Hollerer, T. (2012). Wide-area scene mapping for mobile visual tracking. In Mixed
and Augmented Reality (ISMAR), 2012 IEEE International Symposium on (pp. 3-12). IEEE.
Wide Area Outdoor Tracking
https://www.youtube.com/watch?v=8ZNN0NeXV6s
Outdoor Localization using Maps
ā€¢ Use 2D building footprints and approximate height
ā€¢ Process
ā€¢ Sensor input for initial position orientation
ā€¢ Estimate camera orientation from straight line segments
ā€¢ Estimate camera translation from faƧade segmentation
ā€¢ Use pose estimate to initialise SLAM tracking
ā€¢ Results ā€“ 90% < 4m position error, < 3
o
angular error
Arth, C., Pirchheim, C., Ventura, J., Schmalstieg, D., & Lepetit, V. (2015). Instant outdoor
localization and SLAM initialization from 2.5 D maps. IEEE transactions on visualization and
computer graphics, 21(11), 1309-1318.
Demo: Outdoor Tracking
https://www.youtube.com/watch?v=PzV8VKC5buQ
INTERACTION
Evolution of Interaction
ā€¢ Past
ā€¢ Limited interaction
ā€¢ Viewpoint manipulation
ā€¢ Present
ā€¢ Screen based, simple gesture
ā€¢ tangible interaction
ā€¢ Future
ā€¢ Natural gesture, Multimodal
ā€¢ Intelligent Interfaces
ā€¢ Physiological/Sensor based
Natural Gesture (2-5 years)
ā€¢ Freehand gesture input
ā€¢ Depth sensors for gesture capture
ā€¢ Move beyond simple pointing
ā€¢ Rich two handed gestures
ā€¢ Eg Microsoft Research Hand Tracker
ā€¢ 3D hand tracking, 30 fps, single sensor
ā€¢ Commercial Systems
ā€¢ Meta, MS Hololens, Occulus, Intel, etc
Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Leichter, D. K. C. R. I., ... & Izadi, S.
(2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
Hand Tracking Demo
https://www.youtube.com/watch?v=QTz1zQAnMcU
Example: Eye Tracking Input
ā€¢ Smaller/cheaper eye-tracking systems
ā€¢ More HMDs with integrated eye-tracking
ā€¢ Research questions
ā€¢ How can eye gaze be used for interaction?
ā€¢ What interaction metaphors are natural?
ā€¢ What technology can be used for eye-tracking
ā€¢ Etc..
Eye Gaze Interaction Methods
ā€¢ Gaze for interaction
ā€¢ Implicit vs. explicit input
ā€¢ Exploring different gaze interaction
ā€¢ Duo reticles ā€“ use eye saccade input
ā€¢ Radial pursuit ā€“ use smooth pursuit motion
ā€¢ Nod and roll ā€“ use the vestibular ocular reflex
ā€¢ Hardware
ā€¢ HTC Vive + Pupil Labs integrated eye-tracking
ā€¢ User study to compare between methods for 3DUI
Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017, March).
Exploring natural eye-gaze-based interaction for immersive virtual reality. In 3D User
Interfaces (3DUI), 2017 IEEE Symposium on (pp. 36-39). IEEE.
Duo-Reticles (DR)
Inertial Reticle (IR)
Real-time Reticle (RR) or Eye-gaze Reticle (original
name)
A-1
As RR and IR are aligned,
alignment time counts
down
A-2 A-3
Selection completed
Radial Pursuit (RP)
B-1
Real-time Reticle
(RR)
B-2 B-3 B-4
š‘‘"#$ = min š‘‘), š‘‘+, ā€¦ , š‘‘- , š‘‘# =	āˆ‘ |š‘(š‘–)5	 āˆ’	š‘ā€²5	 |$
5859:9;9<=
Nod and Roll (NR)
36
C-2
C-1
Head-gaze Reticle (HR)
Real-time Reticle
(RR)
C-3
Demo: Eye gaze interaction methods
https://www.youtube.com/watch?v=EpCGqxkmBKE
Multimodal Input (5+ years)
ā€¢ Combine gesture and speech input
ā€¢ Gesture good for qualitative input
ā€¢ Speech good for quantitative input
ā€¢ Support combined commands
ā€¢ ā€œPut that thereā€ + pointing
ā€¢ Eg HIT Lab NZ multimodal input
ā€¢ 3D hand tracking, speech
ā€¢ Multimodal fusion module
ā€¢ Complete tasks faster with MMI, less errors
Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in Space: Gesture Interaction with
Augmented-Reality Interfaces. IEEE computer graphics and applications, (1), 77-80.
HIT Lab NZ Multimodal Input
https://www.youtube.com/watch?v=DSsrzMxGwcA
Intelligent Interfaces (10+ years)
ā€¢ Move to Implicit Input vs. Explicit
ā€¢ Recognize user behaviour
ā€¢ Provide adaptive feedback
ā€¢ Support scaffolded learning
ā€¢ Move beyond check-lists of actions
ā€¢ Eg AR + Intelligent Tutoring
ā€¢ Constraint based ITS + AR
ā€¢ PC Assembly (Westerfield (2015)
ā€¢ 30% faster, 25% better retention
Westerfield, G., Mitrovic, A., & Billinghurst, M. (2015). Intelligent Augmented Reality Training for
Motherboard Assembly. International Journal of Artificial Intelligence in Education, 25(1), 157-172.
COLLABORATION
Collaborative VR Systems
ā€¢ Directions for research
ā€¢ Scalability ā€“ towards millions of users
ā€¢ Graphics ā€“ support for multiple different devices
ā€¢ User representation ā€“ realistic face/body input
ā€¢ Support for communication cues ā€“ messaging, recording, etc
ā€¢ Goal: Collaboration in VR as good/better than FtF
Altspace VR Facebook Spaces
Demo: High Fidelity
https://www.youtube.com/watch?v=-ivL1DDwUK4
ENHANCED
EXPERIENCES
Crossing Boundaries
Jun Rekimoto, Sony CSL
Invisible Interfaces
Jun Rekimoto, Sony CSL
Milgramā€™s Reality-Virtuality continuum
Mixed Reality
Reality - Virtuality (RV) Continuum
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
The MagicBook
Reality VirtualityAugmented
Reality (AR)
Augmented
Virtuality (AV)
The MagicBook
ā€¢ Using AR to transition along Milgramā€™s continuum
ā€¢ Moving seamlessly from Reality to AR to VR
ā€¢ Support for Collaboration
ā€¢ Face to Face, Shared AR/VR, Multi-scale
ā€¢ Natural interaction
ā€¢ Handheld AR and VR viewer
Billinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook: a transitional
AR interface. Computers & Graphics, 25(5), 745-753.
Demo: MagicBook
https://www.youtube.com/watch?v=tNMljw0F-aw
Invisible Interfaces
Jun Rekimoto, Sony CSL
Example: Visualizing Sensor Networks
ā€¢ Rauhala et. al. 2007 (Linkoping)
ā€¢ Network of Humidity Sensors
ā€¢ ZigBee wireless communication
ā€¢ Use Mobile AR toVisualize Humidity
Rauhala, M., Gunnarsson, A. S., & Henrysson, A. (2006). A novel interface to sensor
networks using handheld augmented reality. In Proceedings of the 8th conference on
Human-computer interaction with mobile devices and services(pp. 145-148). ACM.
ā€¢ Humidity information overlaid on real world shown in mobile AR
Invisible Interfaces
Jun Rekimoto, Sony CSL
UbiVR ā€“ CAMAR
CAMAR Companion
CAMAR Viewer
CAMAR Controller
GIST - Korea
ubiHome @ GIST
ƓubiHome
What/When/How
Where/When
Media services
Who/What/
When/How
ubiKey
Couch SensorPDA
Tag-it
Door Sensor
ubiTrack
When/HowWhen/HowWho/What/When/How
Light service MR window
SOCIAL ACCEPTANCE
Example:SocialAcceptance
ā€¢ People donā€™t want to look silly
ā€¢ Only 12% of 4,600 adults would be willing to wear AR glasses
ā€¢ 20% of mobile AR browser users experience social issues
ā€¢ Acceptance more due to Social than Technical issues
ā€¢ Needs further study (ethnographic, field tests, longitudinal)
TATAugmented ID
TAT AugmentedID
https://www.youtube.com/watch?v=tb0pMeg1UN0
CONCLUSIONS
Research Needed in Many Areas
ā€¢ Collaborative Experiences
ā€¢ VR/AR teleconferencing
ā€¢ Social Acceptance
ā€¢ Overcome social problems with AR/VR
ā€¢ Cloud Services
ā€¢ Cloud based storage/processing
ā€¢ Authoring Tools
ā€¢ Easy content creation for non-experts for AR/VR
ā€¢ Etc..
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

More Related Content

What's hot

What's hot (20)

COMP 4026 Lecture4: Processing and Advanced Interface Technology
COMP 4026 Lecture4: Processing and Advanced Interface TechnologyCOMP 4026 Lecture4: Processing and Advanced Interface Technology
COMP 4026 Lecture4: Processing and Advanced Interface Technology
Ā 
COMP 4010 - Lecture 4: 3D User Interfaces
COMP 4010 - Lecture 4: 3D User InterfacesCOMP 4010 - Lecture 4: 3D User Interfaces
COMP 4010 - Lecture 4: 3D User Interfaces
Ā 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
Ā 
Lecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual RealityLecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual Reality
Ā 
AR in Education
AR in EducationAR in Education
AR in Education
Ā 
COMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsCOMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and Systems
Ā 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented Reality
Ā 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR Interaction
Ā 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
Ā 
Mixed Reality in the Workspace
Mixed Reality in the WorkspaceMixed Reality in the Workspace
Mixed Reality in the Workspace
Ā 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR Technology
Ā 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
Ā 
Mobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface Design
Ā 
Lecture 2 Presence and Perception
Lecture 2 Presence and PerceptionLecture 2 Presence and Perception
Lecture 2 Presence and Perception
Ā 
COMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsCOMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR Applications
Ā 
Comp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingComp4010 lecture6 Prototyping
Comp4010 lecture6 Prototyping
Ā 
COMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VRCOMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VR
Ā 
COMP 4026 Lecture3 Prototyping and Evaluation
COMP 4026 Lecture3 Prototyping and EvaluationCOMP 4026 Lecture3 Prototyping and Evaluation
COMP 4026 Lecture3 Prototyping and Evaluation
Ā 
MHIT 603: Introduction to Interaction Design
MHIT 603: Introduction to Interaction DesignMHIT 603: Introduction to Interaction Design
MHIT 603: Introduction to Interaction Design
Ā 
Designing Augmented Reality Experiences
Designing Augmented Reality ExperiencesDesigning Augmented Reality Experiences
Designing Augmented Reality Experiences
Ā 

Similar to COMP 4010 Lecture10 AR/VR Research Directions

Similar to COMP 4010 Lecture10 AR/VR Research Directions (20)

Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
Ā 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual Reality
Ā 
Augmented Reality: The Next 20 Years (AWE Asia 2015)
Augmented Reality: The Next 20 Years (AWE Asia 2015)Augmented Reality: The Next 20 Years (AWE Asia 2015)
Augmented Reality: The Next 20 Years (AWE Asia 2015)
Ā 
Augmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsAugmented Reality: The Next 20 Years
Augmented Reality: The Next 20 Years
Ā 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Ā 
Multimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed Reality
Ā 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
Ā 
Pervasive ar environment
Pervasive ar environmentPervasive ar environment
Pervasive ar environment
Ā 
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesVSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
Ā 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5
Ā 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
Ā 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
Ā 
From Interaction to Empathy
From Interaction to EmpathyFrom Interaction to Empathy
From Interaction to Empathy
Ā 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
Ā 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
Ā 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
Ā 
Applying virtual environments in distance learning of product development
Applying virtual environments in distance learning of product developmentApplying virtual environments in distance learning of product development
Applying virtual environments in distance learning of product development
Ā 
AR-VR Workshop
AR-VR WorkshopAR-VR Workshop
AR-VR Workshop
Ā 
3D Content Development and AR/VR Authoring
3D Content Development and AR/VR Authoring3D Content Development and AR/VR Authoring
3D Content Development and AR/VR Authoring
Ā 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
Ā 

More from Mark Billinghurst

More from Mark Billinghurst (20)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
Ā 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
Ā 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
Ā 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
Ā 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
Ā 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Ā 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
Ā 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
Ā 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
Ā 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
Ā 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
Ā 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
Ā 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
Ā 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
Ā 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
Ā 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
Ā 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
Ā 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
Ā 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
Ā 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
Ā 

Recently uploaded

Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
Joaquim Jorge
Ā 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(ā˜Žļø+971_581248768%)**%*]'#abortion pills for sale in dubai@
Ā 

Recently uploaded (20)

04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
Ā 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
Ā 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
Ā 
GenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdfGenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdf
Ā 
šŸ¬ The future of MySQL is Postgres šŸ˜
šŸ¬  The future of MySQL is Postgres   šŸ˜šŸ¬  The future of MySQL is Postgres   šŸ˜
šŸ¬ The future of MySQL is Postgres šŸ˜
Ā 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
Ā 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?
Ā 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
Ā 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
Ā 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
Ā 
Tech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfTech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdf
Ā 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
Ā 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Ā 
Finology Group ā€“ Insurtech Innovation Award 2024
Finology Group ā€“ Insurtech Innovation Award 2024Finology Group ā€“ Insurtech Innovation Award 2024
Finology Group ā€“ Insurtech Innovation Award 2024
Ā 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
Ā 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
Ā 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Ā 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
Ā 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
Ā 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
Ā 

COMP 4010 Lecture10 AR/VR Research Directions

  • 1. LECTURE 10: RESEARCH DIRECTIONS IN AR AND VR COMP 4010 ā€“ Virtual Reality Semester 5 ā€“ 2018 Bruce Thomas, Mark Billinghurst University of South Australia October 30th 2018
  • 2. Key Technologies for VR Systems ā€¢ Visual Display ā€¢ Stimulate visual sense ā€¢ Audio/Tactile Display ā€¢ Stimulate hearing/touch ā€¢ Tracking ā€¢ Changing viewpoint ā€¢ User input ā€¢ Input Devices ā€¢ Supporting user interaction
  • 3. Many Directions for Research ā€¢ Research in each of the key technology areas for VR ā€¢ Display, input, tracking, graphics, etc. ā€¢ Research in the phenomena of VR ā€¢ Psychological experience of VR, measuring Presence, etc.. ā€¢ Research in tools for VR ā€¢ VR authoring tools, automatic world creation, etc. ā€¢ Research in many application areas ā€¢ Collaborative/social, virtual characters, medical, education, etc.
  • 4. Future Visions of VR: Ready Player One ā€¢ https://www.youtube.com/watch?v=LiK2fhOY0nE
  • 5. Today vs. Tomorrow VR in 2018 VR in 2045 Graphics High quality Photo-realistic Display 110-150 degrees Total immersion Interaction Handheld controller Full gesture/body/gaze Navigation Limited movement Natural Multiuser Few users Millions of users
  • 6. Augmented Reality ā€¢ Defining Characteristics [Azuma 97] ā€¢ Combines Real andVirtual Images ā€¢ Both can be seen at the same time ā€¢ Interactive in real-time ā€¢ The virtual content can be interacted with ā€¢ Registered in 3D ā€¢ Virtual objects appear fixed in space Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
  • 7. Future Vision of AR: IronMan ā€¢ https://www.youtube.com/watch?v=Y1TEK2Wf_e8
  • 8. Key Enabling Technologies 1. Combines Real andVirtual Images Display Technology 2. Registered in 3D Tracking Technologies 3. Interactive in real-time InteractionTechnologies Future research can be done in each of these areas
  • 10. ā€¢ Past ā€¢ Bulky Head mounted displays ā€¢ Current ā€¢ Handheld, lightweight head mounted ā€¢ Future ā€¢ Projected AR ā€¢ Wide FOV see through ā€¢ Retinal displays ā€¢ Contact lens Evolution in Displays
  • 11. Wide FOV See-Through (3+ years) ā€¢ Waveguide techniques ā€¢ Wider FOV ā€¢ Thin see through ā€¢ Socially acceptable ā€¢ Pinlight Displays ā€¢ LCD panel + point light sources ā€¢ 110 degree FOV ā€¢ UNC/Nvidia Lumus DK40 Maimone, A., Lanman, D., Rathinavel, K., Keller, K., Luebke, D., & Fuchs, H. (2014). Pinlight displays: wide field of view augmented reality eyeglasses using defocused point light sources. In ACM SIGGRAPH 2014 Emerging Technologies (p. 20). ACM.
  • 14. Retinal Displays (5+ years) ā€¢ Photons scanned into eye ā€¢ Infinite depth of field ā€¢ Bright outdoor performance ā€¢ Overcome visual defects ā€¢ True 3D stereo with depth modulation ā€¢ Microvision (1993-) ā€¢ Head mounted monochrome ā€¢ MagicLeap (2013-) ā€¢ Projecting light field into eye
  • 15. Contact Lens (10 ā€“ 15 + years) ā€¢ Contact Lens only ā€¢ Unobtrusive ā€¢ Significant technical challenges ā€¢ Power, data, resolution ā€¢ Babak Parviz (2008) ā€¢ Contact Lens + Micro-display ā€¢ Wide FOV ā€¢ socially acceptable ā€¢ Innovega (innovega-inc.com) http://spectrum.ieee.org/biomedical/bionics/augmented-reality-in-a-contact-lens/
  • 17. Evolution of Tracking ā€¢ Past ā€¢ Location based, marker based, ā€¢ magnetic/mechanical ā€¢ Present ā€¢ Image based, hybrid tracking ā€¢ Future ā€¢ Ubiquitous ā€¢ Model based ā€¢ Environmental
  • 18. Model Based Tracking (1-3 yrs) ā€¢ Track from known 3D model ā€¢ Use depth + colour information ā€¢ Match input to model template ā€¢ Use CAD model of targets ā€¢ Recent innovations ā€¢ Learn models online ā€¢ Tracking from cluttered scene ā€¢ Track from deformable objects Hinterstoisser, S., Lepetit, V., Ilic, S., Holzer, S., Bradski, G., Konolige, K., & Navab, N. (2013). Model based training, detection and pose estimation of texture-less 3D objects in heavily cluttered scenes. In Computer Visionā€“ACCV 2012 (pp. 548-562). Springer Berlin Heidelberg.
  • 20. Environmental Tracking (3+ yrs) ā€¢ Environment capture ā€¢ Use depth sensors to capture scene & track from model ā€¢ InifinitAM (www.robots.ox.ac.uk/~victor/infinitam/) ā€¢ Real time scene capture on mobiles, dense or sparse capture ā€¢ Dynamic memory swapping allows large environment capture ā€¢ Cross platform, open source library available
  • 22. Fusion4D (2016) ā€¢ Shahram Izhadi (Microsoft + perceptiveIO) ā€¢ Real capture and dynamic reconstruction ā€¢ RGBD sensors + incremental reconstruction
  • 24. Wide Area Outdoor Tracking (5+ yrs) ā€¢ Process ā€¢ Combine panoramaā€™s into point cloud model (offline) ā€¢ Initialize camera tracking from point cloud ā€¢ Update pose by aligning camera image to point cloud ā€¢ Accurate to 25 cm, 0.5 degree over very wide area Ventura, J., & Hollerer, T. (2012). Wide-area scene mapping for mobile visual tracking. In Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on (pp. 3-12). IEEE.
  • 25. Wide Area Outdoor Tracking https://www.youtube.com/watch?v=8ZNN0NeXV6s
  • 26. Outdoor Localization using Maps ā€¢ Use 2D building footprints and approximate height ā€¢ Process ā€¢ Sensor input for initial position orientation ā€¢ Estimate camera orientation from straight line segments ā€¢ Estimate camera translation from faƧade segmentation ā€¢ Use pose estimate to initialise SLAM tracking ā€¢ Results ā€“ 90% < 4m position error, < 3 o angular error Arth, C., Pirchheim, C., Ventura, J., Schmalstieg, D., & Lepetit, V. (2015). Instant outdoor localization and SLAM initialization from 2.5 D maps. IEEE transactions on visualization and computer graphics, 21(11), 1309-1318.
  • 29. Evolution of Interaction ā€¢ Past ā€¢ Limited interaction ā€¢ Viewpoint manipulation ā€¢ Present ā€¢ Screen based, simple gesture ā€¢ tangible interaction ā€¢ Future ā€¢ Natural gesture, Multimodal ā€¢ Intelligent Interfaces ā€¢ Physiological/Sensor based
  • 30. Natural Gesture (2-5 years) ā€¢ Freehand gesture input ā€¢ Depth sensors for gesture capture ā€¢ Move beyond simple pointing ā€¢ Rich two handed gestures ā€¢ Eg Microsoft Research Hand Tracker ā€¢ 3D hand tracking, 30 fps, single sensor ā€¢ Commercial Systems ā€¢ Meta, MS Hololens, Occulus, Intel, etc Sharp, T., Keskin, C., Robertson, D., Taylor, J., Shotton, J., Leichter, D. K. C. R. I., ... & Izadi, S. (2015, April). Accurate, Robust, and Flexible Real-time Hand Tracking. In Proc. CHI (Vol. 8).
  • 32. Example: Eye Tracking Input ā€¢ Smaller/cheaper eye-tracking systems ā€¢ More HMDs with integrated eye-tracking ā€¢ Research questions ā€¢ How can eye gaze be used for interaction? ā€¢ What interaction metaphors are natural? ā€¢ What technology can be used for eye-tracking ā€¢ Etc..
  • 33. Eye Gaze Interaction Methods ā€¢ Gaze for interaction ā€¢ Implicit vs. explicit input ā€¢ Exploring different gaze interaction ā€¢ Duo reticles ā€“ use eye saccade input ā€¢ Radial pursuit ā€“ use smooth pursuit motion ā€¢ Nod and roll ā€“ use the vestibular ocular reflex ā€¢ Hardware ā€¢ HTC Vive + Pupil Labs integrated eye-tracking ā€¢ User study to compare between methods for 3DUI Piumsomboon, T., Lee, G., Lindeman, R. W., & Billinghurst, M. (2017, March). Exploring natural eye-gaze-based interaction for immersive virtual reality. In 3D User Interfaces (3DUI), 2017 IEEE Symposium on (pp. 36-39). IEEE.
  • 34. Duo-Reticles (DR) Inertial Reticle (IR) Real-time Reticle (RR) or Eye-gaze Reticle (original name) A-1 As RR and IR are aligned, alignment time counts down A-2 A-3 Selection completed
  • 35. Radial Pursuit (RP) B-1 Real-time Reticle (RR) B-2 B-3 B-4 š‘‘"#$ = min š‘‘), š‘‘+, ā€¦ , š‘‘- , š‘‘# = āˆ‘ |š‘(š‘–)5 āˆ’ š‘ā€²5 |$ 5859:9;9<=
  • 36. Nod and Roll (NR) 36 C-2 C-1 Head-gaze Reticle (HR) Real-time Reticle (RR) C-3
  • 37. Demo: Eye gaze interaction methods https://www.youtube.com/watch?v=EpCGqxkmBKE
  • 38. Multimodal Input (5+ years) ā€¢ Combine gesture and speech input ā€¢ Gesture good for qualitative input ā€¢ Speech good for quantitative input ā€¢ Support combined commands ā€¢ ā€œPut that thereā€ + pointing ā€¢ Eg HIT Lab NZ multimodal input ā€¢ 3D hand tracking, speech ā€¢ Multimodal fusion module ā€¢ Complete tasks faster with MMI, less errors Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in Space: Gesture Interaction with Augmented-Reality Interfaces. IEEE computer graphics and applications, (1), 77-80.
  • 39. HIT Lab NZ Multimodal Input https://www.youtube.com/watch?v=DSsrzMxGwcA
  • 40. Intelligent Interfaces (10+ years) ā€¢ Move to Implicit Input vs. Explicit ā€¢ Recognize user behaviour ā€¢ Provide adaptive feedback ā€¢ Support scaffolded learning ā€¢ Move beyond check-lists of actions ā€¢ Eg AR + Intelligent Tutoring ā€¢ Constraint based ITS + AR ā€¢ PC Assembly (Westerfield (2015) ā€¢ 30% faster, 25% better retention Westerfield, G., Mitrovic, A., & Billinghurst, M. (2015). Intelligent Augmented Reality Training for Motherboard Assembly. International Journal of Artificial Intelligence in Education, 25(1), 157-172.
  • 42. Collaborative VR Systems ā€¢ Directions for research ā€¢ Scalability ā€“ towards millions of users ā€¢ Graphics ā€“ support for multiple different devices ā€¢ User representation ā€“ realistic face/body input ā€¢ Support for communication cues ā€“ messaging, recording, etc ā€¢ Goal: Collaboration in VR as good/better than FtF Altspace VR Facebook Spaces
  • 47. Milgramā€™s Reality-Virtuality continuum Mixed Reality Reality - Virtuality (RV) Continuum Real Environment Augmented Reality (AR) Augmented Virtuality (AV) Virtual Environment
  • 48. The MagicBook Reality VirtualityAugmented Reality (AR) Augmented Virtuality (AV)
  • 49. The MagicBook ā€¢ Using AR to transition along Milgramā€™s continuum ā€¢ Moving seamlessly from Reality to AR to VR ā€¢ Support for Collaboration ā€¢ Face to Face, Shared AR/VR, Multi-scale ā€¢ Natural interaction ā€¢ Handheld AR and VR viewer Billinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook: a transitional AR interface. Computers & Graphics, 25(5), 745-753.
  • 52. Example: Visualizing Sensor Networks ā€¢ Rauhala et. al. 2007 (Linkoping) ā€¢ Network of Humidity Sensors ā€¢ ZigBee wireless communication ā€¢ Use Mobile AR toVisualize Humidity Rauhala, M., Gunnarsson, A. S., & Henrysson, A. (2006). A novel interface to sensor networks using handheld augmented reality. In Proceedings of the 8th conference on Human-computer interaction with mobile devices and services(pp. 145-148). ACM.
  • 53.
  • 54. ā€¢ Humidity information overlaid on real world shown in mobile AR
  • 56. UbiVR ā€“ CAMAR CAMAR Companion CAMAR Viewer CAMAR Controller GIST - Korea
  • 57. ubiHome @ GIST ƓubiHome What/When/How Where/When Media services Who/What/ When/How ubiKey Couch SensorPDA Tag-it Door Sensor ubiTrack When/HowWhen/HowWho/What/When/How Light service MR window
  • 59. Example:SocialAcceptance ā€¢ People donā€™t want to look silly ā€¢ Only 12% of 4,600 adults would be willing to wear AR glasses ā€¢ 20% of mobile AR browser users experience social issues ā€¢ Acceptance more due to Social than Technical issues ā€¢ Needs further study (ethnographic, field tests, longitudinal)
  • 62.
  • 63.
  • 65. Research Needed in Many Areas ā€¢ Collaborative Experiences ā€¢ VR/AR teleconferencing ā€¢ Social Acceptance ā€¢ Overcome social problems with AR/VR ā€¢ Cloud Services ā€¢ Cloud based storage/processing ā€¢ Authoring Tools ā€¢ Easy content creation for non-experts for AR/VR ā€¢ Etc..