SlideShare a Scribd company logo
1 of 83
Download to read offline
LECTURE 4: VR SYSTEMS
COMP 4010 – Virtual Reality
Semester 5 - 2019
Bruce Thomas, Mark Billinghurst, Gun Lee
University of South Australia
August 20th 2019
• Survey of VR technologies
• Tracking
• Haptic/Tactile Displays
• Audio Displays
• Input Devices
Recap – Last Week
Tracking in VR
• Need for Tracking
• User turns their head and the VR graphics scene changes
• User wants to walking through a virtual scene
• User reaches out and grab a virtual object
• The user wants to use a real prop in VR
• All of these require technology to track the user or object
• Continuously provide information about position and orientation
Head Tracking
Hand Tracking
Tracking Technologies
§ Active (device sends out signal)
• Mechanical, Magnetic, Ultrasonic
• GPS, Wifi, cell location
§ Passive (device senses world)
• Inertial sensors (compass, accelerometer, gyro)
• Computer Vision
• Marker based, Natural feature tracking
§ Hybrid Tracking
• Combined sensors (eg Vision + Inertial)
Haptic Feedback
• Greatly improves realism
• Hands and wrist are most important
• High density of touch receptors
• Two kinds of feedback:
• Touch Feedback
• information on texture, temperature, etc.
• Does not resist user contact
• Force Feedback
• information on weight, and inertia.
• Actively resists contact motion
Active vs. Passive Haptics
• Active Haptics
• Actively resists motion
• Key properties
• Force resistance, DOF, latency
• Passive Haptics
• Not controlled by system
• Use real props (e.g. styrofoam for walls)
Audio Displays
• Spatialization vs. Localization
• Spatialization is the processing of sound signals
to make them emanate from a point in space
• This is a technical topic
• Localization is the ability of people to identify the
source position of a sound
• This is a human topic, i.e., some people are better at it.
• Head-Related Transfer Function (HRTF)
• Models how sound from a source reaches the eardrum
• Needs to be measured for each individual
Closed
VR Input Devices
• Physical devices that convey information into the application
and support interaction in the Virtual Environment
Multiple Input Devices
• Natural
• Eye, gaze, full body tracking
• Handheld devices
• Controllers, gloves
• Body worn
• Myo armband
• Pedestrian devices
• Treadmill, ball
Mapping Between Input and Output
Input
Output
Comparison Between Devices
From Jerald (2015)
Comparing between hand
and non-hand input
VR SYSTEMS
Creating a Good VR Experience
• Creating a good experience requires good system design
• Integrating multiple hardware, software, interaction, content elements
Example: Shard VR Slide
• Ride down the Shard at 100 mph - Multi-sensory VR
https://www.youtube.com/watch?v=HNXYoEdBtoU
Key Components to Consider
• Five key components:
• Inputs
• Outputs
• Computation/Simulation
• Content/World database
• User interaction
From: Sherman, W. R., & Craig, A. B. (2018). Understanding virtual reality:
Interface, application, and design. Morgan Kaufmann.
Typical VR System
• Combining multiple technology elements for good user experience
• Input devices, output modality, content databases, networking, etc.
From Content to User
Modelling
Program
Content
• 3d model
• Textures
Translation
• CAD data
Application
programming
Dynamics
Generator
Input Devices
• Gloves, Mic
• Trackers
Renderers
• 3D, sound
Output Devices
• HMD, audio
• Haptic
User Actions
• Speak
• Grab
Software
Content
User I/O
Case Study: Multimodal VR System
• US Army project
• Simulate control of an unmanned vehicle
• Sensors (input)
• Head/hand tracking
• Gesture, Speech (Multimodal)
• Displays (output)
• HMD, Audio
• Processing
• Graphics: Virtual vehicles on battlefield
• Speech processing/understanding
Neely, H. E., Belvin, R. S., Fox, J. R., & Daily, M. J. (2004, March). Multimodal interaction
techniques for situational awareness and command of robotic combat entities. In Aerospace
Conference, 2004. Proceedings. 2004 IEEE (Vol. 5, pp. 3297-3305). IEEE.
System Diagram
VR CONTENT
Types of VR Experiences
• Immersive Spaces
• 360 Panorama’s/Movies
• High visual quality
• Limited interactivity
• Changing viewpoint orientation
• Immersive Experiences
• 3D graphics
• Lower visual quality
• High interactivity
• Movement in space
• Interact with objects
Types of VR Graphics Content
• Panoramas
• 360 images/video
• Captured 3D content
• Scanned objects/spaces
• Modelled Content
• Hand created 3D models
• Existing 3D assets
Capturing Panoramas
• Stitching individual photos together
• Image Composite Editor (Microsoft)
• AutoPano (Kolor)
• Using 360 camera
• Ricoh Theta-S
• Fly360
Consumer 360 Capture Devices
Kodac 360 Fly 360 Gear 360 Theta S Nikon
LG 360 Pointgrey Ladybug Panono 360 Bublcam
Example: Cardboard Camera
• Capture 360 panoramas
• Stitch together images on phone
• View in VR on Google Cardboard Viewer
Cardboard Camera
• https://www.youtube.com/watch?v=d5lUXZhWaZY
• Use camera pairs to capture stereo 360 video
• Samsung 360 round
• 17 lenses, 4K 3D images, live video streaming, $10K USD
• Vuze+ VR camera
• 8 lenses, 4K Stereoscopic 3D 360⁰ video and photo, $999 USD
Stereo Video Capture
Vuze Samsung
Samsung 360 Round
• https://www.youtube.com/watch?v=X_ytJJOmVF0
3D Scanning
• A range of products support 3D scanning
• Create point cloud or mesh model
• Typically combine RGB cameras with depth sensing
• Captures texture plus geometry
• Multi-scale
• Object Scanners
• Handheld, Desktop
• Body Scanners
• Rotating platform, multi-camera
• Room scale
• Mobile, tripod mounted
Example: Matterport
• Matterport Pro2 3D scanner
• Room scale scanner, panorama and 3D model
• 360° (left-right) x 300° (vertical) field of view
• Structured light (infared) 3D sensor
• 15 ft (4.5 m) maximum range
• 4K HDR images
Matterport Pro2 Lite
• https://www.youtube.com/watch?v=SjHk0Th-j1I
Handheld/Desktop Scanners
• Capture people/objects
• Sense 3D scanner
• accuracy of 0.90 mm, colour resolution of 1920×1080 pixels
• Occipital Structure sensor
• Add-on to iPad, mesh scanning, IR light projection, 60 Hz
Structure Sensor
• https://www.youtube.com/watch?v=7j3HQxUGvq4
3D Modelling
• A variety of 3D modelling tools can be used
• Export in VR compatible file format (.obj, .fbx, etc)
• Especially useful for animation - difficult to create from scans
• Popular tools
• Blender (free), 3DS max, Maya, etc.
• Easy to Use
• Tinkercad, Sketchup Free, Meshmixer, Fusion 360, etc.
Modelling in VR
• Several tools for modelling in VR
• Natural interaction, low polygon count, 3D object viewins
• Low end
• Google Blocks
• High end
• Quill, Tilt brush – 3D painting
• Gravity Sketch – 3D CAD
Example: Google Blocks
• https://www.youtube.com/watch?v=1TX81cRqfUU
Example: Gravity Sketch
• https://www.youtube.com/watch?v=VK2DDnT_3l0
Download Existing VR Content
• Many locations for 3D objects, textures, etc.
• Google Poly - Low polygon VR ready models
• Sketchfab, Sketchup, Free3D (www.free3d.com), etc.
• Asset stores - Unity, Unreal
• Provide 3D models, materials, code, etc..
Google Poly
• https://poly.google.com/ - search for models you’d like
SIMULATION
Typical VR Simulation Loop
• User moves head, scene updates, displayed graphics change
• Need to synchronize system to reduce delays
System Delays
Typical Delay from Tracking to Rendering
System Delay
Typical System Delays
• Total Delay = 50 + 2 + 33 + 17 = 102 ms
• 1 ms delay = 1/3 mm error for object drawn at arms length
• So total of 33mm error from when user begins moving to when object drawn
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
Living with High Latency (1/3 sec – 3 sec)
• https://www.youtube.com/watch?v=_fNp37zFn9Q
Effects of System Latency
• Degraded Visual Acuity
• Scene still moving when head stops = motion blur
• Degraded Performance
• As latency increases it’s difficult to select objects etc.
• If latency > 120 ms, training doesn’t improve performance
• Breaks-in-Presence
• If system delay high user doesn’t believe they are in VR
• Negative Training Effects
• User train to operative in world with delay
• Simulator Sickness
• Latency is greatest cause of simulator sickness
Simulator Sickness
• Visual input conflicting with vestibular system
Many Causes of Simulator Sickness
• 25-40% of VR users get Simulator Sickness, due to:
• Latency
• Major cause of simulator sickness
• Tracking accuracy/precision
• Seeing world from incorrect position, viewpoint drift
• Field of View
• Wide field of view creates more periphery vection = sickness
• Refresh Rate/Flicker
• Flicker/low refresh rate creates eye fatigue
• Vergence/Accommodation Conflict
• Creates eye strain over time
• Eye separation
• If IPD not matching to inter-image distance then discomfort
Motion Sickness
• https://www.youtube.com/watch?v=BznbIlW8iqE
How to Reduce System Delays
• Use faster components
• Faster CPU, display, etc.
• Reduce the apparent lag (Time Warp)
• Take tracking measurement just before rendering
• Remove tracker from the loop
• Use predictive tracking
• Use fast inertial sensors to predict where user will be looking
• Difficult due to erratic head movements
Jerald, J. (2004). Latency compensation for head-mounted virtual reality. UNC
Computer Science Technical Report.
Reducing System Lag
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
Faster Tracker Faster CPU Faster GPU Faster Display
ReducingApparent Lag (TimeWarp)
Tracking
Update
x,y,z
r,p,y
Virtual Display
Physical
Display
(640x480)
1280 x 960
Last known position
Virtual Display
Physical
Display
(640x480)
1280 x 960
Latest position
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
Create virtual display large than physical display and move at last minute
PredictiveTracking for Reducing Latency
Time
Position
Past Future
Use additional sensors (e.g. inertial) to predict future position
• Can reliably predict up to 80 ms in future (Holloway)
• Use Kalman filters or similar to smooth prediction
Now
PredictiveTracking Reduces Error (Azuma 94)
GRAPHICS
VR Graphics Architecture/Tools
• Rendering Layer (GPU acceleration) [OpenGL]
• Low level graphics code
• Rendering pixels/polygons
• Interface with graphics card/frame buffer
• Graphics Layer (CPU acceleration) [X3D, OSG]
• Scene graph specification
• Object physics engine
• Specifying graphics objects
• Application Layer [Unity, Unreal]
• User interface libraries
• Simulation/behaviour code
• User interaction specification
• Low level code for loading models and showing on screen
• Using shaders and low level GPU programming to improve graphics
Traditional 3D Graphics Pipeline
Graphics Challenges with VR
• Higher data throughput (> 7x desktop requirement)
• Lower latency requirements (from 150ms/frame to 20ms)
• HMD Lens distortion
• HMD may have cheap lens
• Creates chromatic aberration and distorted image
• Warp graphics images to create undistorted view
• Use low level shader programming
Lens Distortion
VR System Pipeline
• Using time warping and lens distortion
Perception Based Graphics
• Eye Physiology
• Rods in eye centre = colour vision, cones in periphery = motion, B+W
• Foveated Rendering
• Use eye tracking to draw highest resolution where user looking
• Reduces graphics throughput
Foveated Rendering
• https://www.youtube.com/watch?v=lNX0wCdD2LA
Scene Graphs
• Tree-like structure for organising VR graphics
• e.g. VRML, OSG, X3D
• Hierarchy of nodes that define:
• Groups (and Switches, Sequences etc…)
• Transformations
• Projections
• Geometry
• …
• And states and attributes that define:
• Materials and textures
• Lighting and blending
• …
Example Scene Graph
• Car model with four wheels
• Only need one wheel geometry object in scene graph
More Complex
• Everything off root node
• Parent/child node
relationships
• Can move car by
transforming group node
Adding Cameras and Lights
• Scene graph includes:
• Cameras
• Lighting
• Material properties
• Etc..
• All passed to renderer
Benefits of Using a Scene Graph
• Performance
• Structuring data facilitates optimization
• Culling, state management, etc…
• Hardware Abstraction
• Underlying graphics pipeline is hidden
• No Low-level programming
• Think about objects, not polygons
• Supports Behaviours
• Collision detection, animation, etc..
Scene Graph Libraries
• VRML/X3D
• descriptive text format, ISO standard
• OpenInventor
• based on C++ and OpenGL
• originally Silicon Graphics, 1988
• now supported by VSG3d.com
• Java3D
• provides 3D data structures in Java
• not supported anymore
• Open Scene Graph (OSG)
• Various Game Engines
• e.g. JMonkey 3 (scene graph based game engine for Java)
Creating a Scene Graph
• Creation of scene graph objects
• Authoring software (e.g. Blender, 3DS Max)
• Assets exported to exchange formats
• E.g. (X3D,) Wavefront OBJ (.obj), 3ds Max
(.3ds), Ogre XML (.mesh)
• Objects typically are tesselated
• Polygon meshes
• Create XML file
• Specify scene graph
• Example:
• JME Scene
Scene Graph in the Rendering Pipeline
• Scene graph used to optimize scene creation in pipeline
OpenSceneGraph
• http://www.openscenegraph.org/
• Open-source scene graph implementation
• Based on OpenGL
• Object-oriented C++ following design pattern principles
• Used for simulation, games, research, and industrial projects
• Active development community
• mailing list, documentation (www.osgbooks.com)
• Uses the OSG Public License (similar to LGPL)
OpenSceneGraph Features
• Plugins for loading and saving
• 3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)…
• 2D: .png, .jpg, .bmp, QuickTime movies
• NodeKits to extend functionality
• osgTerrain - terrain rendering
• osgAnimation - character animation
• osgShadow - shadow framework
• Multi-language support
• C++, Java, Lua and Python
• Cross-platform support:
• Windows, Linux, MacOS, iOS, Android, etc.
OpenSceneGraph Architecture
Scene graph and
Rendering functionality
Plugins read and
write 2D image
and 3D model files
NodeKits extend core
functionality, exposing
higher-level node types
OpenSceneGraph and Virtual Reality
• Need to create VR wrapper on top of OSG
• Add support for HMDs, device interaction, etc..
• Several viewer nodes available with VR support
• OsgOpenVRViewer: viewing on VR devices compatible with openVR/steamVR
• OsgOculusViewer: OsgViewer with support for the Oculus Rift
Examples
• Using OsgOculusViewer, Leap Motion and Oculus Rift HMD
• https://www.youtube.com/watch?v=xZgyOF-oT0g
High Level Graphics Tools
• Game Engines
• Powerful, need scripting ability
• Unity, Unreal, Cry Engine, etc..
• Combine with VR plugins
• HMDs, input devices, interaction, assets, etc..
Tools for Non-Programmers
• Focus on Design, ease of use
• Visual Programming, content arrangement
• Examples
• Insta-VR – 360 panoramas
• http://www.instavr.co/
• Vizor – VR on the Web
• http://vizor.io/
• A-frame – HTML based
• https://aframe.io/
• Eon Creator – Drag and drop tool for AR/VR
• http://www.eonreality.com/eon-creator/
• Amazon Sumerian – WebGL, multiplatform
• https://aws.amazon.com/sumerian/
Example: InstaVR (360 VR)
• https://www.youtube.com/watch?v=M2C8vDL0YeA
Example: Amazon Sumerian (3D VR)
• https://www.youtube.com/watch?v=_Q3QKFp3zlo
SYSTEM DESIGN
GUIDELINES
System Design Guidelines - I
• Hardware
• Choose HMDs with fast pixel response time, no flicker
• Choose trackers with high update rates, accurate, no drift
• Choose HMDs that are lightweight, comfortable to wear
• Use hand controllers with no line of sight requirements
• System Calibration
• Have virtual FOV match actual FOV of HMD
• Measure and set users IPD
• Latency Reduction
• Minimize overall end to end system delay
• Use displays with fast response time and low persistence
• Use latency compensation to reduce perceived latency
Jason Jerald, The VR Book, 2016
System Design Guidelines - II
• General Design
• Design for short user experiences
• Minimize visual stimuli closer to eye (vergence/accommodation)
• For binocular displays, do not use 2D overlays/HUDs
• Design for sitting, or provide physical barriers
• Show virtual warning when user reaches end of tracking area
• Motion Design
• Move virtual viewpoint with actual motion of the user
• If latency high, no tasks requiring fast head motion
• Interface Design
• Design input/interaction for user’s hands at their sides
• Design interactions to be non-repetitive to reduce strain injuries
Jason Jerald, The VR Book, 2016
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

More Related Content

What's hot

COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityMark Billinghurst
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsMark Billinghurst
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR TechnologyMark Billinghurst
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Mark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR PrototypingMark Billinghurst
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingMark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRMark Billinghurst
 
Comp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and InteractionComp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and InteractionMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsMark Billinghurst
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityMark Billinghurst
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR InteractionMark Billinghurst
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsMark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 
COMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsCOMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsMark Billinghurst
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsMark Billinghurst
 
COMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR TechnologyCOMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR TechnologyMark Billinghurst
 
Virtual reality
Virtual reality Virtual reality
Virtual reality Mohit Patel
 

What's hot (20)

COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR Studies
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and Prototyping
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
Comp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and InteractionComp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and Interaction
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR Systems
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
COMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsCOMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and Systems
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR Systems
 
COMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR TechnologyCOMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR Technology
 
Virtual reality
Virtual reality Virtual reality
Virtual reality
 

Similar to Lecture 4: VR Systems

Introduction to daydream for AnDevCon DC - 2017
Introduction to daydream for AnDevCon DC - 2017Introduction to daydream for AnDevCon DC - 2017
Introduction to daydream for AnDevCon DC - 2017Jared Sheehan
 
Introduction to DaydreamVR from DevFestDC 2017
Introduction to DaydreamVR from DevFestDC 2017Introduction to DaydreamVR from DevFestDC 2017
Introduction to DaydreamVR from DevFestDC 2017Jared Sheehan
 
COMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input DevicesCOMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input DevicesMark Billinghurst
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual RealityMark Billinghurst
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google CardboardMark Billinghurst
 
Building AR and VR Experiences
Building AR and VR ExperiencesBuilding AR and VR Experiences
Building AR and VR ExperiencesMark Billinghurst
 
Lecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual RealityLecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual RealityMark Billinghurst
 
Developing AR and VR Experiences with Unity
Developing AR and VR Experiences with UnityDeveloping AR and VR Experiences with Unity
Developing AR and VR Experiences with UnityMark Billinghurst
 
Create Your Own VR Experience
Create Your Own VR ExperienceCreate Your Own VR Experience
Create Your Own VR ExperienceMark Billinghurst
 
Learning The Rules to Break Them: Designing for the Future of VR
Learning The Rules to Break Them: Designing for the Future of VRLearning The Rules to Break Them: Designing for the Future of VR
Learning The Rules to Break Them: Designing for the Future of VRMichael Harris
 
COMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR TechnologyCOMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR TechnologyMark Billinghurst
 
Virtual Reality & Augmented Reality
Virtual Reality & Augmented RealityVirtual Reality & Augmented Reality
Virtual Reality & Augmented RealityRajesh Yadav
 
presentation1-180123jjjjjjjj150728_2.pdf
presentation1-180123jjjjjjjj150728_2.pdfpresentation1-180123jjjjjjjj150728_2.pdf
presentation1-180123jjjjjjjj150728_2.pdfreler89973
 
C. VR intrduction_lecture for introduction to VR Lecture-1.pptx
C. VR intrduction_lecture for introduction to VR Lecture-1.pptxC. VR intrduction_lecture for introduction to VR Lecture-1.pptx
C. VR intrduction_lecture for introduction to VR Lecture-1.pptxRajGopalMishra4
 
Mobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMark Billinghurst
 
Mobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMark Billinghurst
 

Similar to Lecture 4: VR Systems (20)

Introduction to daydream for AnDevCon DC - 2017
Introduction to daydream for AnDevCon DC - 2017Introduction to daydream for AnDevCon DC - 2017
Introduction to daydream for AnDevCon DC - 2017
 
Introduction to DaydreamVR from DevFestDC 2017
Introduction to DaydreamVR from DevFestDC 2017Introduction to DaydreamVR from DevFestDC 2017
Introduction to DaydreamVR from DevFestDC 2017
 
COMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input DevicesCOMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input Devices
 
AR-VR Workshop
AR-VR WorkshopAR-VR Workshop
AR-VR Workshop
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual Reality
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google Cardboard
 
Lecture 9 AR Technology
Lecture 9 AR TechnologyLecture 9 AR Technology
Lecture 9 AR Technology
 
Building AR and VR Experiences
Building AR and VR ExperiencesBuilding AR and VR Experiences
Building AR and VR Experiences
 
Lecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual RealityLecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual Reality
 
Developing AR and VR Experiences with Unity
Developing AR and VR Experiences with UnityDeveloping AR and VR Experiences with Unity
Developing AR and VR Experiences with Unity
 
Create Your Own VR Experience
Create Your Own VR ExperienceCreate Your Own VR Experience
Create Your Own VR Experience
 
Learning The Rules to Break Them: Designing for the Future of VR
Learning The Rules to Break Them: Designing for the Future of VRLearning The Rules to Break Them: Designing for the Future of VR
Learning The Rules to Break Them: Designing for the Future of VR
 
COMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR TechnologyCOMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR Technology
 
Lecture3 - VR Technology
Lecture3 - VR TechnologyLecture3 - VR Technology
Lecture3 - VR Technology
 
Virtual Reality & Augmented Reality
Virtual Reality & Augmented RealityVirtual Reality & Augmented Reality
Virtual Reality & Augmented Reality
 
presentation1-180123jjjjjjjj150728_2.pdf
presentation1-180123jjjjjjjj150728_2.pdfpresentation1-180123jjjjjjjj150728_2.pdf
presentation1-180123jjjjjjjj150728_2.pdf
 
C. VR intrduction_lecture for introduction to VR Lecture-1.pptx
C. VR intrduction_lecture for introduction to VR Lecture-1.pptxC. VR intrduction_lecture for introduction to VR Lecture-1.pptx
C. VR intrduction_lecture for introduction to VR Lecture-1.pptx
 
Mobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - Technology
 
Mobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research Directions
 
Virtual Reality
Virtual RealityVirtual Reality
Virtual Reality
 

More from Mark Billinghurst

Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: PerceptionMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
 
Comp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingComp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingMark Billinghurst
 

More from Mark Billinghurst (15)

Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 
Comp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingComp4010 lecture6 Prototyping
Comp4010 lecture6 Prototyping
 

Recently uploaded

New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfMounikaPolabathina
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Hiroshi SHIBATA
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPathCommunity
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsNathaniel Shimoni
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
Scale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterScale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterMydbops
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesThousandEyes
 
Decarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a realityDecarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a realityIES VE
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
Sample pptx for embedding into website for demo
Sample pptx for embedding into website for demoSample pptx for embedding into website for demo
Sample pptx for embedding into website for demoHarshalMandlekar2
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxLoriGlavin3
 
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...AliaaTarek5
 
A Framework for Development in the AI Age
A Framework for Development in the AI AgeA Framework for Development in the AI Age
A Framework for Development in the AI AgeCprime
 
Manual 508 Accessibility Compliance Audit
Manual 508 Accessibility Compliance AuditManual 508 Accessibility Compliance Audit
Manual 508 Accessibility Compliance AuditSkynet Technologies
 
Potential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and InsightsPotential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and InsightsRavi Sanghani
 

Recently uploaded (20)

New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdf
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to Hero
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directions
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
Scale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterScale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL Router
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
 
Decarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a realityDecarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a reality
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
Sample pptx for embedding into website for demo
Sample pptx for embedding into website for demoSample pptx for embedding into website for demo
Sample pptx for embedding into website for demo
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
 
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
 
A Framework for Development in the AI Age
A Framework for Development in the AI AgeA Framework for Development in the AI Age
A Framework for Development in the AI Age
 
Manual 508 Accessibility Compliance Audit
Manual 508 Accessibility Compliance AuditManual 508 Accessibility Compliance Audit
Manual 508 Accessibility Compliance Audit
 
Potential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and InsightsPotential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and Insights
 

Lecture 4: VR Systems

  • 1. LECTURE 4: VR SYSTEMS COMP 4010 – Virtual Reality Semester 5 - 2019 Bruce Thomas, Mark Billinghurst, Gun Lee University of South Australia August 20th 2019
  • 2. • Survey of VR technologies • Tracking • Haptic/Tactile Displays • Audio Displays • Input Devices Recap – Last Week
  • 3. Tracking in VR • Need for Tracking • User turns their head and the VR graphics scene changes • User wants to walking through a virtual scene • User reaches out and grab a virtual object • The user wants to use a real prop in VR • All of these require technology to track the user or object • Continuously provide information about position and orientation Head Tracking Hand Tracking
  • 4. Tracking Technologies § Active (device sends out signal) • Mechanical, Magnetic, Ultrasonic • GPS, Wifi, cell location § Passive (device senses world) • Inertial sensors (compass, accelerometer, gyro) • Computer Vision • Marker based, Natural feature tracking § Hybrid Tracking • Combined sensors (eg Vision + Inertial)
  • 5. Haptic Feedback • Greatly improves realism • Hands and wrist are most important • High density of touch receptors • Two kinds of feedback: • Touch Feedback • information on texture, temperature, etc. • Does not resist user contact • Force Feedback • information on weight, and inertia. • Actively resists contact motion
  • 6. Active vs. Passive Haptics • Active Haptics • Actively resists motion • Key properties • Force resistance, DOF, latency • Passive Haptics • Not controlled by system • Use real props (e.g. styrofoam for walls)
  • 7. Audio Displays • Spatialization vs. Localization • Spatialization is the processing of sound signals to make them emanate from a point in space • This is a technical topic • Localization is the ability of people to identify the source position of a sound • This is a human topic, i.e., some people are better at it. • Head-Related Transfer Function (HRTF) • Models how sound from a source reaches the eardrum • Needs to be measured for each individual Closed
  • 8. VR Input Devices • Physical devices that convey information into the application and support interaction in the Virtual Environment
  • 9. Multiple Input Devices • Natural • Eye, gaze, full body tracking • Handheld devices • Controllers, gloves • Body worn • Myo armband • Pedestrian devices • Treadmill, ball
  • 10. Mapping Between Input and Output Input Output
  • 11. Comparison Between Devices From Jerald (2015) Comparing between hand and non-hand input
  • 13. Creating a Good VR Experience • Creating a good experience requires good system design • Integrating multiple hardware, software, interaction, content elements
  • 14. Example: Shard VR Slide • Ride down the Shard at 100 mph - Multi-sensory VR https://www.youtube.com/watch?v=HNXYoEdBtoU
  • 15. Key Components to Consider • Five key components: • Inputs • Outputs • Computation/Simulation • Content/World database • User interaction From: Sherman, W. R., & Craig, A. B. (2018). Understanding virtual reality: Interface, application, and design. Morgan Kaufmann.
  • 16. Typical VR System • Combining multiple technology elements for good user experience • Input devices, output modality, content databases, networking, etc.
  • 17. From Content to User Modelling Program Content • 3d model • Textures Translation • CAD data Application programming Dynamics Generator Input Devices • Gloves, Mic • Trackers Renderers • 3D, sound Output Devices • HMD, audio • Haptic User Actions • Speak • Grab Software Content User I/O
  • 18. Case Study: Multimodal VR System • US Army project • Simulate control of an unmanned vehicle • Sensors (input) • Head/hand tracking • Gesture, Speech (Multimodal) • Displays (output) • HMD, Audio • Processing • Graphics: Virtual vehicles on battlefield • Speech processing/understanding Neely, H. E., Belvin, R. S., Fox, J. R., & Daily, M. J. (2004, March). Multimodal interaction techniques for situational awareness and command of robotic combat entities. In Aerospace Conference, 2004. Proceedings. 2004 IEEE (Vol. 5, pp. 3297-3305). IEEE.
  • 21. Types of VR Experiences • Immersive Spaces • 360 Panorama’s/Movies • High visual quality • Limited interactivity • Changing viewpoint orientation • Immersive Experiences • 3D graphics • Lower visual quality • High interactivity • Movement in space • Interact with objects
  • 22. Types of VR Graphics Content • Panoramas • 360 images/video • Captured 3D content • Scanned objects/spaces • Modelled Content • Hand created 3D models • Existing 3D assets
  • 23. Capturing Panoramas • Stitching individual photos together • Image Composite Editor (Microsoft) • AutoPano (Kolor) • Using 360 camera • Ricoh Theta-S • Fly360
  • 24. Consumer 360 Capture Devices Kodac 360 Fly 360 Gear 360 Theta S Nikon LG 360 Pointgrey Ladybug Panono 360 Bublcam
  • 25. Example: Cardboard Camera • Capture 360 panoramas • Stitch together images on phone • View in VR on Google Cardboard Viewer
  • 27. • Use camera pairs to capture stereo 360 video • Samsung 360 round • 17 lenses, 4K 3D images, live video streaming, $10K USD • Vuze+ VR camera • 8 lenses, 4K Stereoscopic 3D 360⁰ video and photo, $999 USD Stereo Video Capture Vuze Samsung
  • 28. Samsung 360 Round • https://www.youtube.com/watch?v=X_ytJJOmVF0
  • 29. 3D Scanning • A range of products support 3D scanning • Create point cloud or mesh model • Typically combine RGB cameras with depth sensing • Captures texture plus geometry • Multi-scale • Object Scanners • Handheld, Desktop • Body Scanners • Rotating platform, multi-camera • Room scale • Mobile, tripod mounted
  • 30. Example: Matterport • Matterport Pro2 3D scanner • Room scale scanner, panorama and 3D model • 360° (left-right) x 300° (vertical) field of view • Structured light (infared) 3D sensor • 15 ft (4.5 m) maximum range • 4K HDR images
  • 31. Matterport Pro2 Lite • https://www.youtube.com/watch?v=SjHk0Th-j1I
  • 32. Handheld/Desktop Scanners • Capture people/objects • Sense 3D scanner • accuracy of 0.90 mm, colour resolution of 1920×1080 pixels • Occipital Structure sensor • Add-on to iPad, mesh scanning, IR light projection, 60 Hz
  • 34. 3D Modelling • A variety of 3D modelling tools can be used • Export in VR compatible file format (.obj, .fbx, etc) • Especially useful for animation - difficult to create from scans • Popular tools • Blender (free), 3DS max, Maya, etc. • Easy to Use • Tinkercad, Sketchup Free, Meshmixer, Fusion 360, etc.
  • 35. Modelling in VR • Several tools for modelling in VR • Natural interaction, low polygon count, 3D object viewins • Low end • Google Blocks • High end • Quill, Tilt brush – 3D painting • Gravity Sketch – 3D CAD
  • 36. Example: Google Blocks • https://www.youtube.com/watch?v=1TX81cRqfUU
  • 37. Example: Gravity Sketch • https://www.youtube.com/watch?v=VK2DDnT_3l0
  • 38. Download Existing VR Content • Many locations for 3D objects, textures, etc. • Google Poly - Low polygon VR ready models • Sketchfab, Sketchup, Free3D (www.free3d.com), etc. • Asset stores - Unity, Unreal • Provide 3D models, materials, code, etc..
  • 39. Google Poly • https://poly.google.com/ - search for models you’d like
  • 41. Typical VR Simulation Loop • User moves head, scene updates, displayed graphics change
  • 42. • Need to synchronize system to reduce delays System Delays
  • 43. Typical Delay from Tracking to Rendering System Delay
  • 44. Typical System Delays • Total Delay = 50 + 2 + 33 + 17 = 102 ms • 1 ms delay = 1/3 mm error for object drawn at arms length • So total of 33mm error from when user begins moving to when object drawn Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop 20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
  • 45. Living with High Latency (1/3 sec – 3 sec) • https://www.youtube.com/watch?v=_fNp37zFn9Q
  • 46. Effects of System Latency • Degraded Visual Acuity • Scene still moving when head stops = motion blur • Degraded Performance • As latency increases it’s difficult to select objects etc. • If latency > 120 ms, training doesn’t improve performance • Breaks-in-Presence • If system delay high user doesn’t believe they are in VR • Negative Training Effects • User train to operative in world with delay • Simulator Sickness • Latency is greatest cause of simulator sickness
  • 47. Simulator Sickness • Visual input conflicting with vestibular system
  • 48. Many Causes of Simulator Sickness • 25-40% of VR users get Simulator Sickness, due to: • Latency • Major cause of simulator sickness • Tracking accuracy/precision • Seeing world from incorrect position, viewpoint drift • Field of View • Wide field of view creates more periphery vection = sickness • Refresh Rate/Flicker • Flicker/low refresh rate creates eye fatigue • Vergence/Accommodation Conflict • Creates eye strain over time • Eye separation • If IPD not matching to inter-image distance then discomfort
  • 50. How to Reduce System Delays • Use faster components • Faster CPU, display, etc. • Reduce the apparent lag (Time Warp) • Take tracking measurement just before rendering • Remove tracker from the loop • Use predictive tracking • Use fast inertial sensors to predict where user will be looking • Difficult due to erratic head movements Jerald, J. (2004). Latency compensation for head-mounted virtual reality. UNC Computer Science Technical Report.
  • 51. Reducing System Lag Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop Faster Tracker Faster CPU Faster GPU Faster Display
  • 52. ReducingApparent Lag (TimeWarp) Tracking Update x,y,z r,p,y Virtual Display Physical Display (640x480) 1280 x 960 Last known position Virtual Display Physical Display (640x480) 1280 x 960 Latest position Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop Create virtual display large than physical display and move at last minute
  • 53. PredictiveTracking for Reducing Latency Time Position Past Future Use additional sensors (e.g. inertial) to predict future position • Can reliably predict up to 80 ms in future (Holloway) • Use Kalman filters or similar to smooth prediction Now
  • 56. VR Graphics Architecture/Tools • Rendering Layer (GPU acceleration) [OpenGL] • Low level graphics code • Rendering pixels/polygons • Interface with graphics card/frame buffer • Graphics Layer (CPU acceleration) [X3D, OSG] • Scene graph specification • Object physics engine • Specifying graphics objects • Application Layer [Unity, Unreal] • User interface libraries • Simulation/behaviour code • User interaction specification
  • 57. • Low level code for loading models and showing on screen • Using shaders and low level GPU programming to improve graphics Traditional 3D Graphics Pipeline
  • 58. Graphics Challenges with VR • Higher data throughput (> 7x desktop requirement) • Lower latency requirements (from 150ms/frame to 20ms) • HMD Lens distortion
  • 59. • HMD may have cheap lens • Creates chromatic aberration and distorted image • Warp graphics images to create undistorted view • Use low level shader programming Lens Distortion
  • 60. VR System Pipeline • Using time warping and lens distortion
  • 61. Perception Based Graphics • Eye Physiology • Rods in eye centre = colour vision, cones in periphery = motion, B+W • Foveated Rendering • Use eye tracking to draw highest resolution where user looking • Reduces graphics throughput
  • 63. Scene Graphs • Tree-like structure for organising VR graphics • e.g. VRML, OSG, X3D • Hierarchy of nodes that define: • Groups (and Switches, Sequences etc…) • Transformations • Projections • Geometry • … • And states and attributes that define: • Materials and textures • Lighting and blending • …
  • 64. Example Scene Graph • Car model with four wheels • Only need one wheel geometry object in scene graph
  • 65. More Complex • Everything off root node • Parent/child node relationships • Can move car by transforming group node
  • 66. Adding Cameras and Lights • Scene graph includes: • Cameras • Lighting • Material properties • Etc.. • All passed to renderer
  • 67. Benefits of Using a Scene Graph • Performance • Structuring data facilitates optimization • Culling, state management, etc… • Hardware Abstraction • Underlying graphics pipeline is hidden • No Low-level programming • Think about objects, not polygons • Supports Behaviours • Collision detection, animation, etc..
  • 68. Scene Graph Libraries • VRML/X3D • descriptive text format, ISO standard • OpenInventor • based on C++ and OpenGL • originally Silicon Graphics, 1988 • now supported by VSG3d.com • Java3D • provides 3D data structures in Java • not supported anymore • Open Scene Graph (OSG) • Various Game Engines • e.g. JMonkey 3 (scene graph based game engine for Java)
  • 69. Creating a Scene Graph • Creation of scene graph objects • Authoring software (e.g. Blender, 3DS Max) • Assets exported to exchange formats • E.g. (X3D,) Wavefront OBJ (.obj), 3ds Max (.3ds), Ogre XML (.mesh) • Objects typically are tesselated • Polygon meshes • Create XML file • Specify scene graph • Example: • JME Scene
  • 70. Scene Graph in the Rendering Pipeline • Scene graph used to optimize scene creation in pipeline
  • 71. OpenSceneGraph • http://www.openscenegraph.org/ • Open-source scene graph implementation • Based on OpenGL • Object-oriented C++ following design pattern principles • Used for simulation, games, research, and industrial projects • Active development community • mailing list, documentation (www.osgbooks.com) • Uses the OSG Public License (similar to LGPL)
  • 72. OpenSceneGraph Features • Plugins for loading and saving • 3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)… • 2D: .png, .jpg, .bmp, QuickTime movies • NodeKits to extend functionality • osgTerrain - terrain rendering • osgAnimation - character animation • osgShadow - shadow framework • Multi-language support • C++, Java, Lua and Python • Cross-platform support: • Windows, Linux, MacOS, iOS, Android, etc.
  • 73. OpenSceneGraph Architecture Scene graph and Rendering functionality Plugins read and write 2D image and 3D model files NodeKits extend core functionality, exposing higher-level node types
  • 74. OpenSceneGraph and Virtual Reality • Need to create VR wrapper on top of OSG • Add support for HMDs, device interaction, etc.. • Several viewer nodes available with VR support • OsgOpenVRViewer: viewing on VR devices compatible with openVR/steamVR • OsgOculusViewer: OsgViewer with support for the Oculus Rift
  • 75. Examples • Using OsgOculusViewer, Leap Motion and Oculus Rift HMD • https://www.youtube.com/watch?v=xZgyOF-oT0g
  • 76. High Level Graphics Tools • Game Engines • Powerful, need scripting ability • Unity, Unreal, Cry Engine, etc.. • Combine with VR plugins • HMDs, input devices, interaction, assets, etc..
  • 77. Tools for Non-Programmers • Focus on Design, ease of use • Visual Programming, content arrangement • Examples • Insta-VR – 360 panoramas • http://www.instavr.co/ • Vizor – VR on the Web • http://vizor.io/ • A-frame – HTML based • https://aframe.io/ • Eon Creator – Drag and drop tool for AR/VR • http://www.eonreality.com/eon-creator/ • Amazon Sumerian – WebGL, multiplatform • https://aws.amazon.com/sumerian/
  • 78. Example: InstaVR (360 VR) • https://www.youtube.com/watch?v=M2C8vDL0YeA
  • 79. Example: Amazon Sumerian (3D VR) • https://www.youtube.com/watch?v=_Q3QKFp3zlo
  • 81. System Design Guidelines - I • Hardware • Choose HMDs with fast pixel response time, no flicker • Choose trackers with high update rates, accurate, no drift • Choose HMDs that are lightweight, comfortable to wear • Use hand controllers with no line of sight requirements • System Calibration • Have virtual FOV match actual FOV of HMD • Measure and set users IPD • Latency Reduction • Minimize overall end to end system delay • Use displays with fast response time and low persistence • Use latency compensation to reduce perceived latency Jason Jerald, The VR Book, 2016
  • 82. System Design Guidelines - II • General Design • Design for short user experiences • Minimize visual stimuli closer to eye (vergence/accommodation) • For binocular displays, do not use 2D overlays/HUDs • Design for sitting, or provide physical barriers • Show virtual warning when user reaches end of tracking area • Motion Design • Move virtual viewpoint with actual motion of the user • If latency high, no tasks requiring fast head motion • Interface Design • Design input/interaction for user’s hands at their sides • Design interactions to be non-repetitive to reduce strain injuries Jason Jerald, The VR Book, 2016