SlideShare a Scribd company logo
1 of 91
Download to read offline
LECTURE 5: VR AUDIO
AND TRACKING
COMP 4010 – Virtual Reality
Semester 5 – 2016
Bruce Thomas, Mark Billinghurst
University of South Australia
August 23rd 2016
Recap – Last Week
•  Visual Displays
•  Head Mounted Display
•  Vive, Mobile VE
•  Projection/Large Screen Display
•  CAVE, Allosphere
•  Haptic Displays
•  Active Haptics
•  Actively Resist Motion
•  Passive Haptics
•  Physical Props
•  Tactile Displays
•  Vibrating actuators
AUDIO DISPLAYS
Audio Displays
Definition: Computer interfaces that provide
synthetic sound feedback to users interacting with
the virtual world.
The sound can be monoaural (both ears hear the
same sound), or binaural (each ear hears a
different sound)
Burdea, Coiffet (2003)
Virtual Reality Audio Overview
•  https://www.youtube.com/watch?v=yUlnMbxTuY0
Motivation
• Most of the focus in Virtual Reality is on the visuals
•  GPUs continue to drive the field
•  Users want more
•  More realism, More complexity, More speed
• However sound can significantly enhance realism
•  Example: Mood music in horror games
• Sound can provide valuable user interface feedback
•  Example: Alert in training simulation
Creating/Capturing Sounds
•  Sounds can be captured from nature (sampled) or
synthesized computationally
•  High-quality recorded sounds are
•  Cheap to play
•  Easy to create realism
•  Expensive to store and load
•  Difficult to manipulate for expressiveness
•  Synthetic sounds are
•  Cheap to store and load
•  Easy to manipulate
•  Expensive to compute before playing
•  Difficult to create realism
Types of Audio Recordings
•  Monaural: Recording with one microphone – no positioning
•  Stereo Sound: Recording with two microphones placed several
feet apart. Perceived sound position as recorded by
microphones.
•  Binaural: Recording microphones embedded in a dummy
head. Audio filtered by head shape.
•  3D Sound: Using tiny microphones in the ears of a real person.
Generate HRTF based on ear shape and audio response.
Synthetic Sounds
•  Complex sounds can be built from simple waveforms
(e.g., sawtooth, sine) and combined using operators
•  Waveform parameters (frequency, amplitude) could be
taken from motion data, such as object velocity
•  Can combine wave forms in various ways
•  This is what classic synthesizers do
•  Works well for many non-speech sounds
Combining Wave Forms
•  Adding up waves creates new waves
Digital Audio Workstation Software
•  Software for recording, editing, producing audio files
•  Mixing console, synthesizer, waveform editor, etc
•  Wide variety available
•  https://en.wikipedia.org/wiki/Digital_audio_workstation
Typical Audio Display Properties
Presentation Properties
•  Number of channels
•  Sound stage
•  Localization
•  Masking
•  Amplification
Logistical Properties
!  Noise pollution
!  User mobility
!  Interface with tracking
!  Environmental requirements
!  Integration
!  Portability
!  Throughput
!  Cumber
!  Safety
!  Cost
Channels and Masking
• Number of channels
•  Stereo vs. mono vs. quadrophonic
•  2.1, 5.1, 7.1
• Two kinds of masking
•  Louder sounds mask softer ones
•  We have too many things vying for our audio attention these days!
•  Physical objects mask sound signals
•  Happens with speakers, but not with headphones
Audio Displays: Head-worn
Ear Buds On Ear Open
Back
Closed Bone
Conduction
Audio Displays: Room Mounted
•  Stereo, 5.1, 7.1, 11.1, etc
•  Sound cube
11.1 Speaker Array
Spatialization vs. Localization
• Spatialization is the processing of sound signals
to make them emanate from a point in space
• This is a technical topic
• Localization is the ability of people to identify the
source position of a sound
• This is a human topic, i.e., some people are
better at it than others.
Stereo Sound
•  Seems to come from inside users head
•  Follows head motion as user moves head
3D Spatial Sound
•  Seems to be external to the head
•  Fixed in space when user moves head
•  Has reflected sound properties
Spatialized Audio Effects
• Naïve approach
•  Simple left/right shift for lateral position
•  Amplitude adjustment for distance
• Easy to produce using consumer hardware/software
• Does not give us "true" realism in sound
•  No up/down or front/back cues
• We can use multiple speakers for this
•  Surround the user with speakers
•  Send different sound signals to each one
Example: The BoomRoom
•  Use surround speakers to create spatial audio effects
•  Gesture based interaction
•  https://www.youtube.com/watch?time_continue=54&v=6RQMOyQ3lyg
Audio Localization
• Main cues used by humans to localize sound:
1.  Interaural time differences: Time difference for
sound wave to travel between ears
2.  Interaural level differences: For high frequency
sounds (> 1.5 kHz), volume difference between
ears used to determine source direction
3.  Spectral filtering done by outer ears: Ear shape
changes frequency heard
Interaural Time Difference
•  Takes fixed time to travel between ears
•  Can use time difference to determine sound location
Spectral Filtering
Ear shape filters sound depending on direction it is coming from.
This change in frequency determines sound source elevation.
Natural Hearing vs. Headphones
•  Due to ear shape natural hearing provides different audio
response depending on sound location
Head-Related Transfer Functions (HRTFs)
• A set of functions that model how sound from a
source at a known location reaches the eardrum
More About HRTFs
• Functions take into account,
•  Individual ear shape
•  Slope of shoulders
•  Head shape
• So, each person has his/her own HRTF!
•  Need to have a parameterizable HRTFs
• Some sound cards/APIs allow specifying an HRTF
•  adsfa
Constructing HRTFs
• Small microphones placed into ear canals
• Subject sits in an anechoic chamber
•  Can use a mannequin's head instead
• Sounds played from a large number of known
locations around the chamber
•  HRTFs are constructed for this data
• Sound signal is filtered through inverse functions
to place the sound at the desired source
Constructing HRTFs
•  Putting microphones in Manikin or human ears
•  Playing sound from fixed positions
•  Record response
How HRTFs are Used
•  HRTF is the Fourier transform of the
in-ear microphone audio response
(head related impulse response
(HRIR))
•  From HRTF we can calculate pairs
of finite impulse response (FIR)
filters for specific sound positions
•  One filter per ear
•  To place virtual sound at a position,
apply set of FIR filters for that
position to the incoming sound
HRTF Processing
•  Input sound is convolved with FIR to generate L/R outputs
Environmental Effects
• Sound is also changed by objects in the environment
•  Can reverberate off of reflective objects
•  Can be absorbed by objects
•  Can be occluded by objects
• Doppler shift
•  Moving sound sources
• Need to simulate environmental audio properties
•  Takes significant processing power
Sound Reverberation
•  Need to consider first and second order reflections
•  Need to model material properties, objects in room, etc
The Tough Part
•  All of this takes a lot of processing
•  Need to keep track of
•  Multiple (possibly moving) sound sources
•  Path of sounds through a dynamic environment
•  Position and orientation of listener(s)
•  Most sound cards only support a limited number of
spatialized sound channels
•  Increasingly complex geometry increases load on
audio system as well as visuals
•  That's why we fake it ;-)
•  GPUs might change this too!
Sound Display Hardware
•  Designed to reduce CPU load
•  Early Hardware
•  Custom HRTF
•  Crystal River Engineering Convolvotron (1988)
•  Real time 3D audio localizer, 4 sound sources
•  Lake Technology (2002)
•  Huron 20, custom DSP hardware, $40,000
•  Modern Consumer Hardware
•  Uses generic HRTF
•  SoundBlaster Audigy/EAX
•  Aureal A3D/Vortex card
Convolvotron Block Diagram
For N sound sources
GPU Based Audio Acceleration
•  Using GPU for audio physics calculations
•  AMD TrueAudio Next
•  https://www.youtube.com/watch?v=Z6nwYLHG8PU
Audio Software SDKs
•  Modern CPUs are fast enough spatial audio can be
generated without dedicated hardware
•  Several 3D audio SDKs exist
•  OpenAL
•  www.openal.org
•  Open source, cross platform
•  Renders multichannel three-dimensional positional audio
•  Google VR SDK
•  Android, iOS, Unity
•  https://developers.google.com/vr/concepts/spatial-audio
•  Oculus
•  https://developer3.oculus.com/documentation/audiosdk/latest/
•  Microsoft DirectX, Unity, etc
Google VR Spatial Audio Demo
•  https://www.youtube.com/watch?v=I9zf4hCjRg0&feature=youtu.be
OSSIC 3D Audio Headphones
•  3D audio headphones
•  Calibrates to user – calculates HRTF
•  Integrated head tracking
•  Multi-driver array providing sound to correct part of ear
•  Raised $2.7 million on Kickstarter
•  https://www.ossic.com/3d-audio/
Ossic vs. Traditional Headphone
•  Provides frequency reproduction of real sound
OSSIC vs. Generic Headphone
•  Sound source localization (T = target)
OSSIC Technology
•  https://www.youtube.com/watch?time_continue=71&v=ko-VeQ7Aflg
Designing Spatial Audio
•  There are several tools available for designing 3D audio
•  E.g. Facebook Spatial Workstation
•  Audio tools for cinematic VR and360 video
•  https://facebook360.fb.com/spatial-workstation/
•  Spatial Audio Designer
•  Mixing of surround sound and 3D audio
•  http://www.newaudiotechnology.com/en/products/spatial-audio-designer/
Demo: Spatial Audio In VR
•  AltspaceVR spatial audio for speaker discrimination
•  https://www.youtube.com/watch?v=dV3Qog44z6E
TRACKING
Immersion and Tracking
• Motivation: For immersion, when the user changes
position in reality the VR view also needs to change
•  Requires tracking of the user’s pose (position/orientation)
in the real world and mapping to the Virtual World
Definitions
• Tracking: measuring the
position and orientation of an
object relative to a known
frame of reference
• VR Tracker: technology used
in VR to measure the real
time change in a 3D object
position and orientation
(1968) Ivan Sutherland
Mechanical Tracker
•  Frames of Reference
•  Real World Coordinate System (Wcs)
•  Head Coordinate System (Hcs)
•  Eye Coordinate System (Ecs)
•  Need to create a mapping between Frames
•  E.g. Transformation from Wcs to Hcs to Ecs
•  Movement in real world maps to movement in Ecs frame
Frames of Reference
Example Frames of Reference
Assuming Head Tracker
mounted on HMD
Assuming tracking relative to
fixed table object
Tracking Degrees of Freedom
• Typically 6 Degrees of Freedom (DOF)
• Rotation or Translation about an Axis
1.  Moving up and down
2.  Moving left and right
3.  Moving forward and backward
4.  Tilting forward and backward (pitching);
5.  Turning left and right (yawing);
6.  Tilting side to side (rolling).
Key Tracking Performance Criteria
• Static Accuracy
• Dynamic Accuracy
• Latency
• Update Rate
• Tracking Jitter
• Signal to Noise Ratio
• Tracking Drift
Static vs. Dynamic Accuracy
•  Static Accuracy
•  Ability of tracker to determine
coordinates of a position in space
•  Depends on sensor sensitivity, errors
(algorithm, operator), environment
•  Dynamic Accuracy
•  System accuracy as sensor moves
•  Depends on static accuracy
•  Resolution
•  Minimum change sensor can detect
•  Repeatability
•  Same input giving same output
Tracker Latency, Update Rate
•  Latency: Time between change
in object pose and time sensor
detects the change
•  Large latency (> 10 ms) can cause
simulator sickness
•  Larger latency (> 50 ms) can
reduce VR immersion
•  Update Rate: Number of
measurements per second
•  Typically > 30 Hz
Tracker Jitter, Signal to Noise Ratio
•  Jitter: Change in tracker output
when tracked object is stationary
•  Range of change is sensor noise
•  Tracker with no jitter reports constant
value if tracked object stationary
•  Makes tracker data changing
randomly about average value
•  Signal to Noise Ratio: Signal in
data relative to noise
•  Found from calculating mean of
samples in known positions
Tracker Drift
•  Drift: Steady increase in
tracker error over time
•  Accumulative (additive) error
over time
•  Relative to Dynamic sensitivity
over time
•  Controlled by periodically
recalibration (zeroing)
Tracking Technologies
•  Mechanical
•  Physical Linkage
•  Electromagnetic
•  Magnetic sensing
•  Inertial
•  Accelerometer, MEMs
•  Acoustic
•  Ultrasonic
•  Optical
•  Computer Vision
•  Hybrid
•  Combination of Technologies
Contact-less
Contact-based
MechanicalTracker
• Idea: mechanical arms with joint sensors
• ++: high accuracy, low jitter, low latency
• -- : cumbersome, limited range, fixed position
Microscribe Sutherland
Example: Fake Space Boom
•  BOOM (Binocular Omni-Orientation Monitor)
•  Counterbalanced arm with 100
o
FOV HMD mounted on it
•  6 DOF, 4mm position accuracy, 300Hz sampling, < 5 ms latency
Demo: Fake Space Tele Presence
•  Using Boom with HMD to control robot view
•  https://www.youtube.com/watch?v=QpTQTu7A6SI
MagneticTracker
• Idea: Measure difference in current between a
magnetic transmitter and a receiver
• ++: 6DOF, robust, accurate, no line of sight needed
• -- : limted range, sensible to metal, noisy, expensive
Flock of Birds (Ascension)
Example: Polhemus Fastrak
•  Degrees-of-Freedom: 6DOF
•  Number of Sensors: 1-4
•  Latency: 4ms
•  Update Rate: 120 Hz/(num sensors)
•  Static Accuracy Position: 0.03in RMS
•  Static Accuracy Orientation: 0.15° RMS
•  Range from Standard Source: Up to 5 feet or 1.52 meters
•  Extended Range Source: Up to 15 feet or 4.6 meters
•  Interface RS-232 or USB (both included)
•  Host OS compatability GUI/API Toolkit 2000/XP
•  http://polhemus.com/motion-tracking/all-trackers/fastrak
Polhemus Tracker Demo
•  https://www.youtube.com/watch?v=7DlEfd0VH_o
Polhemus MagneticTracking Error
Example: Razer Hydra
•  Developed by Sixense
•  Magnetic source + 2 wired controllers
•  Short range (< 1 m), Precision of 1mm and 1
o
•  62Hz sampling rate, < 50 ms latency
•  $600 USD
Razor Hydra Demo
•  https://www.youtube.com/watch?v=jnqFdSa5p7w
InertialTracker
• Idea: Measuring linear and angular orientation rates
(accelerometer/gyroscope)
• ++: no transmitter, cheap, small, high sample rate, wireless
• -- : drift, hysteresis, noise, only 3DOF
IS300 (Intersense)
Wii Remote
Types of Inertial Trackers
• Gyroscopes
•  The rate of change in object orientation or angular
velocity is measured.
• Accelerometers
•  Measure acceleration.
•  Can be used to determine object position, if the starting
point is known.
• Inclinometer
•  Measures inclination, ”level” position.
•  Like carpenter’s level, but giving electrical signal.
Example: MEMS Sensor
•  Uses spring-supported load
•  Reacts to gravity and inertia
•  Changes its electrical parameters
•  < 5 ms latency, 0.01
o
accuracy
•  up to 1000Hz sampling
•  Problems
•  Rapidly accumulating errors.
•  Error in position increases with the square of time.
•  Cheap units can get position drift of 4 cm in 2 seconds.
•  Expensive units have same error in 200 seconds.
•  Not good for measuring location
•  Need to periodically reset the output
Demo: MEMS Sensor Working
•  https://www.youtube.com/watch?v=9eSnxebfuxg
MEMS Gyro Bias Drift
•  Zero reading of MEMS Gyro drifts over time due to noise
Example: iPhone Sensors
•  Three-axis accelerometer
•  Gives direction acceleration -
affected by gravity and movement
•  Three-axis gyroscope
•  Measures translation and rotation
moment – affected by movement
•  Three axis magnetometer
•  Gives (approximate) direction of
magnetic north
•  GPS
•  Gives geolocation – multiple
samples over time can be used to
detect direction and speed
iPhone Sensor Monitor app
Acoustic - UltrasonicsTracker
• Idea:Time of Flight or Phase-Coherence Sound Waves
• ++: Small, Cheap
• -- : 3DOF, Line of Sight, Low resolution, Affected by
Environment (pressure, temperature), Low sampling rate
Ultrasonic
Logitech IS600
Acoustic Tracking Methods
•  Two approaches:
•  Time difference,
•  Phase difference
•  Time-of-flight (TOF):
•  All current commercial systems
•  Time that sound pulse travels is proportional to distance from the receiver.
•  Problem: differentiating the pulse from noise.
•  Each transmitter works sequentially – increased latency.
•  Phase coherent approach (Sutherland 1968):
•  No pulse, but continuous signal (~50 kHz)
•  Many transmitters on different frequencies
•  Sent and received signal phase differences give continuously the change
in distance, no latency,
•  Only relative distance, cumulative & multi-path errors possible.
Acoustic Tracking Principles
•  Measurements are based on triangulation
•  Minimum distances at transmitter and receiver required.
•  Can be a problem if trying to make the receiver very small.
•  Each speaker is activated in cycle and 3 distances from it
to the 3 microphones are calculated, 9 distances total.
•  Tracking performance can degrade when operating in a
noisy environment.
•  Update rate about 50 datasets/s
•  Time multiplexing is possible
•  With 4 receivers, update rate drops to 12 datasets/s
Example: Logitech Head Tracker
•  Transmitter is a set of three ultrasonic
speakers - 30cm from each other
•  Rigid and fixed triangular frame
•  50 Hz update, 30 ms latency
•  Receiver is a set of three microphones
Placed at the top of the HMD
•  May be part of 3D mice, stereo glasses, or
other interface devices
•  Range typically about 1.5 m
•  Direct line of sight required
•  Accuracy 0.1
o
orientation, 2% distance
OpticalTracker
• Idea: Image Processing and ComputerVision
• Specialized
• Infrared, Retro-Reflective, Stereoscopic
• ++: Long range, cheap, immune to metal
• -- : Line of Sight,Visual Targets, Low Sampling rate
ART Hi-Ball
Outside-In vs.Inside-OutTracking
OpticalTrackingTechnologies
• Scalable active trackers
• InterSense IS-900, 3rd Tech HiBall
• Passive optical computer vision
• Line of sight, may require landmarks
• Can be brittle.
• Computer vision is computationally-intensive
3rd Tech, Inc.
Example:HiBallTracking System (3rd Tech)
• Inside-Out Tracker
• $50K USD
• Scalable over large area
• Fast update (2000Hz)
• Latency Less than 1 ms.
• Accurate
• Position 0.4mm RMS
• Orientation 0.02° RMS
Example: Microsoft Kinect
•  Outside-in tracking
•  Components:
•  RGB camera
•  Range camera
•  IR light source
•  Multi-array microphone
•  Specifications
•  Range 1-6m
•  Update rate 30Hz
•  Latency 100ms
•  Tracking resolution < 5mm
•  Range Camera extracts depth information
and combines it with a video signal
Hybrid Tracking
•  Idea: Multiple technologies overcome limitations of each one
•  A system that utilizes two or more position/orientation
measurement technologies (e.g. inertial + vision)
•  ++: Robust, reduce latency, increase accuracy
•  -- : More complex, expensive
Intersense IS-900
Ascension Laser Bird
Example: Intersense IS-900
•  Inertial Ultrasonic Hybrid tracking
•  Use ultrasonic strips for position sensing
•  Intertial sensing for orientation
•  Sensor fusion to combine together
•  Specifications
•  Latency 4ms
•  Update 180 Hz
•  Resolution 0.75mm, 0.05
o
•  Accuracy 3mm, 0.25
o
•  Up to 140m2 tracking volume
•  http://www.intersense.com/pages/20/14
Demo: IS-1200 and IS-900
•  https://www.youtube.com/watch?v=NkYLlTyuYkA
Example: Vive Lighthouse Tracking
•  Outside-in hybrid tracking system
•  2 base stations
•  Each with 2 laser scanners, LED array
•  Headworn/handheld sensors
•  37 photo-sensors in HMD, 17 in hand
•  Additional IMU sensors (500 Hz)
•  Performance
•  Tracking server fuses sensor samples
•  Sampling rate 250 Hz, 4 ms latency
•  2mm RMS tracking accuracy
•  Large area - 5 x 5m range
•  See http://doc-ok.org/?p=1478
Lighthouse Components
•  sd
Base station
- IR LED array
- 2 x scanned lasers
Head Mounted Display
- 37 photo sensors
- 9 axis IMU
Lighthouse Setup
How Lighthouse Tracking Works
•  Position tracking using IMU
•  500 Hz sampling
•  But drifts over time
•  Drift correction using optical tracking
•  IR synchronization pulse (60 Hz)
•  Laser sweep between pulses
•  Photo-sensors recognize sync pulse, measure time to laser
•  Know when sensor hit and which sensor hit
•  Calculate position of sensor relative to base station
•  Use 2 base stations to calculate pose
•  Use IMU sensor data between pulses (500Hz)
•  See http://xinreality.com/wiki/Lighthouse
Lighthouse Tracking
Base station scanning
https://www.youtube.com/watch?v=avBt_P0wg_Y
https://www.youtube.com/watch?v=oqPaaMR4kY4
Room tracking
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

More Related Content

What's hot

Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsMark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VRMark Billinghurst
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsMark Billinghurst
 
Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1Mark Billinghurst
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionMark Billinghurst
 
Augmented Reality (AR)
Augmented Reality (AR)Augmented Reality (AR)
Augmented Reality (AR)Samsil Arefin
 
Lecture7 Example VR Applications
Lecture7 Example VR ApplicationsLecture7 Example VR Applications
Lecture7 Example VR ApplicationsMark Billinghurst
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Mark Billinghurst
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionMark Billinghurst
 
Lecture 2 Presence and Perception
Lecture 2 Presence and PerceptionLecture 2 Presence and Perception
Lecture 2 Presence and PerceptionMark Billinghurst
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XRMark Billinghurst
 
Natural Interfaces for Augmented Reality
Natural Interfaces for Augmented RealityNatural Interfaces for Augmented Reality
Natural Interfaces for Augmented RealityMark Billinghurst
 
Mixed Reality in the Workspace
Mixed Reality in the WorkspaceMixed Reality in the Workspace
Mixed Reality in the WorkspaceMark Billinghurst
 
COMP 4010 - Lecture 5: Interaction Design for Virtual Reality
COMP 4010 - Lecture 5: Interaction Design for Virtual RealityCOMP 4010 - Lecture 5: Interaction Design for Virtual Reality
COMP 4010 - Lecture 5: Interaction Design for Virtual RealityMark Billinghurst
 
Multimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRMark Billinghurst
 

What's hot (20)

Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR Systems
 
Virtual Reality(full)
Virtual Reality(full)Virtual Reality(full)
Virtual Reality(full)
 
Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR Interaction
 
Augmented Reality (AR)
Augmented Reality (AR)Augmented Reality (AR)
Augmented Reality (AR)
 
Lecture7 Example VR Applications
Lecture7 Example VR ApplicationsLecture7 Example VR Applications
Lecture7 Example VR Applications
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
 
Lecture 2 Presence and Perception
Lecture 2 Presence and PerceptionLecture 2 Presence and Perception
Lecture 2 Presence and Perception
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
Natural Interfaces for Augmented Reality
Natural Interfaces for Augmented RealityNatural Interfaces for Augmented Reality
Natural Interfaces for Augmented Reality
 
Mixed Reality in the Workspace
Mixed Reality in the WorkspaceMixed Reality in the Workspace
Mixed Reality in the Workspace
 
COMP 4010 - Lecture 5: Interaction Design for Virtual Reality
COMP 4010 - Lecture 5: Interaction Design for Virtual RealityCOMP 4010 - Lecture 5: Interaction Design for Virtual Reality
COMP 4010 - Lecture 5: Interaction Design for Virtual Reality
 
2013 Lecture3: AR Tracking
2013 Lecture3: AR Tracking 2013 Lecture3: AR Tracking
2013 Lecture3: AR Tracking
 
Multimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed Reality
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 

Viewers also liked

COMP 4010 Lecture3: Human Perception
COMP 4010 Lecture3: Human PerceptionCOMP 4010 Lecture3: Human Perception
COMP 4010 Lecture3: Human PerceptionMark Billinghurst
 
COMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input DevicesCOMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input DevicesMark Billinghurst
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysMark Billinghurst
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionMark Billinghurst
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google CardboardMark Billinghurst
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityMark Billinghurst
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityMark Billinghurst
 
COMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR DisplaysCOMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR DisplaysMark Billinghurst
 
COMP 4010 - Lecture 2: Presence in Virtual Reality
COMP 4010 - Lecture 2: Presence in Virtual RealityCOMP 4010 - Lecture 2: Presence in Virtual Reality
COMP 4010 - Lecture 2: Presence in Virtual RealityMark Billinghurst
 
COMP 4026 Lecture4: Processing and Advanced Interface Technology
COMP 4026 Lecture4: Processing and Advanced Interface TechnologyCOMP 4026 Lecture4: Processing and Advanced Interface Technology
COMP 4026 Lecture4: Processing and Advanced Interface TechnologyMark Billinghurst
 
COMP 4026 Lecture 6 Wearable Computing
COMP 4026 Lecture 6 Wearable ComputingCOMP 4026 Lecture 6 Wearable Computing
COMP 4026 Lecture 6 Wearable ComputingMark Billinghurst
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesMark Billinghurst
 
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARMark Billinghurst
 
COMP 4026 Lecture 5 OpenFrameworks and Soli
COMP 4026 Lecture 5 OpenFrameworks and SoliCOMP 4026 Lecture 5 OpenFrameworks and Soli
COMP 4026 Lecture 5 OpenFrameworks and SoliMark Billinghurst
 
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesVSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesMark Billinghurst
 
Using AR for Vehicle Navigation
Using AR for Vehicle NavigationUsing AR for Vehicle Navigation
Using AR for Vehicle NavigationMark Billinghurst
 
Introduction to Augmented Reality
Introduction to Augmented RealityIntroduction to Augmented Reality
Introduction to Augmented RealityMark Billinghurst
 

Viewers also liked (20)

COMP 4010 Lecture3: Human Perception
COMP 4010 Lecture3: Human PerceptionCOMP 4010 Lecture3: Human Perception
COMP 4010 Lecture3: Human Perception
 
COMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input DevicesCOMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input Devices
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR Interaction
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google Cardboard
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
 
COMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR DisplaysCOMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR Displays
 
AR-VR Workshop
AR-VR WorkshopAR-VR Workshop
AR-VR Workshop
 
COMP 4010 - Lecture 2: Presence in Virtual Reality
COMP 4010 - Lecture 2: Presence in Virtual RealityCOMP 4010 - Lecture 2: Presence in Virtual Reality
COMP 4010 - Lecture 2: Presence in Virtual Reality
 
COMP 4026 Lecture4: Processing and Advanced Interface Technology
COMP 4026 Lecture4: Processing and Advanced Interface TechnologyCOMP 4026 Lecture4: Processing and Advanced Interface Technology
COMP 4026 Lecture4: Processing and Advanced Interface Technology
 
COMP 4026 Lecture 6 Wearable Computing
COMP 4026 Lecture 6 Wearable ComputingCOMP 4026 Lecture 6 Wearable Computing
COMP 4026 Lecture 6 Wearable Computing
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the Possibilities
 
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in AR
 
COMP 4026 Lecture 5 OpenFrameworks and Soli
COMP 4026 Lecture 5 OpenFrameworks and SoliCOMP 4026 Lecture 5 OpenFrameworks and Soli
COMP 4026 Lecture 5 OpenFrameworks and Soli
 
Ismar 2016 Presentation
Ismar 2016 PresentationIsmar 2016 Presentation
Ismar 2016 Presentation
 
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesVSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
 
Using AR for Vehicle Navigation
Using AR for Vehicle NavigationUsing AR for Vehicle Navigation
Using AR for Vehicle Navigation
 
Introduction to Augmented Reality
Introduction to Augmented RealityIntroduction to Augmented Reality
Introduction to Augmented Reality
 
Virtual Reality 2.0
Virtual Reality 2.0Virtual Reality 2.0
Virtual Reality 2.0
 

Similar to COMP 4010 Lecture5 VR Audio and Tracking

WebRTC, RED and Janus @ ClueCon21
WebRTC, RED and Janus @ ClueCon21WebRTC, RED and Janus @ ClueCon21
WebRTC, RED and Janus @ ClueCon21Lorenzo Miniero
 
Spatial Sound 3: Audio Rendering and Ambisonics
Spatial Sound 3: Audio Rendering and AmbisonicsSpatial Sound 3: Audio Rendering and Ambisonics
Spatial Sound 3: Audio Rendering and AmbisonicsRichard Elen
 
Build Your Own VR Display Course - SIGGRAPH 2017: Part 4
Build Your Own VR Display Course - SIGGRAPH 2017: Part 4Build Your Own VR Display Course - SIGGRAPH 2017: Part 4
Build Your Own VR Display Course - SIGGRAPH 2017: Part 4StanfordComputationalImaging
 
Interactive Voice Con
Interactive Voice ConInteractive Voice Con
Interactive Voice ConDru Wynings
 
Making Audio Engineering Learning & Practice Accessible in Virtual Reality
Making Audio Engineering Learning & Practice Accessible in Virtual RealityMaking Audio Engineering Learning & Practice Accessible in Virtual Reality
Making Audio Engineering Learning & Practice Accessible in Virtual RealitySamuel Fisher
 
Summary of the paper「PrivacyMic: Utilizing Inaudible Frequencies for Privacy ...
Summary of the paper「PrivacyMic: Utilizing Inaudible Frequencies for Privacy ...Summary of the paper「PrivacyMic: Utilizing Inaudible Frequencies for Privacy ...
Summary of the paper「PrivacyMic: Utilizing Inaudible Frequencies for Privacy ...YutaFunada
 
SoundSense
SoundSenseSoundSense
SoundSensebutest
 
VR Technical Session: Spatialized Audio Design
VR Technical Session: Spatialized Audio DesignVR Technical Session: Spatialized Audio Design
VR Technical Session: Spatialized Audio DesignOnline News Association
 
“Comparing ML-Based Audio with ML-Based Vision: An Introduction to ML Audio f...
“Comparing ML-Based Audio with ML-Based Vision: An Introduction to ML Audio f...“Comparing ML-Based Audio with ML-Based Vision: An Introduction to ML Audio f...
“Comparing ML-Based Audio with ML-Based Vision: An Introduction to ML Audio f...Edge AI and Vision Alliance
 
Audio spot light
Audio spot lightAudio spot light
Audio spot lightMansi Gupta
 
How Audio Objects Improve Spatial Accuracy / Mads Maretty Sønderup (Audiokine...
How Audio Objects Improve Spatial Accuracy / Mads Maretty Sønderup (Audiokine...How Audio Objects Improve Spatial Accuracy / Mads Maretty Sønderup (Audiokine...
How Audio Objects Improve Spatial Accuracy / Mads Maretty Sønderup (Audiokine...DevGAMM Conference
 

Similar to COMP 4010 Lecture5 VR Audio and Tracking (20)

Spatial Audio
Spatial AudioSpatial Audio
Spatial Audio
 
Spatial audio(19,24)
Spatial audio(19,24)Spatial audio(19,24)
Spatial audio(19,24)
 
WebRTC, RED and Janus @ ClueCon21
WebRTC, RED and Janus @ ClueCon21WebRTC, RED and Janus @ ClueCon21
WebRTC, RED and Janus @ ClueCon21
 
Lecture3 - VR Technology
Lecture3 - VR TechnologyLecture3 - VR Technology
Lecture3 - VR Technology
 
Spatial Sound 3: Audio Rendering and Ambisonics
Spatial Sound 3: Audio Rendering and AmbisonicsSpatial Sound 3: Audio Rendering and Ambisonics
Spatial Sound 3: Audio Rendering and Ambisonics
 
Build Your Own VR Display Course - SIGGRAPH 2017: Part 4
Build Your Own VR Display Course - SIGGRAPH 2017: Part 4Build Your Own VR Display Course - SIGGRAPH 2017: Part 4
Build Your Own VR Display Course - SIGGRAPH 2017: Part 4
 
Mit21 m 380s12_complecnot
Mit21 m 380s12_complecnotMit21 m 380s12_complecnot
Mit21 m 380s12_complecnot
 
Interactive Voice Con
Interactive Voice ConInteractive Voice Con
Interactive Voice Con
 
Making Audio Engineering Learning & Practice Accessible in Virtual Reality
Making Audio Engineering Learning & Practice Accessible in Virtual RealityMaking Audio Engineering Learning & Practice Accessible in Virtual Reality
Making Audio Engineering Learning & Practice Accessible in Virtual Reality
 
Summary of the paper「PrivacyMic: Utilizing Inaudible Frequencies for Privacy ...
Summary of the paper「PrivacyMic: Utilizing Inaudible Frequencies for Privacy ...Summary of the paper「PrivacyMic: Utilizing Inaudible Frequencies for Privacy ...
Summary of the paper「PrivacyMic: Utilizing Inaudible Frequencies for Privacy ...
 
SoundSense
SoundSenseSoundSense
SoundSense
 
Audios in Unity
Audios in UnityAudios in Unity
Audios in Unity
 
VR Technical Session: Spatialized Audio Design
VR Technical Session: Spatialized Audio DesignVR Technical Session: Spatialized Audio Design
VR Technical Session: Spatialized Audio Design
 
Stereo Microphone Techniques
Stereo Microphone TechniquesStereo Microphone Techniques
Stereo Microphone Techniques
 
Audio media
Audio mediaAudio media
Audio media
 
Speech Recognition System
Speech Recognition SystemSpeech Recognition System
Speech Recognition System
 
“Comparing ML-Based Audio with ML-Based Vision: An Introduction to ML Audio f...
“Comparing ML-Based Audio with ML-Based Vision: An Introduction to ML Audio f...“Comparing ML-Based Audio with ML-Based Vision: An Introduction to ML Audio f...
“Comparing ML-Based Audio with ML-Based Vision: An Introduction to ML Audio f...
 
Reverb w5 imp_2
Reverb w5 imp_2Reverb w5 imp_2
Reverb w5 imp_2
 
Audio spot light
Audio spot lightAudio spot light
Audio spot light
 
How Audio Objects Improve Spatial Accuracy / Mads Maretty Sønderup (Audiokine...
How Audio Objects Improve Spatial Accuracy / Mads Maretty Sønderup (Audiokine...How Audio Objects Improve Spatial Accuracy / Mads Maretty Sønderup (Audiokine...
How Audio Objects Improve Spatial Accuracy / Mads Maretty Sønderup (Audiokine...
 

More from Mark Billinghurst

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR SystemsMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR PrototypingMark Billinghurst
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR InteractionMark Billinghurst
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: PerceptionMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignMark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
 

More from Mark Billinghurst (20)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Recently uploaded

Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...apidays
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024The Digital Insurer
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Enterprise Knowledge
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxKatpro Technologies
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilV3cube
 

Recently uploaded (20)

Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of Brazil
 

COMP 4010 Lecture5 VR Audio and Tracking

  • 1. LECTURE 5: VR AUDIO AND TRACKING COMP 4010 – Virtual Reality Semester 5 – 2016 Bruce Thomas, Mark Billinghurst University of South Australia August 23rd 2016
  • 2. Recap – Last Week •  Visual Displays •  Head Mounted Display •  Vive, Mobile VE •  Projection/Large Screen Display •  CAVE, Allosphere •  Haptic Displays •  Active Haptics •  Actively Resist Motion •  Passive Haptics •  Physical Props •  Tactile Displays •  Vibrating actuators
  • 4. Audio Displays Definition: Computer interfaces that provide synthetic sound feedback to users interacting with the virtual world. The sound can be monoaural (both ears hear the same sound), or binaural (each ear hears a different sound) Burdea, Coiffet (2003)
  • 5. Virtual Reality Audio Overview •  https://www.youtube.com/watch?v=yUlnMbxTuY0
  • 6. Motivation • Most of the focus in Virtual Reality is on the visuals •  GPUs continue to drive the field •  Users want more •  More realism, More complexity, More speed • However sound can significantly enhance realism •  Example: Mood music in horror games • Sound can provide valuable user interface feedback •  Example: Alert in training simulation
  • 7. Creating/Capturing Sounds •  Sounds can be captured from nature (sampled) or synthesized computationally •  High-quality recorded sounds are •  Cheap to play •  Easy to create realism •  Expensive to store and load •  Difficult to manipulate for expressiveness •  Synthetic sounds are •  Cheap to store and load •  Easy to manipulate •  Expensive to compute before playing •  Difficult to create realism
  • 8. Types of Audio Recordings •  Monaural: Recording with one microphone – no positioning •  Stereo Sound: Recording with two microphones placed several feet apart. Perceived sound position as recorded by microphones. •  Binaural: Recording microphones embedded in a dummy head. Audio filtered by head shape. •  3D Sound: Using tiny microphones in the ears of a real person. Generate HRTF based on ear shape and audio response.
  • 9. Synthetic Sounds •  Complex sounds can be built from simple waveforms (e.g., sawtooth, sine) and combined using operators •  Waveform parameters (frequency, amplitude) could be taken from motion data, such as object velocity •  Can combine wave forms in various ways •  This is what classic synthesizers do •  Works well for many non-speech sounds
  • 10. Combining Wave Forms •  Adding up waves creates new waves
  • 11. Digital Audio Workstation Software •  Software for recording, editing, producing audio files •  Mixing console, synthesizer, waveform editor, etc •  Wide variety available •  https://en.wikipedia.org/wiki/Digital_audio_workstation
  • 12. Typical Audio Display Properties Presentation Properties •  Number of channels •  Sound stage •  Localization •  Masking •  Amplification Logistical Properties !  Noise pollution !  User mobility !  Interface with tracking !  Environmental requirements !  Integration !  Portability !  Throughput !  Cumber !  Safety !  Cost
  • 13. Channels and Masking • Number of channels •  Stereo vs. mono vs. quadrophonic •  2.1, 5.1, 7.1 • Two kinds of masking •  Louder sounds mask softer ones •  We have too many things vying for our audio attention these days! •  Physical objects mask sound signals •  Happens with speakers, but not with headphones
  • 14. Audio Displays: Head-worn Ear Buds On Ear Open Back Closed Bone Conduction
  • 15. Audio Displays: Room Mounted •  Stereo, 5.1, 7.1, 11.1, etc •  Sound cube 11.1 Speaker Array
  • 16. Spatialization vs. Localization • Spatialization is the processing of sound signals to make them emanate from a point in space • This is a technical topic • Localization is the ability of people to identify the source position of a sound • This is a human topic, i.e., some people are better at it than others.
  • 17. Stereo Sound •  Seems to come from inside users head •  Follows head motion as user moves head
  • 18. 3D Spatial Sound •  Seems to be external to the head •  Fixed in space when user moves head •  Has reflected sound properties
  • 19. Spatialized Audio Effects • Naïve approach •  Simple left/right shift for lateral position •  Amplitude adjustment for distance • Easy to produce using consumer hardware/software • Does not give us "true" realism in sound •  No up/down or front/back cues • We can use multiple speakers for this •  Surround the user with speakers •  Send different sound signals to each one
  • 20. Example: The BoomRoom •  Use surround speakers to create spatial audio effects •  Gesture based interaction •  https://www.youtube.com/watch?time_continue=54&v=6RQMOyQ3lyg
  • 21. Audio Localization • Main cues used by humans to localize sound: 1.  Interaural time differences: Time difference for sound wave to travel between ears 2.  Interaural level differences: For high frequency sounds (> 1.5 kHz), volume difference between ears used to determine source direction 3.  Spectral filtering done by outer ears: Ear shape changes frequency heard
  • 22. Interaural Time Difference •  Takes fixed time to travel between ears •  Can use time difference to determine sound location
  • 23. Spectral Filtering Ear shape filters sound depending on direction it is coming from. This change in frequency determines sound source elevation.
  • 24. Natural Hearing vs. Headphones •  Due to ear shape natural hearing provides different audio response depending on sound location
  • 25. Head-Related Transfer Functions (HRTFs) • A set of functions that model how sound from a source at a known location reaches the eardrum
  • 26. More About HRTFs • Functions take into account, •  Individual ear shape •  Slope of shoulders •  Head shape • So, each person has his/her own HRTF! •  Need to have a parameterizable HRTFs • Some sound cards/APIs allow specifying an HRTF
  • 28. Constructing HRTFs • Small microphones placed into ear canals • Subject sits in an anechoic chamber •  Can use a mannequin's head instead • Sounds played from a large number of known locations around the chamber •  HRTFs are constructed for this data • Sound signal is filtered through inverse functions to place the sound at the desired source
  • 29. Constructing HRTFs •  Putting microphones in Manikin or human ears •  Playing sound from fixed positions •  Record response
  • 30. How HRTFs are Used •  HRTF is the Fourier transform of the in-ear microphone audio response (head related impulse response (HRIR)) •  From HRTF we can calculate pairs of finite impulse response (FIR) filters for specific sound positions •  One filter per ear •  To place virtual sound at a position, apply set of FIR filters for that position to the incoming sound
  • 31. HRTF Processing •  Input sound is convolved with FIR to generate L/R outputs
  • 32. Environmental Effects • Sound is also changed by objects in the environment •  Can reverberate off of reflective objects •  Can be absorbed by objects •  Can be occluded by objects • Doppler shift •  Moving sound sources • Need to simulate environmental audio properties •  Takes significant processing power
  • 33. Sound Reverberation •  Need to consider first and second order reflections •  Need to model material properties, objects in room, etc
  • 34. The Tough Part •  All of this takes a lot of processing •  Need to keep track of •  Multiple (possibly moving) sound sources •  Path of sounds through a dynamic environment •  Position and orientation of listener(s) •  Most sound cards only support a limited number of spatialized sound channels •  Increasingly complex geometry increases load on audio system as well as visuals •  That's why we fake it ;-) •  GPUs might change this too!
  • 35. Sound Display Hardware •  Designed to reduce CPU load •  Early Hardware •  Custom HRTF •  Crystal River Engineering Convolvotron (1988) •  Real time 3D audio localizer, 4 sound sources •  Lake Technology (2002) •  Huron 20, custom DSP hardware, $40,000 •  Modern Consumer Hardware •  Uses generic HRTF •  SoundBlaster Audigy/EAX •  Aureal A3D/Vortex card
  • 37. GPU Based Audio Acceleration •  Using GPU for audio physics calculations •  AMD TrueAudio Next •  https://www.youtube.com/watch?v=Z6nwYLHG8PU
  • 38. Audio Software SDKs •  Modern CPUs are fast enough spatial audio can be generated without dedicated hardware •  Several 3D audio SDKs exist •  OpenAL •  www.openal.org •  Open source, cross platform •  Renders multichannel three-dimensional positional audio •  Google VR SDK •  Android, iOS, Unity •  https://developers.google.com/vr/concepts/spatial-audio •  Oculus •  https://developer3.oculus.com/documentation/audiosdk/latest/ •  Microsoft DirectX, Unity, etc
  • 39. Google VR Spatial Audio Demo •  https://www.youtube.com/watch?v=I9zf4hCjRg0&feature=youtu.be
  • 40. OSSIC 3D Audio Headphones •  3D audio headphones •  Calibrates to user – calculates HRTF •  Integrated head tracking •  Multi-driver array providing sound to correct part of ear •  Raised $2.7 million on Kickstarter •  https://www.ossic.com/3d-audio/
  • 41. Ossic vs. Traditional Headphone •  Provides frequency reproduction of real sound
  • 42. OSSIC vs. Generic Headphone •  Sound source localization (T = target)
  • 44. Designing Spatial Audio •  There are several tools available for designing 3D audio •  E.g. Facebook Spatial Workstation •  Audio tools for cinematic VR and360 video •  https://facebook360.fb.com/spatial-workstation/ •  Spatial Audio Designer •  Mixing of surround sound and 3D audio •  http://www.newaudiotechnology.com/en/products/spatial-audio-designer/
  • 45. Demo: Spatial Audio In VR •  AltspaceVR spatial audio for speaker discrimination •  https://www.youtube.com/watch?v=dV3Qog44z6E
  • 47. Immersion and Tracking • Motivation: For immersion, when the user changes position in reality the VR view also needs to change •  Requires tracking of the user’s pose (position/orientation) in the real world and mapping to the Virtual World
  • 48. Definitions • Tracking: measuring the position and orientation of an object relative to a known frame of reference • VR Tracker: technology used in VR to measure the real time change in a 3D object position and orientation (1968) Ivan Sutherland Mechanical Tracker
  • 49. •  Frames of Reference •  Real World Coordinate System (Wcs) •  Head Coordinate System (Hcs) •  Eye Coordinate System (Ecs) •  Need to create a mapping between Frames •  E.g. Transformation from Wcs to Hcs to Ecs •  Movement in real world maps to movement in Ecs frame Frames of Reference
  • 50. Example Frames of Reference Assuming Head Tracker mounted on HMD Assuming tracking relative to fixed table object
  • 51. Tracking Degrees of Freedom • Typically 6 Degrees of Freedom (DOF) • Rotation or Translation about an Axis 1.  Moving up and down 2.  Moving left and right 3.  Moving forward and backward 4.  Tilting forward and backward (pitching); 5.  Turning left and right (yawing); 6.  Tilting side to side (rolling).
  • 52. Key Tracking Performance Criteria • Static Accuracy • Dynamic Accuracy • Latency • Update Rate • Tracking Jitter • Signal to Noise Ratio • Tracking Drift
  • 53. Static vs. Dynamic Accuracy •  Static Accuracy •  Ability of tracker to determine coordinates of a position in space •  Depends on sensor sensitivity, errors (algorithm, operator), environment •  Dynamic Accuracy •  System accuracy as sensor moves •  Depends on static accuracy •  Resolution •  Minimum change sensor can detect •  Repeatability •  Same input giving same output
  • 54. Tracker Latency, Update Rate •  Latency: Time between change in object pose and time sensor detects the change •  Large latency (> 10 ms) can cause simulator sickness •  Larger latency (> 50 ms) can reduce VR immersion •  Update Rate: Number of measurements per second •  Typically > 30 Hz
  • 55. Tracker Jitter, Signal to Noise Ratio •  Jitter: Change in tracker output when tracked object is stationary •  Range of change is sensor noise •  Tracker with no jitter reports constant value if tracked object stationary •  Makes tracker data changing randomly about average value •  Signal to Noise Ratio: Signal in data relative to noise •  Found from calculating mean of samples in known positions
  • 56. Tracker Drift •  Drift: Steady increase in tracker error over time •  Accumulative (additive) error over time •  Relative to Dynamic sensitivity over time •  Controlled by periodically recalibration (zeroing)
  • 57. Tracking Technologies •  Mechanical •  Physical Linkage •  Electromagnetic •  Magnetic sensing •  Inertial •  Accelerometer, MEMs •  Acoustic •  Ultrasonic •  Optical •  Computer Vision •  Hybrid •  Combination of Technologies Contact-less Contact-based
  • 58. MechanicalTracker • Idea: mechanical arms with joint sensors • ++: high accuracy, low jitter, low latency • -- : cumbersome, limited range, fixed position Microscribe Sutherland
  • 59. Example: Fake Space Boom •  BOOM (Binocular Omni-Orientation Monitor) •  Counterbalanced arm with 100 o FOV HMD mounted on it •  6 DOF, 4mm position accuracy, 300Hz sampling, < 5 ms latency
  • 60. Demo: Fake Space Tele Presence •  Using Boom with HMD to control robot view •  https://www.youtube.com/watch?v=QpTQTu7A6SI
  • 61. MagneticTracker • Idea: Measure difference in current between a magnetic transmitter and a receiver • ++: 6DOF, robust, accurate, no line of sight needed • -- : limted range, sensible to metal, noisy, expensive Flock of Birds (Ascension)
  • 62. Example: Polhemus Fastrak •  Degrees-of-Freedom: 6DOF •  Number of Sensors: 1-4 •  Latency: 4ms •  Update Rate: 120 Hz/(num sensors) •  Static Accuracy Position: 0.03in RMS •  Static Accuracy Orientation: 0.15° RMS •  Range from Standard Source: Up to 5 feet or 1.52 meters •  Extended Range Source: Up to 15 feet or 4.6 meters •  Interface RS-232 or USB (both included) •  Host OS compatability GUI/API Toolkit 2000/XP •  http://polhemus.com/motion-tracking/all-trackers/fastrak
  • 63. Polhemus Tracker Demo •  https://www.youtube.com/watch?v=7DlEfd0VH_o
  • 65. Example: Razer Hydra •  Developed by Sixense •  Magnetic source + 2 wired controllers •  Short range (< 1 m), Precision of 1mm and 1 o •  62Hz sampling rate, < 50 ms latency •  $600 USD
  • 66. Razor Hydra Demo •  https://www.youtube.com/watch?v=jnqFdSa5p7w
  • 67. InertialTracker • Idea: Measuring linear and angular orientation rates (accelerometer/gyroscope) • ++: no transmitter, cheap, small, high sample rate, wireless • -- : drift, hysteresis, noise, only 3DOF IS300 (Intersense) Wii Remote
  • 68. Types of Inertial Trackers • Gyroscopes •  The rate of change in object orientation or angular velocity is measured. • Accelerometers •  Measure acceleration. •  Can be used to determine object position, if the starting point is known. • Inclinometer •  Measures inclination, ”level” position. •  Like carpenter’s level, but giving electrical signal.
  • 69. Example: MEMS Sensor •  Uses spring-supported load •  Reacts to gravity and inertia •  Changes its electrical parameters •  < 5 ms latency, 0.01 o accuracy •  up to 1000Hz sampling •  Problems •  Rapidly accumulating errors. •  Error in position increases with the square of time. •  Cheap units can get position drift of 4 cm in 2 seconds. •  Expensive units have same error in 200 seconds. •  Not good for measuring location •  Need to periodically reset the output
  • 70. Demo: MEMS Sensor Working •  https://www.youtube.com/watch?v=9eSnxebfuxg
  • 71. MEMS Gyro Bias Drift •  Zero reading of MEMS Gyro drifts over time due to noise
  • 72. Example: iPhone Sensors •  Three-axis accelerometer •  Gives direction acceleration - affected by gravity and movement •  Three-axis gyroscope •  Measures translation and rotation moment – affected by movement •  Three axis magnetometer •  Gives (approximate) direction of magnetic north •  GPS •  Gives geolocation – multiple samples over time can be used to detect direction and speed iPhone Sensor Monitor app
  • 73. Acoustic - UltrasonicsTracker • Idea:Time of Flight or Phase-Coherence Sound Waves • ++: Small, Cheap • -- : 3DOF, Line of Sight, Low resolution, Affected by Environment (pressure, temperature), Low sampling rate Ultrasonic Logitech IS600
  • 74. Acoustic Tracking Methods •  Two approaches: •  Time difference, •  Phase difference •  Time-of-flight (TOF): •  All current commercial systems •  Time that sound pulse travels is proportional to distance from the receiver. •  Problem: differentiating the pulse from noise. •  Each transmitter works sequentially – increased latency. •  Phase coherent approach (Sutherland 1968): •  No pulse, but continuous signal (~50 kHz) •  Many transmitters on different frequencies •  Sent and received signal phase differences give continuously the change in distance, no latency, •  Only relative distance, cumulative & multi-path errors possible.
  • 75. Acoustic Tracking Principles •  Measurements are based on triangulation •  Minimum distances at transmitter and receiver required. •  Can be a problem if trying to make the receiver very small. •  Each speaker is activated in cycle and 3 distances from it to the 3 microphones are calculated, 9 distances total. •  Tracking performance can degrade when operating in a noisy environment. •  Update rate about 50 datasets/s •  Time multiplexing is possible •  With 4 receivers, update rate drops to 12 datasets/s
  • 76. Example: Logitech Head Tracker •  Transmitter is a set of three ultrasonic speakers - 30cm from each other •  Rigid and fixed triangular frame •  50 Hz update, 30 ms latency •  Receiver is a set of three microphones Placed at the top of the HMD •  May be part of 3D mice, stereo glasses, or other interface devices •  Range typically about 1.5 m •  Direct line of sight required •  Accuracy 0.1 o orientation, 2% distance
  • 77. OpticalTracker • Idea: Image Processing and ComputerVision • Specialized • Infrared, Retro-Reflective, Stereoscopic • ++: Long range, cheap, immune to metal • -- : Line of Sight,Visual Targets, Low Sampling rate ART Hi-Ball
  • 79. OpticalTrackingTechnologies • Scalable active trackers • InterSense IS-900, 3rd Tech HiBall • Passive optical computer vision • Line of sight, may require landmarks • Can be brittle. • Computer vision is computationally-intensive 3rd Tech, Inc.
  • 80. Example:HiBallTracking System (3rd Tech) • Inside-Out Tracker • $50K USD • Scalable over large area • Fast update (2000Hz) • Latency Less than 1 ms. • Accurate • Position 0.4mm RMS • Orientation 0.02° RMS
  • 81.
  • 82. Example: Microsoft Kinect •  Outside-in tracking •  Components: •  RGB camera •  Range camera •  IR light source •  Multi-array microphone •  Specifications •  Range 1-6m •  Update rate 30Hz •  Latency 100ms •  Tracking resolution < 5mm •  Range Camera extracts depth information and combines it with a video signal
  • 83. Hybrid Tracking •  Idea: Multiple technologies overcome limitations of each one •  A system that utilizes two or more position/orientation measurement technologies (e.g. inertial + vision) •  ++: Robust, reduce latency, increase accuracy •  -- : More complex, expensive Intersense IS-900 Ascension Laser Bird
  • 84. Example: Intersense IS-900 •  Inertial Ultrasonic Hybrid tracking •  Use ultrasonic strips for position sensing •  Intertial sensing for orientation •  Sensor fusion to combine together •  Specifications •  Latency 4ms •  Update 180 Hz •  Resolution 0.75mm, 0.05 o •  Accuracy 3mm, 0.25 o •  Up to 140m2 tracking volume •  http://www.intersense.com/pages/20/14
  • 85. Demo: IS-1200 and IS-900 •  https://www.youtube.com/watch?v=NkYLlTyuYkA
  • 86. Example: Vive Lighthouse Tracking •  Outside-in hybrid tracking system •  2 base stations •  Each with 2 laser scanners, LED array •  Headworn/handheld sensors •  37 photo-sensors in HMD, 17 in hand •  Additional IMU sensors (500 Hz) •  Performance •  Tracking server fuses sensor samples •  Sampling rate 250 Hz, 4 ms latency •  2mm RMS tracking accuracy •  Large area - 5 x 5m range •  See http://doc-ok.org/?p=1478
  • 87. Lighthouse Components •  sd Base station - IR LED array - 2 x scanned lasers Head Mounted Display - 37 photo sensors - 9 axis IMU
  • 89. How Lighthouse Tracking Works •  Position tracking using IMU •  500 Hz sampling •  But drifts over time •  Drift correction using optical tracking •  IR synchronization pulse (60 Hz) •  Laser sweep between pulses •  Photo-sensors recognize sync pulse, measure time to laser •  Know when sensor hit and which sensor hit •  Calculate position of sensor relative to base station •  Use 2 base stations to calculate pose •  Use IMU sensor data between pulses (500Hz) •  See http://xinreality.com/wiki/Lighthouse
  • 90. Lighthouse Tracking Base station scanning https://www.youtube.com/watch?v=avBt_P0wg_Y https://www.youtube.com/watch?v=oqPaaMR4kY4 Room tracking