SlideShare a Scribd company logo
LECTURE 2:
VR PERCEPTION AND
PRESENCE
COMP 4010 - Virtual Reality
Semester 5 - 2019
Mark Billinghurst, Bruce Thomas, Gun Lee
University of South Australia
August 6th 2019
Overview
•Presence in VR
•Perception and VR
•Human Perception
•VR Technology
REVIEW LECTURE 1
The Incredible Disappearing Computer
1960-70’s
Room
1970-80’s
Desk
1980-90’s
Lap
1990-2000’s
Hand
Virtual Reality
• Users immersed in Computer Generated environment
• HMD, gloves, 3D graphics, body tracking
Augmented Reality
•Virtual Images blended with the real world
• See-through HMD, handheld display, viewpoint tracking, etc..
Milgram’s Mixed Reality (MR) Continuum
Augmented Reality Virtual Reality
Real World Virtual World
Mixed Reality
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, (1994) A Taxonomy of Mixed Reality Visual Displays
Internet of Things
VR History Timeline
https://immersivelifeblog.files.wordpress.com/2015/04/vr_history.jpg
Sutherland Display
https://www.youtube.com/watch?v=NtwZXGprxag
Oculus Rift
Sony Morpheus
HTC/Valve Vive
2016 - Rise of Consumer HMDs
Projected HMD Sales
• asdf
Large Commercial Market
Conclusion
• Virtual Reality has a long history
• > 50 years of HMDs, simulators
• Key elements for VR were in place by early 1990’s
• Displays, tracking, input, graphics
• Strong support from military, government, universities
• First commercial wave failed in late 1990’s
• Too expensive, bad user experience, poor technology, etc
• We are now in second commercial wave
• Better experience, Affordable hardware
• Large commercial investment, Significant installed user base
PRESENCE
‘Virtual Reality is a synthetic sensory experience which
may one day be indistinguishable from the real physical
world.”
-Roy Kalawsky (1993)
Today
Tomorrow
Presence ..
“The subjective experience of being in one place or
environment even when physically situated in another”
Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence
questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240.
Presence Definition
“Presence is a psychological state .. in
which even though part or all of an
individual’s current experience is generated
by .. technology, part or all of the individual’s
perception fails to .. acknowledge the role of
the technology in the experience.”
International Society for Presence Research, 2016
https://ispr.info/
Immersion vs. Presence
• Immersion: the extent to which technology delivers a vivid
illusion of reality to the senses of a human participant.
• Presence: a state of consciousness, the (psychological)
sense of being in the virtual environment.
• So Immersion produces a sensation of Presence
• Goal of VR: Create a high degree of Presence
• Make people believe they are really in Virtual Environment
Slater, M., & Wilbur, S. (1997). A framework for immersive virtual environments (FIVE):
Speculations on the role of presence in virtual environments. Presence: Teleoperators and
virtual environments, 6(6), 603-616.
Three Types of Presence
• Personal Presence, the extent to which the person
feels like he or she is part of the virtual environment;
• Social Presence, the extent to which other beings
(living or synthetic) also exist in the VE;
• Environmental Presence, the extent to which the
environment itself acknowledges and reacts to the
person in the VE.
Heeter, C. (1992). Being there: The subjective experience of presence.
Presence: Teleoperators & Virtual Environments, 1(2), 262-271.
Benefits of High Presence
• Leads to greater engagement, excitement and satisfaction
• Increased reaction to actions in VR
• People more likely to behave like in the real world
• E.g. people scared of heights in real world will be scared in VR
• More natural communication (Social Presence)
• Use same cues as face to face conversation
• Note: The relationship between Presence and Performance is
unclear – still an active area of research
How to Create Strong Presence?
• Use Multiple Dimensions of Presence
• Create rich multi-sensory VR experiences
• Include social actors/agents that interact with user
• Have environment respond to user
• What Influences Presence
• Vividness – ability to provide rich experience (Steuer 1992)
• Using Virtual Body – user can see themselves (Slater 1993)
• Internal factors – individual user differences (Sadowski 2002)
• Interactivity – how much users can interact (Steuer 1992)
• Sensory, Realism factors (Witmer 1998)
Factors Contributing to Presence
• From
• Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual
environments: A presence questionnaire. Presence, 7(3), 225-240.
Presence Guidelines (Sadowski 2002)
Example: UNC Pit Room (2002)
• Key Features
• Training room and pit room
• Physical walking
• Fast, accurate, room scale tracking
• Haptic feedback – feel edge of pit, walls
• Strong visual and 3D audio cues
• Task
• Carry object across pit
• Walk across or walk around
• Dropping virtual balls at targets in pit
• http://wwwx.cs.unc.edu/Research/eve/walk_exp/
Typical Subject Behaviour
• Note – from another pit experiment
• https://www.youtube.com/watch?v=VVAO0DkoD-8
Richie’s Plank
• https://www.youtube.com/watch?v=4M92kfnpg-k
Measuring Presence
• Presence is very subjective so there is a lot of debate
among researchers about how to measure it
• Subjective Measures
• Self report questionnaire
• University College London Questionnaire (Slater 1999)
• Witmer and Singer Presence Questionnaire (Witmer 1998)
• ITC Sense Of Presence Inventory (Lessiter 2000)
• Continuous measure
• Person moves slider bar in VE depending on Presence felt
• Objective Measures
• Behavioural
• reflex/flinch measure, startle response
• Physiological measures
• change in heart rate, skin conductance, skin temperature
Presence Slider
Example: Witmer and Singer (1998)
• 32 questions in 4 categories/factors
• Control (CF), Sensory (SF), Realism (RF), Distraction factors (DF)
• Answered on Likert scale from 1 to 7 ( 1 = low, 7 = high)
Example: Heartrate in Pit Experiment
• Physiological measures can be used as a reliable measure
of Presence - especially change in heart rate (HR)
• Change in HR agreed with UCL subjective questionnaire
• HR increased with passive haptics, and increase in fps
Meehan, M., Insko, B., Whitton, M., & Brooks, F. P. (2001). Physiological measures of presence
in virtual environments. In Proceedings of 4th International Workshop on Presence (pp. 21-23).
PERCEPTION AND VR
How do We Perceive Reality?
• We understand the world through
our senses:
• Sight, Hearing, Touch, Taste, Smell
(and others..)
• Two basic processes:
• Sensation – Gathering information
• Perception – Interpreting information
Simple Sensing/Perception Model
Goal of Virtual Reality
“.. to make it feel like you’re actually in
a place that you are not.”
Palmer Luckey
Co-founder, Oculus
Creating the Illusion of Reality
• Fooling human perception by using
technology to generate artificial sensations
• Computer generated sights, sounds, smell, etc.
Reality vs. Virtual Reality
• In a VR system there are input and output devices
between human perception and action
• Goal is to create illusion of reality – high Presence
Example Birdly - http://www.somniacs.co/
• Create illusion of flying like a bird
• Multisensory VR experience
• Visual, audio, wind, haptic
Birdly Demo
• https://www.youtube.com/watch?v=JApQBIsCK6c
HUMAN PERCEPTION
Motivation
• Understand: In order to create a strong sense of Presence
we need to understand the Human Perception system
• Stimulate: We need to be able to use technology to provide
real world sensory inputs, and create the VR illusion
VR Hardware Human Senses
Senses
• How an organism obtains information for perception:
• Sensation part of Somatic Division of Peripheral Nervous System
• Integration and perception requires the Central Nervous System
• Five major senses (but there are more..):
• Sight (Opthalamoception)
• Hearing (Audioception)
• Taste (Gustaoception)
• Smell (Olfacaoception)
• Touch (Tactioception)
Relative Importance of Each Sense
• Percentage of neurons in
brain devoted to each sense
• Sight – 30%
• Touch – 8%
• Hearing – 2%
• Smell - < 1%
• Over 60% of brain involved
with vision in some way
Other Lessor Known Senses..
• Proprioception = sense of body position
• what is your body doing right now
• Equilibrium = balance
• Acceleration
• Nociception = sense of pain
• Temperature
• Satiety (the quality or state of being fed or gratified to or beyond capacity)
• Thirst
• Micturition
• Amount of CO2 and Na in blood
Sight
The Human Visual System
• Purpose is to convert visual input to signals in the brain
The Human Eye
• Light passes through cornea and lens onto retina
• Photoreceptors in retina convert light into electrochemical signals
Photoreceptors – Rods and Cones
• Retina photoreceptors come in two types, Rods and Cones
• Rods – 125 million, periphery of retina, no colour detection, night vision
• Cones – 4-6 million, center of retina, colour vision, day vision
Human Horizontal and Vertical FOV
• Humans can see ~135
o
vertical (60
o
above, 75
o
below)
• See up to ~ 210
o
horizontal FOV, ~ 115
o
stereo overlap
• Colour/stereo in centre, Black & White/mono in periphery
Vergence + Accommodation
• saas
Vergence/Accommodation Demo
• https://www.youtube.com/watch?v=p_xLO7yxgOk
Vergence-Accommodation Conflict
• Looking at real objects, vergence and focal distance match
• In VR, vergence and accommodation can miss-match
• Focusing on HMD screen, but accommodating for virtual object behind screen
Visual Acuity
Visual Acuity Test Targets
• Ability to resolve details
• Several types of visual acuity
• detection, separation, etc
• Normal eyesight can see a 50 cent coin at 80m
• Corresponds to 1 arc min (1/60th of a degree)
• Max acuity = 0.4 arc min
Stereo Perception/Stereopsis
• Eyes separated by IPD
• Inter pupillary distance
• 5 – 7.5cm (avge. 6.5cm)
• Each eye sees diff. image
• Separated by image
parallax
• Images fused to create
3D stereo view
Lecture 2 Presence and Perception
Depth Perception
• The visual system uses a range of different Stereoscopic
and Monocular cues for depth perception
Stereoscopic Monocular
eye convergence angle
disparity between left
and right images
diplopia
eye accommodation
perspective
atmospheric artifacts (fog)
relative sizes
image blur
occlusion
motion parallax
shadows
texture
Parallax can be more important for depth perception!
Stereoscopy is important for size and distance evaluation
Common Depth Cues
Depth Perception Distances
• i.e. convergence/accommodation used for depth perception < 10m
Properties of the Human Visual System
• visual acuity: 20/20 is ~1 arc min
• field of view: ~200° monocular, ~120° binocular, ~135° vertical
• resolution of eye: ~576 megapixels
• temporal resolution: ~60 Hz (depends on contrast, luminance)
• dynamic range: instantaneous 6.5 f-stops, adapt to 46.5 f-stops
• colour: everything in CIE xy diagram
• depth cues in 3D displays: vergence, focus, (dis)comfort
• accommodation range: ~8cm to ∞, degrades with age
Comparison between Eyes and HMD
Human Eyes HTC Vive
FOV 200° x 135° 110° x 110°
Stereo Overlap 120° 110°
Resolution 30,000 x 20,000 2,160 x 1,200
Pixels/inch >2190 (100mm to screen) 456
Update 60 Hz 90 Hz
See http://doc-ok.org/?p=1414
http://www.clarkvision.com/articles/eye-resolution.html
http://wolfcrow.com/blog/notes-by-dr-optoglass-the-resolution-of-the-human-eye/
Hearing
Anatomy of the Ear
Auditory Thresholds
• Humans hear frequencies from 20 – 22,000 Hz
• Most everyday sounds from 80 – 90 dB
Sound Localization
• Humans have two ears
• localize sound in space
• Sound can be localized
using 3 coordinates
• Azimuth, elevation,
distance
Sound Localization
• https://www.youtube.com/watch?v=FIU1bNSlbxk
Sound Localization (Azimuth Cues)
Interaural Time Difference
HRTF (Elevation Cue)
• Pinna and head shape affect frequency intensities
• Sound intensities measured with microphones in ear and
compared to intensities at sound source
• Difference is HRTF, gives clue as to sound source location
Accuracy of Sound Localization
• People can locate sound
• Most accurately in front of them
• 2-3° error in front of head
• Least accurately to sides and behind head
• Up to 20° error to side of head
• Largest errors occur above/below elevations and behind head
• Front/back confusion is an issue
• Up to 10% of sounds presented in the front are perceived
coming from behind and vice versa (more in headphones)
BUTEAN, A., Bălan, O., NEGOI, I., Moldoveanu, F., & Moldoveanu, A. (2015). COMPARATIVE RESEARCH
ON SOUND LOCALIZATION ACCURACY IN THE FREE-FIELD AND VIRTUAL AUDITORY DISPLAYS.
InConference proceedings of» eLearning and Software for Education «(eLSE)(No. 01, pp. 540-548).
Universitatea Nationala de Aparare Carol I.
Touch
Touch
• Mechanical/Temp/Pain stimuli transduced into Action
Potentials (AP)
• Transducing structures are specialized nerves:
• Mechanoreceptors: Detect pressure, vibrations & texture
• Thermoreceptors: Detect hot/cold
• Nocireceptors: Detect pain
• Proprioreceptors: Detect spatial awareness
• This triggers an AP which then travels to various
locations in the brain via the somatosensory nerves
Haptic Sensation
• Somatosensory System
• complex system of nerve cells that responds to changes to
the surface or internal state of the body
• Skin is the largest organ
• 1.3-1.7 square m in adults
• Tactile: Surface properties
• Receptors not evenly spread
• Most densely populated area is the tongue
• Kinesthetic: Muscles, Tendons, etc.
• Also known as proprioception
Cutaneous System
• Skin – heaviest organ in the body
• Epidermis outer layer, dead skin cells
• Dermis inner layer, with four kinds of mechanoreceptors
Mechanoreceptors
• Cells that respond to pressure, stretching, and vibration
• Slow Acting (SA), Rapidly Acting (RA)
• Type I at surface – light discriminate touch
• Type II deep in dermis – heavy and continuous touch
Receptor
Type
Rate of
Acting
Stimulus
Frequency
Receptive Field Detection
Function
Merkel
Discs
SA-I 0 – 10 Hz Small, well
defined
Edges, intensity
Ruffini
corpuscles
SA-II 0 – 10 Hz Large, indistinct Static force,
skin stretch
Meissner
corpuscles
RA-I 20 – 50 Hz Small, well
defined
Velocity, edges
Pacinian
corpuscles
RA-II 100 – 300
Hz
Large, indistinct Acceleration,
vibration
Spatial Resolution
• Sensitivity varies greatly
• Two-point discrimination
Body
Site
Threshold
Distance
Finger 2-3mm
Cheek 6mm
Nose 7mm
Palm 10mm
Forehead 15mm
Foot 20mm
Belly 30mm
Forearm 35mm
Upper Arm 39mm
Back 39mm
Shoulder 41mm
Thigh 42mm
Calf 45mm
http://faculty.washington.edu/chudler/chsense.html
Proprioception/Kinaesthesia
• Proprioception (joint position sense)
• Awareness of movement and positions of body parts
• Due to nerve endings and Pacinian and Ruffini corpuscles at joints
• Enables us to touch nose with eyes closed
• Joints closer to body more accurately sensed
• Users know hand position accurate to 8cm without looking at them
• Kinaesthesia (joint movement sense)
• Sensing muscle contraction or stretching
• Cutaneous mechanoreceptors measuring skin stretching
• Helps with force sensation
Smell
Olfactory System
• Human olfactory system. 1: Olfactory bulb 2: Mitral cells 3: Bone 4: Nasal
epithelium 5: Glomerulus 6: Olfactory receptor neurons
How the Nose Works
• https://www.youtube.com/watch?v=zaHR2MAxywg
Smell
• Smells are sensed by olfactory sensory neurons in the
olfactory epithelium
• 10 cm2
with hundreds of different types of olfactory receptors
• Human’s can detect at least 10,000 different odors
• Some researchers say trillions of odors
• Sense of smell closely related to taste
• Both use chemo-receptors
• Olfaction + taste contribute to flavour
• The olfactory system is the only sense that bypasses the
thalamus and connects directly to the forebrain
Taste
Sense of Taste
• https://www.youtube.com/watch?v=FSHGucgnvLU
Basics of Taste
• Sensation produced when a substance in the mouth
reacts chemically with taste receptor cells
• Taste receptors mostly on taste buds on the tongue
• 2,000 – 5,000 taste buds on tongues/100+ receptors each
• Five basic tastes:
• sweetness, sourness, saltiness, bitterness, and umami
• Flavour influenced by other senses
• smell, texture, temperature, “coolness”, “hotness”
Taste Trivia
VR TECHNOLOGY
Using Technology to Stimulate Senses
• Simulate output
• E.g. simulate real scene
• Map output to devices
• Graphics to HMD
• Use devices to
stimulate the senses
• HMD stimulates eyes
Visual
Simulation
3D Graphics HMD Vision
System
Brain
Example: Visual Simulation
Human-Machine Interface
Key Technologies for VR System
• Visual Display
• Stimulate visual sense
• Audio/Tactile Display
• Stimulate hearing/touch
• Tracking
• Changing viewpoint
• User input
• Input Devices
• Supporting user interaction
What Happens When Senses Don’t Match?
• 20-30% VR users experience motion sickness
• Sensory Conflict Theory
• Visual cues don’t match vestibular cues
• Eyes – “I’m moving!”, Vestibular – “No, you’re not!”
Avoiding Motion Sickness
• Better VR experience design
• More natural movements
• Improved VR system performance
• Less tracking latency, better graphics frame rate
• Provide a fixed frame of reference
• Ground plane, vehicle window
• Add a virtual nose
• Provide peripheral cue
• Eat ginger
• Reduces upset stomach
5 Key Technical Requirements for Presence
• Persistence
• > 90 Hz refresh, < 3 ms persistence, avoid retinal blur
• Optics
• Wide FOV > 90 degrees, comfortable eyebox, good calibration
• Tracking
• 6 DOF, 360 tracking, sub-mm accuracy, no jitter, good tracking volume
• Resolution
• Correct stereo, > 1K x 1K resolution, no visible pixels
• Latency
• < 20 ms latency, fuse optical tracking and IMU, minimize tracking loop
http://www.roadtovr.com/oculus-shares-5-key-ingredients-for-presence-in-virtual-reality/
VISUAL DISPLAY
Creating an Immersive Experience
•Head Mounted Display
•Immerse the eyes
•Projection/Large Screen
•Immerse the head/body
•Future Technologies
•Neural implants
•Contact lens displays, etc
HMD Basic Principles
• Use display with optics to create illusion of virtual screen
Key Properties of HMDs
• Lens
• Focal length, Field of View
• Occularity, Interpupillary distance
• Eye relief, Eye box
• Display
• Resolution, contrast
• Power, brightness
• Refresh rate
• Ergonomics
• Size, weight
• Wearability
Field of View
Monocular FOV is the angular subtense
of the displayed image as measured from
the pupil of one eye.
Total FOV is the total angular size of the displayed image
visible to both eyes.
Binocular(or stereoscopic) FOV refers to
the part of the displayed image visible to
both eyes.
FOV may be measured horizontally,
vertically or diagonally.
Ocularity
• Monocular - HMD
image to only one eye.
• Bioccular - Same HMD
image to both eyes.
• Binocular
(stereoscopic) -
Different but matched
images to each eye.
Interpupillary Distance (IPD)
nIPD is the horizontal
distance between a
user's eyes.
nIPD is the distance
between the two
optical axes in a
binocular view system.
Distortion in Lens Optics
A rectangle Maps to this
HMD optics distort images shown in them
Example Distortion
Oculus Rift DK2 HTC Vive
To Correct for Distortion
• Must pre-distort image
• This is a pixel-based
distortion
• Use shader programming
HMD Design Trade-offs
• Resolution vs. field of view
• As FOV increases, resolution decreases for fixed pixels
• Eye box vs. field of view
• Larger eye box limits field of view
• Size, Weight and Power vs. everything else
vs.
Oculus Rift
• Cost: $399 USD
• FOV: 110
o
Horizontal
• Refresh rate: 90 Hz
• Resolution 1080x1200/eye
• 3 DOF orientation tracking
• 3 axis positional tracking
Inside an Oculus Rift
Comparison Between HMDs
Computer Based vs. Mobile VR Displays
Google Cardboard
• Released 2014 (Google 20% project)
• >5 million shipped/given away
• Easy to use developer tools
+ =
Multiple Mobile VR Viewers Available
Projection/Large Display Technologies
• Room Scale Projection
• CAVE, multi-wall environment
• Dome projection
• Hemisphere/spherical display
• Head/body inside
• Vehicle Simulator
• Simulated visual display in windows
CAVE
• Developed in 1992, EVL University of Illinois Chicago
• Multi-walled stereo projection environment
• Head tracked active stereo
Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. V., & Hart, J. C. (1992). The CAVE: audio
visual experience automatic virtual environment. Communications of the ACM, 35(6), 64-73.
Typical CAVE Setup
• 4 sides, rear projected stereo images
Demo Video – Wisconsin CAVE
• https://www.youtube.com/watch?v=mBs-OGDoPDY
CAVE Variations
Stereo Projection
• Active Stereo
• Active shutter glasses
• Time synced signal
• Brighter images
• More expensive
• Passive Stereo
• Polarized images
• Two projectors (one/eye)
• Cheap glasses (powerless)
• Lower resolution/dimmer
• Less expensive
Caterpillar Demo
• https://www.youtube.com/watch?v=r9N1w8PmD1E
Vehicle Simulators
• Combine VR displays with vehicle
• Visual displays on windows
• Motion base for haptic feedback
• Audio feedback
• Physical vehicle controls
• Steering wheel, flight stick, etc
• Full vehicle simulation
• Emergencies, normal operation, etc
• Weapon operation
• Training scenarios
Demo: Boeing 787 Simulator
• https://www.youtube.com/watch?v=3iah-blsw_U
mark.billinghurst@unisa.edu.au
bruce.thomas@unisa.edu.au
gun.lee@unisa.edu.au

More Related Content

What's hot (20)

PDF
Comp4010 lecture11 VR Applications
Mark Billinghurst
 
PDF
Comp4010 Lecture8 Introduction to VR
Mark Billinghurst
 
PDF
Comp4010 Lecture9 VR Input and Systems
Mark Billinghurst
 
PDF
2022 COMP4010 Lecture2: Perception
Mark Billinghurst
 
PDF
Comp 4010 2021 Lecture1-Introduction to XR
Mark Billinghurst
 
PDF
COMP 4010 - Lecture 3 VR Systems
Mark Billinghurst
 
PDF
2022 COMP4010 Lecture 6: Designing AR Systems
Mark Billinghurst
 
PDF
Comp4010 Lecture13 More Research Directions
Mark Billinghurst
 
PDF
2022 COMP 4010 Lecture 7: Introduction to VR
Mark Billinghurst
 
PDF
2022 COMP4010 Lecture1: Introduction to XR
Mark Billinghurst
 
PDF
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
Mark Billinghurst
 
PDF
Comp4010 Lecture5 Interaction and Prototyping
Mark Billinghurst
 
PDF
Comp4010 lecture3-AR Technology
Mark Billinghurst
 
PDF
Comp4010 lecture11 VR Applications
Mark Billinghurst
 
PDF
COMP 4010 - Lecture 2: Presence in Virtual Reality
Mark Billinghurst
 
PDF
ISS2022 Keynote
Mark Billinghurst
 
PDF
2022 COMP4010 Lecture5: AR Prototyping
Mark Billinghurst
 
PDF
COMP 4010 Lecture10: AR Tracking
Mark Billinghurst
 
PDF
Comp4010 2021 Lecture2-Perception
Mark Billinghurst
 
PPTX
Virtual reality
yp95
 
Comp4010 lecture11 VR Applications
Mark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Mark Billinghurst
 
Comp4010 Lecture9 VR Input and Systems
Mark Billinghurst
 
2022 COMP4010 Lecture2: Perception
Mark Billinghurst
 
Comp 4010 2021 Lecture1-Introduction to XR
Mark Billinghurst
 
COMP 4010 - Lecture 3 VR Systems
Mark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
Mark Billinghurst
 
Comp4010 Lecture13 More Research Directions
Mark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
Mark Billinghurst
 
2022 COMP4010 Lecture1: Introduction to XR
Mark Billinghurst
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
Mark Billinghurst
 
Comp4010 Lecture5 Interaction and Prototyping
Mark Billinghurst
 
Comp4010 lecture3-AR Technology
Mark Billinghurst
 
Comp4010 lecture11 VR Applications
Mark Billinghurst
 
COMP 4010 - Lecture 2: Presence in Virtual Reality
Mark Billinghurst
 
ISS2022 Keynote
Mark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
Mark Billinghurst
 
COMP 4010 Lecture10: AR Tracking
Mark Billinghurst
 
Comp4010 2021 Lecture2-Perception
Mark Billinghurst
 
Virtual reality
yp95
 

Similar to Lecture 2 Presence and Perception (20)

PDF
COMP 4010: Lecture2 VR Technology
Mark Billinghurst
 
PDF
COMP 4010 - Lecture 2: VR Technology
Mark Billinghurst
 
PPT
The NeuroPsychology of Presence
Riva Giuseppe
 
PPTX
ICS2208 lecture6
Vanessa Camilleri
 
PPTX
Virtual Reality in Rehabilitation Medicine
shivnarayansahu28
 
PDF
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
Mark Billinghurst
 
PPTX
Virtual Reality
PreetiPandya4
 
PDF
Social Interaction in Immersive Environments.pdf
Matthias Wölfel
 
PDF
Virtual Reality Technology, Machine Learning, Biosensing Converging to Transf...
Stanford University
 
PDF
Virtual Reality: Sensing the Possibilities
Mark Billinghurst
 
PPTX
Introduction -to- Virtual - Reality.pptx
hariomdeore
 
PDF
ARI2132 lecture 8
Vanessa Camilleri
 
PDF
Capturing Reality in Motion: Qualities of Presence in MoCap by Kent Bye
Kent Bye
 
PDF
Module 1 VR.pdf
Rashmi Bhat
 
PDF
Introduction To Virtual Reality
Rashmi Bhat
 
PDF
Masterclass on Immersive Storytelling & Experiential Design
Kent Bye
 
PPTX
vr
ZheenHandren
 
PPTX
Virtual Reality
Viral Patel
 
PPT
Virtual reality
mnaj3a
 
PPTX
Virtual Riality in simulation gaming and idk.pptx
ssuser0b0103
 
COMP 4010: Lecture2 VR Technology
Mark Billinghurst
 
COMP 4010 - Lecture 2: VR Technology
Mark Billinghurst
 
The NeuroPsychology of Presence
Riva Giuseppe
 
ICS2208 lecture6
Vanessa Camilleri
 
Virtual Reality in Rehabilitation Medicine
shivnarayansahu28
 
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
Mark Billinghurst
 
Virtual Reality
PreetiPandya4
 
Social Interaction in Immersive Environments.pdf
Matthias Wölfel
 
Virtual Reality Technology, Machine Learning, Biosensing Converging to Transf...
Stanford University
 
Virtual Reality: Sensing the Possibilities
Mark Billinghurst
 
Introduction -to- Virtual - Reality.pptx
hariomdeore
 
ARI2132 lecture 8
Vanessa Camilleri
 
Capturing Reality in Motion: Qualities of Presence in MoCap by Kent Bye
Kent Bye
 
Module 1 VR.pdf
Rashmi Bhat
 
Introduction To Virtual Reality
Rashmi Bhat
 
Masterclass on Immersive Storytelling & Experiential Design
Kent Bye
 
Virtual Reality
Viral Patel
 
Virtual reality
mnaj3a
 
Virtual Riality in simulation gaming and idk.pptx
ssuser0b0103
 
Ad

More from Mark Billinghurst (20)

PDF
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
Mark Billinghurst
 
PDF
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
Mark Billinghurst
 
PDF
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
Mark Billinghurst
 
PDF
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
Mark Billinghurst
 
PDF
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
Mark Billinghurst
 
PDF
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
Mark Billinghurst
 
PDF
Research Directions in Heads-Up Computing
Mark Billinghurst
 
PDF
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
Mark Billinghurst
 
PDF
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
Mark Billinghurst
 
PDF
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
Mark Billinghurst
 
PDF
IVE 2024 Short Course - Lecture14 - Evaluation
Mark Billinghurst
 
PDF
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
Mark Billinghurst
 
PDF
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
Mark Billinghurst
 
PDF
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
Mark Billinghurst
 
PDF
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
Mark Billinghurst
 
PDF
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
Mark Billinghurst
 
PDF
Research Directions for Cross Reality Interfaces
Mark Billinghurst
 
PDF
The Metaverse: Are We There Yet?
Mark Billinghurst
 
PDF
Human Factors of XR: Using Human Factors to Design XR Systems
Mark Billinghurst
 
PDF
IVE Industry Focused Event - Defence Sector 2024
Mark Billinghurst
 
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
Mark Billinghurst
 
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
Mark Billinghurst
 
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
Mark Billinghurst
 
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
Mark Billinghurst
 
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
Mark Billinghurst
 
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
Mark Billinghurst
 
Research Directions in Heads-Up Computing
Mark Billinghurst
 
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
Mark Billinghurst
 
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
Mark Billinghurst
 
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
Mark Billinghurst
 
IVE 2024 Short Course - Lecture14 - Evaluation
Mark Billinghurst
 
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
Mark Billinghurst
 
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
Mark Billinghurst
 
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
Mark Billinghurst
 
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
Mark Billinghurst
 
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
Mark Billinghurst
 
Research Directions for Cross Reality Interfaces
Mark Billinghurst
 
The Metaverse: Are We There Yet?
Mark Billinghurst
 
Human Factors of XR: Using Human Factors to Design XR Systems
Mark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
Mark Billinghurst
 
Ad

Recently uploaded (20)

PPTX
Agile Chennai 18-19 July 2025 | Emerging patterns in Agentic AI by Bharani Su...
AgileNetwork
 
PDF
State-Dependent Conformal Perception Bounds for Neuro-Symbolic Verification
Ivan Ruchkin
 
PDF
Presentation about Hardware and Software in Computer
snehamodhawadiya
 
PDF
The Future of Mobile Is Context-Aware—Are You Ready?
iProgrammer Solutions Private Limited
 
PDF
Per Axbom: The spectacular lies of maps
Nexer Digital
 
PPTX
cloud computing vai.pptx for the project
vaibhavdobariyal79
 
PDF
How ETL Control Logic Keeps Your Pipelines Safe and Reliable.pdf
Stryv Solutions Pvt. Ltd.
 
PDF
RAT Builders - How to Catch Them All [DeepSec 2024]
malmoeb
 
PDF
Tea4chat - another LLM Project by Kerem Atam
a0m0rajab1
 
PDF
Brief History of Internet - Early Days of Internet
sutharharshit158
 
PPTX
Agentic AI in Healthcare Driving the Next Wave of Digital Transformation
danielle hunter
 
PDF
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
PDF
MASTERDECK GRAPHSUMMIT SYDNEY (Public).pdf
Neo4j
 
PDF
GDG Cloud Munich - Intro - Luiz Carneiro - #BuildWithAI - July - Abdel.pdf
Luiz Carneiro
 
PDF
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
PPTX
AI in Daily Life: How Artificial Intelligence Helps Us Every Day
vanshrpatil7
 
PDF
TrustArc Webinar - Navigating Data Privacy in LATAM: Laws, Trends, and Compli...
TrustArc
 
PDF
Research-Fundamentals-and-Topic-Development.pdf
ayesha butalia
 
PDF
Researching The Best Chat SDK Providers in 2025
Ray Fields
 
PDF
NewMind AI Weekly Chronicles – July’25, Week III
NewMind AI
 
Agile Chennai 18-19 July 2025 | Emerging patterns in Agentic AI by Bharani Su...
AgileNetwork
 
State-Dependent Conformal Perception Bounds for Neuro-Symbolic Verification
Ivan Ruchkin
 
Presentation about Hardware and Software in Computer
snehamodhawadiya
 
The Future of Mobile Is Context-Aware—Are You Ready?
iProgrammer Solutions Private Limited
 
Per Axbom: The spectacular lies of maps
Nexer Digital
 
cloud computing vai.pptx for the project
vaibhavdobariyal79
 
How ETL Control Logic Keeps Your Pipelines Safe and Reliable.pdf
Stryv Solutions Pvt. Ltd.
 
RAT Builders - How to Catch Them All [DeepSec 2024]
malmoeb
 
Tea4chat - another LLM Project by Kerem Atam
a0m0rajab1
 
Brief History of Internet - Early Days of Internet
sutharharshit158
 
Agentic AI in Healthcare Driving the Next Wave of Digital Transformation
danielle hunter
 
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
MASTERDECK GRAPHSUMMIT SYDNEY (Public).pdf
Neo4j
 
GDG Cloud Munich - Intro - Luiz Carneiro - #BuildWithAI - July - Abdel.pdf
Luiz Carneiro
 
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
AI in Daily Life: How Artificial Intelligence Helps Us Every Day
vanshrpatil7
 
TrustArc Webinar - Navigating Data Privacy in LATAM: Laws, Trends, and Compli...
TrustArc
 
Research-Fundamentals-and-Topic-Development.pdf
ayesha butalia
 
Researching The Best Chat SDK Providers in 2025
Ray Fields
 
NewMind AI Weekly Chronicles – July’25, Week III
NewMind AI
 

Lecture 2 Presence and Perception

  • 1. LECTURE 2: VR PERCEPTION AND PRESENCE COMP 4010 - Virtual Reality Semester 5 - 2019 Mark Billinghurst, Bruce Thomas, Gun Lee University of South Australia August 6th 2019
  • 2. Overview •Presence in VR •Perception and VR •Human Perception •VR Technology
  • 4. The Incredible Disappearing Computer 1960-70’s Room 1970-80’s Desk 1980-90’s Lap 1990-2000’s Hand
  • 5. Virtual Reality • Users immersed in Computer Generated environment • HMD, gloves, 3D graphics, body tracking
  • 6. Augmented Reality •Virtual Images blended with the real world • See-through HMD, handheld display, viewpoint tracking, etc..
  • 7. Milgram’s Mixed Reality (MR) Continuum Augmented Reality Virtual Reality Real World Virtual World Mixed Reality "...anywhere between the extrema of the virtuality continuum." P. Milgram and A. F. Kishino, (1994) A Taxonomy of Mixed Reality Visual Displays Internet of Things
  • 10. Oculus Rift Sony Morpheus HTC/Valve Vive 2016 - Rise of Consumer HMDs
  • 13. Conclusion • Virtual Reality has a long history • > 50 years of HMDs, simulators • Key elements for VR were in place by early 1990’s • Displays, tracking, input, graphics • Strong support from military, government, universities • First commercial wave failed in late 1990’s • Too expensive, bad user experience, poor technology, etc • We are now in second commercial wave • Better experience, Affordable hardware • Large commercial investment, Significant installed user base
  • 15. ‘Virtual Reality is a synthetic sensory experience which may one day be indistinguishable from the real physical world.” -Roy Kalawsky (1993) Today Tomorrow
  • 16. Presence .. “The subjective experience of being in one place or environment even when physically situated in another” Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240.
  • 17. Presence Definition “Presence is a psychological state .. in which even though part or all of an individual’s current experience is generated by .. technology, part or all of the individual’s perception fails to .. acknowledge the role of the technology in the experience.” International Society for Presence Research, 2016 https://ispr.info/
  • 18. Immersion vs. Presence • Immersion: the extent to which technology delivers a vivid illusion of reality to the senses of a human participant. • Presence: a state of consciousness, the (psychological) sense of being in the virtual environment. • So Immersion produces a sensation of Presence • Goal of VR: Create a high degree of Presence • Make people believe they are really in Virtual Environment Slater, M., & Wilbur, S. (1997). A framework for immersive virtual environments (FIVE): Speculations on the role of presence in virtual environments. Presence: Teleoperators and virtual environments, 6(6), 603-616.
  • 19. Three Types of Presence • Personal Presence, the extent to which the person feels like he or she is part of the virtual environment; • Social Presence, the extent to which other beings (living or synthetic) also exist in the VE; • Environmental Presence, the extent to which the environment itself acknowledges and reacts to the person in the VE. Heeter, C. (1992). Being there: The subjective experience of presence. Presence: Teleoperators & Virtual Environments, 1(2), 262-271.
  • 20. Benefits of High Presence • Leads to greater engagement, excitement and satisfaction • Increased reaction to actions in VR • People more likely to behave like in the real world • E.g. people scared of heights in real world will be scared in VR • More natural communication (Social Presence) • Use same cues as face to face conversation • Note: The relationship between Presence and Performance is unclear – still an active area of research
  • 21. How to Create Strong Presence? • Use Multiple Dimensions of Presence • Create rich multi-sensory VR experiences • Include social actors/agents that interact with user • Have environment respond to user • What Influences Presence • Vividness – ability to provide rich experience (Steuer 1992) • Using Virtual Body – user can see themselves (Slater 1993) • Internal factors – individual user differences (Sadowski 2002) • Interactivity – how much users can interact (Steuer 1992) • Sensory, Realism factors (Witmer 1998)
  • 22. Factors Contributing to Presence • From • Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence questionnaire. Presence, 7(3), 225-240.
  • 24. Example: UNC Pit Room (2002) • Key Features • Training room and pit room • Physical walking • Fast, accurate, room scale tracking • Haptic feedback – feel edge of pit, walls • Strong visual and 3D audio cues • Task • Carry object across pit • Walk across or walk around • Dropping virtual balls at targets in pit • http://wwwx.cs.unc.edu/Research/eve/walk_exp/
  • 25. Typical Subject Behaviour • Note – from another pit experiment • https://www.youtube.com/watch?v=VVAO0DkoD-8
  • 27. Measuring Presence • Presence is very subjective so there is a lot of debate among researchers about how to measure it • Subjective Measures • Self report questionnaire • University College London Questionnaire (Slater 1999) • Witmer and Singer Presence Questionnaire (Witmer 1998) • ITC Sense Of Presence Inventory (Lessiter 2000) • Continuous measure • Person moves slider bar in VE depending on Presence felt • Objective Measures • Behavioural • reflex/flinch measure, startle response • Physiological measures • change in heart rate, skin conductance, skin temperature Presence Slider
  • 28. Example: Witmer and Singer (1998) • 32 questions in 4 categories/factors • Control (CF), Sensory (SF), Realism (RF), Distraction factors (DF) • Answered on Likert scale from 1 to 7 ( 1 = low, 7 = high)
  • 29. Example: Heartrate in Pit Experiment • Physiological measures can be used as a reliable measure of Presence - especially change in heart rate (HR) • Change in HR agreed with UCL subjective questionnaire • HR increased with passive haptics, and increase in fps Meehan, M., Insko, B., Whitton, M., & Brooks, F. P. (2001). Physiological measures of presence in virtual environments. In Proceedings of 4th International Workshop on Presence (pp. 21-23).
  • 31. How do We Perceive Reality? • We understand the world through our senses: • Sight, Hearing, Touch, Taste, Smell (and others..) • Two basic processes: • Sensation – Gathering information • Perception – Interpreting information
  • 33. Goal of Virtual Reality “.. to make it feel like you’re actually in a place that you are not.” Palmer Luckey Co-founder, Oculus
  • 34. Creating the Illusion of Reality • Fooling human perception by using technology to generate artificial sensations • Computer generated sights, sounds, smell, etc.
  • 35. Reality vs. Virtual Reality • In a VR system there are input and output devices between human perception and action • Goal is to create illusion of reality – high Presence
  • 36. Example Birdly - http://www.somniacs.co/ • Create illusion of flying like a bird • Multisensory VR experience • Visual, audio, wind, haptic
  • 39. Motivation • Understand: In order to create a strong sense of Presence we need to understand the Human Perception system • Stimulate: We need to be able to use technology to provide real world sensory inputs, and create the VR illusion VR Hardware Human Senses
  • 40. Senses • How an organism obtains information for perception: • Sensation part of Somatic Division of Peripheral Nervous System • Integration and perception requires the Central Nervous System • Five major senses (but there are more..): • Sight (Opthalamoception) • Hearing (Audioception) • Taste (Gustaoception) • Smell (Olfacaoception) • Touch (Tactioception)
  • 41. Relative Importance of Each Sense • Percentage of neurons in brain devoted to each sense • Sight – 30% • Touch – 8% • Hearing – 2% • Smell - < 1% • Over 60% of brain involved with vision in some way
  • 42. Other Lessor Known Senses.. • Proprioception = sense of body position • what is your body doing right now • Equilibrium = balance • Acceleration • Nociception = sense of pain • Temperature • Satiety (the quality or state of being fed or gratified to or beyond capacity) • Thirst • Micturition • Amount of CO2 and Na in blood
  • 43. Sight
  • 44. The Human Visual System • Purpose is to convert visual input to signals in the brain
  • 45. The Human Eye • Light passes through cornea and lens onto retina • Photoreceptors in retina convert light into electrochemical signals
  • 46. Photoreceptors – Rods and Cones • Retina photoreceptors come in two types, Rods and Cones • Rods – 125 million, periphery of retina, no colour detection, night vision • Cones – 4-6 million, center of retina, colour vision, day vision
  • 47. Human Horizontal and Vertical FOV • Humans can see ~135 o vertical (60 o above, 75 o below) • See up to ~ 210 o horizontal FOV, ~ 115 o stereo overlap • Colour/stereo in centre, Black & White/mono in periphery
  • 50. Vergence-Accommodation Conflict • Looking at real objects, vergence and focal distance match • In VR, vergence and accommodation can miss-match • Focusing on HMD screen, but accommodating for virtual object behind screen
  • 51. Visual Acuity Visual Acuity Test Targets • Ability to resolve details • Several types of visual acuity • detection, separation, etc • Normal eyesight can see a 50 cent coin at 80m • Corresponds to 1 arc min (1/60th of a degree) • Max acuity = 0.4 arc min
  • 52. Stereo Perception/Stereopsis • Eyes separated by IPD • Inter pupillary distance • 5 – 7.5cm (avge. 6.5cm) • Each eye sees diff. image • Separated by image parallax • Images fused to create 3D stereo view
  • 54. Depth Perception • The visual system uses a range of different Stereoscopic and Monocular cues for depth perception Stereoscopic Monocular eye convergence angle disparity between left and right images diplopia eye accommodation perspective atmospheric artifacts (fog) relative sizes image blur occlusion motion parallax shadows texture Parallax can be more important for depth perception! Stereoscopy is important for size and distance evaluation
  • 56. Depth Perception Distances • i.e. convergence/accommodation used for depth perception < 10m
  • 57. Properties of the Human Visual System • visual acuity: 20/20 is ~1 arc min • field of view: ~200° monocular, ~120° binocular, ~135° vertical • resolution of eye: ~576 megapixels • temporal resolution: ~60 Hz (depends on contrast, luminance) • dynamic range: instantaneous 6.5 f-stops, adapt to 46.5 f-stops • colour: everything in CIE xy diagram • depth cues in 3D displays: vergence, focus, (dis)comfort • accommodation range: ~8cm to ∞, degrades with age
  • 58. Comparison between Eyes and HMD Human Eyes HTC Vive FOV 200° x 135° 110° x 110° Stereo Overlap 120° 110° Resolution 30,000 x 20,000 2,160 x 1,200 Pixels/inch >2190 (100mm to screen) 456 Update 60 Hz 90 Hz See http://doc-ok.org/?p=1414 http://www.clarkvision.com/articles/eye-resolution.html http://wolfcrow.com/blog/notes-by-dr-optoglass-the-resolution-of-the-human-eye/
  • 61. Auditory Thresholds • Humans hear frequencies from 20 – 22,000 Hz • Most everyday sounds from 80 – 90 dB
  • 62. Sound Localization • Humans have two ears • localize sound in space • Sound can be localized using 3 coordinates • Azimuth, elevation, distance
  • 64. Sound Localization (Azimuth Cues) Interaural Time Difference
  • 65. HRTF (Elevation Cue) • Pinna and head shape affect frequency intensities • Sound intensities measured with microphones in ear and compared to intensities at sound source • Difference is HRTF, gives clue as to sound source location
  • 66. Accuracy of Sound Localization • People can locate sound • Most accurately in front of them • 2-3° error in front of head • Least accurately to sides and behind head • Up to 20° error to side of head • Largest errors occur above/below elevations and behind head • Front/back confusion is an issue • Up to 10% of sounds presented in the front are perceived coming from behind and vice versa (more in headphones) BUTEAN, A., Bălan, O., NEGOI, I., Moldoveanu, F., & Moldoveanu, A. (2015). COMPARATIVE RESEARCH ON SOUND LOCALIZATION ACCURACY IN THE FREE-FIELD AND VIRTUAL AUDITORY DISPLAYS. InConference proceedings of» eLearning and Software for Education «(eLSE)(No. 01, pp. 540-548). Universitatea Nationala de Aparare Carol I.
  • 67. Touch
  • 68. Touch • Mechanical/Temp/Pain stimuli transduced into Action Potentials (AP) • Transducing structures are specialized nerves: • Mechanoreceptors: Detect pressure, vibrations & texture • Thermoreceptors: Detect hot/cold • Nocireceptors: Detect pain • Proprioreceptors: Detect spatial awareness • This triggers an AP which then travels to various locations in the brain via the somatosensory nerves
  • 69. Haptic Sensation • Somatosensory System • complex system of nerve cells that responds to changes to the surface or internal state of the body • Skin is the largest organ • 1.3-1.7 square m in adults • Tactile: Surface properties • Receptors not evenly spread • Most densely populated area is the tongue • Kinesthetic: Muscles, Tendons, etc. • Also known as proprioception
  • 70. Cutaneous System • Skin – heaviest organ in the body • Epidermis outer layer, dead skin cells • Dermis inner layer, with four kinds of mechanoreceptors
  • 71. Mechanoreceptors • Cells that respond to pressure, stretching, and vibration • Slow Acting (SA), Rapidly Acting (RA) • Type I at surface – light discriminate touch • Type II deep in dermis – heavy and continuous touch Receptor Type Rate of Acting Stimulus Frequency Receptive Field Detection Function Merkel Discs SA-I 0 – 10 Hz Small, well defined Edges, intensity Ruffini corpuscles SA-II 0 – 10 Hz Large, indistinct Static force, skin stretch Meissner corpuscles RA-I 20 – 50 Hz Small, well defined Velocity, edges Pacinian corpuscles RA-II 100 – 300 Hz Large, indistinct Acceleration, vibration
  • 72. Spatial Resolution • Sensitivity varies greatly • Two-point discrimination Body Site Threshold Distance Finger 2-3mm Cheek 6mm Nose 7mm Palm 10mm Forehead 15mm Foot 20mm Belly 30mm Forearm 35mm Upper Arm 39mm Back 39mm Shoulder 41mm Thigh 42mm Calf 45mm http://faculty.washington.edu/chudler/chsense.html
  • 73. Proprioception/Kinaesthesia • Proprioception (joint position sense) • Awareness of movement and positions of body parts • Due to nerve endings and Pacinian and Ruffini corpuscles at joints • Enables us to touch nose with eyes closed • Joints closer to body more accurately sensed • Users know hand position accurate to 8cm without looking at them • Kinaesthesia (joint movement sense) • Sensing muscle contraction or stretching • Cutaneous mechanoreceptors measuring skin stretching • Helps with force sensation
  • 74. Smell
  • 75. Olfactory System • Human olfactory system. 1: Olfactory bulb 2: Mitral cells 3: Bone 4: Nasal epithelium 5: Glomerulus 6: Olfactory receptor neurons
  • 76. How the Nose Works • https://www.youtube.com/watch?v=zaHR2MAxywg
  • 77. Smell • Smells are sensed by olfactory sensory neurons in the olfactory epithelium • 10 cm2 with hundreds of different types of olfactory receptors • Human’s can detect at least 10,000 different odors • Some researchers say trillions of odors • Sense of smell closely related to taste • Both use chemo-receptors • Olfaction + taste contribute to flavour • The olfactory system is the only sense that bypasses the thalamus and connects directly to the forebrain
  • 78. Taste
  • 79. Sense of Taste • https://www.youtube.com/watch?v=FSHGucgnvLU
  • 80. Basics of Taste • Sensation produced when a substance in the mouth reacts chemically with taste receptor cells • Taste receptors mostly on taste buds on the tongue • 2,000 – 5,000 taste buds on tongues/100+ receptors each • Five basic tastes: • sweetness, sourness, saltiness, bitterness, and umami • Flavour influenced by other senses • smell, texture, temperature, “coolness”, “hotness”
  • 83. Using Technology to Stimulate Senses • Simulate output • E.g. simulate real scene • Map output to devices • Graphics to HMD • Use devices to stimulate the senses • HMD stimulates eyes Visual Simulation 3D Graphics HMD Vision System Brain Example: Visual Simulation Human-Machine Interface
  • 84. Key Technologies for VR System • Visual Display • Stimulate visual sense • Audio/Tactile Display • Stimulate hearing/touch • Tracking • Changing viewpoint • User input • Input Devices • Supporting user interaction
  • 85. What Happens When Senses Don’t Match? • 20-30% VR users experience motion sickness • Sensory Conflict Theory • Visual cues don’t match vestibular cues • Eyes – “I’m moving!”, Vestibular – “No, you’re not!”
  • 86. Avoiding Motion Sickness • Better VR experience design • More natural movements • Improved VR system performance • Less tracking latency, better graphics frame rate • Provide a fixed frame of reference • Ground plane, vehicle window • Add a virtual nose • Provide peripheral cue • Eat ginger • Reduces upset stomach
  • 87. 5 Key Technical Requirements for Presence • Persistence • > 90 Hz refresh, < 3 ms persistence, avoid retinal blur • Optics • Wide FOV > 90 degrees, comfortable eyebox, good calibration • Tracking • 6 DOF, 360 tracking, sub-mm accuracy, no jitter, good tracking volume • Resolution • Correct stereo, > 1K x 1K resolution, no visible pixels • Latency • < 20 ms latency, fuse optical tracking and IMU, minimize tracking loop http://www.roadtovr.com/oculus-shares-5-key-ingredients-for-presence-in-virtual-reality/
  • 89. Creating an Immersive Experience •Head Mounted Display •Immerse the eyes •Projection/Large Screen •Immerse the head/body •Future Technologies •Neural implants •Contact lens displays, etc
  • 90. HMD Basic Principles • Use display with optics to create illusion of virtual screen
  • 91. Key Properties of HMDs • Lens • Focal length, Field of View • Occularity, Interpupillary distance • Eye relief, Eye box • Display • Resolution, contrast • Power, brightness • Refresh rate • Ergonomics • Size, weight • Wearability
  • 92. Field of View Monocular FOV is the angular subtense of the displayed image as measured from the pupil of one eye. Total FOV is the total angular size of the displayed image visible to both eyes. Binocular(or stereoscopic) FOV refers to the part of the displayed image visible to both eyes. FOV may be measured horizontally, vertically or diagonally.
  • 93. Ocularity • Monocular - HMD image to only one eye. • Bioccular - Same HMD image to both eyes. • Binocular (stereoscopic) - Different but matched images to each eye.
  • 94. Interpupillary Distance (IPD) nIPD is the horizontal distance between a user's eyes. nIPD is the distance between the two optical axes in a binocular view system.
  • 95. Distortion in Lens Optics A rectangle Maps to this HMD optics distort images shown in them
  • 97. To Correct for Distortion • Must pre-distort image • This is a pixel-based distortion • Use shader programming
  • 98. HMD Design Trade-offs • Resolution vs. field of view • As FOV increases, resolution decreases for fixed pixels • Eye box vs. field of view • Larger eye box limits field of view • Size, Weight and Power vs. everything else vs.
  • 99. Oculus Rift • Cost: $399 USD • FOV: 110 o Horizontal • Refresh rate: 90 Hz • Resolution 1080x1200/eye • 3 DOF orientation tracking • 3 axis positional tracking
  • 102. Computer Based vs. Mobile VR Displays
  • 103. Google Cardboard • Released 2014 (Google 20% project) • >5 million shipped/given away • Easy to use developer tools + =
  • 104. Multiple Mobile VR Viewers Available
  • 105. Projection/Large Display Technologies • Room Scale Projection • CAVE, multi-wall environment • Dome projection • Hemisphere/spherical display • Head/body inside • Vehicle Simulator • Simulated visual display in windows
  • 106. CAVE • Developed in 1992, EVL University of Illinois Chicago • Multi-walled stereo projection environment • Head tracked active stereo Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. V., & Hart, J. C. (1992). The CAVE: audio visual experience automatic virtual environment. Communications of the ACM, 35(6), 64-73.
  • 107. Typical CAVE Setup • 4 sides, rear projected stereo images
  • 108. Demo Video – Wisconsin CAVE • https://www.youtube.com/watch?v=mBs-OGDoPDY
  • 110. Stereo Projection • Active Stereo • Active shutter glasses • Time synced signal • Brighter images • More expensive • Passive Stereo • Polarized images • Two projectors (one/eye) • Cheap glasses (powerless) • Lower resolution/dimmer • Less expensive
  • 112. Vehicle Simulators • Combine VR displays with vehicle • Visual displays on windows • Motion base for haptic feedback • Audio feedback • Physical vehicle controls • Steering wheel, flight stick, etc • Full vehicle simulation • Emergencies, normal operation, etc • Weapon operation • Training scenarios
  • 113. Demo: Boeing 787 Simulator • https://www.youtube.com/watch?v=3iah-blsw_U