EMPATHIC COMPUTING
Mark Billinghurst
mark.billinghurst@auckland.ac.nz
April 23rd 2023
Designing for the Broader Metaverse
The Metaverse is Hot!
Google Searches on the Metaverse
• Jan – Dec 2021
• Metaverse (blue) vs. Virtual Reality (red)
Publications about the Metaverse
• Metaverse in the abstract
• 2021 – 46, 2022 – 1065, 2023 – 422 so far
But what is the Metaverse?
• Real-time, 3D, Interactive, Social, Persistent
AWE 2022 John Riccitiello
CEO, Unity Technologies
Meta’s Metaverse
The Broader Metaverse
• Neal Stephenson’s “Snow Crash” (1992)
• The Metaverse is the convergence of:
• 1) virtually enhanced physical reality
• 2) physically persistent virtual space
Metaverse Taxonomy
• Four Key Components
• Virtual Worlds
• Augmented Reality
• Mirror Worlds
• Lifelogging
• Metaverse Roadmap
• http://metaverseroadmap.org/
Mirror Worlds
• Simulations of external space/content
• Capturing and sharing surroundings
• Photorealistic content
• Digital twins
Matterport Deep Mirror Google Street View
Soul Machines
Lifelogging
• Measuring user’s internal state
• Capturing physiological cues
• Recording everyday life
• Augmenting humans
Apple Fitbit Shimmer
OpenBCI
Sensing
Immersing
Augmenting
Capturing
Research Opportunities
Possible Research Directions
• Lifelogging to VR
• Bringing real world actions into VR, VR to experience lifelogging data
• AR to Lifelogging
• Using AR to view lifelogging data in everyday life, Sharing physiological data
• Mirror Worlds to VR
• VR copy of the real world, Mirroring real world collaboration in VR
• AR to Mirror Worlds
• Visualizing the past in place, Asymmetric collaboration
• And more..
Modern Communication Trends
1. Improved Content Capture
• Move from sharing faces to sharing places
2. Increased Network Bandwidth
• Sharing natural communication cues
3. Implicit Understanding
• Recognizing behaviour and emotion
Natural
Collaboration
Implicit
Understanding
Experience
Capture
Natural
Collaboration
Implicit
Understanding
Experience
Capture
Empathic
Computing
Empathic Computing Research Focus
Can we develop systems that allow
us to share what we are seeing,
hearing and feeling with others?
Key Elements of Empathic Systems
•Understanding
• Emotion Recognition, physiological sensors
•Experiencing
• Content/Environment capture, VR
•Sharing
• Communication cues, AR
Example: Sharing Communication Cues
• Measuring non-verbal cues
• Gaze, face expression, heart rate
• Sharing in Augmented Reality
• Collaborative AR experiences
Empathy Glasses
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
Remote Collaboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
Example: Connecting between Spaces
• Augmented Reality
• Bringing remote people into your real space
• Virtual Reality
• Bringing elements of the real world into VR
• AR/VR for sharing communication cues
• Sharing non-verbal communication cues
Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues
in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
Sharing Virtual Communication Cues
• AR/VR displays
• Gesture input (Leap Motion)
• Room scale tracking
• Conditions
• Baseline, FoV, Head-gaze, Eye-gaze
Sharing: Communication Cues (2018)
• What happens when you can’t see your colleague/agent?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M.
(2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the
2018 CHI conference on human factors in computing systems (pp. 1-13).
Collaborating Collaborator out of View
Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
Example: Sensor Input into AR/VR
• Augmented Reality
• Bringing remote people into your real space
• Virtual Reality
• Bringing elements of the real world into VR
• AR/VR for sharing communication cues
• Sharing non-verbal communication cues
NeuralDrum
• Using brain synchronicity to increase connection
• Collaborative VR drumming experience
• Measure brain activity using 3 EEG electrodes
• Use PLV to calculate synchronization
• More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
Set Up
• HTC Vive HMD
• OpenBCI
• 3 EEG electrodes
Results
"It’s quite interesting, I actually felt like my
body was exchanged with my partner."
Poor Player Good Player
Remote Communication
MiniMe
Virtual Cues Enhanced Emotion
Brain
Synchronization
Emotion Recognition
Scene Capture
AI
Accessibility and the Broader Metaverse
• Combine elements from different quadrants
• Augmenting, Immersing, Capturing, Sensing
• Example Projects
• Sharing emotions in VR
• Using VR for remote TBI therapy
• Intelligent agents for depression support
• Haptics for remote emotion sharing
• MiniMe agents for communication cue sharing
Non-visual Metaverse (2018)
• Using haptics/audio for remote collaboration
• Example: DualPanto (2018)
• Haptic only VR experiences
• Designed for blind interaction
Schneider, O., Shigeyama,... & Baudisch, P. (2018, October). DualPanto: a haptic device that enables blind
users to continuously interact with virtual worlds. In Proceedings of the 31st Annual ACM Symposium on User
Interface Software and Technology (pp. 877-887).
Empathic AuRea (2022)
• Person in video see-through HMD (‘decoder’) looks at user wearing ECG (‘encoder’)
• AR aura showing emotional state of the encoder
Valente, A., Lopes, D. S., Nunes, N., & Esteves, A. (2022, March). Empathic AuRea: Exploring the Effects of an
Augmented Reality Cue for Emotional Sharing Across Three Face-to-Face Tasks. In 2022 IEEE Conference on
Virtual Reality and 3D User Interfaces (VR) (pp. 158-166). IEEE.
Delivering the Entire Metaverse
Empathic Tele-Existence
• Based on Empathic Computing
• Creating shared understanding
• Covering the entire Metaverse
• AR, VR, Lifelogging, Mirror Worlds
• Transforming collaboration
• Observer to participant
• Feeling of doing things together
• Supporting Implicit collaboration
Google Searches on the Metaverse
• Jan – Dec 2021
• Metaverse (blue) vs. Virtual Reality (red)
Searches to April 2023
Dec 2021 Apr 2021
The Metaverse Winter
VR Winter (2002-2013)
• April 2007 Computer World
• VRVoted 7th on list of 21 biggest flops
• MS Bob #1
What Happened After That
• Growth of internet
• Natural user interfaces
• Increase in graphics performance
• Explosion in mobile phones
• Pokemon Go!
Increase in VR Research
• Papers grew over fivefold..
AR/VR Growth
• 171 million VR user worldwide
• > 750 million MAU Snap AR
Snap AR DAU
Lessons we can Learn
• Take advantage of new technology available
• Eye tracking, physiological sensing, HMDs, etc
• Significant decrease in costs
• Design for real needs
• Increasing number of people with accessibility needs
• Growing elderly, health issues, etc
• Focus on the research
• More people with skills available to do research
• Especially multi-disciplinary researchers
Conclusions
• Broader Metaverse
• Augmenting, Immersing, Capturing, Sensing
• Empathic Computing
• Systems that enhance understanding
• Combining AR, VR, Physiological sensors
• Opportunities for Research
• Accessibility, Empathic Tele-Existence
• Take advantage of the Metaverse winter
• Design for the whole metaverse
www.empathiccomputing.org
@marknb00
mark.billinghurst@auckland.ac.nz

Empathic Computing: Designing for the Broader Metaverse

  • 1.
    EMPATHIC COMPUTING Mark Billinghurst [email protected] April23rd 2023 Designing for the Broader Metaverse
  • 2.
  • 3.
    Google Searches onthe Metaverse • Jan – Dec 2021 • Metaverse (blue) vs. Virtual Reality (red)
  • 4.
    Publications about theMetaverse • Metaverse in the abstract • 2021 – 46, 2022 – 1065, 2023 – 422 so far
  • 5.
    But what isthe Metaverse? • Real-time, 3D, Interactive, Social, Persistent AWE 2022 John Riccitiello CEO, Unity Technologies
  • 6.
  • 7.
    The Broader Metaverse •Neal Stephenson’s “Snow Crash” (1992) • The Metaverse is the convergence of: • 1) virtually enhanced physical reality • 2) physically persistent virtual space
  • 8.
    Metaverse Taxonomy • FourKey Components • Virtual Worlds • Augmented Reality • Mirror Worlds • Lifelogging • Metaverse Roadmap • http://metaverseroadmap.org/
  • 9.
    Mirror Worlds • Simulationsof external space/content • Capturing and sharing surroundings • Photorealistic content • Digital twins Matterport Deep Mirror Google Street View Soul Machines
  • 10.
    Lifelogging • Measuring user’sinternal state • Capturing physiological cues • Recording everyday life • Augmenting humans Apple Fitbit Shimmer OpenBCI
  • 11.
  • 12.
  • 13.
    Possible Research Directions •Lifelogging to VR • Bringing real world actions into VR, VR to experience lifelogging data • AR to Lifelogging • Using AR to view lifelogging data in everyday life, Sharing physiological data • Mirror Worlds to VR • VR copy of the real world, Mirroring real world collaboration in VR • AR to Mirror Worlds • Visualizing the past in place, Asymmetric collaboration • And more..
  • 14.
    Modern Communication Trends 1.Improved Content Capture • Move from sharing faces to sharing places 2. Increased Network Bandwidth • Sharing natural communication cues 3. Implicit Understanding • Recognizing behaviour and emotion
  • 15.
  • 16.
  • 17.
    Empathic Computing ResearchFocus Can we develop systems that allow us to share what we are seeing, hearing and feeling with others?
  • 18.
    Key Elements ofEmpathic Systems •Understanding • Emotion Recognition, physiological sensors •Experiencing • Content/Environment capture, VR •Sharing • Communication cues, AR
  • 19.
    Example: Sharing CommunicationCues • Measuring non-verbal cues • Gaze, face expression, heart rate • Sharing in Augmented Reality • Collaborative AR experiences
  • 20.
    Empathy Glasses • Combinetogether eye-tracking, display, face expression • Implicit cues – eye gaze, face expression + + Pupil Labs Epson BT-200 AffectiveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • 21.
    Remote Collaboration • Eyegaze pointer and remote pointing • Face expression display • Implicit cues for remote collaboration
  • 23.
    Example: Connecting betweenSpaces • Augmented Reality • Bringing remote people into your real space • Virtual Reality • Bringing elements of the real world into VR • AR/VR for sharing communication cues • Sharing non-verbal communication cues Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
  • 24.
    Sharing Virtual CommunicationCues • AR/VR displays • Gesture input (Leap Motion) • Room scale tracking • Conditions • Baseline, FoV, Head-gaze, Eye-gaze
  • 26.
    Sharing: Communication Cues(2018) • What happens when you can’t see your colleague/agent? Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13). Collaborating Collaborator out of View
  • 27.
    Mini-Me Communication Cuesin MR • When lose sight of collaborator a Mini-Me avatar appears • Miniature avatar in real world • Mini-Me points to shared objects, show communication cues • Redirected gaze, gestures
  • 29.
    Example: Sensor Inputinto AR/VR • Augmented Reality • Bringing remote people into your real space • Virtual Reality • Bringing elements of the real world into VR • AR/VR for sharing communication cues • Sharing non-verbal communication cues
  • 30.
    NeuralDrum • Using brainsynchronicity to increase connection • Collaborative VR drumming experience • Measure brain activity using 3 EEG electrodes • Use PLV to calculate synchronization • More synchronization increases graphics effects/immersion Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
  • 31.
    Set Up • HTCVive HMD • OpenBCI • 3 EEG electrodes
  • 33.
    Results "It’s quite interesting,I actually felt like my body was exchanged with my partner." Poor Player Good Player
  • 34.
  • 35.
    MiniMe Virtual Cues EnhancedEmotion Brain Synchronization Emotion Recognition Scene Capture AI
  • 36.
    Accessibility and theBroader Metaverse • Combine elements from different quadrants • Augmenting, Immersing, Capturing, Sensing • Example Projects • Sharing emotions in VR • Using VR for remote TBI therapy • Intelligent agents for depression support • Haptics for remote emotion sharing • MiniMe agents for communication cue sharing
  • 37.
    Non-visual Metaverse (2018) •Using haptics/audio for remote collaboration • Example: DualPanto (2018) • Haptic only VR experiences • Designed for blind interaction Schneider, O., Shigeyama,... & Baudisch, P. (2018, October). DualPanto: a haptic device that enables blind users to continuously interact with virtual worlds. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (pp. 877-887).
  • 38.
    Empathic AuRea (2022) •Person in video see-through HMD (‘decoder’) looks at user wearing ECG (‘encoder’) • AR aura showing emotional state of the encoder Valente, A., Lopes, D. S., Nunes, N., & Esteves, A. (2022, March). Empathic AuRea: Exploring the Effects of an Augmented Reality Cue for Emotional Sharing Across Three Face-to-Face Tasks. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 158-166). IEEE.
  • 39.
  • 40.
    Empathic Tele-Existence • Basedon Empathic Computing • Creating shared understanding • Covering the entire Metaverse • AR, VR, Lifelogging, Mirror Worlds • Transforming collaboration • Observer to participant • Feeling of doing things together • Supporting Implicit collaboration
  • 41.
    Google Searches onthe Metaverse • Jan – Dec 2021 • Metaverse (blue) vs. Virtual Reality (red)
  • 42.
    Searches to April2023 Dec 2021 Apr 2021
  • 43.
  • 44.
    VR Winter (2002-2013) •April 2007 Computer World • VRVoted 7th on list of 21 biggest flops • MS Bob #1
  • 45.
    What Happened AfterThat • Growth of internet • Natural user interfaces • Increase in graphics performance • Explosion in mobile phones • Pokemon Go!
  • 46.
    Increase in VRResearch • Papers grew over fivefold..
  • 47.
    AR/VR Growth • 171million VR user worldwide • > 750 million MAU Snap AR Snap AR DAU
  • 48.
    Lessons we canLearn • Take advantage of new technology available • Eye tracking, physiological sensing, HMDs, etc • Significant decrease in costs • Design for real needs • Increasing number of people with accessibility needs • Growing elderly, health issues, etc • Focus on the research • More people with skills available to do research • Especially multi-disciplinary researchers
  • 49.
    Conclusions • Broader Metaverse •Augmenting, Immersing, Capturing, Sensing • Empathic Computing • Systems that enhance understanding • Combining AR, VR, Physiological sensors • Opportunities for Research • Accessibility, Empathic Tele-Existence • Take advantage of the Metaverse winter • Design for the whole metaverse
  • 50.