Skip to content

kgupta2789/AMinVR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

Autobiographical memories in VR

  • Human Memory is essential for functioning in the world
  • AI can create digital content that is encoded in our memory
  • VR is a medium for Human-AI collaboration
  • VR can evoke powerful emotions and meaningful experiences

Research Questions

Can we sense recall of meaningful and self-relevant memories in VR to support Human-AI collaboration?

Study design and Experiment

We have created four meaningful VR environments using focus groups. Participants will visit these VR environments and are given the task to create memories by taking pictures (encoding phase) . Two days later, participants revisit the places (n=50) and random location(n=50) in the scene (retrieval phase).

Encoding-phase:

Participants are seated in a roller coaster cart holding the handle in front of them in the old millride scenario. The train makes gentle and smooth ride that can be controlled by them using the start and stopbuttons attached to the handle and are asked to stop the train if they find anything of interest for them. The virtual environment consists of multiple scenes intending to simulate different emotional experiences. After the completion of each scene, the participants are asked to report their subjective emotional state using the Self-Assessment Manikin Scale (SAM). The stopping of the train and the picture taken with coordinates are logged along with Eye-Tracking data (CAN BE FOUND HERE https://github.com/kgupta2789/AMinVR/tree/main/data/pupil).

A video of the encoding phase can be found here: https://www.youtube.com/watch?v=mGb7Oi5CHNc

Retrieval-Phase:

Two days later participants are transported to the location where they took a pictures and randomly selected spots on the map where they have not taken a picture for two minutes.

Measurements

During the retrieval-phase, we measure the electroencephalogram, galvanic skin response, heart rate variability, pupil responses and obtain familiarity ratings.

Project status

Currently data is being collected. The dataset will be labeled and uploaded on this repository as soon as data collection is finished.

Aknowledgements:

This research is supported by the HumaneAI Net, the DFKI (German Research Centre for Artificial Intelligence), the University of Auckland, and the LMU Munich.

About

Understanding Autobiographical Memory in Virtual Reality using Physiological Information

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors