Skip to content

ChelseaLiao/Visual-Search

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Visual-Search

Based on "VRception"[1], a cross-reality toolkit allows users transfer to any steps between (simulated)Augmented Reality, (simulated)Augmented Virtuality and Virtual Reality with its alpha-blending function, I designed this visual search experiment under 6 conditions with these 3 realities and 2 difficulties. High difficulty task has more distractors than the low one.

When the search begins, player should search out the specific target mentioned on a white board from distractors, point and click by using an HTC controller before time runs out. Player has to finish a training session before 6 tasks.

Experimental data as well as EEG signals and eye tracking data were logged into csv files for cognitive load and attention analysis.

Note: This repository only includes eye tracking validation, training session and questionnaire, 6 tasks were removed. Run 03_Training.unity under Assets/Visual Search/Scenes to check how the visual search goes like. You can find pictures and video about the visual search tasks under different conditions from my portfolio https://www.artstation.com/artwork/VJnkDP?notification_id=6033418437

References

  1. Uwe Gruenefeld, Jonas Auda, Florian Mathis, Stefan Schneegass, Mohamed Khamis, Jan Gugenheimer, Sven Mayer. “VRception: Rapid prototyping of cross-reality systems in virtual reality”. In: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 2022, pp. 1–15.(https://doi.org/10.1145/3491102.3501821)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published