Skip to content

nimarek/Contextual-Cueing-Unity

Repository files navigation

Contextual-Cueing-Unity

The classic contextual cueing paradigm for visual search (Chun and Jiang, 1998) in the Unity Gaming Engine (v. 2019.1.0f2). This implementation aims for flexibility in the design of the search displays and will be continuously extended.

Classic 2D Contextual Cueing (left) and an example Contextual Cueing VR configuration: Classic 2D Contextual Cueing (left) and an example Contextual Cueing VR configuration

Features:

  • The experiment can be performed either directly on a computer with mouse and keyboard or by using a virtual reality headset.
  • General workflow: Number of trials within a block and the overall number of blocks can be flexibly defined via the Unity Editor Interface.
  • Search displays: Distance, size, number and positioning of individual objects can be flexibly defined via the Unity Editor Interface.
  • Inter trial gaze pointer with an adjustable sphere to detect fixations.
  • If the same old search display is repeated within one block, a Fisher-Yates shuffle is used to check for repetitions.
  • Measurement of reaction times, head movements and display configurations in /data/sub-**/ folder.

Calibration using Pupil-Labs-Eyetracking inside of Unity: Calibration using Pupil-Labs-Eyetracking inside of Unity

An example search display: An example search display

Detailed infos

Currently objects are placed on a variable number of circles (default: six) surrounding the participant, but future version will include other methods to spawn distractors and targets. The number of repeated and randomly generated search displays can be set separately, as well as the total number of distractors and their placement in virtual space.

About

An implementation of the contextual cueing paradigm in the Unity Engine. New features will be added step by step.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages