Skip to content

ehhall/object-based-memories

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 

Repository files navigation

Object-based attention during scene perception elicits boundary contraction in memory

Journal Article link.
OSF Repository link.
Beth's website link.

If you use this code or dataset in you research, please cite this paper.

@article{hall2024object,
  title={Object-based attention during scene perception elicits boundary contraction in memory},
  author={Hall, Elizabeth H and Geng, Joy J},
  journal={Memory \& cognition},
  pages={1--13},
  year={2024},
  publisher={Springer}
}

Abstract

Boundary contraction and extension are two types of scene transformations that occur in memory. In extension, viewers extrapolate information beyond the edges of the image, whereas in contraction, viewers forget information near the edges. Recent work suggests that image composition influences the direction and magnitude of boundary transformation. We hypothesize that selective attention at encoding is an important driver of boundary transformation effects, selective attention to specific objects at encoding leading to boundary contraction. In this study, one group of participants (N = 36) memorized 15 scenes while searching for targets, while a separate group (N = 36) just memorized the scenes. Both groups then drew the scenes from memory with as much object and spatial detail as they could remember. We asked online workers to provide ratings of boundary transformations in the drawings, as well as how many objects they contained and the precision of remembered object size and location. We found that search condition drawings showed significantly greater boundary contraction than drawings of the same scenes in the memorize condition. Search drawings were significantly more likely to contain target objects, and the likelihood to recall other objects in the scene decreased as a function of their distance from the target. These findings suggest that selective attention to a specific object due to a search task at encoding will lead to significant boundary contraction.

Data

CSV files for the main analyses are included here. The full dataset can be downloaded from the OSF Repository link. It contains the drawings done from memory, the 15 scenes used in the experiment with segmentations for the objects in the scenes, eye-tracking fixations from the study phase, and ratings from the 3 online AMT tasks.

Code

Settings defines the directories.
Attention notebook includes code for the eyetracking analyses.
Boundary calculates boundary transformations in the drawings.
Corners is used to define the scale of the scanned in drawings.
Location calculates the shift in remembered object location from the memory drawings.
Memory includes models for what determines whether an object in the image will be drawn from memory.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published