Type
Text
Type
Dissertation
Advisor
Zelinsky, Gregory J. | Brennan, Susan E.Leung, Hoi-Chung | Casasanto, Daniel.
Date
2012-05-01
Keywords
arrangement, configuration, guidance, relational information, visual search | Cognitive psychology--Experimental psychology
Department
Department of Experimental Psychology
Language
en_US
Source
This work is sponsored by the Stony Brook University Graduate School in compliance with the requirements for completion of degree.
Identifier
http://hdl.handle.net/11401/71405
Publisher
The Graduate School, Stony Brook University: Stony Brook, NY.
Format
application/pdf
Abstract
Objects in the real world exist relative to other objects, resulting in an intricate web of spatial relationships. Do we use this relational information when we search for objects? Current search theory suggests that object relationships can only be established using focal attention (Logan, 1994; 1995). If this is true, pre-attentive search guidance by relational information should be impossible. In a series of seven experiments, I demonstrate that search guidance by relational information is possible, even in the absence of real-world contextual constraints that may magnify relational guidance. Experiment 1 shows search guidance by relational information only, i.e. in the absence of target feature guidance. Experiment 2 indicates that relational guidance is evident in highly heterogeneous displays as well. Experiment 3 demonstrates that relational guidance does not affect search when targets are cued using text labels referring to four object classes, suggesting that the effective coding of relational information may require highly specific target features. Experiment 4 shows that relational guidance is selectively not expressed when functional relationships between objects are contrary to real-world expectations (e.g. a hammer below a nail), suggesting that relational guidance is affected by object spatial associations in long-term memory. Experiment 5 further demonstrates that with minimal practice there is a small automatic contribution to relational guidance, though with continued practice relational guidance increases or disappears depending upon task demands. Experiments 6 and 7 show that relational guidance is unaffected by various grouping cues, suggesting that object spatial relationships are not coded by low-level visual processes, but rather by higher order pointers that code the categorical spatial relationships between objects (above, below, left, right). Collectively, these experiments suggest that object spatial relationships are encoded into the guiding target template at preview, thereby making this relational information available to guide search and removing the need to assume a pre-attentive coding of relational information between peripherally viewed search objects. | 100 pages
Recommended Citation
Schmidt, Joseph C., "Relational information between objects is available to guide search." (2012). Stony Brook Theses and Dissertations Collection, 2006-2020 (closed to submissions). 611.
https://commons.library.stonybrook.edu/stony-brook-theses-and-dissertations-collection/611