Vision is the brain’s primary portal on the world, and research on visual perception is critical not only to understanding brain mechanisms of vision but also to understanding how people are able to optimize visually guided tasks. Our research group is devoted to understanding mid- to high-level visual processes, where vision interfaces with other cognitive and motor systems to support intelligent behavior. Specifically, we study how an image of the external world, available to the eyes, is transformed into a meaningful representation of objects, surfaces, and scenes. In addition, we focus on understanding the mechanisms of attention and attentional control that allow the brain to select objects that are relevant to current goals and behavior.
The Visual Perception Research Group is composed of three main laboratories within the Department of Psychological and Brain Sciences, led by professors Andrew Hollingworth, Cathleen Moore, J. Toby Mordkoff, and Shaun Vecera. Ph.D. students and post-docs tend to have a primary home in one of the four laboratories, but the research group is highly collaborative, and most students develop projects that span laboratories and advisors. In addition to the four labs here in Psychological and Brain Sciences, there is a rich network of collaboration with other research groups across campus at the University of Iowa that study related aspects of vision and perception.
Join our group to start exploring visual perception with us at Iowa! We encourage interested graduate students to contact one or more of the primary faculty before applying to discuss research interests and opportunities. Students formally apply to one of the three broad graduate training areas (Clinical Science, Cognitive, or Behavioral and Cognitive Neuroscience), or through our Individualized Graduate Training Track. Graduate training is student-centered, with the program of study designed for each student to meet their career objectives and to prepare them for the next career stage. Students conduct research from the beginning of their graduate careers and are encouraged to develop independent lines of work as soon as possible.
Participating faculty and their interests are described below:
- Andrew Hollingworth: Prof. Hollingworth’s lab studies a broad range topics devoted to understanding the interactions between visual perception, attention, eye movements, and visual memory. Recent work has focused on the mechanisms by which human attention and gaze is controlled to support real-world behavior and the role of visual memory in these processes. In addition, we study the relationship between mechanisms for selection in vision and similar mechanisms for selection within visual and spatial memory.
- Cathleen Moore: Prof. Moore’s lab is concerned with understanding visual perception and attention, with an emphasis on how the visual system structures incoming visual information (perceptual organization) and how that structure affects other aspects of visual processing. We see objects in the world. But that is not the information that is received at the eye. Your eyes, like the digital sensor in you phone’s camera, register spatial patterns of light (e.g., changes in intensity and wavelength over space) How does the visual system give rise to the experience of a world of objects from those spatial patterns of light?
- J. Toby Mordkoff: The primary focus of Prof. Mordkoff's lab is the nature of mental architecture, particularly in how modules or sub-systems are inter-connected and how information flows through the system. This work involves a variety of methods and approaches, including traditional measures of response time and accuracy, response force, mathematical modeling and simulation, and psychophysiological measures (e.g., event-related brain potentials).
- Shaun Vecera: Prof. Vecera’s lab studies many aspects of how selective attention works, with most of our recent work focusing on attentional control (how attention knows where to go) and distraction (how attention goes to the wrong place). Our most recent work has examined experience-based attentional control, that is, how experience in a task environment helps ‘tune’ attention to become more selective and resist distraction (i.e., attentional capture).
Faculty in Visual Perception
Attention, eye movements, visual memory, scene perception, spatial cognition
Visual Perception, Attention, Object Perception, Perceptual Organization
selective and divided attention, correlational cuing, computational modeling, and psychophysiological measures
Ecological Inference. Perception of natural scenes, with a special focus on hearing and audiovisual integration; Intuitive physics and causal reasoning; Constructing and testing models that "hear the world like humans" with generative models, probabilistic inference, statistical signal processing, and machine learning; Biologically inspired technologies to aid the hearing-impaired; Models of biological information processing.
Visual attention and perception