Reynolds Receives NSF Funding for Research
Greg Reynolds received a National Science Foundation research grant, along with his co-PIs Lorraine Bahrick and Robert Lickliter from Florida International University, for their project "Selective Attention and Intersensory Processing in Infancy." The total grant is $700,000.
How do infants make sense of the world around them? This project examines the possibility that infant attention is initially drawn to the intersensory redundancy that occurs when the same information is perceived through more than one sensory system. An example of intersensory redundancy is the information common to the movements of the face and sounds of the voice of a person speaking. Multiple measures will be used to determine: (1) whether infants pay attention to information provided by intersensory redundancy before paying attention to other types of information, (2) whether intersensory redundancy helps infants process information more efficiently, and (3) areas of the brain involved in intersensory processing in infancy. Answering these questions will provide insight into how infants learn from caregivers. This project will have broader impacts through training graduate and undergraduate students in cognitive neuroscience, which will increase the participation of underrepresented groups in STEM fields. Findings may also contribute to understanding deficits children with disabilities experience processing audiovisual speech.
This project has three major aims that will be addressed in a series of experiments on 5- and 10-month-old infants. Aim 1 examines how intersensory redundancy affects infants' attention and learning from audiovisual speech, using simultaneous measures of heart rate changes associated with attention and electroencephalogram (EEG) measures of attention and memory. Aim 2 examines whether infants' attention can be biased toward or away from intersensory redundancy by providing specific types of information during initial learning prior to EEG testing. Aim 3 is to determine which areas of the brain are involved in intersensory processing of audiovisual speech in infancy, using computational modeling of EEG data. Infants are expected to show enhanced brain activity to redundant information provided by audiovisual speech, and areas of prefrontal cortex are expected to be involved in processing intersensory redundancy in infancy.