Auditory perceptual learning through multimodal training
Liu, Estella Huei-Mei
MetadataShow full item record
Sensitivity to sensory signals can be enhanced through training, which traditionally involves unimodal discrimination tasks. Recent work suggests that training with redundant multimodal stimuli can facilitate such perceptual learning. This dissertation examined how audiovisual training impacts perceptual learning associated with detecting small differences in the amplitude modulation of sounds. Sound amplitude modulation is an important cue to auditory motion in depth. In five experiments, participants were trained to detect increasing or decreasing amplitude modulation either with or without concurrent presentations of growing or shrinking disks. Across the five experiments multimodal training was accompanied by faster and more robust learning compared to unimodal training. Superior learning was found when the crossmodal cues were congruent (Experiments 2–5) but not when the cues were incongruent (Experiments 4 and 5). The successful transfer of learning across tasks (Experiments 2–5) and to untrained carrier frequencies (Experiments 3 and 5) supported the notion that the observed enhancement in perceptual sensitivity was mediated by changes at an abstract rather than a stimulus-specific or procedural level. These results suggest that multimodal perceptual training with stimuli that reflect statistical regularities in the environment can augment perceptual learning.