Speech Perception and Language Lab at Villanova University
Welcome to the Word Recognition & Auditory Perception Lab (WRAP Lab)! Our group studies how human listeners recognize speech and understand spoken language. Investigating language processing as it happens is central to our approach, and we use a combination of computational, cognitive neuroscience, and behavioral techniques to study these processes.
Find out more about our research on this site. Thanks for stopping by! — Joe Toscano
Here's what we've been up to lately
Is In a new paper appearing in Psychological Science, WRAP Lab post-doc Laura Getz investigated whether information about words feeds back to affect low-level speech perception. For example, seeing the word "amusement" leads to the expectation that the word "park" should come next. Are you more likely to perceive an ambiguous speech sound between "bark" and "park" as a "p" when you hear it in the context of "amusement"? We used the event-related potential (ERP) technique to measure how the brain responds to these ambiguous sounds at early stages of speech perception (100 ms after hearing the sound). We found that context does in fact change perception of ambiguous speech sounds, helping to address a long-standing debate regarding the influence of top-down information on perception. Check out the paper here:
Click here to read the paper.
WRAP Lab graduate student Agnes Gao published a paper in Attention, Perception, & Psychophysics that examines the effect of language background on speech perception. Using ERP measures to assess native and non-native speakers' perception of Mandarin tones, we found that both groups of listeners are sensitive to graded acoustic differences during speech processing, but they differ in their behavioral discrimination of the sounds. These results suggest that listeners maintain sensitivity to continuous acoustic cues even after learning phonological categories along an acoustic dimension.
Click here to read the paper.
In a paper published in Auditory Perception & Cognition, WRAP Lab grad students Olivia Pereira and Agnes Gao demonstrate that the auditory N1 ERP component can be used to measure speech sound encoding across a range of acoustic and phonological contrasts, indicating that the N1 provides a general measure of early speech processing. The results also suggest that phonetic distinctions are encoded in a way that reflects differences along specific acoustic cue dimensions. Overall, these data provide us with a more complete picture of early speech processing and shed light on which types of phonetic distinctions we can study using this technique.
Check out the paper here.
Check out the presentations from our lab from this year's Psychonomics and APCAM meetings in New Orleans!
A new paper by WRAP Lab grad student, Lexie Tabachnick, published in the Journal of Speech, Language, and Hearing Research, investigates how auditory brainstem responses to tones vary as a function stimulus frequency. This study demonstrates that the ABR can provide a useful index of perceptual encoding across a wide range of sound frequencies, with the amplitude of the ABR precisely tracking tone frequency from 500 to 8000 Hz. This provides a crucial counterpart to our work on perceptual encoding in cortical responses and offers new insights into how the ABR can be used to study auditory perception in normal-hearing listeners and to measure effects of hearing loss.
Click here to see the paper.
In a new paper published in Brain & Language, we used the fast optical imaging technique to study the time-course of speech perception. We show that the brain encodes sounds in terms of continuous acoustic cues at early stages of perception and rapidly begins to categorize them based on phonological differences. This technique allows us to study these responses non-invasively in human subjects. Check out the paper here.
Please contact us if you would like to learn more about our research, request a copy of a paper, are interested in joining the lab, or have any other questions. Our email address is firstname.lastname@example.org.
Scheduled to participate in a study? The main lab is located in Tolentine Hall, Room 231. Some of our experiments also take place in the eye-tracking lab in Tolentine 18A. If you're scheduled to participate in an experiment but aren't sure where to go, please come to the main lab and a research assistant will meet you there!