Event
IAI/Booz Allen Hamilton Colloquium: Jonathan Simon, "Cortical Encoding"
Friday, September 6, 2013
3:00 p.m.
1110 Kim Building (main lecture hall)
Pam White
301 405 6615
pwhite@umd.edu
Intelligent Automation, Inc. Colloquia Series
Booz Allen Hamilton Distinguished Colloquium in Electrical and Computer Engineering
Cortical Encoding of Auditory Objects at the Cocktail Party
| video |
Jonathan Simon
Associate Professor
Department of Biology
Department of Electrical and Computer Engineering
Institute for Systems Research
Abstract
A visual scene is perceived in terms of its constituent visual objects. Similar ideas have been proposed for the analogous case of auditory scene analysis, though their hypothesized neural underpinnings have not yet been established. Here, we investigate how auditory objects are individually represented in auditory cortex, using magnetoencephalography (MEG) to record the neural responses of human listeners. In a series of experiments, subjects selectively listen to one of two competing streams, in a variety of auditory scenes. First we demonstrate that attentional gain does not act globally on the entire auditory scene, but rather acts differentially on the separate auditory streams. This stream-based attentional gain is then used as a tool to individually analyze the different neural representations of the competing auditory streams.
In the acoustically richest example, subjects selectively listen to one of two competing speakers mixed in a single channel. Individual neural representations of the speech of each speaker are observed in auditory cortex, with each being selectively phase locked to the rhythm of the corresponding speech stream, and from which can be exclusively reconstructed the temporal envelope of that speech stream. The neural representation of the attended speech, originating in posterior auditory cortex, dominates the responses. Critically, when the intensities of the attended and background speakers are separately varied over a wide intensity range, the neural representation of the attended speech adapts only to the intensity of that speaker, but not to the intensity of the background speaker. This demonstrates object-level intensity gain control in addition to the object-level attentional gain.
Overall, these results indicate that concurrent auditory objects, even if spectrally overlapping and not resolvable at the auditory periphery, are indeed neurally encoded individually as objects, in auditory cortex.