IAI/Booz Allen Hamilton Colloquium: Jonathan Simon, "Cortical Encoding"

Friday, September 6, 2013
3:00 p.m.
1110 Kim Building (main lecture hall)
Pam White
301 405 6615

Intelligent Automation, Inc. Colloquia Series
Booz Allen Hamilton Distinguished Colloquium in Electrical and Computer Engineering

Cortical Encoding of Auditory Objects at the Cocktail Party

| video |

Jonathan Simon
Associate Professor
Department of Biology
Department of Electrical and Computer Engineering
Institute for Systems Research

A visual scene is perceived in terms of its constituent visual objects. Similar ideas have been proposed for the analogous case of auditory scene analysis, though their hypothesized neural underpinnings have not yet been established. Here, we investigate how auditory objects are individually represented in auditory cortex, using magnetoencephalography (MEG) to record the neural responses of human listeners. In a series of experiments, subjects selectively listen to one of two competing streams, in a variety of auditory scenes. First we demonstrate that attentional gain does not act globally on the entire auditory scene, but rather acts differentially on the separate auditory streams. This stream-based attentional gain is then used as a tool to individually analyze the different neural representations of the competing auditory streams.

In the acoustically richest example, subjects selectively listen to one of two competing speakers mixed in a single channel. Individual neural representations of the speech of each speaker are observed in auditory cortex, with each being selectively phase locked to the rhythm of the corresponding speech stream, and from which can be exclusively reconstructed the temporal envelope of that speech stream. The neural representation of the attended speech, originating in posterior auditory cortex, dominates the responses. Critically, when the intensities of the attended and background speakers are separately varied over a wide intensity range, the neural representation of the attended speech adapts only to the intensity of that speaker, but not to the intensity of the background speaker. This demonstrates object-level intensity gain control in addition to the object-level attentional gain.

Overall, these results indicate that concurrent auditory objects, even if spectrally overlapping and not resolvable at the auditory periphery, are indeed neurally encoded individually as objects, in auditory cortex.

Audience: Graduate  Undergraduate  Faculty  Post-Docs  Alumni  Corporate 

remind we with google calendar


June 2024

26 27 28 29 30 31 1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 1 2 3 4 5 6
Submit an Event