Skip to main content.
Advanced search >
<< Back to previous page Print

<< Wednesday, November 29, 2017 >>


Remind me

Tell a friend

Add to my Google calendar (bCal)

Download to my calendar

Bookmark and ShareShare


Decoding the computations of high-level auditory neurons

Seminar: Redwood Seminar | November 29 | 12 p.m. | 560 Evans Hall


Joel Kaardal, Salk Institute

Neuroscience Institute, Helen Wills


Characterizing the computations performed by high-level sensory regions of the brain remains enigmatic due to the many nonlinear signal transformations that separate the input sensory stimuli from the neural responses. In order to produce interpretable models of these computations, dimensionality reduction techniques can be employed to obtain a description of the neural computation in terms of a relevant, multicomponent subspace of the stimulus space. While a number of these techniques have been devised, many rely on computing second-order moments of the stimulus/response distribution leading to models with many more parameters than is ultimately necessary to capture the relevant subspace. For high-level sensory neurons in particular, these models can be prone to overfitting due to low effective sampling of the stimulus space when presented with natural stimuli. To address this, we reformulated a maximum entropy method as a low-rank matrix factorization problem. With the principled application of regularization, the low-rank method led to improved prediction accuracy and estimation of the relevant subspace than prior methods. The low-rank method was deployed to study the computations of neurons from high-level regions in the songbird brain yielding multiple relevant components spanning each neuron's receptive field. The relevant components were then transformed using logical OR and logical AND operations highlighting potential differences in how regions and sensory systems process sensory information.


nrterranova@berkeley.edu