Skip to content Skip to navigation

Eva Dyer: Towards robust representations of neural activity: Why do we need them and how do we build them

Eva Dyer
January 18, 2022 - 10:00am to 11:00am
Stanford Neurosciences Building, Gunn Rotunda (E241) & via Zoom

Abstract

Understanding how neural circuits coordinate to drive behavior and decision making is a fundamental challenge in neuroscience. Unfortunately, finding a stable link between the brain and behavior has been difficult--even when behavior is consistent, neural activity can appear highly variable. In this talk, I will discuss ways that my lab is tackling this challenge to form more robust and interpretable readouts from neural circuits. The talk will focus on our recent efforts to use self-supervised learning (SSL) to decode and disentangle neural states. In SSL, invariances are achieved by encouraging “augmentations” (transformations) of the input to be mapped to similar points in the latent space. We demonstrate how this guiding principle can be used to model populations of neurons in diverse brain regions in both macaques and rodents, and disentangle different sources of information in the neural representation of movement. Our work shows that by establishing a more stable link between the brain and behavior, we can build better brain decoders and find common neural representations of behavior across individuals.

Bio

Eva L. Dyer is an Assistant Professor in the Department of Biomedical Engineering at the Georgia Institute of Technology. Dr. Dyer’s research cuts across machine learning and neuroscience to understand how neural activity can be linked to behavior and build biomarkers of disease. Dr. Dyer’s lab derives insights from the structure and function of the brain to design new artificial intelligence systems that can learn from fewer labels and adapt to changing inputs over time.

Event Sponsor: 
Wu Tsai Neurosciences Institute and Stanford Data Science
Contact Email: 
neuroscience@stanford.edu
Contact Phone: 
650-723-3573

This event belongs to the following series