November 3, 2020

12:00 pm / 1:00 pm

Recorded Seminar:   https://wse.zoom.us/rec/share/XwcJaNfBrSFGZhZF5jTFWYGZd8bEcwzEXQMrwqB1gPDApLjp0YSlir4W5Q5Lro8D.iE9yj-pmZBRZ6CFr?startTime=1604422228000

Adam Charles, PhD

Assistant Professor

Department of Biomedical Engineering

Title: Data science in neuroscience: from sensors to theory

Abstract: The human brain has ~100×10^9 neurons. Unlike theliver, we typically believe that this number is important for cognition andlearning, as evidenced by the significant variability of collocated neuron’sactivity.In furthering our understanding of neural systems, we have sought 1)technologies that constantly eclipse the capabilities of the past 2)Meaningfully simplified models that can distil high-dimensional data into humanreadable results. Data science algorithms and theory are becoming acenterpiece in both these domains, driving new computational and adaptive dataacquisition and demonstrating fundamental capabilities of mathematical modelsof computation. In this talk I will discuss two projects that highlight thepotential for data science to have high impact in important, evolvingproblems in neuroscience. 

First I will discuss “ML at the sensor”, i.e.,adaptive sampling approaches that maximize neuron yield using state-of-the artprobes in non-human primates. This work points to theimportant role algorithmswill have in bypassing sensing and bandwidth constraints given the unique andchallenging conditions of real-time recording and experimentation ofhigh-dimensional signals. Next I will discuss a fundamental model forinterpreting cognitive data both in neuroscience and psychology: recurrentneural networks. Often such models are used based on mechanistic arguments(i.e., the brain is recurrent), however the fundamental properties of thesemathematical objects should play a more vital role in when and how they aredeployed to elicit understanding. One such property is the informationretention capability, or the “short-term memory” (STM) of recurrentsystems: a system cannot be considered a good model for a task thatfundamentally needs more information than the system can hold! To these ends Iexplore the particular case of echo-state network and rigorously analyze howproperties of the data can impact the STM of random RNNs. 

Bio: Adam Charles is an Assistant Professor in theDepartment of Biomedical Engineering at Johns Hopkins University. Adamcompleted his Masters and Bachelorsat the Cooper Union in NYC, followed by aPhD in Electrical and Computer Engineering under the guidance of Chris Rozell atGeorgia Tech. With a background in engineering, Adam continued to aPost-doctoral position with Jonathan Pillow at the Princeton NeuroscienceInstitute. Adam’s interests are at the intersection of statistical signalprocessing, computational and theoretical neuroscience, and data science, witha focus on computational imagingand the development of next-generationalgorithms for extracting meaning from complex neural data.