Our lab works on theoretical neuroscience and machine learning, with the fundamental goal of understanding how networks of neurons and synapses cooperate across multiple scales of space and time to mediate important functions, like sensory perception, motor control, and memory. To achieve this goal, we employ and extend tools from disciplines like statistical mechanics, dynamical systems theory, machine learning, information theory, control theory, and high-dimensional statistics, as well as collaborate with experimental neuroscience laboratories collecting physiological data from a range of model organisms. Some topics of interest include: how birds learn to sing, spatial memory in the rodent hippocampus, attention and motor control in macaques, memory properties of complex synapses, dynamics of plasticity in recurrent networks, signal propagation in neural circuits, the emergence of categorization in multi-layered networks, and the statistical mechanics of high dimensional data analysis.
We employ techniques from statistical mechanics, like replica theory and random matrix theory, to analyze the complex dynamics of learning, signal propagation and memory in neuronal networks. We also have an interest in exploiting statistical mechanics to analyze the performance of algorithms from machine learning which could be implemented in neuronal architectures.