PhD, École Polytechnique Fédérale de Lausanne
Synaptic Plasticity Rules: Neural Substrates of Learning
One of the most significant characteristics of our brain is the ability of quick learning and adaptation. In order to unravel the mysteries behind learning, we need to have a strong theoretical framework that explains how a network of neurons can efficiently process huge amount of information. In this talk, I will focus on theoretical cornerstone of synaptic plasticity rules that can be used to describe behavioral properties of memory formation and action learning in the brain. I will describe two general classes of (Hebbian and neo-Hebbian) learning rules, and demonstrate how they can be used for general paradigms of learning in unsupervised and reinforcement-based fashion, respectively. I will specifically focus on models of plasticity that are under the influence of global factors (such as reward or surprise) which can represent the action of one or several neuromodulators, attentional control, or feedback from large populations of neurons.
Mohammadjavad Faraji obtained his B.Sc. in Electrical Engineering at Sharif University of Technology, Tehran, Iran in 2009. He completed his M.Sc. in Communication Systems at Ecole Polytechnique Federal de Lausanne (EPFL) in 2011. He finished his Ph.D. in Computer Science with a specialization in Computational Neuroscience in 2016 at EPFL, under the supervision of Prof. Wulfram Gerstner. During his doctoral study, he has worked on the computational modeling of novelty and surprise signals and how they affect learning in machines (particularly, the brain). His research interests include computational neuroscience, statistical machine learning, and signal processing.
Watch video from here.