There is a growing overlap between Machine Learning, Neuroscience, and Network Theory. These three disciplines create a fertile inter-disciplinary cycle: a) inspiration from neuroscience leads to novel machine learning models and deep neural networks in particular, b) these networks can be better understood and designed using network theory, and c) machine learning and network theory provide new modeling tools to understand the brain’s structure and function, closing the cycle. In this talk, we will “tour” this cross-disciplinary research agenda by focusing on three recent works: a) the design of sparse neural networks that can learn fast and generalize well (PHEW, ICML 2021), b) the use of structural adaptation for continual learning (NISPA, ICML 2022), and c) the emergence of hierarchically modularity in neural networks (Neural Sculpting, NeurIPS 2023).
Dr. Constantine Dovrolis is the Director of the center for Computational Science and Technology (CaSToRC) at The Cyprus Institute (CyI) as of 1/1/2023. He is also a Professor at the School of Computer Science at the Georgia Institute of Technology (Georgia Tech). He is a graduate of the Technical University of Crete (Engr.Dipl. 1995), University of Rochester (M.S. 1996), and University of Wisconsin-Madison (Ph.D. 2000).
Este evento se impartirá en inglés