Large scale learning: challenges and solutions
Halls department, Hall 5
Thursday, 28 December 2017
12:00 - 13:00
The emerge of "big data" has created a need for computationally-statistically efficient optimization methods. From a statistical point of view, more observations are more information in that one can invoke asymptotical results. However, the computational complexity of learning methods scales with the size of the datasets. In this talk, we review the "computational-statistical trade-offs" of large-scale learning. In the context of empirical risk minimization (ERM), we illustrate this trade-off. Then we highlight advantages of learning with adaptive sample sizes to make a better statistical-computational balance for optimization methods in large-scale learning.
Since September 2014 he is a PhD student in Data Analytics group at ETHZ. He completed his M.S. in Artificial Intelligence at Sharif University of Technology in Tehran, and worked at the Empirical Inference Department of the Max Planck Institute in Tübingen as a research intern. He got his B.S. in Computer Engineering from Sharif University of Technology.