Large-Scale Machine Learning and Stochastic Algorithms by Leon Bottou - Machine Learning Summer School at Purdue, 2011. During the last decade, data sizes have outgrown processor speed. We are now frequently facing statistical machine learning problems for which datasets are virtually infinite. Computing time is then the bottleneck.
September 25, 2023
English
English [CC]
Description
The first part of the lecture centers on the qualitative difference between small-scale and large-scale learning problem. Whereas small-scale learning problems are subject to the usual approximation - estimation tradeoff, large-scale learning problems are subject to a qualitatively different tradeoff involving the computational complexity of the underlying optimization algorithms in non-trivial ways. Unlikely optimization algorithm such as stochastic gradient show amazing performance for large-scale machine learning problems. The second part makes a detailed overview of stochastic learning algorithms applied to both linear and nonlinear models. In particular I would like to spend time on the use of stochastic gradient for structured learning problems and on the subtle connection between nonconvex stochastic gradient and active learning.
Course Curriculum
- Lecture 1 – Learning with Stochastic Gradient Descent Unlimited
- Lecture 2 – The Tradeoffs of Large Scale Learning Unlimited
- Lecture 3 – Experiments with SGD Unlimited
- Lecture 4 – Analysis for a Simple Case, Learning with a Single Epoch Unlimited
- Lecture 5 – General Convergence Results Unlimited
- Lecture 6 – SGD for Neyman-Pearson Classification Unlimited
About the instructor
Instructor Rating
Reviews
Courses
Students
Hours of videos
166 years, 7 months
Units & Quizzes
Unlimited Lifetime access
Access on mobile app
Certificate of Completion
- For teams of 2 or more users
- 27,000+ fresh & in-demand courses
- Learning Engagement tools
- SSO and LMS Integrations