1

Survey of Boosting from an Optimization Perspective by Manfred K. Warmuth - Machine Learning Summer School at Purdue, 2011. Boosting has become a well known ensemble method.

FREE
This course includes
Hours of videos

166 years, 7 months

Units & Quizzes

6

Unlimited Lifetime access
Access on mobile app
Certificate of Completion

The algorithm maintains a distribution on the binary labeled examples and a new base learner is added in a greedy fashion. The goal is to obtain a small linear combination of base learners that clearly separates the examples. We focus on a recent view of Boosting where the update algorithm for distribution on the examples is characterized by a minimization problem that uses a relative entropy as a regularization.

The most well known boosting algorithms is AdaBoost. This algorithm approximately maximizes the hard margin, when the data is separable. We focus on recent algorithms that provably maximize the soft margin when the data is noisy. We will teach the new algorithms, give a unified and versatile view of Boosting in terms of relative entropy regularization, and show how to solve large scale problems based on state of the art optimization techniques. We also discuss lower and upper bounds on the number of iterations required for any greedy boosting method and propose a way to circumvent these lower bounds.

Course Currilcum

  • Lecture 1 – Introduction to Boosting Unlimited
  • Lecture 2 – Squared Euclidean versus relative entropy regularization Unlimited
  • Lecture 3 – Boosting as margin maximization with no regularization, LPBoost Unlimited
  • Lecture 4 – LPBoost Unlimited
  • Lecture 5 – Lower Bound and experiments Unlimited
  • Lecture 6 – The Blessing and the Curse of the Multiplicative updates Unlimited