Home » Course Layouts » Free Course Layout Udemy
Survey of Boosting from an Optimization Perspective by Manfred K. Warmuth - Machine Learning Summer School at Purdue, 2011. Boosting has become a well known ensemble method.
0
1
English
English [CC]
- Learn basic syntax that can apply to any language.
- Learn what is a programming language and the basic concepts for beginners.
- Understand what is Javascript in it's truest form.
- Know the basic syntax of Javascript.
- Know some hidden quirks in Javascript.
Description
The algorithm maintains a distribution on the binary labeled examples and a new base learner is added in a greedy fashion. The goal is to obtain a small linear combination of base learners that clearly separates the examples. We focus on a recent view of Boosting where the update algorithm for distribution on the examples is characterized by a minimization problem that uses a relative entropy as a regularization.
The most well known boosting algorithms is AdaBoost. This algorithm approximately maximizes the hard margin, when the data is separable. We focus on recent algorithms that provably maximize the soft margin when the data is noisy. We will teach the new algorithms, give a unified and versatile view of Boosting in terms of relative entropy regularization, and show how to solve large scale problems based on state of the art optimization techniques. We also discuss lower and upper bounds on the number of iterations required for any greedy boosting method and propose a way to circumvent these lower bounds.
Course content
- Lecture 1 – Introduction to Boosting Unlimited
- Lecture 2 – Squared Euclidean versus relative entropy regularization Unlimited
- Lecture 3 – Boosting as margin maximization with no regularization, LPBoost Unlimited
- Lecture 4 – LPBoost Unlimited
- Lecture 5 – Lower Bound and experiments Unlimited
- Lecture 6 – The Blessing and the Curse of the Multiplicative updates Unlimited
N.A
- 5 stars0
- 4 stars0
- 3 stars0
- 2 stars0
- 1 stars0
No Reviews found for this course.