1

Introduction to Supervised, Unsupervised and Partially-Supervised Training Algorithms by Dale Schuurmans - Machine Learning Summer School at Purdue, 2011. This course will provide a simple unified introduction to batch training algorithms for supervised, unsupervised and partially-supervised learning. The concepts introduced will provide a basis for the more advanced topics in other lectures.

FREE
This course includes
Hours of videos

166 years, 7 months

Units & Quizzes

6

Unlimited Lifetime access
Access on mobile app
Certificate of Completion

The first part of the course will cover supervised training algorithms, establishing a general foundation through a series of extensions to linear prediction, including: nonlinear input transformations (features), L2 regularization (kernels), prediction uncertainty (Gaussian processes), L1 regularization (sparsity), nonlinear output transformations (matching losses), surrogate losses (classification), multivariate prediction, and structured prediction. Relevant optimization concepts will be acquired along the way. The second part of the course will then demonstrate how unsupervised and semi-supervised formulations follow from a relationship between forward and reverse prediction problems. This connection allows dimensionality reduction and sparse coding to be unified with regression, and clustering and vector quantization to be unified with classification - even in the context of other extensions. Current convex relaxations of such training problems will be discussed. The last part of the course covers partially-supervised learning - the problem of learning an input representation concurrently with a predictor. A brief overview of current research will be presented, including recent work on boosting and convex relaxations.

Course Currilcum

  • Lecture 1 – Course Introduction Unlimited
  • Lecture 2 – Generalized domain representations and regularizations Unlimited
  • Lecture 3 – Generalized domain representations and regularizations (cont.) Unlimited
  • Lecture 4 – Generalized output representations and structure Unlimited
  • Lecture 5 – Generalized output representations and structure (cont.) Unlimited
  • Lecture 6 – Generalized output representations and structure (cont.) Unlimited