This course was created with the
course builder. Create your online course today.
Start now
Create your course
with
Autoplay
Autocomplete
Previous Lesson
Complete and Continue
Math for Machine Learning
Introduction
Introduction Lecture (2:46)
Companion Book
Linear Regression
Linear Regression (7:32)
The Least Squares Method (11:25)
Linear Algebra Solution to Least Squares Problem (12:50)
Example: Linear Regression (4:05)
Summary: Linear Regression (0:33)
Problem Set: Linear Regression
Solution Set: Linear Regression
Linear Discriminant Analysis
Classification (1:15)
Linear Discriminant Analysis (0:44)
The Posterior Probability Functions (3:42)
Modelling the Posterior Probability Functions (7:13)
Linear Discriminant Functions (5:32)
Estimating the Linear Discriminant Functions (6:00)
Classifying Data Points Using Linear Discriminant Functions (3:09)
LDA Example 1 (13:52)
LDA Example 2 (17:38)
Summary: Linear Discriminant Analysis (1:34)
Problem Set: Linear Discriminant Analysis
Solution Set: Linear Discriminant Analysis
Logistic Regression
Logistic Regression (1:15)
Logistic Regression Model of the Posterior Probability Function (3:02)
Estimating the Posterior Probability Function (8:57)
The Multivariate Newton-Raphson Method (9:14)
Maximizing the Log-Likelihood Function (13:51)
Example: Logistic Regression (9:55)
Summary: Logistic Regression (1:21)
Problem Set: Logistic Regression
Solution Set: Logistic Regression
Artificial Neural Networks
Artificial Neural Networks (0:36)
Neural Network Model of the Output Functions (12:59)
Forward Propagation (0:51)
Choosing Activation Functions (4:30)
Estimating the Output Functions (2:17)
Error Function for Regression (2:27)
Error Function for Binary Classification (6:15)
Error Function for Multi-class Classification (4:38)
Minimizing the Error Function Using Gradient Descent (6:27)
Backpropagation Equations (4:16)
Summary of Backpropagation (1:27)
Summary: Artificial Neural Networks (1:47)
Problem Set: Artificial Neural Networks
Solution Set: Artificial Neural Networks
Maximal Margin Classifier
Maximal Margin Classifier (2:29)
Definitions of Separating Hyperplane and Margin (12:25)
Maximizing the Margin (3:36)
Definition of Maximal Margin Classifier (1:01)
Reformulating the Optimization Problem (27:33)
Solving the Convex Optimization Problem (1:05)
KKT Conditions (1:24)
Primal and Dual Problems (1:24)
Solving the Dual Problem (3:31)
The Coefficients for the Maximal Margin Hyperplane (0:29)
The Support Vectors (0:58)
Classifying Test Points (1:50)
Maximal Margin Classifier Example 1 (9:50)
Maximal Margin Classifier Example 2 (11:41)
Summary: Maximal Margin Classifier (0:31)
Problem Set: Maximal Margin Classifier
Solution Set: Maximal Margin Classifier
Support Vector Classifier
Support Vector Classifier (3:54)
Slack Variables: Points on Correct Side of Hyperplane (3:47)
Slack Variables: Points on Wrong Side of Hyperplane (1:37)
Formulating the Optimization Problem (3:52)
Definition of Support Vector Classifier (0:44)
A Convex Optimization Problem (1:46)
Solving the Convex Optimization Problem (Soft Margin) (6:38)
The Coefficients for the Soft Margin Hyperplane (2:09)
The Support Vectors (Soft Margin) (1:37)
Classifying Test Points (Soft Margin) (1:36)
Support Vector Classifier Example 1 (14:53)
Support Vector Classifier Example 2 (9:19)
Summary: Support Vector Classifier (0:41)
Problem Set: Support Vector Classifier
Solution Set: Support Vector Classifier
Support Vector Machine Classifier
Support Vector Machine Classifier (1:19)
Enlarging the Feature Space (5:22)
The Kernel Trick (4:24)
Summary: Support Vector Machine Classifier (1:07)
Conclusion
Concluding Letter
The Coefficients for the Maximal Margin Hyperplane
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock