Oreilly - Math for Machine Learning
by Richard Han | Released May 2018 | ISBN: None
Would you like to learn a mathematics subject that is crucial for many high-demand lucrative career fields such as: Computer Science Data Science Artificial Intelligence If you're looking to gain a solid foundation in Machine Learning to further your career goals, in a way that allows you to study on your own schedule at a fraction of the cost it would take at a traditional university, this online course is for you. If you're a working professional needing a refresher on machine learning or a complete beginner who needs to learn Machine Learning for the first time, this online course is for you. Why you should take this online course: You need to refresh your knowledge of machine learning for your career to earn a higher salary. You need to learn machine learning because it is a required mathematical subject for your chosen career field such as data science or artificial intelligence. You intend to pursue a masters degree or PhD, and machine learning is a required or recommended subject. Why you should choose this instructor: I earned my PhD in Mathematics from the University of California, Riverside. I have created many successful online math courses that students around the world have found invaluable—courses in linear algebra, discrete math, and calculus. Show and hide more Publisher Resources Download Example Code
- Course Promo 00:02:42
- Introduction
- Course Introduction 00:02:46
- Linear Regression
- Linear Regression 00:07:33
- The Least Squares Method 00:11:25
- Linear Algebra Solution to Least Squares Problem 00:12:51
- Example Linear Regression 00:04:05
- Summary Linear Regression 00:00:34
- Linear Discriminant Analysis
- Classification 00:01:15
- Linear Discriminant Analysis 00:00:44
- The Posterior Probability Functions 00:03:43
- Modelling the Posterior Probability Functions 00:07:13
- Linear Discriminant Functions 00:05:32
- Estimating the Linear Discriminant Functions 00:06:00
- Classifying Data Points Using Linear Discriminant Functions 00:03:09
- LDA Example 1 00:13:52
- LDA Example 2 00:17:38
- Summary Linear Discriminant Analysis 00:01:34
- Logistic Regression
- Logistic Regression 00:01:16
- Logistic Regression Model of the Posterior Probability Function 00:03:02
- Estimating the Posterior Probability Function 00:08:57
- The Multivariate Newton-Raphson Method 00:09:14
- Maximizing the Log-Likelihood Function 00:13:52
- Logistic Regression Example 00:09:55
- Summary Logistic Regression 00:01:21
- Artificial Neural Networks
- Artificial Neural Networks 00:00:36
- Neural Network Model of the Output Functions 00:13:00
- Forward Propagation 00:00:51
- Choosing Activation Functions 00:04:30
- Estimating the Output Functions 00:02:17
- Error Function for Regression 00:02:27
- Error Function for Binary Classification 00:06:16
- Error Function for Multiclass Classification 00:04:38
- Minimizing the Error Function Using Gradient Descent 00:06:27
- Backpropagation Equations 00:04:17
- Summary of Backpropagation 00:01:27
- Summary Artificial Neural Networks 00:01:48
- Maximal Margin Classifier
- Maximal Margin Classifier 00:02:30
- Definitions of Separating Hyperplane and Margin 00:05:44
- Proof 1 00:06:43
- Maximizing the Margin 00:03:36
- Definition of Maximal Margin Classifier 00:01:02
- Reformulating the Optimization Problem 00:07:37
- Proof 2 00:01:14
- Proof 3 00:04:52
- Proof 4 00:08:41
- Proof 5 00:05:11
- Solving the Convex Optimization Problem 00:01:06
- KKT Conditions 00:01:25
- Primal and Dual Problems 00:01:25
- Solving the Dual Problem 00:03:31
- The Coefficients for the Maximal Margin Hyperplane 00:00:30
- The Support Vectors 00:00:58
- Classifying Test Points 00:01:51
- Maximal Margin Classifier Example 1 00:09:50
- Maximal Margin Classifier Example 2 00:11:41
- Summary Maximal Margin Classifier 00:00:31
- Support Vector Classifier
- Support Vector Classifier 00:03:54
- Slack Variables Points on Correct Side of Hyperplane 00:03:47
- Slack Variables Points on Wrong Side of Hyperplane 00:01:38
- Formulating the Optimization Problem 00:03:53
- Definition of Support Vector Classifier 00:00:44
- A Convex Optimization Problem 00:01:47
- Solving the Convex Optimization Problem (Soft Margin) 00:06:38
- The Coefficients for the Soft Margin Hyperplane 00:02:09
- Classifying Test Points (Soft Margin) 00:01:36
- The Support Vectors (Soft Margin) 00:01:37
- Support Vector Classifier Example 1 00:14:53
- Support Vector Classifier Example 2 00:09:20
- Summary Support Vector Classifier 00:00:42
- Support Vector Machine Classifier
- Support Vector Machine Classifier 00:01:20
- Enlarging the Feature Space 00:05:23
- The Kernel Trick 00:04:25
- Summary Support Vector Machine Classifier 00:01:08
Show and hide more
TO MAC USERS: If RAR password doesn't work, use this archive program:
RAR Expander 0.8.5 Beta 4 and extract password protected files without error.
TO WIN USERS: If RAR password doesn't work, use this archive program:
Latest Winrar and extract password protected files without error.