Toggle navigation sidebar
Toggle in-page Table of Contents
Mathematics for Machine Learning and Data Science
Topics
Linear Algebra
Week 1
Week 2
Week 3
Week 4
Extra
Exercises
Calculus
Week 1
Week 2
Week 3
Extra
.rst
.pdf
Calculus
Calculus
#
Week 1
Derivatives
Average rate of change of a function
Derivative of a function
Not differentiable functions
Proof that |x| is not differentiable at 0
Proof that ReLU is not differentiable at 0
Proof that radicals are not differentiable at 0
The inverse function and its derivative
Derivative rules
Constant multiple rule
Sum or difference rule
Product rule
Quotient rule
Chain rule
Univariate optimization
Computational efficiency of symbolic, numerical and automatic differentiation
Week 2
Multivariate optimization
Tangent plane
Partial derivatives
Equaling partial derivatives to 0 to find the minima and maxima
Gradient
Week 3
Optimizing neural networks
Single-neuron network with linear activation and Mean Squared Error (MSE) loss function
Single-neuron network with sigmoid activation and Log loss function
Calculation of partial derivative of Log loss wrt sigmoid activation
Calculation of the derivative of the sigmoid function
Neural network with 1 hidden layer of 3 neurons and sigmoid activations
Motivation
Neural network notation
Partial derivatives of a neural network
Forward and backward propagation
Building a neural network
Newton-Raphson method
Newton-Raphson method with one variable
Second derivatives
Second-order partial derivatives
Hessian matrix
Netwon-Raphson method with more than one variable
Extra
Lagrange multiplier method
Convex optimization
Inequality constraints and the Karush–Kuhn–Tucker conditions
Wolfe dual problem
Optimizing SVM
Quasi-newton methods
BFGS
Derivation of the direct inverse of the BFGS update