aaronlsmiles / Maths-for-ML

Notes from Mathematics for Machine Learning course (Imperial College London, Coursera): Linear Algebra, Multivariate Calculus, Principal Component Analysis (PCA)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Mathematics for Machine Learning

Notes from the course by Imperial College London on Coursera.

This repository will cover the following topics:

  • Linear algebra;
  • Multivariate Calculus;
  • Principal Component Analysis (PCA).

Course description (from Coursera page)

For a lot of higher level courses in Machine Learning and Data Science, you find you need to freshen up on the basics in mathematics - stuff you may have studied before in school or university, but which was taught in another context, or not very intuitively, such that you struggle to relate it to how it’s used in Computer Science. This specialization aims to bridge that gap, getting you up to speed in the underlying mathematics, building an intuitive understanding, and relating it to Machine Learning and Data Science.

In the first course on Linear Algebra we look at what linear algebra is and how it relates to data. Then we look through what vectors and matrices are and how to work with them.

The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data. It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting.

The third course, Dimensionality Reduction with Principal Component Analysis, uses the mathematics from the first two courses to compress high-dimensional data. This course is of intermediate difficulty and will require Python and numpy knowledge.

At the end of this specialization you will have gained the prerequisite mathematical knowledge to continue your journey and take more advanced courses in machine learning.

Course link

Linear Algebra

  • [Week 1]:

    Solving simultaneous equations

  • [Week 2]:

    Modulus e inner products

    Cosine e dot products

    Scalar and vector projections

    Basis change

    Linear dependence

  • [Week 3]:

    Matrix multiplication

    Matrix properties

    Identity matrix

    Matrix transformation

    Solving simultaneous equations through matrix method

    Inverse matrix

  • [Week 4]:

    Einstein summation

    Symmetry of the dot product

    Notes on non-square matrix multiplication

    Changing basis in matrices

    Transformation in changed basis

    Orthogonal matrices

    Gram-schmidt process

  • [Week 5]:

    Eigenvalues

    Eigenvectors

    Special eigen-cases

    Changing to the eigenbasis

Multivariate Calculus

  • [Week1]:

    Differentiation and definition of a derivative;

    Sum rule;

    Power rule;

    Special cases derivative;

    Product rule;

    Chain rule;

    All around application.

  • [Week2]:

    Dependent and independent variables;

    Extension to multivariate differentiation;

    Multivariate complex example;

    Multivariate partial differentiation;

    Jacobian vector;

    Hessian matrix.

  • [Week3]:

    Multivariate chain rule;

    Neural network in matrix form;

    Applied NN with the chain rule

  • [Week4]:

    Why approximate function;

    Power series;

    Maclaurin series;

    Taylor series;

    Linearisation;

    Multivariate taylor series.

  • [Week5]:

    One dimensional newton-raphson;

    Gradient descent;

    Constrained optimisation.

Principal Component Analysis

  • [Week 1]:

    Mean of a dataset;

    One dimensional variance;

    Covariance matrix;

    Linear transformation properties for the mean, variance and covariance;

    Numpy tutorial (from the course lab);

  • [Week 2]:

    Dot product, angles and distance between vectors;

    Inner products;

    Inner products and length of vectors;

    Inner products, orthogonality and angle between vectors.

  • [Week 3]:

    Projections onto 1-D subspace;

    Projections in higher dimensions (N-D subspace).

  • [Week 4]:

    PCA objective and key ideas;

    Coordinates of projected data;

    Derivation of the average square reconstruction error;

    Finding the basis vectors that spans the principal subspace;

    Summary of key equations.

About

Notes from Mathematics for Machine Learning course (Imperial College London, Coursera): Linear Algebra, Multivariate Calculus, Principal Component Analysis (PCA)

License:MIT License


Languages

Language:Jupyter Notebook 100.0%