DX 601 - Mathematical Foundations of Machine Learning
The notes linked below are intended to supplement live sessions, Blackboard content and assigned readings for DX 601. They aim to provide broader depth while being more accessible than the textbooks. Beware: these notes are AI-generated, so read carefully and please check with the instructor if you find any mistakes.
Week 8: Vectors and Matrices
- Vectors (what)
- Matrices (what)
- Dot product (what)
- Linear maps (what)
Week 9: Systems of Linear Equations
- Systems of linear equations (what)
- Gaussian elimination (what)
Week 10: Linear Independence
- Linear independence (what)
- Spans (what)
- Vector spaces (what)
- Bases and coordinates (what)
- Orthogonal bases (what)
- Gram-Schmidt algorithm (what)
Week 11: Matrix Ranks and Invertibility
- Matrix rank (what)
- Matrix invertibility (what)
- Determinants (what)
Week 12: Eigenvectors and eigenvalues
- Eigenvalues and eigenvectors (what)
- Eigendecomposition (what)
- Diagonalization (what)
- Singular value decomposition (what)
- Every matrix has a singular value decomposition (why)
- Spectral theorem for real symmetric matrices (why)
Week 13: Principal Components Analysis
- Covariance matrix (what)
- Principal components (what)
- Principal components are eigenvectors of the covariance (why)
- Dimensionality reduction (what)
- Linear dimensionality reduction (what)
- Low-rank approximation (what)
- The Eckart-Young theorem (why)
- Denoising data (what)
- Matrix completion (what)
Week 14: Introduction to Modeling
- Defining model hypotheses (what)
- Linear regression (what)
- Loss functions (what)
- Finding better hypotheses (what)
- Generalization (what)
- Bias-variance decomposition (what)