Video Recordings for Previous Semesters
Lecture | Lecture content | Fall 22 Recording | Spring 23 Recording |
---|---|---|---|
1 | Introduction, what is optimization? Least-squares, Minimum norm | Lecture 1 | Lecture 1 |
2 | Least squares review: Vector norms, Gram Schmidt-QR, Fundamental Theorem of Linear Algebra | Lecture 2 | Lecture 2 |
3 | Symmetric Matrices + Eigenvalues + Rayleigh coef + Power Iteration, Matrix Norms, Matrix Square Root, PSD Matrices | Lecture 3 | Lecture 3 |
4 | SVD and PCA | Lecture 4 | Lecture 4 |
5 | Low-rank approximation — Eckert-Young theorem. Matrix Norms | Lecture 5 | Lecture 5 |
6 | Low-rank approximation — Eckert-Young theorem part 2 | Lecture 6 | Lecture 6 |
7 | Vector Calculus | Lecture 7 | Lecture 7 |
8 | Ridge regression: 3 interpretations. (1) Ill-conditioned matrices (2) Modified Least squares (3) Ghost Data (connection to Tikhonov) | Lecture 8 | Lecture 8 |
9 | PCA and ridge connection. Least-Squares MLE connection, Ridge: MAP connection. | Lecture 9 | Lecture 9 |
10 | Convexity | Lecture 10 | Lecture 10 |
11 | Convexity | Lecture 11 | Lecture 11 |
12 | Gradient descent + convergence | Lecture 12 | Lecture 12 |
13 | SGD, Projected Gradient Descent, Frank Wolfe | Lecture 13 | Lecture 13 Lecture 14 Lecture 15 |
14 | Midterm review | MT Review | |
15 | Convex Optimization | ||
16 | Weak duality | Lecture 16 | |
17 | Strong duality | Lecture 17 | |
18 | Optimality conditions, KKT | Lecture 18 | |
19 | LPs | Lecture 19 Lecture 20 |
|
20 | QPs | Lecture 21 | |
21 | SOCPs | Lecture 22 |