Linear Algebra Intermediate
Diagonalization, function spaces, and various derivations of the determinant (undergraduate level)
Overview
At the intermediate level, we deepen the concepts learned in the elementary course and explore advanced topics. We study simplification of matrices through diagonalization, the geometric meaning of complex eigenvalues, an introduction to infinite-dimensional vector spaces (function spaces), and various derivations of the determinant.
Learning Objectives
- Understand the conditions and procedures for diagonalization, and efficiently compute matrix powers
- Understand the relationship between complex eigenvalues and rotations
- Work with polynomial spaces and function spaces through concrete examples
- Compare and understand multiple derivations of the determinant (geometric, linear transformation, row reduction)
Table of Contents
-
Chapter 1
Diagonalization
Conditions for diagonalizability, procedures, matrix powers, matrix exponential
-
Chapter 2
Complex Eigenvalues
Complex eigenvalues and rotations, complex eigenvalues of real matrices, oscillatory systems
-
Chapter 3
Polynomial Spaces
Bases of $\mathcal{P}_n$, differential operators, polynomial operations as linear maps
-
Chapter 4
Function Spaces
$C[a,b]$, $L^2$ spaces, infinite-dimensional vector spaces, inner products
-
Chapter 5
Determinant: Why the Leibniz Formula
Deriving the Leibniz formula from geometric motivation
-
Chapter 6
Determinant: The Linear Transformation Perspective
Scaling factors, the IOLA (Introduction to Linear Algebra) approach
-
Chapter 7
Determinant: Stacking Bars
Building the determinant by adding row vectors one at a time
-
Chapter 8
Determinant Derivation: Row Reduction and Upper Triangular Matrices
Reduction to upper triangular form, computational algorithms
-
Chapter 9
Tensors
Multidimensional arrays, tensor products, contraction, implementation in PyTorch
-
Chapter 10
Norms
L1/L2/L-infinity norms, matrix norms, applications to regularization
-
Chapter 11
Inner Products
Axiomatic definition, projections, cosine similarity, attention mechanisms
-
Chapter 12
Linear Transformations
Definition, matrix representation, kernel and image, rank-nullity theorem
-
Chapter 13
Kernel and Image
Proofs of subspace properties, basis construction, dimension theorem
-
Chapter 14
Eigenvalues and Machine Learning
PCA, spectral clustering, PageRank
-
Chapter 15
Applications of Matrix Decompositions
SVD/NMF/Tucker decomposition, recommender systems, latent semantic analysis
-
Chapter 16
Pauli Matrices and Qubits
Pauli matrices, quantum gates, Bloch sphere
-
Chapter 17
Tensor Products and Multi-Qubit Systems
Kronecker products, quantum entanglement, multi-qubit gates
-
Chapter 18
Determinant: The Leibniz Formula and Rigorous Proofs
Permutations and signs, definition via the Leibniz formula, multilinearity and alternating property
-
Chapter 19
Determinant: Axiomatic Definition and Uniqueness
Proving existence and uniqueness of the determinant starting from axioms
-
Chapter 20
Applications of Eigenvalues
Systems of differential equations, Markov chains, PCA, Google PageRank
Columns
-
History of Determinant Computation: From Hand Calculation to State-of-the-Art Algorithms
Tracing 340 years of history from Seki Takakazu and Leibniz (1683) to modern fast matrix multiplication (omega approaching 2)
Special Matrices & Decompositions
-
Hessenberg Reduction
Transformation to an "almost triangular" matrix. Accelerates eigenvalue computation as preprocessing for the QR algorithm
-
Companion Matrix
Connecting polynomial roots to matrix eigenvalues. A building block of the Frobenius normal form
-
Tridiagonal Matrix
A sparse matrix with bandwidth 1. O(n) direct solution via the Thomas algorithm and preprocessing for eigenvalue computation
-
Householder Transformation (Reflection)
Orthogonal transformation via reflection across a hyperplane. The foundation for QR decomposition, Hessenberg reduction, and tridiagonalization
-
LU Decomposition
Decomposition into lower and upper triangular matrices. Existence conditions, uniqueness proof, PLU decomposition, and relation to determinants
-
Cholesky Decomposition
A = LL^T decomposition for positive definite symmetric matrices. Existence and uniqueness, LDL^T decomposition, and positive definiteness testing
-
Singular Value Decomposition (SVD)
A = U-Sigma-V^T decomposition for arbitrary matrices. Geometric interpretation, pseudoinverse, and low-rank approximation
-
Quadratic Forms
Classification of Q(x) = x^TAx. Principal axis theorem, Sylvester's law of inertia, and applications to extremum determination
-
Similarity Transformation
Matrix transformation through change of basis. Similarity invariants, and relation to diagonalization and Jordan normal form
-
Normal Equations
Foundation of the least squares method. Derivation via the projection theorem, numerical solutions using Cholesky and QR decomposition
-
Symmetric Positive Definite Matrices
Quadratic forms, definitions of positive definite and positive semi-definite, Sylvester's criterion, and relation to Cholesky decomposition
-
Tensor Product
Universality of bilinear maps, Kronecker product, symmetric and alternating tensors, applications to quantum mechanics
-
Rotation Matrix
2D and 3D rotation matrices, Euler angles, Rodrigues' rotation formula, SO(n) and Lie algebras, quaternion representation
-
Hadamard Matrix
Definition and properties, Sylvester construction, Paley construction, Hadamard conjecture, applications to CDMA, quantum computing, and design of experiments
-
Stochastic Matrix
Right stochastic, left stochastic, and doubly stochastic matrices, Perron-Frobenius theorem, Markov chains, convergence theorem, PageRank and MCMC
-
Hermitian Matrix
Definition and basic properties, spectral theorem, positive definiteness, Rayleigh quotient, applications to quantum mechanics, statistics, and signal processing
-
Kronecker Product
Definition and block matrix representation, basic properties, mixed-product property, relation to the vec operator, eigenvalues, relation to tensor product, applications
-
Characteristic Polynomial
Definition and basic properties, relation to eigenvalues, Cayley-Hamilton theorem, similarity invariance, relation to minimal polynomial, applications to inverse matrix computation and stability analysis
-
Minimal Polynomial
Definition, existence and uniqueness, relation to characteristic polynomial, diagonalizability conditions, relation to Jordan normal form, concrete examples of nilpotent matrices, applications
-
Matrix Norm
Matrix norm axioms, operator norms, Frobenius norm, condition number, equivalence and inequalities, applications
-
Orthogonal Projection
Projection matrix conditions P squared = P and P transposed = P, projection formula onto subspaces, orthogonal complements, relation to Gram-Schmidt, applications to least squares and QR decomposition
-
Sparse Matrix
Definition, storage formats (COO, CSR, CSC), sparse matrix operations, fill-in and reordering, iterative solvers (CG, GMRES, preconditioning), applications
Prerequisites
- Elementary Linear Algebra content
- In particular: vector spaces, eigenvalues, and basics of determinants