Linear Algebra Intermediate

Diagonalization, function spaces, and various derivations of the determinant (undergraduate level)

Overview

At the intermediate level, we deepen the concepts learned in the elementary course and explore advanced topics. We study simplification of matrices through diagonalization, the geometric meaning of complex eigenvalues, an introduction to infinite-dimensional vector spaces (function spaces), and various derivations of the determinant.

Learning Objectives

  • Understand the conditions and procedures for diagonalization, and efficiently compute matrix powers
  • Understand the relationship between complex eigenvalues and rotations
  • Work with polynomial spaces and function spaces through concrete examples
  • Compare and understand multiple derivations of the determinant (geometric, linear transformation, row reduction)

Table of Contents

  1. Chapter 1 Diagonalization

    Conditions for diagonalizability, procedures, matrix powers, matrix exponential

  2. Chapter 2 Complex Eigenvalues

    Complex eigenvalues and rotations, complex eigenvalues of real matrices, oscillatory systems

  3. Chapter 3 Polynomial Spaces

    Bases of $\mathcal{P}_n$, differential operators, polynomial operations as linear maps

  4. Chapter 4 Function Spaces

    $C[a,b]$, $L^2$ spaces, infinite-dimensional vector spaces, inner products

  5. Chapter 5 Determinant: Why the Leibniz Formula

    Deriving the Leibniz formula from geometric motivation

  6. Chapter 6 Determinant: The Linear Transformation Perspective

    Scaling factors, the IOLA (Introduction to Linear Algebra) approach

  7. Chapter 7 Determinant: Stacking Bars

    Building the determinant by adding row vectors one at a time

  8. Chapter 8 Determinant Derivation: Row Reduction and Upper Triangular Matrices

    Reduction to upper triangular form, computational algorithms

  9. Chapter 9 Tensors

    Multidimensional arrays, tensor products, contraction, implementation in PyTorch

  10. Chapter 10 Norms

    L1/L2/L-infinity norms, matrix norms, applications to regularization

  11. Chapter 11 Inner Products

    Axiomatic definition, projections, cosine similarity, attention mechanisms

  12. Chapter 12 Linear Transformations

    Definition, matrix representation, kernel and image, rank-nullity theorem

  13. Chapter 13 Kernel and Image

    Proofs of subspace properties, basis construction, dimension theorem

  14. Chapter 14 Eigenvalues and Machine Learning

    PCA, spectral clustering, PageRank

  15. Chapter 15 Applications of Matrix Decompositions

    SVD/NMF/Tucker decomposition, recommender systems, latent semantic analysis

  16. Chapter 16 Pauli Matrices and Qubits

    Pauli matrices, quantum gates, Bloch sphere

  17. Chapter 17 Tensor Products and Multi-Qubit Systems

    Kronecker products, quantum entanglement, multi-qubit gates

  18. Chapter 18 Determinant: The Leibniz Formula and Rigorous Proofs

    Permutations and signs, definition via the Leibniz formula, multilinearity and alternating property

  19. Chapter 19 Determinant: Axiomatic Definition and Uniqueness

    Proving existence and uniqueness of the determinant starting from axioms

  20. Chapter 20 Applications of Eigenvalues

    Systems of differential equations, Markov chains, PCA, Google PageRank

Columns

Special Matrices & Decompositions

  • Hessenberg Reduction

    Transformation to an "almost triangular" matrix. Accelerates eigenvalue computation as preprocessing for the QR algorithm

  • Companion Matrix

    Connecting polynomial roots to matrix eigenvalues. A building block of the Frobenius normal form

  • Tridiagonal Matrix

    A sparse matrix with bandwidth 1. O(n) direct solution via the Thomas algorithm and preprocessing for eigenvalue computation

  • Householder Transformation (Reflection)

    Orthogonal transformation via reflection across a hyperplane. The foundation for QR decomposition, Hessenberg reduction, and tridiagonalization

  • LU Decomposition

    Decomposition into lower and upper triangular matrices. Existence conditions, uniqueness proof, PLU decomposition, and relation to determinants

  • Cholesky Decomposition

    A = LL^T decomposition for positive definite symmetric matrices. Existence and uniqueness, LDL^T decomposition, and positive definiteness testing

  • Singular Value Decomposition (SVD)

    A = U-Sigma-V^T decomposition for arbitrary matrices. Geometric interpretation, pseudoinverse, and low-rank approximation

  • Quadratic Forms

    Classification of Q(x) = x^TAx. Principal axis theorem, Sylvester's law of inertia, and applications to extremum determination

  • Similarity Transformation

    Matrix transformation through change of basis. Similarity invariants, and relation to diagonalization and Jordan normal form

  • Normal Equations

    Foundation of the least squares method. Derivation via the projection theorem, numerical solutions using Cholesky and QR decomposition

  • Symmetric Positive Definite Matrices

    Quadratic forms, definitions of positive definite and positive semi-definite, Sylvester's criterion, and relation to Cholesky decomposition

  • Tensor Product

    Universality of bilinear maps, Kronecker product, symmetric and alternating tensors, applications to quantum mechanics

  • Rotation Matrix

    2D and 3D rotation matrices, Euler angles, Rodrigues' rotation formula, SO(n) and Lie algebras, quaternion representation

  • Hadamard Matrix

    Definition and properties, Sylvester construction, Paley construction, Hadamard conjecture, applications to CDMA, quantum computing, and design of experiments

  • Stochastic Matrix

    Right stochastic, left stochastic, and doubly stochastic matrices, Perron-Frobenius theorem, Markov chains, convergence theorem, PageRank and MCMC

  • Hermitian Matrix

    Definition and basic properties, spectral theorem, positive definiteness, Rayleigh quotient, applications to quantum mechanics, statistics, and signal processing

  • Kronecker Product

    Definition and block matrix representation, basic properties, mixed-product property, relation to the vec operator, eigenvalues, relation to tensor product, applications

  • Characteristic Polynomial

    Definition and basic properties, relation to eigenvalues, Cayley-Hamilton theorem, similarity invariance, relation to minimal polynomial, applications to inverse matrix computation and stability analysis

  • Minimal Polynomial

    Definition, existence and uniqueness, relation to characteristic polynomial, diagonalizability conditions, relation to Jordan normal form, concrete examples of nilpotent matrices, applications

  • Matrix Norm

    Matrix norm axioms, operator norms, Frobenius norm, condition number, equivalence and inequalities, applications

  • Orthogonal Projection

    Projection matrix conditions P squared = P and P transposed = P, projection formula onto subspaces, orthogonal complements, relation to Gram-Schmidt, applications to least squares and QR decomposition

  • Sparse Matrix

    Definition, storage formats (COO, CSR, CSC), sparse matrix operations, fill-in and reordering, iterative solvers (CG, GMRES, preconditioning), applications

Prerequisites