Optimization Basics
Undergraduate LevelLearning Goals
- Understand formulation and solution methods for linear programming problems
- Master the simplex method algorithm
- Learn the fundamentals of duality theory and the relationship between primal and dual problems
- Study the basic properties of convex optimization problems
- Derive the KKT conditions and understand their meaning
- Learn formulation and solution methods for quadratic programming
Prerequisites
- Introductory optimization (problem formulation, optimality conditions via derivatives)
- Linear algebra basics (matrix operations, systems of linear equations, bases)
- Calculus basics (partial derivatives, gradients, Hessian matrices)
Chapters
Ch. 1: Linear Programming
Formulation, standard and canonical forms, feasible regions and vertices, graphical solution.
Ch. 2: Simplex Method
The simplex idea, basic and basic feasible solutions, pivot operations.
Ch. 3: Duality Theory
Primal and dual problems, weak and strong duality theorems, complementary slackness.
Ch. 4: Convex Optimization Theory
Definition, local vs. global optimality, typical convex problems.
Ch. 5: KKT Conditions
Inequality-constrained optimization, derivation of KKT conditions, constraint qualifications and geometric meaning.
Ch. 6: Quadratic Programming
Formulation, convex and non-convex QP, equality- and inequality-constrained QP.
Ch. 7: Slack Variables and Constraint Relaxation
Introduction of slack variables, soft-margin optimization, allowing and controlling constraint violations.
Ch. 8: Kernel Methods and the Kernel Trick
Kernel functions, feature maps, and the kernel trick for efficiently solving nonlinear problems.
Ch. 9: Exercises
Practice problems to verify understanding at the basic level.
Ch. 10: No Free Lunch Theorem
No universal optimization algorithm exists — the meaning and practical implications of the NFL theorem.
Ch. 11: Benchmark Functions
Rosenbrock, Ackley, Rastrigin and other standard test functions for evaluating and comparing optimization algorithms.
Overview
This section systematically covers the fundamental theory of optimization. Optimization is the problem of finding values of variables that minimize (or maximize) an objective function subject to given constraints.
The main topics covered are:
- Linear programming: optimization with linear objective and constraints
- Simplex method: an efficient algorithm for solving linear programs
- Duality theory: the relationship between primal and dual problems, characterization of optimality
- Convex optimization: the problem class where local optima are global optima
- KKT conditions: necessary conditions for optimality in constrained problems
- Quadratic programming: quadratic objective with linear constraints
These concepts form an essential foundation for applications in machine learning, operations research, control engineering, and many other fields.