Chapter 1: Fundamentals of Vector Spaces

Axiomatic Definition and Concrete Examples

Goals

A "vector" is not just an arrow. Anything that satisfies the axioms is a vector. Understand that polynomials, functions, matrices, and more can all form vector spaces.

1. Why Do We Need Abstraction?

1.1 Arrows Are Not Enough

In introductory courses, we learn "vector = arrow." But consider:

  • What happens when you add polynomials $p(x) = 2x^2 + 3x + 1$ and $q(x) = x^2 - x$?
  • What happens when you multiply the function $f(x) = \sin x$ by 2?
  • What is the sum of two matrices $A$ and $B$?

All of these support "addition" and "scalar multiplication." They share the same structure as arrows!

Objects with "Addition" and "Scalar Multiplication" Arrow Vectors Polynomials 2x² + 3x + 1 Functions Matrices [ a b ] [ c d ] Vector Space (Common Structure)
Figure 1: Different mathematical objects sharing the common structure of a "vector space"

1.2 Extracting the Common Structure

The concept of a vector space extracts the "rules of addition and scalar multiplication" common to these different objects.

Once we verify the axioms are satisfied, every theorem proved for $\mathbb{R}^n$ applies immediately!

2. Axioms of a Vector Space

2.1 Definition

Definition: A set $V$ is a vector space over a field $\mathbb{F}$ (usually $\mathbb{R}$ or $\mathbb{C}$) if it is equipped with "addition" and "scalar multiplication" satisfying the following axioms.

2.2 Addition Axioms (4)

For all $\boldsymbol{u}, \boldsymbol{v}, \boldsymbol{w} \in V$:

  1. Associativity: $(\boldsymbol{u} + \boldsymbol{v}) + \boldsymbol{w} = \boldsymbol{u} + (\boldsymbol{v} + \boldsymbol{w})$
  2. Commutativity: $\boldsymbol{u} + \boldsymbol{v} = \boldsymbol{v} + \boldsymbol{u}$
  3. Zero vector: There exists $\boldsymbol{0} \in V$ such that $\boldsymbol{v} + \boldsymbol{0} = \boldsymbol{v}$
  4. Additive inverse: For each $\boldsymbol{v}$, there exists $-\boldsymbol{v} \in V$ such that $\boldsymbol{v} + (-\boldsymbol{v}) = \boldsymbol{0}$

2.3 Scalar Multiplication Axioms (4)

For all $\boldsymbol{u}, \boldsymbol{v} \in V$ and $a, b \in \mathbb{F}$:

  1. Distributivity (vector): $a(\boldsymbol{u} + \boldsymbol{v}) = a\boldsymbol{u} + a\boldsymbol{v}$
  2. Distributivity (scalar): $(a + b)\boldsymbol{v} = a\boldsymbol{v} + b\boldsymbol{v}$
  3. Associativity: $(ab)\boldsymbol{v} = a(b\boldsymbol{v})$
  4. Identity element: $1 \cdot \boldsymbol{v} = \boldsymbol{v}$

2.4 Meaning of the Axioms

These eight axioms abstract the "obvious computation rules in $\mathbb{R}^n$." Any set satisfying them can be analyzed using the same methods as $\mathbb{R}^n$.

3. Concrete Examples of Vector Spaces

3.1 $\mathbb{R}^n$ (Coordinate Space)

The most fundamental example. Ordered $n$-tuples of real numbers:

$$\mathbb{R}^n = \left\{ \begin{pmatrix} x_1 \\ x_2 \\ \vdots \\ x_n \end{pmatrix} \,\middle|\, x_i \in \mathbb{R} \right\}$$

Component-wise addition and scalar multiplication satisfy all eight axioms.

3.2 Polynomial Space $P_n$

All polynomials of degree at most $n$:

$$P_n = \{a_0 + a_1 x + a_2 x^2 + \cdots + a_n x^n \mid a_i \in \mathbb{R}\}$$

Polynomial addition and scalar multiplication satisfy the axioms.

Example: For $p(x) = 2x^2 + 3x + 1$ and $q(x) = x^2 - x$:

$p + q = 3x^2 + 2x + 1$, $\quad 2p = 4x^2 + 6x + 2$

Zero vector: $0$ (the zero polynomial)

Dimension: $\dim(P_n) = n + 1$ (basis: $1, x, x^2, \ldots, x^n$)

3.3 Matrix Space $M_{m \times n}$

All $m \times n$ matrices:

$$M_{m \times n} = \{A \mid A \text{ is an } m \times n \text{ matrix}\}$$

Matrix addition and scalar multiplication satisfy the axioms.

Dimension: $\dim(M_{m \times n}) = mn$

3.4 Function Space

All continuous functions on $[0, 1]$:

$$C[0, 1] = \{f: [0, 1] \to \mathbb{R} \mid f \text{ is continuous}\}$$

Pointwise addition and scalar multiplication satisfy the axioms.

Zero vector: $f(x) = 0$ (the zero function)

Dimension: Infinite-dimensional! (cannot be spanned by finitely many functions)

3.5 Solution Space

The set of all solutions to a homogeneous linear system $A\boldsymbol{x} = \boldsymbol{0}$:

$$\ker(A) = \{\boldsymbol{x} \in \mathbb{R}^n \mid A\boldsymbol{x} = \boldsymbol{0}\}$$

This is a subspace of $\mathbb{R}^n$.

4. Subspaces

4.1 Definition

Definition: A subset $W$ of a vector space $V$ is a subspace if:

  1. $\boldsymbol{0} \in W$ (contains the zero vector)
  2. $\boldsymbol{u}, \boldsymbol{v} \in W \Rightarrow \boldsymbol{u} + \boldsymbol{v} \in W$ (closed under addition)
  3. $\boldsymbol{v} \in W, a \in \mathbb{F} \Rightarrow a\boldsymbol{v} \in W$ (closed under scalar multiplication)
Subspaces of R³ Subspace ✓ O Plane through origin Not a subspace ✗ O Does not contain O! Plane not through origin
Figure 2: Subspaces and non-subspaces of R³

4.2 Examples of Subspaces

  • A plane through the origin in $\mathbb{R}^3$
  • A line through the origin in $\mathbb{R}^3$
  • All diagonal matrices (subspace of $M_{n \times n}$)
  • All even functions (subspace of $C[-1, 1]$)

4.3 Non-Examples

  • A plane not through the origin (does not contain the zero vector)
  • Points on the unit circle (not closed under addition)

5. Basis and Dimension

5.1 Linear Independence

Vectors $\boldsymbol{v}_1, \ldots, \boldsymbol{v}_k$ are linearly independent if:

$$c_1\boldsymbol{v}_1 + c_2\boldsymbol{v}_2 + \cdots + c_k\boldsymbol{v}_k = \boldsymbol{0} \quad \Rightarrow \quad c_1 = c_2 = \cdots = c_k = 0$$

5.2 Basis

Definition: A basis of a vector space $V$ is a set of vectors $\{\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n\}$ satisfying both:

  1. Linearly independent — none of the $\boldsymbol{e}_i$ can be built from the others (no waste)
  2. Spans $V$ — every vector in $V$ can be written as a linear combination of $\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n$ (reaches everywhere)

Intuitively, "a minimal toolkit that covers the whole space". Too few and it fails to cover; too many and there is redundancy (linear dependence).

Examples: Basis or not?

Set of vectors Lin. indep.? Spans $\mathbb{R}^2$? Basis?
$\{(1,0), (0,1)\}$ — standard basis Yes Yes Yes
$\{(1,0), (1,1)\}$ Yes Yes Yes (non-standard bases are also bases)
$\{(1,0)\}$ Yes No (cannot reach $y$) No (too few)
$\{(1,0), (2,0)\}$ No (on the same line) No No
$\{(1,0), (0,1), (1,1)\}$ No (third is redundant) Yes No (too many)

Bases in Various Vector Spaces

  • $\mathbb{R}^n$: standard basis $\{\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n\}$ ($\boldsymbol{e}_i$ has a 1 in the $i$-th slot, 0 elsewhere) — $n$ vectors
  • $P_n$ (polynomials of degree $\le n$): $\{1, x, x^2, \ldots, x^n\}$ — $n+1$ vectors
  • $M_{m \times n}$ ($m \times n$ matrices): $\{E_{ij}\}$ (1 at position $(i, j)$, 0 elsewhere) — $mn$ vectors
  • $C[0, 1]$ (continuous functions): has no finite basis (infinite-dimensional)

Note: a basis is not unique. The same space admits many different bases. For example, on $\mathbb{R}^2$, each of $\{(1,0), (0,1)\}$, $\{(1,1), (1,-1)\}$, $\{(3,1), (5,2)\}$ is a basis. Which one to use depends on the problem.

5.3 Dimension

The number of elements in a basis is called the dimension $\dim(V)$ of $V$.

Key Theorem

The number of elements in a basis of a finite-dimensional vector space is the same regardless of the choice of basis.

Basis and Linear Combinations in R² Standard basis {e₁, e₂} e₁ e₂ v = 3e₁+2e₂ Dimension = # of basis vectors R¹: dim = 1 Line (1 basis vector) R²: dim = 2 Plane (2 basis vectors) R³: dim = 3 Space (3 basis vectors)
Figure 3: Linear combinations using a basis and the concept of dimension

5.4 Concrete Examples

Vector Space Standard Basis Dimension
$\mathbb{R}^n$ $\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n$ $n$
$P_n$ $1, x, x^2, \ldots, x^n$ $n + 1$
$M_{m \times n}$ $E_{ij}$ (1 in position $(i,j)$, 0 elsewhere) $mn$
$C[0,1]$ None (infinite-dimensional) $\infty$

6. Why Abstraction Is Useful

6.1 Reusing Theorems

A theorem proved once applies to every vector space:

  • An $n$-dimensional space requires exactly $n$ basis vectors
  • Any $n+1$ vectors must be linearly dependent
  • The dimension of a subspace is at most that of the whole space
Abstraction: Prove Once, Apply Everywhere Vector Space Theorem "If dim V = n, then n+1 vectors are dependent" Rⁿ n+1 vectors are always linearly dependent Polynomial Space Pₙ n+2 polynomials are always linearly dependent ODE Solutions At most dim V independent solutions
Figure 4: A single theorem applies to diverse vector spaces

6.2 Unifying Different Fields

  • Differential equations: The solution space is a vector space
  • Fourier analysis: Functions treated as "vectors"
  • Quantum mechanics: State vectors (wave functions) live in infinite-dimensional vector spaces

6.3 Example: Differential Equations

The solution set of $y'' + y = 0$ is a 2-dimensional vector space with basis $\sin x$ and $\cos x$.

The general solution $y = c_1 \sin x + c_2 \cos x$ is a "linear combination of the basis" — the standard form.

7. Summary

Key Takeaways

  • Vector space: A set satisfying eight axioms
  • Examples: $\mathbb{R}^n$, polynomials, matrices, functions
  • Subspace: A subset closed under addition and scalar multiplication
  • Dimension: The number of basis vectors (invariant under change of basis)
  • Power of abstraction: Treat different objects with a unified theory

Related Pages: