Chapter 19: Determinant — Axiomatic Definition and Uniqueness
Goal of This Page
Define the determinant axiomatically. Rather than starting with an explicit formula, we begin with the properties it must satisfy and prove that such a function exists uniquely. This approach follows the linear algebra notes by Bruce Ikenaga of Millersville University.
Significance of the axiomatic approach: Mathematicians often define a concept not by writing down a formula but by identifying the properties that characterize it. The axiomatic definition of the determinant is a paradigmatic example of this methodology.
1. The Three Axioms of the Determinant
Let $M(n, R)$ denote the set of all $n \times n$ matrices over a field $R$ (such as $\mathbb{R}$ or $\mathbb{C}$), or more generally a commutative ring.
Definition: A function $D: M(n, R) \to R$ is called a determinant function if it satisfies the following three axioms.
The question: Does a function $D$ satisfying these axioms exist? If so, is it unique?
2. Consequences of the Axioms
Several important properties follow from the three axioms.
2.1 Matrices with a zero row
2.2 Row interchange
2.3 Row operations and the determinant
We summarize how elementary row operations affect the determinant:
| Row operation | Effect on determinant | Justification |
|---|---|---|
| Multiply row $i$ by $c$ | $D \to c \cdot D$ | Axiom 1 $\eqref{eq:ax-linear-scalar}$ |
| Swap rows $i$ and $j$ | $D \to -D$ | Lemma 2 $\eqref{eq:ax-swap}$ |
| Add $c$ times row $j$ to row $i$ | $D$ unchanged | Axiom 1 + Axiom 2 |
Proof that row addition leaves $D$ invariant
Add $c$ times row $j$ to row $i$. By $\eqref{eq:ax-linear-add}$:
$$D\begin{pmatrix} \vdots \\ \boldsymbol{r}_i + c\boldsymbol{r}_j \\ \vdots \\ \boldsymbol{r}_j \\ \vdots \end{pmatrix} = D\begin{pmatrix} \vdots \\ \boldsymbol{r}_i \\ \vdots \\ \boldsymbol{r}_j \\ \vdots \end{pmatrix} + D\begin{pmatrix} \vdots \\ c\boldsymbol{r}_j \\ \vdots \\ \boldsymbol{r}_j \\ \vdots \end{pmatrix}$$By $\eqref{eq:ax-linear-scalar}$ and $\eqref{eq:ax-alternating}$, the second term equals $c \cdot 0 = 0$. Hence the determinant is unchanged.
3. Concrete Example: Computing with the Axioms
3.1 A 2x2 matrix
Let us compute the determinant of $A = \begin{pmatrix} 1 & -1 \\ 3 & 2 \end{pmatrix}$ using only the axioms.
Step 1: Decompose the second row. We can write $(3, 2) = 3(1, -1) + (0, 5)$.
By Axiom 1 $\eqref{eq:ax-linear-add}$:
$$D\begin{pmatrix} 1 & -1 \\ 3 & 2 \end{pmatrix} = D\begin{pmatrix} 1 & -1 \\ 3(1, -1) \end{pmatrix} + D\begin{pmatrix} 1 & -1 \\ 0 & 5 \end{pmatrix}$$Step 2: Evaluate the first term. By $\eqref{eq:ax-linear-scalar}$:
$$D\begin{pmatrix} 1 & -1 \\ 3 & -3 \end{pmatrix} = 3 \cdot D\begin{pmatrix} 1 & -1 \\ 1 & -1 \end{pmatrix}$$By Axiom 2 $\eqref{eq:ax-alternating}$, the two rows are equal, so:
$$= 3 \cdot 0 = 0$$Step 3: Evaluate the second term. Decompose the first row as $(1, -1) = 1 \cdot (1, 0) + (-1) \cdot (0, 1)$:
$$D\begin{pmatrix} 1 & -1 \\ 0 & 5 \end{pmatrix} = D\begin{pmatrix} 1 & 0 \\ 0 & 5 \end{pmatrix} + D\begin{pmatrix} 0 & -1 \\ 0 & 5 \end{pmatrix}$$Evaluate the second term. Expanding the second row in the standard basis, $(0, 5) = 0 \cdot \boldsymbol{e}_1 + 5 \cdot \boldsymbol{e}_2$:
$$D\begin{pmatrix} 0 & -1 \\ 0 & 5 \end{pmatrix} = 5 \cdot D\begin{pmatrix} 0 & -1 \\ 0 & 1 \end{pmatrix}$$Further expanding the first row, $(0, -1) = 0 \cdot \boldsymbol{e}_1 + (-1) \cdot \boldsymbol{e}_2$. The $\boldsymbol{e}_1$ term vanishes by Lemma 1 (zero row). The $\boldsymbol{e}_2$ term gives:
$$5 \cdot (-1) \cdot D\begin{pmatrix} 0 & 1 \\ 0 & 1 \end{pmatrix} = -5 \cdot 0 = 0$$The two rows are equal, so the determinant is 0 by Axiom 2.
Step 4: Evaluate the remaining term via $\eqref{eq:ax-linear-scalar}$:
$$D\begin{pmatrix} 1 & 0 \\ 0 & 5 \end{pmatrix} = 5 \cdot D\begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} = 5 \cdot D(I) = 5 \cdot 1 = 5$$This agrees with the usual formula $ad - bc = 1 \cdot 2 - (-1) \cdot 3 = 2 + 3 = 5$.
4. The Uniqueness Theorem
This is the central result of the page. We show that any function satisfying Axioms 1 and 2 is constrained to a specific form.
4.1 Proof outline
The proof proceeds in four steps:
- Expand each row in the standard basis $\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n$
- Apply Axiom 1 (linearity) repeatedly to obtain a sum of $n^n$ terms
- Apply Axiom 2 (alternating) to eliminate every term with a repeated basis vector, leaving $n!$ terms
- Express the coefficient of each surviving term using the sign of the permutation
4.2 Detailed proof
Step 1: Basis expansion
Write the $i$-th row $\boldsymbol{r}_i$ of $A$ in the standard basis:
$$\boldsymbol{r}_i = \sum_{j=1}^{n} a_{ij} \boldsymbol{e}_j = a_{i1}\boldsymbol{e}_1 + a_{i2}\boldsymbol{e}_2 + \cdots + a_{in}\boldsymbol{e}_n$$Step 2: Expansion by linearity
Apply Axiom 1 to the first row:
$$D(A) = D\begin{pmatrix} \sum_{j_1} a_{1j_1}\boldsymbol{e}_{j_1} \\ \boldsymbol{r}_2 \\ \vdots \\ \boldsymbol{r}_n \end{pmatrix} = \sum_{j_1=1}^{n} a_{1j_1} D\begin{pmatrix} \boldsymbol{e}_{j_1} \\ \boldsymbol{r}_2 \\ \vdots \\ \boldsymbol{r}_n \end{pmatrix}$$Then apply it to the second row:
$$= \sum_{j_1=1}^{n} \sum_{j_2=1}^{n} a_{1j_1} a_{2j_2} D\begin{pmatrix} \boldsymbol{e}_{j_1} \\ \boldsymbol{e}_{j_2} \\ \vdots \\ \boldsymbol{r}_n \end{pmatrix}$$Repeating for all rows:
$$D(A) = \sum_{j_1=1}^{n} \sum_{j_2=1}^{n} \cdots \sum_{j_n=1}^{n} a_{1j_1} a_{2j_2} \cdots a_{nj_n} \cdot D\begin{pmatrix} \boldsymbol{e}_{j_1} \\ \boldsymbol{e}_{j_2} \\ \vdots \\ \boldsymbol{e}_{j_n} \end{pmatrix}$$This is a sum of $n^n$ terms.
Step 3: Elimination by the alternating property
When is $D(\boldsymbol{e}_{j_1}, \ldots, \boldsymbol{e}_{j_n})$ nonzero?
By Axiom 2, if $j_k = j_l$ for some $k \neq l$, two rows are identical and $D = 0$.
Therefore only terms where $(j_1, j_2, \ldots, j_n)$ are all distinct survive.
Choosing $n$ distinct elements from $\{1, 2, \ldots, n\}$ amounts to a permutation $\sigma \in S_n$:
$$D(A) = \sum_{\sigma \in S_n} a_{1\sigma(1)} a_{2\sigma(2)} \cdots a_{n\sigma(n)} \cdot D\begin{pmatrix} \boldsymbol{e}_{\sigma(1)} \\ \boldsymbol{e}_{\sigma(2)} \\ \vdots \\ \boldsymbol{e}_{\sigma(n)} \end{pmatrix}$$The $n^n$ terms have been reduced to $n!$.
Step 4: Coefficient of each term
We need the value of $D(\boldsymbol{e}_{\sigma(1)}, \boldsymbol{e}_{\sigma(2)}, \ldots, \boldsymbol{e}_{\sigma(n)})$.
The matrix $(\boldsymbol{e}_{\sigma(1)}, \ldots, \boldsymbol{e}_{\sigma(n)})^T$ can be transformed back to $I = (\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n)^T$ by row swaps.
The parity of the number of transpositions (swaps of two elements) needed to return $\sigma$ to the identity defines the sign of the permutation $\mathrm{sgn}(\sigma)$:
$$\mathrm{sgn}(\sigma) = \begin{cases} +1 & \text{even number (even permutation)} \\ -1 & \text{odd number (odd permutation)} \end{cases}$$By Lemma 2 $\eqref{eq:ax-swap}$, each row swap negates the sign, so:
$$D(\boldsymbol{e}_{\sigma(1)}, \ldots, \boldsymbol{e}_{\sigma(n)}) = \mathrm{sgn}(\sigma) \cdot D(\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n) = \mathrm{sgn}(\sigma) \cdot D(I)$$Note: The operation here is row interchange. We rearrange the rows of $(\boldsymbol{e}_{\sigma(1)}, \ldots, \boldsymbol{e}_{\sigma(n)})^\top$ to recover the identity matrix $I = (\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n)^\top$, and each swap negates the sign by Lemma 2.
4.3 Conclusion
Combining the steps above:
$$D(A) = \sum_{\sigma \in S_n} a_{1\sigma(1)} a_{2\sigma(2)} \cdots a_{n\sigma(n)} \cdot \mathrm{sgn}(\sigma) \cdot D(I)$$ $$= \left( \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \prod_{i=1}^{n} a_{i,\sigma(i)} \right) \cdot D(I)$$Key consequence
Any function $D$ satisfying Axioms 1 and 2 must have the form:
$$D(A) = (\det A) \cdot D(I)$$where $\det A = \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \prod_{i=1}^{n} a_{i,\sigma(i)}$ is the Leibniz formula.
5. Completing Existence and Uniqueness
5.1 Uniqueness
Adding Axiom 3 ($D(I) = 1$) to the theorem $\eqref{eq:ax-uniqueness}$:
$$D(A) = \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \prod_{i=1}^{n} a_{i,\sigma(i)}$$This shows that if a function satisfying all three axioms exists, it must be unique.
Leibniz Formula (Uniqueness)
If a function satisfying all three axioms exists, it must take the following form:
\begin{equation} \det(A) = \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \prod_{i=1}^{n} a_{i,\sigma(i)} \label{eq:ax-leibniz} \end{equation}5.2 Existence
Conversely, if we define $\det$ by $\eqref{eq:ax-leibniz}$, we can verify that it satisfies all three axioms.
Verification of Axiom 1 (Linearity)
Each term $\prod_{i=1}^{n} a_{i,\sigma(i)}$ contains the entry $a_{k,\sigma(k)}$ from row $k$ exactly once, to the first power. Therefore:
- Replacing row $k$ with $\boldsymbol{r}_k + \boldsymbol{r}'_k$ splits the sum
- Replacing row $k$ with $c \boldsymbol{r}_k$ factors out $c$
Verification of Axiom 2 (Alternating)
Suppose rows $i$ and $j$ are equal ($\boldsymbol{r}_i = \boldsymbol{r}_j$), so $a_{ik} = a_{jk}$ for all $k$.
Consider the permutation $\tau = (i\ j) \circ \sigma$ obtained by composing $\sigma$ with the transposition $(i\ j)$. Then $\mathrm{sgn}(\tau) = -\mathrm{sgn}(\sigma)$.
Since $a_{ik} = a_{jk}$, the product of entries for $\sigma$ and $\tau$ are equal while their signs are opposite, so they cancel. Every permutation pairs up this way, giving $\det(A) = 0$.
Verification of Axiom 3 (Normalization)
The entries of the identity matrix $I$ are $\delta_{ij}$ (the Kronecker delta).
$$\det(I) = \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \prod_{i=1}^{n} \delta_{i,\sigma(i)}$$The product $\prod_{i=1}^{n} \delta_{i,\sigma(i)}$ equals 1 only when $\sigma = \mathrm{id}$ (the identity permutation).
$$\det(I) = \mathrm{sgn}(\mathrm{id}) \cdot 1 = 1 \cdot 1 = 1$$5.3 Final conclusion
Characterization of the determinant
- Uniqueness: Any function satisfying the three axioms must coincide with the Leibniz formula
- Existence: The function defined by the Leibniz formula does satisfy all three axioms
- Conclusion: A function satisfying the three axioms exists uniquely, and it is the determinant $\det$
6. Equivalence of Computation Methods
An important consequence of the uniqueness theorem is that all methods of computing the determinant are guaranteed to yield the same result.
Methods for computing the determinant
- Leibniz formula (permutation formula): $\det A = \sum_{\sigma} \mathrm{sgn}(\sigma) \prod_i a_{i,\sigma(i)}$
- Cofactor expansion: expansion along any row or column
- Row reduction (Gaussian elimination): reduce to upper triangular form and take the product of diagonal entries
- LU decomposition: if $A = LU$, then $\det A = \det L \cdot \det U$
Each of these methods can be shown (directly or indirectly) to satisfy the three axioms, and the uniqueness theorem then guarantees they all return the same value.
Practical implication: Since every method gives the same answer, one can simply choose whichever method is most efficient for the problem at hand.
7. Summary
What we have shown
- Axiomatic definition: defined the determinant as "the function satisfying three axioms"
- Consequences of the axioms: row swap negates the sign, row addition leaves it invariant, etc.
- Uniqueness theorem: any function satisfying Axioms 1 and 2 has the form $D(A) = (\det A) \cdot D(I)$
- Existence: verified that the Leibniz formula satisfies all three axioms
- Conclusion: the determinant function exists uniquely
The advantages of the axiomatic approach are:
- It clarifies the "essence" of the determinant — which properties are truly decisive
- It automatically guarantees that different computation methods yield the same result
- It extends the theory to arbitrary commutative rings, not just $\mathbb{R}$ or $\mathbb{C}$
Related pages:
- Deriving the determinant: a geometric approach — from volume to the Leibniz formula
- Visual understanding of the determinant: shear transformations — from parallelogram to rectangle
- Deriving the determinant: stacking bars — the row-vector perspective
- Foundations of linear algebra — properties and applications of the determinant
References:
- Bruce Ikenaga, Determinants - Axioms, Millersville University
- Bruce Ikenaga, Determinants - Uniqueness, Millersville University