Chapter 19: Determinant — Axiomatic Definition and Uniqueness

Goal of This Page

Define the determinant axiomatically. Rather than starting with an explicit formula, we begin with the properties it must satisfy and prove that such a function exists uniquely. This approach follows the linear algebra notes by Bruce Ikenaga of Millersville University.

Significance of the axiomatic approach: Mathematicians often define a concept not by writing down a formula but by identifying the properties that characterize it. The axiomatic definition of the determinant is a paradigmatic example of this methodology.

1. The Three Axioms of the Determinant

Let $M(n, R)$ denote the set of all $n \times n$ matrices over a field $R$ (such as $\mathbb{R}$ or $\mathbb{C}$), or more generally a commutative ring.

Definition: A function $D: M(n, R) \to R$ is called a determinant function if it satisfies the following three axioms.

Axiom 1: Linearity in each row (Multilinearity)

$D$ is linear in each row. That is, for $a \in R$ and $\boldsymbol{x}, \boldsymbol{y} \in R^n$:

\begin{equation} D\begin{pmatrix} \vdots \\ \boldsymbol{x} + \boldsymbol{y} \\ \vdots \end{pmatrix} = D\begin{pmatrix} \vdots \\ \boldsymbol{x} \\ \vdots \end{pmatrix} + D\begin{pmatrix} \vdots \\ \boldsymbol{y} \\ \vdots \end{pmatrix} \label{eq:ax-linear-add} \end{equation} \begin{equation} D\begin{pmatrix} \vdots \\ a\boldsymbol{x} \\ \vdots \end{pmatrix} = a \cdot D\begin{pmatrix} \vdots \\ \boldsymbol{x} \\ \vdots \end{pmatrix} \label{eq:ax-linear-scalar} \end{equation}

(all other rows held fixed)

Axiom 2: Alternating property

If two rows of a matrix are equal, the determinant is 0:

\begin{equation} D\begin{pmatrix} \vdots \\ \boldsymbol{r} \\ \vdots \\ \boldsymbol{r} \\ \vdots \end{pmatrix} = 0 \label{eq:ax-alternating} \end{equation}

(rows $i$ and $j$ are the same vector $\boldsymbol{r}$)

Axiom 3: Normalization

The determinant of the identity matrix is 1:

\begin{equation} D(I) = 1 \label{eq:ax-normalization} \end{equation}

The question: Does a function $D$ satisfying these axioms exist? If so, is it unique?

Axiom 1: Linearity a b x+x' y+y' = a b x y + a b x' y' Decompose linearly by row det is linear in each row Axiom 2: Alternating det a b a b same = 0 Equal rows → det = 0 Axiom 3: Normalization det 1 0 0 1 = 1 Identity matrix → det = 1
Figure 1: The three axioms of the determinant — linearity, alternating property, and normalization uniquely characterize the determinant

2. Consequences of the Axioms

Several important properties follow from the three axioms.

2.1 Matrices with a zero row

Lemma 1: A zero row implies determinant 0

If some row of a matrix $A$ consists entirely of zeros, then $D(A) = 0$.

Show proof

Suppose the $i$-th row is $\boldsymbol{0}$. For any vector $\boldsymbol{r}_i$, we can write $\boldsymbol{0} = 0 \cdot \boldsymbol{r}_i$. By $\eqref{eq:ax-linear-scalar}$:

$$D(A) = D\begin{pmatrix} \vdots \\ 0 \cdot \boldsymbol{r}_i \\ \vdots \end{pmatrix} = 0 \cdot D\begin{pmatrix} \vdots \\ \boldsymbol{r}_i \\ \vdots \end{pmatrix} = 0$$

2.2 Row interchange

Lemma 2: Swapping two rows reverses the sign

Interchanging two rows negates the determinant:

\begin{equation} D\begin{pmatrix} \vdots \\ \boldsymbol{a} \\ \vdots \\ \boldsymbol{b} \\ \vdots \end{pmatrix} = -D\begin{pmatrix} \vdots \\ \boldsymbol{b} \\ \vdots \\ \boldsymbol{a} \\ \vdots \end{pmatrix} \label{eq:ax-swap} \end{equation}
Show proof

By Axiom 2 $\eqref{eq:ax-alternating}$, setting both row $i$ and row $j$ equal to $\boldsymbol{a} + \boldsymbol{b}$:

$$D\begin{pmatrix} \vdots \\ \boldsymbol{a}+\boldsymbol{b} \\ \vdots \\ \boldsymbol{a}+\boldsymbol{b} \\ \vdots \end{pmatrix} = 0$$

Applying Axiom 1 $\eqref{eq:ax-linear-add}$ to row $i$ and then to row $j$:

$$D\begin{pmatrix} \vdots \\ \boldsymbol{a} \\ \vdots \\ \boldsymbol{a} \\ \vdots \end{pmatrix} + D\begin{pmatrix} \vdots \\ \boldsymbol{a} \\ \vdots \\ \boldsymbol{b} \\ \vdots \end{pmatrix} + D\begin{pmatrix} \vdots \\ \boldsymbol{b} \\ \vdots \\ \boldsymbol{a} \\ \vdots \end{pmatrix} + D\begin{pmatrix} \vdots \\ \boldsymbol{b} \\ \vdots \\ \boldsymbol{b} \\ \vdots \end{pmatrix} = 0$$

The first and fourth terms vanish by Axiom 2. Therefore:

$$D\begin{pmatrix} \vdots \\ \boldsymbol{a} \\ \vdots \\ \boldsymbol{b} \\ \vdots \end{pmatrix} + D\begin{pmatrix} \vdots \\ \boldsymbol{b} \\ \vdots \\ \boldsymbol{a} \\ \vdots \end{pmatrix} = 0$$

2.3 Row operations and the determinant

We summarize how elementary row operations affect the determinant:

Row operation Effect on determinant Justification
Multiply row $i$ by $c$ $D \to c \cdot D$ Axiom 1 $\eqref{eq:ax-linear-scalar}$
Swap rows $i$ and $j$ $D \to -D$ Lemma 2 $\eqref{eq:ax-swap}$
Add $c$ times row $j$ to row $i$ $D$ unchanged Axiom 1 + Axiom 2
Proof that row addition leaves $D$ invariant

Add $c$ times row $j$ to row $i$. By $\eqref{eq:ax-linear-add}$:

$$D\begin{pmatrix} \vdots \\ \boldsymbol{r}_i + c\boldsymbol{r}_j \\ \vdots \\ \boldsymbol{r}_j \\ \vdots \end{pmatrix} = D\begin{pmatrix} \vdots \\ \boldsymbol{r}_i \\ \vdots \\ \boldsymbol{r}_j \\ \vdots \end{pmatrix} + D\begin{pmatrix} \vdots \\ c\boldsymbol{r}_j \\ \vdots \\ \boldsymbol{r}_j \\ \vdots \end{pmatrix}$$

By $\eqref{eq:ax-linear-scalar}$ and $\eqref{eq:ax-alternating}$, the second term equals $c \cdot 0 = 0$. Hence the determinant is unchanged.

3. Concrete Example: Computing with the Axioms

3.1 A 2x2 matrix

Let us compute the determinant of $A = \begin{pmatrix} 1 & -1 \\ 3 & 2 \end{pmatrix}$ using only the axioms.

Step 1: Decompose the second row. We can write $(3, 2) = 3(1, -1) + (0, 5)$.

By Axiom 1 $\eqref{eq:ax-linear-add}$:

$$D\begin{pmatrix} 1 & -1 \\ 3 & 2 \end{pmatrix} = D\begin{pmatrix} 1 & -1 \\ 3(1, -1) \end{pmatrix} + D\begin{pmatrix} 1 & -1 \\ 0 & 5 \end{pmatrix}$$

Step 2: Evaluate the first term. By $\eqref{eq:ax-linear-scalar}$:

$$D\begin{pmatrix} 1 & -1 \\ 3 & -3 \end{pmatrix} = 3 \cdot D\begin{pmatrix} 1 & -1 \\ 1 & -1 \end{pmatrix}$$

By Axiom 2 $\eqref{eq:ax-alternating}$, the two rows are equal, so:

$$= 3 \cdot 0 = 0$$

Step 3: Evaluate the second term. Decompose the first row as $(1, -1) = 1 \cdot (1, 0) + (-1) \cdot (0, 1)$:

$$D\begin{pmatrix} 1 & -1 \\ 0 & 5 \end{pmatrix} = D\begin{pmatrix} 1 & 0 \\ 0 & 5 \end{pmatrix} + D\begin{pmatrix} 0 & -1 \\ 0 & 5 \end{pmatrix}$$

Evaluate the second term. Expanding the second row in the standard basis, $(0, 5) = 0 \cdot \boldsymbol{e}_1 + 5 \cdot \boldsymbol{e}_2$:

$$D\begin{pmatrix} 0 & -1 \\ 0 & 5 \end{pmatrix} = 5 \cdot D\begin{pmatrix} 0 & -1 \\ 0 & 1 \end{pmatrix}$$

Further expanding the first row, $(0, -1) = 0 \cdot \boldsymbol{e}_1 + (-1) \cdot \boldsymbol{e}_2$. The $\boldsymbol{e}_1$ term vanishes by Lemma 1 (zero row). The $\boldsymbol{e}_2$ term gives:

$$5 \cdot (-1) \cdot D\begin{pmatrix} 0 & 1 \\ 0 & 1 \end{pmatrix} = -5 \cdot 0 = 0$$

The two rows are equal, so the determinant is 0 by Axiom 2.

Step 4: Evaluate the remaining term via $\eqref{eq:ax-linear-scalar}$:

$$D\begin{pmatrix} 1 & 0 \\ 0 & 5 \end{pmatrix} = 5 \cdot D\begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} = 5 \cdot D(I) = 5 \cdot 1 = 5$$
$$D\begin{pmatrix} 1 & -1 \\ 3 & 2 \end{pmatrix} = 0 + 5 = 5$$

This agrees with the usual formula $ad - bc = 1 \cdot 2 - (-1) \cdot 3 = 2 + 3 = 5$.

4. The Uniqueness Theorem

This is the central result of the page. We show that any function satisfying Axioms 1 and 2 is constrained to a specific form.

Theorem (Uniqueness)

If $D: M(n, R) \to R$ is linear in each row (Axiom 1) and vanishes whenever two rows are equal (Axiom 2), then:

\begin{equation} D(A) = \left( \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \cdot a_{1,\sigma(1)} a_{2,\sigma(2)} \cdots a_{n,\sigma(n)} \right) \cdot D(I) \label{eq:ax-uniqueness} \end{equation}

where $S_n$ is the symmetric group of degree $n$ (the set of all $n!$ permutations), and $\mathrm{sgn}(\sigma)$ is the sign of the permutation.

4.1 Proof outline

The proof proceeds in four steps:

  1. Expand each row in the standard basis $\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n$
  2. Apply Axiom 1 (linearity) repeatedly to obtain a sum of $n^n$ terms
  3. Apply Axiom 2 (alternating) to eliminate every term with a repeated basis vector, leaving $n!$ terms
  4. Express the coefficient of each surviving term using the sign of the permutation
Matrix A Expand each row in eⱼ nⁿ terms (huge) Eliminated by alternating n! terms (drastic reduction) Leibniz formula Input Steps 1-2 After expansion Step 3 Step 4 Example: for n=4, 4⁴=256 terms → 4!=24 terms (about 90% eliminated) For n=10, 10¹⁰=10¹⁰ terms → 10!=3628800 terms
Figure 2: Roadmap of the uniqueness proof — multilinearity expands into nn terms, then the alternating property collapses them to n!

4.2 Detailed proof

Step 1: Basis expansion

Write the $i$-th row $\boldsymbol{r}_i$ of $A$ in the standard basis:

$$\boldsymbol{r}_i = \sum_{j=1}^{n} a_{ij} \boldsymbol{e}_j = a_{i1}\boldsymbol{e}_1 + a_{i2}\boldsymbol{e}_2 + \cdots + a_{in}\boldsymbol{e}_n$$
Step 2: Expansion by linearity

Apply Axiom 1 to the first row:

$$D(A) = D\begin{pmatrix} \sum_{j_1} a_{1j_1}\boldsymbol{e}_{j_1} \\ \boldsymbol{r}_2 \\ \vdots \\ \boldsymbol{r}_n \end{pmatrix} = \sum_{j_1=1}^{n} a_{1j_1} D\begin{pmatrix} \boldsymbol{e}_{j_1} \\ \boldsymbol{r}_2 \\ \vdots \\ \boldsymbol{r}_n \end{pmatrix}$$

Then apply it to the second row:

$$= \sum_{j_1=1}^{n} \sum_{j_2=1}^{n} a_{1j_1} a_{2j_2} D\begin{pmatrix} \boldsymbol{e}_{j_1} \\ \boldsymbol{e}_{j_2} \\ \vdots \\ \boldsymbol{r}_n \end{pmatrix}$$

Repeating for all rows:

$$D(A) = \sum_{j_1=1}^{n} \sum_{j_2=1}^{n} \cdots \sum_{j_n=1}^{n} a_{1j_1} a_{2j_2} \cdots a_{nj_n} \cdot D\begin{pmatrix} \boldsymbol{e}_{j_1} \\ \boldsymbol{e}_{j_2} \\ \vdots \\ \boldsymbol{e}_{j_n} \end{pmatrix}$$

This is a sum of $n^n$ terms.

Step 3: Elimination by the alternating property

When is $D(\boldsymbol{e}_{j_1}, \ldots, \boldsymbol{e}_{j_n})$ nonzero?

By Axiom 2, if $j_k = j_l$ for some $k \neq l$, two rows are identical and $D = 0$.

Therefore only terms where $(j_1, j_2, \ldots, j_n)$ are all distinct survive.

Choosing $n$ distinct elements from $\{1, 2, \ldots, n\}$ amounts to a permutation $\sigma \in S_n$:

$$D(A) = \sum_{\sigma \in S_n} a_{1\sigma(1)} a_{2\sigma(2)} \cdots a_{n\sigma(n)} \cdot D\begin{pmatrix} \boldsymbol{e}_{\sigma(1)} \\ \boldsymbol{e}_{\sigma(2)} \\ \vdots \\ \boldsymbol{e}_{\sigma(n)} \end{pmatrix}$$

The $n^n$ terms have been reduced to $n!$.

Step 4: Coefficient of each term

We need the value of $D(\boldsymbol{e}_{\sigma(1)}, \boldsymbol{e}_{\sigma(2)}, \ldots, \boldsymbol{e}_{\sigma(n)})$.

The matrix $(\boldsymbol{e}_{\sigma(1)}, \ldots, \boldsymbol{e}_{\sigma(n)})^T$ can be transformed back to $I = (\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n)^T$ by row swaps.

The parity of the number of transpositions (swaps of two elements) needed to return $\sigma$ to the identity defines the sign of the permutation $\mathrm{sgn}(\sigma)$:

$$\mathrm{sgn}(\sigma) = \begin{cases} +1 & \text{even number (even permutation)} \\ -1 & \text{odd number (odd permutation)} \end{cases}$$

By Lemma 2 $\eqref{eq:ax-swap}$, each row swap negates the sign, so:

$$D(\boldsymbol{e}_{\sigma(1)}, \ldots, \boldsymbol{e}_{\sigma(n)}) = \mathrm{sgn}(\sigma) \cdot D(\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n) = \mathrm{sgn}(\sigma) \cdot D(I)$$

Note: The operation here is row interchange. We rearrange the rows of $(\boldsymbol{e}_{\sigma(1)}, \ldots, \boldsymbol{e}_{\sigma(n)})^\top$ to recover the identity matrix $I = (\boldsymbol{e}_1, \ldots, \boldsymbol{e}_n)^\top$, and each swap negates the sign by Lemma 2.

4.3 Conclusion

Combining the steps above:

$$D(A) = \sum_{\sigma \in S_n} a_{1\sigma(1)} a_{2\sigma(2)} \cdots a_{n\sigma(n)} \cdot \mathrm{sgn}(\sigma) \cdot D(I)$$ $$= \left( \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \prod_{i=1}^{n} a_{i,\sigma(i)} \right) \cdot D(I)$$

Key consequence

Any function $D$ satisfying Axioms 1 and 2 must have the form:

$$D(A) = (\det A) \cdot D(I)$$

where $\det A = \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \prod_{i=1}^{n} a_{i,\sigma(i)}$ is the Leibniz formula.

5. Completing Existence and Uniqueness

5.1 Uniqueness

Adding Axiom 3 ($D(I) = 1$) to the theorem $\eqref{eq:ax-uniqueness}$:

$$D(A) = \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \prod_{i=1}^{n} a_{i,\sigma(i)}$$

This shows that if a function satisfying all three axioms exists, it must be unique.

Leibniz Formula (Uniqueness)

If a function satisfying all three axioms exists, it must take the following form:

\begin{equation} \det(A) = \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \prod_{i=1}^{n} a_{i,\sigma(i)} \label{eq:ax-leibniz} \end{equation}

5.2 Existence

Conversely, if we define $\det$ by $\eqref{eq:ax-leibniz}$, we can verify that it satisfies all three axioms.

Verification of Axiom 1 (Linearity)

Each term $\prod_{i=1}^{n} a_{i,\sigma(i)}$ contains the entry $a_{k,\sigma(k)}$ from row $k$ exactly once, to the first power. Therefore:

  • Replacing row $k$ with $\boldsymbol{r}_k + \boldsymbol{r}'_k$ splits the sum
  • Replacing row $k$ with $c \boldsymbol{r}_k$ factors out $c$
Verification of Axiom 2 (Alternating)

Suppose rows $i$ and $j$ are equal ($\boldsymbol{r}_i = \boldsymbol{r}_j$), so $a_{ik} = a_{jk}$ for all $k$.

Consider the permutation $\tau = (i\ j) \circ \sigma$ obtained by composing $\sigma$ with the transposition $(i\ j)$. Then $\mathrm{sgn}(\tau) = -\mathrm{sgn}(\sigma)$.

Since $a_{ik} = a_{jk}$, the product of entries for $\sigma$ and $\tau$ are equal while their signs are opposite, so they cancel. Every permutation pairs up this way, giving $\det(A) = 0$.

Verification of Axiom 3 (Normalization)

The entries of the identity matrix $I$ are $\delta_{ij}$ (the Kronecker delta).

$$\det(I) = \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \prod_{i=1}^{n} \delta_{i,\sigma(i)}$$

The product $\prod_{i=1}^{n} \delta_{i,\sigma(i)}$ equals 1 only when $\sigma = \mathrm{id}$ (the identity permutation).

$$\det(I) = \mathrm{sgn}(\mathrm{id}) \cdot 1 = 1 \cdot 1 = 1$$

5.3 Final conclusion

Characterization of the determinant

  1. Uniqueness: Any function satisfying the three axioms must coincide with the Leibniz formula
  2. Existence: The function defined by the Leibniz formula does satisfy all three axioms
  3. Conclusion: A function satisfying the three axioms exists uniquely, and it is the determinant $\det$

6. Equivalence of Computation Methods

An important consequence of the uniqueness theorem is that all methods of computing the determinant are guaranteed to yield the same result.

Methods for computing the determinant

  • Leibniz formula (permutation formula): $\det A = \sum_{\sigma} \mathrm{sgn}(\sigma) \prod_i a_{i,\sigma(i)}$
  • Cofactor expansion: expansion along any row or column
  • Row reduction (Gaussian elimination): reduce to upper triangular form and take the product of diagonal entries
  • LU decomposition: if $A = LU$, then $\det A = \det L \cdot \det U$

Each of these methods can be shown (directly or indirectly) to satisfy the three axioms, and the uniqueness theorem then guarantees they all return the same value.

Practical implication: Since every method gives the same answer, one can simply choose whichever method is most efficient for the problem at hand.

7. Summary

What we have shown

  1. Axiomatic definition: defined the determinant as "the function satisfying three axioms"
  2. Consequences of the axioms: row swap negates the sign, row addition leaves it invariant, etc.
  3. Uniqueness theorem: any function satisfying Axioms 1 and 2 has the form $D(A) = (\det A) \cdot D(I)$
  4. Existence: verified that the Leibniz formula satisfies all three axioms
  5. Conclusion: the determinant function exists uniquely

The advantages of the axiomatic approach are:

  • It clarifies the "essence" of the determinant — which properties are truly decisive
  • It automatically guarantees that different computation methods yield the same result
  • It extends the theory to arbitrary commutative rings, not just $\mathbb{R}$ or $\mathbb{C}$

Related pages:

References: