Search for probability and statistics terms on Statlect
StatLect
Index > Matrix algebra

Properties of the determinant

by , PhD

In this lecture we derive several useful properties of the determinant.

In order to fully understand this lecture you need to remember the main results derived in the lecture on the determinant of an elementary matrix.

Table of Contents

Determinant of a triangular matrix

The first result concerns the determinant of a triangular matrix.

Proposition Let $T$ be a $K	imes K$ triangular matrix (either upper or lower). Then, the determinant of $T$ is equal to the product of its diagonal entries:[eq1]

Proof

Suppose that $T$ is lower triangular. Denote by $P$ the set of all permutations of the first K natural numbers. Let $pi _{0}in P$ be the permutation in which the K numbers are sorted in increasing order. The parity of $pi _{0} $ is even and its sign is [eq2]because it does not contain any inversion (see the lecture on the sign of a permutation). Then, the determinant of $T$ is[eq3]where in step $rame{A}$ we have used the fact that for all permutations $pi $ except $pi _{0}$ the product[eq4]involves at least one entry above the main diagonal that is equal to zero. The latter fact can be proved by contradiction. Suppose the product involves only elements on the main diagonal or below it, and at least one element below it (otherwise $pi =pi _{0}$). Then,[eq5]for all $k=1,ldots ,K$, but the inequality must be strict for at least one k. Suppose that the inequality is strict for $k_{0}$. Then, we have [eq6]for $k=1,ldots ,k_{0}$. In other words, the permutation $pi $ must contain $k_{0}$ different natural numbers smaller than or equal to $k_{0}-1$, which is clearly impossible. This ends the proof by contradiction. Thus, we have proved the proposition for lower triangular matrices. The proof for upper triangular matrices is almost identical (we just need to reverse the inequalities in the last step).

A corollary of the proposition above follows.

Proposition Let I be an identity matrix. Then,[eq7]

Proof

The identity matrix is diagonal. Therefore, it is triangular and its determinant is equal to the product of its diagonal entries. The latter are all equal to 1. As a consequence, the determinant of I is equal to 1.

Transposition does not change the determinant

The next proposition states an elementary but important property of the determinant.

Proposition Let A be a square matrix and denote its transpose by $A^{	op }$. Then,[eq8]

Proof

Denote by $P$ the set of all permutations of the first K natural numbers. For any permutation $pi in P$, there is an inverse permutation $pi ^{-1}$ such that[eq9]for $k=1,ldots ,K$. If $pi $ is obtained by performing a sequence of transpositions, then $pi ^{-1}$ is obtained by performing the opposite transpositions in reverse order. Thus, the number of transpositions is the same and, as a consequence, we have that [eq10]By using the concept of inverse permutation, the determinant of $A^{	op }$ can be easily calculated as follows:[eq11]where: in step $rame{A}$ we have used the definition of transpose; in step $rame{B}$ we have set [eq12] and, as a consequence, [eq13].

The determinant of a matrix with a zero row or column is zero

The following property, while pretty intuitive, is often used to prove other properties of the determinant.

Proposition Let A be a square matrix. If A has a zero row (i.e., a row whose entries are all equal to zero) or a zero column, then[eq14]

Proof

This property can be proved by using the definition of determinant[eq15]For every permutation $pi $, we have that[eq16]because the product contains one entry from each row (column), but one of the rows (columns) contains only zeros. Therefore,[eq17]

The determinant of a singular matrix is zero

We are now going to state one of the most important properties of the determinant.

Proposition Let A be a square matrix. Then A is invertible if and only if [eq18]and it is singular if and only if [eq17]

Proof

The matrix A is row equivalent to a unique matrix $R_{A}$ in reduced row echelon form (RREF). Since A and $R_{A}$ are row equivalent, we have that[eq20]where [eq21] are elementary matrices. Moreover, by the properties of the determinants of elementary matrices, we have that[eq22]But the determinant of an elementary matrix is different from zero. Therefore,[eq23]where $c$ is a non-zero constant. If A is invertible, $R_{A}$ is the identity matrix and [eq24]If A is singular, $R_{A}$ has at least one zero row because the only square RREF matrix that has no zero rows is the identity matrix, and the latter is row equivalent only to non-singular matrices. We have proved above that matrices that have a zero row have zero determinant. Thus, if A is singular, [eq25] and[eq26]To sum up, we have proved that all invertible matrices have non-zero determinant, and all singular matrices have zero determinant. Since a matrix is either invertible or singular, the two logical implications ("if and only if") follow.

Determinant of product equals product of determinants

The next proposition shows that the determinant of a product of two matrices is equal to the product of their determinants.

Proposition Let A and $B$ be two $K	imes K$ matrices. Then,[eq27]

Proof

If one of the two matrices is singular (i.e., not full rank), then their product $AB$ is singular because[eq28]as explained in the lecture entitled Matrix product and rank. Therefore, [eq29]and at least one of [eq30] or [eq31] is zero, so that[eq32]Thus, the statement in the proposition is true if at least one of the two matrices is singular. If neither of them is singular, then we can write them as products of elementary matrices:[eq33]where [eq21] and [eq35] are elementary matrices. Since the determinant of a product of elementary matrices is equal to the products of their determinants, we have that[eq36]Thus, we have proved that the statement in the proposition is true also in the case when the two matrices are non-singular.

Determinant of inverse

The previous proposition allows to easily find the determinant of the inverse of a matrix.

Proposition Let A be a $K	imes K$ invertible matrix. Then,[eq37]

Proof

Since[eq38]we have that[eq39]But the determinant of a product equals the product of determinants:[eq40]and [eq41]As a consequence,[eq42]Furthermore, the determinant of an invertible matrix is different from zero, so that we can divide both sides of the equation above by [eq43] and obtain[eq44]

Effect of multiplying a matrix by a scalar

This sub-section presents an easy-to-prove proposition about the multiplication of a matrix by a scalar. Before reading the proof, try to prove it by yourself as an exercise.

Proposition Let A be a $K	imes K$ matrix. Then, for any scalar $c$,[eq45]

Proof

This proposition is easily proved by using the definition of determinant.[eq46]

Effect of multiplying a row or column by a scalar

This property is similar to the previous one.

Proposition Let A be a $K	imes K$ matrix. Let $c$ be a scalar. Let $B$ be a matrix obtained from A by multiplying a row (or column) by $c$. Then, [eq47]

Proof

Suppose the $j$-th row has been multiplied by $c$. By the definition of determinant:[eq48]If instead the $j$-th column is multiplied by $c$, the same result holds because transposition does not change the determinant.

Linearity in rows and columns

The determinant is linear in the rows and columns of the matrix.

Proposition Let A be a $K	imes K$ matrix. Denote by $A_{jullet }$ the $j$-th row of A. Suppose[eq49]where $f$ and $g$ are two $1	imes K$ vectors and $lpha $ and $eta $ are two scalars. Denote by $A^{f}$ the matrix obtained from A by substituting $A_{jullet }$ with $f$. Denote by $A^{g}$ the matrix obtained from A by substituting $A_{jullet }$ with $g$. Then,[eq50]

Proof

By the definition of determinant, we have[eq51]

Proposition Let A be a $K	imes K$ matrix. Denote by $A_{ullet j}$ the $j$-th column of A. Suppose[eq52]where $f$ and $g$ are two Kx1 vectors and $lpha $ and $eta $ are two scalars. Denote by $A^{f}$ the matrix obtained from A by substituting $A_{jullet }$ with $f$. Denote by $A^{g}$ the matrix obtained from A by substituting $A_{jullet }$ with $g$. Then,[eq53]

Proof

This is a consequence of the previous proposition (linearity in columns) and of the fact that transpositions does not change the determinant.

The determinant and the LU decomposition

One of the easiest and more convenient ways to compute the determinant of a square matrix A is based on the LU decomposition[eq54]where $P$, $L$ and $U$ are a permutation matrix, a lower triangular and an upper triangular matrix respectively. We can write[eq55]and the determinants of $P$, $L$ and $U$ are easy to compute:

The book

Most of the learning materials found on this website are now available in a traditional textbook format.