Search for probability and statistics terms on Statlect
StatLect

Matrix inversion lemmas

by , PhD

Matrix inversion lemmas are extremely useful formulae that allow us to efficiently compute how simple changes in a matrix affect its inverse.

Table of Contents

Rank one updates

One of the simplest changes that can be performed on a matrix is a so-called rank one update.

Definition Let A be a $K	imes K$ matrix and $u$ and $v$ two Kx1 column vectors. Then, the transformation[eq1]is called a rank one update to A.

The reason why the transformation is called rank one is that the rank of the $K	imes K$ matrix $uv^{	op }$ is equal to 1 (because a single vector, $u$, spans all the columns of $uv^{	op }$).

Rank one updates to the identity matrix

The first inversion lemma we present is for rank one updates to identity matrices.

Proposition Let I be the $K	imes K$ identity matrix and $u$ and $v$ two Kx1 column vectors. The matrix[eq2]is invertible if and only if[eq3]When it is invertible, its inverse is[eq4]

Proof

Let us first prove the "if" part. The assumption [eq3]ensures that the ratio[eq6]and the proposed inverse[eq7]are well-defined. Moreover, the latter satisfies the definition of inverse (a matrix multiplied by its inverse gives the identity matrix):[eq8]Let us now prove the "only if" part. Suppose [eq9]or[eq10]The latter equality implies $u
eq 0$. Post-multiply the matrix[eq11]by $u$:[eq12]Thus, the linear combination of the columns of $I+uv^{	op }$ with coefficients taken from the non-zero vector $u$ gives the zero vector. As a consequence, the columns of $I+uv^{	op }$ are not linearly independent, the matrix is not full-rank, hence not invertible.

Note the dimensions of the matrix products involved in this inversion lemma:

Sherman-Morrison formula

The Sherman-Morrison formula, shown in the next proposition, generalizes the previous inversion lemma by considering rank one updates to a generic invertible matrix A (instead of only considering rank one updates to the identity matrix).

Proposition Let A be a $K	imes K$ invertible matrix and $u$ and $v$ two Kx1 column vectors. The rank one update[eq15]is invertible if and only if[eq16]When it is invertible, its inverse is[eq17]

Proof

Note that[eq18]In other words, we can write a rank one update to A as the product of A and a rank one update to the identity matrix (the update is performed with the column vectors $A^{-1}u$ and $v$). Thus, by the standard result on the inverse of a product, we have that [eq19]is invertible if and only if the two factors of the product, that is,[eq20]and [eq21]are invertible. But A is invertible by assumption, and the condition for the invertibility of a rank one update to the identity matrix has been derived in the previous proposition: the rank one update is invertible if and only if[eq16]When it is invertible, its inverse is[eq23]As a consequence,[eq24]

The Sherman-Morrison formula is very useful not only because rank one updates are found in many theorems in matrix algebra, but also because it can be a computationally efficient way of calculating the inverse [eq25] when $A^{-1}$ has already been calculated. The number of arithmetic operations needed to compute [eq26] from scratch is proportional to $K^{3}$, while the number of operations needed to update the inverse with Sherman-Morrison formula is proportional to $K^{2}$. Even if in the latter case the constant of proportionality is higher than in the former, when K is sufficiently large, the computational cost of the update is much lower.

Woodbury matrix identity

Another useful matrix inversion lemma goes under the name of Woodbury matrix identity, which is presented in the following proposition.

Proposition Let A be a $K	imes K$ invertible matrix, $U$ and V two $K	imes L$ matrices, and $C$ an $L	imes L$ invertible matrix. If [eq27]is invertible, then [eq28]is invertible and its inverse is[eq29]

Proof

The condition that[eq30]exists ensures that the proposed inverse is well-defined. We need to check that the proposed inverse satisfies the definition of inverse (a matrix multiplied by its inverse gives the identity matrix):[eq31]

Note that when $L=1$ and $C=1$, the Woodbury matrix identity coincides with the Sherman Morrison formula. Therefore, the latter is a special case of the former.

The reasons why this inversion lemma is worth knowing are similar to those we have explained for the Sherman Morrison formula: it is often used in matrix algebra, and it saves computations when $A^{-1}$ is already known (and $L$ is significantly smaller than K).

How to cite

Please cite as:

Taboga, Marco (2021). "Matrix inversion lemmas", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/matrix-inversion-lemmas.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.