Search for probability and statistics terms on Statlect

QR decomposition

by , PhD

The QR decomposition (or QR factorization) allows us to express a matrix having linearly independent columns as the product of 1) a matrix Q having orthonormal columns and 2) an upper triangular matrix R.

In order to fully understand how the QR decomposition is obtained, we should be familiar with the Gram-Schmidt process.

Table of Contents

Overview of the decomposition

Remember that the Gram-Schmidt process is a procedure used to transform a set of linearly independent vectors into a set of orthonormal vectors (i.e., a set of vectors that have unit norm and are orthogonal to each other).

In the case of a $K	imes L$ matrix A, denote its columns by [eq1]. If these columns are linearly independent, they can be transformed into a set of orthonormal column vectors [eq2] by using the Gram-Schmidt process, which alternates normalization steps and projection steps:

  1. we start with the normalization[eq3]where [eq4] denotes the norm of $A_{ullet 1}$;

  2. we project $A_{ullet 2}$ on $Q_{ullet 1}$:[eq5]where [eq6] is the inner product between $A_{ullet 2}$ and $Q_{ullet 1}$ and $arepsilon _{2}$ is the residual of the projection, orthogonal to $Q_{ullet 1}$;

  3. we normalize the residual:[eq7]

  4. we project $A_{ullet 3}$ on $Q_{ullet 1}$ and $Q_{ullet 2}$:[eq8]where the residual $arepsilon _{3}$ is orthogonal to $Q_{ullet 1}$ and $Q_{ullet 2}$;

  5. we keep on alternating normalization steps (where projection residuals are divided by their norms) and projection steps (where $A_{ullet l}$ is projected on [eq9]) until we have produced a set of orthonormal vectors [eq10].

Note that the residuals can be expressed in terms of normalized vectors as[eq11]for $l=1,ldots ,L$, where we have defined[eq12]

Thus, the projections can be written as[eq13]

The orthonormal vectors can be adjoined to form a $K	imes L$ matrix[eq14]whose columns are orthonormal.

The coefficients of the projections can be collected in an upper triangular $L	imes L$ matrix [eq15]

By computing the matrix product between $Q$ and $R$, we recover the projections in equation (1). As a matter of fact, each column of the product $QR$ is a linear combination of the columns of $Q$ with coefficients taken from the corresponding column of $R$ (see the lecture on matrix products and linear combinations).

Therefore, we have that[eq16]

A formal statement

We now provide a formal statement of the QR decomposition.

Proposition Let A be a $K	imes L$ matrix. If the columns of A are linearly independent, then A can be factorized as[eq17]where $Q$ is a $K	imes L$ matrix whose columns form an orthonormal set, and $R$ is an $L	imes L$ upper triangular matrix whose diagonal entries are strictly positive.


In the previous section we have already shown a constructive proof of how the QR decomposition is obtained. The only important detail we have not mentioned is that the linear independence of the columns of A guarantees that the residuals of the projections performed in the Gram-Schmidt process are different from zero. As a consequence, the normalized vectors[eq18]are well-defined because the norms [eq19] are strictly positive. Moreover, the entries on the main diagonal of $R$ are strictly positive.

Note that $R$ is invertible because a triangular matrix is invertible if its diagonal entries are strictly positive.

It is time to make an example.

Example Define the $3	imes 2$ matrix[eq20]The norm of the first column is[eq21]so that the first normalized vector is[eq22]The inner product between $A_{ullet 2}$ and $Q_{ullet 1}$ is[eq23]The projection of the second column on $Q_{ullet 1}$ is[eq24]and the residual of the projection is[eq25]The norm of the residual is[eq26]Thus,[eq27]Let us verify that $Q_{ullet 1}$ and $Q_{ullet 2}$ are orthogonal:[eq28]We now have performed all the calculations that lead to the QR factorization[eq29]The matrix with orthonormal columns is[eq30]and the upper triangular matrix is[eq31]Let us check that indeed the product of $Q$ and $R$ equals A:[eq32]

Uniqueness of the decomposition

The QR decomposition is unique.

Proposition Under the assumptions of the previous proposition, the QR decomposition is unique, that is, the matrices $Q$ and $R$ satisfying the stated properties are unique.


Suppose [eq33]where $PS$ is a second decomposition into a matrix $P$ having orthonormal columns and an upper triangular matrix $S$ having strictly positive diagonal elements. Since the columns of $Q$ are orthonormal, we have that[eq34]where $Q^{st }$ is the conjugate transpose of $Q$, and I is the $L	imes L$ identity matrix (see Non-square matrices with orthonormal columns). By the same token,[eq35]If we pre-multiply both sides of the equality[eq36]by $P^{st }$ we get[eq37]or[eq38]If we instead pre-multiply the equality by $Q^{st }$ we obtain[eq39]or[eq40]By plugging equation (3) into equation (2), we obtain[eq41]The latter equation implies that, for $l=1,ldots ,L$, the $l$-th row of $S$ can be written as a linear combination of all the rows of $S$ with coefficients taken from the $l$-th row of the matrix [eq42] (see Matrix multiplication and linear combinations). But $S$ is triangular with strictly positive diagonal entries, so its rows are linearly independent and they form a basis for the space of $1	imes L$ vectors. As a consequence, the only way to represent the $l$-th row of $S$ as a linear combination of all the rows of $S$ is to put a unitary coefficient on the $l$-th row itself and a zero coefficient on all the other rows (by the uniqueness of the representation in terms of a basis). In other words, the $l$-th row of [eq43] is the $l$-th vector of the canonical basis. Since this is true for $l=1,ldots ,L$, we have that[eq44]or[eq45]Thus, $P^{st }Q$ is a unitary matrix (its conjugate transpose is equal to its inverse). Moreover, we have that[eq46]Since the inverse of an upper triangular matrix (UT) is UT and the product of two UT matrices is UT, $SR^{-1}$ is UT. It is also invertible, which means that its diagonal entries are strictly positive. To sum up, $P^{st }Q$ is both unitary and UT with strictly positive diagonal entries. Therefore, by a result on unitary and triangular matrices, we have that [eq47]and, as a consequence,[eq48]and[eq49]Thus, the two matrices involved in the QR decomposition are unique.

Pre-multiplication by the Q factor

An important fact that we have discussed in the previous proof but we have not separately stated until now is that the $Q$ matrix in the decomposition is such that[eq50]where $Q^{st }$ is the conjugate transpose of $Q$. As a consequence,[eq51]

If $Q$ has only real entries, then the conjugate transpose coincides with the transpose and the two equations above become[eq52]and[eq53]

Square matrices

When the matrix A being decomposed is a square $K	imes K$ matrix, then[eq29]
where $Q$ and $R$ are both square $K	imes K$ matrices.

But a square matrix $Q$ having orthonormal columns is a unitary matrix.

Therefore, the QR decomposition of a square matrix having linearly independent columns is the product of a unitary matrix and an upper triangular matrix with strictly positive entries.

Application to linear regression

The QR method is often used to estimate linear regressions.

In a linear regression we have an $N	imes 1$ vector $y$ of outputs and an $N	imes K$ matrix of inputs whose columns are assumed to be linearly independent. We need to find the Kx1 coefficient vector $eta $ that minimizes the mean squared errors made by using the fitted values[eq55]to predict the actual values $y$.

The well-known solution to this problem is the so-called ordinary least squares (OLS) estimator[eq56]

We can simplify the formula for the OLS estimator, avoid to invert a matrix and thus reduce the computational burden (and the possible numerical instabilities) by computing the QR decomposition of X:[eq57]where $Q$ is $N	imes K$ and $R$ is $K	imes K$.

Then, the OLS estimator becomes[eq58]or[eq59]

The latter way of writing the solution is more convenient: since $R$ is upper triangular, we do not need to invert it, but we can use the back-substitution algorithm to find the solution $eta $.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Compute the QR decomposition of[eq60]


The norm of the first column of A is[eq61]Thus, the first orthonormal vector is[eq62]The inner product between the second column of A and $Q_{ullet 1}$ is[eq63]The projection of $A_{ullet 2}$ on $Q_{ullet 1}$ is[eq64]and the residual of the projection is[eq65]The norm of the residual is[eq66]The second orthonormal vector is[eq67]Thus, the QR decomposition $A=QR$ is given by[eq68]and[eq69]

How to cite

Please cite as:

Taboga, Marco (2021). "QR decomposition", Lectures on matrix algebra.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.