Given a vector space , the orthogonal complement of a subset is the subspace of formed by all the vectors that are orthogonal to the vectors of .

Remember that two vectors and are said to be orthogonal if their inner product is equal to zero:

Definition Let be a vector space. Let be a subset of . The orthogonal complement of , denoted by , is

Let us make a simple example.

Example Let be the space of all column vectors having real entries. The inner product between two vectorsisConsider the set formed by the single vectorThen, the orthogonal complement of isThus, is formed by all the vectors whose second entry is equal to the first entry .

No matter how the subset is chosen, its orthogonal complement is a subspace, that is, a set closed with respect to taking linear combinations.

Proposition Let be a vector space. Let be a subset of . Then, the orthogonal complement is a subspace of .

Proof

Arbitrarily choose and two scalars and . Then,where: in step we have used the linearity of the inner product in in its first argument: in step we have used the fact that and belong to and are therefore orthogonal to each vector of . Thus, we have proved that any linear combination of vectors of is orthogonal to each element of . Hence, it belongs to . As a consequence, is closed with respect to taking linear combinations. Thus, it is a subspace.

Remember that, given two subspaces and of , their sum is the set

Moreover, when , then we say that the sum is direct and we write .

When, additionally, we have thatthen the two subspaces and are said to be complementary subspaces. In other words, two subspaces are complementary if their direct sum gives the whole vector space as a result.

It turns out that if is a subspace, then is one of its complementary subspaces. This is the reason why is called an orthogonal "complement".

Proposition Let be a finite-dimensional vector space. Let be a subspace of . Then, the orthogonal complement is a complementary subspace of .

Proof

We first prove that that is, any vector can be written as a sum of two vectors, one taken from and one taken from . Since and, a fortiori, are finite-dimensional, we can find a basis of . By the Gram-Schmidt process, we can transform it into an orthonormal basis . Moreover, as explained in the lecture on the Gram-Schmidt process, any vector can be decomposed as follows:where is orthogonal to all the vectors of the basis . This implies that is orthogonal to every vector of . Therefore, . Moreover, the vectorbelongs to , because , being a subspace, contains all linear combinations of its vectors. In other words, we can write any vector as a sum of a vector of and a vector of . Thus,Now, if a vector belongs to both and , it must be orthogonal to itself, that is,But by the definiteness property of the inner product, the only such vector is the zero vector. Therefore, and the sum is direct. Thus,

In the lecture on complementary subspaces, we have discussed the fact that complements are not necessarily unique, that is, there can be many different complements to a subspace . On the contrary, the orthogonal complement is unique, as is precisely identified by the condition that it must contain all the vectors that satisfy

If we take the orthogonal complement twice, we get back to the original subspace.

Proposition Let be a finite-dimensional vector space. Let be a subspace of . Then,

Proof

By the definition of , any vector is orthogonal to all vectors of and therefore belongs to . Thus,Now, choose any vector . Since is finite-dimensional and is a subspace, and has the decompositionwhere and . We have thatbecause , being in , is orthogonal to all elements of . Moreover, because , being in , is orthogonal to all elements of . Therefore, By the definiteness property of the inner product, this implies that . Therefore, . Thus, we have proved that the initial assumption that implies that . In other words,By putting the inclusion relations (1) and (2) together, we obtain .

Below you can find some exercises with explained solutions.

Let be the space of all column vectors having real entries. Let be the subspace containing all vectors of the formwhere , and can be any real numbers satisfying . What is the orthogonal complement of ? In particular, what constraints do the entries of the vectors in need to satisfy? Can you find a vector that spans ?

Solution

The vectors of can be written asIn other words, is spanned by the two vectorsThe orthogonal complement contains all the vectors that satisfyfor any two scalars and . Since this equation needs to be satisfied for every and , it must be thatDenote by the three components of :Then,andThus, the orthogonal complement contains all vectors whose coordinates satisfy the two constraintsandThese constraints are satisfied only by the vectors of the formIn other words, is spanned by the vector

Please cite as:

Taboga, Marco (2017). "Orthogonal complement", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/orthogonal-complement.

The book

Most of the learning materials found on this website are now available in a traditional textbook format.

Featured pages

- Chi-square distribution
- Likelihood ratio test
- Convergence in distribution
- Score test
- F distribution
- Convergence in probability

Explore

Main sections

- Mathematical tools
- Fundamentals of probability
- Probability distributions
- Asymptotic theory
- Fundamentals of statistics
- Glossary

About

Glossary entries

- Estimator
- Discrete random variable
- Alternative hypothesis
- Factorial
- Mean squared error
- Posterior probability

Share