Search for probability and statistics terms on Statlect
StatLect

Orthonormal basis

by , PhD

A basis is orthonormal if its vectors:

  1. have unit norm;

  2. are orthogonal to each other (i.e., their inner product is equal to zero).

The representation of a vector as a linear combination of an orthonormal basis is called Fourier expansion. It is particularly important in applications.

Table of Contents

Orthonormal sets

Let us start by defining orthonormality for a set of vectors (not necessarily a basis).

Definition Let $S$ be a vector space equipped with an inner product [eq1]. A set of K vectors [eq2] is said to be an orthonormal set if and only if[eq3]

Thus, all vectors in an orthonormal set are orthogonal to each other and have unit norm:[eq4]

Let us make a simple example.

Example Consider the space $S$ of all $3	imes 1$ column vectors having real entries, together with the inner product[eq5]where $r,sin S$ and $s^{intercal }$ denotes the transpose of $s$. Consider the set of two vectors [eq6]The inner product of $s_{1}$ with itself is[eq7]The inner product of $s_{2}$ with itself is[eq8]The inner product of $s_{1}$ and $s_{2}$ is[eq9]Therefore, $s_{1}$ and $s_{2}$ form an orthonormal set.

Orthonormal sets are linearly independent

The next proposition shows a key property of orthonormal sets.

Proposition Let $S$ be a vector space equipped with an inner product [eq10]. The vectors of an orthonormal set [eq11] are linearly independent.

Proof

The proof is by contradiction. Suppose that the vectors [eq12] are linearly dependent. Then, there exist K scalars [eq13], not all equal to zero, such that [eq14]Thus, for any $j=1,ldots ,K$,[eq15]where: in step $rame{A}$ we have used the additivity and homogeneity of the inner product in its first argument; in step $rame{B}$ we have used the fact that we are dealing with an orthonormal set, so that [eq16] if $k
eq j$; in step $rame{C}$ we have used the fact that the vectors $s_{j}$ have unit norm. Therefore, all the coefficients [eq17] must be equal to zero. We have arrived at a contradiction and, as a consequence, the hypothesis that [eq18] are linearly dependent is false. Hence, they are linearly independent.

Basis of orthonormal vectors

If an orthonormal set is a basis for its space, then it is called an orthonormal basis.

Definition Let $S$ be a vector space equipped with an inner product [eq19]. A set of K vectors [eq20] are called an orthonormal basis of $S$ if and only if they are a basis for $S$ and they form an orthonormal set.

In the next example we show that the canonical basis of a coordinate space is an orthonormal basis.

Example As in the previous example, consider the space $S$ of all $3	imes 1$ column vectors having real entries, together with the inner product[eq21]for $r,sin S$. Let us consider the three vectors[eq22]which constitute the canonical basis of $S$. We can clearly see that[eq23]For instance,[eq24]and[eq25]Thus, the canonical basis is an orthonormal basis.

Fourier expansion

It is incredibly easy to derive the representation of a given vector as a linear combination of an orthonormal basis.

Proposition Let $S$ be a vector space equipped with an inner product [eq26]. Let [eq27] be an orthonormal basis of $S$. Then, for any $sin S$, we have[eq28]

Proof

Suppose the unique representation of $s$ in terms of the basis is[eq29]where [eq30] are scalars. Then, for $j=1,ldots ,K $, we have that[eq31]where: in step $rame{A}$ we have used the additivity and homogeneity of the inner product in its first argument; in step $rame{B}$ we have used the fact that we are dealing with an orthonormal basis, so that [eq32] if $k
eq j$; in step $rame{C}$ we have used the fact that the vectors $b_{j}$ have unit norm. Thus, we have found that $lpha _{j}=$ [eq33] for any $j$, which proves the proposition.

The linear combination above is called Fourier expansion and the coefficients [eq34] are called Fourier coefficients.

In other words, we can find the coefficient of $b_{k}$ by simply calculating the inner product of $s$ with $b_{k}$.

Example Let $S$ be the space of all $2	imes 1$ column vectors with complex entries, together with the inner product[eq35]where $r,sin S$ and $s^{st }$ is the conjugate transpose of $s$. Consider the orthonormal basis[eq36]Consider the vector[eq37]Then, the first Fourier coefficient of $s$ is[eq38]and the second Fourier coefficient is[eq39]We can check that $s$ can indeed be written as a linear combination of the basis with the coefficients just derived:[eq40]

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Use the orthonormal basis of two complex vectors introduced in the previous example to derive the Fourier coefficients of the vector[eq41]

Solution

The first Fourier coefficient is derived by computing the inner product of $s $ and $b_{1}$:[eq42]The second Fourier coefficient is found by calculating the inner product of $s$ and $b_{2}$:[eq39]

Exercise 2

Verify that the Fourier coefficients found in the previous exercise are correct.

In particular, check that using them to linearly combine the two vectors of the basis gives $s$ as a result.

Solution

The Fourier representation of $s$ is[eq44]which is the desired result.

How to cite

Please cite as:

Taboga, Marco (2021). "Orthonormal basis", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/orthonormal-basis.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.