Search for probability and statistics terms on Statlect
StatLect

Basis of a linear space

by , PhD

A set of linearly independent vectors constitutes a basis for a given linear space if and only if all the vectors belonging to the linear space can be obtained as linear combinations of the vectors belonging to the basis.

Table of Contents

Definition

Let us start with a formal definition of basis.

Definition Let $S$ be a linear space. Let [eq1] be n linearly independent vectors. Then, [eq2] are said to be a basis for $S$ if and only if, for any $sin S$, there exist n scalars $alpha _{1}$, ...,$alpha _{n}$ such that[eq3]

In other words, if any vector $sin S$ can be represented as a linear combination of [eq4], then these vectors are a basis for $S$ (provided that they are also linearly independent).

Example Let $x_{1}$ and $x_{2}$ be two $2	imes 1$ column vectors defined as follows.[eq5]These two vectors are linearly independent (see Exercise 1 in the exercise set on linear independence). We are going to prove that $x_{1}$ and $x_{2}$ are a basis for the set $S$ of all $2	imes 1$ real vectors. Now, take a vector $sin S$ and denote its two entries by $s_{1}$ and $s_{2}$. The vector $s$ can be written as a linear combination of $x_{1}$ and $x_{2}$ if there exist two coefficients $alpha _{1}$ and $alpha _{2}$ such that[eq6]This can be written as[eq7]Therefore, the two coefficients $alpha _{1}$ and $alpha _{2}$ need to satisfy the following system of linear equations[eq8]From the second equation, we obtain[eq9]By substituting it in the first equation, we get[eq10]or[eq11]As a consequence,[eq12]Thus, we have been able to find two coefficients that allow us to express $s$ as a linear combination of $x_{1}$ and $x_{2}$, for any $sin S$. Furthermore, $x_{1}$ and $x_{2}$ are linearly independent. As a consequence, they are a basis for $S$.

Uniqueness of representation in terms of a basis

An important fact is that the representation of a vector in terms of a basis is unique.

Proposition If [eq13] are a basis for a linear space $S$, then the representation of a vector $sin S$ in terms of the basis is unique, i.e., there exists one and only one set of coefficients [eq14] such that[eq15]

Proof

The proof is by contradiction. Suppose there were two different sets of coefficients [eq16] and [eq17] such that[eq18]If we subtract the second equation from the first, we obtain[eq19]Since the two sets of coefficients are different, there exist at least one $k $ such that[eq20]Thus, there exists a linear combination of [eq21], with coefficients not all equal to zero, giving the zero vector as a result. But this implies that [eq21] are not linearly independent, which contradicts our hypothesis ([eq21] are a basis, hence they are linearly independent).

Basis replacement theorem

The replacement theorem states that, under appropriate conditions, a given basis can be used to build another basis by replacing one of its vectors.

Proposition Let [eq21] be a basis for a linear space $S$. Let $sin S$. If $s
eq 0$, then a new basis can be obtained by replacing one of the vectors [eq21] with $s$.

Proof

Because [eq21] is a basis for $S$ and $sin S$, there exist $n $ scalars $alpha _{1}$, ...,$alpha _{n}$ such that[eq15]At least one of the scalars must be different from zero, because otherwise we would have $s=0$, in contradiction with our hypothesis that $s
eq 0$. Without loss of generality, we can assume that $alpha _{1}
eq 0$ (if it is not, we can re-number the vectors in the basis). Now, consider the set of vectors obtained from our basis by replacing $x_{1}$ with $s$:[eq28]If this new set of vectors is linearly independent and spans $S$, then it is a basis and the proposition is proved. First, we are going to prove linear independence. Suppose[eq29]for some set of scalars [eq30]. By replacing $s$ with its representation in terms of the original basis, we obtain[eq31]Because [eq21] are linearly independent, this implies that[eq33]But we know that $alpha _{1}
eq 0$. As a consequence, [eq34] implies $beta _{1}=0$. By substitution in the other equations, we obtain[eq35]Thus, we can conclude that [eq36]implies that all coefficients [eq30] are equal to zero. By the very definition of linear independence, this means that [eq38] are linearly independent. This concludes the first part of our proof. We now need to prove that [eq39] span $S$. In other words, we need to prove that for any $tin S$, we can find n coefficients [eq40] such that[eq41]Because [eq21] is a basis, there are coefficients [eq43] such that [eq44]From previous results, we have that[eq15]and, as a consequence, [eq46]Thus, we can write[eq47]This means that the desired linear representation [eq41]is achieved with[eq49]As a consequence, [eq39] span $S$. This concludes the second and last part of the proof.

By reading the proof we notice that we cannot choose arbitrarily the vector to be replaced with $s$: only some of the vectors [eq21] are suitable to be replaced; in particular, we can replace only those that have a non-zero coefficient in the unique representation[eq15]

Basis extension theorem

The basis extension theorem, also known as Steinitz exchange lemma, says that, given a set of vectors that span a linear space (the spanning set), and another set of linearly independent vectors (the independent set), we can form a basis for the space by picking some vectors from the spanning set and including them in the independent set.

Proposition Let [eq53] be a set of linearly independent vectors belonging to a linear space $S$. Let [eq54] be a finite set of vectors that span $S$. If the independent set [eq55] is not a basis for $S$, then we can form a basis by adjoining some elements of [eq56] to the independent set.

Proof

Define[eq57]For $j=1,ldots ,m$, if [eq58]set[eq59]Otherwise, define [eq60]In the latter case, the set $S_{j}$ remains linearly independent because it is formed by adjoining to the linearly independent set $S_{j-1}$ a vector $y_{j}$ that cannot be written as a linear combination of the vectors of $S_{j-1}$. At the end of this process, we have a set of linearly independent vectors $S_{m}$ that spans $S$ because any $y_{j}$ can be written as a linear combination of the vectors of $S_{m}$, and any $sin S$ can be written as a linear combination of the vectors [eq61]. Therefore, $S_{m}$ is a basis for $S$.

Existence of a basis

The basis extension theorem implies that every finite-dimensional linear space has a basis. This is discussed in the lecture on the dimension of a linear space.

All the bases of a space have the same cardinality

Another important fact, which will also be discussed in the lecture on the dimension of a linear space, is that all the bases of a space have the same number of elements.

How to cite

Please cite as:

Taboga, Marco (2021). "Basis of a linear space", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/basis-of-a-linear-space.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.