Search for probability and statistics terms on Statlect
StatLect

Linear independence

by , PhD

Linear independence is a central concept in linear algebra. Two or more vectors are said to be linearly independent if none of them can be written as a linear combination of the others. On the contrary, if at least one of them can be written as a linear combination of the others, then they are said to be linearly dependent.

In the remainder of this lecture we will give a formal definition of linear independence, we will explain its meaning and we will provide some examples.

Table of Contents

Linearly dependent vectors

Let us start with a formal definition of linear dependence.

Definition Let $S$ be a linear space. Some vectors [eq1] are said to be linearly dependent if and only if there exist n scalars [eq2] such that[eq3]and at least one of the n scalars [eq4] is different from zero.

The requirement that at least one scalar be different from zero is fundamental.

First of all, without this requirement the definition would be trivial: we could always choose[eq5]and obtain as a result[eq6]for any set of n vectors.

Secondly, if one of the coefficients of the linear combination is different from zero (suppose, without loss of generality, it is $lpha _{1}$), then we can write[eq7]that is, $x_{1}$ is a linear combination of the vectors [eq8] with coefficients [eq9]. This fact motivates the informal definition of linear dependence we have given in the introduction above: two or more vectors are linearly dependent if at least one of them can be written as a linear combination of the others.

The assumption $lpha _{1}
eq 0$ is without loss of generality because we can always change the order of the vectors and assign the first position to a vector corresponding to a non-zero coefficient (by assumption there exists at least one such vector).

Example Let $x_{1}$ and $x_{2}$ be $2	imes 1$ column vectors defined as follows.[eq10]The linear combination[eq11]gives as a result the zero vector because[eq12]As a consequence, $x_{1}$ and $x_{2}$ are linearly dependent.

Linearly independent vectors

It is now straightforward to give a definition of linear independence.

Definition Let $S$ be a linear space. Some vectors [eq13] are said to be linearly independent if and only if they are not linearly dependent.

It follows from this definition that, in the case of linear independence,[eq14]implies[eq5]

In other words, when the vectors are linearly independent, their only linear combination that gives the zero vector as a result has all coefficients equal to zero.

Example Let $x_{1}$ and $x_{2}$ be $2	imes 1$ column vectors defined as follows.[eq16]Consider a linear combination of these two vectors with coefficients $lpha _{1}$ and $lpha _{2}$:[eq17]This is equal to[eq18]Therefore, we have that[eq19]if and only if[eq20]that is, if and only if [eq21]. As a consequence, the two vectors are linearly independent.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Define the following $2	imes 1$ vectors:[eq22]Are $A_{1}$ and $A_{2}$ linearly independent?

Solution

Consider a linear combination with coefficients $lpha _{1}$ and $lpha _{2}$:[eq23]Such a linear combination gives as a result the zero vector if and only if [eq24]that is, if and only if the two coefficients $lpha _{1}$ and $lpha _{2}$ solve the system of linear equations[eq25]This system can be solved as follows. From the second equation, we obtain[eq26]which, substituted in the first equation, gives[eq27]Thus, $lpha _{2}=0$ and $lpha _{1}=0$. Therefore, the only linear combination of $A_{1}$ and $A_{2}$ giving the zero vector as a result has all coefficients equal to zero. This means that $A_{1}$ and $A_{2}$ are linearly independent.

Exercise 2

Let $A_{1}$, $A_{2}$ and $A_{3}$ be $3	imes 1$ vectors defined as follows:[eq28]Why are these vectors linearly dependent?

Solution

Notice that the vector $A_{3}$ is a scalar multiple of $A_{2}$:[eq29]or[eq30]As a consequence, a linear combination of $A_{1}$, $A_{2}$ and $A_{3}$, with coefficients $lpha _{1}=0$, $lpha _{2}=2$ and $lpha _{3}=-1$, gives as a result[eq31]Thus, there exists a linear combination of the three vectors such that the coefficients of the combination are not all equal to zero, but the result of the combination is equal to the zero vector. This means that the three vectors are linearly dependent.

Exercise 3

Let x be a real number. Define the following $2	imes 1$ vectors:[eq32]Are $A_{1}$ and $A_{2}$ linearly independent?

Solution

Take a linear combination with coefficients $lpha _{1}$ and $lpha _{2}$:[eq33]This linear combination is equal to the zero vector if and only if [eq34]that is, if and only if the two coefficients $lpha _{1}$ and $lpha _{2}$ solve the system of linear equations[eq35]A solution to this system can be found as follows. We subtract the second equation from the first and obtain[eq36]or[eq37]By substitution into the second equation, we get[eq38]or[eq39]Now, there are two possible cases. If $x
eq 0$ (first case), then $lpha _{2}=0$ and, as a consequence, $lpha _{1}=0$. Thus, in this case the only linear combination of $A_{1}$ and $A_{2}$ giving the zero vector as a result has all coefficients equal to zero. This means that $A_{1}$ and $A_{2}$ are linearly independent. If instead $x=0$ (second case), then any value of $lpha _{2}$ will satisfy the equation[eq39]Choose a number different from zero and denote it by $s$. Then, the system of linear equations will be solved by $lpha _{2}=s$ and $lpha _{1}=-2s$. Thus, in this case there are infinite linear combinations with at least one coefficient different from zero that give the zero vector as a result (a different combination for each choice of $s$). This means that $A_{1}$ and $A_{2}$ are linearly dependent.

How to cite

Please cite as:

Taboga, Marco (2021). "Linear independence", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/linear-independence.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.