Search for probability and statistics terms on Statlect
StatLect

Linear map

by , PhD

Linear maps are transformations from one vector space to another that have the property of preserving vector addition and scalar multiplication.

Table of Contents

Definition

Let us start with a definition.

Definition Let $S$ and $T$ be two linear spaces. Let $f:S
ightarrow T$ be a transformation that associates one and only one element of $T$ to each element of $S$. The transformation $f$ is said to be a linear map if and only if[eq1]for any two scalars $lpha _{1}$ and $lpha _{2}$ and any two vectors $s_{1},s_{2}in S$.

While "map" is probably the most commonly used term, we can interchangeably use the terms "mapping", "transformation" and "function".

Example Let $S$ be the space of all $3	imes 1$ column vectors having real entries. Let $T$ be the space of $2	imes 1$ column vectors having real entries. Suppose the map $f:S
ightarrow T$ associates to each vector [eq2]a vector[eq3]Now, take any two vectors $s_{1},s_{2}in S$ and any two scalars $lpha _{1} $ and $lpha _{2}$. By repeatedly applying the definitions of vector addition and scalar multiplication, we get[eq4]Thus, $f$ is a linear map.

Matrix multiplication defines a linear map

We will later prove that every linear map can be represented by a matrix, but the converse is also true: pre-multiplication of vectors by a matrix defines a linear map.

Proposition Let $S$ be the linear space of all $L	imes 1$ column vectors. Let $T$ be the linear space of all Kx1 column vectors. Let A be a $K	imes L$ matrix. Consider the transformation $f:S
ightarrow T$ defined, for any $sin S$, by [eq5]where $As$ denotes the matrix product between A and $s$. Then $f$ is a linear map.

Proof

For any two vectors $s_{1},s_{2}in S$ and any two scalars $lpha _{1}$ and $lpha _{2}$, we have that [eq6]where in step $rame{A}$ we have applied the distributive property of matrix multiplication.

The same result holds for post-multiplication.

Proposition Let $S$ be the linear space of all $1	imes K$ row vectors. Let $T$ be the linear space of all $1	imes L$ row vectors. Let A be a $K	imes L$ matrix. Consider the transformation $f:S
ightarrow T$ defined, for any $sin S$, by [eq7]where $sA$ denotes the matrix product between $s$ and A. Then $f$ is a linear map.

Proof

Analogous to the previous proof.

The definition extends to combinations of multiple terms

As it might be intuitive to understand, linear maps preserve the linearity also of combinations that involve more than two terms.

Proposition Let [eq8] be K scalars and let [eq9] be K elements of a linear space $S$. If $f:S
ightarrow T$ is a linear map, then[eq10]

Proof

The result is obtained by applying the linearity property to one vector at a time:[eq11]

A linear map is completely determined by its values on a basis

A very interesting and useful property is that a linear map $f:S
ightarrow T $ is completely determined by its values on a basis of $S$ (i.e., a set of linearly independent vectors such that any vector $sin S$ can be written as a linear combination of the basis).

Proposition Let $S$ and $T$ be linear spaces. Let [eq12] be a basis of $S$. Let [eq13]. Then, there is a unique linear map $f:S
ightarrow T$ such that[eq14]for $k=1,ldots ,K$.

Proof

Any vector $sin S$ can be written as a linear combination of the basis:[eq15]where the scalars [eq16] are unique because representations in terms of a basis are unique. Then, the linearity of the map implies that [eq17]for any $sin S$. Thus, the value $fleft( s
ight) $ of the map is uniquely determined by the vectors [eq18] (which were chosen in advance) and by the unique scalars [eq16].

In other words, if we know the values taken by the map in correspondence to the vectors of the basis, then we are able to derive also all the other values taken by the map.

Example Let $S$ be the space of all $2	imes 1$ vectors. Let $T$ be the space of all $3	imes 1$ vectors. Consider the linear map $f:S
ightarrow T$ such that[eq20]The two vectors[eq21]form a basis for $S$ (the canonical basis of $S$). Any vector $sin S$ can be written as a linear combination of the basis. In particular, if we denote by $s_{1}$ and $s_{2}$ the two entries of $s$, then we have that[eq22]Therefore, the value of $f$ in correspondence of any vector $s$ can be derived as follows:[eq23]

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Let $S=T$ be the space of all $2	imes 1$ vectors. Define the function $f:S
ightarrow T$ that maps each vector $sin S$ as follows:[eq24]Determine whether $f$ is a linear map.

Solution

Take any two vectors $s_{1},s_{2}in S$ and any two scalars $lpha _{1}$ and $lpha _{2}$. We have that[eq25]The map would be linear if the vector [eq26]was equal to zero for any two scalars $lpha _{1}$ and $lpha _{2}$. But the latter vector is different from zero for any choice of $lpha _{1}$ and $lpha _{2}$ such that [eq27]. Therefore, the map is not linear.

Exercise 2

Let $S=T$ be the space of all $2	imes 1$ vectors. Define the function $f:S
ightarrow T$ that maps each vector $sin S$ as follows:[eq28]Determine whether $f$ is a linear map.

Solution

For any two vectors $s_{1},s_{2}in S$ and any two scalars $lpha _{1}$ and $lpha _{2}$, we have that[eq29]Thus, the map is linear.

How to cite

Please cite as:

Taboga, Marco (2021). "Linear map", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/linear-map.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.