Search for probability and statistics terms on Statlect
StatLect
Index > Matrix algebra

Coordinate vector

by , PhD

We have previously provided two definitions of a vector space:

We have also explained that the simpler informal definition is perfectly compatible with the more formal definition, as a set of numerical arrays satisfies all the properties of a vector space, provided that vector addition and scalar multiplication are defined in the usual manner and that the set is closed with respect to linear combinations.

We now introduce a new concept, that of a coordinate vector, which makes the two definitions almost equivalent: if we are dealing with an abstract vector space, but its dimension is finite and we are able to identify a basis for the space, then we can write each vector as a linear combination of the basis; as a consequence, we can represent the vector as an array, called a coordinate vector, that contains the coefficients of the linear combination. Once we have obtained this simple representation, we can apply the usual rules of matrix algebra to the coordinate vectors, even if we are dealing with an abstract vector space. Not only this is very convenient, but it blurs the differences between the two approaches to defining vectors and vector spaces (at least for the finite-dimensional case).

Table of Contents

Definition

We are now ready to give a definition of coordinate vector.

Definition Let $S$ be a finite-dimensional linear space. Let [eq1] be a basis for $S$. For any $sin S$, take the unique set of n scalars [eq2] such that[eq3]Then, the Kx1 vector[eq4]is called the coordinate vector of $s$ with respect to the basis $B$.

Note that the uniqueness of the scalars [eq5] is guaranteed by the uniqueness of representations in terms of a basis.

Example Let $S$ be a vector space and [eq6] a basis for it. Suppose that a vector $sin S$ can be written as a linear combination of the basis as follows:[eq7]Then, the coordinate vector of $s$ with respect to $B$ is[eq8]

Example Consider the space $P$ of second-order polynomials[eq9]where the coefficients $p_{0},p_{1},p_{2}$ and the argument $z$ are scalars. As we have already discussed in the lecture on linear spaces, $P$ is a vector space provided that the addition of polynomials and their multiplication by scalars is performed in the usual manner. Consider the polynomials[eq10]These three polynomials form a basis [eq11] for $P$ because they are linearly independent (no combination of them is equal to zero for any $z$) and they can be linearly combined so as to obtain any $pleft( z
ight) $ of the form above:[eq12]The coordinate vector of $pleft( z
ight) $ with respect to the basis we have just found is[eq13]

Addition of coordinate vectors

The addition of two vectors can be carried out by performing the usual operation of vector addition on their respective coordinate vectors.

Proposition Let $S$ be a linear space and [eq14] a basis for $S$. Let $s,tin S$. Then, the coordinate vector of $s+t$ with respect to the basis is equal to the sum of the coordinate vectors of $s$ and $t$ with respect to the same basis, that is,[eq15]

Proof

Suppose that the representations in terms of the basis are[eq16]so that the coordinate vectors are[eq17]By the commutative and distributive properties of vector addition and scalar multiplication in abstract vector spaces, we have that[eq18]Thus, the coordinate vector of $s+t$ is[eq19]

Scalar multiplication of coordinate vectors

The multiplication of a vector by a scalar can be carried out by performing the usual operation of multiplication by a scalar on its coordinate vector.

Proposition Let $S$ be a linear space and [eq14] a basis for $S$. Let $sin S$ and let $lpha $ be a scalar. Then, the coordinate vector of $lpha s$ with respect to the basis is equal to the product of $lpha $ and the coordinate vector of $s$, that is,[eq21]

Proof

Suppose the representation in terms of the basis is[eq3]so that the coordinate vector is[eq23]By the associative and distributive properties of scalar multiplication in abstract vector spaces, we have that[eq24]Thus, the coordinate vector of $lpha s$ is[eq25]

Numeric arrays are coordinate vectors with respect to the canonical basis

When the elements of a linear space $S$ are one-dimensional arrays of numbers (vectors, in the simplest sense of the term), then they coincide with their coordinate vectors with respect to the standard basis. For example, let $S$ be the space of all Kx1 column vectors. Let [eq26] be its canonical basis, where $e_{k}$ is a vector whose entries are all 0, except the k-th, which is equal to 1: [eq27]Take any [eq28]Then, $s$ is the same as its coordinate vector with respect to the basis [eq29], that is,[eq30]because [eq31]

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Let $P$ be the vector space of all third-order polynomials. Perform the addition of two polynomials[eq32]and[eq33]by using their coordinate vectors with respect to the basis[eq34]Check that the result is the same that you would get by summing the two polynomials directly.

Solution

The representations in terms of the basis [eq35] are[eq36]Thus, the two coordinate vectors are[eq37]Their sum is[eq38]so that[eq39]This is the same result that we obtain by carrying out the addition directly:[eq40]

Exercise 2

Let $S$ be the space of all $2	imes 1$ vectors. Consider the basis [eq41] where[eq42]Find the coordinate vector of [eq43]with respect to the given basis.

Solution

We have that[eq44]Therefore, the coordinate vector of $s$ is[eq45]

How to cite

Please cite as:

Taboga, Marco (2017). "Coordinate vector", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/coordinate-vector.

The book

Most of the learning materials found on this website are now available in a traditional textbook format.