Search for probability and statistics terms on Statlect
StatLect

Generalized eigenvector

by , PhD

The generalized eigenvectors of a matrix A are vectors that are used to form a basis together with the eigenvectors of A when the latter are not sufficient to form a basis (because the matrix is defective).

Table of Contents

Definition

We start with a formal definition.

Definition Let A be a $K	imes K$ matrix. Let $lambda $ be an eigenvalue of A. A Kx1 non-zero vector x is said to be a generalized eigenvector of A associated to the eigenvalue $lambda $ if and only if there exists an integer $kgeq 1$ such that[eq1]where I is the $K	imes K$ identity matrix.

Note that ordinary eigenvectors satisfy[eq2]

Therefore, an ordinary eigenvector is also a generalized eigenvector. However, the converse is not necessarily true.

Example Define the matrix[eq3]Its characteristic polynomial is [eq4]where in step $rame{A}$ we have used the Laplace expansion. Thus, the only eigenvalue (with algebraic multiplicity equal to $3$) is[eq5]The vector [eq6]satisfies[eq7]Hence, x is an eigenvector of A. We have[eq8]The vector[eq9]satisfies[eq10]hence it is a generalized eigenvector.

Equivalent definition

The following criterion can be used as an equivalent definition of generalized eigenvector.

Proposition Let A be a $K	imes K$ matrix. Let $lambda $ be an eigenvalue of A. A non-zero vector x is a generalized eigenvector of A associated to the eigenvalue $lambda $ if and only if [eq11]

Proof

The "if" part is trivial, as any non-zero vector satisfying[eq12]is by definition a generalized eigenvector of A. Let us now prove the "only if" part. Let $S$ be the space of all Kx1 vectors. Suppose that a generalized eigenvector x satisfies[eq13]for a given integer k. As demonstrated in the lecture on matrix powers, the null space[eq14]becomes larger by increasing k, but it cannot be larger than[eq15]In other words,[eq16]for any integer k. As a consequence,[eq17]that is, x satisfies[eq12]

Rank

We now define the rank of a generalized eigenvector.

Definition Let A be a $K	imes K$ matrix. Let $lambda $ be an eigenvalue of A. Let x be a generalized eigenvector of A associated to the eigenvalue $lambda $. We say that x is a generalized eigenvector of rank k if and only if[eq19]

Thus, a generalized eigenvector of rank 1 is an ordinary eigenvector.

Generalized eigenspace

The set of all generalized eigenvectors associated to an eigenvalue is called a generalized eigenspace.

Definition Let A be a $K	imes K$ matrix. Let $lambda $ be an eigenvalue of A. The set of all generalized eigenvectors (plus the zero vector)[eq20]is called the generalized eigenspace associated to $lambda $.

Note that we have already proved (see Equivalent definition above) that the null space [eq21] comprises all the generalized eigenvectors. However, it comprises also the zero vector, which is not a generalized eigenvector.

Since a generalized eigenspace is the null space of a power of $A-lambda I$, it has two important properties:

A consequence of the second point is that[eq24]for any [eq25]and any k.

Generalized eigenspaces have only the zero vector in common

Two generalized eigenspaces corresponding to two distinct eigenvalues have only the zero vector in common.

Proposition Let A be a $K	imes K$ matrix. Let $lambda _{1}$ and $lambda _{2}$ be two distinct eigenvalues of A (i.e., [eq26]). Then, their generalized eigenspaces satisfy [eq27]

Proof

We are going to use the following notation:[eq28]The proof is by contradiction. Suppose that x is a non-zero vector belonging to the intersection of the two generalized eigenspaces. Let k be the smallest integer such that[eq29]so that[eq30]which implies that [eq31]is an eigenvector associated to $lambda _{1}$. Clearly, [eq32]. Note that $y$ is obtained by repeatedly applying to x the transformation[eq33]which maps $QTR{cal}{N}_{2}$ into itself because both [eq34] and $A-lambda _{2}I$ map $QTR{cal}{N}_{2}$ into itself (the latter by the invariant property discussed above). Thus, not only [eq35], but also [eq36], that is,[eq37]Since $y$ is an eigenvector, it is non-zero and[eq38]Moreover, it is an eigenvector associated to $lambda _{1}$, which implies that it cannot be also an eigenvector associated to $lambda _{2}$ (because eigenvectors corresponding to distinct eigenvalues are linearly independent). As a consequence,[eq39]For any $j>1$, we have [eq40]or[eq41]Hence, $y$ is not a generalized eigenvector associated to $lambda _{2}$. But we have proved above that [eq36]. Thus, we have arrived at a contradiction. As a consequence, the zero vector is the only vector belonging to the intersection of the two generalized eigenspaces.

The minimal polynomial again

Thanks to the things that we have discovered in this lecture, we can improve our understanding of minimal polynomials.

Remember that the minimal polynomial of A is the annihilating polynomial (i.e., [eq43]) having the lowest possible degree and it can be written in terms of A as[eq44]where [eq45] are the distinct eigenvalues of A.

Proposition Let A be a $K	imes K$ matrix. Let p be the minimal polynomial of A:[eq46]Then, for $j=1,ldots ,m$, the exponent $
u _{j}$ is the rank of the generalized eigenvectors associated to $lambda _{j}$ having the highest rank.

Proof

We are going to prove the proposition for the case $j=1$. The other cases are similar. First of all, we are going to prove that there exists a generalized eigenvector of rank $
u _{1}$. Define[eq47]and note that[eq48]because otherwise p would not be minimal. Hence, there exists a non-zero Kx1 vector $s$ such that[eq49]Define[eq50]Then,[eq51]and[eq52]Thus,[eq53]which implies that x is a generalized eigenvector of rank $
u _{1}$. We now need to prove that there do not exist generalized eigenvectors of higher rank. The proof is by contradiction. Suppose that there exists a generalized eigenvector $y$ of higher rank, that is, such that[eq54]for $
u >
u _{1}$. Define[eq55]Then,[eq56]Factor the minimal polynomial p as[eq57]Since p is an annihilating polynomial, we have[eq58]Define[eq59]Thus, [eq60]which implies that [eq61]. But[eq62]which implies that [eq63]. This is impossible since $q$ is non-zero, and [eq64] and [eq65] have only the zero vector in common (they can be used to form a direct sum, as demonstrated in the lecture on the Primary Decomposition Theorem; therefore, they must have only the zero vector in common). Hence, we have arrived at a contradiction and the initial assumption that there exists a a generalized eigenvector of rank $
u >
u _{1}$ must be wrong.

Thus, the exponent $
u _{j}$ in the minimal polynomial provides two key pieces of information:

A rather important consequence of these two points is that[eq66]which is proved in detail in a solved exercise at the end of this lecture.

In other words, the generalized eigenspace associated to $lambda _{j}$ is the null space of [eq67].

We already knew that[eq68]

But the exponent $v_{j}$ tells us exactly when null spaces stop growing:[eq69]where $subset $ denotes strict inclusion.

Thus, using the terminology introduced in the lectures on the Range null-space decomposition, $v_{j}$ is the index of the matrix $A-lambda _{j}I$.

The primary decomposition theorem revisited

Let $S$ be the space of all Kx1 vectors and A a $K	imes K$ matrix.

In a previous lecture we have proved the Primary Decomposition Theorem, which states that the vector space $S$ can be written as[eq70]where $oplus $ denotes a direct sum, [eq71] are the distinct eigenvalues of A and [eq72] are the same strictly positive integers that appear in the minimal polynomial.

As a consequence, by the definition of direct sum, we are able to uniquely write each vector $sin S$ as[eq73]where $x_{j}in $ [eq74] for $j=1,ldots ,m$.

Thus we can re-interpret / re-state the Primary Decomposition Theorem by using the terminology introduced in this lecture: the vector space $S$ can be written as a direct sum of generalized eigenspaces and every vector $sin S$ can be written as a sum of generalized eigenvectors corresponding to distinct eigenvalues.

Bases of generalized eigenvectors

An immediate consequence of the Primary Decomposition Theorem, as restated above, follows.

Proposition Let $S$ be the space of all Kx1 vectors. Let A be a matrix. Then, there exists a basis for $S$ formed by generalized eigenvectors of A.

Proof

Choose a basis for each generalized eigenspace and write each vector $x_{j}$ in equation (1) as a linear combination of the basis of [eq75]. Thus, we can write any $sin S$ as a linear combination of generalized eigenvectors, and the union of the bases of the generalized eigenspaces spans $S$. The vectors of the union are linearly independent because $S$ is a direct sum of the eigenspaces. Hence, the union is a basis for $S$.

It is interesting to contrast this result with the result discussed in the lecture on the linear independence of eigenvectors: while it is not always possible to form a basis of (ordinary) eigenvectors for $S$, it is always possible to form a basis of generalized eigenvectors!

Dimensions of the generalized eigenspaces

The dimension of each generalized eigenspace is equal to the algebraic multiplicity of the corresponding eigenvalue.

Proposition Let A be a $K	imes K$ matrix. Let $lambda $ be an eigenvalue of A having algebraic multiplicity equal to mu. Let [eq76]be the generalized eigenspace associated to $lambda $. Then, the dimension of $N_{lambda }$ is mu.

Proof

By the Schur decomposition theorem, there exists a unitary matrix $Q$ such that[eq77]where $T$ is upper triangular and $Q^{st }$ denotes the conjugate transpose of $Q$. Since A and $T$ are similar, they have the same eigenvalues. Moreover, the Schur decomposition can be performed in such a way that the last mu entries on the diagonal of $T$ are equal to $lambda $. We have[eq78]and[eq79]Write the matrix $T-lambda I$ as a block-triangular matrix[eq80]where $B$ is an upper triangular matrix with non-zero entries on its main diagonal, $C$ is an upper triangular matrix with zero entries on its main diagonal and $st $ denotes a generic matrix of possibly non-zero entries. We have[eq81]The block $B^{K}$ is upper triangular and its diagonal entries are non-zero, while $C^{K}=0$ (by a simple induction argument that is very similar to that used in the proof of the Cayley-Hamilton theorem). The first $K-mu $ rows of [eq82] are clearly linearly independent, while the last mu are zero. Therefore, the rank of [eq83] is $K-mu $. Since $Q$ is full-rank, and multiplication by a full-rank square matrix preserves rank, also [eq84] has rank $K-mu $. Then, the rank-nullity theorem allows us to obtain the desired result:[eq85]

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

In an example above we have found two generalized eigenvectors of the matrix[eq86]Can you find a third generalized eigenvector so as to complete the basis of generalized eigenvectors?

Solution

We have already found the generalized eigenvector [eq6]satisfying[eq88]and the generalized eigenvector[eq9]satisfying[eq90]Now, we compute[eq91]and, for example, the vector[eq92]satisfies[eq93]Moreover, [eq94] is a basis for the space of $3	imes 1$ vectors (it is the so-called standard basis).

Exercise 2

Let A be a $K	imes K$ matrix. Let $lambda _{j}$ be an eigenvalue of A and $
u _{j}$ its corresponding exponent in the minimal polynomial. We have proved that there exists at least one generalized eigenvector of rank $v_{j}$ associated to $lambda _{j}$ and no generalized eigenvector associated to $lambda _{j}$ can have rank greater than $
u _{j}$. Explain in detail while these facts imply that[eq95]

Solution

Since $
u _{j}$ is less than or equal to the algebraic multiplicity of $lambda _{j}$ and the latter is less than or equal to K, we have $
u _{j}leq K$. Hence, by the properties of matrix powers, [eq96]Now suppose that [eq97], so that[eq98]Since no generalized eigenvector can have rank greater than $
u _{j}$, it must be that[eq99]Hence, [eq100] and[eq101]The stated result is obtained by combining (2) and (3).

How to cite

Please cite as:

Taboga, Marco (2021). "Generalized eigenvector", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/generalized-eigenvector.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.