In this chapter, the linear transformations are from a
given finite dimensional vector space
to itself. Observe
that in this case, the matrix of the linear transformation is a square
matrix. So, in this chapter, all the matrices
are square matrices and *a vector*
means
for some positive integer

To get a solution, let us assume that

Let
be a matrix of order
In general, we ask the question:

For what values of
there exist a non-zero
vector
such that

Here, stands for either the vector space over or over Equation (6.1.4) is equivalent to the equation

By Theorem 2.5.1, this system of linear equations has a non-zero solution, if

So, to solve (6.1.4), we are forced to choose those values of for which Observe that is a polynomial in of degree We are therefore lead to the following definition.

*Some books use the term EIGENVALUE in place of characteristic value.
*

has a non-zero solution. height6pt width 6pt depth 0pt

- is called an eigenvalue of
- is called an eigenvector corresponding to the eigenvalue of and
- the tuple is called an eigenpair.

*Consider the matrix
Then the characteristic polynomial of
is *

- If that is, if is considered a COMPLEX matrix, then the roots of in are So, has and as eigenpairs.
- If that is, if is considered a REAL matrix, then has no solution in Therefore, if then has no eigenvalue but it has as characteristic values.

*Suppose
is a root of the characteristic equation
Then
is singular and
Suppose
Then by Corollary 4.3.9,
the linear system
has
linearly
independent solutions. That is,
has
linearly independent
eigenvectors corresponding to the eigenvalue
whenever
*

- Let
with
for
Then
is
the characteristic equation. So, the eigenpairs are
- Let
Then
Hence, the characteristic equation has roots
That is
is a repeated eigenvalue. Now check that the
equation
for
is
equivalent to the equation
And this has the solution
Hence, from the above remark,
is a representative for the eigenvector. Therefore, HERE WE
HAVE TWO EIGENVALUES
MATHEND000# BUT ONLY ONE EIGENVECTOR. - Let
Then
The characteristic equation has
roots
Here, the matrix that we have is
and we know
that
for every
and we
can CHOOSE ANY TWO LINEARLY INDEPENDENT VECTORS
from
to get
and
as
the two eigenpairs.
In general, if are linearly independent vectors in then are eigenpairs for the identity matrix,

- Let Then The characteristic equation has roots Now check that the eigenpairs are and In this case, we have TWO DISTINCT EIGENVALUES AND THE CORRESPONDING EIGENVECTORS ARE ALSO LINEARLY INDEPENDENT. The reader is required to prove the linear independence of the two eigenvectors.
- Let Then The characteristic equation has roots Hence, over the matrix has no eigenvalue. Over the reader is required to show that the eigenpairs are and

- Find the eigenvalues of a triangular matrix.
- Find eigenpairs over
for each of the following matrices:

and - Let
and
be similar matrices.
- Then prove that and have the same set of eigenvalues.
- Let
be an eigenpair for
and
be an eigenpair
for
What is the relationship between the vectors
and
?
[

*Hint: Recall that if the matrices and are similar, then there exists a non-singular matrix such that*]

- Let be an matrix. Suppose that for all Then prove that is an eigenvalue of What is the corresponding eigenvector?
- Prove that the matrices and have the same set of eigenvalues. Construct a matrix such that the eigenvectors of and are different.
- Let be a matrix such that ( is called an idempotent matrix). Then prove that its eigenvalues are either 0 or or both.
- Let be a matrix such that ( is called a nilpotent matrix) for some positive integer . Then prove that its eigenvalues are all 0 .

(6.1.5) is an identity in as polynomials. Therefore, by substituting in (6.1.5), we get

Also,

for some Note that the coefficient of comes from the product

So, by definition of trace.

But , from (6.1.5) and (6.1.7), we get

Therefore, comparing the coefficient of we have

Hence, we get the required result. height6pt width 6pt depth 0pt

- Let be a skew symmetric matrix of order Then prove that 0 is an eigenvalue of
- Let be a orthogonal matrix .If , then prove that there exists a non-zero vector such that

Let be an matrix. Then in the proof of the above theorem, we observed that the characteristic equation is a polynomial equation of degree in Also, for some numbers it has the form

Note that, in the expression is an element of Thus, we can only substitute by elements of

It turns out that the expression

holds true as a matrix identity. This is a celebrated theorem called the Cayley Hamilton Theorem. We state this theorem without proof and give some implications.

Some of the implications of Cayley Hamilton Theorem are as follows.

- Let Then its characteristic polynomial is Also, for the function, and This shows that the condition for each eigenvalue of does not imply that
- Suppose we are given a square matrix
of order
and we are
interested in calculating
where
is large compared to
Then we can use the division algorithm to find numbers
and a polynomial
such that

Hence, by the Cayley Hamilton Theorem,In the language of graph theory, it says the following:

``Let be a graph on vertices. Suppose there is no path of length or less from a vertex to a vertex of Then there is no path from to of any length. That is, the graph is disconnected and and are in different components." - Let
be a non-singular matrix of order
Then note that
and

Note that the vector (as an element of the vector space of all matrices) is a linear combination of the vectors

Let the result be true for We prove the result for We consider the equation

for the unknowns We have

From Equations (6.1.9) and (6.1.10), we get

This is an equation in eigenvectors. So, by the induction hypothesis, we have

But the eigenvalues are distinct implies for We therefore get for Also, and therefore (6.1.9) gives

Thus, we have the required result. height6pt width 6pt depth 0pt

We are thus lead to the following important corollary.

- For an
matrix
prove the following.
- and have the same set of eigenvalues.
- If is an eigenvalue of an invertible matrix then is an eigenvalue of
- If is an eigenvalue of then is an eigenvalue of for any positive integer
- If
and
are
matrices with
nonsingular then
and
have the same set of eigenvalues.
In each case, what can you say about the eigenvectors?

- Let
and
be
matrices for which
and
- Do and have the same set of eigenvalues?
- Give examples to show that the matrices and need not be similar.

- Let
be an eigenpair for a matrix
and let
be an eigenpair for another matrix
- Then prove that is an eigenpair for the matrix
- Give an example to show that if are respectively the eigenvalues of and then need not be an eigenvalue of

- Let
be distinct non-zero
eigenvalues of an
matrix
Let
be the corresponding eigenvectors. Then show that
forms a basis of
If
then show that
has the unique solution

A K Lal 2007-09-12