Let *A* be a square matrix of order n and
one of its eigenvalues. Let *X* be an eigenvector of *A* associated to .
We must have

This is a linear system for which the matrix coefficient is . Since the zero-vector is a solution, the system is consistent. In fact, we will in a different page that the structure of the solution set of this system is very rich. In this page, we will basically discuss how to find the solutions.

**Remark.** It is quite easy to notice that if *X* is a vector which satisfies
,
then the vector *Y* = *c X* (for any arbitrary number *c*) satisfies the same equation, i.e.
.
In other words, if we know that *X* is an eigenvector, then *cX* is also an eigenvector associated to the same eigenvalue.

Let us start with an example.

**Example.** Consider the matrix

First we look for the eigenvalues of

If we develop this determinant using the third column, we obtain

Using easy algebraic manipulations, we get

which implies that the eigenvalues of

Next we look for the eigenvectors.

**1.****Case****:**The associated eigenvectors are given by the linear system

which may be rewritten by

Many ways may be used to solve this system. The third equation is identical to the first. Since, from the second equations, we have*y*= 6*x*, the first equation reduces to 13*x*+*z*= 0. So this system is equivalent to

So the unknown vector*X*is given by

Therefore, any eigenvector*X*of*A*associated to the eigenvalue 0 is given by

where*c*is an arbitrary number.**2.****Case****:**The associated eigenvectors are given by the linear system

which may be rewritten by

In this case, we will use elementary operations to solve it. First we consider the augmented matrix , i.e.

Then we use elementary row operations to reduce it to a upper-triangular form. First we interchange the first row with the first one to get

Next, we use the first row to eliminate the 5 and 6 on the first column. We obtain

If we cancel the 8 and 9 from the second and third row, we obtain

Finally, we subtract the second row from the third to get

Next, we set*z*=*c*. From the second row, we get*y*= 2*z*= 2*c*. The first row will imply*x*= -2*y*+3*z*= -*c*. Hence

Therefore, any eigenvector*X*of*A*associated to the eigenvalue -4 is given by

where*c*is an arbitrary number.**2.****Case****:**The details for this case will be left to the reader. Using similar ideas as the one described above, one may easily show that any eigenvector*X*of*A*associated to the eigenvalue 3 is given by

where*c*is an arbitrary number.

**Remark.** In general, the eigenvalues of a matrix are not all distinct from each other (see the page on the eigenvalues for more details). In the next two examples, we discuss this problem.

**Example.** Consider the matrix

The characteristic equation of

Hence the eigenvalues of

where

which may be rewritten by

Clearly, the third equation is identical to the first one which is also a multiple of the second equation. In other words, this system is equivalent to the system reduced to one equation

2*x*+*y* + 2*z*= 0.

To solve it, we need to fix two of the unknowns and deduce the third one. For example, if we set and , we obtain . Therefore, any eigenvector

In other words, any eigenvector

**Example.** Consider the matrix

The characteristic equation is given by

Hence the matrix

which may be rewritten by

This system is equivalent to the one equation-system

So if we set

Let us summarize what we did in the above examples.

**Summary**: *Let **A** be a square matrix. Assume ** is an eigenvalue of **A**. In order to find the associated eigenvectors, we do the following steps:
*

**1.**- Write down the associated linear system

**2.**- Solve the system.
**3.**- Rewrite the unknown vector
*X*as a linear combination of known vectors.

The above examples assume that the eigenvalue
is real number. So one may wonder whether any eigenvalue is always real. In general, this is not the case except for symmetric matrices. The proof of this is very complicated. For square matrices of order 2, the proof is quite easy. Let us give it here for the sake of being little complete.

Consider the symmetric square matrix

Its characteristic equation is given by

This is a quadratic equation. The nature of its roots (which are the eigenvalues of

Using algebraic manipulations, we get

Therefore, is a positive number which implies that the eigenvalues of

**Remark.** Note that the matrix *A* will have one eigenvalue, i.e. one double root, if and only if
.
But this is possible only if *a*=*c* and *b*=0. In other words, we have

In the next page, we will discuss the case of complex eigenvalues.

**
**

Do you need more help? Please post your question on our S.O.S. Mathematics CyberBoard.

**Author**: M.A. Khamsi

Contact us

Math Medics, LLC. - P.O. Box 12395 - El Paso TX 79913 - USA

users online during the last hour