# Vectors and eigenvalues

Eigenvectors are vectors multiplied by an eigenvalue in linear transformations of a matrix. The eigenvalues are constants that multiply the eigenvectors in the linear transformations of a matrix.

In other words, the eigenvectors translate the information from the original matrix into the multiplication of values and a constant. The eigenvalues are this constant that multiplies the eigenvectors and participates in the linear transformation of the original matrix.

Although its name in Spanish is very descriptive, in English, the eigenvectors are called *eigenvectors* and the eigenvalues, *eigenvalues*.

Recommended articles: matrix typologies, inverse matrix, determinant of a matrix.

## Own Vectors

The eigenvectors are sets of elements that by multiplying any constant, are equivalent with the multiplication of the original matrix and the sets of elements.

Mathematically, an eigenvector *V*= (v1,…, vn) of a square matrix *Q *is any vector *V *which satisfies the following expression for any constant *h*:

QV = hV

## Own values

The constant *h* is the eigenvalue that belongs to the eigenvector *V*.

The eigenvalues are the real roots (roots that have real numbers as a solution) that we find through the characteristic equation.

### Characteristics of eigenvalues

- Each eigenvalue has infinite eigenvectors since there are infinite real numbers that can be part of each eigenvector.
- They are scalars, they can be complex numbers (not real) and they can be identical (more than one equal eigenvalue).
- There are as many eigenvalues as there are number of rows (
*m*) or columns (*n*) has the original matrix.

## Vectors and eigenvalues

There is a linear dependency relationship between vectors and eigenvalues since the eigenvalues multiply the eigenvectors.

### Mathematically

If V is an eigenvector of the matrix *Z* and *h *is the eigenvalue of the matrix *Z*, then *hV* is a linear combination between vectors and eigenvalues.

## Characteristic function

The characteristic function is used to find the eigenvalues of a matrix *Z* square.

### Mathematically

(Z - hl) V = 0

Where zy *h* are defined above and *I* is the identity matrix.

### Terms

To find vectors and eigenvalues of a matrix, the following must be satisfied:

- Matrix
*Z*square: the number of rows (*m*) is the same as the number of columns (*n*). - Matrix
*Z*real. Most matrices used in finance have real roots. What advantage is there in using real roots? Well, the eigenvalues of the matrix are never going to be complex numbers, and that, friends, solves our lives a lot. - Matrix (
*Z*–*hI*) non-invertible: determinant = 0. This condition helps us to always find eigenvectors other than zero. If we found eigenvectors equal to 0, then the multiplication between values and eigenvectors would be zero.

## Practical example

We suppose that we want to find the vectors and eigenvalues of a Z matrix of dimension 2 × 2:

1. We substitute the matrix Z and *I* in the characteristic equation:

2. We fix the factors:

3. We multiply the elements as if we were looking for the determinant of the matrix.

4. The solution to this quadratic equation is h = 2 and h = 5. Two eigenvalues because the number of rows or columns of the Z matrix is 2. So, we have found the eigenvalues of the Z matrix that in turn make the determinant 0.

5. To find the eigenvectors we will have to solve:

6. For example, (v1, v2) = for h = 2 and (v1, v2) = (- 1,2) for h = 5:

**Tags: **
famous-phrases culture present