2x2 matrix | |
---|---|

- | Matrix diagonalization |

- | Invertible matrix to diagonalize |

- | Check diagonalization |

3x3 matrix | |
---|---|

- | Matrix diagonalization |

- | Invertible matrix to diagonalize |

- | Check diagonalization |

$2 \times 2$ matrix diagonalization

Let $A$ be a $2 \times 2$ matrix defined as
● Preparation

For a square matrix $A$,
matrix diagonalization is to find a diagonal matrix $\Lambda$ satisfying
In the following, we find the diagonal matrix $\Lambda$ for the matrix $A$ in $(1.1)$, and the invertive matrix $P$ that diagonalizes $A$. It is known that the diagonal elements of a diagonalized matrix are the eigenvalues of the original matrix. Therefore, by obtaining eigenvalues of $A$ and arranging them in diagonal elements, diagonalized matrix $\Lambda$ is obtained.

●
Derivation of diagonal matrix $\Lambda$

In order to obtain the eigenvalue $\Lambda$ of $A$,
we need to solve the characteristic equation
\begin{eqnarray}
\left| \lambda I - A \right| = 0
\end{eqnarray}
$$
\tag{1.2}
$$
, which is a polynomial equation in the variable (eigenvalue) $\lambda$.
Since the left-hand side is a
$2 \times 2$ determinant,
we have
●
Derivation of invertible matrix that diagonalizes $A$

The invertible matrix $P$
diagonalizing the matrix $A$
is the matrix whose columun vectors are the eigenvectors of $A$.
Therefore, $P$ is obtained,
if the eigenvector for each eigenvalue of $A$ is obtained.
So, we will derive the eigenvectors of the eigenvalues of $A$ as follows.
In this case, the eigenvector $\mathbf{x}$ satisfies

In this case, the eigenvector $\mathbf{x}$ satisfies

By $(1.4)$ and $(1.5)$, we obtain the invertible matrix $P$ as

● Check the answer

We will check
whether the matrix $P$ in equation $(1.6)$
actually diagonalizes the matrix $A$,
that is, whether $P$, $A$ and $\Lambda$ satisfy
We will derive the inverse matrix $P^{−1}$ by Gaussian elimination. We define a matrix in which $P$ and the identity matrix $I$ are arranged side by side,

Now we can check the diagonalization as follows.

$3 \times 3$ matrix diagonalization

Let $A$ be a $3 \times 3$ matrix defined as
● Preparation

For a square matrix $A$,
matrix diagonalization is to find a diagonal matrix $\Lambda$ satisfying
In the following, we find the diagonal matrix $\Lambda$ for the matrix $A$ in $(2.1)$, and the invertive matrix $P$ that diagonalizes $A$. It is known that the diagonal elements of a diagonalized matrix are the eigenvalues of the original matrix. Therefore, by obtaining eigenvalues of $A$ and arranging them in diagonal elements, diagonalized matrix $\Lambda$ is obtained.

●
Derivation of diagonal matrix $\Lambda$

In order to obtain the eigenvalue $\Lambda$ of $A$,
we need to solve the characteristic equation
●
Derivation of invertible matrix that diagonalizes $A$

The invertible matrix $P$ that
diagonalizes the matrix $A$
is the matrix whose columun vectors are the eigenvectors of $A$.
Therefore, $P$ is obtained,
if the eigenvector for each eigenvalue of $A$ is obtained.
So, we will derive the eigenvectors of the eigenvalues of $A$ as follows.
In this case, the eigenvector $\mathbf{x}$ satisfies

In this case, the eigenvector $\mathbf{x}$ satisfies

In this case, the eigenvector $\mathbf{x}$ satisfies

By (2.4), (2.5) and (2.6), we obtain the invertible matrix $P$ as

● Check the answer

We will check
whether the matrix $P$ in equation $(2.7)$
actually diagonalizes the matrix $A$,
that is, whether $P$, $A$ and $\Lambda$ satisfy
We will derive the inverse matrix $P^{−1}$ by Gaussian elimination. We define a matrix in which $P$ and the identity matrix $I$ are arranged side by side,

Now we can check the diagonalization as follows.