Your Flashcards are Ready!
15 Flashcards in this deck.
Topic 2/3
15 Flashcards in this deck.
At the core of eigenvalues and eigenvectors lies the concept of matrices representing linear transformations. A matrix transforms vectors in a vector space, scaling and rotating them according to its properties. Understanding how matrices interact with vectors is crucial for grasping the subsequent concepts of eigenvalues and eigenvectors.
An eigenvector of a square matrix $A$ is a non-zero vector $v$ such that when $A$ acts on $v$, the vector is scaled by a scalar factor $\lambda$, known as the eigenvalue: $$ A \mathbf{v} = \lambda \mathbf{v} $$ This equation signifies that the transformation $A$ stretches or compresses the vector $v$ without altering its direction.
To find the eigenvalues of a matrix $A$, we solve the characteristic equation: $$ \det(A - \lambda I) = 0 $$ where $I$ is the identity matrix of the same dimension as $A$. Solving this equation yields the eigenvalues $\lambda$.
**Example:** Consider matrix $A = \begin{pmatrix} 4 & 1 \\ 2 & 3 \end{pmatrix}$. The characteristic equation is: $$ \det\left(\begin{pmatrix} 4 - \lambda & 1 \\ 2 & 3 - \lambda \end{pmatrix}\right) = (4 - \lambda)(3 - \lambda) - 2 \cdot 1 = \lambda^2 - 7\lambda + 10 = 0 $$ Solving $\lambda^2 - 7\lambda + 10 = 0$ gives $\lambda = 5$ and $\lambda = 2$.
Once the eigenvalues are determined, eigenvectors can be found by solving the equation: $$ (A - \lambda I)\mathbf{v} = \mathbf{0} $$ for each eigenvalue $\lambda$.
**Example:** For $\lambda = 5$ and matrix $A = \begin{pmatrix} 4 & 1 \\ 2 & 3 \end{pmatrix}$: $$ \begin{pmatrix} 4 - 5 & 1 \\ 2 & 3 - 5 \end{pmatrix} \begin{pmatrix} v_1 \\ v_2 \end{pmatrix} = \begin{pmatrix} -1 & 1 \\ 2 & -2 \end{pmatrix} \begin{pmatrix} v_1 \\ v_2 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix} $$ This simplifies to $-v_1 + v_2 = 0$, leading to $v_2 = v_1$. Thus, an eigenvector corresponding to $\lambda = 5$ is $\begin{pmatrix} 1 \\ 1 \end{pmatrix}$.
A matrix $A$ is said to be diagonalisable if it can be expressed in the form: $$ A = PDP^{-1} $$ where $D$ is a diagonal matrix containing the eigenvalues of $A$, and $P$ is a matrix whose columns are the corresponding eigenvectors.
Diagonalisation simplifies matrix computations, particularly in raising matrices to powers and solving systems of differential equations.
**Example:** Using the previous matrix $A = \begin{pmatrix} 4 & 1 \\ 2 & 3 \end{pmatrix}$ with eigenvalues $\lambda_1 = 5$, $\lambda_2 = 2$ and corresponding eigenvectors $v_1 = \begin{pmatrix} 1 \\ 1 \end{pmatrix}$, $v_2 = \begin{pmatrix} 1 \\ -1 \end{pmatrix}$: $$ P = \begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix}, \quad D = \begin{pmatrix} 5 & 0 \\ 0 & 2 \end{pmatrix} $$ Thus, $$ A = PDP^{-1} $$
Eigenvalues can have algebraic and geometric multiplicities. The algebraic multiplicity is the number of times an eigenvalue appears as a root of the characteristic equation, while the geometric multiplicity is the number of linearly independent eigenvectors corresponding to that eigenvalue. For a matrix to be diagonalisable, the geometric multiplicity must equal the algebraic multiplicity for each eigenvalue.
The spectral theorem states that every symmetric matrix is diagonalisable by an orthogonal matrix. This means there exists an orthogonal matrix $Q$ such that: $$ A = QDQ^T $$ where $D$ is a diagonal matrix of eigenvalues, and $Q$ contains orthonormal eigenvectors. This theorem is pivotal in simplifying many problems in mathematics and physics.
Eigenvalues and eigenvectors have diverse applications:
Diagonalising a matrix involves finding a suitable basis of eigenvectors. The steps include:
**Example:** Diagonalising $A = \begin{pmatrix} 4 & 1 \\ 2 & 3 \end{pmatrix}$:
When a matrix is not diagonalisable, the Jordan canonical form provides a generalized form. It involves Jordan blocks, which are nearly diagonal but have ones on the superdiagonal. This form is essential for understanding the structure of linear operators beyond diagonalisation.
**Example:** Consider matrix $B = \begin{pmatrix} 6 & 1 \\ 0 & 6 \end{pmatrix}$. The eigenvalue is $\lambda = 6$ with algebraic multiplicity 2. However, there is only one linearly independent eigenvector: $$ \begin{pmatrix} 1 \\ 0 \end{pmatrix} $$ Thus, $B$ is not diagonalisable. Its Jordan form is: $$ J = \begin{pmatrix} 6 & 1 \\ 0 & 6 \end{pmatrix} $$
Diagonalisation simplifies the computation of matrix powers. If $A = PDP^{-1}$, then: $$ A^n = PD^nP^{-1} $$ Since $D$ is diagonal, $D^n$ is straightforward to compute by raising each diagonal element to the power $n$.
**Example:** Using $A = PDP^{-1}$ from earlier: $$ A^3 = PD^3P^{-1} = P \begin{pmatrix} 5^3 & 0 \\ 0 & 2^3 \end{pmatrix} P^{-1} = P \begin{pmatrix} 125 & 0 \\ 0 & 8 \end{pmatrix} P^{-1} $$ Calculating this yields: $$ A^3 = \begin{pmatrix} 4 & 1 \\ 2 & 3 \end{pmatrix}^3 = \begin{pmatrix} 100 & 31 \\ 62 & 93 \end{pmatrix} $$
Eigenvalues and eigenvectors play a pivotal role in solving systems of linear differential equations. By diagonalising the matrix representing the system, solutions can be expressed in terms of exponential functions scaled by eigenvalues.
**Example:** Consider the system: $$ \frac{d\mathbf{y}}{dt} = A\mathbf{y}, \quad A = \begin{pmatrix} 4 & 1 \\ 2 & 3 \end{pmatrix} $$ Diagonalising $A$ allows the system to be decoupled into independent equations: $$ \mathbf{y}(t) = c_1 e^{5t} \begin{pmatrix} 1 \\ 1 \end{pmatrix} + c_2 e^{2t} \begin{pmatrix} 1 \\ -1 \end{pmatrix} $$ where $c_1$ and $c_2$ are constants determined by initial conditions.
In practical applications, especially with large matrices, numerical methods are employed to approximate eigenvalues and eigenvectors. Techniques such as the Power Iteration, QR Algorithm, and Jacobi Method are integral to computational linear algebra, facilitating applications in data science, machine learning, and engineering simulations.
**Power Iteration:** A method to find the dominant eigenvalue and its corresponding eigenvector by iteratively applying the matrix to a random vector.
**QR Algorithm:** A more sophisticated technique that decomposes a matrix into an orthogonal matrix $Q$ and an upper triangular matrix $R$, iteratively refining approximations of eigenvalues.
Eigenvalues and eigenvectors connect various disciplines by providing a common mathematical framework:
Understanding these connections enhances the appreciation of linear algebra's versatility and its pivotal role in solving real-world problems.
Aspect | Eigenvalues | Eigenvectors |
---|---|---|
Definition | Scalars $\lambda$ satisfying $A \mathbf{v} = \lambda \mathbf{v}$. | Non-zero vectors $\mathbf{v}$ that are scaled by $A$. |
Computation | Found by solving $\det(A - \lambda I) = 0$. | Derived by solving $(A - \lambda I)\mathbf{v} = \mathbf{0}$ for each $\lambda$. |
Multiplicity | Eigenvalues can have algebraic and geometric multiplicities. | Eigenvectors' count depends on the geometric multiplicity of eigenvalues. |
Role in Diagonalisation | Populate the diagonal matrix $D$. | Form the columns of matrix $P$. |
Applications | Used in stability analysis, vibration modes, and quantum mechanics. | Essential for transforming and simplifying linear transformations. |
Remember the Process: Use the mnemonic "CAME" to recall the steps for diagonalisation: Calculate eigenvalues, Assemble eigenvectors, Make matrix P, and Express A as $PDP^{-1}$.
Double-Check Calculations: Small arithmetic errors can lead to incorrect eigenvalues or eigenvectors. Always verify by plugging back into the original equation $A\mathbf{v} = \lambda\mathbf{v}$.
Understand Multiplicities: Before attempting diagonalisation, ensure that for each eigenvalue, the number of eigenvectors matches its algebraic multiplicity.
The concept of eigenvalues and eigenvectors was first introduced in the 18th century by mathematicians like Euler and Lagrange, initially to solve problems in mechanics. Additionally, in modern times, eigenvectors are crucial in Google's PageRank algorithm, which ranks web pages based on their significance. Another fascinating application is in facial recognition technology, where eigenvectors help in identifying unique facial features.
Incorrect Characteristic Equation: Students often forget to subtract $\lambda$ from the diagonal elements when forming $A - \lambda I$. For example, for matrix $A = \begin{pmatrix} 2 & 1 \\ 1 & 2 \end{pmatrix}$, the correct characteristic equation is $\det(A - \lambda I) = (2-\lambda)^2 - 1 = 0$, not $\det(A) - \lambda = 0$.
Neglecting All Eigenvalues: Sometimes, students stop after finding one eigenvalue, forgetting that there may be multiple. Always ensure to solve the characteristic equation completely.
Assuming Diagonalisation is Always Possible: Not all matrices are diagonalisable. It's important to check the geometric multiplicity of eigenvalues before attempting to diagonalise.