Eigenvalue

A special scalar value that, when used to scale a vector in a linear transformation, results in a vector parallel to the original.

Eigenvalue

An eigenvalue is a fundamental concept in linear algebra that emerges when studying how linear transformation affect vectors in space. When a linear transformation is applied to certain special vectors (called eigenvector), the result is simply a scaling of the original vector by a factor λ (lambda) - this scaling factor is the eigenvalue.

Mathematical Definition

For a square matrix A and a non-zero vector v, if there exists a scalar λ such that:

Av = λv

Then λ is an eigenvalue of A, and v is its corresponding eigenvector.

Properties

  1. Characteristic Equation

    • Found by solving det(A - λI) = 0
    • The degree of this equation equals the matrix dimension
    • Complex eigenvalues can occur in conjugate pairs
  2. Key Characteristics

Applications

Physics and Engineering

Computer Science

Geometric Interpretation

Eigenvalues represent how vectors are stretched or compressed by a linear transformation. This leads to three possible scenarios:

  1. λ > 1: The eigenvector is stretched
  2. 0 < λ < 1: The eigenvector is compressed
  3. λ < 0: The eigenvector is reversed and scaled

Historical Context

The concept emerged from the work of Euler in the 18th century, though the term "eigen" (meaning "characteristic" or "proper" in German) was introduced by David Hilbert in the early 20th century.

Computational Methods

Several algorithms exist for finding eigenvalues:

These methods are particularly important for large matrices where direct calculation becomes impractical.

See Also