Spectral Theorem
Eigenvalues and eigenvectors of symmetric matricesLet ![]() The vector ![]() The interpretation of ![]() where the notation From the fundamental theorem of algebra, any polynomial of degree Spectral theoremAn important result of linear algebra, called the spectral theorem, or symmetric eigenvalue decomposition (SED) theorem, states that for any symmetric matrix, there are exactly Theorem: Symmetric eigenvalue decomposition
We can decompose any symmetric matrix ![]() where the matrix of Here is a proof. The SED provides a decomposition of the matrix in simple terms, namely dyads. We check that in the SED above, the scalars ![]() The eigenvalue decomposition of a symmetric matrix can be efficiently computed with standard software, in time that grows proportionately to its dimension Matlab syntax
>> A = triu(A)+tril(A',-1); >> [U,D] = eig(A); Example: Rayleigh quotientsGiven a symmetric matrix ![]() For a proof, see here. The term ‘‘variational’’ refers to the fact that the eigenvalues are given as optimal values of optimization problems, which were referred to in the past as variational problems. Variational representations exist for all the eigenvalues, but are more complicated to state. The interpretation of the above identities is that the largest and smallest eigenvalues is a measure of the range of the quadratic function Historically, David Hilbert coined the term ‘‘spectrum’’ for the set of eigenvalues of a symmetric operator (roughly, a matrix of infinite dimensions). The fact that for symmetric matrices, every eigenvalue lies in the interval Example: Largest singular value norm of a matrix. |