**Theorem 1** (**Spectral Decomposition**): Let *A* be a symmetric *n × n* matrix, then *A* has a spectral decomposition *A = CDC ^{T}* where

*C*is a

*n × n*matrix whose columns are unit eigenvectors

*C*

_{1}, …,

*C*corresponding to the eigenvalues

_{n}*λ*

_{1}, …,

*λ*of

_{n}*A*and

*D*is the

*n × n*diagonal matrix whose main diagonal consists of

*λ*

_{1}, …,

*λ*.

_{n}Proof: We prove that every symmetric *n × n* matrix is orthogonally diagonalizable by induction on *n*. The property is clearly true for *n* = 1. We assume that it is true for any *n × n* symmetric matrix and show that it is true for an *n*+1 × *n*+1 symmetric matrix *A*.

Let *λ* be any eigenvalue of* A* (we know by Property 1 of Symmetric Matrices that *A* has *n *+ 1 real eigenvalues) and let *X* be a unit eigenvector corresponding to *λ*. Thus *AX = λX*, and so *X ^{T}AX = X^{T}λX = λ*(

*X*) =

^{T}X*λ*(

*X ∙ X*) =

*λ*, showing that

*λ = X*.

^{T}AXBy Property 3 of Linear Independent Vectors, we can construct a basis for the set of all *n*+1 * ×* 1 column vectors which includes

*X*, and so using Theorem 1 of Orthogonal Vectors and Matrices (Gram-Schmidt), we can construct an orthonormal basis for the set of

*n*+1

*1 column vectors which includes X. Now define*

*×**B*to be the matrix whose columns are the vectors in this basis excluding

*X*. By Property 4 of Orthogonal Vectors and Matrices,

*B*is an

*n*+1

*×**n*orthogonal matrix.

Note that (*B ^{T}AB*)

*=*

^{T}*B*since

^{T}A^{T}B^{T}= B^{T}AB*A*is symmetric. This shows that

*B*is a symmetric

^{T}AB*n*matrix, and so by the induction hypothesis there is an

*n**×**n*diagonal matrix

*×*n*E*whose main diagonal consists of the eigenvalues of

*B*and an orthogonal

^{T}AB*n × n*matrix

*P*such

*B*. Now define the

^{T}AB = PEP^{T}*n*+1 ×

*n*matrix

*Q = BP*. Note that by Property 5 of Orthogonal Vectors and Matrices

*Q*is orthogonal.

Now define the *n*+1 × *n*+1 matrix *C* whose first row is *X* and whose remaining rows are those of *Q*, i.e. *C* = [*X, Q*]. We now show that *C* is orthogonal. First we note that since *X* is a unit vector, *X ^{T}X = X ∙ X* = 1. Since the columns of

*B*along with

*X*are orthogonal,

*X*=

^{T}B_{j}*X ∙ B*= 0 for any column

_{j}*B*in

_{j}*B*, and so

*X*= 0, as well as

^{T}B*B*= (

^{T}X*X*)

^{T}B*= 0. Finally since*

^{T}*Q*is orthogonal,

*Q*.

^{T}Q = INow

This completes the proof that *C* is orthogonal. We next show that *Q ^{T}AQ = E*.

Next we need to show that *Q ^{T}AX = X^{T}AQ* = 0. Since

since *A* is symmetric, it is sufficient to show that *Q ^{T}AX* = 0.

As we saw above, *B ^{T}X* = 0. Also, since

*λ*is an eigenvalue corresponding to

*X*,

*AX = λX*. Thus,

Now

Defining

it follows that *C ^{T}AC = D,* and so

Note that at each stage of the induction, the next item on the main diagonal matrix of *D* is an eigenvalue of *A* and the next column in *C* is the corresponding eigenvector and that this eigenvector is orthogonal to all the other columns in *C*.

**Observation**: The spectral decomposition can also be expressed as *A* = .

**Example 1**: Find the spectral decomposition of the matrix *A* in range A4:C6 of Figure 1.

**Figure 1 – Spectral Decomposition**

We calculate the eigenvalues/vectors of *A* (range E4:G7) using the supplemental function eVECTORS(A4:C6). Matrix *C* (range E10:G12) consists of the eigenvectors of *A* and matrix *D* (range I10:K12) consists of the square roots of the eigenvalues. You can check that *A* = *CDC ^{T}* using the array formula

=MMULT(E10:G12,MMULT(I10:K12,M10:O12))

**Real Statistics Function**: The Real Statistics Resource Pack provides the following function:

**SPECTRAL**(R1,* iter)*: returns a 2*n* × *n* range whose top half is the matrix *C* and whose lower half is the matrix *D* in the spectral decomposition of* CDC*^{T} of *A *where *A* is the matrix of values in range R1.

Here *iter *is the number of iterations in the algorithm used to compute the spectral decomposition (default 100).

**Real Statistics Data Analysis Tool**: The **Spectral Factorization** option of the Real Statistics **Matrix Operations** data analysis tool also provides the means to output the spectral decomposition of a symmetric matrix.

**Observation**: As we have mentioned previously, for an *n × n* matrix *A*, det(*A – λI*) is an *n*th degree polynomial of form (-1)* ^{n}* (

*x – λ*) where

_{i}*λ*

_{1}, ….,

*λ*are the eigenvalues of

_{n}*A*. If all the eigenvalues are distinct then we have a simpler proof for Theorem 1 (see Property 4 of Symmetric Matrices). But as we observed in Symmetric Matrices, not all symmetric matrices have distinct eigenvalues. This motivates the following definition

**Definition 1**: The (**algebraic**) **multiplicity** of an eigenvalue is the number of times that eigenvalue appears in the factorization (-1)^{n} (*x – λ _{i}*) of det(

*A – λI*).

**Property 1**: For any eigenvalue *λ* of a square matrix, the number of independent eigenvectors corresponding to *λ* is at most the multiplicity of *λ*.

Proof: Suppose *λ*_{1} is an eigenvalue of the* n × n* matrix *A* and that *B*_{1}, …, *B _{k}* are

*k*independent eigenvectors corresponding to

*λ*

_{1}. By Property 3 of Linear Independent Vectors, there are vectors

*B*

_{k}_{+1}, …,

*B*such that

_{n}*B*

_{1}, …,

*B*is a basis for the set of

_{n}*n*× 1 vectors. Now let

*B*be the

*n × n*matrix whose columns are

*B*

_{1}, …,

*B*. Since

_{n}*B*

_{1}, …,

*B*are independent, rank(

_{n}*B*) =

*n*and so

*B*is invertible. By Property 9 of Eigenvalues and Eigenvectors we know that

*B*

^{-1}

*AB*and

*A*have the same eigenvalues, and in fact they have the same characteristic polynomial.

Now consider *AB*. The first *k* columns take the form *A**B*_{1}, …, *A**B _{k}*, but since

*B*

_{1}, …,

*B*are eigenvectors corresponding to

_{k}*λ*

_{1}, the first

*k*columns are

*λB*

_{1}, …,

*λB*. It now follows that the first

_{k}*k*columns of

*B*

^{–1}

*AB*consists of the vectors of the form

*D*1, …,

*D*where

_{k}*D*consists of

_{j}*λ*

_{1}in row

*j*and zeros elsewhere. This means that the characteristic polynomial of

*B*

^{–1}

*AB*has a factor of at least (

*λ*–

*λ*

_{1})

*, i.e. the multiplicity of*

^{k}*B*

^{–1}

*AB*, and therefore

*A*, is at least

*k*.

**Property 2**: For each eigenvalue *λ* of a symmetric matrix there are *k* independent (real) eigenvectors where k equals the multiplicity of *λ*, and there are no more than *k* such eigenvectors.

Proof: By Theorem 1, any symmetric *n × n* matrix *A* has *n* orthonormal eigenvectors corresponding to its *n* eigenvalues. By Property 2 of Orthogonal Vectors and Matrices, these eigenvectors are independent. This shows that the number of independent eigenvectors corresponding to *λ* is at least equal to the multiplicity of *λ*. But by Property 5 of Symmetric Matrices, it can’t be greater than the multiplicity of *λ*, and so we conclude that it is equal to the multiplicity of λ.

By Property 1 of Symmetric Matrices, all the eigenvalues are real and so we can assume that all the eigenvectors are real too.

Thanks a lot sir for your help regarding my problem. Keep it up sir.

You are doing a great job sir.

With regards

Tapan

For spectral decomposition As given at Figure 1

By taking the A matrix=[4 2 -1

2 3 1

-1 1 9],

when i am trying to find Eigen value and corresponding Eigen Vector by using eVECTORS(A).

I am only getting only one Eigen value 9.259961. How to get the three Eigen value and Eigen Vectors.

You need to highlight the range E4:G7 insert the formula =eVECTORS(A4:C6) and then press

Ctrl-Shift-Enter. Since eVECTORS is an array function you need to pressCtrl-Shift-Enterand not simplyEnter.Charles