Singular Value Decomposition

Theorem 1 (Singular Value Decomposition): For any m × n matrix A there exit an m ×m orthogonal matrix U, an n × n orthogonal matrix V and a diagonal m × n matrix D with non-negative values on the diagonal such that A = UDVT.

In fact, such matrices can be constructed where the columns of U are the eigenvectors of AAT, the columns of V are the eigenvectors of ATA and the main diagonal of D contains the square roots of the eigenvalues of U (or V) in descending order.

Proof: By Property 2 of Positive Definite Matrices, ATA is a positive semidefinite n × n matrix, and so by Property 1 of Positive Definite Matrices, it is symmetric. By Theorem 1 of Spectral Decomposition, it has a spectral decomposition ATA = VEVT where V is an orthogonal n × n matrix whose columns are unit eigenvalues of ATA and E is an n × n diagonal matrix whose main diagonal consists of the eigenvalues λ1, …, λn of ATA in descending order. Since ATA is positive semidefinite, these eigenvalues are non -negative. Thus there is an r such λ1 ≥ … ≥ λr >0 and λr+1 =⋯= λn = 0.

Since Vj is a unit vector

image9358

and so AVj = 0 when j > r. We now construct an m × m matrix U as follows. First define the first r columns of U by Uj = \frac{1}{\sqrt \lambda_j}AVj . Since the Vj are orthogonal, so are the Uj. Since

image9357

Uj is a unit vector. If r < m, then we can expand U1, …, Ur to an orthonormal basis U1, …, Um for the set of m × 1 column vectors. In either case let U be the m × m matrix whose columns are the Uj. Based on the construction described above, U is an orthogonal matrix.

Now let B = UTAV = [bij]. Then

image9359

for j ≤ r, and

image9360

for j > r. Let D = the m × n diagonal matrix whose main diagonal consists of \sqrt \lambda_1, …, \sqrt \lambda_rfollowed by zeros (if needed). We have just shown that UTAV = D, and so A = UDVT.

Observation: From the proof of the theorem, it follows that

image9361

Observation: Note that AAT = (AT)T(AT) is a positive semidefinite m × m matrix. In fact, we could have used AAT instead of ATA in the proof of Theorem 1. Also note that

image9362 image9363

These are simply spectral decompositions of ATA and AAT. Note too that the diagonal matrix D2 for ATA is m × m, while the diagonal matrix D2 for AAT is n × n, but both have the same non-zero values on their main diagonals.

If A is a symmetric n × n matrix, then ATA = A2 = AAT and the two spectral decompositions can be considered equal with U = V. In fact, the singular value decomposition of A is then A = UDUT, which is the same as its spectral decomposition.

Observation: The columns of U corresponding to the non-zero diagonal elements form an orthonormal basis for the range of A, and so the rank of A = the number of non-zero diagonal elements. Thus a square matrix is invertible if and only if all the elements in D are positive. If A is invertible then A-1 = (UDVT )-1 = VD-1UT

The solutions to the equation AX = C can be found as follows:

C = AX = UDVTX

and so

X = VD*UTC

Where D* is the diagonal matrix whose main diagonal consists of the reciprocals of the non-negative elements in D followed by zeros. We can view VD*UT as representing a sort of inverse for A even when A is not a square matrix.

Observation: The columns of V corresponding to the zero diagonal elements form an orthogonal basis for the null space of A, and so the dimension of the null space of A = the number of columns in A minus the rank of A, i.e. nr in the proof of Theorem 1. Thus any linear combination of columns in V is a solution to the homogeneous equation AX = 0.

Note that AX = 0 if and only if AX = UDVTX = 0 if and only if

image9364

Thus X is a solution of AX = 0 if and only if X′ is a solution DX′ = 0 where X‘ = VTX. This means that λjX'_j = 0 for all j. But since the λj = 0 for j = r+1, …, n, it follows that X'_j = 0 for such j, and so Xj = VVTXj = VX'_j = 0. Thus if AX = 0 then X is a linear combination of the final n – r columns in V.

Since the λj = 0 for j = r+1, …, n, any linear combination of the final n – r columns in V is a solution to AX = 0. Since the columns of V are orthogonal and therefore independent, it follows that the final n – r columns of V form a basis for the null space, and so the dimension of the null space is n – r.

Real Statistics Functions: The Real Statistics Resource Pack provides the following functions:

SVD_U(R1, iter) = U matrix of the singular vector decomposition (SVD) for the matrix A corresponding to range R1; thus A = UDVT where and V are orthogonal matrices and D is a diagonal matrix.

SVD_D(R1, iter) = D matrix of the SVD for the matrix A corresponding to range R1

SVD_V(R1, iter) = V matrix of the SVD for the matrix A corresponding to range R1

Here iter is the number of iterations in the algorithm used to compute the SVD (default 100). The current implementation of these functions only supports square matrices.

Real Statistics Data Analysis Tool: The SVD Factorization option of the Real Statistics Matrix Operations data analysis tool also provides the means to output the singular value decomposition of a square matrix.

4 Responses to Singular Value Decomposition

  1. Pierluigi Mozzo says:

    I’m not able to find SVD functions in RealStatistic function list installed on my MAC.

    • Charles says:

      Pierluigi,
      Unfortunately, this function isnot yet supported on the mac version of the software, only the Windows version.
      Charles

  2. Jun Zhao says:

    Hi Charles,
    Do you have a SVD function implemented?
    Thanks,
    Jun

Leave a Reply

Your email address will not be published. Required fields are marked *