**Theorem 1** (**Singular Value Decomposition**): For any *m × n* matrix *A* there exit an *m × m* orthogonal matrix *U*, an *n × n* orthogonal matrix *V* and an *m × n *diagonal matrix *D* with non-negative values on the diagonal such that *A = UDV ^{T}.*

In fact, such matrices can be constructed where the columns of *U* are the eigenvectors of *AA ^{T}*, the columns of

*V*are the eigenvectors of

*A*and the main diagonal of

^{T}A*D*contains the square roots of the eigenvalues of

*U*(or

*V*) in descending order.

Proof: By Property 2 of Positive Definite Matrices, *A ^{T}A* is a positive semidefinite

*n*×

*n*matrix, and so by Property 1 of Positive Definite Matrices, it is symmetric. By Theorem 1 of Spectral Decomposition, it has a spectral decomposition

*A*where

^{T}A = VEV^{T}*V*is an orthogonal

*n*×

*n*matrix whose columns are unit eigenvalues of

*A*and

^{T}A*E*is an

*n*×

*n*diagonal matrix whose main diagonal consists of the eigenvalues

*λ*

_{1}, …,

*λ*of

_{n}*A*in descending order. Since

^{T}A*A*is positive semidefinite, these eigenvalues are non -negative. Thus there is an

^{T}A*r*such

*λ*

_{1}≥ … ≥

*λ*>0 and

_{r}*λ*

_{r}_{+1}=⋯=

*λ*= 0.

_{n}Since *V _{j}* is a unit vector

and so *AV _{j} *= 0 when

*j > r*. We now construct an

*m × m*matrix

*U*as follows. First define the first

*r*columns of

*U*by

*U*=

_{j}*AV*. Since the

_{j}*V*are orthogonal, so are the

_{j}*U*. Since

_{j}*U _{j} *is a unit vector. If

*r < m*, then we can expand

*U*

_{1}, …,

*U*to an orthonormal basis

_{r}*U*

_{1}, …,

*U*for the set of

_{m}*m*× 1 column vectors. In either case let

*U*be the m × m matrix whose columns are the

*U*Based on the construction described above,

_{j}.*U*is an orthogonal matrix.

Now let *B = U ^{T}AV* = [

*b*]. Then

_{ij}for *j ≤ r*, and

for *j > r. *Let *D* = the *m × n* diagonal matrix whose main diagonal consists of , …, followed by zeros (if needed). We have just shown that *U ^{T}AV = D*, and so

*A = UDV*.

^{T}**Observation**: From the proof of the theorem, it follows that

**Observation**: Note that *AA ^{T}* = (

*A*)

^{T}^{T}(

*A*) is a positive semidefinite

^{T}*m × m*matrix. In fact, we could have used

*AA*instead of

^{T}*A*in the proof of Theorem 1. Also note that

^{T}AThese are simply spectral decompositions of *A ^{T}A* and

*AA*. Note too that the diagonal matrix

^{T}*D*

^{2}for

*A*is

^{T}A*m × m*, while the diagonal matrix

*D*

^{2}for

*AA*is

^{T}*n × n*, but both have the same non-zero values on their main diagonals.

If *A* is a symmetric *n × n* matrix, then *A ^{T}A* =

*A*

^{2}=

*AA*and the two spectral decompositions can be considered equal with

^{T}*U = V*. In fact, the singular value decomposition of

*A*is then

*A = UDU*, which is the same as its spectral decomposition.

^{T}**Observation**: The columns of *U* corresponding to the non-zero diagonal elements form an orthonormal basis for the range of *A*, and so the rank of *A* = the number of non-zero diagonal elements. Thus a square matrix is invertible if and only if all the elements in *D* are positive. If *A* is invertible then *A*^{-1} = (*UDV ^{T}* )

^{-1}=

*VD*

^{-1}

*UT*

The solutions to the equation *AX = C* can be found as follows:

*C = AX = UDV ^{T}X*

and so

*X = VD ^{*}U^{T}C*

Where *D ^{*}* is the diagonal matrix whose main diagonal consists of the reciprocals of the non-negative elements in

*D*followed by zeros. We can view

*VD*as representing a sort of inverse for

^{*}U^{T}*A*even when A is not a square matrix.

**Observation**: The columns of *V* corresponding to the zero diagonal elements form an orthogonal basis for the null space of *A*, and so the dimension of the null space of *A* = the number of columns in *A* minus the rank of *A*, i.e. *n* – *r* in the proof of Theorem 1. Thus any linear combination of columns in *V* is a solution to the homogeneous equation *AX* = 0.

Note that *AX* = 0 if and only if *AX = UDV ^{T}X* = 0 if and only if

Thus *X* is a solution of *AX* = 0 if and only if *X*′ is a solution *DX*′ = 0 where *X*‘ = *V ^{T}X*. This means that

*λ*= 0 for all

_{j}*j*. But since the

*λ*= 0 for

_{j}*j*=

*r*+1, …,

*n*, it follows that = 0 for such

*j*, and so

*X*=

_{j}= VVTX_{j}*V*= 0. Thus if

*AX*= 0 then

*X*is a linear combination of the final

*n – r*columns in

*V*.

Since the* λ _{j}* = 0 for

*j*=

*r*+1, …,

*n*, any linear combination of the final

*n – r*columns in

*V*is a solution to

*AX*= 0. Since the columns of

*V*are orthogonal and therefore independent, it follows that the final

*n – r*columns of

*V*form a basis for the null space, and so the dimension of the null space is

*n – r*.

**Real Statistics Functions**: The Real Statistics Resource Pack provides the following functions:

**SVD_U**(R1, *iter*) = *U* matrix of the singular vector decomposition (SVD) for the matrix *A* corresponding to range R1; thus *A = UDV ^{T}* where

*U*and

*V*are orthogonal matrices and

*D*is a diagonal matrix.

**SVD_D**(R1, *iter*) = *D* matrix of the SVD for the matrix *A* corresponding to range R1

**SVD_V**(R1, *iter*) = *V* matrix of the SVD for the matrix *A* corresponding to range R1

Here *iter* is the number of iterations in the algorithm used to compute the SVD (default 100).

**Real Statistics Data Analysis Tool**: The **SVD Factorization** option of the Real Statistics **Matrix Operations** data analysis tool also provides the means to output the singular value decomposition of a square matrix.

SVD_U is sometimes giving #Value! It happened to many matrices I will give an example. Any help is greatly appreciated

-0.088721917 -0.611548267 -0.962796502

-5.647089783 -0.502167919 -0.79059257

-3.790783085 -0.337095695 -0.530709632

-2.34760153 -0.208760657 -0.328664214

-1.431077773 -0.127258708 -0.200350888

-0.907758067 -0.08072246 -0.127086129

Rami,

Perhaps you are using an old version of the Real Statistics software. In one of the last few versions I correction some error in the SVD_U function. The answer I get now for U is

-0.037381909 -0.999301052 -9.10094E-09 2.98519E-10 -7.01112E-11 4.20293E-10

-0.763403531 0.028557441 -0.018845734 -0.082496343 0.091553822 -0.071487468

-0.512458152 0.019170063 0.018095374 -0.401453297 -0.322848265 -0.214276705

-0.317361219 0.011871866 -0.013228788 0.874574071 -0.250001454 0.15765527

-0.193460679 0.007236988 0.040372035 0.104632223 0.908208565 -0.018597225

-0.122715547 0.00459055 0.012237016 -0.237069389 -0.006585681 0.961131622

Charles

Charles,

Thank you for your reply. I downloaded the file from http://www.real-statistics.com/free-download/real-statistics-resource-pack/. For excel 2016. is there any other site in which I could download the newer version pack from ?

Rami,

That contains the newest version, Release 5.2.

Charles

Thank you VERY MUCH for your e-mail reply with the attachment showing that SVD_U does indeed work properly on the example matrix. I went back to your website and downloaded the add-in anew to make sure I have the most current version. Voila, indeed, the SVD_U function works 🙂

SVD_U sometimes gives #VALUE! error. At first, I thought this only happened when U contained near zero values. Today, however, I found this same error on this simple matrix:

1 16.85 1.46

1 24.81 -4.61

1 18.85 -0.21

1 12.63 4.93

1 21.38 -1.36

1 18.78 -0.08

1 15.58 2.98

1 16.3 1.73

This matrix is from this article on SVD and regression analysis:

https://pdfs.semanticscholar.org/aef2/68c21be034bfd6228bf3946cb46e3c62cdb1.pdf

The comparable SVDU function from the old Digilander site’s matrix.xla returns values without any problem (but I don’t like using it in today’s Excel – it was written for an older version and seems to cause some problems in current Excel). The Digilander site is no longer but it appears you can download the old matrix.xla here:

http://www.bowdoin.edu/~rdelevie/excellaneous/#downloads

I just mention matrix.xla in case it may be helpful in solving the problems with SVD_U

Monte,

The SVD_U function seems to work fine on this matrix and doesn’t return error values. I just tested it on my computer and will send you the results by email.

Charles

I’m not able to find SVD functions in RealStatistic function list installed on my MAC.

Pierluigi,

Unfortunately, this function isnot yet supported on the mac version of the software, only the Windows version.

Charles

Hi Charles,

Do you have a SVD function implemented?

Thanks,

Jun

Jun,

I don’t have an SVD function implemented, but it wouldn’t be hard to do. I’ll try to add it shortly.

Charles