**Definition 1**: An *n × n* symmetric matrix *A* is positive definite if for any *n* × 1 column vector *X* ≠ 0, *X ^{T}AX* > 0.

*A*is positive semidefinite if for any

*n*× 1 column vector

*X*,

*X*≥ 0.

^{T}AX**Observation**: Note that if *A* = [*a _{ij}*] and

*X*= [

*x*], then

_{i}If we set *X* to be the column vector with *x _{k}* = 1 and

*x*= 0 for all

_{i}*i ≠ k*, then

*X*=

^{T}AX*a*, and so if

_{kk}*A*is positive definite, then

*a*> 0, which means that all the entries in the diagonal of

_{kk}*A*are positive. Similarly if

*A*is positive semidefinite then all the elements in its diagonal are non-negative.

**Property 1**: If *B* is an *m × n* matrix, then *A = B ^{T}B* is symmetric

Proof: If *B* = [*b _{ij}*] is an

*m × n*matrix then

*A = B*= [

^{T}B*a*] is an

_{kj}*n × n*matrix where

*a*= .

_{kj}*A*is symmetric since by Property 1 of Matrix Operations,

*A*= (

^{T}*B*

^{T}*B*)

^{T}=

*B*(

^{T}*B*)

^{T}^{T}=

*B*=

^{T}B*A*.

**Observation**: If *X* = [*x _{i}*] is an

*m*× 1 column vector, then

*X*= .

^{T}X**Property 2**: If *B* is an *m × n* matrix, then *A = B ^{T}B* is positive semidefinite.

Proof: As we observed in Property 1, *A* is a symmetric *n × n* matrix. For any *n* × 1 column vector *X*, *BX* is an *m* × 1 column vector [*c _{i}*] where

*c*= , and so

_{i}**Property 3**: If *B* is an *m × n* matrix of rank *n* where *n ≤ m*, then *A = B ^{T}B* is a positive definite matrix.

Proof: From the proof of Property 2, we know that *X ^{T}AX* = for any

*n*

*×*1 column vector

*X*. Now let

*X*be any non-null

*n*

*×*1 column vector. If all the are zero, then

*BX*= 0. But by Property 3 of Matrix Rank, if follows that

*X*= 0, which is a contradiction. Since

*BX*≠ 0, at least one of the

*c*≠ 0, and so > 0, which means that

_{i}*X*= > 0, and so

^{T}AX*A*is positive definite.

**Property 4**: The following are equivalent for a symmetric *n × n* matrix *A*:

*A*is positive semidefinite- There is a matrix
*U*such that*A = U*^{T}U - All the eigenvalues of
*A*are non-negative

Proof: Assume (c) and show (b). Since *A* is symmetric, by Theorem 1 of Spectral Decomposition, *A* has a spectral decomposition *A = CDC ^{T}* where

*D*consists of the eigenvalues

*λ*

_{1}, …,

*λ*of

_{n}*A*. By assumption these are all non-negative, and so there exists the diagonal matrix

*D*

^{½}whose main diagonal consists of , …, . Since

*D*

^{½}

*D*

^{½}=

*D*, we have

and so the desired matrix is *U* = (*CD*^{½})^{T}.

Assume (b) and show (a). Let *X* be any *n* × 1 column vector. Then

Assume (a) and show (c). Let *A* be positive semidefinite and let *X* be an eigenvector corresponding to eigenvalue *λ*. Since *A* is positive semidefinite, *X ^{T}AX* ≥ 0. Since

*X*is an eigenvector corresponding to

*λ*,

*AX = λX*, and so 0 ≤

*X*=

^{T}AX = X^{T}λX*λX*. Since

^{T}X*X*= ||

^{T}X*X||*> 0, it follows that

*λ*≥ 0.

**Property 5**: The following are equivalent for a symmetric *n × n* matrix *A*:

*A*is positive definite- There is an invertible matrix
*U*such that*A = U*^{T}U - All the eigenvalues of
*A*are positive

Proof: Assume (c) and show (b). Since *A* is symmetric, by Theorem 1 of Spectral Decomposition, *A* has a spectral decomposition *A = CDC ^{T}*

^{ }where

*D*consists of the eigenvalues

*λ*

_{1}, …,

*λ*of

_{n}*A*. By assumption these are all positive, and so there exists the diagonal matrix

*D*½ whose main diagonal consists of , …, . Since

*D*

^{½}

*D*

^{½}=

*D*, we have

and so the desired matrix is *U* = (*CD*^{½})* ^{T} *provided we can show that

*U*is invertible. Now

*C*is an orthogonal matrix and so

*C*

^{-1}=

*C*. Since

^{T}*D*

^{½}is a diagonal matrix det

*D*

^{½}= the product of the elements on the diagonal. Since all the elements on the main diagonal are positive, it follows that det

*D*

^{½}≠ 0, and so

*D*

^{½}is invertible. Thus

*U*is invertible with inverse ((

*D*

^{½})

^{-1}

*C*

^{T})

^{T}, which is

*CE*, where

*E*= the diagonal matrix whose main diagonal consists of the elements , …,

Assume (b) and show (a). Let *X* be any *n* × 1 column vector. Then

If ||*UX*||^{2} = 0 then *UX* = 0. Since *U* is invertible, *X* = *U*^{-1}*UX* = 0, which is a contradiction. Thus *X ^{T}AX* = ||

*UX*||

^{2}> 0.

Assume (a) and show (c). Let *A* be positive definite and let *X* be an eigenvector corresponding to eigenvalue *λ*. Since *A* is positive definite, *X ^{T}AX* > 0. Since

*X*is an eigenvector corresponding to

*λ*,

*AX = λX*, and so 0 <

*X*=

^{T}AX = X^{T}λX*λX*. Since

^{T}X*X*= ||

^{T}X*X||*> 0, it follows that

*λ*> 0.

**Property 6**: The determinant of a positive definite matrix is positive. Furthermore a positive semidefinite matrix is positive definite if and only if it is invertible.

Proof: The first assertion follows from Property 1 of Eigenvalues and Eigenvectors and Property 5. The second follows from the first and Property 4 of Linear Independent Vectors.

**Observation**: If *A* is a positive semidefinite matrix, it is symmetric, and so it makes sense to speak about the spectral decomposition of *A*.

**Definition 2**: If *A* is a positive semidefinite matrix, then the square root of *A*, denoted *A*^{½}, is defined to be the *n × n* matrix *CD*^{½}*C ^{T} *where

*C*is as defined in Definition 1 of Symmetric matrices and

*D*

^{½}is the diagonal matrix whose main diagonal consists of , …, .

**Property 7**: If *A* is a positive semidefinite matrix, then *A*^{½} is a symmetric matrix and *A* = *A*^{½}*A*^{½}

Proof:

Since a diagonal matrix is symmetric, we have

**Property 8**: Any covariance matrix is positive semidefinite. If the covariance matrix is invertible then it is positive definite.

Proof: We will show the proof for the sample covariance *n × n* matrix *S* for *X*. The proof for a population matrix is similar. Note that

where *X* = [*x _{ij}*] is a

*k × n*matrix such that for each

*i,*{

*x*: 1 ≤

_{ij}*j ≤ n*} is a random sample for the random variable

*x*. Now let

_{i}*Y*be any

*n*x 1 column vector. Thus

Now the following matrices can be represented as a dot prodict, which evaluate to the same scalar *c _{i}*

which shows that any covariance matrix is positive semidefinite. The second assertion follows from Property 6.

**Observation**: A consequence of Property 4 and 8 is that all the eigenvalues of a covariance (or correlation) matrix are non-negative real numbers.

**Real Statistics Function**: The Real Statistics Resource Pack provides the following supplemental array function, where R1 is a *k × k* range in Excel

**MSQRT**(R1): Produces a *k × k* array which is the square root of the matrix represented by range R1

**Example 1**: Find the square root of the matrix in range A4:C6 of Figure 1.

**Figure 1 – Square root of a matrix**

Range A9:C9 contains the eigenvalues of matrix *A* and range A10:C12 contains the corresponding eigenvectors (which are repeated as matrix *C*). These can be calculated using eVECTORS(A4:C6). *D*^{½} is a diagonal matrix whose main diagonal consists of the square roots of the eigenvalues.

The square root of* A* is therefore given in range I4:K6, calculated by the array formula

=MMULT(E4:G6,MMULT(E9:G11,E14:G16))

The same result can be achieved using the supplemental array formula =MSQRT(A4:C6).

Note that the spectral decomposition* A = CDC ^{T} *is captured by the array formula

=MMULT(E4:G6,MMULT(DIAGONAL(A9:C9),E14:G16))

Charles, is there any reason why a correlation matrix would show up with eVECTORS generating all positive eigenvalues (i.e., positive semidefinite), whereas a covariance matrix created with the exact same underlying data shows up with eVECTORS generating a couple of negative eigenvalues? I have tried several different “iter” parameters (default, 1,000, 10,000 for an 11 x 11 matrix).

My takeaway from Property 8 above would be that if the covariance matrix was a valid one, it should be generating all positive eigenvalues.

Victor,

If you send me an Excel file with an example where you get negative eigenvalues from a covariance matrix I will try to figure out what is going on. See the Contact Us webpage for my email address.

Charles

Just sent you an email earlier today, Charles. Thanks very much.

Victor,

I received your email and will respond shortly.

Charles

What do you mean in Property 8 when you say:

“Now the following are the same scalar c”?

Thanks

Mark,

The two referenced matrices can be re-expressed as a dot product, which is a scalar. Since A dot B = B dot A, they are in fact the same scalar, which I will call c (actually it should be called c with a subscript i).

I have just updated the webpage to make it a little clearer.

Charles

Thank you Charles

Hi… I am trying to understand proof of Property 3: what do you mean ‘ …but by Property 4 of Matrix Operations, if follows that X = 0’ … Could you explain it in more detail? thank you

Property 4 of Matrix Operations is the wrong reference. It should be Property 3 of Rank of Matrix. I have corrected the referenced webpage. Thanks for bringing this error to my attention.

Charles

hello, lead me please

If M is spd matrix show that none of its diagonal elements can be nonpositive.

Thank you

Since A is a positive definite nxn matrix (for some n), for any n × 1 non-zero column vector XTAX > 0. Try selecting X such that all the entries are zero except for one entry which is 1.

Charles