# Positive Definite Matrices

Definition 1: An n × n symmetric matrix A is positive definite if for any n × 1 column vector X ≠ 0, XTAX > 0. A is positive semidefinite if for any n × 1 column vector X, XTAX ≥ 0.

Observation: Note that if A = [aij] and X = [xi], then

If we set X to be the column vector with xk = 1 and xi = 0 for all i ≠ k, then XTAX = akk, and so if A is positive definite, then akk > 0, which means that all the entries in the diagonal of A are positive. Similarly if A is positive semidefinite then all the elements in its diagonal are non-negative.

Property 1: If B is an m × n matrix, then A = BTB is symmetric

Proof: If B = [bij] is an m × n  matrix then A = BTB = [akj] is an n × n matrix where akj = $\sum_{i=1}^m b_{ki} b_{ij}$. A is symmetric since by Property 1 of Matrix Operations, AT = (BTB)T = BT(BT)T = BTB = A.

Observation: If X = [xi] is an m × 1 column vector, then XTX = $\sum_{i=1}^m x_i^2$.

Property 2: If B is an m × n  matrix, then A = BTB is positive semidefinite.

Proof: As we observed in Property 1, A is a symmetric n × n matrix. For any n × 1 column vector X, BX is an m × 1 column vector [ci] where ci = $\sum_{k=1}^n b_{ik} x_k$, and so

Property 3: If B is an m × n matrix of rank n where n ≤ m, then A = BTB is a positive definite matrix.

Proof: From the proof of Property 2, we know that XTAX = $\sum_{i=1}^m c_i^2$ for any n × 1 column vector X. Now let X be any non-null n × 1 column vector. If all the $c_i^2$ are zero, then BX = 0. But by Property 3 of Matrix Rank, if follows that X = 0, which is a contradiction. Since BX ≠ 0, at least one of the ci ≠ 0, and so $c_i^2$ > 0, which means that XTAX = $\sum_{i=1}^m c_i^2$ > 0, and so A is positive definite.

Property 4: The following are equivalent for a symmetric n × n matrix A:

1. A is positive semidefinite
2. There is a matrix U such that A = UTU
3. All the eigenvalues of A are non-negative

Proof: Assume (c) and show (b). Since A is symmetric, by Theorem 1 of Spectral Decomposition, A has a spectral decomposition A = CDCT where D consists of the eigenvalues λ1, …, λn of A. By assumption these are all non-negative, and so there exists the diagonal matrix D½ whose main diagonal consists of $\sqrt \lambda_1$, …, $\sqrt \lambda_n$. Since D½D½ = D, we have

and so the desired matrix is U = (CD½)T.

Assume (b) and show (a). Let X be any n × 1 column vector. Then

Assume (a) and show (c). Let A be positive semidefinite and let X be an eigenvector corresponding to eigenvalue λ. Since A is positive semidefinite, XTAX ≥ 0. Since X is an eigenvector corresponding to λ, AX = λX, and so 0 ≤ XTAX = XTλX = λXTX. Since XTX = ||X|| > 0, it follows that λ ≥ 0.

Property 5: The following are equivalent for a symmetric n × n matrix A:

1. A is positive definite
2. There is an invertible matrix U such that A = UTU
3. All the eigenvalues of A are positive

Proof: Assume (c) and show (b). Since A is symmetric, by Theorem 1 of Spectral DecompositionA has a spectral decomposition A = CDCT where D consists of the eigenvalues λ1, …, λn of A. By assumption these are all positive, and so there exists the diagonal matrix D½ whose main diagonal consists of $\sqrt \lambda_1$, …, $\sqrt \lambda_n$. Since D½D½ = D, we have

and so the desired matrix is U = (CD½)T provided we can show that U is invertible. Now C is an orthogonal matrix and so C-1 = CT. Since D½ is a diagonal matrix det D½ = the product of the elements on the diagonal. Since all the elements on the main diagonal are positive, it follows that det D½ ≠ 0, and so D½ is invertible. Thus U is invertible with inverse ((D½)-1CT)T, which is CE, where E = the diagonal matrix whose main diagonal consists of the elements $\frac{1}{\sqrt \lambda_1}$, …, $\frac{1}{\sqrt \lambda_n}$

Assume (b) and show (a). Let X be any n × 1 column vector. Then

If ||UX||2 = 0 then UX = 0. Since U is invertible, X = U-1UX = 0, which is a contradiction. Thus XTAX = ||UX||2 > 0.

Assume (a) and show (c). Let A be positive definite and let X be an eigenvector corresponding to eigenvalue λ. Since A is positive definite, XTAX > 0. Since X is an eigenvector corresponding to λAX = λX, and so 0 < XTAX = XTλX = λXTX. Since XTX = ||X|| > 0, it follows that λ > 0.

Property 6: The determinant of a positive definite matrix is positive. Furthermore a positive semidefinite matrix is positive definite if and only if it is invertible.

Proof: The first assertion follows from Property 1 of Eigenvalues and Eigenvectors and Property 5. The second follows from the first and Property 4 of Linear Independent Vectors.

Observation: If A is a positive semidefinite matrix, it is symmetric, and so it makes sense to speak about the spectral decomposition of A.

Definition 2: If A is a positive semidefinite matrix, then the square root of A, denoted A½, is defined to be the n × n matrix CD½CT where C is as defined in Definition 1 of Symmetric matrices and D½ is the diagonal matrix whose main diagonal consists of $\sqrt \lambda_1$, …, $\sqrt \lambda_n$.

Property 7: If A is a positive semidefinite matrix, then A½ is a symmetric matrix and AA½A½

Proof:

Since a diagonal matrix is symmetric, we have

Property 8: Any covariance matrix is positive semidefinite. If the covariance matrix is invertible then it is positive definite.

Proof: We will show the proof for the sample covariance n × n matrix S for X. The proof for a population matrix is similar. Note that

where X = [xij] is a k × n matrix such that for each i, {xij : 1  ≤ j ≤ n} is a random sample for the random variable xi. Now let Y be any n x 1 column vector. Thus

Now the following matrices can be represented as a dot prodict, which evaluate to the same scalar ci

Thus

which shows that any covariance matrix is positive semidefinite. The second assertion follows from Property 6.

Observation: A consequence of Property 4 and 8 is that all the eigenvalues of a covariance (or correlation) matrix are non-negative real numbers.

Real Statistics Function: The Real Statistics Resource Pack provides the following supplemental array function, where R1 is a k × k range in Excel

MSQRT(R1): Produces a k × k array which is the square root of the matrix represented by range R1

Example 1: Find the square root of the matrix in range A4:C6 of Figure 1.

Figure 1 – Square root of a matrix

Range A9:C9 contains the eigenvalues of matrix A and range A10:C12 contains the corresponding eigenvectors (which are repeated as matrix C). These can be calculated using eVECTORS(A4:C6). D½ is a diagonal matrix whose main diagonal consists of the square roots of the eigenvalues.

The square root of A is therefore given in range I4:K6, calculated by the array formula

=MMULT(E4:G6,MMULT(E9:G11,E14:G16))

The same result can be achieved using the supplemental array formula =MSQRT(A4:C6).

Note that the spectral decomposition A = CDCT is captured by the array formula

=MMULT(E4:G6,MMULT(DIAGONAL(A9:C9),E14:G16))

### 11 Responses to Positive Definite Matrices

1. behnam says:

If M is spd matrix show that none of its diagonal elements can be nonpositive.
Thank you

• Charles says:

Since A is a positive definite nxn matrix (for some n), for any n × 1 non-zero column vector XTAX > 0. Try selecting X such that all the entries are zero except for one entry which is 1.
Charles

2. Maricarmen says:

Hi… I am trying to understand proof of Property 3: what do you mean ‘ …but by Property 4 of Matrix Operations, if follows that X = 0’ … Could you explain it in more detail? thank you

• Charles says:

Property 4 of Matrix Operations is the wrong reference. It should be Property 3 of Rank of Matrix. I have corrected the referenced webpage. Thanks for bringing this error to my attention.
Charles

3. Mark says:

What do you mean in Property 8 when you say:

“Now the following are the same scalar c”?

Thanks

• Charles says:

Mark,
The two referenced matrices can be re-expressed as a dot product, which is a scalar. Since A dot B = B dot A, they are in fact the same scalar, which I will call c (actually it should be called c with a subscript i).
I have just updated the webpage to make it a little clearer.
Charles

• Mark says:

Thank you Charles

4. Victor says:

Charles, is there any reason why a correlation matrix would show up with eVECTORS generating all positive eigenvalues (i.e., positive semidefinite), whereas a covariance matrix created with the exact same underlying data shows up with eVECTORS generating a couple of negative eigenvalues? I have tried several different “iter” parameters (default, 1,000, 10,000 for an 11 x 11 matrix).

My takeaway from Property 8 above would be that if the covariance matrix was a valid one, it should be generating all positive eigenvalues.

• Charles says:

Victor,
If you send me an Excel file with an example where you get negative eigenvalues from a covariance matrix I will try to figure out what is going on. See the Contact Us webpage for my email address.
Charles

• Victor says:

Just sent you an email earlier today, Charles. Thanks very much.

• Charles says:

Victor,