If we have a real symmetric matrix M which is diagonalized into QLQ^T where Q is orthogonal and L is diagonal (this is possible by spectral theorem), then is Q and L necessarily unique? Is there another orthogonal diagonalization using some Q' != Q or L' != L?

I ask this because it seems many explanations of how to perform PCA with SVD rely on this as an unstated fact. I can't tell if this is because i'm misreading or because I forgot some important theorem from my linear algebra class

Here's my attempt at proof:

suppose it is possible:

M = Q'L'Q'^T

= (QQ^T) Q'L'Q'^T (QQ^T)

= Q (Q^T Q'L'Q'^T Q) Q^T

now it suffices to prove L != (Q^T Q'L'Q'^T Q) . I have the fact that L' and L is diagonal, so if i can show that Q^T Q'L'Q'^T Q is not diagonal, then I would be done, but i'm not sure how to

## Is an orthogonal diagonalization of a real symmetric matrix unique?

**Moderators:** gmalivuk, Moderators General, Prelates

### Re: Is an orthogonal diagonalization of a real symmetric matrix unique?

No, orthogonal diagonalization is not unique (kind of). The set of eigenvalues is unique, and the multiplicity of each eigenvalue is well-defined, but these kinds of matrix decompositions are usually unique only up to permutation, and sometimes not even that. You could equivalently have the diagonal elements of L be increasing or decreasing, and all this changes is the order of the columns of Q.

Another problem is if M has repeated eigenvalues. Say that there is some eigenvalue λ with orthogonal eigenvectors v

Edit: I should also point out that since the Q matrix is constructed from the eigenvectors of M, you can multiply any column by -1 and still get a valid (and technically different) decomposition.

Another problem is if M has repeated eigenvalues. Say that there is some eigenvalue λ with orthogonal eigenvectors v

_{1}and v_{2}. Then any nontrivial linear combination of v_{1}and v_{2}is also an eigenvector of M. For example you can "decompose" the zero matrix as 0 = Q0Q^{T}for any suitably-sized matrix Q. You can also write the identity matrix as QQ^{T}for any orthogonal matrix Q. This is the main reason that a lot of results involving matrix decompositions can run into trouble when eigenvalues are repeated.Edit: I should also point out that since the Q matrix is constructed from the eigenvectors of M, you can multiply any column by -1 and still get a valid (and technically different) decomposition.

### Who is online

Users browsing this forum: Google [Bot] and 8 guests