Linear algebra - Find an orthogonal diagonalization for A=UDU^T where U is an orthogonal matrix and D is a diagonal matrix

Eagerissac

New member
Joined
Jan 9, 2020
Messages
16
Can anyone tell me what's wrong with the answers I gave for the question in the picture below? I calculated the eigenvalues and got 9 and 12.

Subtracting these values from the matrix, I'm left with these two matrices:

For λ = 9

[10-9 1 1 | 0]
[1 10-9 1 | 0]
[1 1 10-9 | 0]

RREF gives me:
[1 1 1 | 0]
[0 0 0 | 0]
[0 0 0 | 0]

This gives two basis vectors because there are 2 free variables x2 and x3:
[-1]
[1]
[0]

and

[-1]
[0]
[1]

For λ = 12
[10-12 1 1 | 0]
[1 10-12 1 | 0]
[1 1 10-12 | 0]

RREF gives me:
[1 0 -1 | 0]
[0 1 -1 | 0]
[0 0 0 | 0]

The gives the basis vector:
[1]
[1]
[1]

Normalizing these vectors I get:
√((-1)^2 + (1)^2 + 0^2) = √2

[1/√2]
[1/√2]
[0]

√((-1)^2 + 0 ^2 + (1)^2) = √2

[-1/√2]
[0]
[1/√2]

and

√((1)^2 + (1)^2 + (1)^2) = √3

[1/√3]
[1/√3]
[1/√3]

This formed my answer for the U matrix. My D matrix are just the eigenvalues I got. The system keeps marking my answer as wrong and I'm not sure why it's incorrect. Am I doing something wrong?
 

Attachments

  • Screenshot (2).png
    Screenshot (2).png
    38.1 KB · Views: 7
Last edited:
If you multiply [math]UDU^{T}[/math], do you recover [math]A[/math]?

D is good.
The 1/sqrt(3)s look good.
Did you Orthogonalize or just assume they would be orthogonal?

Maybe it also doesn't understand "sqrt". What is that note at the top of the image?
 
I don't appear to get A when I multiply UDU^T. Something's clearly wrong but I don't know what I did wrong in my steps.
 
To begin with, you can't assume two vectors are orthogonal when they are not ...
 
Top