matrix in linear algebra

dxzre

New member
Joined
Apr 14, 2021
Messages
1
Let A be a 2 x 2 matrix. The matrix A is said to be the square root of the divide
Matrix B 2 x 2 matrix, if A^2 = B.

the question is:
Prove that there is no square root of:
0 1
0 0

Take the square root of:
1 1
0 0

i need your opinion to solve the question
 
I would simply write A as [MATH]\left(\begin{matrix}a&b\\c&d\\\end{matrix}\right)[/MATH]Compute [MATH]A^2[/MATH] and then equate it to e.g. [MATH]\left(\begin{matrix}0&1\\0&0\\\end{matrix}\right)[/MATH]. This will give equations which will have no solution.
Similarly for [MATH]\left(\begin{matrix}1&1\\0&0\\\end{matrix}\right)[/MATH] but it will yield equations with solutions, which you can find.
 
Since dxzre has solved this:
Suppose \(\displaystyle \begin{bmatrix}a & b \\ c & d \end{bmatrix}^2= \begin{bmatrix}a & b \\ c & d \end{bmatrix}\begin{bmatrix}a & b \\ c & d \end{bmatrix}= \begin{bmatrix}a^2+ bc & ab+ bd \\ ac+ cd & bc+ d^2\end{bmatrix}= \begin{bmatrix}0 & 1 \\ 0 & 0 \end{bmatrix}\).

\(\displaystyle a^2+ bc= 0\), \(\displaystyle ab+ bd= 1\), \(\displaystyle ac+ cd= 0\), \(\displaystyle bc+ d^2= 0\).

From ac+ cd= c(a+ d)= 0, either c= 0 or a+ d= 0.
If c= 0, \(\displaystyle a^2+ bc= a^2= 0\) so a= 0 and \(\displaystyle bc+ d^2= d^2= 0\) so d= 0.
ab+bd= 0, not 1.

If c is not 0 then a+ d= 0 so d= -a. ab+ bd= ab- ab= 0, not 1 so there is no such matrix.


Suppose \(\displaystyle \begin{bmatrix}a & b \\ c & d \end{bmatrix}^2= \begin{bmatrix}a & b \\ c & d \end{bmatrix}\begin{bmatrix}a & b \\ c & d \end{bmatrix}= \begin{bmatrix}a^2+ bc & ab+ bd \\ ac+ cd & bc+ d^2\end{bmatrix}= \begin{bmatrix}1 & 1 \\ 0 & 0 \end{bmatrix}\).

\(\displaystyle a^2+ bc= 1\), \(\displaystyle ab+ bd= 1\), \(\displaystyle ac+ cd= 0\), \(\displaystyle bc+ d^2= 0\).A
As before, ac+ cd= c(a+ d)= 0 so either c= 0 or a+d= 0.
If c= 0, \(\displaystyle a^2+ bd= a^2= 1\) so a= 1 or -1. \(\displaystyle bc+ d^2= d^2= 0\), d= 0.
If a= 1, ab+ bd= b= 1
a= b= 1, c= d= 0 so \(\displaystyle \begin{bmatrix}1 & 1 \\ 0 & 0 \end{bmatrix}\).

Then \(\displaystyle \begin{bmatrix}1 & 1 \\ 0 & 0 \end{bmatrix}^2= \begin{bmatrix}1 & 1 \\ 0 & 0 \end{bmatrix}\begin{bmatrix}1 & 1 \\ 0 & 0 \end{bmatrix}= \begin{bmatrix} 1 & 1 \\ 0 & 0 \end{bmatrix}\).

If c= 0 and a= -1 then ab+ bd= -b= 1 so b= -1.
a= b= -1, c= d= 0 so \(\displaystyle \begin{bmatrix}-1 & -1 \\ 0 & 0 \end{bmatrix}\).

Then \(\displaystyle \begin{bmatrix}-1 & -1 \\ 0 & 0 \end{bmatrix}^2= \begin{bmatrix}=-1 & -1 \\ 0 & 0 \end{bmatrix}\begin{bmatrix}-1 & -1 \\ 0 & 0 \end{bmatrix}= \begin{bmatrix} 1 & 1 \\ 0 & 0 \end{bmatrix}\).
 
Since dxzre has solved this:
He probably has by now!

And for the second part, if [MATH]a=-d[/MATH], then the final equation [MATH]bc+d^2=0 \rightarrow bc+a^2=0[/MATH], contradicting the first equation [MATH]a^2+bc=1[/MATH]
 
Top