Find the mistake with √i

chipan

New member
Joined
Dec 30, 2022
Messages
2
So I was doing a problem which had to do with √i and √-i. I found that √i=√2(1+i)/2 and so far as I can tell this is correct.

I then figured that √-i=i√i, and this too looks correct.

The problem I get is that given this √-i=i√2(1+I)/2. But I've checked and this doesn't seem to be true. So where is the mistake?
 
So I was doing a problem which had to do with √i and √-i. I found that √i=√2(1+i)/2 and so far as I can tell this is correct.

I then figured that √-i=i√i, and this too looks correct.

The problem I get is that given this √-i=i√2(1+I)/2. But I've checked and this doesn't seem to be true. So where is the mistake?
The main mistake is that the square root function is not well defined for complex numbers; so the problem itself is not valid. If you choose one of the two possible signs for the result, you can't guarantee that √(ab) = √a √b in all cases.

There are two square roots of i, and two square roots of -i. You have made choices for each that do not agree; and there is no way to make this entirely consistent. Instead, always show both possible answers, and everything should work out.
 
The main mistake is that the square root function is not well defined for complex numbers; so the problem itself is not valid. If you choose one of the two possible signs for the result, you can't guarantee that √(ab) = √a √b in all cases.

There are two square roots of i, and two square roots of -i. You have made choices for each that do not agree; and there is no way to make this entirely consistent. Instead, always show both possible answers, and everything should work out.
Well I only did primary square root because that's all the radical calls for. But if I'm understanding correctly, are you suggesting that distributing the radical is invalid for complex values? √(-i)≠i√i?
 
Well I only did primary square root because that's all the radical calls for. But if I'm understanding correctly, are you suggesting that distributing the radical is invalid for complex values? √(-i)≠i√i?
Yes. Does your textbook not explain this?

As a simple example, if you define the radical as just the "positive", then [imath]\sqrt{-1}\cdot\sqrt{-1}=i\cdot i=-1[/imath], while [imath]\sqrt{-1\cdot-1}=\sqrt{-1\cdot1}=\sqrt{1}=1[/imath].
 
As a simple example, if you define the radical as just the "positive", then [imath]\sqrt{-1}\cdot\sqrt{-1}=i\cdot i=-1[/imath], while [imath]\sqrt{-1\cdot-1}=\sqrt{-1\cdot1}=\sqrt{1}=1[/imath].
Correcting a little typo,

[imath]\sqrt{-1}\cdot\sqrt{-1}=i\cdot i=-1[/imath], while [imath]\sqrt{-1\cdot-1}=\sqrt{1}=1[/imath].​

If you do define a principal square root (which is no longer as simple as "take the positive root", since complex numbers are not positive or negative), then you can't assume that the root of the product is the product of the roots. As a result, in evaluating an expression with such roots, you have to evaluate the radicals first, and not take shortcuts.

So I was doing a problem which had to do with √i and √-i. I found that √i=√2(1+i)/2 and so far as I can tell this is correct.

I then figured that √-i=i√i, and this too looks correct.

The problem I get is that given this √-i=i√2(1+I)/2. But I've checked and this doesn't seem to be true. So where is the mistake?
Following through on the details, i actually has two roots, [imath]\pm\frac{\sqrt{2}}{2}(1+i)[/imath]; the one with positive real part is conventionally taken as the principal root in appropriate contexts. Given this, [imath]\sqrt{i}=\frac{\sqrt{2}}{2}(1+i)[/imath] and [imath]\sqrt{-i}=\frac{\sqrt{2}}{2}({\color{red}{1-i}})[/imath].

Note that this is not equal to your [imath]\sqrt{-i}=i\sqrt{i}=i\cdot\frac{\sqrt{2}}{2}(1+i)=\frac{\sqrt{2}}{2}({\color{red}{i-1}})={\color{red}{-}}\frac{\sqrt{2}}{2}({\color{red}{1-i}})[/imath]. That gives the wrong sign.
 
Top