Let f:C∖{−1}→C∖{1}, let A={z∈C ∣ ℜ(z)>0} and let D={z∈C such that ∣z∣<1}. Prove that f maps A in D.
My work: let z∈A be arbitrary. Letting z=x+iy, it is:
∣f(z)∣=∣∣∣∣∣z+1z−1∣∣∣∣∣=∣z+1∣∣z−1∣=(x+1)2+y2(x−1)2+y2=x2+2x+1+y2x2−2x+1+y2Since z∈A, x>0 and so −2x<2x. Using the facts that the square root is increasing and that the denominator of the latter fraction is positive, we have:
x2+2x+1+y2x2−2x+1+y2<x2+2x+1+y2x2+2x+1+y2=1So, ∣f(z)∣<1; hence, f(A)⊆D. Is this correct?
My work: let z∈A be arbitrary. Letting z=x+iy, it is:
∣f(z)∣=∣∣∣∣∣z+1z−1∣∣∣∣∣=∣z+1∣∣z−1∣=(x+1)2+y2(x−1)2+y2=x2+2x+1+y2x2−2x+1+y2Since z∈A, x>0 and so −2x<2x. Using the facts that the square root is increasing and that the denominator of the latter fraction is positive, we have:
x2+2x+1+y2x2−2x+1+y2<x2+2x+1+y2x2+2x+1+y2=1So, ∣f(z)∣<1; hence, f(A)⊆D. Is this correct?