So this is one of the questions in my probability theory textbook and I can't figure it out, can anyone help?
Let X and Y be random variables.
a. Assuming that all of the expected values actually exist, prove that
(E(XY))2 ≤ E(X2)E(Y2)
It gave us a hint saying to consider E((aX + bY)2) and E((aX - bY)2) for arbitrary a and b, and to apply what we find to well-chosen a and b.
If you expand that out you get E(a2X2 + 2abXY + b2Y2)
Or , a2E(X2) + 2abE(XY) + b2E(Y2)
but I don't know what to do with that. I realize it's quadratic in both a and b so I could try to do something with the discriminant which would get me the terms I need but I don't know if that's ok to do or how to prove it if it is.
b. If the standard deviations of X and Y are Sx and Sy, prove that |Cov(X,Y)| ≤ SxSy
I really don't know what to do for this part. I was trying to work the right hand side using the fact that the standard deviation is the root of the variance and then using the definition of variance.
Thanks in advance!
Let X and Y be random variables.
a. Assuming that all of the expected values actually exist, prove that
(E(XY))2 ≤ E(X2)E(Y2)
It gave us a hint saying to consider E((aX + bY)2) and E((aX - bY)2) for arbitrary a and b, and to apply what we find to well-chosen a and b.
If you expand that out you get E(a2X2 + 2abXY + b2Y2)
Or , a2E(X2) + 2abE(XY) + b2E(Y2)
but I don't know what to do with that. I realize it's quadratic in both a and b so I could try to do something with the discriminant which would get me the terms I need but I don't know if that's ok to do or how to prove it if it is.
b. If the standard deviations of X and Y are Sx and Sy, prove that |Cov(X,Y)| ≤ SxSy
I really don't know what to do for this part. I was trying to work the right hand side using the fact that the standard deviation is the root of the variance and then using the definition of variance.
Thanks in advance!
Last edited: