Correlation Question

abc4616

New member
Joined
Sep 30, 2006
Messages
9
Given random variable X with standard deviation x, and a random variable Y=a + bX, where a,b are constants, show that if b<0, the correlation coefficient = -1, and if b>0, correlation coefficient is 1.

Does anyone know how to prove this?
 
\(\displaystyle x^2 = E(X^2 ) - E^2 (X)\quad \Rightarrow \quad E(X^2 ) = x^2 + E^2 (X)\)


\(\displaystyle \begin{array}{rcl}
Cov(X,Y) & = & E(XY) - E(X)E(Y) \\
& = & E(aX + bX^2 ) - E(X)E(a + bX) \\
& = & aE(X) + bE(X^2 ) - aE(X) - bE^2 (X) \\
& = & aE(X) + b\left( {x^2 + E^2 (X)} \right) - aE(X) - bE^2 (X) \\
& = & bx^2 \\
\end{array}\)
 
Top