Normal Distribution

ScottM

New member
Joined
May 12, 2013
Messages
3
Normal Distribution (Solved)

Hi all,

I have a very general question about normal distribution.
Say, I have two independent random variables X and Y. I also have their standard deviations and means.
How do you find P(X>Y)?
In the book, they have examples that deal with only one random variable like P(X>some constant).
The formula for it is like this: Screen Shot 2013-05-11 at 11.30.33 PM.png
I feel like there exists a formula that can solve this, but I really can't recall anything useful.
Thanks in advance!
 
Last edited:
Hi all,

I have a very general question about normal distribution.
Say, I have two independent random variables X and Y. I also have their standard deviations and means.
How do you find P(X>Y)?
In the book, they have examples that deal with only one random variable like P(X>some constant).
The formula for it is like this: View attachment 2845
I feel like there exists a formula that can solve this, but I really can't recall anything useful.
Thanks in advance!

Like tkhunny was hinting at, we can view \(\displaystyle P(X>Y)\) as \(\displaystyle P(X-Y>0)\).

If we let \(\displaystyle \mu_x \), \(\displaystyle \mu_y \), \(\displaystyle \sigma^2_x \), and \(\displaystyle \sigma^2_y \) be the means and variances of \(\displaystyle X\) and \(\displaystyle Y\), then because \(\displaystyle X\) and \(\displaystyle Y\) are independent normal random variables, the random variable \(\displaystyle X-Y\) will be normal with mean \(\displaystyle \mu_x - \mu_y\) and variance \(\displaystyle \sigma^2_x + \sigma^2_y \).

Is that enough to get you going?
 
Have you considered the distribution of X - Y?

Like tkhunny was hinting at, we can view \(\displaystyle P(X>Y)\) as \(\displaystyle P(X-Y>0)\).

If we let \(\displaystyle \mu_x \), \(\displaystyle \mu_y \), \(\displaystyle \sigma^2_x \), and \(\displaystyle \sigma^2_y \) be the means and variances of \(\displaystyle X\) and \(\displaystyle Y\), then because \(\displaystyle X\) and \(\displaystyle Y\) are independent normal random variables, the random variable \(\displaystyle X-Y\) will be normal with mean \(\displaystyle \mu_x - \mu_y\) and variance \(\displaystyle \sigma^2_x + \sigma^2_y \).

Is that enough to get you going?

Thanks for the hint! I must have stat-phobia =-=
variance \(\displaystyle \sigma^2_x + \sigma^2_y \) this should be minus though, right? Or is there a rule for that? I know X+Y will be normal with mean \(\displaystyle \mu_x + \mu_y\) and variance \(\displaystyle \sigma^2_x + \sigma^2_y \).
 
Last edited:
Thanks for the hint! I must have stat-phobia =-=
variance \(\displaystyle \sigma^2_x + \sigma^2_y \) this should be minus though, right? Or is there a rule for that? I know X+Y will be normal with mean \(\displaystyle \mu_x + \mu_y\) and variance \(\displaystyle \sigma^2_x + \sigma^2_y \).

Variances always add. The variance of \(\displaystyle -Y\) is the same as the variance of \(\displaystyle Y\) (that is, there's just as much variation even if the values change signs), so the variance of \(\displaystyle X-Y\) is the same as the variance of \(\displaystyle X+Y\).
 
Last edited:
Variances always add. The variance of \(\displaystyle -Y\) is the same as the variance of \(\displaystyle Y\) (that is, there's just as much variation even if the values change signs), so the variance of \(\displaystyle X-Y\) is the same as the variance of \(\displaystyle X+Y\).

Thanks! My brain literally stops functioning when I work on statistics problems T_T
 
Last edited:
Top