Variance proof

estrayer7

New member
Joined
Mar 31, 2020
Messages
5
Let a ≤ X ≤ b with a ≥ mu ≥ 0. Show a-mu ≤ SD(X) ≤ b-mu. Where SD(X) is the standard deviation of X.
 
Can you please tell us where you are stuck and what you tried? It is hard to help you if we do not know where you are stuck and what method you want to use. So please post back with your work. You might want to read the forum guidelines, as if you did you would have received help by now.
 
This isn't true as stated.

Suppose X is a point value. It then has standard deviation 0.

[MATH]0 \not > a - \mu[/MATH]
Do you mean X is a random variable with non-zero probability over the entire interval [MATH][a,b][/MATH] ?
 
This isn't true as stated.

Suppose X is a point value. It then has standard deviation 0.

[MATH]0 \not > a - \mu[/MATH]
Do you mean X is a random variable with non-zero probability over the entire interval [MATH][a,b][/MATH] ?
Exactly.
 
This is for my college stats class, I asked my professor for help and said to start at a ≤ X ≤ b and then work up to (a-mu)^2 ≤ SD(X) ≤ (b-mu)^2. Then take the square root to get the final answer. I am stuck on the in-between part, getting from the start to the end.
 
If you are stuck in the middle and want someone here to help you don't you think that we would need to see your work up to the middle?
 
All I have is a ≤ X ≤ b = (a-mu)^2 ≤ VAR(X) ≤ (b-mu)^2 = square root(a-mu)^2 ≤ VAR(X) ≤ (b-mu)^2 = a-mu ≤ SD(X) ≤ b-mu. But I feel like this is wrong I really do not understand how to do the proof
 
this is what I have gotten up to. I am still stuck on getting from the one step to the next aka the middle portion
 

Attachments

  • unnamed.jpg
    unnamed.jpg
    986.2 KB · Views: 1
Top