Chebyshevs theorem: Suppose that IQ scores have mean 100, standard deviation 15....

Jack678

New member
Joined
May 17, 2016
Messages
1
I'm currently revising for a quants exam and I'm stuck on some stats stuff, specifically chebyshevs theorem.

So the question is: Suppose that IQ scores have a mean of 100 and a standard deviation of 15, use Chebyshev’s theorem to calculate the percentage of people that have an IQ score between 70 and 130.

So far I have

(1-1/k^2)
(1-1/15^2)
(1-1/225)
1-0.00444444
0.995

So 99.5% of Iq scores fall between 70 - 130. Can anyone tell me if this is correct? And if not where I'm going wrong.

Thanks
 
I'm currently revising for a quants exam and I'm stuck on some stats stuff, specifically chebyshevs theorem.

So the question is: Suppose that IQ scores have a mean of 100 and a standard deviation of 15, use Chebyshev’s theorem to calculate the percentage of people that have an IQ score between 70 and 130.

So far I have

(1-1/k^2)
(1-1/15^2)
(1-1/225)
1-0.00444444
0.995

So 99.5% of Iq scores fall between 70 - 130. Can anyone tell me if this is correct? And if not where I'm going wrong.

Thanks

No, look at this similar example at this link:

https://www.algebra.com/algebra/hom...ility-and-statistics.faq.question.306452.html
 
Top