Suppose rents for apartments in NYC are normally distributed with a mean monthly rent of $880 with a standard deviation of $150. If 100 apartments in NYC are selected at random, what is the probability that rent will be greater than $900?
I'm coming up with two answers,
.4469
or
.0912
Which is correct?
I'm coming up with two answers,
.4469
or
.0912
Which is correct?