Reasons for Z-Score standard deviation ? 1???

rheighton

New member
Joined
Mar 16, 2006
Messages
11
I have a series of 24 raw scores which I have standardized by individually converting them to z-scores. My assignment asks me to calculate the mean and standard deviation of these new scores (I presume to prove that the mean for a series of z-scores is always = 0 and the standard deviation for a series of z-scores is always = 1).

As expected, the mean of the scores (which ranged from -1.79 to 1.33) = 0. However, upon calculating the standard deviation for the series of scores, I am getting 0.84 as the value (which is clearly incorrect). I have run through my math thrice to ensure that I didn't make a simple calculation error, so I cannot understand why my standard deviation would not come out as = 1.

I am using the formula:

SD = ?( [ ?z^2 – (?z)^2/N ] / N )
where (?z)^2 = 0 and N = 24
Reducing the formula to:
SD = ?( [?z^2] / N )

And in calculating ?z^2 I added the squares of each individual z-score (giving me a value of ~16.74).

If anyone could possibly give me a heads up as to where I went wrong (or perhaps suggesting other reasons why the SD wouldn't = 1) it would be GREATLY appreciated!
 
You are right to suspect such a result.

I still believe there is something wrong in your calculations. It is rather tedious. Are you using a spreadsheet or are we talking paper and pencil?

Provide data and results. Someone will find the error. It's in there.
 
Thanks very much for your reply. I actually figured out that I had made a small oversight in an earlier calculation which skewed the rest of my results... easy enough to do when calculating with paper and pencil!
 
Top