jimmyjump75
New member
- Joined
- Jul 11, 2013
- Messages
- 1
Hi,
I am trying to accurately calculate average when there are many zero values. I have a computes. From this 10 computer I am collecting performance statistics for metrics of operations per second, bytes per second etc... I am collecting samples 3 times a minute or every 20 seconds for 3 days. So for this computer I have 180 samples an hour and 4320 samples per day. Every sample can range from 0 to 3000. I want to obtain the average per hour for each metric I am collecting statistics for.
The problem is that for many samples I have zero values and I am not sure how to properly calculate the average per hour. Usually I would say that for a specific computer for the hour of 11 AM to 12 PM I would add up the 180 samples and divide the sum of those samples by 180. However the zero values seem to skew the numbers to be really low. Is this the proper way to calculate average when you have many zero values?
Thanks
Jimmy
I am trying to accurately calculate average when there are many zero values. I have a computes. From this 10 computer I am collecting performance statistics for metrics of operations per second, bytes per second etc... I am collecting samples 3 times a minute or every 20 seconds for 3 days. So for this computer I have 180 samples an hour and 4320 samples per day. Every sample can range from 0 to 3000. I want to obtain the average per hour for each metric I am collecting statistics for.
The problem is that for many samples I have zero values and I am not sure how to properly calculate the average per hour. Usually I would say that for a specific computer for the hour of 11 AM to 12 PM I would add up the 180 samples and divide the sum of those samples by 180. However the zero values seem to skew the numbers to be really low. Is this the proper way to calculate average when you have many zero values?
Thanks
Jimmy