Say we have many time series (of length N) which all consist of normal distributed values (mean=0, std-deviation=1).
Then we calculate the slope of all time series and in turn calculate the standard deviation (S) of all slopes.
My question is: is there a function S = f(N) which describes how the standard deviation of the slopes decreases with increasing N ?
I think S(N) ~ 1/(N*sqrt(N))
Can anyone give a proof ? ?
Then we calculate the slope of all time series and in turn calculate the standard deviation (S) of all slopes.
My question is: is there a function S = f(N) which describes how the standard deviation of the slopes decreases with increasing N ?
I think S(N) ~ 1/(N*sqrt(N))
Can anyone give a proof ? ?