slope of random time-series

Flurry

New member
Joined
Mar 20, 2013
Messages
1
Say we have many time series (of length N) which all consist of normal distributed values (mean=0, std-deviation=1).
Then we calculate the slope of all time series and in turn calculate the standard deviation (S) of all slopes.

My question is: is there a function S = f(N) which describes how the standard deviation of the slopes decreases with increasing N ?
I think S(N) ~ 1/(N*sqrt(N))

Can anyone give a proof ? ?
 
\(\displaystyle Avg(nx) = n\cdot Avg(x) \)

\(\displaystyle Var(nx) = n^{2}\cdot Var(x) \)

...assuming they are all independent, of course.

Am I missing something?

Yes, I am missing something. I am adding the SAME random variable, not different ones. Let's try that again...

\(\displaystyle Var(x_{i}) = Var(x_{1} + x_{2} + x_{3} + ... + x_{n}) = 1^{1}Var(x_{1}) + 1^{2}Var(x_{2}) + ... + 1^{2}Var(x_{n}) = n\cdot Var(x_{any})\)

That's better.
 
Last edited:
Say we have many time series (of length N) which all consist of normal distributed values (mean=0, std-deviation=1).
Then we calculate the slope of all time series and in turn calculate the standard deviation (S) of all slopes.

My question is: is there a function S = f(N) which describes how the standard deviation of the slopes decreases with increasing N ?
I think S(N) ~ 1/(N*sqrt(N))

Can anyone give a proof ? ?
It is not clear what you mean by the slope of a normally distributed variable. I can picture the data for each series summed over all time to follow a unit normal distribution, and the sum of all the series would also be a unit normal.

But when you do a regression line with respect to time you find a slope - for instance, earlier times may tend to be less than the mean, and later times greater. The regression should report the mean and the slope for each of the N time series, as well as the standard deviation of the estimate of the mean and the standard deviation of the estimate of the slope, mu ± sigmaM, and S ± sigmaS. [Note that sigmaM is NOT the same as the sigma of the distribution, but is instead the precision with which mu is determined.]

In that case the mean slope is Sum(S)/N, and the average Variance is Sum(sigmaS^2)/N. For proof, you should know that when independent data are added (or subtracted) the Variances of the data add. When data are divided by a constant, the Variance is also divided. The mean and the Variance are treated the same when data are averaged.

The law you are looking for is the Sampling Theorem: With great generality, the standard deviation of the estimate decreases as 1/sqrt(N).
 
Top