Expectation value of infinite sun

Guanno

New member
Joined
Nov 22, 2013
Messages
2
Expectation value of infinite sum

The question:

Consider a stochastic process in which N(0,1) Gaussian white noise Zn is filtered to obtain Xn by an IIR filter : xn - ϕxn-1 = σzn
a) Expand this recurrence relation into an infinite sum xn = Σk=0 akzn-k
Use this infinite sum to compute E[Xn]. Also compute the variance of Xn.


Where I'm stuck:

The requested infinite sum is xn = σ[Σk=0 ϕkzn-k] but I'm having trouble grasping how to determine the expected value E[Xn]. I understand that the expected value is the average value of the series, but E[Xn] = Xn / n seems too simplistic. What I've found on the internet doesn't really clarify things for me, and the notes and lectures for the class leave a lot to be desired.

Can anyone explain how to get E[Xn]?


Thanks.
 
Last edited:
The question:

Consider a stochastic process in which N(0,1) Gaussian white noise Zn is filtered to obtain Xn by an IIR filter : xn - ϕxn-1 = σzn
a) Expand this recurrence relation into an infinite sum xn = Σk=0 akzn-k
Use this infinite sum to compute E[Xn]. Also compute the variance of Xn.


Where I'm stuck:

The requested infinite sum is xn = σ[Σk=0 ϕkzn-k] but I'm having trouble grasping how to determine the expected value E[Xn]. I understand that the expected value is the average value of the series, but E[Xn] = Xn / n seems too simplistic.Can anyone explain how to get E[Xn]?

each Zn is a zero mean Gaussian RV. Two important properties of Gaussian RVs come into play

a) if Z is N(0,1) then a*Z is N(0,a^2) (a^2 is the variance)
b) if Zn are N(mn, varn), then let m = Sum[1,n,mn] and var = Sum[1,n,varn], then Sum[1,n,Zn] is N(m,var), i.e. you sum the means and sum the variances to obtain the mean and variance of the sum.

Given this we see that xn is a sum of scaled zero mean Gaussian RVs and thus it's mean is also zero as all the addend's mean are zero.
The variance is more interesting. You have a sum of N(0,1) RVs scaled by sigma phi^k and thus their variances are scaled by sigma^2 phi^2k.
The resulting sum has variance that is the sum of the individual variances. If phi < 1 this sum will converge to sigma^2/(1 - phi^2) as it is a geometric series.
 
each Zn is a zero mean Gaussian RV. Two important properties of Gaussian RVs come into play

a) if Z is N(0,1) then a*Z is N(0,a^2) (a^2 is the variance)
b) if Zn are N(mn, varn), then let m = Sum[1,n,mn] and var = Sum[1,n,varn], then Sum[1,n,Zn] is N(m,var), i.e. you sum the means and sum the variances to obtain the mean and variance of the sum.

Given this we see that xn is a sum of scaled zero mean Gaussian RVs and thus it's mean is also zero as all the addend's mean are zero.
The variance is more interesting. You have a sum of N(0,1) RVs scaled by sigma phi^k and thus their variances are scaled by sigma^2 phi^2k.
The resulting sum has variance that is the sum of the individual variances. If phi < 1 this sum will converge to sigma^2/(1 - phi^2) as it is a geometric series.

I see, thank you. I suspected that E[Xn] = 0, but was second guessing myself because I was misunderstanding the variance.

Then, var(Zn) = E[Z^2] - E[Z]^2, with the second term equal to zero.

But then is E[Z^2] != 0 because Z^2 is no longer a N(0,1) Gaussian?

Edit : Disregard that second question. That was fueled by a lack of sleep. (Obviously, squaring a series with negative terms results in all non-negative values, and thus a non-zero mean.)
 
Last edited:
Top