Convergence in probability of these random variables.

mpatch

New member
Joined
Jul 29, 2012
Messages
1
Hey guys, here's a problem that I am working on (click thumbnail below):

j4.GIF

The way I see it is that any of the X's take on two values: 0, or +/-n.

Now the definition of probability convergence says that:


P(|X_n - Y|>= (epsilon))=0 or equivalently P(|X_n - Y|< (epsilon))=1.


Since Y is degenerate and always equal to 0, the problem becomes very simple. Yet it doesn't (for me at least):


Given (epsilon)>0, when X_n = 0 then we have our convergence. That is, the probability that 0 is greater than a number greater than zero is an impossible event, or 0.


Now When X_n=+/-n, then |+/-n|=n. As n approaches infinity, it seems to me that it is ALWAYS greater than (epsilon). Well not really, since if I let (epsilon)=(n+1) then the probability converges (that is, n is never greater than n+1) but I don't think that's the definition of convergence or what I'm supposed to do.


Furthermore n is NEVER less than a given (epsilon), at infinity especially, and yet the definition states that this event always happening (P=1) is when the random variables converge. That is P(n<epsilon)=1 means convergence in probability, but n at infinity will be greater than a fixed epsilon!


Essentially, I don't think I get this concept. My textbook explains the definition in passing to the central limit theorem. We discussed it in class more thoroughly but it seems not to be clicking in. Wikipedia didn't help either.


I'm not asking for an explanation about convergence of probabilities however, I just felt like including where my handicap is coming from. I feel like with the problem worked out I might finally understand the concept.


I asked for help on reddit, too, and someone mentioned that I can find a simple expression for P(|X_n-Y|<epsilon) and take the limit of this expression as n approaches infinity. This isn't the method I learned (I think) but I went ahead and tried that this probability (ie, P(|X_n-Y|<epsilon)) might be represented by 1-(1/n), and clearly as n approaches infinity this probability becomes 1. I don't think I am correct, however. I mean, I am not convinced by my own answer, rather it just seems that it fits.

Thanks for any hint/help you can provide!
 
Last edited by a moderator:
Top