Don't understand Stats Prob: Let (Xn)n1 be a Markov chain

Joystar77

New member
Joined
Jul 8, 2013
Messages
39
Let (Xn)n1 be a Markov chain with state space f1; : : : ; kg for some k  1. Show that
if i and j communicate, then the probability that the chain started in state i reaches
state j in k steps or fewer is greater than 0.
 
Let (Xn)n1 be a Markov chain with state space f1; : : : ; kg for some k 1. Show that
if i and j communicate, then the probability that the chain started in state i reaches
state j in k steps or fewer is greater than 0.

Please share your work with us .

If you are stuck at the beginning tell us and we'll start with the definitions e.g. define Markov Chain.

You need to read the rules of this forum. Please read the post titled "Read before Posting" at the following URL:

http://www.freemathhelp.com/forum/th...217#post322217
 
Top