# Thread: Don't understand Stats Prob: Let (Xn)n1 be a Markov chain

1. ## Don't understand Stats Prob: Let (Xn)n1 be a Markov chain

Let (Xn)n1 be a Markov chain with state space f1; : : : ; kg for some k  1. Show that
if i and j communicate, then the probability that the chain started in state i reaches
state j in k steps or fewer is greater than 0.

2. Originally Posted by Joystar77
Let (Xn)n1 be a Markov chain with state space f1; : : : ; kg for some k 1. Show that
if i and j communicate, then the probability that the chain started in state i reaches
state j in k steps or fewer is greater than 0.
Please share your work with us .

If you are stuck at the beginning tell us and we'll start with the definitions e.g. define Markov Chain.

You need to read the rules of this forum. Please read the post titled "Read before Posting" at the following URL:

http://www.freemathhelp.com/forum/th...217#post322217

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•