Results 1 to 2 of 2

Thread: Don't understand Stats Prob: Let (Xn)n1 be a Markov chain

  1. #1

    Don't understand Stats Prob: Let (Xn)n1 be a Markov chain

    Let (Xn)n1 be a Markov chain with state space f1; : : : ; kg for some k  1. Show that
    if i and j communicate, then the probability that the chain started in state i reaches
    state j in k steps or fewer is greater than 0.

  2. #2
    Elite Member
    Join Date
    Jun 2007
    Posts
    12,871
    Quote Originally Posted by Joystar77 View Post
    Let (Xn)n1 be a Markov chain with state space f1; : : : ; kg for some k 1. Show that
    if i and j communicate, then the probability that the chain started in state i reaches
    state j in k steps or fewer is greater than 0.
    Please share your work with us .

    If you are stuck at the beginning tell us and we'll start with the definitions e.g. define Markov Chain.

    You need to read the rules of this forum. Please read the post titled "Read before Posting" at the following URL:

    http://www.freemathhelp.com/forum/th...217#post322217
    “... mathematics is only the art of saying the same thing in different words” - B. Russell

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •