Markov Chain: Given the 1st random probability of 0.3586,

odumath

New member
Joined
Sep 20, 2006
Messages
42
Hello Experts,

I trying to solve a problem using the Markov chain. Below URL provides you a snapshot of the problem.

http://img237.imageshack.us/my.php?image=02ct7.gif

Here's my question:
- I'm in state S1
- I have 2 options.
a) Go to state S2 (probability of 0.20) or
b) Go to state S3 (probability of 0.80)

- Given the 1st random probability of 0.3586, am I moving to S2 or S3?
- Whichever state I'm moving to, what was the basis for this decision? (I don't make the connection since no "<" or "<=" or ">" symbols are used)

odumath
 
I have looked at what you have posted. Sorry to say that I have no clue as to what that notation means. I say that having done some work with Markov Chains. But I have never encountered that notation. It is probably unique to your field or to the particular textbook that you are using.
 
pka,

thanks anyways... if you had to guess... would you say:

a) 0.3586 is greater than 0.2; hence, the next state should be S3, or
b) although 0.3586 is greater than 0.2, it is NOT greater than 0.8; hence, the next state should be S2
 
pka,

you actually helped me solve this problem. Given your answer, I realized that is was all a matter of identifying/listing missing/ambiguous assumptions.

So, once I wrote out these assumptions, solving it was really easy.

Thanks,
odumath
 
Top