Markov Chains: Using P = [[ 0.3 0.7 ][ 0.6 0.4 ]], find prob

Angela123

Junior Member
Joined
Oct 9, 2008
Messages
54
P=[.3 .7
.6 .4]

I don't understand how to answer these. You don't have to give me the answers but I really want to understand how to solve them.

(1) If, on the first observation the system is in state 1, what is the probability that it is in state 1 on the second observation?

(2) If, on the first observation the system is in state 1, what is the probability that it is in state 1 on the third observation?

(3) If, on the first observation, the system is in state 2, what state is the system most likely to occupy on the third observation? (If there is more than one such state, which is the first one.)

(4) If, on the first observation, the system is in state 2, what is the probability that it alternates between states 1 and 2 for the first four observations (i.e., it occupies state 2, then state 1, then state 2, and finally state 1 again)?
 
Re: Markov Chains

Angela123 said:
(1) If, on the first observation the system is in state 1, what is the
probability that it is in state 1 on the second observation?
I have taught Markov Chains. But you are using a vocabulary that I do not understand. Could you tell me what the above statement means?
What does state 1 mean?

Let me give you an example. Suppose we have a rose plant. Each next generation of plant operates this way: A white rose may give a white offspring .3 times and a red .7 times. While a red rose gives a white offspring .6 times and a red .4 times.
Thus we have a transition matrix of
\(\displaystyle \begin{array}{ccc} {} & W & R \\ W & {.3} & {.7} \\ R & {.6} & {.4} \\ \end{array}\)
In this case, what is state 1?
 
Re: Markov Chains

I wish I could explain the vocabulary, but that's the way the problem is written. That's why I need help, I have no idea what it's asking. In your example, state 1 would be .3, right?
 
Re: Markov Chains

I have seen this terminology before.

Here is an example using a 3X3 transition matrix:



\(\displaystyle \;\ \;\ \;\ \;\ \text{preceding state}\)

\(\displaystyle \begin{bmatrix}p_{11}&p_{12}&p_{13}\\p_{21}&p_{22}&p_{23}\\p_{31}&p_{32}&p_{33}\end{bmatrix} \;\ \;\ \text{new state}\)


In this matrix \(\displaystyle p_{32}\) is the probability that the system will change from state 2 to state 3, \(\displaystyle p_{11}\) is the probability that the system

will still be in state 1 if it was previously in state 1, and so on.

Hope this helps a little.
 
Hello, Angela123!

\(\displaystyle P\:=\:\begin{bmatrix}0.3 &0.7 \\ 0.6 & 0.4\end{bmatrix}\)

We are expected to understand that this transition matrix is a "chart".

. . . . . . . . . . . . .\(\displaystyle \text{To}\)
. . \(\displaystyle \begin{array}{c} \\ \text{From}\end{array}\;\begin{array}{c||c|c|} & a_1 & a_2 \\ \hline\hline a_1 & 0.3 & 0.7 \\ \hline a_2 & 0.6 & 0.4 \\ \hline \end{array}\)

It gives us the probabilities of going from one state to another.

\(\displaystyle \text{The four entries in the chart mean: }\;\begin{array}{ccccccc} P(a_1\to a_1) \:=\:0.3 & & P(a_1\to a_2) \:=\:0.7 \\ P(a_2\to a_1)\:=\:0.6 & & P(a_2\to a_2) \:=\:0.4 \end{array}\)




(1) If, on the first observation the system is in \(\displaystyle a_1\),
what is the probability that it is in \(\displaystyle a_1\) on the second observation?

\(\displaystyle \text{We see above that: }\:p(a_1\to a_1) \:=\:0.3\)




(2) If, on the first observation the system is in \(\displaystyle a_1\),
what is the probability that it is in \(\displaystyle a_1\) on the third observation?

\(\displaystyle \text{Consider: }P^2 \;=\;\begin{bmatrix}0.3&0.7 \\ 0.6&0.4\end{bmatrix} \begin{bmatrix}0.3&0.7\\ 0.6&0.4\end{bmatrix} \;=\;\begin{bmatrix}0.51&0.49\\0.42&0.58 \end{bmatrix}\)

This displays the probabilities of going from one step to another in two steps.

\(\displaystyle \text{Therefore: }\:p(a_1\to a_1\text{, 2 steps}) \:=\:0.51\)



(3) If, on the first observation, the system is in \(\displaystyle a_2\),
what state is the system most likely to occupy on the third observation?
(If there is more than one such state, which is the first one?)

\(\displaystyle \text{Consider: }\;P^3 \;=\;\begin{bmatrix}0.447 & 0.553\\0.474&0.526\end{bmatrix}\)

This displays the probabilities of going from one step to another in three steps.

\(\displaystyle \text{The second row gives us: }\:\begin{array}{ccc}P(a_2\to a_1\text{, 3 steps}) \:=\:0.424 \\ P(a_2\to a_2\text{, 3 steps}) \:=\:0.526 \end{array}\)

Therefore, it is more likely to be in \(\displaystyle a_2\).



(4) If, on the first observation, the system is in \(\displaystyle a_2\),
what is the probability that it alternates between \(\displaystyle a_1\) and \(\displaystyle a_2\) for the first four observations?
(i.e., it occupies \(\displaystyle a_2\), then \(\displaystyle a_1\), then \(\displaystyle a_2\), and finally \(\displaystyle a_1\) again)

This can be answered using the one-step matrix, \(\displaystyle P.\)

. . \(\displaystyle \begin{array}{ccccccccc}P(a_2\to a_1) & \wedge & P(a_1\to a_2) & \wedge & P(a_2\to a_1) \\ 0.6 & \times & 0.7 & \times & 0.6 & = & \boxed{0.252} \end{array}\)

 
Top