Is this Markov chain?

Win_odd Dhamnekar

Junior Member
Joined
Aug 14, 2018
Messages
207
An urn always contains 2 balls. Ball colors are red and blue. At each stage a ball is randomly chosen and then replaced by a new ball, which with probability 0.8 is the same color, and with probability 0.2 is the opposite color, as the ball it replaces. Now, if we define [imath]X_n =1 [/imath] if the nth selection is red, and [imath]X_n=0 [/imath] if the nth selection is blue, will [imath]\{X_n, n\geqslant 1\}[/imath] be a Markov chain?

In my opinion, it will be a Markov chain.

But author said it is not a Markov chain because information about previous color selections would affect probabilities about the current makeup of the urn, which would affect the probability that the next selection is red.

I don't understand what the author meant to say here?

Would any member of Free MHF explain author's answer?
 
An urn always contains 2 balls. Ball colors are red and blue. At each stage a ball is randomly chosen and then replaced by a new ball, which with probability 0.8 is the same color, and with probability 0.2 is the opposite color, as the ball it replaces. Now, if we define [imath]X_n =1 [/imath] if the nth selection is red, and [imath]X_n=0 [/imath] if the nth selection is blue, will [imath]\{X_n, n\geqslant 1\}[/imath] be a Markov chain?

In my opinion, it will be a Markov chain.

But author said it is not a Markov chain because information about previous color selections would affect probabilities about the current makeup of the urn, which would affect the probability that the next selection is red.

I don't understand what the author meant to say here?

Would any member of Free MHF explain author's answer?
Example 4.10 An urn always contains 2 balls. Ball colors are red and blue. At each stage a ball is randomly chosen and then replaced by a new ball, which with probability 0.8 is the same color, and with probability 0.2 is the opposite color, as the ball it replaces. If initially both balls are red, find the probability that the fifth ball selected is red.
Question:
If in Example 4.10 we had defined [imath]X_n[/imath] to equal 1 if the nth selection were red and to equal 0 if it were blue, would [imath]\{X_n, n \geqslant 1\} [/imath]be a Markov chain?

Note: I give here the original example and question again for better understanding of what author is asking.
 
For independent events, [imath]\Pr(B|A)=\Pr(B)[/imath].
Can you show that [imath]\Pr(X_2=\text{Red}|X_1=\text{Red})=\Pr(X_2=\text{Red})?[/imath]
 
For independent events, [imath]\Pr(B|A)=\Pr(B)[/imath].
Can you show that [imath]\Pr(X_2=\text{Red}|X_1=\text{Red})=\Pr(X_2=\text{Red})?[/imath]
If initially both the balls are red, [imath]X_1[/imath]= red with [imath]P( X_1=red)=100[/imath]%. But then red ball chosen get replaced by new red ball with probability 0.8 and by blue ball with probability 0.2. Naturally it would affects the 2nd selection [imath]P(X_2)=red[/imath]. Thus thereby it would affect the [imath]P(X_3 = red)[/imath] also.

So, did author say that's why it is not Markov chain?
 
If initially both the balls are red, [imath]X_1[/imath]= red with [imath]P( X_1=red)=100[/imath]%. But then red ball chosen get replaced by new red ball with probability 0.8 and by blue ball with probability 0.2. Naturally it would affects the 2nd selection [imath]P(X_2)=red[/imath]. Thus thereby it would affect the [imath]P(X_3 = red)[/imath] also.

So, did author say that's why it is not Markov chain?
Markov Chains are processes where the distribution of the future steps depends only on the present state of the process.
The next outcome depends on what's currently in the urn, but what's in the urn is determined by a series of draws in the past. Thus, it's not a Markov Chain.
 
Last edited:
Top