Hello, I am new here and hope this is the right section for this.
I have two players (A and B), who can pick between two options each (A1 and A2 for player A, B1 and B2 for player B). They both select an option at the same time. Depending on which option they chose, a certain outcome will happen. I tried to make it more clear with the help of a pay off matrix. In the case of A2 / B2, the players would have to repeat the game.
I wanted to try and find a mixed strategy nash equilibrium, giving me the odds with which each player should pick their options. Now my idea was to replace the 'Repeat' option with the expected value - which then comes with the issue that I need probabilities to calculate the expected value in the first place. Can I use an equation sytstem to solve this, or will I fall short of enough equations to get a solution?
I hope this question isn't trivial, my knowledge on game theory is not very deep sadly. Happily looking forward to any kind of help!
I have two players (A and B), who can pick between two options each (A1 and A2 for player A, B1 and B2 for player B). They both select an option at the same time. Depending on which option they chose, a certain outcome will happen. I tried to make it more clear with the help of a pay off matrix. In the case of A2 / B2, the players would have to repeat the game.
I wanted to try and find a mixed strategy nash equilibrium, giving me the odds with which each player should pick their options. Now my idea was to replace the 'Repeat' option with the expected value - which then comes with the issue that I need probabilities to calculate the expected value in the first place. Can I use an equation sytstem to solve this, or will I fall short of enough equations to get a solution?
I hope this question isn't trivial, my knowledge on game theory is not very deep sadly. Happily looking forward to any kind of help!
B \ A | A1 | A2 |
B1 | -1 \ 1 | 1 \ -1 |
B2 | 2 \ -2 | Repeat |