mr adams and ms smith are betting on repeated flips of a coin. at the start of the game mr.adams has a dollars and ms smith has b dollars, at each flip the loser pays the winner one dollar,and the game continues until either player is ''ruined''. making use of the fact that in an equitable game each player's mathematical expectation is zero, find the probability that mr adams will win ms smith's b dollars before he loses his a dollars.

This material may consist of step-by-step explanations on how to solve a problem or examples of proper writing, including the use of citations, references, bibliographies, and formatting. This material is made available for the sole purpose of studying and learning - misuse is strictly forbidden.

This is practically the Gamblerâ€™s ruin problem with two players involved.

We assume that Mr.Adams wins or loses 1$ each game, independent of the past, with probability p, and q respectively, with p+q=1. His objective is to get to a+b dollars before getting ruined.

We let Xn being the fortune after the nth game.

This case Xn yields a Markov chain (MC) on the state space States = {0, 1, : : : ,a+b}.

The transition states also assume transition probabilities, when winning or losing 1$ per game.

Like this: Pa,a+1 =p and Pa,a-1=1-p, because q=1-p....

This is only a preview of the solution. Please use the purchase button to see the entire solution