Date | May Example question | Marks available | 3 | Reference code | EXM.3.AHL.TZ0.4 |
Level | Additional Higher Level | Paper | Paper 3 | Time zone | Time zone 0 |
Command term | Write down | Question number | 4 | Adapted from | N/A |
Question
This question will connect Markov chains and directed graphs.
Abi is playing a game that involves a fair coin with heads on one side and tails on the other, together with two tokens, one with a fish’s head on it and one with a fish’s tail on it. She starts off with no tokens and wishes to win them both. On each turn she tosses the coin, if she gets a head she can claim the fish’s head token, provided that she does not have it already and if she gets a tail she can claim the fish’s tail token, provided she does not have it already. There are 4 states to describe the tokens in her possession; A: no tokens, B: only a fish’s head token, C: only a fish’s tail token, D: both tokens. So for example if she is in state B and tosses a tail she moves to state D, whereas if she tosses a head she remains in state B.
After throws the probability vector, for the 4 states, is given by where the numbers represent the probability of being in that particular state, e.g. is the probability of being in state B after throws. Initially .
Draw a transition state diagram for this Markov chain problem.
Explain why for any transition state diagram the sum of the out degrees of the directed edges from a vertex (state) must add up to +1.
Write down the transition matrix M, for this Markov chain problem.
Find the steady state probability vector for this Markov chain problem.
Explain which part of the transition state diagram confirms this.
Explain why having a steady state probability vector means that the matrix M must have an eigenvalue of .
Find .
Hence, deduce the form of .
Explain how your answer to part (f) fits with your answer to part (c).
Find the minimum number of tosses of the coin that Abi will have to make to be at least 95% certain of having finished the game by reaching state C.
Markscheme
M1A2
[3 marks]
You must leave the state along one of the edges directed out of the vertex. R1
[1 mark]
M1A2
[3 marks]
M1
since so steady state vector is . A1R1A1
[4 marks]
There is a loop with probability of 1 from state D to itself. A1
[1 mark]
Let the steady state probability vector be s then Ms = 1s showing that (\lambda = 1\) is an eigenvalue with associated eigenvector of s. A1R1
[2 marks]
A1A1A1A1
[4 marks]
A2
[2 marks]
the steady state probability vector M1R1
[2 marks]
Require (e.g. by use of table) R1M1A2
[4 marks]