STAT 334
ASSIGNMENT 2
1) (10) You are in an Escape Room with a number of doors. You start from position A and each time you return to A, all doors look identical to you.
If you take Door 1, you return to A in 4 minutes.
If you take Door 2, you return to A in 5 minutes,
If you take Door 3,you spend 2 minutes and then you can turn right or left. If you turn right, you escape the room in 3 minutes. If you turn left, you escape the room in 1 minute.
You choose doors at random. Let y = number of minutes required to escape.
a) Find E(y).
b) Find var(y).
c) Suppose you decide to cheat and mark the doors you already tried. (you will not choose a door
if you were unsuccessful in the past, but will choose randomly among the remaining doors.
Find E(y), var(y). (Hint: In this case,it maybe easy to list all the possible paths)
2) (15) I play videogames every night. Each time I play a game, I have a probability p of winning. The games are independent.
a) Suppose I play until I win k games. It takes me Y (ar.v.) games to get these wins, and then I play Y more games, and then stop. Find the mean total number of wins.
b) Find the variance of the total number of wins in (a).
c) Suppose instead I decide to play n games, I winX (ar.v.) of these games, and then I play an additional X games. Find the mean total number of wins.
d) Find the Variance of the total number of wins in part © .
e) Find the probability that I win 0 additional games in the situation in part ©
3) (9) Suppose the number of accidents during a month, M , has a Poisson distribution with mean
100. Each accident independently leads to an insurance claim with probability
0.8 or does not lead to an insurance claim with probability 0.2. Let N = number of insurance claims during a month.
(a) Find the mean and variance of N . (Hint: let Yi = 1 if accident i leads to an insurance claim and 0 otherwise.)
(b) Find the mgf of N . What distribution is this? Why does it make logical sense?
(c) If the amount paid for each insurance claim is Xi ∼ Gam(5, 0.01), find the mean and standard deviation of the total amount paid for all claims during a month.
4) (7) Suppose that if Surya holds tutorial one week, he will hold a tutorial the next week with probability 0.6. But if he does not hold a tutorial one week, he will hold one the next week with probability 0.3.
a) Define the state space and the transition probability matrix for this Markov chain.
b) Do you think the conditions of a Markov chain would actually hold in this situation in reality?
c) Surya did not hold a tutorial this week What is the probability he will hold a tutorial five weeks from today?
5) (15) Suppose the number of errors in the STAT 334 course notes can be modeled by a Poisson process with rate = 2 per page. 75% of the errors will be discovered by the proofreading
process, with each error being discovered independently of the others. (a) Find the probability that there are 8 errors in 2 pages.
(b) Find the probability that 5 errors are discovered in 2 pages.
(c) Given that there are 8 errors in 2 pages, find the distribution of the number that are discovered on page 1.
(d) Given that 5 errors are discovered in 2 pages, find the expected actual number of errors on those 2 pages.
(e) The time to fix an error has mean 10 minutes and standard deviation 5 minutes. Find the mean and variance of the total time to fix all discovered errors in t pages.
6) (12) Consider games played by a basketball team which has P(Win) = 0.5. A string of consecutive games with identical outcomes is called a “streak". Let X1 = 1 and Xn = the length of the current streak up to and including game n. So, for example, if the outcomes were
W, W, L, W,W,W,W, L, ... then
X1 = 1, X2 = 2, X3 = 1, X4 = 1, X5 = 2, X6 = 3, X7 = 4, X8 = 1 etc. The state space is S = {1,2, … . }.
(a) Write down the first 5X5 entries of the transition probability matrix P. (b) Is this chain irreducible? It is aperiodic? Briefly justify for each.
(c) Solve for the stationary distribution and use this to determine the proportion of time the team is on a 5-game streak.
(d) Now consider if probability that the team wins is not 0.5. Is this process still a Markov Chain? Either prove that it is, or explain why not and how you could change/enlarge the state space so that it is.
7) (12) Consider the following Markov Chain (figure 1)
a) Let x = the number of transitions up to and including the transition in which the process leaves State 3 for the last time. Find var(x).
b) Let Y = the number of transitions up to and including the transition in which the process enters State 4 for the first time. Find E(Y).
c) Let πi = probability that the process is at state i in 100100 periods. Find πi for each state.
8) (15) Vivek is currently in Room 1.
On each day he is in Room 1, he has a probability of ½ of remaining in Room 1 the next day.
Otherwise, he has an equally likely chance of being in Room 2, Room 3, Room A or Room B the next day.
On any day he is in Room 2, he has a probability of ½ of switching to Room B, a probability of 3/8 of switching to Room 1, and a probability of 1/8 of switching to Room 3 the next day.
On any day he is in Room 3, he has a probability of ¼ of switching to Room A, a probability of 3/8 of switching to Room 1, and a probability of 3/8 of switching to Room 2.
If he finds himself in Rooms A or B, he stays there forever.
a) Draw the Markov Chain. Write down its transition Matrix. Identify all the transient and recurring states.
b) What is the probability that he eventually will leave the numbered Rooms 1, 2 and 3?
c) What is the probability that he will eventually be in Room B?
d) What is the expected number of days until he leaves the numbered rooms 6?