image

If such a chain is irreducible and aperiodic and consists of M+1image states 0,1,,Mimage, show that the long-run proportions are given by

πj=1M+1,j=0,1,,M

image

*21. A DNA nucleotide has any of four values. A standard model for a mutational change of the nucleotide at a specific location is a Markov chain model that supposes that in going from period to period the nucleotide does not change with probability 1-3αimage, and if it does change then it is equally likely to change to any of the other three values, for some 0<α<13image.

(a) Show that P1,1n=14+34(1-4α)nimage.

(b) What is the long-run proportion of time the chain is in each state?

22. Let Ynimage be the sum of nimage independent rolls of a fair die. Find

limnP{Ynis a multiple of13}

image


Hint: Define an appropriate Markov chain and apply the results of Exercise 20.

23. In a good weather year the number of storms is Poisson distributed with mean 1; in a bad year it is Poisson distributed with mean 3. Suppose that any year’s weather conditions depends on past years only through the previous year’s condition. Suppose that a good year is equally likely to be followed by either a good or a bad year, and that a bad year is twice as likely to be followed by a bad year as by a good year. Suppose that last year—call it year 0—was a good year.

(a) Find the expected total number of storms in the next two years (that is, in years 1 and 2).

(b) Find the probability there are no storms in year 3.

(c) Find the long-run average number of storms per year.

24. Consider three urns, one colored red, one white, and one blue. The red urn contains 1 red and 4 blue balls; the white urn contains 3 white balls, 2 red balls, and 2 blue balls; the blue urn contains 4 white balls, 3 red balls, and 2 blue balls. At the initial stage, a ball is randomly selected from the red urn and then returned to that urn. At every subsequent stage, a ball is randomly selected from the urn whose color is the same as that of the ball previously selected and is then returned to that urn. In the long run, what proportion of the selected balls are red? What proportion are white? What proportion are blue?

25. Each morning an individual leaves his house and goes for a run. He is equally likely to leave either from his front or back door. Upon leaving the house, he chooses a pair of running shoes (or goes running barefoot if there are no shoes at the door from which he departed). On his return he is equally likely to enter, and leave his running shoes, either by the front or back door. If he owns a total of kimage pairs of running shoes, what proportion of the time does he run barefooted?

26. Consider the following approach to shuffling a deck of nimage cards. Starting with any initial ordering of the cards, one of the numbers 1,2,,nimage is randomly chosen in such a manner that each one is equally likely to be selected. If number iimage is chosen, then we take the card that is in position iimage and put it on top of the deck—that is, we put that card in position 1. We then repeatedly perform the same operation. Show that, in the limit, the deck is perfectly shuffled in the sense that the resultant ordering is equally likely to be any of the n!image possible orderings.

*27. Each individual in a population of size Nimage is, in each period, either active or inactive. If an individual is active in a period then, independent of all else, that individual will be active in the next period with probability αimage. Similarly, if an individual is inactive in a period then, independent of all else, that individual will be inactive in the next period with probability βimage. Let Xnimage denote the number of individuals that are active in period nimage.

(a) Argue that Xn,n0image is a Markov chain.

(b) Find E[XnX0=i]image.

(c) Derive an expression for its transition probabilities.

(d) Find the long-run proportion of time that exactly jimage people are active.
Hint for (d): Consider first the case where N=1image.

28. Every time that the team wins a game, it wins its next game with probability 0.8; every time it loses a game, it wins its next game with probability 0.3. If the team wins a game, then it has dinner together with probability 0.7, whereas if the team loses then it has dinner together with probability 0.2. What proportion of games result in a team dinner?

29. An organization has Nimage employees where Nimage is a large number. Each employee has one of three possible job classifications and changes classifications (independently) according to a Markov chain with transition probabilities

0.70.20.10.20.60.20.10.40.5

image

What percentage of employees are in each classification?

30. Three out of every four trucks on the road are followed by a car, while only one out of every five cars is followed by a truck. What fraction of vehicles on the road are trucks?

31. A certain town never has two sunny days in a row. Each day is classified as being either sunny, cloudy (but dry), or rainy. If it is sunny one day, then it is equally likely to be either cloudy or rainy the next day. If it is rainy or cloudy one day, then there is one chance in two that it will be the same the next day, and if it changes then it is equally likely to be either of the other two possibilities. In the long run, what proportion of days are sunny? What proportion are cloudy?

*32. Each of two switches is either on or off during a day. On day nimage, each switch will independently be on with probability

[1+number of on switches during dayn-1]/4

image

For instance, if both switches are on during day n-1image, then each will independently be on during day nimage with probability 3/4image. What fraction of days are both switches on? What fraction are both off?

33. A professor continually gives exams to her students. She can give three possible types of exams, and her class is graded as either having done well or badly. Let piimage denote the probability that the class does well on a type iimage exam, and suppose that p1=0.3,p2=0.6image, and p3=0.9image. If the class does well on an exam, then the next exam is equally likely to be any of the three types. If the class does badly, then the next exam is always type 1. What proportion of exams are type i,i=1,2,3image?

34. A flea moves around the vertices of a triangle in the following manner: Whenever it is at vertex iimage it moves to its clockwise neighbor vertex with probability piimage and to the counterclockwise neighbor with probability qi=1-pi,i=1,2,3image.

(a) Find the proportion of time that the flea is at each of the vertices.

(b) How often does the flea make a counterclockwise move that is then followed by five consecutive clockwise moves?

35. Consider a Markov chain with states 0, 1, 2, 3, 4. Suppose P0,4=1image; and suppose that when the chain is in state i,i>0image, the next state is equally likely to be any of the states 0,1,,i-1image. Find the limiting probabilities of this Markov chain.

36. The state of a process changes daily according to a two-state Markov chain. If the process is in state iimage during one day, then it is in state jimage the following day with probability Pi,jimage, where

P0,0=0.4,P0,1=0.6,P1,0=0.2,P1,1=0.8

image

Every day a message is sent. If the state of the Markov chain that day is iimage then the message sent is “good” with probability piimage and is “bad” with probability qi=1-piimage, i=0,1image

(a) If the process is in state 0image on Monday, what is the probability that a good message is sent on Tuesday?

(b) If the process is in state 0image on Monday, what is the probability that a good message is sent on Friday?

(c) In the long run, what proportion of messages are good?

(d) Let Ynimage equal 1image if a good message is sent on day nimage and let it equal 2image otherwise. Is {Yn,n1}image a Markov chain? If so, give its transition probability matrix. If not, briefly explain why not.

37. Show that the stationary probabilities for the Markov chain having transition probabilities Pi,jimage are also the stationary probabilities for the Markov chain whose transition probabilities Qi,jimage are given by

Qi,j=Pi,jk

image

for any specified positive integer kimage.

38. Capa plays either one or two chess games every day, with the number of games that she plays on successive days being a Markov chain with transition probabilities

P1,1=.2,P1,2=.8P2,1=.4,P2,2=.6

image

Capa wins each game with probability pimage. Suppose she plays two games on Monday.

(a) What is the probability that she wins all the games she plays on Tuesday?

(b) What is the expected number of games that she plays on Wednesday?

(c) In the long run, on what proportion of days does Capa win all her games.

39. Consider the one-dimensional symmetric random walk of Example 4.18, which was shown in that example to be recurrent. Let πiimage denote the long-run proportion of time that the chain is in state iimage.

(a) Argue that πi=π0image for all iimage.

(b) Show that iπi1image.

(c) Conclude that this Markov chain is null recurrent, and thus all πi=0image.

40. A particle moves on 12 points situated on a circle. At each step it is equally likely to move one step in the clockwise or in the counterclockwise direction. Find the mean number of steps for the particle to return to its starting position.

*41. Consider a Markov chain with states equal to the nonnegative integers, and suppose its transition probabilities satisfy Pi,j=0ifjiimage. Assume X0=0image, and let ejimage be the probability that the Markov chain is ever in state jimage. (Note that e0=1image because X0=0image.) Argue that for j>0image

ej=i=0j-1eiPi,j

image

If Pi,i+k=1/3,k=1,2,3image, find eiimage for i=1,,10image.

42. Let Aimage be a set of states, and let Acimage be the remaining states.

(a) What is the interpretation of

iAjAcπiPij?

image

(b) What is the interpretation of

iAcjAπiPij?

image

(c) Explain the identity

iAjAcπiPij=iAcjAπiPij

image

43. Each day, one of nimage possible elements is requested, the iimageth one with probability Pi,i1,1nPi=1image. These elements are at all times arranged in an ordered list that is revised as follows: The element selected is moved to the front of the list with the relative positions of all the other elements remaining unchanged. Define the state at any time to be the list ordering at that time and note that there are n!image possible states.

(a) Argue that the preceding is a Markov chain.

(b) For any state i1,,inimage (which is a permutation of 1,2,,nimage), let π(i1,,inimage) denote the limiting probability. In order for the state to be i1,,inimage, it is necessary for the last request to be for i1image, the last non-i1image request for i2image, the last non-i1image or i2image request for i3image, and so on. Hence, it appears intuitive that

π(i1,,in)=Pi1Pi21-Pi1Pi31-Pi1-Pi2Pin-11-Pi1--Pin-2

image

Verify when n=3image that the preceding are indeed the limiting probabilities.

44. Suppose that a population consists of a fixed number, say, mimage, of genes in any generation. Each gene is one of two possible genetic types. If exactly iimage (of the mimage) genes of any generation are of type 1, then the next generation will have jimage type 1 (and m-jimage type 2) genes with probability

mjimjm-imm-j,j=0,1,,m

image

Let Xnimage denote the number of type 1 genes in the nimageth generation, and assume that X0=iimage.

(a) Find E[Xn]image.

(b) What is the probability that eventually all the genes will be type 1?

45. Consider an irreducible finite Markov chain with states 0,1,,Nimage.

(a) Starting in state iimage, what is the probability the process will ever visit state jimage? Explain!

(b) Let xi=P{visit stateNbefore state0start ini}image. Compute a set of linear equations that the xiimage satisfy, i=0,1,,Nimage.

(c) If jjPij=ifori=1,,N-1image, show that xi=i/Nimage is a solution to the equations in part (b).

46. An individual possesses rimage umbrellas that he employs in going from his home to office, and vice versa. If he is at home (the office) at the beginning (end) of a day and it is raining, then he will take an umbrella with him to the office (home), provided there is one to be taken. If it is not raining, then he never takes an umbrella. Assume that, independent of the past, it rains at the beginning (end) of a day with probability pimage.

(a) Define a Markov chain with r+1image states, which will help us to determine the proportion of time that our man gets wet. (Note: He gets wet if it is raining, and all umbrellas are at his other location.)

(b) Show that the limiting probabilities are given by

πi=qr+q,ifi=0whereq=1-p1r+q,ifi=1,,r

image

(c) What fraction of time does our man get wet?

(d) When r=3image, what value of pimage maximizes the fraction of time he gets wet

*47. Let {Xn,n0}image denote an ergodic Markov chain with limiting probabilities πiimage. Define the process {Yn,n1}byYn=(Xn-1,Xn)image. That is, Ynimage keeps track of the last two states of the original chain. Is {Yn,n1}image a Markov chain? If so, determine its transition probabilities and find

limnP{Yn=(i,j)}

image

48. Consider a Markov chain in steady state. Say that a kimage length run of zeroes ends at time mimage if

Xm-k-10,Xm-k=Xm-k+1==Xm-1=0,Xm0

image

Show that the probability of this event is π0(P0,0)k-1(1-P0,0)2image, where π0image is the limiting probability of state 0.

49. Let P(1)image and P(2)image denote transition probability matrices for ergodic Markov chains having the same state space. Let π1image and π2image denote the stationary (limiting) probability vectors for the two chains. Consider a process defined as follows:

(a) X0=1image. A coin is then flipped and if it comes up heads, then the remaining states X1,image are obtained from the transition probability matrix P(1)image and if tails from the matrix P(2)image. Is {Xn,n0}image a Markov chain? If p=P{coin comes up heads}image, what is limnP(Xn=i)image?

(b) X0=1image. At each stage the coin is flipped and if it comes up heads, then the next state is chosen according to P(1)image and if tails comes up, then it is chosen according to P(2)image. In this case do the successive states constitute a Markov chain? If so, determine the transition probabilities. Show by a counterexample that the limiting probabilities are not the same as in part (a).

50. In Exercise 8image, if today’s flip lands heads, what is the expected number of additional flips needed until the pattern t,t,h,t,h,t,timage occurs?

51. In Example 4.3, Gary is in a cheerful mood today. Find the expected number of days until he has been glum for three consecutive days.

52. A taxi driver provides service in two zones of a city. Fares picked up in zone Aimage will have destinations in zone Aimage with probability 0.6 or in zone Bimage with probability 0.4. Fares picked up in zone Bimage will have destinations in zone Aimage with probability 0.3 or in zone Bimage with probability 0.7. The driver’s expected profit for a trip entirely in zone Aimage is 6image; for a trip entirely in zone Bimage is 8image; and for a trip that involves both zones is 12. Find the taxi driver’s average profit per trip.

53. Find the average premium received per policyholder of the insurance company of Example 4.27 if λ=1/4image for one-third of its clients, and λ=1/2image for two-thirds of its clients.

54. Consider the Ehrenfest urn model in which Mimage molecules are distributed between two urns, and at each time point one of the molecules is chosen at random and is then removed from its urn and placed in the other one. Let Xnimage denote the number of molecules in urn 1 after the nimageth switch and let μn=E[Xn]image. Show that

(a) μn+1=1+(1-2/M)μnimage.

(b) Use (a) to prove that

μn=M2+M-2MnE[X0]-M2

image

55. Consider a population of individuals each of whom possesses two genes that can be either type Aimage or type aimage. Suppose that in outward appearance type Aimage is dominant and type aimage is recessive. (That is, an individual will have only the outward characteristics of the recessive gene if its pair is aaimage.) Suppose that the population has stabilized, and the percentages of individuals having respective gene pairs AAimage, aaimage, and Aaimage are p,qimage, and rimage. Call an individual dominant or recessive depending on the outward characteristics it exhibits. Let S11image denote the probability that an offspring of two dominant parents will be recessive; and let S10image denote the probability that the offspring of one dominant and one recessive parent will be recessive. Compute S11image and S10image to show that S11=S102image. (The quantities S10image and S11image are known in the genetics literature as Snyder’s ratios.)

56. Suppose that on each play of the game a gambler either wins 1 with probability pimage or loses 1 with probability 1-pimage. The gambler continues betting until she or he is either up nimage or down mimage. What is the probability that the gambler quits a winner?

57. A particle moves among n+1image vertices that are situated on a circle in the following manner. At each step it moves one step either in the clockwise direction with probability pimage or the counterclockwise direction with probability q=1-pimage. Starting at a specified state, call it state 0, let Timage be the time of the first return to state 0. Find the probability that all states have been visited by time Timage.
Hint: Condition on the initial transition and then use results from the gambler’s ruin problem.

58. In the gambler’s ruin problem of Section 4.5.1, suppose the gambler’s fortune is presently iimage, and suppose that we know that the gambler’s fortune will eventually reach Nimage (before it goes to 0). Given this information, show that the probability he wins the next gamble is

p[1-(q/p)i+1]1-(q/p)i,ifp12i+12i,ifp=12

image


Hint: The probability we want is

P{Xn+1=i+1Xn=i,limmXm=N}=P{Xn+1=i+1,limmXm=NXn=i}P{limmXm=NXn=i}

image

59. For the gambler’s ruin model of Section 4.5.1, let Miimage denote the mean number of games that must be played until the gambler either goes broke or reaches a fortune of Nimage, given that he starts with i,i=0,1,,Nimage. Show that Miimage satisfies

M0=MN=0;Mi=1+pMi+1+qMi-1,i=1,,N-1

image

Solve these equations to obtain

Mi=i(N-i),ifp=12=iq-p-Nq-p1-(q/p)i1-(q/p)N,ifp12

image

60. The following is the transition probability matrix of a Markov chain with states 1,2,3,4image

P=.4.3.2.1.2.2.2.4.25.25.50.2.1.4.3

image

If X0=1image

(a) find the probability that state 3image is entered before state 4image;

(b) find the mean number of transitions until either state 3image or state 4image is entered.

61. Suppose in the gambler’s ruin problem that the probability of winning a bet depends on the gambler’s present fortune. Specifically, suppose that αiimage is the probability that the gambler wins a bet when his or her fortune is iimage. Given that the gambler’s initial fortune is iimage, let P(i)image denote the probability that the gambler’s fortune reaches Nimage before 0.

(a) Derive a formula that relates P(i)image to P(i-1)image and P(i+1)image.

(b) Using the same approach as in the gambler’s ruin problem, solve the equation of part (a) for P(i)image.

(c) Suppose that iimage balls are initially in urn 1 and N-iimage are in urn 2, and suppose that at each stage one of the Nimage balls is randomly chosen, taken from whichever urn it is in, and placed in the other urn. Find the probability that the first urn becomes empty before the second.

*62. Consider the particle from Exercise 57. What is the expected number of steps the particle takes to return to the starting position? What is the probability that all other positions are visited before the particle returns to its starting state?

63. For the Markov chain with states 1, 2, 3, 4 whose transition probability matrix Pimage is as specified below find fi3image and si3image for i=1,2,3image.

P=0.40.20.10.30.10.50.20.20.30.40.20.10001

image

64. Consider a branching process having μ<1image. Show that if X0=1image, then the expected number of individuals that ever exist in this population is given by 1/(1-μ)image. What if X0=nimage?

65. In a branching process having X0=1image and μ>1image, prove that π0image is the smallest positive number satisfying Equation (4.20).
Hint: Let πimage be any solution of π=j=0πjPjimage. Show by mathematical induction that πP{Xn=0}image for all nimage, and let nimage. In using the induction argue that

P{Xn=0}=j=0(P{Xn-1=0})jPj

image

66. For a branching process, calculate π0image when

(a) P0=14,P2=34image.

(b) P0=14,P1=12,P2=14image.

(c) P0=16,P1=12,P3=13image.

67. At all times, an urn contains Nimage balls—some white balls and some black balls. At each stage, a coin having probability p,0<p<1image, of landing heads is flipped. If heads appears, then a ball is chosen at random from the urn and is replaced by a white ball; if tails appears, then a ball is chosen from the urn and is replaced by a black ball. Let Xnimage denote the number of white balls in the urn after the nimageth stage.

(a) Is {Xn,n0}image a Markov chain? If so, explain why.

(b) What are its classes? What are their periods? Are they transient or recurrent?

(c) Compute the transition probabilities Pijimage.

(d) Let N=2image. Find the proportion of time in each state.

(e) Based on your answer in part (d) and your intuition, guess the answer for the limiting probability in the general case.

(f) Prove your guess in part (e) either by showing that Theorem 4.1 is satisfied or by using the results of Example 4.35.

(g) If p=1image, what is the expected time until there are only white balls in the urn if initially there are iimage white and N-iimage black?

*68. 

(a) Show that the limiting probabilities of the reversed Markov chain are the same as for the forward chain by showing that they satisfy the equations

πj=iπiQij

image

(b) Give an intuitive explanation for the result of part (a).

69. Mimage balls are initially distributed among mimage urns. At each stage one of the balls is selected at random, taken from whichever urn it is in, and then placed, at random, in one of the other M-1image urns. Consider the Markov chain whose state at any time is the vector (n1,,nm)image where niimage denotes the number of balls in urn iimage. Guess at the limiting probabilities for this Markov chain and then verify your guess and show at the same time that the Markov chain is time reversible.

70. A total of mimage white and mimage black balls are distributed among two urns, with each urn containing mimage balls. At each stage, a ball is randomly selected from each urn and the two selected balls are interchanged. Let Xnimage denote the number of black balls in urn 1 after the nimageth interchange.

(a) Give the transition probabilities of the Markov chain Xn,n0image.

(b) Without any computations, what do you think are the limiting probabilities of this chain?

(c) Find the limiting probabilities and show that the stationary chain is time reversible.

71. It follows from Theorem 4.2 that for a time reversible Markov chain

PijPjkPki=PikPkjPji,for alli,j,k

image

It turns out that if the state space is finite and Pij>0image for all i,jimage, then the preceding is also a sufficient condition for time reversibility. (That is, in this case, we need only check Equation (4.26) for paths from iimage to iimage that have only two intermediate states.) Prove this.
Hint: Fix iimage and show that the equations

πjPjk=πkPkj

image

are satisfied by πj=cPij/Pjiimage, where cimage is chosen so that jπj=1image.

72. For a time reversible Markov chain, argue that the rate at which transitions from iimage to jimage to kimage occur must equal the rate at which transitions from kimage to jimage to iimage occur.

73. Show that the Markov chain of Exercise 31 is time reversible.

74. A group of nimage processors is arranged in an ordered list. When a job arrives, the first processor in line attempts it; if it is unsuccessful, then the next in line tries it; if it too is unsuccessful, then the next in line tries it, and so on. When the job is successfully processed or after all processors have been unsuccessful, the job leaves the system. At this point we are allowed to reorder the processors, and a new job appears. Suppose that we use the one-closer reordering rule, which moves the processor that was successful one closer to the front of the line by interchanging its position with the one in front of it. If all processors were unsuccessful (or if the processor in the first position was successful), then the ordering remains the same. Suppose that each time processor iimage attempts a job then, independently of anything else, it is successful with probability piimage.

(a) Define an appropriate Markov chain to analyze this model.

(b) Show that this Markov chain is time reversible.

(c) Find the long-run probabilities.

75. A Markov chain is said to be a tree process if

(i) Pij>0image whenever Pji>0image,

(ii) for every pair of states iimage and j,ijimage, there is a unique sequence of distinct states i=i0,i1,,in-1,in=jimage such that

Pik,ik+1>0,k=0,1,,n-1

image

In other words, a Markov chain is a tree process if for every pair of distinct states iimage and jimage there is a unique way for the process to go from iimage to jimage without reentering a state (and this path is the reverse of the unique path from jimage to iimage). Argue that an ergodic tree process is time reversible.

76. On a chessboard compute the expected number of plays it takes a knight, starting in one of the four corners of the chessboard, to return to its initial position if we assume that at each play it is equally likely to choose any of its legal moves. (No other pieces are on the board.)
Hint: Make use of Example 4.36.

77. In a Markov decision problem, another criterion often used, different than the expected average return per unit time, is that of the expected discounted return. In this criterion we choose a number α,0<α<1image, and try to choose a policy so as to maximize E[i=0αiR(Xi,ai)]image (that is, rewards at time nimage are discounted at rate αnimage). Suppose that the initial state is chosen according to the probabilities biimage. That is,

P{X0=i}=bi,i=1,,n

image

For a given policy βimage let yjaimage denote the expected discounted time that the process is in state jimage and action aimage is chosen. That is,

yja=Eβn=0αnI{Xn=j,an=a}

image

where for any event Aimage the indicator variable IAimage is defined by

IA=1,ifAoccurs0,otherwise

image

(a) Show that

ayja=En=0αnI{Xn=j}

image

or, in other words, ayjaimage is the expected discounted time in state jimage under βimage.

(b) Show that

jayja=11-α,ayja=bj+αiayiaPij(a)

image


Hint: For the second equation, use the identity

I{Xn+1=j}=iaI{Xn=i,an=a}I{Xn+1=j}

image

Take expectations of the preceding to obtain

EIXn+1=j}=iaEI{Xn=i,an=a}Pij(a)

image

(c) Let {yja}image be a set of numbers satisfying

jayja=11-α,ayja=bj+αiayiaPij(a) (4.38)

image (4.38)

Argue that yjaimage can be interpreted as the expected discounted time that the process is in state jimage and action aimage is chosen when the initial state is chosen according to the probabilities bjimage and the policy βimage, given by

βi(a)=yiaayia

image

is employed.


Hint: Derive a set of equations for the expected discounted times when policy βimage is used and show that they are equivalent to Equation (4.38).

(d) Argue that an optimal policy with respect to the expected discounted return criterion can be obtained by first solving the linear program

maximizejayjaR(j,a),such thatjayja=11-α,ayja=bj+αiayiaPij(a),yja0,allj,a;

image

and then defining the policy βimage by

βi(a)=yiaayia

image

where the yjaimage are the solutions of the linear program.

78. For the Markov chain of Exercise 5image, suppose that p(sj)image is the probability that signal simage is emitted when the underlying Markov chain state is j,j=0,1,2image.

(a) What proportion of emissions are signal simage?

(b) What proportion of those times in which signal simage is emitted is 0 the underlying state?

79. In Example 4.43, what is the probability that the first 4image items produced are all acceptable?