image

Renewal Theory and Its Applications

Abstract

We have seen that a Poisson process is a counting process for which the times between successive events are independent and identically distributed exponential random variables. One possible generalization is to consider a counting process for which the times between successive events are independent and identically distributed with an arbitrary distribution. Such a counting process is called a renewal process.

Keywords

Renewal Process; Limit Theorems; Elementary Renewal Process; Wald’s Equation; Renewal Reward Process; Regenerative Process; Alternating Renewal Process; Semi Markov Process; Patterns

7.1 Introduction

We have seen that a Poisson process is a counting process for which the times between successive events are independent and identically distributed exponential random variables. One possible generalization is to consider a counting process for which the times between successive events are independent and identically distributed with an arbitrary distribution. Such a counting process is called a renewal process.

Let {N(t),t0}image be a counting process and let Xnimage denote the time between the (n-1)imagest and the nimageth event of this process, n1image.

Definition 7.1

If the sequence of nonnegative random variables {X1,X2,}image is independent and identically distributed, then the counting process {N(t),t0}image is said to be a renewal process.

Thus, a renewal process is a counting process such that the time until the first event occurs has some distribution Fimage, the time between the first and second event has, independently of the time of the first event, the same distribution Fimage, and so on. When an event occurs, we say that a renewal has taken place.

For an example of a renewal process, suppose that we have an infinite supply of lightbulbs whose lifetimes are independent and identically distributed. Suppose also that we use a single lightbulb at a time, and when it fails we immediately replace it with a new one. Under these conditions, {N(t),t0}image is a renewal process when N(t)image represents the number of lightbulbs that have failed by time timage.

For a renewal process having interarrival times X1,X2,,image let

S0=0,Sn=i=1nXi,n1

image

That is, S1=X1image is the time of the first renewal; S2=X1+X2image is the time until the first renewal plus the time between the first and second renewal, that is, S2image is the time of the second renewal. In general, Snimage denotes the time of the nimageth renewal (see Figure 7.1).

We shall let Fimage denote the interarrival distribution and to avoid trivialities, we assume that F(0)=P{Xn=0}<1image. Furthermore, we let

μ=E[Xn],n1

image

be the mean time between successive renewals. It follows from the nonnegativity of Xnimage and the fact that Xnimage is not identically 0 that μ>0image.

The first question we shall attempt to answer is whether an infinite number of renewals can occur in a finite amount of time. That is, can N(t)image be infinite for some (finite) value of timage? To show that this cannot occur, we first note that, as Snimage is the time of the nimageth renewal, N(t)image may be written as

N(t)=max{n:Snt} (7.1)

image (7.1)

To understand why Equation (7.1) is valid, suppose, for instance, that S4timage but S5>timage. Hence, the fourth renewal had occurred by time timage but the fifth renewal occurred after time timage; or in other words, N(t)image, the number of renewals that occurred by time timage, must equal 4. Now by the strong law of large numbers it follows that, with probability 1,

Snnμasn

image

But since μ>0image this means that Snimage must be going to infinity as nimage goes to infinity. Thus, Snimage can be less than or equal to timage for at most a finite number of values of nimage, and hence by Equation (7.1), N(t)image must be finite.

However, though N(t)<image for each timage, it is true that, with probability 1,

N()limtN(t)=

image

This follows since the only way in which N()image, the total number of renewals that occur, can be finite is for one of the interarrival times to be infinite.

Therefore,

P{N()<}=P{Xn=for somen}=Pn=1{Xn=}n=1P{Xn=}=0

image

7.2 Distribution of N(t)image

The distribution of N(t)image can be obtained, at least in theory, by first noting the important relationship that the number of renewals by time t is greater than or equal to n if and only if the nth renewal occurs before or at time t. That is,

N(t)nSnt (7.2)

image (7.2)

From Equation (7.2) we obtain

P{N(t)=n}=P{N(t)n}-P{N(t)n+1}=P{Snt}-P{Sn+1t} (7.3)

image (7.3)

Now, since the random variables Xi,i1image, are independent and have a common distribution Fimage, it follows that Sn=i=1nXiimage is distributed as Fnimage, the nimage-fold convolution of Fimage with itself (Section 2.5). Therefore, from Equation (7.3) we obtain

P{N(t)=n}=Fn(t)-Fn+1(t)

image

Example 7.1

Suppose that P{Xn=i}=p(1-p)i-1,i1image. That is, suppose that the interarrival distribution is geometric. Now S1=X1image may be interpreted as the number of trials necessary to get a single success when each trial is independent and has a probability pimage of being a success. Similarly, Snimage may be interpreted as the number of trials necessary to attain nimage successes, and hence has the negative binomial distribution

P{Sn=k}=k-1n-1pn(1-p)k-n,kn0,k<n

image

Thus, from Equation (7.3) we have that

P{N(t)=n}=k=n[t]k-1n-1pn(1-p)k-n-k=n+1[t]k-1npn+1(1-p)k-n-1

image

Equivalently, since an event independently occurs with probability pimage at each of the times 1,2,image

P{N(t)=n}=[t]npn(1-p)[t]-n

image

Another expression for P(N(t)=n)image can be obtained by conditioning on Snimage. This yields

PN(t)=n=0PN(t)=nSn=yfSn(y)dy

image

Now, if the nimageth event occurred at time y>timage, then there would have been less than nimage events by time timage. On the other hand, if it occurred at a time ytimage, then there would be exactly nimage events by time timage if the next interarrival exceeds t-yimage. Consequently,

PN(t)=n=0tPXn+1>t-ySn=yfSn(y)dy=0tF¯(t-y)fSn(y)dy

image

where F¯=1-Fimage.

Example 7.2

If F(x)=1-eλximage then Snimage, being the sum of nimage independent exponentials with rate λimage, will have a gamma (n,λ)image distribution. Consequently, the preceding identity gives

PN(t)=n=0te-λ(t-y)λe-λyλyn-1(n-1)!dy=λne-λt(n-1)!0tyn-1dy=e-λt(λt)nn!

image

By using Equation (7.2) we can calculate m(t)image, the mean value of N(t)image, as

m(t)=E[N(t)]=n=1P{N(t)n}=n=1P{Snt}=n=1Fn(t)

image

where we have used the fact that if Ximage is nonnegative and integer valued, then

E[X]=k=1kP{X=k}=k=1n=1kP{X=k}=n=1k=nP{X=k}=n=1P{Xn}

image

The function m(t)image is known as the mean-value or the renewal function.

It can be shown that the mean-value function m(t)image uniquely determines the renewal process. Specifically, there is a one-to-one correspondence between the interarrival distributions Fimage and the mean-value functions m(t)image.

Another interesting result that we state without proof is that

m(t)<for allt<

image

Remarks

An integral equation satisfied by the renewal function can be obtained by conditioning on the time of the first renewal. Assuming that the interarrival distribution Fimage is continuous with density function fimage this yields

m(t)=E[N(t)]=0E[N(t)X1=x]f(x)dx (7.4)

image (7.4)

Now suppose that the first renewal occurs at a time ximage that is less than timage. Then, using the fact that a renewal process probabilistically starts over when a renewal occurs, it follows that the number of renewals by time timage would have the same distribution as 1 plus the number of renewals in the first t-ximage time units. Therefore,

E[N(t)X1=x]=1+E[N(t-x)]ifx<t

image

Since, clearly

E[N(t)X1=x]=0whenx>t

image

we obtain from Equation (7.4) that

m(t)=0t[1+m(t-x)]f(x)dx=F(t)+0tm(t-x)f(x)dx (7.5)

image (7.5)

Equation (7.5) is called the renewal equation and can sometimes be solved to obtain the renewal function.

Example 7.3

One instance in which the renewal equation can be solved is when the interarrival distribution is uniform—say, uniform on (0, 1). We will now present a solution in this case when t1image. For such values of timage, the renewal function becomes

m(t)=t+0tm(t-x)dx=t+0tm(y)dyby the substitutiony=t-x

image

Differentiating the preceding equation yields

m(t)=1+m(t)

image

Letting h(t)=1+m(t)image, we obtain

h(t)=h(t)

image

or

logh(t)=t+C

image

or

h(t)=Ket

image

or

m(t)=Ket-1

image

Since m(0)=0image, we see that K=1image, and so we obtain

m(t)=et-1,0t1

image

7.3 Limit Theorems and Their Applications

We have shown previously that, with probability 1, N(t)image goes to infinity as timage goes to infinity. However, it would be nice to know the rate at which N(t)image goes to infinity. That is, we would like to be able to say something about limtN(t)/timage.

As a prelude to determining the rate at which N(t)image grows, let us first consider the random variable SN(t)image. In words, just what does this random variable represent? Proceeding inductively suppose, for instance, that N(t)=3image. Then SN(t)=S3image represents the time of the third event. Since there are only three events that have occurred by time timage, S3image also represents the time of the last event prior to (or at) time timage. This is, in fact, what SN(t)image represents—namely, the time of the last renewal prior to or at time timage. Similar reasoning leads to the conclusion that SN(t)+1image represents the time of the first renewal after time timage (see Figure 7.2). We now are ready to prove the following.

Proposition 7.1

With probability 1,

N(t)t1μast

image

Proof

Since SN(t)image is the time of the last renewal prior to or at time timage, and SN(t)+1image is the time of the first renewal after time timage, we have

SN(t)t<SN(t)+1

image

or

SN(t)N(t)tN(t)<SN(t)+1N(t) (7.6)

image (7.6)

However, since SN(t)/N(t)=i=1N(t)Xi/N(t)image is the average of N(t)image independent and identically distributed random variables, it follows by the strong law of large numbers that SN(t)/N(t)μimage as N(t)image. But since N(t)image when timage, we obtain

SN(t)N(t)μast

image

Furthermore, writing

SN(t)+1N(t)=SN(t)+1N(t)+1N(t)+1N(t)

image

we have that SN(t)+1/(N(t)+1)μimage by the same reasoning as before and

N(t)+1N(t)1ast

image

Hence,

SN(t)+1N(t)μast

image

The result now follows by Equation (7.6) since t/N(t)image is between two random variables, each of which converges to μimage as timage. ■

Example 7.4

Beverly has a radio that works on a single battery. As soon as the battery in use fails, Beverly immediately replaces it with a new battery. If the lifetime of a battery (in hours) is distributed uniformly over the interval (30, 60), then at what rate does Beverly have to change batteries?

Solution: If we let N(t)image denote the number of batteries that have failed by time timage, we have by Proposition 7.1 that the rate at which Beverly replaces batteries is given by

limtN(t)t=1μ=145

image

That is, in the long run, Beverly will have to replace one battery every 45 hours. ■

Example 7.5

Suppose in Example 7.4 that Beverly does not keep any surplus batteries on hand, and so each time a failure occurs she must go and buy a new battery. If the amount of time it takes for her to get a new battery is uniformly distributed over (0,1)image, then what is the average rate that Beverly changes batteries?

Solution: In this case the mean time between renewals is given by

μ=E[U1]+E[U2]

image

where U1image is uniform over (30, 60) and U2image is uniform over (0, 1). Hence,

μ=45+12=4512

image

and so in the long run, Beverly will be putting in a new battery at the rate of 291image. That is, she will put in two new batteries every 91 hours. ■

Solution: In answering these questions, let us suppose that at time 0 a customer has just entered the bank. (That is, we define the process to start when the first customer enters the bank.) If we let μGimage denote the mean service time, then, by the memoryless property of the Poisson process, it follows that the mean time between entering customers is

μ=μG+1λ

image

Hence, the rate at which customers enter the bank will be given by

1μ=λ1+λμG

image

On the other hand, since potential customers will be arriving at a rate λimage, it follows that the proportion of them entering the bank will be given by

λ/(1+λμG)λ=11+λμG

image

In particular if λ=2image and μG=2image, then only one customer out of five will actually enter the system. ■

A somewhat unusual application of Proposition 7.1 is provided by our next example.

Example 7.7

A sequence of independent trials, each of which results in outcome number iimage with probability Pi,i=1,,nimage, i=1nPi=1image, is observed until the same outcome occurs kimage times in a row; this outcome then is declared to be the winner of the game. For instance, if k=2image and the sequence of outcomes is 1, 2, 4, 3, 5, 2, 1, 3, 3, then we stop after nine trials and declare outcome number 3 the winner. What is the probability that iimage wins, i=1,,nimage, and what is the expected number of trials?

Solution: We begin by computing the expected number of coin tosses, call it E[T]image, until a run of kimage successive heads occurs when the tosses are independent and each lands on heads with probability pimage. By conditioning on the time of the first nonhead, we obtain

E[T]=j=1k(1-p)pj-1(j+E[T])+kpk

image

Solving this for E[T]image yields

E[T]=k+(1-p)pkj=1kjpj-1

image

Upon simplifying, we obtain

E[T]=1+p++pk-1pk=1-pkpk(1-p) (7.7)

image (7.7)


Now, let us return to our example, and let us suppose that as soon as the winner of a game has been determined we immediately begin playing another game. For each iimage let us determine the rate at which outcome iimage wins. Now, every time iimage wins, everything starts over again and thus wins by iimage constitute renewals. Hence, from Proposition 7.1, the

rate at whichiwins=1E[Ni]

image

where Niimage denotes the number of trials played between successive wins of outcome iimage. Hence, from Equation (7.7) we see that

rate at whichiwins=Pik(1-Pi)1-Pik (7.8)

image (7.8)

Hence, the long-run proportion of games that are won by number iimage is given by

proportion of gamesiwins=rate at whichiwinsj=1nrate at whichjwins=Pik(1-Pi)/(1-Pik)j=1n(Pjk(1-Pj)/(1-Pjk))

image

However, it follows from the strong law of large numbers that the long-run proportion of games that iimage wins will, with probability 1, be equal to the probability that iimage wins any given game. Hence,

P{iwins}=Pik(1-Pi)/(1-Pik)j=1n(Pjk(1-Pj)/(1-Pjk))

image

To compute the expected time of a game, we first note that the

rate at which games end=i=1nrate at whichiwins=i=1nPik(1-Pi)1-Pik(fromEquation(7.8))

image


Now, as everything starts over when a game ends, it follows by Proposition 7.1 that the rate at which games end is equal to the reciprocal of the mean time of a game. Hence,

E[time of a game}=1rate at which games end=1i=1n(Pik(1-Pi)/(1-Pik))

image
image
Figure 7.2

Proposition 7.1 says that the average renewal rate up to time timage will, with probability 1, converge to 1/μimage as timage. What about the expected average renewal rate? Is it true that m(t)/timage also converges to 1/μimage? This result is known as the elementary renewal theorem.

Remark

At first glance it might seem that the elementary renewal theorem should be a simple consequence of Proposition 7.1. That is, since the average renewal rate will, with probability 1, converge to 1/μimage, should this not imply that the expected average renewal rate also converges to 1/μimage? We must, however, be careful; consider the next example.

Example 7.8

Let Uimage be a random variable which is uniformly distributed on (0, 1); and define the random variables Yn,n1image, by

Yn=0,ifU>1/nn,ifU1/n

image

Now, since, with probability 1, Uimage will be greater than 0, it follows that Ynimage will equal 0 for all sufficiently large nimage. That is, Ynimage will equal 0 for all nimage large enough so that 1/n<Uimage. Hence, with probability 1,

Yn0asn

image

However,

E[Yn]=nPU1n=n1n=1

image

Therefore, even though the sequence of random variables Ynimage converges to 0, the expected values of the Ynimage are all identically 1. ■

To prove the elementary renewal theorem we will make use of an identity known as Wald’s equation. Before stating Wald’s equation we need to introduce the concept of a stopping time for a sequence of independent random variables.

Definition

The nonnegative integer valued random variable Nimage is said to be a stopping time for a sequence of independent random variables X1,X2,image if the event that {N=n}image is independent of Xn+1,Xn+2,,image for all n=1,2,.image

The idea behind a stopping time is that we imagine that the Xiimage are observed in sequence, first X1image, then X2image, and so on, and that Nimage denotes the number of them observed before stopping. Because the event that we stop after having observed X1,,Xnimage can only depend on these nimage values, and not on future unobserved values, it must be independent of these future values.

Example 7.9

Suppose that X1,X2,image is a sequence of independent and identically distributed random variables with

P(Xi=1)=p=1-P(Xi=0)

image

where p>0image. If we define

N=min(n:X1++Xn=r)

image

then Nimage is a stopping time for the sequence. If we imagine that trials are being performed in sequence and that Xi=1image corresponds to a success on trial iimage, then Nimage is the number of trials needed until there have been a total of rimage successes when each trial is independently a success with probability pimage. ■

Example 7.10

Suppose that X1,X2,image is a sequence of independent and identically distributed random variables with

P(Xi=1)=1/2=1-P(Xi=-1)

image

If

N=min(n:X1++Xn=1)

image

then Nimage is a stopping time for the sequence. Nimage can be regarded as the stopping time for a gambler who on each play is equally likely to win or lose 1image, and who is going to stop the first time he is winning money. (Because the successive winnings of the gambler are a symmetric random walk, which we showed in Chapter 4 to be a recurrent Markov chain, it follows that P(N<)=1image.) ■

We are now ready for Wald’s equation.

Theorem 7.2

Wald’s Equation

If X1,X2,,image is a sequence of independent and identically distributed random variables with finite expectation E[X]image, and if Nimage is a stopping time for this sequence such that E[N]<image, then

En=1NXn=E[N]E[X]

image

Proof

For n=1,2,,image let

In=1,ifnN0,ifn>N

image

and note that

n=1NXn=n=1XnIn

image

Taking expectations gives

En=1NXn=En=1XnIn=n=1E[XnIn]

image

Now In=1image if Nnimage, which means that In=1image if we have not yet stopped after having observed X1,,Xn-1image. But this implies that the value of Inimage is determined before Xnimage has been observed, and thus Xnimage is independent of Inimage. Consequently,

E[XnIn]=E[Xn]E[In]=E[X]E[In]

image

showing that

En=1NXn=E[X]n=1E[In]=E[X]En=1In=E[X]E[N]

image

To apply Wald’s equation to renewal theory, let X1,X2,image be the sequence of interarrival times of a renewal process. If we observe these one at a time and then stop at the first renewal after time timage, then we would stop after having observed X1,,XN(t)+1image, showing that N(t)+1image is a stopping time for the sequence of interarrival times. For a more formal argument that N(t)+1image is a stopping time for the sequence of interarrival times, note that N(t)=n-1image if and only if the (n-1)stimage renewal occurs by time timage and the nthimage renewal occurs after time timage. That is,

N(t)+1=nN(t)=n-1X1++Xn-1t,X1++Xn>t

image

showing that the event that N(t)+1=nimage depends only on the values of X1,,Xnimage. We thus have the following corollary of Wald’s equation.

Proposition 7.2

If X1,X2,,image are the interarrival times of a renewal process then

E[X1++XN(t)+1]=E[X]E[N(t)+1]

image

That is,

E[SN(t)+1]=μ[m(t)+1]

image

We are now ready to prove the elementary renewal theorem.

Proof of Elementary Renewal Theorem

Because SN(t)+1image is the time of the first renewal after timage, it follows that

SN(t)+1=t+Y(t)

image

where Y(t)image, called the excess at time timage, is defined as the time from timage until the next renewal. Taking expectations of the preceding yields, upon applying Proposition 7.2, that

μ(m(t)+1)=t+E[Y(t)] (7.9)

image (7.9)

which can be written as

m(t)t=1μ+E[Y(t)]tμ-1t

image

Because Y(t)0image, the preceding yields that m(t)t1μ-1timage, showing that

limtm(t)t1μ

image

To show that limtm(t)t1μimage, let us suppose that there is a value M<image such that P(Xi<M)=1image. Because this implies that Y(t)image must also be less than Mimage, we have that E[Y(t)]<Mimage, and so

m(t)t1μ+Mtμ-1t

image

which gives that

limtm(t)t1μ

image

and thus completes the proof of the elementary renewal theorem when the interarrival times are bounded. When the interarrival times X1,X2,image are unbounded, fix M>0image, and let NM(t),t0image be the renewal process with interarrival times min(Xi,M),i1image. Because min(Xi,M)Xiimage for all iimage, it follows that NM(t)N(t)image for all timage. (That is, because each interarrival time of NM(t)image is smaller than its corresponding interarrival time of N(t)image, it must have at least as many renewals by time timage.) Consequently, E[N(t)]E[NM(t)]image, showing that

limtE[N(t)]tlimtE[NM(t)]t=1E[min(Xi,M)]

image

where the equality follows because the interarrival times of NM(t)image are bounded. Using that limME[min(Xi,M)]=E[Xi]=μimage, we obtain from the preceding upon letting Mimage that

limtm(t)t1μ

image

and the proof is completed. ■

Equation (7.9) shows that if we can determine E[Y(t)]image; the mean excess at time timage, then we can compute m(t)image and vice versa.

Example 7.11

Consider the renewal process whose interarrival distribution is the convolution of two exponentials; that is,

F=F1F2,whereFi(t)=1-e-μit,i=1,2

image

We will determine the renewal function by first determining E[Y(t)]image. To obtain the mean excess at timage, imagine that each renewal corresponds to a new machine being put in use, and suppose that each machine has two components—initially component 1 is employed and this lasts an exponential time with rate μ1image, and then component 2, which functions for an exponential time with rate μ2image, is employed. When component 2 fails, a new machine is put in use (that is, a renewal occurs). Now consider the process {X(t),t0}image where X(t)image is iimage if a type iimage component is in use at time timage. It is easy to see that {X(t),t0}image is a two-state continuous-time Markov chain, and so, using the results of Example 6.11, its transition probabilities are

P11(t)=μ1μ1+μ2e-(μ1+μ2)t+μ2μ1+μ2

image

To compute the expected remaining life of the machine in use at time timage, we condition on whether it is using its first or second component: for if it is still using its first component, then its remaining life is 1/μ1+1/μ2image, whereas if it is already using its second component, then its remaining life is 1/μ2image. Hence, letting p(t)image denote the probability that the machine in use at time timage is using its first component, we have

E[Y(t)]=1μ1+1μ2p(t)+1-p(t)μ2=1μ2+p(t)μ1

image

But, since at time 0 the first machine is utilizing its first component, it follows that p(t)=P11(t)image, and so, upon using the preceding expression of P11(t)image, we obtain

E[Y(t)]=1μ2+1μ1+μ2e-(μ1+μ2)t+μ2μ1(μ1+μ2) (7.10)

image (7.10)

Now it follows from Equation (7.9) that

m(t)+1=tu+E[Y(t)]μ (7.11)

image (7.11)

where μimage, the mean interarrival time, is given in this case by

μ=1μ1+1μ2=μ1+μ2μ1μ2

image

Substituting Equation (7.10) and the preceding equation into (7.11) yields, after simplifying,

m(t)=μ1μ2μ1+μ2t-μ1μ2(μ1+μ2)2[1-e-(μ1+μ2)t]