image

5.3.4 Further Properties of Poisson Processes

Consider a Poisson process {N(t),t0}image having rate λimage, and suppose that each time an event occurs it is classified as either a type I or a type II event. Suppose further that each event is classified as a type I event with probability pimage or a type II event with probability 1-pimage, independently of all other events. For example, suppose that customers arrive at a store in accordance with a Poisson process having rate λimage; and suppose that each arrival is male with probability 12image and female with probability 12image. Then a type I event would correspond to a male arrival and a type II event to a female arrival.

Let N1(t)image and N2(t)image denote respectively the number of type I and type II events occurring in [0,t]image. Note that N(t)=N1(t)+N2(t)image.

Proposition 5.2

{N1(t),t0}image and {N2(t),t0}image are both Poisson processes having respective rates λpimage and λ(1-p)image. Furthermore, the two processes are independent.

Example 5.14

If immigrants to area Aimage arrive at a Poisson rate of ten per week, and if each immigrant is of English descent with probability 112image, then what is the probability that no people of English descent will emigrate to area Aimage during the month of February?

Solution: By the previous proposition it follows that the number of Englishmen emigrating to area Aimage during the month of February is Poisson distributed with mean 4·10·112=103image. Hence, the desired probability is e-10/3imageimage

Example 5.15

Suppose nonnegative offers to buy an item that you want to sell arrive according to a Poisson process with rate λimage. Assume that each offer is the value of a continuous random variable having density function f(x)image. Once the offer is presented to you, you must either accept it or reject it and wait for the next offer. We suppose that you incur costs at a rate cimage per unit time until the item is sold, and that your objective is to maximize your expected total return, where the total return is equal to the amount received minus the total cost incurred. Suppose you employ the policy of accepting the first offer that is greater than some specified value yimage. (Such a type of policy, which we call a yimage-policy, can be shown to be optimal.) What is the best value of yimage? What is the maximal expected net return?

Solution: Let us compute the expected total return when you use the yimage-policy, and then choose the value of yimage that maximizes this quantity. Let Ximage denote the value of a random offer, and let F¯(x)=P{X>x}=xf(u)duimage be its tail distribution function. Because each offer will be greater than yimage with probability F¯(y)image, it follows that such offers occur according to a Poisson process with rate λF¯(y)image. Hence, the time until an offer is accepted is an exponential random variable with rate λF¯(y)image. Letting R(y)image denote the total return from the policy that accepts the first offer that is greater than yimage, we have

E[R(y)]=E[acceptedoffer]-cE[timetoaccept]=E[XX>y]-cλF¯(y)=0xfXX>y(x)dx-cλF¯(y)=yxf(x)F¯(y)dx-cλF¯(y)=yxf(x)dx-c/λF¯(y) (5.14)

image (5.14)

Differentiation yields

ddyE[R(y)]=0-F¯(y)yf(y)+yxf(x)dx-cλf(y)=0

image

Therefore, the optimal value of yimage satisfies

yF¯(y)=yxf(x)dx-cλ

image

or

yyf(x)dx=yxf(x)dx-cλ

image

or

y(x-y)f(x)dx=cλ

image

It is not difficult to show that there is a unique value of yimage that satisfies the preceding. Hence, the optimal policy is the one that accepts the first offer that is greater than yimage, where yimage is such that

y(x-y)f(x)dx=c/λ

image

Putting y=yimage in Equation (5.14) shows that the maximal expected net return is

E[R(y)]=1F¯(y)(y(x-y+y)f(x)dx-c/λ)=1F¯(y)(y(x-y)f(x)dx+yyf(x)dx-c/λ)=1F¯(y)(c/λ+yF¯(y)-c/λ)=y

image

Thus, the optimal critical value is also the maximal expected net return. To understand why this is so, let mimage be the maximal expected net return, and note that when an offer is rejected the problem basically starts anew and so the maximal expected additional net return from then on is mimage. But this implies that it is optimal to accept an offer if and only if it is at least as large as mimage, showing that mimage is the optimal critical value. image

It follows from Proposition 5.2 that if each of a Poisson number of individuals is independently classified into one of two possible groups with respective probabilities pimage and 1-pimage, then the number of individuals in each of the two groups will be independent Poisson random variables. Because this result easily generalizes to the case where the classification is into any one of rimage possible groups, we have the following application to a model of employees moving about in an organization.

Example 5.16

Consider a system in which individuals at any time are classified as being in one of rimage possible states, and assume that an individual changes states in accordance with a Markov chain having transition probabilities Pij,i,j=1,,rimage. That is, if an individual is in state iimage during a time period then, independently of its previous states, it will be in state jimage during the next time period with probability Pijimage. The individuals are assumed to move through the system independently of each other. Suppose that the numbers of people initially in states 1,2,,rimage are independent Poisson random variables with respective means λ1,λ2,,λrimage. We are interested in determining the joint distribution of the numbers of individuals in states 1,2,,rimage at some time nimage.

Solution: For fixed iimage, let Nj(i),j=1,,rimage denote the number of those individuals, initially in state iimage, that are in state jimage at time nimage. Now each of the (Poisson distributed) number of people initially in state iimage will, independently of each other, be in state jimage at time nimage with probability Pijnimage, where Pijnimage is the nimage-stage transition probability for the Markov chain having transition probabilities Pijimage. Hence, the Nj(i),j=1,,rimage will be independent Poisson random variables with respective means λiPijnimage, j=1,,rimage. Because the sum of independent Poisson random variables is itself a Poisson random variable, it follows that the number of individuals in state jimage at time nimage—namely i=1rNj(i)image—will be independent Poisson random variables with respective means iλiPijnimage, for j=1,,rimageimage

Example 5.17

The Coupon Collecting Problem

There are mimage different types of coupons. Each time a person collects a coupon it is, independently of ones previously obtained, a type jimage coupon with probability pj,j=1mpj=1image. Let Nimage denote the number of coupons one needs to collect in order to have a complete collection of at least one of each type. Find E[N]image.

Solution: If we let Njimage denote the number one must collect to obtain a type jimage coupon, then we can express Nimage as

N=max1jmNj

image

However, even though each Njimage is geometric with parameter pjimage, the foregoing representation of Nimage is not that useful, because the random variables Njimage are not independent.
We can, however, transform the problem into one of determining the expected value of the maximum of independent random variables. To do so, suppose that coupons are collected at times chosen according to a Poisson process with rate λ=1image. Say that an event of this Poisson process is of type j,1jmimage, if the coupon obtained at that time is a type jimage coupon. If we now let Nj(t)image denote the number of type jimage coupons collected by time timage, then it follows from Proposition 5.2 that {Nj(t),t0},j=1,,mimage are independent Poisson processes with respective rates λpj=pjimage. Let Xjimage denote the time of the first event of the jimageth process, and let

X=max1jmXj

image

denote the time at which a complete collection is amassed. Since the Xjimage are independent exponential random variables with respective rates pjimage, it follows that

P{X<t}=P{max1jmXj<t}=P{Xj<t,forj=1,,m}=j=1m(1-e-pjt)

image

Therefore,

E[X]=0P{X>t}dt=01-j=1m(1-e-pjt)dt (5.15)

image (5.15)

It remains to relate E[X]image, the expected time until one has a complete set, to E[N]image, the expected number of coupons it takes. This can be done by letting Tiimage denote the iimageth interarrival time of the Poisson process that counts the number of coupons obtained. Then it is easy to see that

X=i=1NTi

image

Since the Tiimage are independent exponentials with rate 1, and Nimage is independent of the Tiimage, we see that

E[XN]=NE[Ti]=N

image

Therefore,

E[X]=E[N]

image

and so E[N]image is as given in Equation (5.15).
Let us now compute the expected number of types that appear only once in the complete collection. Letting Iiimage equal 1image if there is only a single type iimage coupon in the final set, and letting it equal 0image otherwise, we thus want

Ei=1mIi=i=1mE[Ii]=i=1mP{Ii=1}

image

Now there will be a single type iimage coupon in the final set if a coupon of each type has appeared before the second coupon of type iimage is obtained. Thus, letting Siimage denote the time at which the second type iimage coupon is obtained, we have

P{Ii=1}=P{Xj<Si,forallji}

image

Using that Siimage has a gamma distribution with parameters (2,pi)image, this yields

P{Ii=1}=0P{Xj<SiforalljiSi=x}pie-pixpixdx=0P{Xj<x,forallji}pi2xe-pixdx=0ji(1-e-pjx)pi2xe-pixdx

image

Therefore, we have the result

Ei=1mIi=0i=1mji(1-e-pjx)pi2xe-pixdx=0xj=1m(1-e-pjx)i=1mpi2e-pix1-e-pixdx

image

The next probability calculation related to Poisson processes that we shall determine is the probability that nimage events occur in one Poisson process before mimage events have occurred in a second and independent Poisson process. More formally let {N1(t),t0}image and {N2(t),t0}image be two independent Poisson processes having respective rates λ1image and λ2image. Also, let Sn1image denote the time of the nimageth event of the first process, and Sm2image the time of the mimageth event of the second process. We seek

PSn1<Sm2

image

Before attempting to calculate this for general nimage and mimage, let us consider the special case n=m=1image. Since S11image, the time of the first event of the N1(t)image process, and S12image, the time of the first event of the N2(t)image process, are both exponentially distributed random variables (by Proposition 5.1) with respective means 1/λ1image and 1/λ2image, it follows from Section 5.2.3 that

PS11<S12=λ1λ1+λ2 (5.16)

image (5.16)

Let us now consider the probability that two events occur in the N1(t)image process before a single event has occurred in the N2(t)image process. That is, P{S21<S12}image. To calculate this we reason as follows: In order for the N1(t)image process to have two events before a single event occurs in the N2(t)image process, it is first necessary for the initial event that occurs to be an event of the N1(t)image process (and this occurs, by Equation (5.16), with probability λ1/(λ1+λ2)image). Now, given that the initial event is from the N1(t)image process, the next thing that must occur for S21image to be less than S12image is for the second event also to be an event of the N1(t)image process. However, when the first event occurs both processes start all over again (by the memoryless property of Poisson processes) and hence this conditional probability is also λ1/(λ1+λ2)image; thus, the desired probability is given by

PS21<S12=λ1λ1+λ22

image

In fact, this reasoning shows that each event that occurs is going to be an event of the N1(t)image process with probability λ1/(λ1+λ2)image or an event of the N2(t)image process with probability λ2/(λ1+λ2)image, independent of all that has previously occurred. In other words, the probability that the N1(t)image process reaches nimage before the N2(t)image process reaches mimage is just the probability that nimage heads will appear before mimage tails if one flips a coin having probability p=λ1/(λ1+λ2)image of a head appearing. But by noting that this event will occur if and only if the first n+m-1image tosses result in nimage or more heads, we see that our desired probability is given by

PSn1<Sm2=k=nn+m-1n+m-1kλ1λ1+λ2kλ2λ1+λ2n+m-1-k

image

5.3.5 Conditional Distribution of the Arrival Times

Suppose we are told that exactly one event of a Poisson process has taken place by time timage, and we are asked to determine the distribution of the time at which the event occurred. Now, since a Poisson process possesses stationary and independent increments it seems reasonable that each interval in [0,t]image of equal length should have the same probability of containing the event. In other words, the time of the event should be uniformly distributed over [0,t]image. This is easily checked since, for stimage,

P{T1<sN(t)=1}=P{T1<s,N(t)=1}P{N(t)=1}=P{1eventin[0,s),0eventsin[s,t]}P{N(t)=1}=P{1eventin[0,s)}P{0eventsin[s,t]}P{N(t)=1}=λse-λse-λ(t-s)λte-λt=st

image

This result may be generalized, but before doing so we need to introduce the concept of order statistics.

Let Y1,Y2,,Ynimage be nimage random variables. We say that Y(1),Y(2),,Y(n)image are the order statistics corresponding to Y1,Y2,,Ynimage if Y(k)image is the kimageth smallest value among Y1,,Ynimage, k=1,2,,nimage. For instance, if n=3image and Y1=4image, Y2=5image, Y3=1image then Y(1)=1,Y(2)=4image, Y(3)=5image. If the Yi,i=1,,nimage, are independent identically distributed continuous random variables with probability density fimage, then the joint density of the order statistics Y(1),Y(2),,Y(n)image is given by

f(y1,y2,,yn)=n!i=1nf(yi),y1<y2<<yn

image

The preceding follows since

(i) (Y(1),Y(2),,Y(n)image) will equal (y1,y2,,ynimage) if (Y1,Y2,,Ynimage) is equal to any of the nimage! permutations of (y1,y2,,ynimage);

and

(ii) the probability density that (Y1,Y2,,Ynimage) is equal to yi1,,yinimage is j=1nf(yij)=j=1nf(yj)image when i1,,inimage is a permutation of 1,2,,nimage.

If the Yi,i=1,,nimage, are uniformly distributed over (0,t)image, then we obtain from the preceding that the joint density function of the order statistics Y(1),Y(2),,Y(n)image is

f(y1,y2,,yn)=n!tn,0<y1<y2<<yn<t

image

We are now ready for the following useful theorem.

Theorem 5.2

Given that N(t)=nimage, the nimage arrival times S1,,Snimage have the same distribution as the order statistics corresponding to nimage independent random variables uniformly distributed on the interval (0,t)image.

Proof

To obtain the conditional density of S1,,Snimage given that N(t)=nimage note that for 0<s1<<sn<timage the event that S1=s1,S2=s2,,Sn=sn,N(t)=nimage is equivalent to the event that the first n+1image interarrival times satisfy T1=s1,T2=s2-s1,,Tn=sn-sn-1,Tn+1>t-snimage. Hence, using Proposition 5.1, we have that the conditional joint density of S1,,Snimage given that N(t)=nimage is as follows:

f(s1,,snn)=f(s1,,sn,n)P{N(t)=n}=λe-λs1λe-λ(s2-s1)λe-λ(sn-sn-1)e-λ(t-sn)e-λt(λt)n/n!=n!tn,0<s1<<sn<t

image

which proves the result. image

Remark

The preceding result is usually paraphrased as stating that, under the condition that nimage events have occurred in (0,t)image, the times S1,,Snimage at which events occur, considered as unordered random variables, are distributed independently and uniformly in the interval (0,t)image.

Application of Theorem 5.2 (Sampling a Poisson Process)

In Proposition 5.2 we showed that if each event of a Poisson process is independently classified as a type I event with probability pimage and as a type II event with probability 1-pimage then the counting processes of type I and type II events are independent Poisson processes with respective rates λpimage and λ(1-p)image. Suppose now, however, that there are kimage possible types of events and that the probability that an event is classified as a type iimage event, i=1,,kimage, depends on the time the event occurs. Specifically, suppose that if an event occurs at time yimage then it will be classified as a type iimage event, independently of anything that has previously occurred, with probability Pi(y),i=1,,kimage where i=1kPi(y)=1image. Upon using Theorem 5.2 we can prove the following useful proposition.

Proposition 5.3

If Ni(t),i=1,,kimage, represents the number of type iimage events occurring by time timage then Ni(t),i=1,,kimage, are independent Poisson random variables having means

E[Ni(t)]=λ0tPi(s)ds

image

Before proving this proposition, let us first illustrate its use.

Example 5.18

An Infinite Server Queue

Suppose that customers arrive at a service station in accordance with a Poisson process with rate λimage. Upon arrival the customer is immediately served by one of an infinite number of possible servers, and the service times are assumed to be independent with a common distribution Gimage. What is the distribution of X(t)image, the number of customers that have completed service by time timage? What is the distribution of Y(t)image, the number of customers that are being served at time timage?

To answer the preceding questions let us agree to call an entering customer a type I customer if he completes his service by time timage and a type II customer if he does not complete his service by time timage. Now, if the customer enters at time s,stimage, then he will be a type I customer if his service time is less than t-simage. Since the service time distribution is Gimage, the probability of this will be G(t-s)image. Similarly, a customer entering at time s,stimage, will be a type II customer with probability G¯(t-s)=1-G(t-s)image. Hence, from Proposition 5.3 we obtain that the distribution of X(t)image, the number of customers that have completed service by time timage, is Poisson distributed with mean

E[X(t)]=λ0tG(t-s)ds=λ0tG(y)dy (5.17)

image (5.17)

Similarly, the distribution of Y(t)image, the number of customers being served at time timage is Poisson with mean

E[Y(t)]=λ0tG¯(t-s)ds=λ0tG¯(y)dy (5.18)

image (5.18)

Furthermore, X(t)image and Y(t)image are independent.

Suppose now that we are interested in computing the joint distribution of Y(t)image and Y(t+s)image—that is, the joint distribution of the number in the system at time timage and at time t+simage. To accomplish this, say that an arrival is

type 1: if he arrives before time timage and completes service between timage and t+simage,

type 2: if he arrives before timage and completes service after t+simage,

type 3: if he arrives between timage and t+simage and completes service after t+simage,

type 4: otherwise.

Hence, an arrival at time yimage will be type iimage with probability Pi(y)image given by

P1(y)=G(t+s-y)-G(t-y),ify<t0,otherwiseP2(y)=G¯(t+s-y),ify<t0,otherwiseP3(y)=G¯(t+s-y),ift<y<t+s0,otherwiseP4(y)=1-P1(y)-P2(y)-P3(y)

image

Thus, if Ni=Ni(s+t),i=1,2,3image, denotes the number of type iimage events that occur, then from Proposition 5.3, Ni,i=1,2,3image, are independent Poisson random variables with respective means

E[Ni]=λ0t+sPi(y)dy,i=1,2,3

image

Because

Y(t)=N1+N2,Y(t+s)=N2+N3

image

it is now an easy matter to compute the joint distribution of Y(t)image and Y(t+s)image. For instance,

Cov[Y(t),Y(t+s)]=Cov(N1+N2,N2+N3)=Cov(N2,N2)byindependenceofN1,N2,N3=Var(N2)=λ0tG¯(t+s-y)dy=λ0tG¯(u+s)du

image

where the last equality follows since the variance of a Poisson random variable equals its mean, and from the substitution u=t-yimage. Also, the joint distribution of Y(t)image and Y(t+s)image is as follows:

P{Y(t)=i,Y(t+s)=j}=P{N1+N2=i,N2+N3=j}=l=0min(i,j)P{N2=l,N1=i-l,N3=j-l}=l=0min(i,j)P{N2=l}P{N1=i-l}P{N3=j-l}

image

Example 5.19

A One Lane Road with No Overtaking

Consider a one lane road with a single entrance and a single exit point which are of distance Limage from each other (See Figure 5.2). Suppose that cars enter this road according to a Poisson process with rate λimage, and that each entering car has an attached random value Vimage which represents the velocity at which the car will travel, with the proviso that whenever the car encounters a slower moving car it must decrease its speed to that of the slower moving car. Let Viimage denote the velocity value of the iimageth car to enter the road, and suppose that Vi,i1image are independent and identically distributed and, in addition, are independent of the counting process of cars entering the road. Assuming that the road is empty at time 0image, we are interested in determining

(a) the probability mass function of R(t),image the number of cars on the road at time timage; and

(b) the distribution of the road traversal time of a car that enters the road at time yimage.

Solution: Let Ti=L/Viimage denote the time it would take car iimage to travel the road if it were empty when car iimage arrived. Call Tiimage the free travel time of car iimage, and note that T1,T2,image are independent with distribution function

G(x)=P(Tix)=P(L/Vix)=P(ViL/x)

image

Let us say that an event occurs each time that a car enters the road. Also, let timage be a fixed value, and say that an event that occurs at time simage is a type 1image event if both stimage and the free travel time of the car entering the road at time simage exceeds t-simage. In other words, a car entering the road is a type 1image event if the car would be on the road at time timage even if the road were empty when it entered. Note that, independent of all that occurred prior to time s,image an event occurring at time simage is a type 1image event with probability

P(s)=G¯(t-s),ifst0,ifs>t

image

Letting N1(y)image denote the number of type 1image events that occur by time yimage, it follows from Proposition 5.3 that N1(y)image is, for yt,image a Poisson random variable with mean

E[N1(y)]=λ0yG¯(t-s)ds,yt

image

Because there will be no cars on the road at time timage if and only if N1(t)=0,image it follows that

P(R(t)=0)=P(N1(t)=0)=e-λ0tG¯(t-s)ds=e-λ0tG¯(u)du

image

To determine P(R(t)=n)image for n>0image we will condition on when the first type 1image event occurs. With Ximage equal to the time of the first type 1image event (or to image if there are no type 1image events), its distribution function is obtained by noting that

XyN1(y)>0

image

thus showing that

FX(y)=P(Xy)=P(N1(y)>0)=1-e-λ0yG¯(t-s)ds,yt

image

Differentiating gives the density function of Ximage:

fX(y)=λG¯(t-y)e-λ0yG¯(t-s)ds,yt

image

To use the identity

P(R(t)=n)=0tP(R(t)=nX=y)fX(y)dy (5.19)

image (5.19)

note that if X=ytimage then the leading car that is on the road at time timage entered at time y.image Because all other cars that arrive between yimage and timage will also be on the road at time timage, it follows that, conditional on X=y,image the number of cars on the road at time timage will be distributed as 1image plus a Poisson random variable with mean λ(t-y).image Therefore, for n>0image

P(R(t)=nX=y)=e-λ(t-y)(λ(t-y))n-1(n-1)!,ifyt0,ify=

image

Substituting this into Equation (5.19) yields

P(R(t)=n)=0te-λ(t-y)(λ(t-y))n-1(n-1)!λG¯(t-y)e-λ0yG¯(t-s)dsdy

image

(b) Let Timage be the free travel time of the car that enters the road at time y,image and let A(y)image be its actual travel time. To determine P(A(y)<x),image let t=y+ximage and note that A(y)image will be less than ximage if and only if both T<ximage and there have been no type 1image events (using t=y+ximage) before time yimage. That is,

A(y)<xT<x,N1(y)=0

image

Because Timage is independent of what has occurred prior to time yimage, the preceding gives

P(A(y)<x)=P(T<x)P(N1(y)=0)=G(x)e-λ0yG¯(y+x-s)ds=G(x)e-λxy+xG¯(u)du

image

Example 5.20

Tracking the Number of HIV Infections

There is a relatively long incubation period from the time when an individual becomes infected with the HIV virus, which causes AIDS, until the symptoms of the disease appear. As a result, it is difficult for public health officials to be certain of the number of members of the population that are infected at any given time. We will now present a first approximation model for this phenomenon, which can be used to obtain a rough estimate of the number of infected individuals.

Let us suppose that individuals contract the HIV virus in accordance with a Poisson process whose rate λimage is unknown. Suppose that the time from when an individual becomes infected until symptoms of the disease appear is a random variable having a known distribution Gimage. Suppose also that the incubation times of different infected individuals are independent.

Let N1(t)image denote the number of individuals who have shown symptoms of the disease by time timage. Also, let N2(t)image denote the number who are HIV positive but have not yet shown any symptoms by time timage. Now, since an individual who contracts the virus at time simage will have symptoms by time timage with probability G(t-s)image and will not with probability G¯(t-s)image, it follows from Proposition 5.3 that N1(t)image and N2(t)image are independent Poisson random variables with respective means

E[N1(t)]=λ0tG(t-s)ds=λ0tG(y)dy

image

and

E[N2(t)]=λ0tG¯(t-s)ds=λ0tG¯(y)dy

image

Now, if we knew λimage, then we could use it to estimate N2(t)image, the number of individuals infected but without any outward symptoms at time timage, by its mean value E[N2(t)]image. However, since λimage is unknown, we must first estimate it. Now, we will presumably know the value of N1(t)image, and so we can use its known value as an estimate of its mean E[N1(t)]image. That is, if the number of individuals who have exhibited symptoms by time timage is n1image, then we can estimate that

n1E[N1(t)]=λ0tG(y)dy

image

Therefore, we can estimate λimage by the quantity λˆimage given by

λˆ=n1/0tG(y)dy

image

Using this estimate of λimage, we can estimate the number of infected but symptomless individuals at time timage by

estimateofN2(t)=λˆ0tG¯(y)dy=n10tG¯(y)dy0tG(y)dy

image

For example, suppose that Gimage is exponential with mean μimage. Then G¯(y)=e-y/μimage, and a simple integration gives that

estimateofN2(t)=n1μ(1-e-t/μ)t-μ(1-e-t/μ)

image

If we suppose that t=16image years, μ=10image years, and n1=220image thousand, then the estimate of the number of infected but symptomless individuals at time 16 is

estimate=2,200(1-e-1.6)16-10(1-e-1.6)=218.96

image

That is, if we suppose that the foregoing model is approximately correct (and we should be aware that the assumption of a constant infection rate λimage that is unchanging over time is almost certainly a weak point of the model), then if the incubation period is exponential with mean 10 years and if the total number of individuals who have exhibited AIDS symptoms during the first 16 years of the epidemic is 220 thousand, then we can expect that approximately 219 thousand individuals are HIV positive though symptomless at time 16. image

Proof of Proposition 5.3

Let us compute the joint probability P{Ni(t)=ni,i=1,,k}image. To do so note first that in order for there to have been niimage type iimage events for i=1,,kimage there must have been a total of i=1kniimage events. Hence, conditioning on N(t)image yields

P{N1(t)=n1,,Nk(t)=nk}=PN1(t)=n1,,Nk(t)=nkN(t)=i=1kni×PN(t)=i=1kni

image

Now consider an arbitrary event that occurred in the interval [0,t]image. If it had occurred at time simage, then the probability that it would be a type iimage event would be Pi(s)image. Hence, since by Theorem 5.2 this event will have occurred at some time uniformly distributed on [0,t]image, it follows that the probability that this event will be a type iimage event is

Pi=1t0tPi(s)ds

image

independently of the other events. Hence,

PNi(t)=ni,i=1,,kN(t)=i=1kni

image

will just equal the multinomial probability of niimage type iimage outcomes for i=1,,kimage when each of i=1kniimage independent trials results in outcome iimage with probability Pi,i=1,,kimage. That is,

PN1(t)=n1,,Nk(t)=nkN(t)=i=1kni=i=1kni!n1!nk!P1n1Pknk

image

Consequently,

P{N1(t)=n1,,Nk(t)=nk}=(ini)!n1!nk!P1n1Pknke-λt(λt)ini(ini)!=i=1ke-λtPi(λtPi)ni/ni!

image

and the proof is complete. image