The notion of martingales and related concepts seem to have originated in studies of games of chance similar to the following. Suppose
Y n = a gambler's “gain” on the nth play of a game
Y 0 = the original capital or “bankroll”
Set
X
n
= 0 for . Thus, Xn
is the capital after n plays, and
Put and
.
For any
n ∈ N
,
and
or, equivalently,
. Hence
If YN
is an independent class with , the game
is considered fair. In this case, we have by (CE5), (CE7), and hypothesis
Also
Gamblers seek to develop a “system” to improve expected earnings. We examine some typical approaches and show their futility. To keep the analysis simple, consider a simple coin-flipping game. Let
H k = event of a “head” on the kth component trial
T k = H k c = event of a “tail” on the kth component trial
The player has a system. He decides how much to bet on each play from the
pattern of previous events. Let be the result
of the nth play, where |B
n
| is the amount of the bet;
B
n
> 0 indicates a
bet on a head;
B
n
< 0 indicates a bet on a tail;
B = 0 indicates a decision not
to bet.
Systems take various forms; here we consider two possibilities.
The amount of the bet is determend by the pattern of outcomes of previous tosses
The amount bet is determined by the pattern of previous payoffs
Let
Y
0 = X
0 = C
, a constant. Since C is independent of any random variable,
E[H|C] = E[H]. In either scheme, by (CE8), (CI5), and the fact
It follows that
The “fairness” of the game is not altered by the betting scheme, since decisions must be based on past performance. In spite of simple beginnings, the extension and analysis of these patterns form a major thrust of modern probability theory.
In the following treatment,
is the basic sequence
is the incremental sequence
We suppose ZN
is a decision sequence and
X
N
∼ Z
N
; that is,
.
X N ∼ Z N iff Y N ∼ Z N
If
X
N
∼ H
N
and
H
N
∼ Z
N
, then
X
N
∼ Z
N
. In particular, if , then
X
N
∼ H
N
.
Definition. If XN is integrable and ZN is a decision sequence, then
XN is a martingale (MG) relative to ZN iff
XN is a submartingale (SMG) relative to ZN iff
XN is a supermartingale (SRMG) relative to ZN iff
Notation. When we write is a martingale (submartingale,
supermartingale), we are asserting XN
is integrable, ZN
is a decision sequence,
X
N
∼ Z
N
, and XN
is a MG (SMG, SRMG) relative to ZN
.
Definition. If YN is integrable and ZN is a decision sequence, then
YN is absolutely fair relative to ZN iff
YN is favorable relative to ZN iff
YN is unfavorable relative to ZN iff
Notation. When we write is absolutely fair (favorable,
unfavorable), we are asserting YN
is integrable, ZN
is a decision sequence,
Y
N
∼ Z
N
, and YN
is absolutely fair (favorable, unfavorable)
relative to ZN
.
IXA2-2
Theorem 1.1. IXA2-1
If XN is a basic sequence and YN is the corresponding incremental sequence, then
is a martingale iff
is absolutely fair.
is a submartingale iff
is favorable.
is a supermartingale iff
is unfavorable.
Remarks
is a SMG iff
is a SRMG
We write (S)MG to indicate the same statement can be made for a MG or a SMG with the appropriate choice of = or ≥
We write ( ≥ ) to indicate simultaneously two cases:
read as = in all places (for a MG)
read as ≥ in all places (for a SMG)
Theorem 1.2. IXA3-1
Theorem 1.3. IXA3-2
For integrable X N ∼ Z N , the following are equivalent
Corollary 1.1. IXA3-3
If is a (S)MG, then
Theorem 1.4. IXA3-4
is a (S)MG iff
EXERCISE. Note
Theorem 1.5. IXA3-5
If is an
L
2
MG, then
by (CE1b),
(CE8), and Thm IXA3-4
As in b, since X n – X m ∼ W n
Suppose
p < q
. Then, since
X
p
∼ W
p
, by definition of MG.
For
q < p
, interchange
in the argument above.
by d, above
By c, for
j ≠ k
. Hence,
A variety of weighted sums of increments are useful.
Theorem 1.6. IXA3-6
Suppose is a (S)MG and YN
is the incremental sequence.
Let H0
be a (nonnegative) constant and let
, be
bounded (nonnegative). Set
Then is a (S)MG.
by (CE8)
For MG case: for arbitrary bounded Hn
For SMG case: for
H
n
≥ 0, bounded
The conclusion follows from Theorem 1.1. IXA2-1
Remark. This result extends the pattern in the introductory gambling example. Theorem 1.4. IXA3-4 IXA3-3
Some important special cases
Theorem 1.8. IXA3-8
Suppose integrable
X
N
∼ Z
N
. If
X
n + 1 – X
n
( ≥ )0a.s.∀n ∈ N
, then is a (S)MG.
Apply monotonicity and Theorem IXA3-4
Theorem 1.9. IXA3-9
Suppose XN has independent increments.
If , invariant with n, then XN
is a MG.
If ,
then
is a (S)MG.
For any n, consider any . By independent increments,
is independent. Hence,
.
The desired result follows from Theorem IXA3-2(c).
Theorem 1.10. IXA3-10
Suppose g is a convex Borel function on an interval I which contains the range of all Xn and
, Let
,
If is a MG, then
is a SMG.
If g is nondecreasing and is a SMG, then
so is
a. : By Jensen's inequality and the definition of a MG
(1.20)
![]() |
b. : By Jensen's inequality
(1.21)
Since ![]() ![]() (1.22)
![]() |
Some commonly utilized convex functions
g ( t ) = | t |
g(t) = t 2 g is increasing for t ≥ 0
g(t) = u(t)t
g nondecreasing for all t
g nonincreasing for all t
g is increasing for all t
Theorem 1.11. IXA3-11
Consider integrable X N ∼ Z N .
If and
, then
is a MG
If and
, then
is a SMG
The restrictionl a > 0 is needed in the ≥ case.
Theorem 1.12. A4-1 Sums of Independent Random Variables
Suppose YN
is an independent, integrable sequence. Set
.
If , then XN
is a (S)MG.
Theorem 1.13. A4-2 Products of nonnegative random variables
Suppose
Y
N
∼ Z
N
,Y
n
≥ 0a.s.∀n
. Consider
.
If , then
is a (S)MG
X
n
∼ W
n
and
X
n + 1 = Y
n + 1
X
n
. Hence,
Theorem 1.14. A4-3 Discrete random walk
Consider
Y
0 = 0 and iid. Set
. Suppose
. Let
Now .
Hence,
g
Y
(s) = 1 has at most two roots, one of which is
s = 1.
s = 1 is a minimum point iff , in which case XN
is a MG (see A4-1)
If
g
Y
(r) = 1 for 0 < r < 1, then .
Let
.
By A4-2, ZN
is a MG
For the MG case in Theorem IXA3-6, the Yn are centered at conditional expectation; that is
The following is an
extension of that pattern.
Theorem 1.15. A4-4 More general sums
Consider integrable Y N ∼ Z N and bounded H N ∼ Z N . Let W n = a constant for n < 0 and H n = 1 for n < 0. Set
Then is a MG.
X
n
∼ W
n
;∀n ≥ 0 and
IXA4-2
Theorem 1.16. A4-5 Sums of products
Suppose YN
is absolutely fair relative to ZN
, with , fixed
k > 0. For
n ≥ k
, set
Then is a MG,
X n + 1 = X n + K n + 1 , where
We consider, next, some relationships with homogeneous Markov sequences.
Suppose is a homogeneous Markov sequence with finite
state space
and transition matrix
.
A function f on E is represented by a column matrix
. Then
has value
f(k) when
X
n
= k
.
Pf
is an
m × 1 column matrix and
P
f(j) is the jth
element of that matrix. Consider
. Now
A nonnegative function f on E is called (super)harmonic for P iff
.
Theorem 1.17. A4-6 Positive supermartingales and superharmonic functions.
Suppose is a homogeneous Markov sequence with finite
state space
and transition matrix
.
For nonnegative f on E, let
. Then
is a positive (super)martingale P(SR)MG iff f is
(super)harmonic for P.
As noted above .
If f is (super)harmonic , so that
If is a P(SR)MG, then
IX A4-3
An eigenfunction f and associated eigenvalue λ for P satisfy P f = λ f (i.e., (λ I – P)f = 0). In most cases, |λ| < 1. For real λ, 0 < λ < 1, the eigenfunctions are superharmonic functions. We may use the construction of Theorem IXA3-12 to obtain the associated MG.
Theorem 1.18. A4-7 Martingales induced by eigenfunctions for homogeneous Markov sequences
Let be a homogenous Markov sequence, and f be an
eigenfunction with eigenvalue λ. Put
.
Then, by Theorem IAXA3-12,
is a MG.
Theorem 1.19. A4-8 A dynamic programming example.
We consider a horizon of N stages and a finite state space .
Observe the system at prescribed instants
Take action on the basis of previous states and actions.
Suppose the observed state is j and the action is a ∈ A . Two results ensue:
A return r(j,a) is realized
The system moves to a new state
Let:
Y
n
= state in nth period,
A
n
= action taken on the basis of
[A0 is the initial action based on the initial state Y0 ]
A policy
π is a set of functions ,
such that
The expected return under policy π, when Y 0 = j 0 is
The goal is to determine π to maximize .
Let and
. If
is Markov, then use of (CI9) and (CI11) shows that
for any policy the Z-process is Markov. Hence
We assume time homogeneity in the sense that
We make a dynamic programming approach
Define recursively f N , f N – 1 , ⋯ , f 0 as follows:
f N (j) = 0,∀j ∈ E . For n = N,N – 1,⋯,2,1, set
Put
Then, by A4-2, is a MG, with
and
We may therefore assert
Hence, . For
. If a policy π*
can be found which yields equality, then
π*
is an optimal policy.
The following procedure leads to such a policy.
For each
j ∈ E
, let be the action which maximizes
Thus, .
Now, , which yields equality in the argument above. Thus,
and π*
is optimal.
Note that π*
is a Markov policy, . The
functions fn
depend on the future stages, but once determined, the policy
is Markov.
Theorem 1.20. A4-9 Doob's martingale
Let X be an integrable random variable and ZN
an arbitrary sequence
of random vectors. For each n, let . Then
is a MG.
Theorem 1.21. A4-9a Best mean-square estimators
If
X ∈ L
2
, then is the best mean-square estimator of
X, given
.
is a MG.
Theorem 1.22. A4-9b Futures pricing
Let XN
be a sequence of “spot” prices for a commodity. Let t0
be the
present and
t
0 + T
be a fixed future. The agent can be expected to know
the past history , and will
update as t increases beyond t0
. Put
, the
expected futures price, given the history up to
t
0 + k
. Then
is a Doob's MG, with
Y = X
t
0 + T
, relative
to
, where
Z
0 = U
t
0
and
Z
k
= X
t
0 + k
for
1 ≤ k ≤ T
.
Theorem 1.23. A4-9c Discounted futures
Assume rate of return is r per unit time. Then α = 1 / (1 + r) is the discount factor. Let
Then
Thus is a SMG relative to
.
Implication from martingale theory is that all methods to determine profitable patterns of prediction from past history are doomed to failure.
Theorem 1.24. A4-10 Present discounted value of capital
If α = 1 / (1 + r) is the discount factor, Xn is the dividend at time n, and Vn is the present value, at time n, of all future returns, then
Note that iff
.
Set
. Then
so that
Thus, is a (S)MG iff
The submartingale convergence theorem
Theorem 1.25.
If is a SMG with
,
then there exists
X
∞ ∼ W
∞
such that
Uniform integrability and some convergence conditions
Definition. The class is uniformly integrable iff
Theorem 1.26.
Any of the following conditions ensures uniform integrability:
The class is dominated by an integrable random variable Y.
The class is finite and integrable.
There is a u.i. class such that
for all
t ∈ T
.
X integrable implies Doob's MG is u.i.
Definition. The class is uniformly absolutely
continuous iff for each
ϵ > 0 there is a
δ > 0 such that
P(A) < δ
implies
.
Theorem 1.27.
is u.i. iff both (i) XT
is u.a.c., and
(ii)
.
Definition. iff
as
n → ∞ for all
ϵ > 0.
Theorem 1.28.
implies
Theorem 1.29.
If (i) , (ii)
and (iii)
exists a.s., then
Theorem 1.30.
Suppose is a (S)MG. Consider the following
(A) :
![]() ![]() |
(a) : XN is uniformly integrable. |
(a + ) : X + N is uniformly integrable. |
(b) :
![]() |
(b
+
) :
![]() |
(c) : There is an integrable
X
∞ ∼ W
∞
such that
(1.53)
![]() |
(c ' ) : Condition (c) with ≥ even for a MG. |
(d) : There is an integrable X with ![]() |
(d ' ) : Condition (d) with ≥ even for a MG. |
Then
Each of the propositions (a) through (d ' ) implies (A), hence SMG convergence.
(a) (a
+
)
(a) (b)
(c)
(d)
(a
+
) (b
+
)
(c
'
)
(d
'
)
For a MG, (d) (a), so that (a)
(b)
(c)
(d)
The notion of regularity is characterized in terms of the conditions in the theorem.
Definition. A martingale is said to be
martingale regular iff the equivalent conditions (a), (b), (c), (d) in
the theorem hold.
A submartingale is said to be submartingale regular iff
the equivalent conditions (a
+
), (b
+
), (c
'
), (d
'
) in
the theorem hold.
Remarks
Since a MG is a SMG, a martingale regular MG is also submartingale regular.
It is not true, in general that a submartingale regular SMG is martingale regular. We do have for SMG (a) ⇔ (b) ⇒ (c) ⇒ (d).
Regularity may be viewed in terms of membership of X∞
in the (S)MG.
The condition
is indicated by saying
X∞
belongs to the (S)MG or by saying the (S)MG is closed (on
the right) by X∞
.
Summary
For a martingale
If martingale regular, then
X
n
→ X
∞ ∼ W
∞a.s. and
and
If submartingale regular, but not martingale regular, then
X
n
→ X
∞ ∼ W
∞a.s. but
and
For a submartingale
Either martingale regularity or submartingale regularity implies
and
and
If XN
is uniformly integrable, then .
Theorem 1.31.
If is a MG with
, then the
proceess is MG regular, with