文章目录
Absolute ContinuityRadon-Nikodym TheoremConditional ExpectationThe Smaller σ \sigma σ-field Always WinsConditional Expectation is the Projection in L 2 L_2 L2 Space.FiltrationMartingaleEquivalent ConditionsExamples of MartingalePrevisible ProcessMartingale TransformationYou Can't Beat the System!Stopping TimeA PropositionMartingale Convergence TheoremDoob’s Decomposition L p L_p Lp Convergence TheoremUniform IntegrabilityAn ExampleEquivalence of Uniform Integrability IEquivalence of Uniform Integrability IIA LemmaEquivalence of Uniform Integrability IIIThe Limit of E [ X ∣ F n ] E[X|\mathcal{F}_n] E[X∣Fn]Levy's 0-1 LawA Theorem ReferenceAbsolute Continuity
We say ν \nu ν is absolutely continuous with respect to μ \mu μ, denoted by ν ≪ μ \nu\ll\mu ν≪μ, if μ ( A ) = 0 \mu(A)=0 μ(A)=0 implies ν ( A ) = 0 \nu(A)=0 ν(A)=0.
Radon-Nikodym Theorem
Let ν \nu ν and μ \mu μ be sigma-finite measures on ( Ω , F ) (\Omega, \mathcal{F}) (Ω,F). If ν \nu ν is absolutely continuous with respect to μ \mu μ, i.e., ν ≪ μ \nu\ll\mu ν≪μ, then there will be a function f ∈ F f\in \mathcal{F} f∈F such that for all A ∈ F A\in \mathcal{F} A∈F, ν ( A ) = ∫ A f d μ . \nu(A)=\int_A fd\mu. ν(A)=∫Afdμ. f f f is denoted by d ν d μ \frac{d\nu}{d\mu} dμdν and called the Radon-Nikodym derivative.
Conditional Expectation
Consider a probability space ( Ω , F , P ) (\Omega, \mathcal{F}, P) (Ω,F,P) and a random variable X X X on ( Ω , F , P ) (\Omega, \mathcal{F}, P) (Ω,F,P). Let F ∗ ⊂ F \mathcal{F}^{*}\subset \mathcal{F} F∗⊂F be a sub-sigma field. Define the measure ν \nu ν on F ∗ \mathcal{F}^{*} F∗:
ν ( A ) = ∫ A X d P , ∀ A ∈ F ∗ . \nu(A)=\int_A XdP,\forall A\in \mathcal{F}^{*}. ν(A)=∫AXdP,∀A∈F∗. If X ≥ 0 X\geq 0 X≥0, it can be shown that ν \nu ν is a measure on F ∗ \mathcal{F}^{*} F∗ by DCT. For the general case, we can write X = X + − X − X=X^{+} - X^{-} X=X+−X−. Since ν ≪ P \nu \ll P ν≪P, Radon-Nikodym says that there exists Y ∈ F ∗ Y\in \mathcal{F} ^{*} Y∈F∗ such that Y = d ν d P Y = \frac{d \nu}{dP} Y=dPdν. Y Y Y is called the conditional expectation of X X X given F ∗ \mathcal{F}^{*} F∗ and denoted by Y = E [ X ∣ F ∗ ] Y=E[X|\mathcal{F}^{*}] Y=E[X∣F∗].
Conditional expectation holds nearly all the properties of expectation.
The Smaller σ \sigma σ-field Always Wins
If F 1 ⊂ F 2 \mathcal{F}_1\subset\mathcal{F}_2 F1⊂F2, then E [ E [ X ∣ F 1 ] ∣ F 2 ] = E [ X ∣ F 1 ] = E [ E [ X ∣ F 2 ] ∣ F 1 ] . E[E[X|\mathcal{F}_1]|\mathcal{F}_2]=E[X|\mathcal{F}_1]=E[E[X|\mathcal{F}_2] |\mathcal{F}_1]. E[E[X∣F1]∣F2]=E[X∣F1]=E[E[X∣F2]∣F1].Let Y Y Y be a random variable and Z = g ( Y ) Z=g(Y) Z=g(Y) where g g g is a mearsurable function from ( R , B ) (\mathbb{R} , \mathcal{B}) (R,B) to ( R , B ) (\mathbb{R} , \mathcal{B}) (R,B). Then Y − 1 g − 1 ( B ) ⊂ Y − 1 ( B ) Y^{-1}g^{-1} (\mathcal{B})\subset Y^{-1} (\mathcal{B}) Y−1g−1(B)⊂Y−1(B). As a result,
E [ E [ X ∣ Y ] ∣ Z ] = E [ X ∣ Z ] E[E[X|Y]|Z]=E[X|Z] E[E[X∣Y]∣Z]=E[X∣Z]. In addition, E [ Z ∣ Y ] = Z E[Z|Y]=Z E[Z∣Y]=Z.
Conditional Expectation is the Projection in L 2 L_2 L2 Space.
Suppose E X 2 < ∞ EX^2<\infty EX2<∞, then E [ X ∣ F ] E[X|\mathcal{F}] E[X∣F] is the element Y ∈ F Y\in\mathcal{F} Y∈F that minimizes E ( X − Y ) 2 E(X-Y)^2 E(X−Y)2.
Filtration
A filtration is an increasing sequence of sigma-fields.
Martingale
Let F n \mathcal{F}_n Fn be a filtration. A random sequence X n X_n Xn with
E ∣ X n ∣ < ∞ E|X_n|<\infty E∣Xn∣<∞, (for finite n) X n ∈ F n X_n\in\mathcal{F}_n Xn∈Fn, E [ X n + 1 ∣ F n ] = X n E[X_{n+1}|\mathcal{F}_n]=X_n E[Xn+1∣Fn]=Xn for all n.
is said to be a martingale. If in the last equation, = is replaced by ≤ \leq ≤ or ≥ \geq ≥, then { X n } \{X_n\} {Xn} is said to be a supermartingale or submartingale.
Equivalent Conditions
For a supermartingale, E [ X n ∣ F m ] ≤ X m E[X_n|\mathcal{F}_m]\leq X_m E[Xn∣Fm]≤Xm for all n > m n>m n>m. For a submartingale, E [ X n ∣ F m ] ≥ X m E[X_n|\mathcal{F}_m]\geq X_m E[Xn∣Fm]≥Xm for all n > m n>m n>m.
Examples of Martingale
Sum of independent zero-mean RVs, S n S_n Sn.Products of non-negative independent RVs of mean 1, ∏ i = 1 n X i \prod_{i=1}^n X_i ∏i=1nXi.Accumulating data about a random variable, E [ X ∣ F n ] E[X|\mathcal{F}_n] E[X∣Fn].
Previsible Process
A process C n C_n Cn is previsible if C n ∈ F n − 1 C_n\in\mathcal{F}_{n-1} Cn∈Fn−1 for all n.
Martingale Transformation
( C ⋅ X ) n = ∑ k = 1 n C k ( X k − X k − 1 ) (C\cdot X)_n=\sum_{k=1}^n C_k(X_k-X_{k-1}) (C⋅X)n=∑k=1nCk(Xk−Xk−1) is the martingale transform of X X X by C C C.
You Can’t Beat the System!
If C n C_n Cn are non-negative and bounded, X X X is the supermartingale, then ( C ⋅ X ) n (C\cdot X)_n (C⋅X)n is a supermartingale.
Stopping Time
For countable state space, T T T is a stopping time IFF { T = n } ∈ F n , ∀ n \{T=n\}\in\mathcal{F}_n, \forall n {T=n}∈Fn,∀n. That is to say, T T T is a time when you can decide to stop the game or not after the n n n-th game based on the history up to time n.
A Proposition
If N N N is a stopping time, X n X_n Xn is a supermartingale, then X n ∧ N X_{n\wedge N} Xn∧N is a supermartingale.
Martingale Convergence Theorem
If X n X_n Xn is a submartingale with sup n E X n + < ∞ \sup_n EX^{+}_n<\infty supnEXn+<∞, X n X_n Xn converges a.s. to a limit X X X with E ∣ X ∣ < ∞ E|X|<\infty E∣X∣<∞. A followed collary is:
If X n ≥ 0 X_n\geq 0 Xn≥0 is a supermartingale, then X n → X X_n\rightarrow X Xn→X a.s. and E X ≤ E X 0 EX\leq EX_0 EX≤EX0.
Doob’s Decomposition
Any submartingale can be written in a unique way as X n = M n + A n X_n=M_n+A_n Xn=Mn+An where M n M_n Mn is a martingale and A n A_n An is a predictable increasing sequence with A 0 = 0 A_0=0 A0=0. In fact, A n = ∑ k = 1 n E [ X k − X k − 1 ∣ F k − 1 ] A_n=\sum_{k=1}^n E[X_k-X_{k-1}|\mathcal{F}_{k-1}] An=∑k=1nE[Xk−Xk−1∣Fk−1]
L p L_p Lp Convergence Theorem
If X n X_n Xn is a martingale with sup E ∣ X n ∣ p < ∞ \sup E|X_n|^p<\infty supE∣Xn∣p<∞ where p > 1 p>1 p>1, then X n X_n Xn converges a.s. and in L p L_p Lp.
Uniform Integrability
A collection of { X i , i ∈ I } \{X_i,i\in I\} {Xi,i∈I} is said to be uniformly integrable IFF
lim M → ∞ sup i ∈ I ∫ ∣ X i ∣ > M ∣ X i ∣ = 0. \lim_{M\rightarrow\infty}\sup_{i\in I}\int_{|X_i|>M}|X_i|=0. M→∞limi∈Isup∫∣Xi∣>M∣Xi∣=0.
An Example
Given a probability space ( Ω , F 0 , P ) (\Omega,\mathcal{F}_0,P) (Ω,F0,P) and X ∈ L 1 X\in L_1 X∈L1, { E [ X ∣ F ] : F isa σ -field ⊂ F 0 } \{E[X|\mathcal{F}]:\mathcal{F} \text{ is a } \sigma\text{-field}\subset\mathcal{F}_0\} {E[X∣F]:Fisaσ-field⊂F0} is uniformly integrable.
Equivalence of Uniform Integrability I
If X n → X X_n\rightarrow X Xn→X in prob. and E ∣ X n ∣ < ∞ E|X_n|<\infty E∣Xn∣<∞ for all n, then the following are equivalent:
{ X n } \{X_n\} {Xn} is uniformly integrable. X n → X X_n\rightarrow X Xn→X in L 1 L_1 L1. E ∣ X n ∣ → E ∣ X ∣ < ∞ E|X_n|\rightarrow E|X|<\infty E∣Xn∣→E∣X∣<∞.
Equivalence of Uniform Integrability II
For a submartingale, the following are equivalent:
uniform integrability.a.s. convergence and convergence in L 1 L_1 L1.convergence in L 1 L_1 L1.
A Lemma
If a martingale X n X_n Xn converges to X X X in L 1 L_1 L1, then X n = E [ X ∣ F n ] X_n=E[X|\mathcal{F}_n] Xn=E[X∣Fn].
Equivalence of Uniform Integrability III
For a martingale, the following are equivalent:
uniform integrability.a.s. convergence and convergence in L 1 L_1 L1.convergence in L 1 L_1 L1.There is an integrable random variable X X X such that X n = E [ X ∣ F n ] X_n=E[X|\mathcal{F}_n] Xn=E[X∣Fn].
The Limit of E [ X ∣ F n ] E[X|\mathcal{F}_n] E[X∣Fn]
If F n \mathcal{F}_n Fn is an increasing sequence of sigma-field with limit F ∞ = σ { ∪ n F n } \mathcal{F}_\infty=\sigma\{\cup_n\mathcal{F}_n\} F∞=σ{∪nFn}, then
E [ X ∣ F n ] → E [ X ∣ F ∞ ] E[X|\mathcal{F}_n]\rightarrow E[X|\mathcal{F}_\infty] E[X∣Fn]→E[X∣F∞] a.s. and in L 1 L_1 L1.
Levy’s 0-1 Law
If F n ↑ F ∞ \mathcal{F}_n\uparrow\mathcal{F}_{\infty} Fn↑F∞ and A ∈ F ∞ A\in\mathcal{F}_{\infty} A∈F∞, then E [ 1 A ∣ F n ] → 1 A E[1_A|\mathcal{F}_n]\rightarrow 1_A E[1A∣Fn]→1A a.s…
A Theorem
If X n X_n Xn is a submartingale and N N N is a stopping time with p r ( N ≤ k ) = 1 pr(N\leq k)=1 pr(N≤k)=1, then E X 0 ≤ E X N ≤ E X k EX_0\leq EX_N \leq EX_k EX0≤EXN≤EXk.
For a uniformly integrable submartingale X n X_n Xn and any stopping time N N N we have E X 0 ≤ E X N ≤ E X ∞ EX_0\leq EX_N \leq EX_\infty EX0≤EXN≤EX∞
Reference
Durrett, Rick. Probability: theory and examples. Vol. 49. Cambridge university press, .
S.R.S.Varadhan, Probability Theory (Courant Lecture Notes), 2000