# Kolmogorov extension theorem

Kolmogorov extension theorem This article is about a theorem on stochastic processes. For a theorem on extension of pre-measure, see Hahn–Kolmogorov theorem.

In mathematics, the Kolmogorov extension theorem (also known as Kolmogorov existence theorem, the Kolmogorov consistency theorem or the Daniell-Kolmogorov theorem) is a theorem that guarantees that a suitably "consistent" collection of finite-dimensional distributions will define a stochastic process. It is credited to the English mathematician Percy John Daniell and the Russian mathematician Andrey Nikolaevich Kolmogorov.[1] Contents 1 Statement of the theorem 2 Explanation of the conditions 3 Implications of the theorem 4 General form of the theorem 5 History 6 References 7 External links Statement of the theorem Let {displaystyle T} denote some interval (thought of as "time"), and let {displaystyle nin mathbb {N} } . For each {displaystyle kin mathbb {N} } and finite sequence of distinct times {displaystyle t_{1},dots ,t_{k}in T} , let {displaystyle nu _{t_{1}dots t_{k}}} be a probability measure on {displaystyle (mathbb {R} ^{n})^{k}} . Suppose that these measures satisfy two consistency conditions: 1. for all permutations {displaystyle pi } of {displaystyle {1,dots ,k}} and measurable sets {displaystyle F_{i}subseteq mathbb {R} ^{n}} , {displaystyle nu _{t_{pi (1)}dots t_{pi (k)}}left(F_{pi (1)}times dots times F_{pi (k)}right)=nu _{t_{1}dots t_{k}}left(F_{1}times dots times F_{k}right);} 2. for all measurable sets {displaystyle F_{i}subseteq mathbb {R} ^{n}} , {displaystyle min mathbb {N} } {displaystyle nu _{t_{1}dots t_{k}}left(F_{1}times dots times F_{k}right)=nu _{t_{1}dots t_{k},t_{k+1},dots ,t_{k+m}}left(F_{1}times dots times F_{k}times underbrace {mathbb {R} ^{n}times dots times mathbb {R} ^{n}} _{m}right).} Then there exists a probability space {displaystyle (Omega ,{mathcal {F}},mathbb {P} )} and a stochastic process {displaystyle X:Ttimes Omega to mathbb {R} ^{n}} such that {displaystyle nu _{t_{1}dots t_{k}}left(F_{1}times dots times F_{k}right)=mathbb {P} left(X_{t_{1}}in F_{1},dots ,X_{t_{k}}in F_{k}right)} for all {displaystyle t_{i}in T} , {displaystyle kin mathbb {N} } and measurable sets {displaystyle F_{i}subseteq mathbb {R} ^{n}} , i.e. {displaystyle X} has {displaystyle nu _{t_{1}dots t_{k}}} as its finite-dimensional distributions relative to times {displaystyle t_{1}dots t_{k}} .

In fact, it is always possible to take as the underlying probability space {displaystyle Omega =(mathbb {R} ^{n})^{T}} and to take for {displaystyle X} the canonical process {displaystyle Xcolon (t,Y)mapsto Y_{t}} . Therefore, an alternative way of stating Kolmogorov's extension theorem is that, provided that the above consistency conditions hold, there exists a (unique) measure {displaystyle nu } on {displaystyle (mathbb {R} ^{n})^{T}} with marginals {displaystyle nu _{t_{1}dots t_{k}}} for any finite collection of times {displaystyle t_{1}dots t_{k}} . Kolmogorov's extension theorem applies when {displaystyle T} is uncountable, but the price to pay for this level of generality is that the measure {displaystyle nu } is only defined on the product σ-algebra of {displaystyle (mathbb {R} ^{n})^{T}} , which is not very rich.

Explanation of the conditions The two conditions required by the theorem are trivially satisfied by any stochastic process. For example, consider a real-valued discrete-time stochastic process {displaystyle X} . Then the probability {displaystyle mathbb {P} (X_{1}>0,X_{2}<0)} can be computed either as {displaystyle nu _{1,2}(mathbb {R} _{+}times mathbb {R} _{-})} or as {displaystyle nu _{2,1}(mathbb {R} _{-}times mathbb {R} _{+})} . Hence, for the finite-dimensional distributions to be consistent, it must hold that {displaystyle nu _{1,2}(mathbb {R} _{+}times mathbb {R} _{-})=nu _{2,1}(mathbb {R} _{-}times mathbb {R} _{+})} . The first condition generalizes this statement to hold for any number of time points {displaystyle t_{i}} , and any control sets {displaystyle F_{i}} . Continuing the example, the second condition implies that {displaystyle mathbb {P} (X_{1}>0)=mathbb {P} (X_{1}>0,X_{2}in mathbb {R} )} . Also this is a trivial condition that will be satisfied by any consistent family of finite-dimensional distributions.

Implications of the theorem Since the two conditions are trivially satisfied for any stochastic process, the power of the theorem is that no other conditions are required: For any reasonable (i.e., consistent) family of finite-dimensional distributions, there exists a stochastic process with these distributions.

The measure-theoretic approach to stochastic processes starts with a probability space and defines a stochastic process as a family of functions on this probability space. However, in many applications the starting point is really the finite-dimensional distributions of the stochastic process. The theorem says that provided the finite-dimensional distributions satisfy the obvious consistency requirements, one can always identify a probability space to match the purpose. In many situations, this means that one does not have to be explicit about what the probability space is. Many texts on stochastic processes do, indeed, assume a probability space but never state explicitly what it is.

The theorem is used in one of the standard proofs of existence of a Brownian motion, by specifying the finite dimensional distributions to be Gaussian random variables, satisfying the consistency conditions above. As in most of the definitions of Brownian motion it is required that the sample paths are continuous almost surely, and one then uses the Kolmogorov continuity theorem to construct a continuous modification of the process constructed by the Kolmogorov extension theorem.

General form of the theorem The Kolmogorov extension theorem gives us conditions for a collection of measures on Euclidean spaces to be the finite-dimensional distributions of some {displaystyle mathbb {R} ^{n}} -valued stochastic process, but the assumption that the state space be {displaystyle mathbb {R} ^{n}} is unnecessary. In fact, any collection of measurable spaces together with a collection of inner regular measures defined on the finite products of these spaces would suffice, provided that these measures satisfy a certain compatibility relation. The formal statement of the general theorem is as follows.[2] Let {displaystyle T} be any set. Let {displaystyle {(Omega _{t},{mathcal {F}}_{t})}_{tin T}} be some collection of measurable spaces, and for each {displaystyle tin T} , let {displaystyle tau _{t}} be a Hausdorff topology on {displaystyle Omega _{t}} . For each finite subset {displaystyle Jsubset T} , define {displaystyle Omega _{J}:=prod _{tin J}Omega _{t}} .

For subsets {displaystyle Isubset Jsubset T} , let {displaystyle pi _{I}^{J}:Omega _{J}to Omega _{I}} denote the canonical projection map {displaystyle omega mapsto omega |_{I}} .

For each finite subset {displaystyle Fsubset T} , suppose we have a probability measure {displaystyle mu _{F}} on {displaystyle Omega _{F}} which is inner regular with respect to the product topology (induced by the {displaystyle tau _{t}} ) on {displaystyle Omega _{F}} . Suppose also that this collection {displaystyle {mu _{F}}} of measures satisfies the following compatibility relation: for finite subsets {displaystyle Fsubset Gsubset T} , we have that {displaystyle mu _{F}=(pi _{F}^{G})_{*}mu _{G}} where {displaystyle (pi _{F}^{G})_{*}mu _{G}} denotes the pushforward measure of {displaystyle mu _{G}} induced by the canonical projection map {displaystyle pi _{F}^{G}} .

Then there exists a unique probability measure {displaystyle mu } on {displaystyle Omega _{T}} such that {displaystyle mu _{F}=(pi _{F}^{T})_{*}mu } for every finite subset {displaystyle Fsubset T} .

As a remark, all of the measures {displaystyle mu _{F},mu } are defined on the product sigma algebra on their respective spaces, which (as mentioned before) is rather coarse. The measure {displaystyle mu } may sometimes be extended appropriately to a larger sigma algebra, if there is additional structure involved.

Note that the original statement of the theorem is just a special case of this theorem with {displaystyle Omega _{t}=mathbb {R} ^{n}} for all {displaystyle tin T} , and {displaystyle mu _{{t_{1},...,t_{k}}}=nu _{t_{1}dots t_{k}}} for {displaystyle t_{1},...,t_{k}in T} . The stochastic process would simply be the canonical process {displaystyle (pi _{t})_{tin T}} , defined on {displaystyle Omega =(mathbb {R} ^{n})^{T}} with probability measure {displaystyle P=mu } . The reason that the original statement of the theorem does not mention inner regularity of the measures {displaystyle nu _{t_{1}dots t_{k}}} is that this would automatically follow, since Borel probability measures on Polish spaces are automatically Radon.

This theorem has many far-reaching consequences; for example it can be used to prove the existence of the following, among others: Brownian motion, i.e., the Wiener process, a Markov chain taking values in a given state space with a given transition matrix, infinite products of (inner-regular) probability spaces. History According to John Aldrich, the theorem was independently discovered by British mathematician Percy John Daniell in the slightly different setting of integration theory.[3] References ^ Øksendal, Bernt (2003). Stochastic Differential Equations: An Introduction with Applications (Sixth ed.). Berlin: Springer. p. 11. ISBN 3-540-04758-1. ^ Tao, T. (2011). An Introduction to Measure Theory. Graduate Studies in Mathematics. Vol. 126. Providence: American Mathematical Society. p. 195. ISBN 978-0-8218-6919-2. ^ J. Aldrich, But you have to remember PJ Daniell of Sheffield, Electronic Journal for History of Probability and Statistics, Vol. 3, number 2, 2007 External links Aldrich, J. (2007) "But you have to remember P.J.Daniell of Sheffield" Electronic [email protected] for History of Probability and Statistics December 2007. Categories: Theorems regarding stochastic processes

Si quieres conocer otros artículos parecidos a **Kolmogorov extension theorem** puedes visitar la categoría **Theorems regarding stochastic processes**.

Deja una respuesta