Kolmogorovs Dreireihensatz

Kolmogorov's three-series theorem In probability theory, Kolmogorov's Three-Series Theorem, named after Andrey Kolmogorov, gives a criterion for the almost sure convergence of an infinite series of random variables in terms of the convergence of three different series involving properties of their probability distributions. Kolmogorovs Dreireihensatz, combined with Kronecker's lemma, can be used to give a relatively easy proof of the Strong Law of Large Numbers.[1] Inhalt 1 Aussage des Theorems 2 Nachweisen 2.1 Sufficiency of conditions ("wenn") 2.2 Necessity of conditions ("only if") 3 Beispiel 4 Notes Statement of the theorem Let {Anzeigestil (X_{n})_{nin mathbb {N} }} be independent random variables. The random series {textstyle sum _{n=1}^{unendlich }X_{n}} converges almost surely in {Anzeigestil mathbb {R} } if the following conditions hold for some {displaystyle A>0} , and only if the following conditions hold for any {displaystyle A>0} : {Anzeigestil Summe _{n=1}^{unendlich }mathbb {P} (|X_{n}|geq A)} konvergiert. Lassen {Anzeigestil Y_{n}=X_{n}mathbf {1} _{{|X_{n}|leq A}}} , dann {Anzeigestil Summe _{n=1}^{unendlich }mathbb {E} [Y_{n}]} , the series of expected values of {Anzeigestil Y_{n}} , konvergiert. {Anzeigestil Summe _{n=1}^{unendlich }Mathrm {var} (Y_{n})} konvergiert. Proof Sufficiency of conditions ("wenn") Condition (ich) and Borel–Cantelli give that {Anzeigestil X_{n}=Y_{n}} zum {Anzeigestil n} large, fast sicher. Somit {displaystyle textstyle sum _{n=1}^{unendlich }X_{n}} converges if and only if {displaystyle textstyle sum _{n=1}^{unendlich }Y_{n}} konvergiert. Conditions (ii)-(iii) and Kolmogorov's Two-Series Theorem give the almost sure convergence of {displaystyle textstyle sum _{n=1}^{unendlich }Y_{n}} .

Necessity of conditions ("only if") Nehme an, dass {displaystyle textstyle sum _{n=1}^{unendlich }X_{n}} converges almost surely.

Without condition (ich), by Borel–Cantelli there would exist some {displaystyle A>0} so dass {Anzeigestil {|X_{n}|geq A}} for infinitely many {Anzeigestil n} , fast sicher. But then the series would diverge. Deswegen, we must have condition (ich).

We see that condition (iii) implies condition (ii): Kolmogorov's two-series theorem along with condition (ich) applied to the case {displaystyle A=1} gives the convergence of {displaystyle textstyle sum _{n=1}^{unendlich }(Y_{n}-mathbb {E} [Y_{n}])} . So given the convergence of {displaystyle textstyle sum _{n=1}^{unendlich }Y_{n}} , wir haben {displaystyle textstyle sum _{n=1}^{unendlich }mathbb {E} [Y_{n}]} konvergiert, so condition (ii) is implied.

Daher, it only remains to demonstrate the necessity of condition (iii), and we will have obtained the full result. It is equivalent to check condition (iii) for the series {displaystyle textstyle sum _{n=1}^{unendlich }Z_{n}=textstyle sum _{n=1}^{unendlich }(Y_{n}-Y'_{n})} where for each {Anzeigestil n} , {Anzeigestil Y_{n}} und {displaystyle Y'_{n}} are IID—that is, to employ the assumption that {Anzeigestil mathbb {E} [Y_{n}]=0} , seit {Anzeigestil Z_{n}} is a sequence of random variables bounded by 2, converging almost surely, und mit {Anzeigestil mathrm {var} (Z_{n})=2mathrm {var} (Y_{n})} . So we wish to check that if {displaystyle textstyle sum _{n=1}^{unendlich }Z_{n}} konvergiert, dann {displaystyle textstyle sum _{n=1}^{unendlich }Mathrm {var} (Z_{n})} converges as well. This is a special case of a more general result from martingale theory with summands equal to the increments of a martingale sequence and the same conditions ( {Anzeigestil mathbb {E} [Z_{n}]=0} ; the series of the variances is converging; and the summands are bounded).[2][3][4] Example As an illustration of the theorem, consider the example of the harmonic series with random signs: {Anzeigestil Summe _{n=1}^{unendlich }pm {frac {1}{n}}.} Hier, " {Anzeigestil pm } " means that each term {Anzeigestil 1/n} is taken with a random sign that is either {Anzeigestil 1} oder {Anzeigestil -1} with respective probabilities {Anzeigestil 1/2, 1/2} , and all random signs are chosen independently. Lassen {Anzeigestil X_{n}} in the theorem denote a random variable that takes the values {Anzeigestil 1/n} und {displaystyle -1/n} with equal probabilities. With {displaystyle A=2} the summands of the first two series are identically zero and var(Yn)= {Anzeigestil n^{-2}} . The conditions of the theorem are then satisfied, so it follows that the harmonic series with random signs converges almost surely. Auf der anderen Seite, the analogous series of (zum Beispiel) square root reciprocals with random signs, nämlich {Anzeigestil Summe _{n=1}^{unendlich }pm {frac {1}{quadrat {n}}},} diverges almost surely, since condition (3) in the theorem is not satisfied for any A. Note that this is different from the behavior of the analogous series with alternating signs, {Anzeigestil Summe _{n=1}^{unendlich }(-1)^{n}/{quadrat {n}}} , which does converge.

Notes ^ Durrett, Rick. "Wahrscheinlichkeit: Theorie und Beispiele." Duxbury advanced series, Third Edition, Thomson Brooks/Cole, 2005, Abschnitt 1.8, pp. 60–69. ^ Sun, Rongfeng. Lecture notes. http://www.math.nus.edu.sg/~matsr/ProbI/Lecture4.pdf Archived 2018-04-17 at the Wayback Machine ^ M. Loève, "Probability theory", Princeton Univ. Drücken Sie (1963) pp. Sect. 16.3 ^W. Feller, "An introduction to probability theory and its applications", 2, Wiley (1971) pp. Sect. IX.9 Categories: Mathematical seriesProbability theorems

Wenn Sie andere ähnliche Artikel wissen möchten Kolmogorovs Dreireihensatz Sie können die Kategorie besuchen Mathematische Reihe.

Hinterlasse eine Antwort

Deine Email-Adresse wird nicht veröffentlicht.

Geh hinauf

Wir verwenden eigene Cookies und Cookies von Drittanbietern, um die Benutzererfahrung zu verbessern Mehr Informationen