Glivenko–Cantelli theorem

Glivenko–Cantelli theorem In the theory of probability, the Glivenko–Cantelli theorem (sometimes referred to as the Fundamental Theorem of Statistics), named after Valery Ivanovich Glivenko and Francesco Paolo Cantelli, determines the asymptotic behaviour of the empirical distribution function as the number of independent and identically distributed observations grows.[1] The uniform convergence of more general empirical measures becomes an important property of the Glivenko–Cantelli classes of functions or sets.[2] The Glivenko–Cantelli classes arise in Vapnik–Chervonenkis theory, with applications to machine learning. Applications can be found in econometrics making use of M-estimators.

Contents 1 Statement 2 Proof 3 Empirical measures 4 Glivenko–Cantelli class 5 Examples 6 See also 7 References 8 Further reading Statement Assume that {displaystyle X_{1},X_{2},dots } are independent and identically-distributed random variables in {displaystyle mathbb {R} } with common cumulative distribution function {displaystyle F(x)} . The empirical distribution function for {displaystyle X_{1},dots ,X_{n}} is defined by {displaystyle F_{n}(x)={frac {1}{n}}sum _{i=1}^{n}I_{[X_{i},infty )}(x)={frac {1}{n}}left|left{1leq ileq n|X_{i}leq xright}right|} where {displaystyle I_{C}} is the indicator function of the set {displaystyle C} . For every (fixed) {displaystyle x} , {displaystyle F_{n}(x)} is a sequence of random variables which converge to {displaystyle F(x)} almost surely by the strong law of large numbers, that is, {displaystyle F_{n}} converges to {displaystyle F} pointwise. Glivenko and Cantelli strengthened this result by proving uniform convergence of {displaystyle F_{n}} to {displaystyle F} .

Theorem {displaystyle |F_{n}-F|_{infty }=sup _{xin mathbb {R} }|F_{n}(x)-F(x)|longrightarrow 0} almost surely.[3] This theorem originates with Valery Glivenko[4] and Francesco Cantelli,[5] in 1933.

Remarks If {displaystyle X_{n}} is a stationary ergodic process, then {displaystyle F_{n}(x)} converges almost surely to {displaystyle F(x)=E(1_{X_{1}leq x})} . The Glivenko–Cantelli theorem gives a stronger mode of convergence than this in the iid case. An even stronger uniform convergence result for the empirical distribution function is available in the form of an extended type of law of the iterated logarithm.[6] See asymptotic properties of the empirical distribution function for this and related results. Proof For simplicity, consider a case of continuous random variable {displaystyle X} . Fix {displaystyle -infty =x_{0}

Si quieres conocer otros artículos parecidos a Glivenko–Cantelli theorem puedes visitar la categoría Asymptotic theory (statistics).

Deja una respuesta

Tu dirección de correo electrónico no será publicada.


Utilizamos cookies propias y de terceros para mejorar la experiencia de usuario Más información