Cramér's theorem (large deviations)

Cramér's theorem (large deviations) Cramér's theorem is a fundamental result in the theory of large deviations, a subdiscipline of probability theory. It determines the rate function of a series of iid random variables. A weak version of this result was first shown by Harald Cramér in 1938.

Statement The logarithmic moment generating function (which is the cumulant-generating function) of a random variable is defined as: {displaystyle Lambda (t)=log operatorname {E} [exp(tX_{1})].} Let {displaystyle X_{1},X_{2},dots } be a sequence of iid real random variables with finite logarithmic moment generating function, e.g. {displaystyle Lambda (t)operatorname {E} [X_{1}].} In the terminology of the theory of large deviations the result can be reformulated as follows: If {displaystyle X_{1},X_{2},dots } is a series of iid random variables, then the distributions {displaystyle left({mathcal {L}}({tfrac {1}{n}}sum _{i=1}^{n}X_{i})right)_{nin mathbb {N} }} satisfy a large deviation principle with rate function {displaystyle Lambda ^{*}} .

References Klenke, Achim (2008). Probability Theory. Berlin: Springer. pp. 508. doi:10.1007/978-1-84800-048-3. ISBN 978-1-84800-047-6. "Cramér theorem", Encyclopedia of Mathematics, EMS Press, 2001 [1994] Categories: Large deviations theoryProbability theorems

Si quieres conocer otros artículos parecidos a Cramér's theorem (large deviations) puedes visitar la categoría Large deviations theory.

Deja una respuesta

Tu dirección de correo electrónico no será publicada.

Subir

Utilizamos cookies propias y de terceros para mejorar la experiencia de usuario Más información