Laplace principle (large deviations theory)
This article relies largely or entirely on a single source. (March 2024) |
In mathematics, Laplace's principle is a basic theorem in large deviations theory which is similar to Varadhan's lemma. It gives an asymptotic expression for the Lebesgue integral of exp(−θφ(x)) over a fixed set A as θ becomes large. Such expressions can be used, for example, in statistical mechanics to determining the limiting behaviour of a system as the temperature tends to absolute zero.
Statement of the result[edit]
Let A be a Lebesgue-measurable subset of d-dimensional Euclidean space Rd and let φ : Rd → R be a measurable function with
Then
where ess inf denotes the essential infimum. Heuristically, this may be read as saying that for large θ,
Application[edit]
The Laplace principle can be applied to the family of probability measures Pθ given by
to give an asymptotic expression for the probability of some event A as θ becomes large. For example, if X is a standard normally distributed random variable on R, then
for every measurable set A.
See also[edit]
References[edit]
- Dembo, Amir; Zeitouni, Ofer (1998). Large deviations techniques and applications. Applications of Mathematics (New York) 38 (Second ed.). New York: Springer-Verlag. pp. xvi+396. ISBN 0-387-98406-2. MR1619036