Concentration Inequalities #
Pure mathematical infrastructure for concentration inequalities. No learning-theory types.
Main definitions #
BoundedRandomVariable: typeclass for random variables bounded in [a, b] a.e.chebyshev_majority_bound: Chebyshev-based majority bound for independent events
Main results #
chebyshev_majority_bound: if k ≥ 9/δ independent events each have probability ≥ 2/3, then the probability that strictly more than k/2 of them hold is ≥ 1-δ.
References #
- Boucheron, Lugosi, Massart, "Concentration Inequalities", Chapter 2
class
BoundedRandomVariable
{Ω : Type u_1}
[MeasurableSpace Ω]
(f : Ω → ℝ)
(μ : MeasureTheory.Measure Ω)
(a b : ℝ)
:
A random variable f : Ω → ℝ is bounded in [a, b] almost everywhere.
- measurable : Measurable f
Instances
theorem
chebyshev_majority_bound
{Ω : Type u_1}
[MeasurableSpace Ω]
{μ : MeasureTheory.Measure Ω}
[MeasureTheory.IsProbabilityMeasure μ]
{k : ℕ}
{δ : ℝ}
(h_delta_pos : 0 < δ)
(hk : 9 / δ ≤ ↑k)
(events : Fin k → Set Ω)
(hevents_meas : ∀ (j : Fin k), MeasurableSet (events j))
(hindep : ProbabilityTheory.iIndepSet (fun (j : Fin k) => events j) μ)
(hprob : ∀ (j : Fin k), μ (events j) ≥ ENNReal.ofReal (2 / 3))
:
Chebyshev majority bound: if k ≥ 9/δ independent events each have probability ≥ 2/3, then the probability that strictly more than k/2 of them hold is ≥ 1-δ. Uses indicator random variables, Popoviciu's variance bound, independence for variance of sums, and Chebyshev's inequality.