Documentation

FLT_Proofs.Learner.Bayesian

Bayesian Inference and Learners #

BayesianInference bundles prior, likelihood, and posterior computation. BayesianLearner extends BatchLearner with Bayesian inference machinery. GibbsPosterior adds temperature for PAC-Bayes optimization.

structure BayesianInference (X : Type u) (Y : Type v) [MeasurableSpace X] :
Type (max u v)

Bayesian inference: bundles prior, likelihood model, and posterior computation.

  • prior : Concept X Y

    Prior distribution over hypotheses

  • likelihood : Concept X YX × Y

    Likelihood: probability of data given hypothesis

  • posterior (h : Concept X Y) (data : List (X × Y)) :

    Posterior: prior(h) × ∏ likelihood(h, dᵢ). Unnormalized; the normalization constant Z = Σ_h' prior(h') × ∏ likelihood(h', dᵢ) is omitted because computing it requires summing over all hypotheses (which may be uncountable). Downstream definitions that need a proper probability must normalize explicitly. This is the standard "unnormalized posterior" used in computational Bayesian inference.

Instances For
    structure BayesianLearner (X : Type u) (Y : Type v) [MeasurableSpace X] :
    Type (max u v)

    A Bayesian learner carries a prior and updates via Bayes' rule.

    Instances For
      structure GibbsPosterior (X : Type u) (Y : Type v) [MeasurableSpace X] :
      Type (max u v)

      Gibbs posterior: a Bayesian learner that uses a tempered posterior (PAC-Bayes bound optimization).

      • base : BayesianLearner X Y

        Base Bayesian learner

      • lambda :

        Temperature parameter (inverse)

      • lambda_pos : 0 < self.lambda

        Temperature is positive

      Instances For