Bayesian Inference and Learners #
BayesianInference bundles prior, likelihood, and posterior computation. BayesianLearner extends BatchLearner with Bayesian inference machinery. GibbsPosterior adds temperature for PAC-Bayes optimization.
Bayesian inference: bundles prior, likelihood model, and posterior computation.
Prior distribution over hypotheses
Likelihood: probability of data given hypothesis
Posterior: prior(h) × ∏ likelihood(h, dᵢ). Unnormalized; the normalization constant Z = Σ_h' prior(h') × ∏ likelihood(h', dᵢ) is omitted because computing it requires summing over all hypotheses (which may be uncountable). Downstream definitions that need a proper probability must normalize explicitly. This is the standard "unnormalized posterior" used in computational Bayesian inference.
Instances For
A Bayesian learner carries a prior and updates via Bayes' rule.
- hypotheses : HypothesisSpace X Y
The hypothesis space
- inference : BayesianInference X Y
Bayesian inference engine
MAP learner: output the maximum a posteriori hypothesis
Output is in hypothesis space
Instances For
Gibbs posterior: a Bayesian learner that uses a tempered posterior (PAC-Bayes bound optimization).
- base : BayesianLearner X Y
Base Bayesian learner
- lambda : ℝ
Temperature parameter (inverse)
Temperature is positive