Documentation

FLT_Proofs.Criterion.Extended

Extended and Cross-Paradigm Criteria #

EX under drift, universal learning, Bayesian criteria (posterior consistency, PAC-Bayes, information-theoretic bounds).

def EXUnderDrift (X : Type u) (C : ConceptClass X Bool) (_driftRate : ) :

EX-learning under drift: the target concept changes over time. The learner must track the drifting target. At each time step t, the current target is targets(t), and the learner must eventually output a hypothesis matching the current target before it drifts again.

Equations
  • One or more equations did not get rendered due to their size.
Instances For

    Universal learning: distribution-free convergence rates. Strictly stronger than PAC.

    Sample space: Fin m → X with i.i.d. product measure D^m (matching PACLearnable). Labels: derived deterministically from target concept c (realizable case). The rate function converges to 0, and for every m, with probability ≥ 2/3 over D^m, the learner's error is at most rate(m).

    Γ₄₈ fix: changed from existential Dm to Measure.pi (CNA₁₁ definition repair).

    Equations
    • One or more equations did not get rendered due to their size.
    Instances For

      PAC learning with concept-dependent advice: for every target concept c in C, there exists an advice value a(c) in A such that the advice-augmented learner achieves PAC learning of C when given advice a(c). The advice may depend on the target concept but not on the distribution or the sample.

      When A is finite, advice can be eliminated (Ben-David & Dichterman 1998): PACLearnableWithAdvice X C A → PACLearnable X C (see advice_elimination).

      Equations
      • One or more equations did not get rendered due to their size.
      Instances For

        Bayesian Criteria #

        Posterior consistency: as sample size grows, the posterior concentrates on the true parameter/concept.

        Equations
        Instances For
          def PACBayesBound (X : Type u) (Y : Type v) [MeasurableSpace X] [MeasurableSpace Y] (prior posterior : Concept X Y) (S : IIDSample X Y) :

          PAC-Bayes bound: generalization error bounded by prior-posterior KL divergence. For any data-dependent posterior Q over hypotheses and any prior P (chosen before seeing data), with probability ≥ 1 - δ over S ~ D^m: E_{h~Q}[err_D(h)] ≤ E_{h~Q}[err_S(h)] + √((KL(Q ‖ P) + ln(m/δ)) / (2m)) We express this as three conjuncts: absolute continuity, KL finiteness, and the actual bound inequality.

          Equations
          • One or more equations did not get rendered due to their size.
          Instances For

            Information-theoretic bound: generalization bounds based on mutual information.

            Equations
            Instances For