Formal Learning Theory Kernel: Blueprint v1

3.6 Computational Interlude

The Fundamental Theorem is an information-theoretic characterization: it says when enough data exists to learn, regardless of computational cost. The computational question, when can learning be done in polynomial time, is a separate and largely open problem.

Theorem 3.26 Computational Hardness [

Under standard cryptographic assumptions (specifically, the existence of one-way functions), there exist concept classes \(\mathcal{C}\) with \(\mathrm{VCdim}(\mathcal{C}) = O(\log n)\) that are PAC learnable (information-theoretically) but not efficiently PAC learnable (no polynomial-time algorithm achieves PAC guarantees).

This result separates the information-theoretic and computational landscapes of PAC learning. The Fundamental Theorem characterizes the information boundary; it says nothing about the computational one.

The gap between information and computation is the subject of the textbook’s Chapter 16 (Computational Hardness). 1 The key examples:

  • [nosep]
  • Properly learning DNF formulas is computationally hard under cryptographic assumptions, even though DNF has polynomial VC dimension.

  • Improperly learning DNF can be done efficiently via boosting (using a richer hypothesis class).

  • Learning intersections of halfspaces is hard even improperly, under stronger assumptions.

The proper/improper distinction (Definition 2.2) is computationally sharp even when it is information-theoretically irrelevant.