On Margins and Derandomisation in PAC-Bayes - Inria CWI Access content directly
Preprints, Working Papers, ... Year : 2021

On Margins and Derandomisation in PAC-Bayes

Abstract

We develop a framework for derandomising PAC-Bayesian generalisation bounds achieving a margin on training data, relating this process to the concentration-of-measure phenomenon. We apply these tools to linear prediction, single-hidden-layer neural networks with an unusual erf activation function, and deep ReLU networks, obtaining new bounds. The approach is also extended to the idea of "partial-derandomisation" where only some layers are derandomised and the others are stochastic. This allows empirical evaluation of single-hidden-layer networks on more complex datasets, and helps bridge the gap between generalisation bounds for non-stochastic deep networks and those for randomised deep networks as generally examined in PAC-Bayes.
Fichier principal
Vignette du fichier
2107.03955.pdf (992.24 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03282597 , version 1 (09-07-2021)
hal-03282597 , version 2 (24-02-2022)

Identifiers

Cite

Felix Biggs, Benjamin Guedj. On Margins and Derandomisation in PAC-Bayes. 2021. ⟨hal-03282597v1⟩
66 View
194 Download

Altmetric

Share

Gmail Facebook X LinkedIn More