A note on generalization bounds for losses with finite moments
Borja Rodríguez-Gálvez, KTH Royal Institute of Technology, Sweden; Omar Rivasplata, UCL, United Kingdom; Ragnar Thobaben, Mikael Skoglund, KTH Royal Institute of Technology, Sweden
Session:
Generalization Bounds
Track:
8: Machine Learning
Location:
Ballroom II & III
Presentation Time:
Thu, 11 Jul, 16:45 - 17:05
Session Chair:
Abdellatif Zaidi,
Abstract
This paper studies the truncation method from [1] to derive high-probability PAC-Bayes bounds for unbounded losses with heavy tails. Assuming that the $p$-th moment is bounded, the resulting bounds interpolate between a slow rate $\nicefrac{1}{\sqrt{n}}$ when $p=2$, and a fast rate $\nicefrac{1}{n}$ when $p \to \infty$ and the loss is essentially bounded. Moreover, the paper derives a high-probability PAC-Bayes bound for losses with a bounded variance. This bound has an exponentially better dependence on the confidence parameter and the dependency measure than previous bounds in the literature. Finally, the paper extends all results to guarantees in expectation and single-draw PAC-Bayes. In order to so, it obtains analogues of the PAC-Bayes fast rate bound for bounded losses from [2] in these settings. The full version of the paper can be found in https://arxiv.org/abs/2403.16681.