1 / 16

Generalization bounds for uniformly stable algorithms

Generalization bounds for uniformly stable algorithms. Vitaly Feldman Brain. with Jan Vondrak. Uniform stability. Domain (e.g. ) Dataset Learning algorithm Loss function. Uniform stability [ Bousquet,Elisseeff 02] : has uniform stability w.r.t. if

mmcmurray
Télécharger la présentation

Generalization bounds for uniformly stable algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Generalization bounds for uniformly stable algorithms Vitaly Feldman Brain with Jan Vondrak

  2. Uniform stability • Domain (e.g. ) • Dataset • Learning algorithm • Loss function • Uniform stability [Bousquet,Elisseeff 02]: • has uniform stability w.r.t. if • For all neighboring

  3. Generalization error • Probability distribution over , • Population loss: • Empirical loss: • Generalization error/gap:

  4. Stochastic convex optimization • For all is convex -Lipschitz in • Minimize over • For all , being SGD with rate : • Uniform convergence error: • ERMmight have generalization error: • [Shalev-Shwartz,Shamir,Srebro,Sridharan ’09; Feldman 16]

  5. Stable optimization Strongly convex ERM [BE 02, SSSS 09] is uniformly stable and minimizes within Gradient descent on smooth losses [Hardt,Recht,Singer 16] : steps of GD on with is uniformly stable and minimizes within

  6. Generalization bounds For w/ range and uniform stability [Rogers,Wagner 78] [Bousquet,Elisseeff 02] Vacuous when NEW!

  7. Comparison [BE 02] This work Generalization error

  8. Second moment Previously [Devroye,Wagner 79; BE 02] • Chebyshev: TIGHT! NEW! [BE 02] This work Generalization error

  9. Implications For any loss , has uniform stability Stronger generalization bounds for DP prediction algorithms • Differentially-private prediction [Dwork,Feldman 18] • . A randomized algorithm has -DP prediction if for all

  10. Data-dependent functions Consider . E.g. • has uniform stability if • For all neighboring • Generalization error/gap:

  11. Generalization in expectation Goal: For all and ,

  12. Concentration via McDiarmid For all neighboring 1. 2. McDiarmid: where

  13. Proof technique Based on [Nissim,Stemmer 15; BNSSSU 16] Let for Need to bound • Let be a distribution over UNSTABLE!

  14. Stable max Exponential mechanism [McSherry,Talwar 07] : sample • Stable: - differentially private • Approximates max

  15. Game over Pick get • Let be a distribution over

  16. Conclusions • Better understanding of uniform stability • New technique • Open • Gap between upper and lower bounds • High probability generalization without strong uniformity

More Related