A Vector-Contraction Inequality for Rademacher Complexities

The contraction inequality for Rademacher averages is extended to Lipschitz functions with vector-valued domains, and it is also shown that in the bounding expression the Rademacher variables can be replaced by arbitrary iid symmetric and sub-gaussian variables. Example applications are given for mu...

Full description

Saved in:
Bibliographic Details
Published inAlgorithmic Learning Theory pp. 3 - 17
Main Author Maurer, Andreas
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing 2016
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783319463780
3319463780
ISSN0302-9743
1611-3349
DOI10.1007/978-3-319-46379-7_1

Cover

More Information
Summary:The contraction inequality for Rademacher averages is extended to Lipschitz functions with vector-valued domains, and it is also shown that in the bounding expression the Rademacher variables can be replaced by arbitrary iid symmetric and sub-gaussian variables. Example applications are given for multi-category learning, K-means clustering and learning-to-learn.
ISBN:9783319463780
3319463780
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-46379-7_1