A Vector-Contraction Inequality for Rademacher Complexities
The contraction inequality for Rademacher averages is extended to Lipschitz functions with vector-valued domains, and it is also shown that in the bounding expression the Rademacher variables can be replaced by arbitrary iid symmetric and sub-gaussian variables. Example applications are given for mu...
        Saved in:
      
    
          | Published in | Algorithmic Learning Theory pp. 3 - 17 | 
|---|---|
| Main Author | |
| Format | Book Chapter | 
| Language | English | 
| Published | 
        Cham
          Springer International Publishing
    
        2016
     | 
| Series | Lecture Notes in Computer Science | 
| Subjects | |
| Online Access | Get full text | 
| ISBN | 9783319463780 3319463780  | 
| ISSN | 0302-9743 1611-3349  | 
| DOI | 10.1007/978-3-319-46379-7_1 | 
Cover
| Summary: | The contraction inequality for Rademacher averages is extended to Lipschitz functions with vector-valued domains, and it is also shown that in the bounding expression the Rademacher variables can be replaced by arbitrary iid symmetric and sub-gaussian variables. Example applications are given for multi-category learning, K-means clustering and learning-to-learn. | 
|---|---|
| ISBN: | 9783319463780 3319463780  | 
| ISSN: | 0302-9743 1611-3349  | 
| DOI: | 10.1007/978-3-319-46379-7_1 |