Abstract
In this paper, we summarize some recent results in Li et al. (2012), which can be used to extend an important PAC-Bayesian approach, namely the Gibbs posterior, to study the nonadditive ranking risk. The methodology is based on assumption-free risk bounds and nonasymptotic oracle inequalities, which leads to nearly optimal convergence rates and optimal model selection to balance the approximation errors and the stochastic errors.
Original language | English (US) |
---|---|
Pages (from-to) | 512-521 |
Number of pages | 10 |
Journal | Journal of Machine Learning Research |
Volume | 30 |
State | Published - Jan 1 2013 |
Event | 26th Conference on Learning Theory, COLT 2013 - Princeton, NJ, United States Duration: Jun 12 2013 → Jun 14 2013 |
Keywords
- Gibbs posterior
- Model selection
- Oracle inequalities
- Ranking
- Risk minimization
ASJC Scopus subject areas
- Control and Systems Engineering
- Software
- Statistics and Probability
- Artificial Intelligence