Skip to Main content Skip to Navigation
Conference papers

Data sparse nonparametric regression with epsilon-insensitive losses

Abstract : Leveraging the celebrated support vector regression (SVR) method, we propose a unifying framework in order to deliver regression machines in reproducing kernel Hilbert spaces (RKHSs) with data sparsity. The central point is a new definition of epsilon-insensitivity, valid for many regression losses (including quantile and expectile regression) and their multivariate extensions. We show that the dual optimization problem to empirical risk minimization with epsilon-insensitivity involves a data sparse regularization. We also provide an analysis of the excess of risk as well as a randomized coordinate descent algorithm for solving the dual. Numerical experiments validate our approach.
Complete list of metadatas

Cited literature [47 references]  Display  Hide  Download
Contributor : Maxime Sangnier <>
Submitted on : Thursday, August 2, 2018 - 5:21:54 PM
Last modification on : Friday, July 31, 2020 - 10:44:09 AM
Long-term archiving on: : Saturday, November 3, 2018 - 3:38:40 PM


Files produced by the author(s)


  • HAL Id : hal-01593459, version 1


Maxime Sangnier, Olivier Fercoq, Florence d'Alché-Buc. Data sparse nonparametric regression with epsilon-insensitive losses. 9th Asian Conference on Machine Learning (ACML 2017), Nov 2017, Séoul, South Korea. pp.192-207. ⟨hal-01593459⟩



Record views


Files downloads