We study functional regression with random subgaussian design and real-valued response. The focus is on the problems in which the regression function can be well approximated by a functional linear model with the slope function being “sparse” in the sense that it can be represented as a sum of a small number of well separated “spikes”. This can be viewed as an extension of now classical sparse estimation problems to the case of infinite dictionaries. We study an estimator of the regression function based on penalized empirical risk minimization with quadratic loss and the complexity penalty defined in terms of -norm (a continuous version of LASSO). The main goal is to introduce several important parameters characterizing sparsity in this class of problems and to prove sharp oracle inequalities showing how the -error of the continuous LASSO estimator depends on the underlying sparsity of the problem.
Nous étudions la régression fonctionnelle linéaire avec design sous-gaussien et la réponse à valeurs réelles. Nous nous concentrons sur les problèmes où la fonction de régression est bien approchée par un modèle fonctionnel linéaire dont la pente est « sparse » dans le sens où elle peut être représentée comme une somme d’un petit nombre de « pics » séparés. Nous pouvons considérer ce problème comme une extension du problème classique d’estimation « sparse » au cas d’un dictionnaire infini. Nous étudions un estimateur de la fonction de régression basé sur la minimisation du risque empirique pénalisé avec une perte quadratique et avec une pénalité de complexité définie en termes de la norme (une version continue du LASSO). L’objectif principal est d’introduire certains paramètres importants qui caractérisent la « sparsité » dans cette classe de problèmes et de prouver des inégalités d’oracle « sparses » montrant comment l’erreur de la version continue du LASSO dépend de la sparsité sous-jacent du problème.
Accepted:
Published online:
DOI: 10.5802/jep.11
Keywords: Functional regression, sparse recovery, LASSO, oracle inequality, infinite dictionaries
Mot clés : Régression fonctionnelle, recouvrement « sparse », LASSO, inégalité d’oracle, dictionnaire infini
Vladimir Koltchinskii 1; Stanislav Minsker 2
@article{JEP_2014__1__269_0, author = {Vladimir Koltchinskii and Stanislav Minsker}, title = {$L_1$-penalization in functional linear regression with subgaussian design}, journal = {Journal de l{\textquoteright}\'Ecole polytechnique {\textemdash} Math\'ematiques}, pages = {269--330}, publisher = {\'Ecole polytechnique}, volume = {1}, year = {2014}, doi = {10.5802/jep.11}, mrnumber = {3322790}, zbl = {1308.62143}, language = {en}, url = {https://jep.centre-mersenne.org/articles/10.5802/jep.11/} }
TY - JOUR AU - Vladimir Koltchinskii AU - Stanislav Minsker TI - $L_1$-penalization in functional linear regression with subgaussian design JO - Journal de l’École polytechnique — Mathématiques PY - 2014 SP - 269 EP - 330 VL - 1 PB - École polytechnique UR - https://jep.centre-mersenne.org/articles/10.5802/jep.11/ DO - 10.5802/jep.11 LA - en ID - JEP_2014__1__269_0 ER -
%0 Journal Article %A Vladimir Koltchinskii %A Stanislav Minsker %T $L_1$-penalization in functional linear regression with subgaussian design %J Journal de l’École polytechnique — Mathématiques %D 2014 %P 269-330 %V 1 %I École polytechnique %U https://jep.centre-mersenne.org/articles/10.5802/jep.11/ %R 10.5802/jep.11 %G en %F JEP_2014__1__269_0
Vladimir Koltchinskii; Stanislav Minsker. $L_1$-penalization in functional linear regression with subgaussian design. Journal de l’École polytechnique — Mathématiques, Volume 1 (2014), pp. 269-330. doi : 10.5802/jep.11. https://jep.centre-mersenne.org/articles/10.5802/jep.11/
[1] - “A tail inequality for suprema of unbounded empirical processes with applications to Markov chains”, Electron. J. Probab. 13 (2008), p. 1000-1034 | DOI | MR | Zbl
[2] - Sobolev spaces, Academic Press, New York, 1975 | Zbl
[3] - “Numerical methods for PDEs” (2009), Lecture notes available at http://www.columbia.edu/~gb2030/COURSES/E6302/NumAnal.pdf
[4] - “-regularized linear regression: persistence and oracle inequalities”, Probab. Theory Relat. Fields 154 (2012), p. 193-224 | DOI | Zbl
[5] - “Concentration via chaining method and its applications” (2014), arXiv:1405.0676v2
[6] - “Simultaneous analysis of Lasso and Dantzig selector”, Ann. Statist. 37 (2009) no. 4, p. 1705-1732 | DOI | MR | Zbl
[7] - Measure theory. Vol. I, II, Springer-Verlag, Berlin, 2007 | Zbl
[8] - Statistics for high-dimensional data, Springer- Verlag, Berlin-Heidelberg, 2011 | DOI
[9] - “Sparsity oracle inequalities for the Lasso”, Electron. J. Statist. 1 (2007), p. 169-194 | DOI | MR | Zbl
[10] - “Prediction in functional linear regression”, Ann. Statist. 34 (2006) no. 5, p. 2159-2179 | DOI | MR | Zbl
[11] - “The restricted isometry property and its implications for compressed sensing”, Comptes Rendus Mathématique 346 (2008) no. 9, p. 589-592 | DOI | MR | Zbl
[12] - “Towards a Mathematical Theory of Super-resolution”, Comm. Pure Appl. Math. 67 (2014) no. 6, p. 906-956 | DOI | MR | Zbl
[13] - “Stable signal recovery from incomplete and inaccurate measurements”, Comm. Pure Appl. Math. 59 (2006) no. 8, p. 1207-1223 | DOI | MR | Zbl
[14] - “Smoothing splines estimators for functional linear regression”, Ann. Statist. 37 (2009) no. 1, p. 35-72 | DOI | MR | Zbl
[15] - “Tail bounds via generic chaining” (2013), arXiv:1309.3522 | Zbl
[16] - “High-dimensional generalized linear models and the Lasso”, Ann. Statist. 36 (2008) no. 2, p. 614-645 | DOI | MR | Zbl
[17] - “The Lasso, correlated design, and improved oracle inequalities”, in A Festschrift in Honor of Jon Wellner, IMS Collections, Institute of Mathematical Statistics, 2012, p. 3468-3497 | Zbl
[18] - “Norms of random matrices and widths of finite-dimensional sets”, Mat. Sb. 120(162) (1983) no. 2, p. 180-189 | Zbl
[19] - “How Correlations Influence Lasso Prediction”, IEEE Trans. Information Theory 59 (2013) no. 3, p. 1846-1854 | DOI | MR | Zbl
[20] - Theory of Extremal Problems, Nauka, Moscow, 1974
[21] - “Sparseness and functional data analysis”, in The Oxford handbook of functional data analysis, Oxford University Press, New York, 2011, p. 298-323
[22] - “Functional linear regression that’s interpretable”, Ann. Statist. 37 (2009) no. 5A, p. 2083-2108 | DOI | MR | Zbl
[23] - “The Dantzig selector and sparsity oracle inequalities”, Bernoulli 15 (2009) no. 3, p. 799-828 | DOI | MR | Zbl
[24] - “Sparse recovery in Convex Hulls via Entropy penalization”, Ann. Statist. 37 (2009) no. 3, p. 1332-1359 | DOI | MR | Zbl
[25] - “Sparsity in Penalized Empirical Risk Minimization”, Ann. Inst. H. Poincaré Probab. Statist. 45 (2009) no. 1, p. 7-57 | DOI | Numdam | MR | Zbl
[26] - “Oracle inequalities in empirical risk minimization and sparse recovery problems”, in 38th Probability Summer School (Saint-Flour, 2008), Springer, 2011 | Zbl
[27] - “Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion”, Ann. Statist. 39 (2011) no. 5, p. 2302-2329 | DOI | MR | Zbl
[28] - “Sparse Recovery in Convex Hulls of Infinite Dictionaries”, in COLT 2010, 23rd Conference on Learning Theory, 2010, p. 420-432
[29] - Real and functional analysis, Graduate Texts in Math., vol. 142, Springer, 1993 | MR | Zbl
[30] - Gaussian random functions, Mathematics and its Applications, vol. 322, Kluwer Academic Publishers, Dordrecht, 1995 | MR | Zbl
[31] - “The Lasso as an -ball model selection procedure”, Electron. J. Statist. 5 (2011), p. 669-687 | DOI | Zbl
[32] - “Oracle inequalities and the isomorphic method”, Preprint, 2012. Available at http://maths-people.anu.edu.au/~mendelso/papers/subgaussian-12-01-2012.pdf
[33] - “Empirical processes with a bounded diameter”, Geom. Funct. Anal. 20 (2010) no. 4, p. 988-1027 | DOI
[34] - “Generalized functional linear models”, Ann. Statist. 33 (2005) no. 2, p. 774-805 | DOI | MR
[35] - Functional data analysis, Wiley Online Library, 2006
[36] - Applied functional data analysis: methods and case studies, Springer Series in Statistics, vol. 77, Springer, New York, 2002 | Zbl
[37] - “Multivariate integration and approximation for random fields satisfying Sacks-Ylvisaker conditions”, Ann. Appl. Probab. (1995), p. 518-540 | DOI | MR | Zbl
[38] - “Designs for regression problems with correlated errors”, Ann. Statist. 37 (1966) no. 1, p. 66-89 | DOI | MR | Zbl
[39] - The generic chaining, Springer Monographs in Mathematics, Springer-Verlag, Berlin, 2005 | Zbl
[40] - “Regression shrinkage and selection via the Lasso”, J. R. Stat. Soc. Ser. B Stat. Methodol. (1996), p. 267-288 | Zbl
[41] - Weak convergence and empirical processes, Springer Series in Statistics, Springer-Verlag, New York, 1996 | DOI | Zbl
[42] - “A reproducing kernel Hilbert space approach to functional linear regression”, Ann. Statist. 38 (2010) no. 6, p. 3412-3444 | DOI | MR | Zbl
Cited by Sources: