Rank theory approach to ridge, LASSO, preliminary test and Stein-type estimators: Comparative study
Kybernetika, Tome 54 (2018) no. 5, pp. 958-977
Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

In the development of efficient predictive models, the key is to identify suitable predictors for a given linear model. For the first time, this paper provides a comparative study of ridge regression, LASSO, preliminary test and Stein-type estimators based on the theory of rank statistics. Under the orthonormal design matrix of a given linear model, we find that the rank based ridge estimator outperforms the usual rank estimator, restricted R-estimator, rank-based LASSO, preliminary test and Stein-type R-estimators uniformly. On the other hand, neither LASSO nor the usual R-estimator, preliminary test and Stein-type R-estimators outperform the other. The region of domination of LASSO over all the R-estimators (except the ridge R-estimator) is the interval around the origin of the parameter space. Finally, we observe that the L$_2$-risk of the restricted R-estimator equals the lower bound on the L$_2$-risk of LASSO. Our conclusions are based on L$_2$-risk analysis and relative L$_2$-risk efficiencies with related tables and graphs.
In the development of efficient predictive models, the key is to identify suitable predictors for a given linear model. For the first time, this paper provides a comparative study of ridge regression, LASSO, preliminary test and Stein-type estimators based on the theory of rank statistics. Under the orthonormal design matrix of a given linear model, we find that the rank based ridge estimator outperforms the usual rank estimator, restricted R-estimator, rank-based LASSO, preliminary test and Stein-type R-estimators uniformly. On the other hand, neither LASSO nor the usual R-estimator, preliminary test and Stein-type R-estimators outperform the other. The region of domination of LASSO over all the R-estimators (except the ridge R-estimator) is the interval around the origin of the parameter space. Finally, we observe that the L$_2$-risk of the restricted R-estimator equals the lower bound on the L$_2$-risk of LASSO. Our conclusions are based on L$_2$-risk analysis and relative L$_2$-risk efficiencies with related tables and graphs.
DOI : 10.14736/kyb-2018-5-0958
Classification : 62G05, 62J05, 62J07
Keywords: efficiency of LASSO; penalty estimators; preliminary test; Stein-type estimator; ridge estimator; L$_2$-risk function
@article{10_14736_kyb_2018_5_0958,
     author = {Saleh, A. K. Md. Ehsanes and Navr\'atil, Radim},
     title = {Rank theory approach to ridge, {LASSO,} preliminary test and {Stein-type} estimators: {Comparative} study},
     journal = {Kybernetika},
     pages = {958--977},
     year = {2018},
     volume = {54},
     number = {5},
     doi = {10.14736/kyb-2018-5-0958},
     mrnumber = {3893130},
     zbl = {07031754},
     language = {en},
     url = {http://geodesic.mathdoc.fr/articles/10.14736/kyb-2018-5-0958/}
}
TY  - JOUR
AU  - Saleh, A. K. Md. Ehsanes
AU  - Navrátil, Radim
TI  - Rank theory approach to ridge, LASSO, preliminary test and Stein-type estimators: Comparative study
JO  - Kybernetika
PY  - 2018
SP  - 958
EP  - 977
VL  - 54
IS  - 5
UR  - http://geodesic.mathdoc.fr/articles/10.14736/kyb-2018-5-0958/
DO  - 10.14736/kyb-2018-5-0958
LA  - en
ID  - 10_14736_kyb_2018_5_0958
ER  - 
%0 Journal Article
%A Saleh, A. K. Md. Ehsanes
%A Navrátil, Radim
%T Rank theory approach to ridge, LASSO, preliminary test and Stein-type estimators: Comparative study
%J Kybernetika
%D 2018
%P 958-977
%V 54
%N 5
%U http://geodesic.mathdoc.fr/articles/10.14736/kyb-2018-5-0958/
%R 10.14736/kyb-2018-5-0958
%G en
%F 10_14736_kyb_2018_5_0958
Saleh, A. K. Md. Ehsanes; Navrátil, Radim. Rank theory approach to ridge, LASSO, preliminary test and Stein-type estimators: Comparative study. Kybernetika, Tome 54 (2018) no. 5, pp. 958-977. doi: 10.14736/kyb-2018-5-0958

[1] A, A. Belloni, Chernozhukov, V.: Least squares after model selection in high-dimensional sparse models. Bernoulli 19 (2013), 521-547. | DOI | MR

[2] Breiman, L.: Heuristics of instability and stabilization in model selection. Ann. Statist. 24 (1996), 2350-2383. | DOI | MR

[3] Donoho, D. L., Johnstone, I. M.: Minimax estimation via wavelet shrinkage. Ann. Statist. 26 (1994), 879-921. | DOI | MR

[4] Draper, N. R., Nostrand, R. C. Van: Ridge regression and James-Stein estimation: review and comments. Technometrics 21 (1979), 451-466. | DOI | MR

[5] Fan, J., Li, R.: Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 (2001), 1348-1360. | DOI | MR

[6] Frank, L. E., Friedman, J. H.: A statistical view of some chemometrics regression tools. Technometrics 35 (1993), 109-135. | DOI

[7] Hoerl, E., Kennard, R. W.: Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12 (1970), 55-67. | DOI

[8] James, W., Stein, C.: Estimation with quadratic loss. In: Proc. Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics, University of California Press 1961, pp. 361-379. | MR

[9] Jurečková, J.: Nonparametric estimate of regression coefficients. Ann. Math. Statist. 42 (1971), 1328-1338. | DOI | MR

[10] Hansen, B. E.: The risk of James-Stein and Lasso shrinkage. Econometric Rev. 35 (2015), 456-470. | MR

[11] Saleh, A. K. Md. E.: Theory of Preliminary test and Stein-Type Estimators with Applications. John Wiley and Sons, New York 2006. | DOI | MR

[12] Saleh, A. K. Md. E., Arashi, M., Norouzirad, M., Kibria, B. M. G.: On shrinkage and selection: ANOVA MODEL. J. Statist. Res. 51 (2017), 165-191. | MR

[13] Stein, C.: Inadmissibility of the usual estimator for the mean of a multivariate normal distribution. In: Proc. Third Berkeley Symposium on Mathematical Statistics and Probability, University of California Press 1956, pp. 197-206. | MR

[14] Tibshirani, R.: Regression shrinkage and selection via the lasso. J. Royal Statist. Soc., Series B (Methodological) 58 (1996), 267-288. | DOI | MR

[15] Tikhonov, A. N.: Solution of incorrectly formulated problems and the regularization method. Doklady Akademii Nauk SSSR 151 (1963), 501-504. | MR

[16] Zou, H.: The adaptive lasso and its oracle properties. J. Amer. Statist. Assoc. 101 (2006), 1418-1429. | DOI | MR

[17] Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. J. Royal Stat. Soc. Ser. B Stat. Methodol. 67 (2005), 301-320. | DOI | MR

Cité par Sources :