Model Selection via Bayesian Information Criterion for Quantile Regression Models
Citations

WEB OF SCIENCE

155
Citations

SCOPUS

163

초록

Bayesian information criterion (BIC) is known to identify the true model consistently as long as the predictor dimension is finite. Recently, its moderate modifications have been shown to be consistent in model selection even when the number of variables diverges. Those works have been done mostly in mean regression, but rarely in quantile regression. The best-known results about BIC for quantile regression are for linear models with a fixed number of variables. In this article, we investigate how BIC can be adapted to high-dimensional linear quantile regression and show that a modified BIC is consistent in model selection when the number of variables diverges as the sample size increases. We also discuss how it can be used for choosing the regularization parameters of penalized approaches that are designed to conduct variable selection and shrinkage estimation simultaneously. Moreover, we extend the results to structured nonparametric quantile models with a diverging number of covariates. We illustrate our theoretical results via some simulated examples and a real data analysis on human eye disease. Supplementary materials for this article are available online.

키워드

High dimensionLinear quantile regressionModel selection consistencyNonparametric quantile regressionRegularization parameter selectionShrinkage methodNONCONCAVE PENALIZED LIKELIHOODVARIABLE SELECTIONDIVERGING NUMBERSHRINKAGELASSOPARAMETERSESTIMATORSDIMENSIONPENALTY
제목
Model Selection via Bayesian Information Criterion for Quantile Regression Models
저자
Lee, Eun RyungNoh, HohsukPark, Byeong U.
DOI
10.1080/01621459.2013.836975
발행일
2014-03
유형
Article
저널명
Journal of the American Statistical Association
109
505
페이지
216 ~ 229