Detailed Information

Cited 0 time in webofscience Cited 2 time in scopus
Metadata Downloads

Fixed support positive-definite modification of covariance matrix estimators via linear shrinkageopen access

Authors
Choi, Young-GeunLim, JohanRoy, AnindyaPark, Junyong
Issue Date
May-2019
Publisher
ELSEVIER INC
Citation
JOURNAL OF MULTIVARIATE ANALYSIS, v.171, pp 234 - 249
Pages
16
Journal Title
JOURNAL OF MULTIVARIATE ANALYSIS
Volume
171
Start Page
234
End Page
249
URI
https://scholarworks.sookmyung.ac.kr/handle/2020.sw.sookmyung/1842
DOI
10.1016/j.jmva.2018.12.002
ISSN
0047-259X
1095-7243
Abstract
This paper is concerned with the positive definiteness (PDness) problem in covariance matrix estimation. For high-dimensional data, many regularized estimators have been proposed under structural assumptions on the true covariance matrix, including sparsity. They were shown to be asymptotically consistent and rate-optimal in estimating the true covariance matrix and its structure. However, many of them do not take into account the PDness of the estimator and produce a non-PD estimate. To achieve PDness, researchers considered additional regularizations (or constraints) on eigenvalues, which make both the asymptotic analysis and computation much harder. In this paper, we propose a simple modification of the regularized covariance matrix estimator to make it PD while preserving the support. We revisit the idea of linear shrinkage and propose to take a convex combination between the first-stage estimator (the regularized covariance matrix without PDness) and a given form of diagonal matrix. The proposed modification, which we call the FSPD (Fixed Support and Positive Definiteness) estimator, is shown to preserve the asymptotic properties of the first-stage estimator if the shrinkage parameters are carefully selected. It has a closed form expression and its computation is optimization-free, unlike existing PD sparse estimators. In addition, the FSPD is generic in the sense that it can be applied to any non-PD matrix, including the precision matrix. The FSPD estimator is numerically compared with other sparse PD estimators to understand its finite-sample properties as well as its computational gain. It is also applied to two multivariate procedures relying on the covariance matrix estimator – the linear minimax classification problem and the Markowitz portfolio optimization problem – and is shown to improve substantially the performance of both procedures.
Files in This Item
Go to Link
Appears in
Collections
이과대학 > 통계학과 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE