Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection

Lisha Chen, Jianhua Z. Huang

Research output: Contribution to journalArticlepeer-review

174 Scopus citations

Abstract

The reduced-rank regression is an effective method in predicting multiple response variables from the same set of predictor variables. It reduces the number of model parameters and takes advantage of interrelations between the response variables and hence improves predictive accuracy. We propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty. We apply a group-lasso type penalty that treats each row of the matrix of the regression coefficients as a group and show that this penalty satisfies certain desirable invariance properties. We develop two numerical algorithms to solve the penalized regression problem and establish the asymptotic consistency of the proposed method. In particular, the manifold structure of the reduced-rank regression coefficient matrix is considered and studied in our theoretical analysis. In our simulation study and real data analysis, the new method is compared with several existing variable selection methods for multivariate regression and exhibits competitive performance in prediction and variable selection. © 2012 American Statistical Association.
Original languageEnglish (US)
Pages (from-to)1533-1545
Number of pages13
JournalJournal of the American Statistical Association
Volume107
Issue number500
DOIs
StatePublished - Oct 8 2012
Externally publishedYes

Fingerprint

Dive into the research topics of 'Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection'. Together they form a unique fingerprint.

Cite this