Sparse principal component regression for generalized linear models

Shuichi Kawano, Hironori Fujisawa, Toyoyuki Takada, Toshihiko Shiroishi

研究成果: ジャーナルへの寄稿学術誌査読

18 被引用数 (Scopus)

抄録

Principal component regression (PCR) is a widely used two-stage procedure: principal component analysis (PCA), followed by regression in which the selected principal components are regarded as new explanatory variables in the model. Note that PCA is based only on the explanatory variables, so the principal components are not selected using the information on the response variable. We propose a one-stage procedure for PCR in the framework of generalized linear models. The basic loss function is based on a combination of the regression loss and PCA loss. An estimate of the regression parameter is obtained as the minimizer of the basic loss function with a sparse penalty. We call the proposed method sparse principal component regression for generalized linear models (SPCR-glm). Taking the two loss function into consideration simultaneously, SPCR-glm enables us to obtain sparse principal component loadings that are related to a response variable. However, a combination of loss functions may cause a parameter identification problem, but this potential problem is avoided by virtue of the sparse penalty. Thus, the sparse penalty plays two roles in this method. We apply SPCR-glm to two real datasets, doctor visits data and mouse consomic strain data.

本文言語英語
ページ(範囲)180-196
ページ数17
ジャーナルComputational Statistics and Data Analysis
124
DOI
出版ステータス出版済み - 8月 2018
外部発表はい

!!!All Science Journal Classification (ASJC) codes

  • 統計学および確率
  • 計算数学
  • 計算理論と計算数学
  • 応用数学

フィンガープリント

「Sparse principal component regression for generalized linear models」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル