Equivalence between adaptive Lasso and generalized ridge estimators in linear regression with orthogonal explanatory variables after optimizing regularization parameters

Mineaki Ohishi, Hirokazu Yanagihara, Shuichi Kawano

研究成果: ジャーナルへの寄稿学術誌査読

1 被引用数 (Scopus)

抄録

In this paper, we deal with a penalized least-squares (PLS) method for a linear regression model with orthogonal explanatory variables. The used penalties are an adaptive Lasso (AL)-type ℓ1 penalty (AL penalty) and a generalized ridge (GR)-type ℓ2 penalty (GR penalty). Since the estimators obtained by minimizing the PLS methods strongly depend on the regularization parameters, we optimize them by a model selection criterion (MSC) minimization method. The estimators based on the AL penalty and the GR penalty have different properties, and it is universally recognized that these are completely different estimators. However, in this paper, we show an interesting result that the two estimators are exactly equal when the explanatory variables are orthogonal after optimizing the regularization parameters by the MSC minimization method.

本文言語英語
ページ(範囲)1501-1516
ページ数16
ジャーナルAnnals of the Institute of Statistical Mathematics
72
6
DOI
出版ステータス出版済み - 12月 1 2020
外部発表はい

!!!All Science Journal Classification (ASJC) codes

  • 統計学および確率

フィンガープリント

「Equivalence between adaptive Lasso and generalized ridge estimators in linear regression with orthogonal explanatory variables after optimizing regularization parameters」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル