TY - GEN
T1 - Higher order fused regularization for supervised learning with grouped parameters
AU - Takeuchi, Koh
AU - Kawahara, Yoshinobu
AU - Iwata, Tomoharu
N1 - Funding Information:
This work was partially supported by JSPS KAKENHI Grant Numbers 14435225 and 14500801.
Publisher Copyright:
© Springer International Publishing Switzerland 2015.
PY - 2015
Y1 - 2015
N2 - We often encounter situations in supervised learning where there exist possibly groups that consist of more than two parameters. For example, we might work on parameters that correspond to words expressing the same meaning, music pieces in the same genre, and books released in the same year. Based on such auxiliary information, we could suppose that parameters in a group have similar roles in a problem and similar values. In this paper, we propose the Higher Order Fused (HOF) regularization that can incorporate smoothness among parameters with group structures as prior knowledge in supervised learning. We define the HOF penalty as the Lovász extension of a submodular higher-order potential function, which encourages parameters in a group to take similar estimated values when used as a regularizer. Moreover, we develop an efficient network flow algorithm for calculating the proximity operator for the regularized problem. We investigate the empirical performance of the proposed algorithm by using synthetic and real-world data.
AB - We often encounter situations in supervised learning where there exist possibly groups that consist of more than two parameters. For example, we might work on parameters that correspond to words expressing the same meaning, music pieces in the same genre, and books released in the same year. Based on such auxiliary information, we could suppose that parameters in a group have similar roles in a problem and similar values. In this paper, we propose the Higher Order Fused (HOF) regularization that can incorporate smoothness among parameters with group structures as prior knowledge in supervised learning. We define the HOF penalty as the Lovász extension of a submodular higher-order potential function, which encourages parameters in a group to take similar estimated values when used as a regularizer. Moreover, we develop an efficient network flow algorithm for calculating the proximity operator for the regularized problem. We investigate the empirical performance of the proposed algorithm by using synthetic and real-world data.
UR - http://www.scopus.com/inward/record.url?scp=84984650204&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84984650204&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-23528-8_36
DO - 10.1007/978-3-319-23528-8_36
M3 - Conference contribution
AN - SCOPUS:84984650204
SN - 9783319235271
SN - 9783319235271
SN - 9783319235271
SN - 9783319235271
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 577
EP - 593
BT - Machine Learning and Knowledge Discovery in Databases - European Conference, ECML PKDD 2015, Proceedings
A2 - Appice, Annalisa
A2 - Gama, João
A2 - Costa, Vitor Santos
A2 - Gama, João
A2 - Jorge, Alípio
A2 - Appice, Annalisa
A2 - Appice, Annalisa
A2 - Costa, Vitor Santos
A2 - Jorge, Alípio
A2 - Appice, Annalisa
A2 - Rodrigues, Pedro Pereira
A2 - Rodrigues, Pedro Pereira
A2 - Gama, João
A2 - Costa, Vitor Santos
A2 - Soares, Soares
A2 - Rodrigues, Pedro Pereira
A2 - Soares, Soares
A2 - Soares, Soares
A2 - Gama, João
A2 - Soares, Soares
A2 - Jorge, Alípio
A2 - Jorge, Alípio
A2 - Rodrigues, Pedro Pereira
A2 - Costa, Vitor Santos
PB - Springer Verlag
T2 - European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2015
Y2 - 7 September 2015 through 11 September 2015
ER -