TY - JOUR
T1 - Unsupervised Learning of Domain-Independent User Attributes
AU - Ishikawa, Yuichi
AU - Legaspi, Roberto
AU - Yonekawa, Kei
AU - Nakamura, Yugo
AU - Ishida, Shigemi
AU - Mine, Tsunenori
AU - Arakawa, Yutaka
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2022
Y1 - 2022
N2 - Learning user attributes is essential for providing users with a service. In particular, for e-commerce portals which deal in variety of goods ranging from clothes to foods to home electronics, it is especially important to learn 'domain-independent' attributes such as age, gender, and personality that affect people's behavior across various domains of daily life (e.g., clothing, eating and housing) because these attributes can be used for personalization in diverse domains their service covers. Thus far, researchers have proposed approaches to learn user representation (UR) from user-item interactions, trying to embed rich information about user attributes in UR. However, very few can learn URs that are domain-independent without confounding them with domain-specific attributes (e.g., food preferences). This could consequently undermine the former's utility for personalizing services in other domains from which the URs are not learned. To address this, we propose an approach to learn URs that exclusively reflect domain-independent attributes. Our approach introduces a novel multi-layer RNN with two types of layers: Domain Specific Layers (DSLs) for modeling behavior in individual domains and a Domain Independent Layer (DIL) for modeling attributes that affect behavior across multiple domains. By exchanging hidden states between these layers, the RNNs implement the process of domain-independent attributes affecting domain-specific behavior and makes the DIL learn URs that capture domain-independence. Our evaluation results confirmed that the URs learned by our approach have greater utility in predicting behavior in the other domains from which these URs were not learned thereby demonstrating adaptability to various domains.
AB - Learning user attributes is essential for providing users with a service. In particular, for e-commerce portals which deal in variety of goods ranging from clothes to foods to home electronics, it is especially important to learn 'domain-independent' attributes such as age, gender, and personality that affect people's behavior across various domains of daily life (e.g., clothing, eating and housing) because these attributes can be used for personalization in diverse domains their service covers. Thus far, researchers have proposed approaches to learn user representation (UR) from user-item interactions, trying to embed rich information about user attributes in UR. However, very few can learn URs that are domain-independent without confounding them with domain-specific attributes (e.g., food preferences). This could consequently undermine the former's utility for personalizing services in other domains from which the URs are not learned. To address this, we propose an approach to learn URs that exclusively reflect domain-independent attributes. Our approach introduces a novel multi-layer RNN with two types of layers: Domain Specific Layers (DSLs) for modeling behavior in individual domains and a Domain Independent Layer (DIL) for modeling attributes that affect behavior across multiple domains. By exchanging hidden states between these layers, the RNNs implement the process of domain-independent attributes affecting domain-specific behavior and makes the DIL learn URs that capture domain-independence. Our evaluation results confirmed that the URs learned by our approach have greater utility in predicting behavior in the other domains from which these URs were not learned thereby demonstrating adaptability to various domains.
UR - http://www.scopus.com/inward/record.url?scp=85141613890&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85141613890&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2022.3220781
DO - 10.1109/ACCESS.2022.3220781
M3 - Article
AN - SCOPUS:85141613890
SN - 2169-3536
VL - 10
SP - 119649
EP - 119665
JO - IEEE Access
JF - IEEE Access
ER -