Estimating user satisfaction impact in cities using physical reaction sensing and multimodal dialogue system

Yuki Matsuda, Dmitrii Fedotov, Yuta Takahashi, Yutaka Arakawa, Keiichi Yasumoto, Wolfgang Minker

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Following the increase in use of smart devices, various real-time environmental information becomes available everywhere. To provide more context-aware information, we also need to know emotion and a satisfaction level in a viewpoint of users. In this paper, we define it as “a user satisfaction impact (USI)” and propose a method to estimate USI by combining dialogue features and physical reaction features. As dialogue features, facial expression and acoustic feature are extracted from multimodal dialogue system on a smartphone. As physical reactions, head motion, eye motion, and heartbeat are collected by wearable devices. We conducted the preliminary experiments in the real-world to confirm the feasibility of this study in the tourism domain. Among various features, we confirmed that eye motion correlates with satisfaction level up to 0.36.

Original languageEnglish
Title of host publication9th International Workshop on Spoken Dialogue System Technology, IWSDS 2018
EditorsLuis Fernando D’Haro, Rafael E. Banchs, Haizhou Li
PublisherSpringer
Pages177-183
Number of pages7
ISBN (Print)9789811394423
DOIs
Publication statusPublished - 2019
Event9th International Workshop on Spoken Dialogue System Technology, IWSDS 2018 - Singapore, Singapore
Duration: Apr 18 2018Apr 20 2018

Publication series

NameLecture Notes in Electrical Engineering
Volume579
ISSN (Print)1876-1100
ISSN (Electronic)1876-1119

Conference

Conference9th International Workshop on Spoken Dialogue System Technology, IWSDS 2018
Country/TerritorySingapore
CitySingapore
Period4/18/184/20/18

All Science Journal Classification (ASJC) codes

  • Industrial and Manufacturing Engineering

Fingerprint

Dive into the research topics of 'Estimating user satisfaction impact in cities using physical reaction sensing and multimodal dialogue system'. Together they form a unique fingerprint.

Cite this