90 likes | 183 Vues
Multiparty Communication with a Tour Guide ECA. Aleksandra Čereković HOTLab group D epartment of telecommunications Faculty of electrical engineering and computing Zagreb University http://hotlab.tel.fer.hr. Outline. Project overview Background Information eNTERFACE ‘08 system features
E N D
Multiparty Communication with a Tour Guide ECA Aleksandra Čereković HOTLab group Department of telecommunications Faculty of electrical engineering and computing Zagreb University http://hotlab.tel.fer.hr
Outline • Project overview • Background Information • eNTERFACE ‘08 system • features • scenario • configuration • Project members
Project Overview • Project target • System with an Embodied Conversational Agent (ECA) • ECA is a tour guide which describes the city of Dubrovnik to two visitors and interacts with them • We aim to explore issues of a multiparty support in ECA system with focus on: • Dialogue support • Participant roles • Interaction Management • Dialogue structure • Nonverbal behavior model of an ECA • Getting attention, model of gazing, maintaining a conversational flow
Background Information • During eNTERFACE ‘06 workshop we developed a Tour Guide system • System is based on common GECA framework • GECA Platform, GECA Protocol, GECA Plugs • Connects various system components • Tranfers messages between components in the real time • Enables simple system integration
eNTERFACE ‘08 system scenario • In our system scenario ECA has a role of narrator • ECA takes visitor through 5 different scenes in the city • Users can interrupt ECA • They can raise their hands to take a speaker’s role or just say something • User utterances are predefined: they can make comments, have requests or just express thier willingness to leave the session • Durring interaction ECA keeps initiative and gets attention
Component functionalities (I) • Input • Captures speech and users’ position, size, openness of the eyes and the mouth, facial orientation (Okao’s Vision) • Deliberative phase components • Input understanding • Interprets raw messages coming from the system input • Detects speaker and his utterance (request, comment, leave the system) • Detects speech collision and mutual conversation between users • Decision Making Planner (Dialogue Management) • Generates transitions between dialogue states in a multiparty interaction (Information State, Nonverbal Event, Specifying addressee, Grounding judgment, Assuming the next speaker) • MIDIKI DM toolkit will be extended to manage multiparty interaction
Component functionalities (II) • Deliberative phase components • Behavior Planner • Generates appropriate response by an ECA • Dynamically changes ECA’s behaviors according the Information State • Output • Player • Generates ECA’s behaviors selected by Behavior Planner • Has a support to stop utterance and start a new one
Project Members • Project Researches • Takuya Furukawa* (Input, Input Understanding) • Huang Hsuang- Hung* (DM, Behavior Planner), Shinya Takeda** and Yuji Yamaoka** (DM) • Aleksandra Cerekovic*** (Output) • Project supervisors • Toyoaki Nishida* • Yukiko Nakano** • Igor S. Pandzic*** *University of Kyoto, Graduate School of Informatics, Nishida-Sumi Lab ** Seikei University (Tokyo), Faculty of Science and Technology ***University Of Zagreb, Faculty of Electrical Engineering and Computing, HOTLab group