1 / 17

Chatting

Design and First Tests of a Chatter Hans Dybkjær SpeechLogic ™ , Prolog Development Center A/S & Laila Dybkjær NISLab, University of Southern Denmark. Chatting. Dialogue type not common in state-of-the-art Eliza, chatbots: written interaction New kinds of application edutainment

tamas
Télécharger la présentation

Chatting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Design and First Tests of a ChatterHans Dybkjær SpeechLogic™, Prolog Development Center A/S&Laila DybkjærNISLab, University of Southern Denmark

  2. Chatting • Dialogue type not common in state-of-the-art • Eliza, chatbots: written interaction • New kinds of application • edutainment • chat with character from commercials series • small-talk while waiting instead of music • Test-bed for new conversational techniques • express feelings • understand feelings • non-task oriented dialogue • other new features How far can we push current technology towards free conversation?

  3. Kurt • Entertain users through chat (in Danish) • Limited vocabulary (350 words) • Phone-based • Preferences of food, notably fruit and vegetables • Kurt, e.g. his name, his age, and where he works • Personality • childish • affective • self-centred • defensive with an underlying uncertainty • evasive Personality designed to hide shortcomings of understanding level

  4. Available (Phonetic) lexicon Grammar Recognition scores Phrasing Dialogue flow Available, but not used: n-best ambiguity barge-in event handling complex task domain Not available (input) Glottal stop Stress Prosody Non-linguistic vocal phenomena, e.g. laughter Mood (anger, joy, ...) Aware sites Overlapping speech(back-channelling) ... Features for emotion modelling Platform allows limited emotion modelling features

  5. Interaction model You are stupid Fool yourself, … Linguistic personality Compute affect Generate output s t a t e Flow model Manage dialogue Standard dialogue model extended with affective state and handling

  6. Linguistic personality Lexicon tagged with • Face value • Preference • Embarrassment Used for • Input interpretation Face value • Kurt sensitive to losing face • Negative face value: e.g. corrections and insults • Positive face value: e.g. praise Preference • Words are liked, disliked or neutral Embarrassment • Certain words embarrassing • All other words neutral stupid Fool, … pers’lity affect output s t a t e flow manage Context-independent assumption

  7. Negation • Changes face value and preference • Does not affect embarrassment • Syntactic negation: • you are not stupid • Semantic negation: • you hate apples • Implication of negation may depend on question or statement • you hate apples = don’t you hate apples • you are not stupid ≠ aren’t you stupid Though = and ≠ are not fully semantically correct, they holdwith respect to face value and preference More complex logic negation not useful for spoken language

  8. Affect computation Self-confidence • Recognition scores • Changed by accept/reject Embarrassment • Means topic change Face value • Complex, simplify: • if any negative input, take minimum • otherwise take maximum Preference • Positive/negative face value => knock-on effect • Not a function of single words • But: • if any negative input, take minimum • otherwise take maximum stupid Fool, … pers’lity affect output s t a t e flow manage Simplified but transparent

  9. Affective state Self-confidence • Influences • magnitude of satisfaction changes • flow Satisfaction • Main personality control • scale from angry (low) to exalted (high) • Overflow at both ends • Initial level is neutral • Changes computed from • input preference • input face value • self-confidence level stupid Fool, … pers’lity affect output Hangup Get Angry s t a t e flow Angry Current Exalted manage Two-parameter model

  10. Dialogue management Flow model • Questions • Answers • Statements • Jokes • Feedback • implicit, explicit Embarrassment • Joke and change topic Satisfaction • ”Underflow” leads to hangup • No other flow effect Self-confidence 0 low medium high 1 Feedback: Explicit Implicit None stupid Fool, … pers’lity At accept: Joke Joke None affect output s t a t e flow manage Simple task solving plus some more chat-like interaction

  11. Generate output Phrases • Canned • Composed of: • Change marker • Insults and jokes • Answers and feedback • Prompts Change marker • Notifies user of system’s emotional state • Function of satisfaction state and satisfaction change • High, high: Happy • Low, low: Angry • High, low: Forbearing • Low, High: Distrustful Random phrases • Variation, less rigid stupid Fool, … pers’lity affect output s t a t e flow manage A simple scheme with large variability

  12. Example dialogue

  13. Data collection • No controlled experiments • Dialogues collected from demo-line • 86 dialogues transcribed from 3 system iterations • Many dialogues performed by children • First output voice by 40 years old male • Second output voice by 14 years old boy Small but sufficient to give impression

  14. Learned from dialogues (1) • Start • identity • age • location • knows about • how are you • During call • mostly questions concerning Kurt • maybe search for common ground • little volunteered information • dialogue on the conversation Dinner party conversation with a twist

  15. Learned from dialogues (2) • Topics asked about by users • personal (where he works, where he lives, childhood, wife, children, health, hair, eye-colour, glasses, smokes, …) (parents, …) • adjective descriptions (stupid, clever, handsome, …) • likes and dislikes (alcohol, food, football, music, work, sex, …) • utterances related to what the system says (insults, long input, …) Topics depends on modelled person

  16. Next steps • Extend grammar coverage • Extend Kurt’s knowledge about himself • Provide him with interests • Let Kurt ask questions about the user • Experiment with addition of new parameters (patience, balance, self-esteem, pessimism/optimism) • Weighting of parameters depends on personality • New kinds of interaction patterns (hand over phone, detection of repeated calls from same number) Extended conversational and emotional coverage

  17. Conclusion • Clearly too small vocabulary and grammar for longer interactions • Entertaining despite all shortcomings • In particular • repetition of what was understood • reactions to insults Simple but entertaining aspects

More Related