1 / 12

Conversational Applications Workshop Introduction

Conversational Applications Workshop Introduction. Jim Larson. W3C started with Speech Interface Framework. Semantic Interpretation 1.0. SRGS 1.0. VoiceXML 2.0/2.1. Dialog Manager. World Wide Web. Speech Recognizer. DTMF Tone Recognizer. PLS 1.0. Telephone System.

hila
Télécharger la présentation

Conversational Applications Workshop Introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Conversational Applications Workshop Introduction Jim Larson

  2. W3C started with Speech Interface Framework Semantic Interpretation 1.0 SRGS 1.0 VoiceXML 2.0/2.1 Dialog Manager World Wide Web Speech Recognizer DTMF Tone Recognizer PLS 1.0 Telephone System Prerecorded Audio Player User Speech Synthesizer SSML 1.0 CCXML 1.0

  3. Next came the W3C Multimodal Interaction Framework

  4. Input Components

  5. Output Components

  6. World Wide Web ConsortiumStandardizes Languages • Voice Browser Working Group • Voice XML 2.0 & 2.1 • Speech Recognition Grammar Specification 1.0 • Speech Synthesis Markup Language 1.1 • Semantic Interpretation for Speech Recognition 1.0 • Pronunciation Lexicon 1.0 • Call Control XML 1.0 • State Chart XML 1.0 • Multimodal Interaction Working Group • Multimodal Architecture and Interfaces 1.0 • Extended Multimodal Architecture 1.0 • Emotion Markup Language 1.0 • InkML 1.0

  7. Goal of this Workshop • Advise W3C Voice Browser and Multimodal Working Groups what to do next to better enable conversational voice systems • Identify and justify new languages for example: • Context Sensitive Grammar Language • Statistical Markup Language • Semantic Representation Language • Identify and justify extensions to existing languages, for example: • PLS 1.0 • Parts of Speech, grammatical features • SRGS 1.1 • Boolean constraints

  8. Not to goal of this workshop • Do not specify architectures • Languages should work under multiple architectures • Venders are free to design their own architectures to support W3C languages • Do not specify the language details • This is the responsibility of W3C working groups • Take care to avoid IP issues May need to discuss architectures and language details to understand the to provide context for the use of a new language and to explain its purpose

  9. To justify new language or language extension • Explain what new applications are enabled by the language • Use cases • Concrete examples • Identify existing implementations of the language • Demonstrate that it is implementable and useful • Demonstrate real interest in the language among vendors

  10. Prioritize new languages and language extensions • Must have, should have, nice to have

  11. Workshop Deliverables • Summary of discussions • Minute takers send minutes to Kazuyuki who will integrate them onto a web page • Document list of new languages and extensions to existing languages • Brief description • User Cases and concrete examples • Justification • Existing implementations

  12. Our agenda • First day • Identify suggestions for new languages and extensions to existing languages by reviewing position papers • Second day • Justify each new language and extension to existing language • Brief description (one paragraph) • User Cases • Identify existing implementations • Prioritize recommendations

More Related