1 / 1

Develop a fast semantic decoder Robust to speech recognition noise Trainable on different domains: Tourist information

Transformation-based Learning for Semantic Parsing. F. Jurcicek, M . Gasic , S . Keizer, F. Mairesse, B. Thomson, K . Yu, S. Young Cambridge University Engineering Department | {fj228, mg436 , sk561 , f.mairesse, brmt2, ky219, sjy}@ eng.cam.ac.uk. Goals. Evaluation. Example of parsing.

italia
Télécharger la présentation

Develop a fast semantic decoder Robust to speech recognition noise Trainable on different domains: Tourist information

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Transformation-based Learning for Semantic Parsing F. Jurcicek, M. Gasic, S. Keizer, F. Mairesse, B. Thomson, K. Yu, S. Young Cambridge University Engineering Department | {fj228, mg436, sk561, f.mairesse, brmt2, ky219, sjy}@eng.cam.ac.uk Goals Evaluation Example of parsing • Develop a fast semantic decoder • Robust to speech recognition noise • Trainable on different domains: • Tourist information (TownInfo) • Air travel information system (ATIS) • Semantic parsing maps natural language to formal langage – most of the maping incurrent dialogue is fairly simple(cities, times, ... ). • To develop a fast semantic parser, we have to take advantage of this feature. • Inference of compact set of transforation rules which transforms the initial naive semantic hypothesis. • Example: • find all the flights between Toronto and Sandiego that arrive on Saturday • Parser assigns to inital semantics to input sentece. • Rules, whose triggers match the sentence a the hypothesised semantics, are sequentially applied. • Resulting semantics • Compare semantic tuple classifiers with: • A handcrafted Phoenix grammar • The Hidden Vector State model (He & Young, 2006) • Probabilistic Combinatory Categorial Grammar Induction (Zettlemoyer & Collins, 2007) • Markov Logic networks (Meza-Ruiz et al., 2008) • Evaluation metrics: • Dialogue act type accuracy (e.g. inform or request) • Precision, recall and F-measure of dialogue act items (e.g. food=Chinese,toloc.city=New York) • Conclusion: semantic tuple classifiersare robust to noise and competitive with the state of the art – they provide a simple yet efficient solution to the spoken language understanding problem! • The code is available at http://code.google.com/p/tbed-parser . what are the lowest airfare from Washington DC to Boston GOAL = airfare airfare.type = lowest from.city = Wahington from.state = DC to.city = Boston GOAL= flight # trigger transformation „between toronto“ add the slot „from.city=Toronto“ „and sandiago“ add the slot „to.city=Sand Diaego“ „Saturday“ add the slot „departure.day=Saturday“ Motivation GOAL= flight from.city = Toronto to.city = Sandiego departure.day = Saturday Tranformation parsing Open source Triggers Transformations „tickets“ replace the goal by „airfare“ „flights * from“ & GOAL=airfare replace the goal by flight „Seatle“ add the slot „to.city=Seatle“ „connecting“replace the slot „to.city=*“ by „stop.city=*“ Acknowledgements This research was partly funded by the UK EPSRC under grant agreement EP/F013930/1 and by the EU FP7 Programme under grant agreement 216594 (CLASSiCproject: www.classic-project.org).

More Related