1 / 21

QUESTION AND ANSWERING

QUESTION AND ANSWERING. Overview. What is Question Answering? Why use it? How does it work? Problems Examples Future. What is it?. Definition of Question Answering Examples AskJeeves is probably most well known example AnswerBus is an open-domain question answering system

india-kelly
Télécharger la présentation

QUESTION AND ANSWERING

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. QUESTION AND ANSWERING

  2. Overview • What is Question Answering? • Why use it? • How does it work? • Problems • Examples • Future

  3. What is it? • Definition of Question Answering • Examples • AskJeeves is probably most well known example • AnswerBus is an open-domain question answering system • Ionaut, EasyAsk, AnswerLogic, AnswerFriend, Start, LCC, Quasm, Mulder, Webclopedia, etc.

  4. Why use it? • From AskJeeves “Search engines do not speak your language. They make you speak their language; a language that's strange, confusing, and includes words that no one is entirely sure of their meaning.” • QA engines attempt to let you ask your question the way you'd normally ask it . • Inexperienced users • Document=Answer?

  5. How does it work? • Natural Language Processing • Semantic Processing • Syntactic Processing • Parsing • Knowledge Base • Answer Processing

  6. Natural Language Processing (NLP) • Engines have unique processes • START-Natural Language System • Parsing • Natural Language Annotation • Processing Component

  7. Answer Processing

  8. AskJeeves • Has own knowledge base and uses partners to answer questions • Catalogues previous questions • Answer processing engine • Question template response

  9. AnswerBus

  10. Problems • How and Why questions • What questions • What happened? • What did we do? • Answer Quality • Correct?? • Answer Presentation

  11. Correct? (From Webclopedia) • Question: Where do lobsters like to live?Answer: on a Canadian airline • Question: Where do hyenas live?Answer: in Saudi ArabiaAnswer: in the back of pick-up trucks • Question: Where are zebras most likely found?Answer: near dumpsAnswer: in the dictionary • Question: Why can't ostriches fly?Answer: Because of American economic sanctions • Collected by Ulf Hermjakob --November 29, 2001

  12. (TREC) -- Text Retrieval Conference • Yearly information retrieval competition • Began in 1992: QA in 1999 • In order to encourage research into systems that return answers rather than document lists. • Q’s are open domain, closed class • A’s are less than 50 chars and entities or noun phrases

  13. (TREC) -- Text Retrieval Conference • 500 Questions in 2001 • Some answers = nil; large difficulty • Lots of definition questions • QA list tasks • Name 4 cities that have a “Shubert” theater. • QA context tasks • How many species of spiders are there? • How many are poisonous to humans? • What percentage of spider bites in the US are fatal?

  14. Example Questions and Results • What river in the US is known as the Big Muddy? • AskJeeves • AnswerBus • Google

  15. Example Questions and Results • What person’s head is on a dime? • AskJeeves • AnswerBus • AltaVista

  16. Example Questions and Results • Show some paintings by Claude Monet • START

  17. Looking Ahead • User Demand • Enormous Interest in Problem • Successes

  18. Conclusion • Question and Answering and Search Engines • Why its used • Future • Moore’s Law for QA???

  19. Sources • AskMSR: Question Answering Using the Worldwide Web • Michele Banko, Eric Brill, Susan Dumais, Jimmy Lin • http://www.ai.mit.edu/people/jimmylin/publications/Banko-etal-AAAI02.pdf • In Proceedings of 2002 AAAI SYMPOSIUM on Mining Answers from Text and Knowledge Bases, March 2002  • Web Question Answering: Is More Always Better? • Susan Dumais, Michele Banko, Eric Brill, Jimmy Lin, Andrew Ng • http://research.microsoft.com/~sdumais/SIGIR2002-QA-Submit-Conf.pdf

  20. Sources • AnswerBus • www.answerbus.com • http://misshoover.si.umich.edu/~zzheng/qa-new/ • http://www2002.org/CDROM/poster/203/ • AskJeeves • http://www.ask.co.uk/docs/about/what_is.asp • Webclopedia • http://trec.nist.gov/pubs/trec9/papers/webclopedia.pdf • http://www.isi.edu/natural-language/projects/webclopedia/ • Start • http://www.ai.mit.edu/projects/infolab/ailab.html • Text Retrieval Conference • http://trec.nist.gov/presentations/TREC10/qa/

More Related