1 / 12

Tensor Query Expansion: a cognitively motivated relevance model

Tensor Query Expansion: a cognitively motivated relevance model. Mike Symonds, Peter Bruza, Laurianne Sitbon and Ian Turner Queensland University of Technology. Introduction. Introduction Background Tensor Query expansion Results Future work.

novia
Télécharger la présentation

Tensor Query Expansion: a cognitively motivated relevance model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Tensor Query Expansion: a cognitively motivated relevance model Mike Symonds, Peter Bruza, Laurianne Sitbon and Ian TurnerQueensland University of Technology

  2. Introduction IntroductionBackgroundTensor Query expansionResultsFuture work • We use a formal model of word meaning to simulate cognitive processes used when a user formulates their query • Use this approach for query expansion in an ad hoc retrieval task. • Our approach shows significant improvement in retrieval effectiveness over state-of-the-art for: • short queries and • newswire TREC data sets

  3. Query Expansion (QE) IntroductionBackgroundTensor Query expansionResultsFuture work • Geometric representations: • Rocchio (Rocchio, 1971); • Probabilistic representations: • Relevance models (Lavrenko and Croft, 2001); P(w|R) • Term dependency approaches • Latent concept expansion (Metzler and Croft, 2007) and positional relevance model (Lv and Zhai, 2010)

  4. Motivation IntroductionBackgroundTensor Query expansionResultsFuture work • The user’s information need is a cognitive construct. • The use of cognitive models in query expansion has not been extensively studied. • Trend in QE research: term dependency approaches are outperforming term independent. • However, current semantic features have little, if any, linguistic meaning.

  5. Hypothesis IntroductionBackgroundTensor Query expansionResultsFuture work • Using a cognitively motivated model of word meaning within the query expansion process can significantly improve retrieval effectiveness. • Model of Word Meaning • Tensor Encoding Model (Symonds,2011) • Structural Linguistic Theory • Ferdinand de Saussure (1916) • Syntagmatic associations (hot-sun) • Paradigmatic associations (quick-fast)

  6. Modeling word meaning IntroductionBackgroundTensor Query ExpansionResultsFuture work • Syntagmatic Associations • Use efficient cosine measure • Paradigmatic Associations • Use probabilistic based measure Ssyn(Q,w) =

  7. Tensor Query Expansion IntroductionBackgroundTensor Query ExpansionResultsFuture work • Formally combine the syntagmatic and paradigmatic features • using a Markov random field, and • fit into the relevance modeling framework; replace P(w|R) with PG, Γ(w|Q)

  8. Ad Hoc Retrieval Results IntroductionBackgroundTensor Query ExpansionResultsFuture work • Mean average precision (MAP)

  9. Ad Hoc Retrieval Results IntroductionBackgroundTensor Query ExpansionResultsQuestions • Robustness Associated Press Wall Street Journal

  10. Ad Hoc Retrieval Results IntroductionBackgroundTensor Query ExpansionResultsFuture work • Parameter sensitivity • Observe the change in MAP for different mix of syntagmatic and paradigmatic (i.e., gamma) Associated Press Wall Street Journal

  11. Summary of contribution IntroductionBackgroundTensor Encoding ModelResultsFuture work • Cognitively motivated approach to performing query expansion. • Use of semantic features that have explicit linguistic meaning. • Demonstrated significant improvement in retrieval effectiveness over the unigram relevance model.

  12. Future Work IntroductionBackgroundTensor Query ExpansionResultsFuture work • Evaluate on larger data sets • TREC GOV2, ClueWeb • Compare to the Positional Relevance Model or LCE • Evaluate on verbose queries • More semantic information in longer queries Questions?

More Related