1 / 13

Evaluating Implicit Measures to Improve the Search Experience

Evaluating Implicit Measures to Improve the Search Experience. SIGIR 2003 Steve Fox . Outline. Background Approach Data Analysis Value-Add Contributions Result-Level Findings Session-Level Findings. Background. Interested in implicit measures to improve user’s search experience

hallam
Télécharger la présentation

Evaluating Implicit Measures to Improve the Search Experience

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating Implicit Measures to Improve the Search Experience SIGIR 2003 Steve Fox

  2. Outline • Background • Approach • Data Analysis • Value-Add Contributions • Result-Level Findings • Session-Level Findings

  3. Background • Interested in implicit measures to improve user’s search experience • What the user wants • What satisfies them • Significant implicit measures • Needed to prove it! • Two goals: • Test association between implicit measures and user satisfaction • Understand what implicit measures were useful within this association

  4. Approach • Architecture • Internet Explorer add-in • Client-Server • Configured for MSN Search and Google • Deployment • Internal MS employees (n = 146) – work environment • Implicit measures and explicit feedback • SQL Server back-end

  5. Approach, cont’d

  6. Data Analysis • Bayesian modeling at result and session level • Trained on 80% and tested on 20% • Three levels of SAT – VSAT, PSAT & DSAT • Implicit measures:

  7. Data Analysis, cont’d

  8. Result-Level Findings • Dwell time, clickthrough and exit type strongest predictors of SAT • Printing and Adding to Favorites highly predictive of SAT when present • Combined measures predict SAT better than clickthrough

  9. Result Level Findings, cont’d Only clickthrough Combined measures Combined measures with confidence of > 0.5 (80-20 train/test split)

  10. Session-Level Findings • Four findings: • Strong predictor of session-level SAT was result-level SAT • Dwell time strong predictor of SAT • Combination of (slightly different) implicit measures could predict SAT better than clickthrough • Some gene sequences predict SAT (preliminary and descriptive)

  11. Session Level Findings, cont’d • Common patterns in gene analysis, e.g. SqLrZ • Session starts (S) • Submit a query (q) • Result list returned (L) • Click a result (r) • Exit on result (Z)

  12. Value-Add Contributions • Deployed in the work setting • Collected data in context of web search • Rich user behavior data stream • Annotated data stream with explicit judgment • Used new methodology to analyze the data • ‘Gene analysis’ to analyze usage patterns • Mapped usage patterns to SAT

  13. Question(s)

More Related