1 / 78

Susan Dumais Microsoft Research http:// research.microsoft.com/~sdumais

Temporal Dynamics and Information Retrieval. Susan Dumais Microsoft Research http:// research.microsoft.com/~sdumais. In collaboration with:

hedy
Télécharger la présentation

Susan Dumais Microsoft Research http:// research.microsoft.com/~sdumais

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Temporal Dynamics and Information Retrieval Susan Dumais Microsoft Research http://research.microsoft.com/~sdumais In collaboration with: Eric Horvitz, Jaime Teevan, Eytan Adar, Jon Elsas, Ed Cutrell, Dan Liebling, Richard Hughes, Merrie Ringel Morris, EvgeniyGabrilovich, Krysta Svore, Anagha Kulkani

  2. Change is Everywhere in IR • Change is everywhere in digital information systems • New documents appear all the time • Document content changes over time • Queries and query volume change over time • What’s relevant to a query changes over time • E.g., U.S. Open 2011 (in May vs. Sept) • User interaction changes over time • E.g., tags, anchor text, social networks, query-click streams, etc. • Relations between entities change over time • E.g., President of the US in 2008 vs. 2004 vs. 2000 • Change is pervasive in digital information systems … yet, we’re not doing much about it !

  3. Information Dynamics But, ignores … Content Changes 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 User Visitation/ReVisitation Today’s Browse and Search Experiences

  4. Digital Dynamics Easy to Capture • Easy to capture • But … few tools support dynamics

  5. Overview • Change on the desktop and news • Desktop: Stuff I’ve Seen; Memory Landmarks; LifeBrowser • News: Analysis of novelty (e.g., NewsJunkie) • Change on the Web • Content changes over time • User interactionvaries over time (queries, re-visitation, anchor text, query-click stream, “likes”) • Tools for understanding Web change (e.g., Diff-IE) • Improving Web retrieval using dynamics • Query trends over time • Retrieval models that leverage dynamics • Task evolution over time

  6. SIS: • Unified access to distributed, heterogeneous content (mail, files, web, tablet notes, rss, etc.) • Index full content + metadata • Fast, flexible search • Information re-use Stuff I’ve Seen Windows-DS [Dumais et al., SIGIR 2003] Stuff I’ve Seen (SIS) • Many silos of information • SIS -> • Windows Desktop Search

  7. Example Desktop Searches Lots of metadata … especially time • Looking for: recent email from Fedor that contained a link to his new demo • Initiated from:   Start menu • Query:from:Fedor Stuff I’ve Seen • Looking for: the pdf of a SIGIR paper on context and ranking (not sure it used those words) that someone (don’t remember who) sent me about a month ago • Initiated from:   Outlook • Query:SIGIR Looking for:meeting invite for the last intern handoffInitiated from: Start menu Query: intern handoff kind:appointment Looking for: C# program I wrote a long time ago Initiated from: Explorer pane Query:  QCluster*.*

  8. Stuff I’ve Seen: Findings • Evaluation: • Internal to Microsoft, ~3000 users in 2004 • Methods: free-form feedback, questionnaires, usage patterns from log data, in situ experiments, lab studies for richer data • Personal store characteristics: • 5k–1500k items • Information needs: • Desktop search != Web search • Short queries (1.6 words) • Few advanced operators in the initial query (~7%) • But … many advanced operators and query iteration in UI (48%) • Filters (type, date, people); modify query; re-sort results • People know a lot about what they are looking for and we need to provide a way to express it !

  9. Stuff I’ve Seen: Findings • Information needs: • People are important – 29% queries involve names/aliases • Dateis the most common sort order • Even w/ “best-match” default • Few searches for “best” matching object • Many other criteria (e.g., time, people, type), depending on task • Need to support flexible access • Abstraction is important – “useful” date, people, pictures • Age of items retrieved • Today (5%), Last week (21%), Last month (47%) • Need to support episodic access to memory

  10. Memory Landmarks • Importance of episodes in human memory • Memory organized into episodes (Tulving, 1983) • People-specific events as anchors (Smith et al., 1978) • Time of events often recalled relative to other events, historical or autobiographical (Huttenlocher & Prohaska, 1997) • Identify and use landmarks facilitate search and information management • Timeline interface, augmented w/ landmarks • Learn Bayesian models to identify memorable events • Extensions beyond search, e.g., Life Browser

  11. [Ringle et al., 2003] Distribution of Results Over Time Search Results • Memory Landmarks • General (world, calendar) • Personal (appts, photos) • Linked to results by time Memory Landmarks

  12. 30 25 20 Search Time (s) 15 10 5 0 Dates Only Landmarks + Dates Memory Landmarks: Findings With Landmarks Without Landmarks

  13. [Horvitz et al., 2004] Memory LandmarksLearned models of memorability

  14. [Horvitz & Koch, 2010] Images & videos Desktop & search activity Appts & events Locations Whiteboard capture LifeBrowser

  15. LifeBrowserLearned models of selective memory

  16. [Gabrilovichet al., WWW 2004] NewsJunkiePersonalized news via information novelty • News is a stream of information w/ evolving events • But, it’s hard to consume it as such • Personalized news using information novelty • Identify clusters of related articles • Characterize what a user knows about an event • Compute the novelty of new articles, relative to this background knowledge (relevant & novel) • Novelty = KLDivergence (article || current_knowledge) • Use novelty score and user preferences to guide what, when, and how to show new information

  17. NewsJunkie in Action NewsJunkie: Pizza delivery man w/ bomb incident Friends say Wells is innocent Looking for two people Copycat case in Missouri NoveltyScore Gun disguised as cane ArticleSequencebyTime

  18. NewsJunkie in Action

  19. News Junkie Evaluation • Experiment to evaluate algorithms for detecting novelty • Task: Given background article, select set of articles that you would recommend for a friend who wants to find out what’s new about the story • KL and Named Entity algorithms better than temporal • But, many types of “differences” • Recap, review of prior information • Elaboration, new information • Offshoot, related but mostly about something else • Irrelevant, not related to main story

  20. Novelty Score Novelty Score On topic, elaboration:SARS patient’s wife held under quarantine On-topic, recap Word Position Word Position Novelty Score Novelty Score Offshoot:Swiss company develops SARS vaccine Offshoot:SARS impact on Asian stock markets Word Position Word Position NewsJunkieTypes of novelty, via intra-article novelty dynamics

  21. Overview • Change on the desktop and news • Desktop: Stuff I’ve Seen; Memory Landmarks; LifeBrowser • News: Analysis of novelty (e.g., NewsJunkie) • Change on the Web • Content changes over time • User interactionvaries over time (queries, re-visitation, anchor text, query-click stream, “likes”) • Tools for understanding Web change (e.g., Diff-IE) • Improving Web retrieval using dynamics • Query trends over time • Retrieval models that leverage dynamics • Task evolution over time Questions?

  22. [Adar et al., WSDM 2009] Characterizing Web Change Content Changes 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 • Large-scale Web crawls, over time • Revisited pages • 55,000 pages crawled hourly for 18+ months • Unique users, visits/user, time between visits • Pages returned by a search engine (for ~100k queries) • 6 million pages crawled every two days for 6 months 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 User Visitation/ReVisitation

  23. Measuring Web Page Change • Summary metrics • Number of changes • Amount of change • Time between changes • Change curves • Fixed starting point • Measure similarity over different time intervals • Within-page changes

  24. Measuring Web Page Change • 33% of Web pages change • 66% of visited Web pages change • 63% of these change every hr. • Avg. Dice coeff. = 0.80 • Avg. time bet. change = 123 hrs. • .edu and .gov pages change infrequently, and not by much • popular pages change more frequently, but not by much • Summary metrics • Number of changes • Amount of change • Time between changes

  25. Measuring Web Page Change Knot point Time from starting point • Summary metrics • Number of changes • Amount of change • Time between changes • Change curves • Fixed starting point • Measure similarity over different time intervals

  26. Measuring Within-Page Change Sep. Oct. Nov. Dec. Time • DOM-level changes • Term-level changes • Divergence from norm • cookbooks • salads • cheese • ingredient • bbq • … • “Staying power” in page

  27. Example Term Longevity Graphs

  28. [Adar et al., CHI 2009] Revisitation on the Web • Revisitation patterns • Log analyses • Toolbar logs for revisitation • Query logs for re-finding • User survey to understand intent in revisitations Content Changes 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 User Visitation/ReVisitation What was the last Web page you visited? Why did you visit (re-visit) the page?

  29. Measuring Revisitation • 60-80% of Web pages you visit, you’ve visited before • Many motivations for revisits Time Interval • Summary metrics • Unique visitors • Visits/user • Time between visits • Revisitation curves • Histogram of revisit intervals • Normalized

  30. Four Revisitation Patterns • Fast • Hub-and-spoke • Navigation within site • Hybrid • High quality fast pages • Medium • Popular homepages • Mail and Web applications • Slow • Entry pages, bank pages • Accessed via search engine

  31. Relationships Between Change and Revisitation • Interested in change • Monitor • Effect change • Transact • Change unimportant • Re-find old • Change can interfere with re-finding

  32. [Teevan et al., SIGIR 2007] [Tyler et al., WSDM 2010] [Teevan et al., WSDM 2011] Revisitation and Search(ReFinding) • Repeat query (33%) • Q: microsoft research • Click same and different URLs • Repeat click (39%) • http://research.microsoft.com/ • Q:microsoft research; msr • Big opportunity (43%) • 24% “navigational revisits”

  33. Building Support for Web Dynamics Temporal IR Content Changes 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 User Visitation/ReVisitation Diff IE

  34. Diff-IE [Teevan et al., UIST 2009] [Teevan et al., CHI 2010] Diff-IE toolbar Changes to page since your last visit

  35. Interesting Features of Diff-IE New to you Always on Non-intrusive In-situ Try it: http://research.microsoft.com/en-us/projects/diffie/default.aspx

  36. Examples of Diff-IEin Action

  37. Expected New Content

  38. Monitor

  39. Serendipitous Encounters

  40. Unexpected Important Content

  41. Understand Page Dynamics

  42. Expected Unexpected Unexpected Important Content Expected New Content Edit Attend to Activity Understand Page Dynamics Monitor Unexpected Unimportant Content Serendipitous Encounter

  43. Studying Diff-IE In situ Representative Experience Longitudinal • Feedback buttons • Survey • Prior to installation • After a month of use • Logging • URLs visited • Amount of change when revisited • Experience interview

  44. People Revisit More 14% • Perception of revisitation remains constant • How often do you revisit? • How often are revisits to view new content? • Actual revisitation increases • Last week: 45.0% of visits are revisits • First week: 39.4% of visits are revisits • Why are people revisiting more with DIFF-IE?

  45. Revisited Pages Change More 8% 17% 51+% • Perception of change increases • What proportion of pages change regularly? • How often do you notice unexpected change? • Amount of change seen increases • Last week: 32.4% revisits changed, by 9.5% • First week: 21.5% revisits changed, by 6.2% • Diff-IE is driving visits to changed pages • It supports people in understanding change

  46. Other Examples of Dynamics and User Experience • Content changes • Diff-IE (Teevan et al., 2008) • Zoetrope (Adar et al., 2008) • Diffamation(Chevalier et al., 2010) • Temporal summaries and snippets … • Interaction changes • Explicit annotations, ratings, wikis, etc. • Implicit interest via interaction patterns • Edit wear and read wear (Hill et al., 1992)

  47. Overview • Change on the desktop and news • Desktop: Stuff I’ve Seen; Memory Landmarks; LifeBrowser • News: Analysis of novelty (e.g., NewsJunkie) • Change on the Web • Content changes over time • User interactionvaries over time (queries, re-visitation, anchor text, query-click stream, “likes”) • Tools for understanding Web change (e.g., Diff-IE) • Improving Web retrieval using dynamics • Query trends over time • Retrieval models that leverage dynamics • Task evolution over time Questions?

  48. Leveraging Dynamics for Retrieval Temporal IR Content Changes 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 User Visitation/ReVisitation

  49. Temporal IR • Query frequency over time • Retrieval models that incorporate time • Ranking algorithms typically look only at a single snapshot in time • But, both content and user interaction with the content change over time • Model content change on a page • Model user interactions • Tasks evolve over time

  50. Query Dynamics • Queries sometimes mention time, but often don’t • Explicit time (e.g., World Cup Soccer 2011) • Explicit news (e.g., earthquake news) • Implicit time (e.g., Harry Potter reviews; implicit “now”) • Queries are not uniformly distributed over time • Often triggered by events in the world • Using temporal query patterns to: • Cluster similar queries • Identify events and find related news

More Related