1 / 71

Information Foraging & Information Scent: Theory, Models, and Applications

Information Foraging & Information Scent: Theory, Models, and Applications. Peter Pirolli User Interface Research. Work supported in part by the Office of Naval Research. Aim of this Talk. Overview Information foraging theory Information scent Sample of psychological investigations

yael-kirby
Télécharger la présentation

Information Foraging & Information Scent: Theory, Models, and Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Information Foraging & Information Scent:Theory, Models, and Applications Peter Pirolli User Interface Research Work supported in part by the Office of Naval Research

  2. Aim of this Talk • Overview • Information foraging theory • Information scent • Sample of psychological investigations • Sample of applications

  3. Human-Information Interaction:Can approach from user or producer side Server Data base Web pages

  4. Overview • Motivations, origins, assumptions • Initial development: Scatter/Gather use • Extension: WIF-ACT model of WWW use • Information scent as a critical parameter of the large-scale shape of WWW use

  5. Motivations & origins • Humans are informavores (George Miller, 1983) • Organisms that hunger for information about the world and themselves • Humans seek, gather, share, and consume information in order to adapt

  6. Growth of available information 1000000 100000 Journals 10000 1000 100 10 1 Journals/People x106 0.1 0.01 1750 1800 1850 1900 1950 2000 Year Pressures of the information environment Source: Price (1963)

  7. Growth in attention 1000000 100000 10000 1000 100 Capacity of human working memory 10 1 0.1 0.01 1750 1800 1850 1900 1950 2000 Year Pressures of the information environment

  8. Pressures of the information environment “ A wealth of informationcreates a poverty of attention and a need to allocate it efficiently ” Herbert A. Simon

  9. WWW challenges HCI theory • 2003 e-commerce revenue = $1 Trillion (est.) BUT • 65% of virtual shopping trips end in failure (Souza, 2000) • 1M site visitors, 40% do not return, cost=$2.8 M (Manning, 1998) • WWW site redesigns = $1.5 M/yr to $2.8 M/yr (Manning, 1998)

  10. Information Foraging Theory • Take concept of informavores seriously • Key ideas • Information scent. Local cues used to explore and search information spaces • Economics of attention and the cost structure of information • Optimal foraging models

  11. Take concept of informavores seriously • Information processing systems evolve so as to maximize the gain of valuable information per unit cost • Sensory systems (vision, hearing) • Information access (card catalogs, offices) • Natural selection has made animals (and our human ancestors) very good at searching for food (foraging) • Modern information foragers use problem-solving abilities with deep evolutionary roots in food foraging ] [ information value cost of interaction maximize

  12. Problem solving • Decision making 10-1000 • Visual search • Motor behavior 1-100 • Visual attention • Perceptual judgment Pete Pirolli's Home Page Peter Pirolli. ... Palo Alto, CA 94304 USA phone: +1-650-812-4483 fax: +1-650-812-4241 email: pirolli@parc.xerox.com This page updated December 18, 2000. www.parc.xerox.com/istl/members/pirolli/pirolli.html - 9k - Cached - Similar pages .100-1 Time scales of analysis Psychological domain User Interface Domain Time scale (s)

  13. Example: Scatter/Gather • Information scent • Optimal foraging analyses • ACT-IF cognitive model • Evaluation by user simulation

  14. Example: Scatter/Gather • Information scent • Optimal foraging analyses • ACT-IF cognitive model • Evaluation by user simulation

  15. information scent Tokyo Cues that facilitate orientation, navigation, assessment of information value New York San Francisco

  16. Scatter/Gather • supports exploration/browsing of very large full-text collections (~ 1,000,000) • creates clusters of content-related documents • presents users with overviews of cluster contents • allows user to navigate through clusters and overviews • More recently extended to multi-modal Scatter/Gather (Chen et al., 1999) • Images + text

  17. Scatter/Gather task Display Titles Window Scatter/Gather Window Law Nat. Lang. World News Robots AI Expert Sys CS Planning Medicine Bayes. Nets

  18. 7 Observed Rating 6 Predicted Rating 5 Probability relevant 4 3 2 1 0 1 2 3 4 5 6 7 8 9 10 Rank information scent new cell Information Need medical patient Text snippet • Spreading activation • Derived from models of human memory • Activation reflects likelihood of relevance given past history and current context • Approximates Bayesian network treatments dose procedures beam

  19. Activation of node i Ai = Bi + WjSji Base-level activation Activation spread from linked nodes j Pr(i) Pr(not i) Bi = ln( ) i “bread” j “butter” Pr(j|i) Pr(j|not i) Sji = ln( ) spreading activation Base-level reflects likelihood of occurrence Strength of link spread reflects likelihood of cooccurrance

  20. spreading activation networks(for modeling “scent”) Document corpus Word statistics Spreading activation network

  21. interface provides good scent of underlying document clustering Perceived by model Identified by computer

  22. Summary: Information Scent • Spreading activation predicts user judgments • Networks built a priori. Only need to estimate one scaling parameter from user data • Can be used to assess “goodness of links”

  23. Example: Scatter/Gather • Information scent • Optimal foraging analyses • ACT-IF cognitive model • Evaluation by user simulation

  24. cost/value estimates • TREC • queries and expert-identified relevant documents • Analysis of clustering algorithm • distribution of relevant information over clusters • Time costs

  25. Number of relevant documents in cluster Time to process cluster  = Activation from cluster text a a t1 + N t2  = Time to process relevant docs Time to process all docs RSG = RD at t + 1 total relevant documents task time RD = foraging evaluations choose cluster enrich exploit

  26. Number of relevant documents in cluster Time to process cluster  = Optimum Total relevant documents Total time R = R Choose clusters (in descending rank p) if p > R Cluster selection (optimal diet model) 16 14  12 10 Relevant documents/second 8 6 4 2 0 0 1 2 3 4 5 6 7 8 9 10 Rank profitability

  27. enrichment v exploitation relevant documents time cost R= .06 R*SG > R*D R*D > R*SG .05 if user chooses to display clusters now .04 R*D R*SG if user chooses to display later (after more Scatter/Gather) Rate of gain .03 .02 .01 0 0 200 400 600 800 1000 Time (sec)

  28. Example: Scatter/Gather • Information scent • Optimal foraging analyses • ACT-IF cognitive model • Evaluation by user simulation

  29. Model-Tracing Method Trace Cognitive Model System Psychology Users Optimal foraging theory

  30. ACT-IF production system Procedural Memory Declarative Memory Perceptual Input Condition -> Action Foraging evaluation heuristics Condition -> Action Condition -> Action Motor Output Condition -> Action

  31. production rule evaluations SELECT-RELEVANT-CLUSTER Goal is to Process Scatter/Gather Window & there is a Query & there is an unselected cluster Select the cluster DO-SCATTER/GATHER Goal is to Process Scatter/Gather Window & there is a Query & some clusters have been selected Scatter/Gather the window DO-DISPLAY-TITLES Goal is to Process Scatter/Gather Window & there is a Query & some clusters have been selected Scatter/Gather the window p RSG RD

  32. Model predicts user action 250 200 150 Frequency 100 50 0 1 2 3 4 5 6 7 8 9 10 More Rank of Predicted Production

  33. Example: Scatter/Gather • Information scent • Optimal foraging analyses • ACT-IF cognitive model • Evaluation by user simulation

  34. Evaluation by user simulation 50 50 Faster Interaction 40 40 Improved Clustering Improved Clustering 30 30 Percent Change from Baseline (Relevant Documents) Faster 20 20 Interaction 10 10 0 0 Few Many Soft Hard Repository Results Relevant to Task Task Deadline Condition

  35. Summary: Scatter/Gather • ACT-IF model matches user behavior • (most of) Model specified a priori • People optimize value/cost using foraging heuristics

  36. Overview • Motivations, origins, assumptions • Initial development: Scatter/Gather use • Extension: WIF-ACT model of WWW use • Information scent as a critical parameter of the large-scale shape of WWW use

  37. WIF-ACT • Web Information Foraging - ACT • Not a reality yet • Preliminary version interacts with Internet Explorer • What we have done: • Specialized instrumentation • Methodology • Preliminary analysis of information foraging and information scent

  38. Cached pages Cached pages WebLogger WebLogger WebEyeMapper WebEyeMapper Event log Event log Points of regard Points of regard Eye tracker Eye tracker Interface objects Interface objects Database & statistics Database & statistics Fixation table Fixation table Visualizations Visualizations Instrumentation

  39. WebLogger Event File

  40. Study • 6 “Find information” tasks, e.g., • “You are Chair of Comedic events for Louisiana State University in Baton Rouge. Your computer has crashed and you have lost several advertisements for upcoming events. You know that the Second City tour is coming to your theatre in the spring, but you do not know the precise date. Find the date the comedy troupe is playing on your campus. Also find a photograph of the group to put on the advertisement.” • 12 Stanford University students • 2 tasks (CITY, ANTZ) analyzed for 4 participants

  41. Analysis • Task/Information environment • Information patch structure • Problem space structure • Information scent

  42. Yahoo Movie Posters Archive 123 Posters Information structure • Web sites • Portals • Search engines • Pages • Website home page • Search engine page • Hitlist page • Content elements

  43. TU TU www.antzthemovie.com www.google.com CL CL www.antz.com www.antz.com/antzstore S 123 Posters Antz Problem space structure • URL • Link • Keyword • Visual Search

  44. Web Behavior Graph

  45. Web Behavior Graph State in Problem Space Hit List

  46. Web Behavior Graph Execution of Operator Return to Previous State

More Related