1 / 35

Tagging done by YOU

Tagging done by YOU. Milan Vojnovi ć MSRC, Systems and Networking.

sandra_john
Télécharger la présentation

Tagging done by YOU

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Tagging done by YOU Milan Vojnović MSRC, Systems and Networking

  2. ThanksTagBooster project Dinan Gunawardena James Cruise (U Cambridge) Peter Marbach (U Toronto) Fabian Suchanek (MPI) Product groups O14 Sharepoint Communities Tagspace Officelabs MSR Tagging SummitTagBooster User Study Nick Duffield John Mulgrew Andy Slowey This talk Abi Alex Chris Peter Stephen

  3. Social tagging in web2.0

  4. Why tag?

  5. Tagging: what and why • Tag suggestions • Conclusion

  6. In this talk, we’ll find relations among the following x y f g

  7. Discover, filter, share

  8. Faceted browsing bbc bbc BBC radio BBC news BBC radio palin BBC news Michael Palin Michael Palin BBC shop BBC shop

  9. Tagging vs. traditional classification • Traditional classification • Pre-defined vocabulary • Structured • Done by authors/librarians • Non trivial task • Social tagging • Use any words • No structure • Done by anyone • Easy

  10. Systems with controlled vocabulary

  11. Social tagging challenges • Vocabulary evolution • Filtering tags, tag suggestions, tagging metaphors • Uncontrolled vocabulary: scalable, mitigate vocabulary problem, but tag noise • User interface design • Tagcloud, tag clustering • Cold start • Lack of prior knowledge about tags for an object • Participation incentive • Scale • More tagging events, easier filtering • Making use of tags • Related tags for navigation, expertise tracking, tag meta-data for search, scoped rankings of items, faceted browsing

  12. TagBooster User Study Sept-Oct 07 4000+ participants Tagging web pagesQuestionnaire

  13. Tagging done by YOU

  14. Analogous to voting Feedback ! music soul london music jazz soul london black artist british singer

  15. Why suggest tags? • Hiding users’ true preference over tags • I picked a suggested tag that now I can’t remember • I tend to overuse same tags all over again “exploit” vs. “explore” • Less effort (cognitive, typing) • Encourage users to use tags (cold start) • Conformance in vocabulary Positive Negative

  16. Top Popular: classical suggestion method # sel. tag 174 music 110 radio 96 internet radio 77 online radio 49 last.fm 40 online music 34 fm 33 streaming music 31 streaming 28 last fm 22 web radio 19 scrobbling 18 lastfm 12 listen 12 new music 10 mp3 10 stream 9 streaming radio Suggested tags: radio, music, online radio, internet radio

  17. Users’ generation of tags Set of all tags Suggested tags music music singer soul soul rehab jazz Black singer jazz artist british London british artist singer

  18. Simple user model Set of all tags Suggested tags music music singer i soul soul rehab jazz Black singer jazz ri ri artist London british i british artist 1-p p non imitation imitation singer

  19. Users’ tag selection affected by tag suggestions Frequency of tag selection Conditional on that the tag was suggested 0.5 0.4 0.3 Unconditional 0.2 0.1 0 tag: apollo

  20. The imitation rate portion of tag selections not in the suggestion set S Boes’ estimate: portion of tag selections not in S; suggestions not made 0.32 0.34 0.31 0.4

  21. Move-to-Set: simple randomised rule Sel. Tag 174 music 110 radio 96 internet radio 77 online radio 49 last.fm 40 online music 34 fm 33 streaming music 31 streaming 28 last fm 22 web radio 19 scrobbling 18 lastfm 12 listen 12 new music 10 mp3 10 stream 9 streaming radio Suggested tags: last.fm, music, online radio, web radio

  22. Correctness of popularity order Under the user model, for any imitation probability p < 1, the long run frequency of tag selections induces the true popularity ranking ? for Sufficient

  23. Simple update rule • Converges to sampling the suggestion set proportional to the product of true rank scores Suggested tags: last.fm, music, online radio, web radio radio Suggested tags: last.fm, music, online radio, web radio Suggested tags: last.fm, music, radio, web radio • Same as “show most recent item” for suggestion set size 1

  24. Analogous to exclusion process rj rj j i

  25. Frequency Move-to-Set Rank Tag 174 music 110 radio 96 internet radio 77 online radio 49 last.fm 40 online music 34 fm 33 streaming music 31 streaming 28 last fm 22 web radio 19 scrobbling 18 lastfm 12 listen 12 new music 10 mp3 10 stream 9 streaming radio Suggested tags: radio, music, online radio, internet radio radio Rank(radio) remains unchanged (“radio” suggested) last.fm Rank(last.fm)++ (“last.fm” NOT suggested)

  26. Only sufficiently popular tags eventually suggested suggestion set size harmonic mean of r1, ..., r|C| frequency of suggesting tag i competing set Tag i in the competing set iff:

  27. Suggestion methods in action Frequency of tag suggestion Norm. frequency of tag selection NONE MTS TOP FMTS Tag rank i Tag rank i

  28. Suggestion methods in action (cont’d) TOP NONE FMTS MTS

  29. How did users appreciate the suggested tags?

  30. Why did I select these tags? 1 Tags: gadgets technology engadget blog 2 I thought these are keywords that I would likely use later to find this item I thought these are categories that best describe the object else

  31. Why did I select these tags? (cont’d) find, search, describe, categorise, identify, remember, organise, classify YOU describing the item wikipedia tag (meta data) definition keyword-based classification search

  32. Why did I select these tags (cont’d)? Semantic analysis of tags, search and content keywords • May 2007 popular Web searches + delicious tags Tags similar to categories • Small overlap with search keywords

  33. Summary • Social tagging poses interesting research challenges • Space for innovation • A mix of control theory, user behaviour, information retrieval, interface design • Aim at best design of tagging systems to support particular users’ tasks

  34. Sample of research challenges Rate of convergence Asymptotically accurate algorithms User model? Tag to attract Select from the list only (e.g. remote controller/mobile device) Ranking across multiple lists Faces project Beyond popularity ranking: What does it mean a tag is relevant? ongoing work Make suggestions to improve users’ task (e.g. search, faceted browsing)?

  35. Familiarity with tagging still used infrequently by many

More Related