1 / 74

Recommender Systems

Recommender Systems. Robin Burke DePaul University Chicago, IL. About myself. PhD 1993 Northwestern University Intelligent Multimedia Retrieval 1993-1998 Post-doc at University of Chicago Kristian Hammond Helped found Recommender, Inc. became Verb, Inc. 1998-2000

timko
Télécharger la présentation

Recommender Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recommender Systems Robin Burke DePaul University Chicago, IL

  2. About myself • PhD 1993 Northwestern University • Intelligent Multimedia Retrieval • 1993-1998 • Post-doc at University of Chicago • Kristian Hammond • Helped found Recommender, Inc. • became Verb, Inc. • 1998-2000 • Dir. of Software Development • Adjunct at University of California, Irvine • 2000-2002 • California State University, Fullerton • 2002-present • DePaul University

  3. My Interests • Memory • How do we remember the right thing at the right time? • Why is it that computers are so bad at this? • How does knowledge of different types shape the activity of memory?

  4. Organization • 3 days • 21 hours • Not me talking all the time! • Partners • For in-class activities • For coding labs • For labs • Must be one laptop per pair • Using Eclipse / Java

  5. Activity 1 • With your partner • One person should recommend a movie or DVD to the other • asking questions as necessary • in the end, you should be confident that they are right • No right or wrong way to do this! • Take note • the questions you ask • the reasons for the recommendation

  6. Discussion • Recommender • What did you have to ask? • How did you use this information? • Recommendee • What made you sure the recommendation was good?

  7. Example: Amazon.com

  8. Product similarity

  9. Market-basket analysis

  10. Profitability analysis

  11. Sequential pattern mining

  12. Application: Recommender.com

  13. Similar movies

  14. Applying a critique

  15. New results

  16. Knowledge employed • Similarity metric • what makes something "alike"? • # of features in common is not sufficient • Movies • genres of movies • types of actors • directorial styles • meaning of ratings • NR could mean adult, but it could just be a foreign movie

  17. This class Tuesday • 8:00 – 10:30 • 10:45 – 13:00 • 15:00 – 18:00 Wednesday • 8:00 – 10:00 • 10:15 – 13:00 • 17:00 – 19:00 Thursday • 8:00 – 11:00 • 14:30 – 16:00 • 18:00 – 20:00

  18. Roadmap • Session A: Basic Techniques I • Introduction • Knowledge Sources • Recommendation Types • Collaborative Recommendation • Session B: Basic Techniques II • Content-based Recommendation • Knowledge-based Recommendation • Session C: Domains and Implementation I • Recommendation domains • Example Implementation • Lab I • Session D: Evaluation I • Evaluation • Session E: Applications • User Interaction • Web Personalization • Session F: Implementation II • Lab II • Session G: Hybrid Recommendation • Session H: Robustness • Session I: Advanced Topics • Dynamics • Beyond accuracy

  19. Recommender Systems • Wikipedia: • Recommendation systems are programs which attempt to predict items(movies, music, books, news, web pages) that a user may be interested in, given some information about the user's profile. • My definition • Any system that guides the user in a personalized way to interesting or useful objects in a large space of possible options or that produces such objects as output.

  20. Historical note • Used to be a more restrictive definition • “people provide recommendations as inputs, which the system then aggregates and directs to appropriate recipients” (Resnick & Varian 1997)

  21. Aspects of the definition • basis for recommendation • personalization • process of recommendation • interactivity • results of recommendation • interest / useful objects

  22. Personalization • Any system that guides the user in a personalized way to interesting or useful objects in a large space of possible options or that produces such objects as output. • Definitions agree that recommendations are personalized • Some might say that suggesting a best-seller to everyone is a form of recommendation • Meaning • the process is guided by some user-specific information • could be a long-term model • could be a query

  23. Interactivity • Any system that guides the user in a personalized way to interesting or useful objects in a large space of possible options or that produces such objects as output. • Many possible interaction styles • query / retrieve • recommendation list • predicted rating • dialog

  24. Results • Any system that guides the user in a personalized way to interesting or useful objects in a large space of possible options or that produces such objects as output. • Recommendation = Search? • Search • a query matching process • given a query • return all items that match it • Recommendation • a need satisfaction process • given a need • return items that are likely to satisfy it

  25. Some definitions • Recommendation • Items • Domain • Users • Ratings • Profile

  26. Recommendation • A prediction of a given user's likely preference regarding an item • Issues • Negative prediction • Presentation / Interface • Notation • Pred(u,i)

  27. Items • The things being recommended • can be products • can be documents • Assumption • Discrete items are being recommended • Not, for example, contract terms • Issues • Cost • Frequency of purchase • Customizability • Configurations • Notation • I = set of all items • i = an individual item

  28. Recommendation Domain • What is being recommended? • a $0.99 music track? • a $1.9 M luxury condo? • Much depends on the characteristics of the domain • cost • how costly is a false positive? • how costly is a false negative? • portfolio • OK to recommend something that the user has already seen? • compatibility with owned items? • individual vs group • are we recommending something for individual or group consumption? • single item vs configuration • are we recommending a single item or a configuration of items? • what are the constraints that tie configurations together? • constraints • what types of constraints are users likely to impose (hard vs soft)?

  29. Example 1 • Music track (ala iTunes) • low cost • individual • configuration • fit into existing playlist? • portfolio • should not be already owned • constraints • likely to be soft

  30. Example 2 • Course advising • high cost • individual • configuration • must fit with other courses • prerequisites • portfolio • should not have already been taken • constraints • may be hard • graduation requirements • time and day

  31. Example 3 • DVD rental • low cost • group consumption • no configuration issues • portfolio • possible to recommend a favorite title again • Christmas movies • constraints • likely to be soft • some could be hard like maximum allowed rating

  32. Users • People who need / want items • Assumption • (Usually) repeat users • Issues • Portfolio effects • Notation • U = set of all users • u = a particular user

  33. Ratings • A (numeric) score given by a user to a particular item representing the user's preference for that item. • Assumption • Preferences are static (or at least of long duration) • Issues • Multi-dimensional ratings • Context-dependencies • Notation • ru,i = a rating of item i by user u • RU,i = Ri = the ratings of item i by all users

  34. Explicit vs Implicit Ratings • A explicit rating is one that has been provided by a user • via a user interface • An implicit rating is inferred from user behavior • for example, as recorded in web log data • Issues • effort threshold • noise

  35. Collecting Explicit Ratings

  36. Profile • A user profile is everything that the system knows about a particular user • Issues • profile dimensionality • Notation • P = all profiles • Pu = the profile of user u

  37. Knowledge Sources • An AI system requires knowledge • Takes various forms • raw data • algorithm • heuristics • ontology • rule base

  38. In Recommendation • Social knowledge • User knowledge • Content knowledge

  39. Knowledge source: Collaborative • A collaborative knowledge source is one that holds information about peer users in a system • Examples • ratings of items • age, sex, income of other users

  40. Knowledge source: User • A user knowledge source is one that holds information about the current user • the one who needs a recommendation • Example • a query the user has entered • a model of the user's preferences

  41. Knowledge source: Content • A content knowledge source holds information about the items being recommended • Example • knowledge about how items satisfy user needs • knowledge about the attributes of items

  42. Recommendation Knowledge Sources Taxonomy RecommendationKnowledge Collaborative Opinion Profiles Demographic Profiles User Opinions Query Demographics Constraints Requirements Preferences Content Item Features Context DomainKnowledge Means-ends FeatureOntology Contextual Knowledge DomainConstraints

  43. Break

  44. Roadmap • Session A: Basic Techniques I • Introduction • Knowledge Sources • Recommendation Types • Collaborative Recommendation • Session B: Basic Techniques II • Content-based Recommendation • Knowledge-based Recommendation • Session C: Domains and Implementation I • Recommendation domains • Example Implementation • Lab I • Session D: Evaluation I • Evaluation • Session E: Applications • User Interaction • Web Personalization • Session F: Implementation II • Lab II • Session G: Hybrid Recommendation • Session H: Robustness • Session I: Advanced Topics • Dynamics • Beyond accuracy

  45. Recommendation Types • Default (non-personalized) • “Would you like fries with that?” • Collaborative • “Most people who bought hamburgers also bought fries.” • Demographic • “Most 45-year-old computer scientists buy fries.” • Content-based • “You usually buy fries with your burgers.” • Knowledge-based • “A large order of curly fries would really complement the flavor of a Western Bacon Cheeseburger.”

  46. Collaborative • Key knowledge source • opinion database • Process • given a target user, find similar peer users • extrapolate from peer user ratings to the target user

  47. Demographic • Key knowledge sources • Demographic profiles • Opinion profiles • Process • for target user, find users of similar demographic • extrapolate from similar users to target user

  48. Content-based • Key knowledge sources • User’s opinion • Item features • Process • learn a function that maps from item features to user’s opinion • apply this function to new items

  49. Knowledge-based • Key knowledge source • Domain knowledge • Process • determine user’s requirements • apply domain knowledge to determine best item

More Related