Download
preference elicitation n.
Skip this Video
Loading SlideShow in 5 Seconds..
Preference elicitation PowerPoint Presentation
Download Presentation
Preference elicitation

Preference elicitation

131 Vues Download Presentation
Télécharger la présentation

Preference elicitation

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Preference elicitation Communicational Burden by Nisan, Segal, Lahaie and Parkes October 27th, 2004 Jella Pfeiffer

  2. Outline • Motivation • Communication • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work

  3. Outline • Motivation • Communication • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work

  4. Motivation • Exponential number of bundles in the number of goods • Communication of values • Determination of valuations • Reluctance to reveal valuation entirely minimzecommunication and informationrevelation* * Incentives are not considered

  5. Agenda • Motivation • Communication • Burden • Protocols • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work

  6. ? Communication burden Communication burden: • Minimum Number of messages • Transmitted in a protocol (nondeterministic) • Realizing the communication Here: „worst-case“ burden = max. number

  7. Communication protocols Sequential message sending • Deterministic protocol: Message send, determined by type and preceding messages • Nondeterministic protocol: Omniscient oracle • Knows state of the world ≽ and • Desirable alternative x ∈ F(≽)

  8. Definition Nondeterministic protocol A nondeterministic protocol is a triple Г = (M, μ, h) where M is the message set, μ: R  M is the message correspondance, and h: MX‘ is the outcome function, and the message correspondance μ has the following two properties: • Existence: μ(≽) ≠ ∅ for all ≽ ∈ ℜ, • Privacy preservation: μ(≽) = ∩i μi(≽i) for all ≽ ∈ ℜ, where μi: Ri  M for all i ∈ N.

  9. Agenda • Motivation • Communication • Lindahl prices • Equilibria • Importance of Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work

  10. Lindahl Equilbria Lindahl prices: nonlinear and non-anonymous Definition: is a Lindahl equilibrium in state ≽ ∈ ℜ if • ≽i) for all i ∈ N, (L1) • (L2) Lindahl equilibrium correspondance: ↠

  11. Importance of Lindahl prices Protocol <M, μ, h> realizes the weakly Pareto efficient correspondence F* if and only if there exists an assignment of budget sets to messages such that protocol <M, μ, (B,h)> realizes the Lindahl equilibrium correspondance E. Communication burden of efficiency = burden of finding Lindahl prices ! !

  12. Agenda • Motivation • Communication • Lindahl prices • Communication complexity • Alice and Bob • Proof for Lower Bound • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work

  13. Alice and Bob

  14. Communication Complexity (1) Finding a lower bound from „Alice and Bob“: • Including auctioneer • Larger number of bidders • Queries to the bidders • Communicating real numbers • Deterministic protocols

  15. The proof Lemma: Let v ≠u be arbitrary 0/1 valuations. Then, the sequence of bits transmitted on inputs (v,v*), is not identical to the sequence of bits transmitted on inputs (u,u*). (v*(S) = 1-v(Sc)) Theorem: Every protocol that finds the optimal allocation for every pair of 0/1 valuations v1, v2 must use at least bits of total communication in the worst case.

  16. Comments on the proof • In the main paper: Better allocation than auctioning off all objects as a bundle in a two-bidder auction needs at least Holds for valuations with: • No externalities • Normalization • With L = 50 items, the number of bits is (about 500 Gigabytes of data)

  17. Communication Complexity (2) Theorem*: Exact efficiency requires communicating at least one price for each of the possible bundles. ( is the dimension of the message space) *Holds for general valuations.

  18. Agenda • Motivation • Communication • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Conclusion • Future Work

  19. Preference Classes • Submodular valuations: Dimension of message space in any efficient protocol is at least -1 • Homogenous valuations: Agents care only about number of items recieved Dimension L • Additive Valuations Dimension L

  20. Agenda • Motivation • Communication • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Learning algorithms • Preference elicitation • Parallels (polynomial query learnable/elicitation) • Converting learning algorithms • Applications • Conclusion • Future Work

  21. Membership Query Equivalence Query Value Query Demand Query Applying Learning Algorithms Learning theory Preference elicitation

  22. What is a Learning Algorithm? • Learning an unknown function f: X  Y via questions to an oracle • Known function class C • Typically: , Y either {0,1} or ⊆ ℜ • Manifest hypotheses: • Size(f) with respect to presentation • Example: f: ;f(x) = 2 if x consists of m 1‘s, and f(x) = 0 otherwise. 1) a list of values 2)

  23. Learning Algorithm - Queries Membership Query Equivalence Query

  24. Preference elicitation Assumptions: • Normalized • No externalities • Quasi-linear utility function • Polynomial time for representation  values of bundles Goal: Sufficient set of manifest valuations to compute an optimal allocation.

  25. Preference eliciation - Queries Value Query Demand Query

  26. Parallels: learning & eliciation pref. • Membership query Value query • Equivalence query ? Demand query • Lindahl prices are only a constant away from manifest valuations • Out of a preferred bundle S‘, counterexamples can be computed

  27. Polynomial-query learnable Defintion:The representation class C is polymonial-query exactly learnable from membership and equivalence queries if there is a fixed polynomial and an algorithm L with access to membership and equivalence queries of an oracle such that for any target function f ∈ C, L outputs after at most p(size(f),m) queries a function such that for all instances x.

  28. Polynomial-query elicited Similar to definition for polynomial-query learnable but: • Value and demand queries • Agents‘ valuations are target functions • Outputs in p(size(v1,...,vn),m) an optimal allocation • Valuation functions need not to be determined exactly!

  29. Converting learning algorithms Idea proved in paper: If each representation class V1,…,V2 can be polynomial-query exactly learned from membership and equivalence queries  V1,…,V2 can be polynomial-query elicited from value and demand queries.

  30. Converted Algorithm 1) Run learning algorithms on valuation classes until each requires response to equivalence query

  31. Converted Algorithm • Compute optimal allocation S* and Lindahl prices L* with respect to manifest valuations • Represent demand query with S* and L*

  32. Converted Algorithm 4) Quit if all agents answer YES, otherwise give counterexample from agent i to learning algorithm i. goto 1

  33. Agenda • Motivation • Communication • Lindahl prices • Communication complexity • Preference Classes • Applying Learning Algorithms to Preference elicitation • Applications • Polynomial representation • XOR/DNF • Linear-Threshold • Conclusion • Future Work

  34. Polynomials • T-spares, multivariate polynomials: • T-terms • Term is product of variables (e.g. x1x3x5) • „Every valuation function can be uniquely written as polynomial“ [Schapire and Selli] • Example: additive valuations • Polynomials of size m (m = number of items) • x1+…+xm • Learning algorithm: • At most Equivalence queries • At most Membership queries

  35. XOR/DNF Representations (1) • XOR bids represent valuations wich have free-disposal • Analog in learning theory: DNF formulae • Disjunction of conjunctions with unnegated bits • E.g. • Atomic bids in XOR have value 1

  36. XOR/DNF Representations (2) • An XOR bid containing t atomic bids can be exactly learned with t+1 equivalence queries and at most tm membership queries • Each Equivalence query leads to one new atomic bid • By m membership queries (exluding bids out of the counteraxample which do not belong to the atomic bid)

  37. Linear-Threshold Representations • r-of-S valuation • Let , r-of-k threshold functions: • If r known: equivalence queries or demand queries

  38. Important Results by Nisan, Segal • Important role of prices (efficient allocation must reveal suppporting Lindahl prices) • Efficient communication must name at least one Lindahl price for each of the bundles • Lower bound: no generell good communication design focus on specific classes of preferences

  39. Important Results by Lahaie, Parkes • Learning algorithm with membership and equivalence queries as basis for preference elicitation algorithm • If polynomial-query learnable algorithm exists for valuations, preferences can be efficiently elicited whith queries polynomial in m and size(v1,…,vn) solution exists for polynomials, XOR, linear- threshold

  40. Future Work • Finding more specific classes of preferences which can be elicited efficiently • Address issue of incentives • Which Lindahl prices may be used for the questions

  41. Thank you for your attenttion Any Questions?