1 / 31

An Improved Approximation Algorithm for Combinatorial Auctions with Submodular Bidders

An Improved Approximation Algorithm for Combinatorial Auctions with Submodular Bidders. Combinatorial Auctions. A set M={1,…,m} of items for sale. n bidders, each bidder i has a valuation function v i :2 M ->R + . Common assumptions: Normalization: v i (  )=0

Télécharger la présentation

An Improved Approximation Algorithm for Combinatorial Auctions with Submodular Bidders

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An Improved Approximation Algorithm for Combinatorial Auctions with Submodular Bidders

  2. Combinatorial Auctions • A set M={1,…,m} of items for sale. • n bidders, each bidder i has a valuation function vi:2M->R+. Common assumptions: • Normalization: vi()=0 • Free disposal: ST vi(T) ≥ vi(S) • Goal: find a partition S1,…,Sn such that social welfare Svi(Si) is maximized

  3. Combinatorial Auctions • Problem 1: finding an optimal allocation is NP-hard. Therefore, we are interested in the possible approximation ratios. • Problem 2: the valuations’ length is exponential in m, while we wish our algorithms to be polynomial in m and n. • Problem 3: how can we be certain that the bidders do not lie?

  4. Access Models • Common types of queries: • Value: given a bundle S, return v(S). • Demand: given a vector of prices (p1,…, pm) return the bundle S that maximizes v(S)-SjSpj. (demand queries are strictly more powerful than value queriesBlumrosen-Nisan, Dobzinski-Schapira ). • General: any possible type of query (the communication model).

  5. The Hierarchy of CF Valuations Lehmann, Lehmann, Nisan OXS  GS  SM XOSCF • Complement-Free: v(ST) ≤ v(S) + v(T). • XOS • Submodular: v(ST) + v(ST) ≤ v(S) + v(T). • Semantic Characterization: Decreasing Marginal Utilities. • 2-approximation (Lehmann-Lehmann-Nisan). • Recent result: an e/(e-1)-approximation (Dobzinski-Schapira). • GS: (Gross) Substitutes: Solvable in polynomial time.

  6. Part I: Approximations Using Demand Queries • An e/(e-1)-approximation for XOS • Also holds for submodular valuations. • The previously known upper bound is 2 (Lehmann-Lehmann-Nisan, Dobzinski-Nisan-Schapira) • An e/(e-1) communication lower bound for XOS

  7. XOS • The maximum over additive valuations: (a:1 b:2  c:3) (a:2) Examples: v({a}) = 2 v({a,b}) = 3 v({a,b,c}) = 6

  8. Intuition for the XOS algorithm • We exploit the syntax of the XOS class. • We can regard the value each bidder assigns a bundle as a sum of the values he assigns the items in that bundle. • We will analyze the expected contribution of each item separately.

  9. The XOS Algorithm – Step 1 • Solve the linear relaxation of the problem: Maximize: Si,Sxi,Svi(S) Subject To: • For each item j: Si,S|jSxi,S ≤ 1 • For each bidder i: SSxi,S ≤ 1 • For each i,S: xi,S ≥ 0

  10. The XOS Algorithm – Steps 2-3 • Randomized Rounding: For each bidder i, let Si be the bundle S with probability xi,S, and the empty set with probability 1-SSxi,S. • The expected value of vi(Si) is SSxi,Svi(S) • Bidder i got the bundle Si = (x1:p1i… xm:pmi) • Give item j to bidder i such that pjj ≥ pji’ for all i’.

  11. The XOS Algorithm • Theorem: The algorithm is an e/(e-1)-approximation. • Proof: only for the special case where all prices are equal. • Example: (x1:1  x2:1)  (x1:1) • We now only need to prove that the number of items which are allocated ≥ (1-(1-1/n)n)(Si,sxi,s|S|). • We will prove that each item is allocated with probability ≥ (1- (1-1/n)n)Si,S:j Sxi,s.

  12. The XOS Algorithm Proof • Pr [item j is not allocated] ≤ Pni=1(1-SjSxi,S) = ((Pni=1(1-SjSxi,S))1\n)n • Due to the arithmetic/geometric mean inequality:≤ ((Sni=1(1-SjSxi,S))\n)n = (1-(Si,jSxi,s)/n)n • Pr [item j is allocated] ≥ 1-(1-(Si,jSxi,s)/n)n≥ (1-(1-1/n)n)Si,S:jSxi,s

  13. An e/(e-1) Lower Bound for XOS • Theorem: Any approximation better than e/(e-1) of a combinatorial auctions with XOS bidders requires exponential communication. • Unconditional Lower bound • We will prove the lower bound for the MCG problem (Chekuri-Kumar): • We are given a set of M items, and n groups of subsets of the M items • The goal is to choose one subset from each group, such that their union is maximized. MCG Instance Auction with n XOS bidders A B C v1: (A:1  D:1)  (D:1  E:1  F:1) v2: (B:1 C:1)  (C:1 F:1) D E F

  14. Approximate Disjointness • n players, each holds a string of length t. • The string of player i specifies a subsetAi  {1,…,t}. • The goal is to distinguish between the following two extreme cases: • NO: iAi ≠  • YES: for every i≠j AiAj =  • Theorem: Requires t/n4 bits of communication (Alon-Matias-Szegedy)

  15. The Reduction • Denote a partition C of M to n parts as {C1,…,Cn). • We build a set of partitions F=(C1,…,Cexp(m/n)), such that every n sets from different parts cover at most(1-(1-1/n)n)m elements. • Existence is proved using probabilistic construction. • Randomly build each partition: place each item in exactly one of the n sets. • Given n sets the probability that an item is covered is (1-(1-1/n)n) • The expectation is (1-(1-1/n)n)m • By the chernoff bounds the probability that we are far from the optimum is exponentially small  we have an exponential number of sets. • Each player i who got Ai as input, constructs the collection Bi = {Csi|Ai=1}. • If the intersection wasn’t empty, all the elements can be covered. • If the intersection was empty, the construction guarantees that no more than (1-(1-1/n)n)m elements can be covered. • Corollary: exponential communication is required for any approximation better than (1-(1-1/n)n).

  16. Part II: Approximations Using Value Queries • An O(m1/4-e) lower bound for XOS • An m1/2-approximation algorithmfor CF is known (Dobzinski-Nisan-Schapira). • (2-1/n)- approximation for submodular valuations. • The Previously known upper bound for submodular valuations is 2 (Lehmann-Lehmann-Nisan) • 1+1/2m communication lower bound for submodular valuations is known (Nisan-Segal) • e/(e-1) lower bound – conditional in P≠NP (Khot-Lipton-Markakis-Mehta) Reminder: OXS  GS  SM  XOS  CF

  17. An O(m1/4-e) lower bound for XOS • Setting: m items, m½ XOS bidders. • Choose, uniformly at random, a partition T1,…,Tn, where |Ti|=m½. • Valuations: vi = (jT j:m-½) |S|=2m^(¼+e) (jS j:m-¼) |S|=m^(¾) (jS j:m-¼) • The optimal Allocation has value of m½ (according to the Ti’s). • Lemma: Exponential number of value queries is required to find a bundle R, |R|<m¾, for which the maximizing clause is (jT j:m-½). • Corollary: the best allocation has value of 2m¼+e. • Proof(of lemma): • The average intersection between a random bundle and Ti is m¼. • By the chernoff bounds, the chance of finding a bundle whose intersection with Ti is greater than the average by e is exponentially small in e. • By the union bound it requires an exponential number of value queries to find such a bundle.

  18. A (2-1/n)-Approximation • An equivalent definition for submodular valuations (“decreasing marginal utilities”): • Marginal utility of j given S: v(j|S):=v(S{j}) - v(S) • TSM: v(j|S) ≤ v(j|T) • Fact: the marginal valuation of a submodular valuation is also submodular. • The greedy algorithm provides a 2-approximation (Lehmann-Lehmann-Nisan) • We use randomization to improve the approximation ratio.

  19. The Algorithm • For each item j=1..m • For each bidder i, let ti = vi(j|Si)n-1 • Assign to exactly one bidder the item j, where bidder i is chosen with probability ti / Sktk. • Theorem: the algorithm produces an allocation which is in expectation a (2-1/n)-approximation to the optimal total social welfare. • We will prove the theorem for n=2.

  20. Proof Sketch v1(a)=1, v1(b)=1, v1(c)=1 v1(S)=min(2, SjSv1(j)) v2(a)=0, v2(b)=1, v2(c)=0 v2(S)=min(1, SjSv2(j)) • Let OPTj denote the value of the optimal solution without the first (j-1) items.

  21. Proof Sketch • Let OPTj denote the value of the optimal solution without the first (j-1) items. • With the submodular valuations v1(·|S1),…,vn(·|Sn). a v1(a)=1, v1(b)=1, v1(c)=1 v1(S)=min(2, SjSv1(j)) v2(a)=0, v2(b)=1, v2(c)=0 v2(S)=min(1, SjSv2(j)) v1(b|a)=1, v1(c|a)=1 v1(S|a)=min(1, SjSv1(j|a))

  22. Proof Sketch • Let Pj denote the random variable which indicates the “price” we got for item j. • i.e. the contribution of item j to the total social welfare. • Observe the E[ALG] = SjE[Pj]. • Let OPTij denote the optimal solution given that item j was assigned to bidder i. • Lj denotes the random variable that gets the value of OPTj – OPTj+1 • i.e. how much did we lose by assigning item j to bidder i? • We will prove that E[Lj] / E[Pj] ≤ 1.5, and the theorem will follow. a v1(b|a)=1, v1(c|a)=1 v1(S|a)=min(1, SjSv1(j|a)) v2(a)=0, v2(b)=1, v2(c)=0 v2(S)=min(1, SjSv2(j))

  23. Proof Sketch • Lemma: E[Lj] / E[Pj] ≤ 1.5 • Proof: Notation: vi := v(j|Si). • E[Pj] = (v1*(v1 / (v1+v2))+ v2*(v1 / (v1+v2))) = (v12 + v22) / (v1+v2) a v1(b|a)=1, v1(c|a)=1 v1(S|a)=min(1, SjSv1(j|a)) v2(a)=0, v2(b)=1, v2(c)=0 v2(S)=min(1, SjSv2(j))

  24. Proof Sketch • WLOG bidder 2 gets item j in OPTj. • If we assign item j to bidder 2: L=OPTj-OPT1j=v2 • This happens with probability v2 / (v1+v2) b a v1(b|a)=1, v1(c|a)=1 v1(S|a)=min(1, SjSv1(j|a)) v2(a)=0, v2(b)=1, v2(c)=0 v2(S)=min(1, SjSv2(j))

  25. Proof Sketch • Suppose we assign item j to bidder 1: • Bidder 1 loses at most v1 in OPT1j • the marginal value of j given the bundle he gets in OPT1j is smaller than v1. • Bidder 2 loses at most v2 in OPT1j •  L ≤ v1+v2 • This happens with probability v1 / (v1+v2) • E[Lj] ≤ (v2*(v2 / (v1+v2)) +(v1+v2) *(v1 / (v1+v2))) = (v12+v22+v1*v2) / (v1+v2) b a v1(b|a)=1, v1(c|a)=1 v1(S|a)=min(1, SjSv1(j|a)) v2(a)=0, v2(b)=1, v2(c)=0 v2(S)=min(1, SjSv2(j))

  26. Proof Sketch • We have: • E[Lj] ≤ (v12+v22+v1*v2) / (v1+v2) • E[Pj] = (v12 + v22) / (v1+v2) • E[Lj] / E[Pj] ≤ (v12+v22+v1*v2) / (v12+v22) ≤ 1+v1*v2 / (v12+v22) ≤ 1.5

  27. Online Combinatorial Auctions • Items arrive one by one. • Each item must be assigned as it arrives. • The type of queries the algorithm is allowed to ask is restricted. • We suggest two natural restrictions. • Our algorithm provides a 2-1/n upper bound for both variants.

  28. Variant I: Look Backwards • Before assigning item j the algorithm may only query the any bundle S  {1,..j}. • Online Matching (Karp-Vazirani-Vazirani) • Bipartite graph. The goal is to find the maximum bipartite matching. Vertices from side I arrive one by one, and the edges of a vertex are revealed as the vertex arrive. • Reduction: the set of vertices from side I is the set of items, and the set of vertices from side II is the set of bidders. Vi(S)=1 if there exists some vS such that the edge (v,i) exists. Otherwise Vi(S)=0. • e/(e-1) randomized upper bound. • Other problems: Online b-Matching (Kalayanasundaram-Pruhs), Adwords (Mehta-Saberi-Vazirani-Vazirani). • All have an e/(e-1) randomized upper bound.

  29. Variant II: Look Ahead • Before assigning item j the algorithm may only query the marginal value of item j given any bundle S  M. • Bounded-Delay buffer (Kesselman et al.) • Packets arrive one by one, each has a value and a deadline. We can handle one packet at a time. The goal is to maximize the sum of values of packets which have been transferred before their deadline. • Reduction: let set of time slots be the set of items, each packet is reduced to a bidder. Vi(S)=1 if S contains a time slot between the arrival and the expiration of the corresponding packet. Otherwise, Vi(S)=1. • e/(e-1) randomized upper bound (Bartal et al.)

  30. Summary • Demand Queries: • e/(e-1) upper bound for XOS valuations • Also holds for submodular valuations • e/(e-1) lower bound for XOS valuations • Holds for any type of queries • Value Queries: • An O(m1/4-e) lower bound for approximating CF valuations using value queries only. • 2-1/n approximation for submodular valuations. • e/(e-1) lower bound is known (Khot-Lipton-Markakis-Mehta). Reminder: OXS  GS  SM  XOS  CF

  31. Open Questions • Is there an e/(e-1) upper bound for combinatorial auctions with submodular valuations using value queries only? • An upper bound of e/(e-1) is known for many special cases. • Online: online matching, bounded delay buffer, … • Offline: budget additive valuations (Andelman-Mansour), coverage valuations. • Is there a constant lower bound for approximation of submodular valuations using demand oracles? • Close the gap between the O(log m)-approximation for CF valuations and the 2-e lower bound. • Incentive compatible auctions with better approximation ratios.

More Related