1 / 45

Understanding #P Complexity in Counting Problems

Explore the complexity of #P, range problems, NP relations, and complete problems. Learn about counting functions, #SAT, NP reductions, and more. Dive into the world of #P-complete problems and their implications.

lmontano
Télécharger la présentation

Understanding #P Complexity in Counting Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Counting Class #P Slides by Vera Asodi & Tomer Naveh Updated By : Avi Ben-Aroya & Alon Brook Adapted from Oded Goldreich’s course lecture notes by Oded Lachish, Yoav Rodeh and Yael Tauman.

  2. Introduction In this lecture we’ll cover: • Definition and characterization of #P • Completeness in #P • Range problems • Promise problems • The relation between NP and #P • Unique SAT

  3. 10.1 #P Def: An NP-relation is a relation R**s.t. • R is polynomial-time decidable. • There exists a polynomial p(.) such that for every (x,y)R|y|p(|x|). Def:LR = {x*|y s.t. (x,y)R} y is a witness to the membership of x in LR. The class NPrelates to whether such y exists. Another natural question may be:“how many such y’s exist?” This is the question the class #Pconcerned with

  4. #P - Def Def: For every binary relationR**, the counting function fR:*N, is defined by:fR(x) = |{y | (x,y)R}| Def:#P = {fR: R is an NP-relation} Def (counting language): The language of thresholds for the number of witnesses for each x* #R = { (x,k) : |{y : (x,y)R}|  k }

  5. Plan of the lecture • We are interested in how does the power of #P compared with complexity classes. • This will be viewed by the following reductions : StrongRangepoly(#P) Cook StrongRangepoly(#SAT) Cook WeakRange(#SAT) Cook ConstantRange64(#SAT) Cook Gap8#SAT ProbabilisticCook SAT Cook Gap8#SAT’ Probabilistic Cook fewSAT Cook uniqueSAT

  6. Some simple properties of #P Prop: For any NP-relation R: • #Ris Cook reducible to fR • fR is Cook reducible to #R Proof: The first is trivial. For the second, we know that fR(x) is in the range {0,…,2|p(x)|}, where p(.) is the polynomial bounding the length of the witness in the NP-relation.We use binary search to compute fR(x) , given an oracle for #R, in polynomial time in |x|.

  7. Some simple properties of #P (2) 10.2 Prop: For every NP-relation R, the corresponding language LRCook reduces to fR. Corollary: NP Cook reduces to #P. fR LR

  8. Some simple properties of #P (3) Claim: #PCook reduces to PSPACE. Proof: Let p(.) be the polynomial bounding the length of the witnesses of R. We run over all possible witnesses of length  p(|x|), checking for every one if it is indeed a witness, and counting the number of witnesses. Checking a witness can be done in polynomial time (since R is an NP-relation) and hence in polynomial space. Both the witness and the working area are polynomial in |x|, so the problem is in PSPACE.

  9. Completeness in #P 10.3 That is, the number of witnesses is unchanged Def: f is #P-complete if: • f is in #P • For every g in #P, g Cook reduces to f. Def: A reduction :** is parsimonious with relation to NP-relations Q and R if for every x: | { y : (x,y)Q } | = | { y : ((x), y)R } | Corollary: If R is an NP-relation, and for every NP-relation Q there exists a reduction Q from Q to R, such that Q is parsimonious with respect to Q and R, then fR is #P-complete.

  10. A #P-Complete problem 10.4 Def: Def: #SAT =

  11. A #P-Complete problem (2) That’s the reduction we used at the time Thm: #SAT is #P-complete Proof (Outline): • #SAT is in #Psince RSAT is an NP-relation. • We have proved that SAT is NP-complete by a reduction from any NP-relation to bounded halting, a reduction from bounded halting to circuit-SAT, and a reduction from circuit-SAT to SAT.If we look at these reductions, we can see that the correspondence between the witnesses is 1-1, and therefore the reductions are parsimonious.By the previous corollary, it follows that #SAT is #P-complete

  12. Example: fR is #P-complete, LRP 10.5 Thm: There exists an NP-relation R s.t. fRis #P-complete, and LR is in P. Proof:We define a relation:R’SAT = { (,(,)) | [(() = 1)  ( = 1)]  ( = 0) }.Every  has a witness   it is inP.But #SAT() + 2|variables()| = ,so #SAT can be calculated from , therefore is#P-complete.

  13. 10.6 Perfect Matching Def: RPM = { (G,f): G is a bipartite graph and f is a perfect matching of G } Prop:is polynomial-time decidable. Thm:is#P-Complete. This result is proved by showing that calculating the permanent of a {0,1} matrix is #P-complete (not proven), and computationally equivalent to perfect matching.

  14. Perfect Matching (2) Def: The permanent of an nn matrix is: . • Perm(A) is, in fact, the number of ways to choose n 1’s from the matrix, one in each column and one in each row. • If we look at a bipartite graph with n vertices in each side, then we can represent its edges as an {0,1} matrix of size nn. Each perfect matching is n 1’s from this matrix, one in each column and one in each row. Therefore Perm(A) and the number of perfect matchings are equivalent problems. Hot news a PTAS for the Permanent problem has been discovered recently

  15. Perfect matching (3) 1 1 2 2 3 3 4 4

  16. Cycle Cover 10.7 Def: A cycle cover of a directed graph G, is a set of vertex-disjoint simple cycles that cover all the vertices of G. Prop: Forevery directed graph G, Perm(A(G)) = #Cycle(G), where A(G) is the adjacency matrix of G.

  17. Cycle Cover (2) 1 3 2 6 4 5 7

  18. How Close is #P to NP? 10.8 • We’ve seen that NP Cook #P Cook PSPACE. • We’ll now try to refine these bounds by showing that a counting problem in #Pcan be probabilistically approximated in polynomial time using an NP oracle.

  19. 10.9 Range Problems Def: A range problem  is defined by two functions =(l, u), l,, u:*N,xl(x)  u(x), such that on input x* the problem is to find t[l(x), u(x)]. Def (strong range): For f: *N and a polynomial p(.), we define the range problem StrongRangep(f)=(l,u), where andMeaning that on input x* the problem is to find t[l(x), u(x)]

  20. Range Problems (2) Prop: If we can approximate #SAT strongly, we can just as strongly approximate any f in #P, i.e. StrongRangep(f) Cook StrongRangep(#SAT). Proof: As we have seen, for every f in #P there is a parsimonious reduction f with relation to f and #SAT. We may assume that |f(x)||x|, because we can always pad f(x) with something that will not change the number of witnesses.We now use our oracle to StrongRangep(#SAT) on f(x), and get a result t that satisfies: .

  21. Range Problems (3) Def (constant range): For f:*N, and a constant c>0, we define a range problem ConstantRangec(f)=(l,u), where: l(x)= f(x)/c u(x)=cf(x) Def (weak range): For f:*N, and a constant >0, we define a range problem WeakRangec(f)=(l,u), where: l(x)= f(x) u(x)= f(x)

  22. Reminder of the plan StrongRangepoly(#P) Cook StrongRangepoly(#SAT) Cook WeakRange(#SAT)Cook ConstantRange64(#SAT) Cook Gap8#SAT ProbabilisticCook SAT Cook Gap8#SAT’ Probabilistic Cook fewSAT Cook uniqueSAT Range Problems (4) Prop: For every polynomial p(.) and constant 0<<1:StrongRangep(#SAT) Cook WeakRange(#SAT). Proof: • Given , a Boolean formula on variables , we define a polynomial , and build where each is a distinct copy of the variables of . • Obviously #SAT(’)=(#SAT())q(||). • Notice that |’|2||q(||).

  23. Reminder of the plan StrongRangepoly(#P) Cook StrongRangepoly(#SAT) Cook WeakRange(#SAT) Cook ConstantRange64(#SAT)Cook Gap8#SAT ProbabilisticCook SAT Cook Gap8#SAT’ Probabilistic Cook fewSAT Cook uniqueSAT Range Problems (5) Claim: For every 0<<1 and c>0:WeakRange(#SAT) Cook ConstantRangec(#SAT). Proof: For a large enough n: .

  24. Range Problems (6) • Now, assuming we have an oracle to WeakRange(#SAT), we can call it on ’ to get: • Our result would be . Then we have: .

  25. Promise Problems 10.10 Def: A promise problem =(Y, N), where Y, N*, and YN=, is the question: given xPromise()=YN, decide whether xY. Def (Gap8#SAT): The promise problem Gap8#SAT = (Gap8#SATY, Gap8#SATN), whereGap8#SATY = { (,K) : fSAT() > 8K }Gap8#SATN = { (,K) : fSAT() < K }

  26. Reminder of the plan StrongRangepoly(#P) Cook StrongRangepoly(#SAT) Cook WeakRange(#SAT) Cook ConstantRange64(#SAT)Cook Gap8#SATProbabilisticCook SAT Cook Gap8#SAT’ Probabilistic Cook fewSAT Cook uniqueSAT Promise Problems (2) Prop: ConstantRange64(#SAT) Cook Gap8#SAT. Proof: We run the following algorithm on input : • i = 0 • while Gap8#SAT answers YES on (,8i) do i = i + 1 • return Denote = log8(#SAT()). For all i < -1, #SAT() > 8·8i, so (,8i)Gap8#SATY. Therefore, the algorithm will increment such i and will not stop.

  27. Promise Problems (3) For all i > +1, #SAT() < ·8i, so (,8i)Gap8#SATN.Therefore, the algorithm must stop at the first such i or before it. Thus, the algorithm stops when i  +2 Thus, the result 8k-½ satisfies: -1  k  +2  -2 < k-½ < +2 

  28. Promise Problems (4) Comment (amplification): For every promise problem P, and machine M that satisfies: ,if we run M on xPromise(P) O(n) times, then the majority of the results will equal P(x) with probability greater than 1-2-n.

  29. Promise Problems (5) Prop: Given a problem P and a promise problem Q, such that P Cook Q, if we have a probabilistic machine Q’ s.t. then for every polynomial p(.), we have a probabilistic polynomial time machine M that uses an oracle to Q’ s.t.Pr[MQ’(y) is a solution of P on input y] > 1-2-p(|y|) Proof: • Since the reduction from P to Q is polynomial, there exists a polynomial q(.) s.t. the oracle Q is called less than q(|y|) times. • We can amplify the probability of success to • The probability of M being correct is at least the probability that all the oracle calls are correct, which is greater than 1-2-p(|y|).

  30. 10.11 Probabilistic Cook Reductions Def (probabilistic Cook reduction): Given promise problems P and Q, we say that there is a probabilistic Cook reduction from P to Q if there is a probabilistic time oracle machine M that uses Q as an oracle, and satisfies: xPromise(P) Pr[MQ(x) = P(x)] > [ where MQ(x) denotes the computation of machine M on input x when given oracle access to Q.] Notice that in this definition the oracle has no probability of error, but using amplification we can show that an oracle with bounded error probability can be used.

  31. Universal2 Hashing 10.12 Def (universal2 hashing): A family of functions Hn,m mapping {0,1}nto {0,1}m is called universal2if for a uniformly selected h from Hn,m, the random variables are pairwise independent and uniformly distributed over {0,1}m. • An efficient construction of such families should have polynomial-time algorithms for selecting and evaluating functions in the family.

  32. Universal2 Hashing (2) • A popular example is the family of all affine transformations from {0,1}nto {0,1}m. That is, all functions of the form hA,b(x)=Ax+b, where A is an mn {0,1} matrix, b is an m-dimensional {0,1} vector, and the arithmetic is modulo 2. Lemma (leftover hash): Let Hn,mbe a family of universal2 hash functions mapping {0,1}nto {0,1}m, and let >0. Let S{0,1}nbe arbitrary provided, s.t.|S| -32m. Then: .

  33. Universal2 Hashing (3) Proof: define, for each e, a random variable XeXe = 1 , h(e) = 0m 0 , otherwise for each e1  e2, Xe1 , Xe2 are independent (pending homework assignment) Define now a random variable Y s.t.

  34. Universal2 Hashing (4) Since the X-s are pairwise independent we get : By Chebychev inequality, we get :

  35. Reminder of the plan StrongRangepoly(#P) Cook StrongRangepoly(#SAT) Cook WeakRange(#SAT) Cook ConstantRange64(#SAT) Cook Gap8#SATProbabilisticCook SAT Cook Gap8#SAT’ Probabilistic Cook fewSAT Cook uniqueSAT Gap8#SAT reduces to SAT 10.13 Thm:Gap8#SAT R SAT. Proof: We construct a probabilistic polynomial-time machine M which is given oracle access to SAT.On input (,2m), where  has n variables, M operates as follows: • Select uniformlyhHn,m = {affine transformations from {0,1}n to {0,1}m}. The function h is represented by a {0,1} matrixAmn = (aij) and a {0,1} vector b = (bi)i=1,…,m.

  36. Gap8#SAT reduces to SAT (2) • We construct a formula h, on variables x1,…,xn,y1,…,ytsuch that x{0,1}n • h(x)=0m there exists an assignment to the yi’ss.t. his true. • h(x)=0m there’s only one such assignment to the yi’s. Now: h(x) = 0m The formula on the right can be transformed into a CNF formula (using additional variables) which has the above properties

  37. Gap8#SAT reduces to SAT (3) • Define ’ =   h. Use the oracle to SAT on ’, and return the result. Claim 1: If (,2m)Gap8#SATythen ’SAT with probability greater than ½. Proof: We define S = { x : (x)=1 }. |S|>82mbecause (,2m)Gap8#SATy.Prh[’SAT] = Prh[{x : (x) = 1 & h(x) = 0m}  ] = Prh[{xS : h(x) = 0m}  ] Prh[|{xS : h(x) = 0m}|  [1½] >½ By the Leftover Hash Lemma, with =½

  38. Gap8#SAT reduces to SAT (4) Claim 2: If (,2m)Gap8#SATNthen ’SAT with probability less than . Proof: We define S = { x : (x)=1 }. |S|< 2mbecause (,2m)Gap8#SATN.Prh[’SAT] = Prh[{xS : h(x) = 0m}  ] = = Prh[( )  ]   < 2m 2-m =

  39. Gap8#SAT reduces to SAT (5) {0,1}n h {0,1}m 0m

  40. Reminder of the plan StrongRangepoly(#P) Cook StrongRangepoly(#SAT) Cook WeakRange(#SAT) Cook ConstantRange64(#SAT) Cook Gap8#SAT ProbabilisticCook SATCook Gap8#SAT’Probabilistic Cook fewSAT Cook uniqueSAT uniqueSAT 10.14 We will see now a more restricted version of Gap8#SAT called uniqueSAT, and show that SAT probabilistically Cook reduces to uniqueSAT. Def:Gap8#SAT’ is the promise problem on input pairs (,k) defined by: Gap8#SAT’Y = {(,k) : 8k < #SAT() < 32k)} Gap8#SAT’N = {(,k) : #SAT() < k)} Claim: SAT Cook reduces to Gap8#SAT’. Proof: Given , we will create ’ s.t. #SAT(’) = 15#SAT() using 4 variables {x1,…,x4} that do not appear in .

  41. uniqueSAT (2) • We define: ’ =   (x1  x2  x3  x4) • Notice that: #SAT(’)15  SAT #SAT(’)=0  SAT • We will call an oracle: Gap8#SAT’(’, 2i) for every 0i|Variables(’)|. • One of the answers is YES iff SAT: • If SAT then all the answers are NO. • If SAT then #SAT(’)15, therefore, log2(#SAT(’))log2(15) > 3. So there exists i0 s.t.i < log2(#SAT(’)) – 3 < i+2  82i < #SAT(’) < 322iand for that i we get a YES answer.

  42. Reminder of the plan StrongRangepoly(#P) Cook StrongRangepoly(#SAT) Cook WeakRange(#SAT) Cook ConstantRange64(#SAT) Cook Gap8#SAT ProbabilisticCook SAT Cook Gap8#SAT’Probabilistic Cook fewSAT Cook uniqueSAT uniqueSAT (3) Def:fewSAT is the following promise problem: fewSATY = { : 1  #SAT() < 100} fewSATN = { : #SAT() = 0} Prop: Gap8#SAT’ probabilistically Cook reduces to fewSAT. The proof uses the same reduction used when proving Gap8#SAT probabilistically Cook reduces to SAT.

  43. Reminder of the plan StrongRangepoly(#P) Cook StrongRangepoly(#SAT) Cook WeakRange(#SAT) Cook ConstantRange64(#SAT) Cook Gap8#SAT ProbabilisticCook SAT Cook Gap8#SAT’ Probabilistic Cook fewSATCook uniqueSAT uniqueSAT (4) Def:uniqueSAT is the following promise problem: uniqueSATY = { : #SAT() = 1} uniqueSATN = { : #SAT() = 0} Prop: fewSAT Cook reduces to uniqueSAT Proof:Given a formula , we construct formulas i for1  i < 100 in the following way: i.e. i copies of , each on a separate set of variables.

  44. uniqueSAT (5) Now, if SAT, then so is i. Every satisfying assignment of i corresponds to i satisfying assignments of , but these assignments are not necessarily different. So we define i’s that will force them to be different, and even impose lexicographical order on the solutions. Thus, iwill have a unique satisfying assignment iff  has exactly i satisfying assignments: .

  45. The End 10.15 Let’s look at what we’ve done: StrongRangepoly(#P) Cook StrongRangepoly(#SAT) Cook WeakRange(#SAT) Cook ConstantRange64(#SAT) Cook Gap8#SAT ProbabilisticCook SAT Cook Gap8#SAT’ Probabilistic Cook fewSAT Cook uniqueSAT

More Related