1 / 41

Soft constraints

Soft constraints. Classical (hard) Constraints. Variables {X 1 ,…,X n }=X Domains {D(X 1 ),…,D(X n )}=D Constraints : each constraint c connects a subset of X and is a subset of the Cartesian product of the domains of its variables ( allowed combinations) Example: graph coloring

saeran
Télécharger la présentation

Soft constraints

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Soft constraints

  2. Classical (hard) Constraints • Variables{X1,…,Xn}=X • Domains{D(X1),…,D(Xn)}=D • Constraints: each constraint c connects a subset of X and is a subset of the Cartesian product of the domains of its variables (allowed combinations) • Example: graph coloring • Variables: nodes of the graph • Domains: sets of colours • Constraints: two variables, set of pairs of colours (different colours)     

  3. Classical constraints are useful for ... • resource allocation • scheduling • vehicle routing • car sequencing • timetabling • graph coloring • VLSI design • vision • Robotics • ... However, in most real-life situations we need to express fuzziness, possibilities, preferences, probabilities, costs, ...

  4. Why is this not enough? • Over-constrained problems • it does not make sense to just say that there is no solution • Optimization problems • Also multi-criteria • Problems with both preferences and hard statements • ex.: time-tabling (hard constraints about space and time, soft constraints about teachers) • Problems with uncertainty

  5. A problem with preferences • The problem: to decide what to eat at a restaurant, given some choices for drinks and dishes • The client has some preferences over drinks, dishes, and their combinations • Examples: • White wine is the most preferred drink • Pizza is the most preferred dish • Fish is less preferred • white wine and fish is the most preferred combination • red wine and meat is less preferred • We look for a combination which “maximizes” the overall preference of the client

  6. Soft Constraints: the C-semiring framework • Variables {X1,…,Xn}=X • Domains {D(X1),…,D(Xn)}=D • C-semiring <A,+,x,0,1>: • A set of preferences • + additive operator, inducing the ordering: a≥biff a+b=a (idempotent, commutative, associative, unit element 0); • x multiplicative operator: combines preferences (commutative, associative, unit element 1, absorbing element 0) • 0,1 respect. bottom and top element • x distributes over + [Bistarelli, Montanari, Rossi, IJCAI 1995]

  7. Soft Constraints: the C-semiring framework • + and x monotone on ≤ • for all a,b in A, a x b ≤ a • for all a in A, 0 ≤ a ≤ 1 • Lattice <A,≤> • + is lub • x glb if x idempotent • When ≤ is a total order and x idempotent: • if a ≤ b, then a + b = b and a x b = a, thus +=max and x=min

  8. Soft constraints • Soft constraint: a pair c=<f,con> where: • Scope: con={Xc1,…, Xck} subset of X • Preference function : f: D(Xc1)x…xD(Xck) → A tuple (v1,…, vk) → p preference • Hard constraint: a soft constraint where for each tuple (v1,…, vk) f (v1,…, vk)=1 the tuple is allowed f (v1,…, vk)=0 the tuple is forbidden

  9. Instances of soft constraints • Each instance is characterized by a c-semiring <A, +, x, 0, 1> • Classical constraints: <{0,1},or,and,0,1> • Fuzzy constraints: <[0,1],max,min,0,1> • Lexicographic CSPs: <[0,1]k,lex-max,min,0k,1k> • Weighted constraints (N):<N+,min,+,+,0> • Weighted constraints (R):<R+,min,+,+,0> • Max CSP: weight =1 when constraint is not satisfied and 0 is satisfied • Probabilistic constraints: <[0,1],max,x,0,1> • Valued CSPs: any semiring with a total order • Multicriteria problems: Cartesian product of semirings

  10. {s=slow, f=fast} {256, 512, 1024} VGA MB <s,256> → 1 <s,512> → 1 <s,1024> → 1 <f,256> → 0 <f,512> → 1 <f,1024> → 1 Example of hard constraint The fast VGA is compatible only with at least 512MB of memory

  11. {s=slow, f=fast} {256, 512, 1024} VGA MB <s,256> → 0.6 <s,512> → 0.7 <s,1024> → 0.9 <f,256> → 0.1 <f,512> → 0.9 <f,1024> → 1 Example of soft (fuzzy) constraint The VGA performs better with more memory • Soft constraints are a quantitative formalism

  12. Solution ordering • A soft CSP induces an ordering over the solutions, from the ordering of the semiring • Totally ordered semiring  total order over solutions (possibly with ties) • Partially ordered semiring  total or partial order over solutions (possibly with ties) • Any ordering can be obtained

  13. Expressive power • A B iff from a problem P in A it is possible to build in polynomial time a problem P’ in B s.t. the optimal solutions are the same(but not necessarily the solution ordering!) • A B iff from a problem P in A it is possible to build in polynomial time a problem P’ in B s.t. opt(P’)  opt(P)

  14. Semiring-based Prob Valued weightedR weightedN Fuzzy Lexicographic Classical Expressive power

  15. {s=slow, f=fast} {256, 512, 1024} VGA MB <s,256> → 0.6 <s,512> → 0.7 <s,1024> → 0.9 <f,256> → 0.1 <f,512> → 0.9 <f,1024> → 1 • Projection over VGA: • s max(0.6,0.7,0.9) = 0.9 • f  max(0.1,0.9,1) = 1 Fuzzy constraints: <[0,1],max,min,0,1> Projection If c=<f,con>, then c|I = <f', I  con> • f'(t') = +(f(t)) over tuples of values t s.t. t|I  con = t’

  16. 256  0.5 512  0.8 1024  0.7 s=slow 0.2 f=fast  0.9 P4  0.7 AMD  0.4 VGA MB P <s,256> → 0.6 <s,512> → 0.7 <s,1024> → 0.9 <f,256> → 0.1 <256,P4> → 0.5 <256,AMD> → 0.5 <f,512> → 0.9 <512,P4> → 0.7 <512,AMD> → 0.5 <f,1024> → 1 <1024,AMD> → 0.5 <1024,P4> → 0.9 f(s,256,P4) = min(0.2,0.6,0.5,0.5) = 0.2 f(f,1024,P4)=min(0.9,1,0.7,0.9,0.7)=0.7 Combination If ci=<fi,coni>, then c1 x c2 = <f, con1  con2> • f(t) = f1(t|con1) x f2(t|con2)

  17. 256  0.5 512  0.8 1024  0.7 s=slow 0.2 f=fast  0.9 P4  0.7 AMD  0.4 VGA MB P <s,256> → 0.6 <s,512> → 0.7 <s,1024> → 0.9 <f,256> → 0.1 <256,P4> → 0.5 <256,AMD> → 0.5 <f,512> → 0.9 <512,P4> → 0.7 <512,AMD> → 0.5 <f,1024> → 1 <1024,AMD> → 0.5 <1024,P4> → 0.9 val(s,256,P4) = min(0.2,0.6,0.5,0.5) = 0.2 val(f,1024,P4)=min(0.9,1,0.7,0.9,0.7)=0.7 Solutions • Sol(C) = X(C) • an element of A for each tuple of elements of D • val(t) = f1(t|con1) x ... x fk(t|conk)

  18. Multi-criteria problems • Main idea: one semiring for each criteria • Given n c-semirings Si = <Ai, +i, xi, 0i,1i>, we can build the c-semiring <<A1,..., An>, +,x, <01,...,0n>,<11,...,1n>> • + and x obtained by pointwise application of +i and xi on each semiring • A tuple of values associated with each variable instantiation • A partial order even if all the criteria are totally ordered

  19. Example • The problem: choosing a route between two cities • Each piece of highway has a preference and a cost • We want to both minimize the sum of the costs and maximize the preference • Semiring: by putting together one fuzzy semiring and one weighted semiring: • <[0,1],max,min,0,1> • <N, min, +, +, 0> • Best solutions: routes such that there is no other route with a better semiring value • <0.8,$10> is better than <0.7,$15> • Two total orders, but the resulting order is partial: • <0.6, $10> and <0.4,$5> are not comparable

  20. Typical questions • Find an optimal solution • Difficult: NP-hard (branch and bound + constraint propagation) • Is t an optimal solution? • Difficult: NP-hard (we have to find the optimal preference level) • Is t better than t’? • Easy: Linear in the number of constraints (compute the two pref. levels and compare them)

  21. How to find optimal solutions • Classical constraints: • depth-first backtracking search • Constraint propagation • Variable elimination • Local search • ... • Is it possible to extend/adapt these techniques to soft constraints?

  22. Branch and bound • Same scheme as for optimization CSPs • Lower bound = preference of best solution so far (0 at the beginning) • Upper bound for each node • If ub > lb  enter the subtree

  23. Upper bounds • Based on preferences • t = assignment to past variables (P) • ub1= x c(t), with c already assigned by t • ub2 = x ub(cji), where xi in F, xj in P • ub(cji) = +cji(vj,a) for a in Di • ub3 = x (+ cij(a,b)) • x over xi, xj in F • + over a in Di, b in Dj • Ub = ub1 x ub2 x ub3

  24. Upper boundsweighted constraints (min +) • t = assignment to past variables • ub1= + c(t), with c already assigned by t • ub2 = + ub(cji), for xi in F, xj in P • ub(cji) = min cji(vj,a) for a in Di • ub3 = + (min cij(a,b)) • + over xi, xj in F • min over a in Di, b in Dj • ub = ub1 + ub2 + ub3

  25. Upper boundsfuzzy constraints (max min) • t = assignment to past variables • ub1= min c(t), with c already assigned by t • ub2 = min ub(cji), for xi in F, xj in P • ub(cji) = max cji(vj,a) for a in Di • ub3 = min (max cij(a,b)) • min over xi, xj in F • max over a in Di, b in Dj • ub = min(ub1, ub2, ub3)

  26. Constraint propagation • Constraint propagation (ex.arc-consistency): • Deletes an element a from the domain of a variable x if, according to a constraint between x and y, it does not have any compatible element b in the domain of y • Iterate until stability • Polynomial time • Very useful at each node of the search tree to prune subtrees

  27. a b a b X Y a a a b Example No matter what the other constraints are, X=b cannot participate in any solution. So we can delete it without changing the set of solutions.

  28. Properties of GI algorithm • Equivalence: each step preserves the set of solutions • Termination (with finite domains) • Order-independence: no matter how constraints are chosen, when stability is achieved we always have the same problem

  29. Soft constraint propagation • Deleting a value means passing from 1 to 0 in the semiring <{0,1},or,and,0,1> • In general, constraint propagation can change preferences to lower values in the ordering • Soft arc-consistency: given cx, cxy, and cy, computecx := (cx x cxy x cy)|x • Iterate until stability according to algorithm GI

  30. 256  0.5 512  0.8 1024  0.7 s=slow  0.2 f=fast  0.9 VGA MB <s,256> → 0.6 <s,512> → 0.7 <s,1024> → 0.9 <f,256> → 0.1 <f,512> → 0.9 <f,1024> → 1 Example: fuzzy arc-consistency cx := (cx x cxy x cy)|x Fuzzy semiring= <[0,1],max,min,0,1>  +=max an x=min VGA=s  max(min(0.2,0.6,0.5),min(0.2,0.7,0.8),min(0.2,0.9,0.7))= max(0.2,0.2,0.2) = 0.2 VGA=f max(min(0.9,0.1,0.5),min(0.9,0.9,0.8),min(0.9,1,0.7))= max(0.1,0.8,0.7)=0.8

  31. Properties of soft GI • If x idempotent (ex.:fuzzy,classical): • Equivalence • Termination • Order-independence • If x not idempotent (ex.: weighted CSPs, prob.), we could count more than once the same constraint  we need to compensate by subtracting appropriate quantities somewhere else  we need an additional property (fairness=presence of -) • Equivalence • Termination • Not order-independence

  32. Branch and bound with soft constraint propagation • Soft constraint propagation lowers the preferences • ub1, ub2, ub3 can be lower • more probable that ub < lb • more probable to prune

  33. Cuts • Given P soft, and given  in the semiring, build P : • Same graph as P • In each constraint, only the tuples with preference >

  34. 256  0.5 512  0.8 1024  0.7 s=slow  0.2 f=fast  0.9 VGA MB 256  0 512  1 1024  1 s 0 f  1 <s,256> → 1 <s,256> → 0.6 VGA MB <s,512> → 1 <s,512> → 0.7 <s,1024> → 1 <s,1024> →0.9 <f,256> → 0 <f,256> → 0.1 <f,512> → 1 <f,512> → 0.9 <f,1024> → 1 <f,1024> → 1 Example P0.5:

  35. Properties • t optimal solution of P with preference v • For each <v, P consistent and t solution of P • For each  ≥ v, P inconsistent • Hold if x idempotent

  36. Solution method • To find the optimal solutions of P: • Find the consistent P such that, for each > , P inconsistent • The solutions of P are the optimal solutions of P • Just check the P with  occurring in P  O(kd2) calls to a classical constraint solver • k= number of constraints • d = domain size • Binary search  O(log(kd2)) calls

  37. 256  0.5 512  0.8 1024  0.7 s=slow  0.2 f=fast  0.9 VGA MB 256  0 512  1 1024  1 s 0 f  1 <s,256> → 0.6 <s,256> → 1 VGA MB <s,512> → 0.7 <s,512> → 1 <s,1024> → 1 <s,1024> →0.9 <f,256> → 0.1 <f,256> → 0 <f,512> → 1 <f,512> → 0.9 <f,1024> → 1 <f,1024> → 1 Example Optimal solution: VGA=f,MB=512 (val=0.8) P0.5: VGA=f,MB=512 is a solution, but also VGA=f,MB=1024 is a solution

  38. 256  0.5 512  0.8 1024  0.7 s=slow  0.2 f=fast  0.9 VGA MB 256  0 512  1 1024  1 s 0 f  1 <s,256> → 0.6 <s,256> → 0 VGA MB <s,512> → 0.7 <s,512> → 1 <s,1024> → 1 <s,1024> →0.9 <f,256> → 0.1 <f,256> → 0 <f,512> → 1 <f,512> → 0.9 <f,1024> → 1 <f,1024> → 1 Example Optimal solution: VGA=f,MB=512 (val=0.8) P0.6: VGA=f,MB=512 is a solution, but also VGA=f,MB=1024 is a solution

  39. 256  0.5 512  0.8 1024  0.7 s=slow  0.2 f=fast  0.9 VGA MB 256  0 512  1 1024  0 s 0 f  1 <s,256> → 0.6 <s,256> → 0 VGA MB <s,512> → 0.7 <s,512> → 0 <s,1024> → 1 <s,1024> →0.9 <f,256> → 0.1 <f,256> → 0 <f,512> → 1 <f,512> → 0.9 <f,1024> → 1 <f,1024> → 1 Example Optimal solution: VGA=f,MB=512 (val=0.8) P0.7: Only VGA=f,MB=512 is a solution

  40. 256  0.5 512  0.8 1024  0.7 s=slow  0.2 f=fast  0.9 VGA MB 256  0 512  0 1024  0 s 0 f  1 <s,256> → 0.6 <s,256> → 0 VGA MB <s,512> → 0.7 <s,512> → 0 <s,1024> → 1 <s,1024> →0.9 <f,256> → 0.1 <f,256> → 0 <f,512> → 1 <f,512> → 0.9 <f,1024> → 1 <f,1024> → 1 Example Optimal solution: VGA=f,MB=512 (val=0.8) P0.8: No solution

  41. Variable elimination • Choose a variable, x • x is connected via constraints C to the set of variables Y • Compute x(C) and project over Y, and add the new constraint over Y (we need idempotent x) • Forget about x and C • The same for all other variables • When just one variable, its highest preference is the preference of optimal solutions • To find an optimal solution, instantiate the variables in the order opposite to elimination, always taking the value with the highest preference

More Related