1 / 45

CGRASS

CGRASS. Alan Frisch, Ian Miguel AI Group University of York Toby Walsh 4C. A System for Transforming Constraint Satisfaction Problems. Overview. Motivation Proof Planning CGRASS The Golomb Ruler Results Conclusion/Future Work. Motivation.

marlenm
Télécharger la présentation

CGRASS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CGRASS Alan Frisch, Ian Miguel AI Group University of York Toby Walsh 4C A System for Transforming Constraint Satisfaction Problems

  2. Overview • Motivation • Proof Planning • CGRASS • The Golomb Ruler • Results • Conclusion/Future Work

  3. Motivation • Transforming a model can greatly reduce the effort of solving a problem. • So, experts work hard to identify useful transformations, e.g. • Adding implied constraints. • Breaking symmetry. • Removing redundant constraints. • Replacing constraints with their logical equivalents.

  4. CGRASS • Constraint GeneRation And Symmetry-breaking. • Designed to automate the task of finding useful model transformations. • Based on, and extends, Bundy’s proof planning. • Modelling expertise captured in PP methods. • Methods selectively applied to transform the problem.

  5. Challenges • Want to make only useful transformations. • Not always easy to tell what will help • How much work to do before we resort to search?

  6. Proof Planning • Used to guide search for proof in ATP. • Patterns in proofs captured in methods. • Strong preconditions limit applicability. • Prevent combinatorially explosive search. • Easily adaptable to problem transformation. • Methods now encapsulate common patterns in hand-transformation.

  7. Proof Planning - Advantages • Strong preconditions restrict us to useful model transformations. • Methods act at a high level. • Search control separate from inference steps.

  8. Proof Planning - Operation • Given a goal to prove: • Select a method. • Check pre-conditions. • Execute post-conditions. • Construct output goal(s). • Associated tactic constructs actual proof. • CGRASS makes transformations rather than constructing sub-goals to prove.

  9. An Example – Strengthen Inequality • Preconditions • There exist two expressions, xy and xy. • Post-conditions • Add a new constraint of the form x < y • Delete xy and xy. • xy • xy • x < y Transformed to

  10. Non-monotonicity • Sometimes, methods might add a new constraint. • At others, they might: • Replace one constraint by a tighter one. • Eliminate a redundant constraint. • The set of constraints may increase or decrease. • We replace the single output slot of a method with add and delete lists. • Cf. classical planning.

  11. Looping and History • Unless a method deletes at least one of its input constraints, its preconditions continue to hold. • Hence the method can repeatedly fire. • The History mechanism prevents this. • A list of all constraints ever added (including initial set). • Intuition: constraints removed when redundant or transformed into a more useful form • Restoring a previously removed constraint is a retrograde step.

  12. Pattern Matching • Other proof planners use first order unification. • CGRASS uses a richer pattern matching language. • A method’s input slot can specify: • Any single algebraic constraint (irrespective of type). • No individual methods looking for equality, inequations, inequalities… • Subsets of constraints (e.g. all inequations).

  13. Other Extensions • Constraint Utility: • Constraint arity, complexity and tightness. • Remains difficult. • Termination: • At some point, must stop inferring and start searching. • Need an executive to terminate when future rewards look poor. • Explanation: • Tactics write out text explaining method application.

  14. Syntax • Simple input: problem variables and domains, a list of constraints. • In future: OPL. • Internally simplified further: • Inequalities always re-arranged to x < y or xy. • Subtraction replaced by a sum and a –1 coefficient. • Promotes efficiency. • Reduces number of methods required.

  15. Normalisation • Normal form inspired by that used in HartMath: • www.hartmath.org • Necessary to deal with associative and commutative operators (e.g. +, *). • Can replace the test for semantic equivalence with a much simpler syntactic comparison (to an extent). • A combination of lexicographic ordering and simplification.

  16. Lexicographic Ordering • We define an order over CGRASS’ types: • Constants < variables < fractions < … < equality < … • Objects of different type compared using this order. • Objects of same type compared recursively. • Each has an associated method of self-comparison • Base case: two constants or variables compared. • Sums/Products/Conjunction/Disjunction in flattened form: • Simply sort to normalise

  17. Lexicographic Ordering Example x8 + x7 x6 + x5 x4*2 + x3 = 2*x1 + x2 • In normal form: x2 + 2*x1 = x3 + 2*x4 x5+x6 x7 + x8 • Equality is higher than an inequation. • The sums are ordered internally and recursively. • The lhs of an equation is lexicographically least.

  18. Simplification • Collection of like terms. • Cancellation. • Removal of common factors. • Example: 2*6*x1 + 4*x2= 6*x1 + x3*2*2 + 6*x1 • Normalise: 2*6*x1 + 4*x2 = 6*x1 + 6*x1 + 2*2* x3

  19. Simplification Example • Given: 2*6*x1 + 4*x2 = 6*x1 + 6*x1 + 2*2* x3 • Collect constants and occurrences of x1: 4*x2 + 12*x1 = 4*x3 + 12*x1 • Next we perform cancellation: 4*x2 = 4*x3 • Remove common factor: x2 = x3

  20. Normalisation - Summary • These simple steps performed recursively until no further change. • Reduce workload of CGRASS .substantially: • Provide a syntactic test for equality. • Avoid such simplification being written as explicit methods.

  21. The Golomb Ruler • A set of n ticks at integer points on a ruler of length m. • All inter-tick distances must be unique. • Practical applications: astronomy, crystallography • Previously, useful implied constraints found by hand [Smith et al 2000].

  22. Golomb Ruler – A Concise Model • Minimise: maxi(xi) • Constraints of the form: (xi – xj xk – xl) • Subject to: i  j k  l i  k  j  l • Domains: [0 – n2]

  23. The Concise Model is Poor • The constraints are quaternary: • Delayed by most solvers. • Symmetry: (x1 – x2 x3 – x4) • Is the same as: (x3 – x4 x1 – x2) • Serves to illustrate how CGRASS can automatically improve a basic model.

  24. The 3-tick Golomb Ruler • Basic model produces 30 constraints. • Initial normalisation of the constraint set reduces this to just 12. • Constraints with reflection symmetry across an inequation identical following normalisation. • Some cancellation also possible E.g. x1 – x2 x1 – x3

  25. Initial State (After Normalisation)

  26. Symmetry-breaking • Often useful implied constraints can be derived only when some/all symmetry broken. • CGRASS detects/breaks symmetry as a pre-processing step. • Symmetrical variables. • Symmetrical sub-terms.

  27. Symmetrical Variables • Have identical domains. • If all occurrences of x1, x2 are exchanged, re-normalised constraint set returns to its original state. • Transitivity of symmetry reduces number of comparisons: • x1 ≡ x2, x2 ≡ x3 → x1 ≡ x3 • Pairwise comparisons of normalised constraint sets also increases efficiency.

  28. Breaking Variable Symmetry • Result of symmetry detection: a set of lists of symmetrical variables. • Break by creating a partial order between adjacent pairs of variables in each list. • Bounds consistency maintains consistency on transitive closure.

  29. Golomb Ruler: Symmetrical Variables • Symmetry-testing on the 3-tick version: • x1, x2, x3 are symmetrical. • Hence we add: • x1 x2 • x2  x3

  30. Symmetrical Sub-terms • Potentially expensive. • Heuristics based on structural equivalence. • Identical when explicit variable names replaced by a common `marker’. Becomes: • Swap corresponding pairs throughout and re-normalise.

  31. Firing `Strengthen Inequality` • Preconditions • There exist two expressions, xy and xy. • Post-conditions • Add a new constraint of the form x < y • Delete xy and xy. • x1x2 • x1x2 • x2 x3 • x2x3 • x1<x2 • x2<x3 Transformed to

  32. Method Application Order • `StrengthenInequality` an example of a number of simple but useful methods: • CGRASS ascribes a high priority to these when making method selection. • Other examples: `NodeConsistency`, `BoundsConsistency`. • Cheap to fire, often reduce constraint set size: • Promotes efficiency, leaving fewer constraints for more complicated methods to match against.

  33. Firing `ArcConsistency` • Given x1 < x2, x2 < x3, update domains: • x1 ε {0, 1, 2, 3, 4, 5, 6, 7} • x2 ε {1, 2, 3, 4, 5, 6, 7, 8} • x3 ε {2, 3, 4, 5, 6, 7, 8, 9}

  34. Some Redundancy • Given: x1 < x2 and x2 < x3, x1 x3 is redundant. • Could add a simple method: • Input: a set of strict inequalities and inequation. • Removes redundant inequation.

  35. Variable Introduction • The model still contains 9 quaternary constraints. • Can reduce arity by introducing new variables: • New variable bound to a sub-expression of the quaternary constraints. • E.g. z0 = x1 – x2 • `Eliminate` method substitutes new variables into the quaternary constraints, reducing their arity.

  36. The `Introduce` Method • Preconditions • Exp has arity greater than 1, occurs more than once in the constraint set. • someVariable = Exp not already present. • Post-conditions • Generate new var, x, domain defined by Exp. • Add constraint: x = Exp.

  37. Application of `Introduce` • Potentially Explosive. • Only applied when all simpler (reductive) methods inapplicable. • Golomb Example: • Sub-term: x1 – x2 meets preconditions. • Hence: z0 = x1 – x2, z0ε {-8 … 6}. • In order to make use of z0, the companion `Eliminate` method is necessary.

  38. An Instance of `Eliminate` • Preconditions: • Lhs = CommonExp s.t. CommonExp also present in another constraint c. • cnew obtained by replacing all occurrences of CommonExp by Lhs in c. • Complexity of cnew less than that of c. • cnew not obviously redundant. • Post-conditions • Add cnew, remove c.

  39. Application of `Eliminate` • Strong preconditions mean this method simplifies the constraint set. • Hence, must be exhausted before `Introduce` applied. • Combination of `Introduce` and `Eliminate` gives:

  40. Other Instances of `Eliminate` • Can also eliminate with inequalities. • Given: z0 = x1 – x2 • Eliminate x1 in favour of x2 using x1 < x2. • Giving: z0 < 0. • `NodeConsistency` method immediately halves the domain of z0.

  41. The `All-different` Method • Generates an all-different constraint from a clique of inequations. • Maximal clique identification is NP-complete • CGRASS uses a greedy procedure. • Input: inequations with single variables on both sides. • Generates a list for each variable of the variables with which it is not equal. • Looks at variables with equal-sized lists.

  42. Final Problem State

  43. Results

  44. Results - Analysis • Size of input generated by the basic model increases dramatically with number of ticks. • 6 ticks: 870 constraints. • Hence the increase in effort for CGRASS. • Once generated, the new models are far easier to solve. • Gap increases rapidly with number of ticks.

  45. Future Work • Direct support for quantified expressions. • Reduces size of input to a single constraint in this case. • Allows reasoning about a class of problems. • Writing methods typically more complicated. • Some easier: • First steps: Arrays.

More Related