1 / 26

CGRASS

CGRASS. A System for Transforming Constraint Satisfaction Problems. Alan Frisch, Ian Miguel (York) Toby Walsh (Cork). Motivation. Modelling experts carefully transform problems to reduce greatly the effort required to solve them. The challenge: Automate these transformations.

Télécharger la présentation

CGRASS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CGRASS A System for Transforming Constraint Satisfaction Problems Alan Frisch, Ian Miguel (York) Toby Walsh (Cork)

  2. Motivation • Modelling experts carefully transform problems to reduce greatly the effort required to solve them. • The challenge: • Automate these transformations. • Identify useful transformations.

  3. Types of Transformation • Add constraints to eliminate symmetrical solutions. • Given: Add: • Add implied constraints. • Given: and • Infer:

  4. Types of Transformation (2) • Replacing constraints with equivalents. • E.g. Introducing new variables as we will see later. • Removing redundant constraints. • Given: and • is redundant

  5. Overview of Rest of Talk • Case Study: The Golomb Ruler. • Transformation at 3 Levels of Abstraction. • Conclusions/Future Work

  6. The Golomb Ruler • A set of n ticks at integer points on a ruler, length m. • All inter-tick distances must be unique. • Minimise m. • Practical applications: astronomy, crystallography • Previously, useful implied constraints found by hand [Smith et al 2000].

  7. The Golomb Ruler –Characteristics of a Good Model • Break Symmetry: • All n ticks are symmetrical. • Introduce distance variables: • Allows us to use the all-different constraint.

  8. Approach 1:Instance Level Transformation • Minimise: maxi(xi) • Constraints of the form: (xi – xj xk – xl) • Subject to: i  j, k  l, i  k  j  l • Domains: [0 – n2] • Poor Model: • Quaternary Constraints. • Symmetry.

  9. CGRASS • Constraint GeneRation And Symmetry-breaking. • Based on, and extends, Bundy’s proof planning. • Patterns in proofs captured in methods. • Strong preconditions limit applicability. • Prevent combinatorially explosive search. • Easily adaptable to problem transformation. • Methods now encapsulate common patterns in hand-transformation.

  10. Proof Planning - Operation • Given a goal to prove: • Select a method. • Check pre-conditions. • Execute post-conditions. • Construct output goal(s). • Associated tactic constructs actual proof. • CGRASS makes transformations rather than constructing sub-goals to prove.

  11. The Introduce Method • Preconditions • Exp has arity greater than 1, occurs more than once in the constraint set. • someVariable = Exp not already present. • Post-conditions • Generate new var, x, domain defined by Exp. • Add constraint: x = Exp.

  12. An Instance of Eliminate • Preconditions: • Lhs = CommonExp s.t. CommonExp also present in another constraint c. • cnew obtained by replacing all occurrences of CommonExp by Lhs in c. • Size of cnew less than that of c. • cnew not obviously redundant. • Post-conditions • Add cnew, remove c.

  13. Normalisation • Lexicographic ordering + Simplification. • Inspired by HartMath: www.hartmath.org. • Necessary to deal with associative/commutative operators (e.g. +, *). • Replaces semantic equivalence test with simple syntactic comparison (to an extent).

  14. 3-tick Golomb Ruler • Basic model produces 30 constraints. • Initial normalisation of the constraint set reduces this to just 12. • Constraints with reflection symmetry across an inequation identical following normalisation. • Some cancellation also possible E.g. x1 – x2 x1 – x3

  15. Symmetry-breaking • Often useful implied constraints can be derived only when some/all symmetry broken. • Symmetry breaking method as pre-processing. • Symmetrical variables: • Have identical domains. • If exchange x1, x2 throughout, re-normalised constraint set returns to original state. • Symmetry-testing on the 3-tick instance: • x1, x2, x3 are symmetrical. • Hence we add: • x1 x2, x2  x3

  16. Variable Introduction/Elimination • After symmetry breaking, strengthening inequalities. • Still 9 quaternary constraints. • Reduce arity by introducing new variables: • E.g. z0 = x1 – x2 • Eliminate substitutes new variables into quaternary constraints, reducing arity.

  17. Final Problem State • Combination of Introduce and Eliminate • GenAllDiff greedily checks for cliques of inequations.

  18. Results

  19. Analysis of Results • Cost of making inferences far outweighs the benefit gained. • An instance of a bad model may have huge numbers of constraints. • E.g. bad model of 6-tick ruler has 870 quaternary constraints. • Conclusion: Inference at instance level alone unlikely to be cost-effective in general.

  20. Approach 2: Transformation of Quantified Constraint Expressions • Minimise: maxi(xi) • This model still in terms of CSP variables. • Input small, represents whole class. • Fairly straightforward to justify distance vars. • Symmetry of ticks is difficult • Swapping ticks, re-normalising doesn’t work.

  21. Approach 3: High Level Transformation of the Golomb Ruler • Put n ticks on a ruler of size m such that all inter-tick distances are unique. Minimise m. • Find T{0, …, m}, |T|=n. • Minimise m, (upper bound: n2). • distance({x, y}) = |x – y| • {distance({x, y})  distance({x’, y’}) | {x, y}  T, {x’,y’}  T, {x, y}  {x’,y’}}

  22. Initial Transformations • Given recurrence of distance, natural to introduce distance variables. • {dxy = distance({x, y}) | {x, y}  T} • Substituting: • {dxy  dx’y’ | {x, y}  T, {x’, y’}  T, {x, y}  {x’, y’}} • This defines a clique. Lifted GenAllDiff: • All-diff({dxy | {x, y}  T})

  23. Refinement • We use refinement methods to move to lower levels of abstraction: • E.g. To pick subset, size n, from set A, size m, totally ordered by : • Introduce set S of n variables, {s1, …, s2}. • Domain of each si is A. • Assignments to elements of S form a set totally ordered by , i.e.s1 < s2 < … < sn

  24. Application of refine(subset) • Assign S = {s1, s2, …, sn} • Each si has domain {0, …, m} • s1 < s2 < … < sn • {dij = distance({si, sj}) | {si, sj} S} • All-diff({dij | {si, sj} S}) • Starting to look more like a CSP

  25. Final Transformations • Domain of each si bounded above by m, which is to be minimised. • Refine {si, sj} from a set to an ordered pair. • Assign S = {s1, s2, …, sn} • Each si has domain {0, …, m} • s1 < s2 < … < sn • Minimise(sn) • {dij = sj – si | <si, sj> S x S, i < j} • All-diff({dxy | <x, y> S x S, i < j})

  26. Conclusions/Future Work • CGRASS: Automatic transformation of instances. • Based on proof planning. • Transformation at instance level alone not practical. • Transformation at the `right’ level is much easier. 4+ Higher Levels… Abstract Refine Sets/Mappings 3 Refine Abstract Quantified Constraints 2 Refine Abstract Individual Instances 1

More Related