450 likes | 456 Vues
CGRASS. Alan Frisch, Ian Miguel AI Group University of York Toby Walsh 4C. A System for Transforming Constraint Satisfaction Problems. Overview. Motivation Proof Planning CGRASS The Golomb Ruler Results Conclusion/Future Work. Motivation.
E N D
CGRASS Alan Frisch, Ian Miguel AI Group University of York Toby Walsh 4C A System for Transforming Constraint Satisfaction Problems
Overview • Motivation • Proof Planning • CGRASS • The Golomb Ruler • Results • Conclusion/Future Work
Motivation • Transforming a model can greatly reduce the effort of solving a problem. • So, experts work hard to identify useful transformations, e.g. • Adding implied constraints. • Breaking symmetry. • Removing redundant constraints. • Replacing constraints with their logical equivalents.
CGRASS • Constraint GeneRation And Symmetry-breaking. • Designed to automate the task of finding useful model transformations. • Based on, and extends, Bundy’s proof planning. • Modelling expertise captured in PP methods. • Methods selectively applied to transform the problem.
Challenges • Want to make only useful transformations. • Not always easy to tell what will help • How much work to do before we resort to search?
Proof Planning • Used to guide search for proof in ATP. • Patterns in proofs captured in methods. • Strong preconditions limit applicability. • Prevent combinatorially explosive search. • Easily adaptable to problem transformation. • Methods now encapsulate common patterns in hand-transformation.
Proof Planning - Advantages • Strong preconditions restrict us to useful model transformations. • Methods act at a high level. • Search control separate from inference steps.
Proof Planning - Operation • Given a goal to prove: • Select a method. • Check pre-conditions. • Execute post-conditions. • Construct output goal(s). • Associated tactic constructs actual proof. • CGRASS makes transformations rather than constructing sub-goals to prove.
An Example – Strengthen Inequality • Preconditions • There exist two expressions, xy and xy. • Post-conditions • Add a new constraint of the form x < y • Delete xy and xy. • xy • xy • x < y Transformed to
Non-monotonicity • Sometimes, methods might add a new constraint. • At others, they might: • Replace one constraint by a tighter one. • Eliminate a redundant constraint. • The set of constraints may increase or decrease. • We replace the single output slot of a method with add and delete lists. • Cf. classical planning.
Looping and History • Unless a method deletes at least one of its input constraints, its preconditions continue to hold. • Hence the method can repeatedly fire. • The History mechanism prevents this. • A list of all constraints ever added (including initial set). • Intuition: constraints removed when redundant or transformed into a more useful form • Restoring a previously removed constraint is a retrograde step.
Pattern Matching • Other proof planners use first order unification. • CGRASS uses a richer pattern matching language. • A method’s input slot can specify: • Any single algebraic constraint (irrespective of type). • No individual methods looking for equality, inequations, inequalities… • Subsets of constraints (e.g. all inequations).
Other Extensions • Constraint Utility: • Constraint arity, complexity and tightness. • Remains difficult. • Termination: • At some point, must stop inferring and start searching. • Need an executive to terminate when future rewards look poor. • Explanation: • Tactics write out text explaining method application.
Syntax • Simple input: problem variables and domains, a list of constraints. • In future: OPL. • Internally simplified further: • Inequalities always re-arranged to x < y or xy. • Subtraction replaced by a sum and a –1 coefficient. • Promotes efficiency. • Reduces number of methods required.
Normalisation • Normal form inspired by that used in HartMath: • www.hartmath.org • Necessary to deal with associative and commutative operators (e.g. +, *). • Can replace the test for semantic equivalence with a much simpler syntactic comparison (to an extent). • A combination of lexicographic ordering and simplification.
Lexicographic Ordering • We define an order over CGRASS’ types: • Constants < variables < fractions < … < equality < … • Objects of different type compared using this order. • Objects of same type compared recursively. • Each has an associated method of self-comparison • Base case: two constants or variables compared. • Sums/Products/Conjunction/Disjunction in flattened form: • Simply sort to normalise
Lexicographic Ordering Example x8 + x7 x6 + x5 x4*2 + x3 = 2*x1 + x2 • In normal form: x2 + 2*x1 = x3 + 2*x4 x5+x6 x7 + x8 • Equality is higher than an inequation. • The sums are ordered internally and recursively. • The lhs of an equation is lexicographically least.
Simplification • Collection of like terms. • Cancellation. • Removal of common factors. • Example: 2*6*x1 + 4*x2= 6*x1 + x3*2*2 + 6*x1 • Normalise: 2*6*x1 + 4*x2 = 6*x1 + 6*x1 + 2*2* x3
Simplification Example • Given: 2*6*x1 + 4*x2 = 6*x1 + 6*x1 + 2*2* x3 • Collect constants and occurrences of x1: 4*x2 + 12*x1 = 4*x3 + 12*x1 • Next we perform cancellation: 4*x2 = 4*x3 • Remove common factor: x2 = x3
Normalisation - Summary • These simple steps performed recursively until no further change. • Reduce workload of CGRASS .substantially: • Provide a syntactic test for equality. • Avoid such simplification being written as explicit methods.
The Golomb Ruler • A set of n ticks at integer points on a ruler of length m. • All inter-tick distances must be unique. • Practical applications: astronomy, crystallography • Previously, useful implied constraints found by hand [Smith et al 2000].
Golomb Ruler – A Concise Model • Minimise: maxi(xi) • Constraints of the form: (xi – xj xk – xl) • Subject to: i j k l i k j l • Domains: [0 – n2]
The Concise Model is Poor • The constraints are quaternary: • Delayed by most solvers. • Symmetry: (x1 – x2 x3 – x4) • Is the same as: (x3 – x4 x1 – x2) • Serves to illustrate how CGRASS can automatically improve a basic model.
The 3-tick Golomb Ruler • Basic model produces 30 constraints. • Initial normalisation of the constraint set reduces this to just 12. • Constraints with reflection symmetry across an inequation identical following normalisation. • Some cancellation also possible E.g. x1 – x2 x1 – x3
Symmetry-breaking • Often useful implied constraints can be derived only when some/all symmetry broken. • CGRASS detects/breaks symmetry as a pre-processing step. • Symmetrical variables. • Symmetrical sub-terms.
Symmetrical Variables • Have identical domains. • If all occurrences of x1, x2 are exchanged, re-normalised constraint set returns to its original state. • Transitivity of symmetry reduces number of comparisons: • x1 ≡ x2, x2 ≡ x3 → x1 ≡ x3 • Pairwise comparisons of normalised constraint sets also increases efficiency.
Breaking Variable Symmetry • Result of symmetry detection: a set of lists of symmetrical variables. • Break by creating a partial order between adjacent pairs of variables in each list. • Bounds consistency maintains consistency on transitive closure.
Golomb Ruler: Symmetrical Variables • Symmetry-testing on the 3-tick version: • x1, x2, x3 are symmetrical. • Hence we add: • x1 x2 • x2 x3
Symmetrical Sub-terms • Potentially expensive. • Heuristics based on structural equivalence. • Identical when explicit variable names replaced by a common `marker’. Becomes: • Swap corresponding pairs throughout and re-normalise.
Firing `Strengthen Inequality` • Preconditions • There exist two expressions, xy and xy. • Post-conditions • Add a new constraint of the form x < y • Delete xy and xy. • x1x2 • x1x2 • x2 x3 • x2x3 • x1<x2 • x2<x3 Transformed to
Method Application Order • `StrengthenInequality` an example of a number of simple but useful methods: • CGRASS ascribes a high priority to these when making method selection. • Other examples: `NodeConsistency`, `BoundsConsistency`. • Cheap to fire, often reduce constraint set size: • Promotes efficiency, leaving fewer constraints for more complicated methods to match against.
Firing `ArcConsistency` • Given x1 < x2, x2 < x3, update domains: • x1 ε {0, 1, 2, 3, 4, 5, 6, 7} • x2 ε {1, 2, 3, 4, 5, 6, 7, 8} • x3 ε {2, 3, 4, 5, 6, 7, 8, 9}
Some Redundancy • Given: x1 < x2 and x2 < x3, x1 x3 is redundant. • Could add a simple method: • Input: a set of strict inequalities and inequation. • Removes redundant inequation.
Variable Introduction • The model still contains 9 quaternary constraints. • Can reduce arity by introducing new variables: • New variable bound to a sub-expression of the quaternary constraints. • E.g. z0 = x1 – x2 • `Eliminate` method substitutes new variables into the quaternary constraints, reducing their arity.
The `Introduce` Method • Preconditions • Exp has arity greater than 1, occurs more than once in the constraint set. • someVariable = Exp not already present. • Post-conditions • Generate new var, x, domain defined by Exp. • Add constraint: x = Exp.
Application of `Introduce` • Potentially Explosive. • Only applied when all simpler (reductive) methods inapplicable. • Golomb Example: • Sub-term: x1 – x2 meets preconditions. • Hence: z0 = x1 – x2, z0ε {-8 … 6}. • In order to make use of z0, the companion `Eliminate` method is necessary.
An Instance of `Eliminate` • Preconditions: • Lhs = CommonExp s.t. CommonExp also present in another constraint c. • cnew obtained by replacing all occurrences of CommonExp by Lhs in c. • Complexity of cnew less than that of c. • cnew not obviously redundant. • Post-conditions • Add cnew, remove c.
Application of `Eliminate` • Strong preconditions mean this method simplifies the constraint set. • Hence, must be exhausted before `Introduce` applied. • Combination of `Introduce` and `Eliminate` gives:
Other Instances of `Eliminate` • Can also eliminate with inequalities. • Given: z0 = x1 – x2 • Eliminate x1 in favour of x2 using x1 < x2. • Giving: z0 < 0. • `NodeConsistency` method immediately halves the domain of z0.
The `All-different` Method • Generates an all-different constraint from a clique of inequations. • Maximal clique identification is NP-complete • CGRASS uses a greedy procedure. • Input: inequations with single variables on both sides. • Generates a list for each variable of the variables with which it is not equal. • Looks at variables with equal-sized lists.
Results - Analysis • Size of input generated by the basic model increases dramatically with number of ticks. • 6 ticks: 870 constraints. • Hence the increase in effort for CGRASS. • Once generated, the new models are far easier to solve. • Gap increases rapidly with number of ticks.
Future Work • Direct support for quantified expressions. • Reduces size of input to a single constraint in this case. • Allows reasoning about a class of problems. • Writing methods typically more complicated. • Some easier: • First steps: Arrays.