1 / 33

When Ants Attack! Ant Algorithms for Subset Selection Problems

When Ants Attack! Ant Algorithms for Subset Selection Problems. Overview. Ant Algorithms for Best Hamiltonian Path-Finding Problems Subset Selection Problems Definitions and examples Ant-SS, an Ant Algorithm for Subset Selection Problems Solution construction step Pheromone updating step

fayre
Télécharger la présentation

When Ants Attack! Ant Algorithms for Subset Selection Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. When Ants Attack!Ant Algorithms for Subset Selection Problems

  2. Overview • Ant Algorithms for Best Hamiltonian Path-Finding Problems • Subset Selection Problems • Definitions and examples • Ant-SS, an Ant Algorithm for Subset Selection Problems • Solution construction step • Pheromone updating step • Parameters • Experimental Results • Conclusions

  3. Ant Algorithms • Find solutions or near-solutions to intractable discrete optimization problems • Inspired by the behaviour of real ants • Ants communicate indirectly by laying trails of pheromone • “stigmergy” • Enables the ants to find shortest paths between nest and food

  4. Ant Algorithms • Problem is modelled as a search for a minimum cost path in a graph • Artificial ants complete a series of walks of the graph • Each path corresponds to a potential solution • They deposit pheromone on their path in proportion to its quality • Unlike real ants, typically • the pheromone is deposited on completion of the path • only the best paths receive pheromone (“elitism”)

  5. Ant Algorithms • The ants choose between destinations probabilistically based on a combination of • the cumulated pheromone τ, and • an heuristic factor η η<A,B> Τ<A,B> B A Τ<A, C> η<A,C> C

  6. Ant Algorithms • The pheromone intensifies search around the most promising paths • But to avoid stagnation, we need techniques to diversify search, e.g: • raise the importance of the heuristic factor • evaporate some pheromone • impose lower and upper bounds on the pheromone (MAX-MIN Ant Systems)

  7. Travelling salesperson problem • Find the minimum length route that: • starts in, e.g., Cork; • visits each town exactly once; • returns to Cork.

  8. And now the bad news... 7 360 360 microsecs 3 billion (10-digit number) 14 52 mins Nearly 40,000 years A 19-digit number 21 2 trillion centuries 28 A 28-digit number

  9. Demo

  10. Best Hamiltonian Path-Finding Problems • Ant algorithms have been used for • Travelling salesperson problems • Quadratic assignment problems • Vehicle routing problems • Permutation constraint satisfaction problems • Characteristics • ants must visit every vertex in the graph • heuristic factor is often inverse of edge cost

  11. Subset Selection Problems • Intuition: • Given a set S, find a subset S’  S • that satisfies certain properties and/or • optimizes some objective function • Example: the multidimensional 0-1 knapsack problem • Choose which objects to put into the knapsack • but total weight < 15kg (and perhaps other properties) • and maximise monetary value

  12. Subset Selection Problems • An SS problem may be defined by a triple (S, Sfeasible, f) where • S is a set of objects; • Sfeasible (S) is a set that contains all feasible subsets of S; • f : Sfeasible   is an objective function that associates a real-valued cost f(S’) with every feasible subset of objects S’  Sfeasible. • The goal of an SS problem (S, Sfeasible, f) is to find S*  S such that S*  Sfeasible and f(S*) is maximal.

  13. Multidimensional knapsack as a SS Problem • S is the set of objects; • Sfeasible contains every subset that satisfies all the resource constraints, i.e. Sfeasible = {S’  S | i1..m, jS’, rij < bi} where m is the number of resources, rij is the consumption of resource i by object j, and bi is the available quantity of resource i; • f returns the total profit, i.e., S’Sfeasible, f(S’) = jS’ pj where pj is the profit associated with object j.

  14. Other SS Problems • Maximum Clique • Maximum Boolean Satisfiability • Maximum Constraint Satisfaction • Minimum Vertex Cover • Graph Matching • Edge-Weighted k-Cardinality Tree

  15. Ant-SS • Initialise pheromone • repeat • for each ant k, construct a solution Sk as follows: • Randomly choose a first object oi S • Sk  {oi} • Candidates  {oj  S \ oi | Sk  {oj}  Sfeasible} • whileCandidates   • Choose an object oi  Candidates with probability p(oi, Sk) based on τ(oi, Sk) and η(oi, Sk) • Sk  Sk  {oi} • Remove oi from Candidates • Remove from Candidates every object oj such that Sk  {oj}  Sfeasible • end while • end for • Optionally, apply local search • Update the pheromone • until maximum number of cycles reached or acceptable solution found Sfeasibe must be defined in such a way that feasible subsets can be constructed incrementally

  16. Pheromone update • In SS Problems, the order in which objects are selected is not significant • hence rewarding consecutively visited vertices (by placing pheromone on visited edges) is not meaningful • Vertex pheromone strategy • reward solution Sk by associating pheromone with each object oi Sk • Clique pheromone strategy • reward solution Sk by associating pheromone with each pair of different objects (oi, oj)  Sk  Sk

  17. Vertex Pheromone Strategy • Pheromone is laid on objects • represents the learned desirability of selecting that object The ant’s walk Where pheromone is laid 3 3 2 2 1 4 1 4 5 5

  18. How does this affect candidate selection? • Initialise pheromone • repeat • for each ant k, construct a solution Sk as follows: • Randomly choose a first object oi S • Sk  {oi} • Candidates  {oj  S | Sk  {oj}  Sfeasible} • whileCandidates   • Choose an object oi  Candidates with probability p(oi, Sk) based on τ(oi, Sk) and η(oi, Sk) • Sk  Sk  {oi} • Remove oi from Candidates • Remove from Candidates every object oj such that Sk  {oj}  Sfeasible • end while • end for • Optionally, apply local search • Update the pheromone • until maximum number of cycles reached or acceptable solution found

  19. Vertex Pheromone Strategy: how it affects candidate selection Suppose the ant has selected objects 1, 2 and 4. • The probability of selecting • object 5 is based on • the pheromone on object 5; • the heuristic value of object 5 3 3 2 2 1 4 1 4 5 5 Τ5 η5

  20. Clique Pheromone Strategy • Pheromone is laid on pairs of objects • represents the learned desirability of selecting one object given that another has been selected The ant’s walk Where pheromone is laid 3 3 2 2 1 4 1 4 5 5

  21. Clique Pheromone Strategy: how it affects candidate selection Suppose the ant has selected objects 1, 2 and 4. • The probability of selecting • object 5 is based on • the pheromone between objects • 1 & 5, 2 & 5, and 4 & 5; • the heuristic value of object 5 3 3 2 2 1 4 1 4 5 5 Τ5 = Τ(1,5) + Τ(2, 5) + Τ(4, 5) η5

  22. Comparison • Vertex Pheromone Strategy • evaporation: O( |S| ) • reward of subset Sk: O( |Sk| ) • Clique Pheromone Strategy • evaporation: O( |S|2 ) • reward of subset Sk: O( |Sk|2 )

  23. Experiments • τmin = 0.01; τmax = problem-dependent • Evaporation rate, ρ = 0.01 • Number of ants = 30 • Pheromone factor weight = 1 • Heuristic factor = problem-dependent • Heuristic factor weight = problem-dependent • Pheromone update strategy: tried both • Local search: tried without/with

  24. Maximum Clique Problem • τmax = 6 • Heuristic factor, η = 1 (in effect, none) • Heuristic factor weight = 0 • Results averaged over 50 runs

  25. Solution Quality Results • All variants find optimal solutions to ‘easy’ instances • On harder instances, Clique outperforms Vertex • Incorporating local search improves solution quality

  26. Solution Time Results • Vertex needs fewer cycles than Clique • Hence, and because it is cheaper, Vertex converges in less time than Clique • Local search reduces Cycles but not always Time

  27. Convergence (C500.9)

  28. Comparisons with other algorithms • In other experiments on maximum clique problems (Solnon), Ant-SS • outperforms a genetic algorithm • is comparable to an adaptive greedy search • is slightly outperformed by a tabu search

  29. Constraint Satisfaction Problems • τmax = 4 • Heuristic factor η is • select variable based on smallest-domain • heuristic factor of variable’s value is inversely proportional to the number of additional constraints that would be violated • Heuristic factor weight = 10 • Results averaged over 100 runs

  30. Solution Quality Results • All variants find solutions to ‘easy’ instances • On harder instances, Clique outperforms Vertex • Incorporating local search improves solution quality • All instances are solvable. • We show % of runs that succeeded in finding a solution

  31. Solution Time Results • Clique needs fewer cycles than Vertex!!! • But, because it is cheaper, Vertex converges in less time than Clique • Local search reduces Cycles but not always Time

  32. Comparisons with other algorithms • In other experiments on constraint satisfaction problems (Solnon), Ant-SS • outperforms a genetic algorithm because Clique nearly always finds a solution even on harder instances, but the genetic algorithm does not • is slower than a complete search on smaller instances but much faster on larger instances

  33. Conclusions • We defined Subset Selection Problems • We have defined the Ant-SS algorithm • with two pheromone strategies, Vertex and Clique • We have evaluated on a number of problem instances • good performance relative to other algorithms • Clique typically finds better quality solutions but Vertex generally requires less CPU time

More Related