1 / 30

Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs.Annapoorani Department:CSE

Explore the nature of metaheuristics in operation research, including simulated annealing and tabu search. Discover the advantages and disadvantages of metaheuristics and learn about their applications in solving optimization problems.

bradyl
Télécharger la présentation

Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs.Annapoorani Department:CSE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Subject Name: Operation ResearchSubject Code: 10CS661Prepared By:Mrs.AnnapooraniDepartment:CSE

  2. UNIT 8-MetaHeuristics 1.The Nature of Metaheuristics 2.Simulated Annealing 3.Tabu Search 4.Genetic Algorithms

  3. Heuristics • Heuristics are rules to search to find optimal or near-optimal solutions. Examples are FIFO, LIFO, earliest due date first, largest processing time first, shortest distance first, etc. • Heuristics can be constructive (build a solution piece by piece) or improvement (take a solution and alter it to find a better solution).

  4. Many constructive heuristics are greedy or myopic, that is, they take the best thing next without regard for the rest of the solution. Example: A constructive heuristic for TSP is to take the nearest city next. An improvement heuristic for TSP is to take a tour and swap the order of two cities.

  5. 1. Meta-Heuristics An iterative generation process which guides a subordinate heuristic by combining intelligently different concepts derived from classical heuristics, artificial intelligence, biological evolution, natural and physical sciences for exploring and exploiting the search spaces using learning strategies to structure information in order to find efficiently near-optimal solutions.

  6. 1.1 Advantages of Meta-Heuristics • Very flexible • Often global optimizers • Often robust to problem size, problem instance and random variables • May be only practical alternative

  7. 1.2 Disadvantages of Meta-Heuristics • Often need problem specific information / techniques • Optimality (convergence) may not be guaranteed • Lack of theoretic basis • Different searches may yield different solutions to the same problem (stochastic) • Stopping criteria • Multiple search parameters

  8. 1.3 Hill climbing

  9. Local Search Cost function ? Solution space 1.4 Simulated Annealing

  10. 2. Simulated Annealing • What • Exploits an analogy between the annealing process and the search for the optimum in a more general system.

  11. 2.1Annealing Process • Annealing Process • Raising the temperature up to a very high level (melting temperature, for example), the atoms have a higher energy state and a high possibility to re-arrange the crystalline structure. • Cooling down slowly, the atoms have a lower and lower energy state and a smaller and smaller possibility to re-arrange the crystalline structure.

  12. 2.2 Simulated Annealing Algorithm Initialize: • initial solution x , • highest temperature Th, • and coolest temperature Tl T= Th When the temperature is higher than Tl While not in equilibrium Search for the new solution X’ Accept or reject X’ according to Metropolis Criterion End Decrease the temperature T End

  13. 2.3 Example of Simulated Annealing • Traveling Salesman Problem (TSP) • Given 6 cities and the traveling cost between any two cities • A salesman need to start from city 1 and travel all other cities then back to city 1 • Minimize the total traveling cost

  14. Contd… • Solution representation • An integer list, i.e., (1,4,2,3,6,5) • Search mechanism • Swap any two integers (except for the first one) • (1,4,2,3,6,5)  (1,4,3,2,6,5) • Cost function

  15. 3.Tabu Search • What • Neighborhood search + memory • Neighborhood search • Memory • Record the search history – the “tabu list” • Forbid cycling search Main idea of tabu

  16. 3.1 Algorithm of Tabu Search • Choose an initial solution X • Find a subset of N(x) the neighbors of X which are not in the tabu list. • Find the best one (x’) in set N(x). • If F(x’) > F(x) then set x=x’. • Modify the tabu list. • If a stopping condition is met then stop, else go to the second step.

  17. 3.2 Effective Tabu Search • Effective Modeling • Neighborhood structure • Objective function (fitness or cost) • Example: • Graph coloring problem: • Find the minimum number of colors needed such that no two connected nodes share the same color. • Aspiration criteria • The criteria for overruling the tabu constraints and differentiating the preference of among the neighbors

  18. Contd… • Effective Computing • “Move” may be easier to be stored and computed than a completed solution • move: the process of constructing of x’ from x • Computing and storing the fitness difference may be easier than that of the fitness function.

  19. Contd… • Effective Memory Use • Variable tabu list size • For a constant size tabu list • Too long: deteriorate the search results • Too short: cannot effectively prevent from cycling • Intensification of the search • Decrease the tabu list size • Diversification of the search • Increase the tabu list size • Penalize the frequent move or unsatisfied constraints

  20. 3.3 Eample of Tabu Search • A hybrid approach for graph coloring problem • Given an undirected graph G=(V,E) • V={v1,v2,…,vn} • E={eij} • Determine a partition of V in a minimum number of color classes C1,C2,…,Ck such that for each edge eij, vi and vjare not in the same color class. • NP-hard

  21. 4. Genetic Algorithm • Reproduction (Selection) • Crossover • Mutation

  22. Example 1 Maximize f(x) = x2 where x  I and 0  x  31

  23. Coding of a solution : A five-bit integer, e.g. 01101 • Fitness function : F(x) = f(x) = x2 • Initial population : (Randomly generated) 0110111000 0100010011

  24. Reproduction Roulette Wheel

  25. Reproduction

  26. Crossover

  27. Mutation The probability of mutation Pm = 0.001 20 bits * 0.001 = 0.02 bits

  28. Genetic Algorithm (5/5) p1 = 01110, p2 = 01110 p3 = 11100, p4 = 00010 begin • t0 • initializePt • evaluatePt • while (not terminated) do • begin • t  t+1 • selectPt from Pt-1 • crossover and mutationPt • evaluatePt • end end f1 = 01110 = 3, f2 = 01110 = 3 f3 = 11100 = 3, f4 = 00010 = 1 s1 = 01110 = 0.3, s2 = 01110 = 0.3 s3 = 11100 = 0.3, s4 = 00010 = 0.1 s4 = 11100 = 0.3 p1 = 011 10, p2 = 01 110 p3 = 111 00, p4 = 11 100 c1 = 011 00, c2 = 01 100 c3 = 111 10, c4 = 11 110 c1 = 01101, c2 = 01110 c3 = 11010, c4 = 11111 c1 = 01100, c2 = 01100 c3 = 11110, c4 = 11110 29

  29. Initialization Loop Loop Each ant applies a state transition rule to incrementally build a solution and applies a local updating rule to the pheromone Until each of all ants has built a complete solution A global pheromone updating rule is applied Until End_Condition

More Related