1 / 18

Optimization Problems

Optimization Problems. Greg Stitt ECE Department University of Florida. Introduction. Why are we studying optimization problems? Just about every RC-tool related issue is an optimization problem This lecture should provide abstract solutions to many RC tool problems. Time Complexity.

alaula
Télécharger la présentation

Optimization Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimization Problems Greg Stitt ECE Department University of Florida

  2. Introduction • Why are we studying optimization problems? • Just about every RC-tool related issue is an optimization problem • This lecture should provide abstract solutions to many RC tool problems

  3. Time Complexity • Time Complexity - Informal definition: • Defines how execution time increases as input size increases • We are interested in worst case execution time • Big O Notation • O(n) • Execution time increases linearly with input size • O(n2) • Quadratic growth • O(2n) • Exponential growth

  4. Problem Classes • P • A problem is “in P” if it has a polynomial time solution • O(n), O(n2), O(n3), etc. • Undecidable • Provably impossible to solve • Example: halting problem • NP • Does not mean “Not-Polynomial”!!!! • A NP problem has a non-deterministic polynomial time solution • NP-hard • A problem is NP-hard if every problem in NP is reducible to it • Basically means that NP-hard problems are at least as hard as the hardest problems in NP • NP-complete • NP + NP-hard • Most interesting problems are NP-complete!!!! • Traveling salesman, Subset sum, 0-1 knapsack, graph coloring, vertex cover • Place and route, logic minimization, minimum resource scheduling, hw/sw partitioning, etc.

  5. The Big Question • Does P = NP? • Been studied for a long time • Never been proven either true or false • Why do we care? • Currently, best known solutions for NP-complete problems typically have complexity of O(2n), O(n!) • Known as intractable – takes more than your lifetime to solve • If one NP-complete problem can be solved in polynomial time (is in P), then all NP-complete problems can be solved in polynomial time • Remember, many interesting problems are NP-complete • If you can find a polynomial time solution to an NP-complete RC problem: • I’ll give you an A And, I’ll get a million dollars (www.claymath.org/millennium/P_vs_NP/ )

  6. Problem Classes

  7. Optimization Problems • Informal definition • Problem of finding the best solution from all possible solutions • Typically involves finding best solution for millions, billions, or even more possibilities • Possible solution: exhaustive search • Check every possible solution, save best one • Works, but • Many optimization problems are NP-complete • Not feasible to generate all possible solutions • Example: O(2n) • n = 5 => 32, n = 10 => 1024, n = 100 => 1.3 * 1030 • Problem: If most optimization problems are intractable, how do we solve them? • Could use solution to any NP-complete problem! • Remember: any NP-complete problem can be reduced to any other NP-complete problem • But, this doesn’t really help • There is no known polynomial solution to any NP-complete problem • So, what can we do?

  8. Branch and Bound • Another solution: • Branch and bound • Idea: Try to eliminate many possibilities without evaluating them • How it works: • Imagine a tree representing all possible solutions • Algorithm progressively builds tree • Maintains best found solution so far • When considering a branch in the tree, determines best possible solution represented by branch • If branch can’t possibly be better than current best, don’t both to explore possible solutions • “Prunes” solution space • Result • + Very often, can eliminate large percentage of possible solution • + Still finds optimal solution • + Usually, much faster than exhaustive search • - But, still has exponential complexity  • O(2n) • - Still not feasible for large input sizes

  9. Heuristics • Another solution: • Forget about trying to find the best solution • Focus on finding a “good” solution quickly • Known as heuristics • Good candidates for heuristics • Map problem to other NP-complete problem, use heuristic for that problem • Source of way too many publications • Use generalized optimization problem heuristic • Due to large number of interesting optimization problems, research has introduced general heuristics that apply to all optimization problems • Examples: • Hill Climbing • Simulated Annealing • Genetic Algorithms

  10. Hill Climbing • Background: Graph of solution space • x axis = possible solutions • y axis = quality of solution • Height of solution represents goodness • Peak represents best solution • Informal description: • 1) Choose a solution s (usually randomly) • 2) Find neighboring solution n (i.e. change 1 thing) • 3) If n better than s (climbs the hill) • s = n, Repeat from 2) • 4) If all neighboring solutions worse than s • s is answer (s is a peak) • Example: Traveling Salesman • 1) Find a solution • 2) Swap 2 cities, if improves solution keep, if not reject • 3) repeat 2) until no improvement can be found

  11. Hill Climbing • Advantages: • Very fast, works well for certain problems • Disadvantages: • What if there are multiple peaks? • Hill climbing gets stuck at all peaks, known as local maxima • Optimal solution is highest peak – global maximum • May result in extremely suboptimal solution if many peaks

  12. Simulated Annealing • Problem: Hill climbing suffered from local peaks • Need way of escaping local peaks • => Have to accept some bad moves • Solution: Simulated annealing • Annealing is process of heating and cooling metals in order to improve strength • Idea: Controlled heating and cooling of metal • When hot, atoms move around • When cooled, atoms find configuration with lower internal energy • i.e. makes metal stronger • Analogy: • Temperature = probability of accepting worse neighboring solution • When temperature is high, likely to accept worse neighboring solutions (but may lead to better overall solution) • Analogous to atoms wandering around • Cooling represents shrinking probability of accepting worse solutions

  13. Simulated Annealing • Informal description • 1) Set initial temperature and probability p • 2) Find initial solution s • 3) Find neighboring solution n • 4) If n better than s • s = n (always accept better solutions) • 5) If n worse than s • Accept n with probability p • 6) Reduce “temperature” and p by some amount • 7) If final temperature not reached, repeat from 3) • 8) If final temperature reached, report best solution found

  14. Simulated Annealing • Advantages • Can find near optimal solutions – escapes local maxima • Disadvantages • Takes a long time to find near optimal solution • But, still fast compared to algorithms • Very sensitive to input parameters – must be configured well to get good results • How long to run? • Initial temperature/final temperature • What should initial probability be? • Cooling schedule • How much to reduce temperature and probability at each step?

  15. Genetic Algorithms • Another possible solution: • Imitate evolution – survival of the fittest • Assumption: evolution produces “better” humans/animals • Implies genetic processes must work well • Genetic Algorithms • Generates a “population” of solutions • Selection chooses members of population (solutions) to survive by probability based on fitness function (i.e. natural selection) • Bad solutions less likely to “survive” • Reproduction combines attributes of selected population • Crossover/Inheritence – Combines traits for each parent (solution) • Mutation – Random change to characteristic • Repeat until solution has “evolved” to acceptable lavel

  16. Genetic Algorithms • Advantages • Can find near optimal solutions • Disadvantages • Takes a long time to find near optimal solution • But, still fast compared to algorithms • Very sensitive to input parameters – must be configured well to get good results • How large should population be? • How does reproduction occur? • How many generations should be considered? • How much of each generations should be killed off? • Same advantages/disadvantages as simulated annealing

  17. Other Heuristics • Ant colony optimization • Tabu search • Stochastic optimization • Many others

  18. Summary • Optimizations problems search huge solution space for best solution • Many optimization problems are NP-complete • No known efficient solution • Branch and bound helps a little • Heuristics must be used • Find a “good” solution in reasonable time • General optimization problem heuristics • Hill climbing – fast, but gets stuck at local maxima • Simulated annealing – works well but sensitive to parameters • Genetic algorithms - work well but sensitive to parameters • Why should you care about any of this? • Most RC tool problems are NP-complete optimization problems • You now know how to solve almost all tool problems • You can use branch and bound, hill climbing, simulated annealing, genetic algorithms, ant colony optimization, tabu search, etc.

More Related