1 / 9

A Study of Genetic Algorithms for Parameter Optimization

A Study of Genetic Algorithms for Parameter Optimization. Mac Newbold. Introduction. Many algorithms have constant values that affect the way they work Sometimes we choose them arbitrarily or based on some experimentation Their interactions are often not well understood

Télécharger la présentation

A Study of Genetic Algorithms for Parameter Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Study of Genetic Algorithms for Parameter Optimization Mac Newbold

  2. Introduction • Many algorithms have constant values that affect the way they work • Sometimes we choose them arbitrarily or based on some experimentation • Their interactions are often not well understood • Use Genetic Algorithm to optimize the parameters to an algorithm

  3. Background • Utah Network Testbed (www.emulab.net) • Map a “virtual” topology graph to the physical topology graph • NP-Hard, 30+ degrees of freedom • “assign” – Simulated Annealing (AI algo.) • 19 constants control behaviors • 1 boolean, 4 integers, 15 floating point • 1 integer and 3 floats are scaling factors • 15 parameters need to be optimized

  4. Genetic Algorithm • Evolution, “survival of the fittest” • Genetic Algorithm control – “tune” • Calls object methods • Replaceable object • Obj->Random() – returns a random object • Obj->Fitness() – calculate fitness of object • Obj->Cross(obj2) – crossover (returns 2 objs) • Obj->Mutate() – mutate • Obj->Display() – Show the object • Very Flexible Framework

  5. Parameter Optimization • “Params” object • Specialized for “assign” • Contains the 15 variables we want to tune • One extra value caches fitness calculations • Insures that values “make sense” using domain specific constraints • Uniform crossover • Random mutation • Performance based fitness measure

  6. Fitness Function • For “assign”, we care about running time • Choice of constants has huge effect • Fitness calculation: • Run “assign” on a set of N problems, using the object’s parameters • Allow S seconds for each run • X=Sum of execution times • Fitness = (S*N) – X • S*N = maximum possible total time • Higher scores are better • Could take a long time, so cache result

  7. G.A. Results • Tested genetic algorithm with “random” objects • Same as “Params” object, except for fitness • Random fitness, updated after cross/mutate • 1000 member population • Crossover rate of 0.50 • Mutation rate of 0.30 • Threshold = 999.995/1000 • Took 7 generations, about 5 seconds

  8. Results

  9. What’s Next • Finish setting up actual scoring using “assign” runtimes…

More Related