1 / 26

A review of Parallel Genetic Algorithms & Particle Swarm Optimization

A review of Parallel Genetic Algorithms & Particle Swarm Optimization. Jeonghwa Moon Advisor :Andreas A. Linninger 6/14/2006 Laboratory for Product and Process Design, Department of Chemical Engineering, University of Illinois, Chicago, IL 60607, U.S.A. Contents.

euclid
Télécharger la présentation

A review of Parallel Genetic Algorithms & Particle Swarm Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A review of Parallel Genetic Algorithms& Particle Swarm Optimization Jeonghwa Moon Advisor :Andreas A. Linninger 6/14/2006 Laboratory for Product and Process Design, Department of Chemical Engineering, University of Illinois, Chicago, IL 60607, U.S.A.

  2. Contents • Parallel Genetic Algorithms • Introduction to GA / Overview of Parallel genetic algorithms • Classification of Parallel • Global Master-Slave Method • Coarse-Grained Parallel GAs • Fine-Grained parallel GAs • Hierarchical Parallel algorithms • Conclusion • Particle swarm Optimization • Overview • Equations of PSO • Flowchart of PSO • PS for Multiobjective Optimization • Weighted aggression approach • Comparison of results of each weighted aggression approach • Vector Evaluated Particle Swarm Optimization (VEPSO) • Conclusion

  3. PART 1 :Parallel Genetic Algorithms

  4. Introduction to GA • GA is an optimization and search technique based on the principals of genetics and natural selection. • It works on a population of possible solutions, while other heuristic methods use a single solution in their iterations. • GAs are probabilistic (stochastic), not deterministic. • It is easy to parallelize Define cost function, cost, variables Select GA parameters Generate initial population Find cost for each chromosome Select mates Mating Mutation Convergence Check Done Flow chart of a continuous GA

  5. The time to evaluate the fitness of one chromosome The average time to communicate with one processor Number of processors Parameter dependent on selection and parallelization method Parallel Genetic Algorithm • GAs are easily parallelized!!! • Advantages of Parallel • Speed up • Separation into Subpopulation can prevent premature convergence of by allowing each island to search in different combinations (for multiple solution) • Data parallelism involves the execution of the same procedure on multiple large data subsets at the same time. • Control parallelism involves the concurrent execution of multiple different procedures.

  6. Hybrid Methods coarse-grained GAs Master-Slave (fitness distribution) (static) subpopulations with migration coarse-grained GAs Overlapping subpopulations (without migration) fine-grained GAs massively parallel genetic algorithms Classification of parallel • The way in which GAs can be parallelized depends on the following elements: • How fitness is evaluated and mutation is applied • If single or multiple subpopulation (demes) are used • If multiple populations are used, how individuals are exchanged • How selection is applied (globally or locally) This classification can be three types: master-slave, coarse-grained and fine-grained

  7. Global Master-Slave Method • Single population, thus selection and mating are global • Master stores the population and the slaves evaluate the fitness • Easy to implement • It can be very efficient method of parallelization when evaluation needs considerable computations. • It can be applied directly for GA. • Two types of M-S parallelization A schematic of a master-slave parallel GA. The master stores the population, executes GA operations, and distributes individuals to the slaves. The slaves only evaluate the fitness of the individuals.

  8. Coarse-Grained Parallel GAs • Also called island model, distributed , multiple-deme • subpopulation model with a relatively small number of demes with many individuals • These models are characterized, • by the relatively long time they require for processing a generation within each (“sequential”) deme, • by their occasional communication for exchanging individuals. S • From the implementation point of view, multiple-deme GAs are simple extensions of the serial GA. • They are usually implemented on distributed memory MIMD (multiple instructions , multiple data) computers. () • well suited for heterogeneous networks. A schematic of a multiple-population parallel GA. Each process is a simple GA, and there is (infrequent) communication between the populations.

  9. Migration • The parameters that control the migration • The topology that defines the connections between the subpopulations. • A migration rate that controls how many individuals migrate • A migration scheme, that controls which individuals from the source deme (best, worst, random) migrate to another deme, and which individuals are replaced (worst, random, etc.) • A migration interval that determines the frequency of migrations Coarse grained algorithms is a general term for a subpopulation • Synchronous migration • It occurs at predetermined constant intervals. • Asynchronous migration • The demes communicate only after some events occur

  10. Fine-Grained parallel GAs • Also called Cellular. • Subpopulation is very small, often composed of a single population. • This individual can only compete and interact with its neighbors. • require a large number of processors because the population is divided into a large number of small demes. • The characteristic of the best individual diffuse slowly through the grid. A schematic of a fine-grained parallel GA. This class of parallel GAs has one spatially-distributed population, and it can be implemented very efficiently on massively parallel computers.

  11. Hierarchical Parallel algorithms • Combines two of the methods to parallelize GAs • These combined Gas forms hierarchy. (ii) (i) (iii) (i) This hierarchical GA combines a multi-deme GA (at the upper level) and a fine-grained GA (at the lower level). (ii) A schematic of a hierarchical parallel GA. At the upper level this hybrid is a multi-deme parallel GA where each node is a master-slave GA. (iii) This hybrid uses multi-deme GAs at both the upper and the lower levels. A the lower level the migration rate is faster and the communications topology is much denser than at the upper level.

  12. Conclusion • Genetic algorithms are powerful search techniques that are used successfully to solve problems in many disciplines. • Parallel Genetic algorithms are becoming as important as the number of parallel machines increases. • Parallel Genetic algorithms are easy to implement and used for speed up and for multiple solutions. • Global master-slave, fine-grained, multiple-deme, and hierarchical PGA are their four categories. • The research on parallel GAs is dominated by Multiple-deme. • They are very complex, and their behavior is affected by many parameters.

  13. PART 2 :Particle swarm Optimization

  14. Particle Swarm Optimization • Based on the social behavior metaphor of organisms such as bird flocking and fish schooling (move synchronized, without colliding) • Originally proposed by J.Kennedy as a simulation of social behavior. • Each individual has memory, remembering the best position of the search space it has ever visited. • Easy to implement , computationally inexpensive. • It does not require gradient-behavior. • Evolutionary algorithm that does not use “survival of fitness” • Two variant of the PSO algorithms • Global neighborhood : Each particle moves towards its best previous position in whole swarm • Local neighborhood : Each particle moves towards its best previous position in restricted neighborhood.

  15. Equations of PSO • Notation • The search space is D-dimensional • i-th particle of swarm is represented by the D-dimensional vector xi=(x1i, x2i,..,xdi) • The velocity of the particle is represented by vector vi=(v1i,v2i,..,vdi) • The best particle in the swarm is denoted by g • The best previous position of i-th particle is Pi=(p1i,p2i,..,,pdi) (1) (3) (1) : The particle’s previous velocity (2): Distance between best previous position of the particle and its current position (3): Distance between best experience and current position (2)

  16. Flowchart of PSO Initialize variables Initialize swarm and velocities START EVALUTAION LOOP Evaluate initial Population Update Velocity Find best positions, best particle Update SWARM Evaluate New SWARM Cost Update the best position for each particle, index of best particle Convergence Check Done

  17. Particle Swarm for Multiobjective Optimization • Objective functions have to be considered and minimized simultaneously. • Traditional gradient-based optimization technique can be used to detect pareto optimal solutions but, • The objectives have to be aggregated in one single objective function • Only one solution can be detected per optimization run. • PSO is suited to MO because they search for multiple pareto optimal solutions in a single run. • Problem formulation of MO • Let x be an n-dimensional search space • Objective function: fi(x) i=1..k, inequality constraints: gi(x) i=1..m • The goal of MO is to provide a set of solutions that are Pareto Optimal.

  18. MO : Weighted aggression approach • The most common approach for coping with MO • Definition • wi=1…k Three types of Weighted aggression approach

  19. Comparison of results of each Weighted aggression approach-1 • Test problems

  20. Comparison of results of each Weighted aggression approach-2

  21. Comparison of results of each Weighted aggression approach-3 The left :the Pareto Fronts obtained using only the best particle of the other swarm The right : obtained using only both the best particle and the best previous positions of other swarm

  22. Vector Evaluated Particle Swarm Optimization (VEPSO) • The main idea of VEGA is adopted and modified to fit the PSO framework. • N swarms are used for solving N multi-objective functions. • Each swarm is evaluated according to one the objective, • But information coming from the other swarm is used to determine the change of velocities. Equations for i–th particle in the j–th swarm The ring migration scheme.

  23. Results of VEPSO_1 The left :the Pareto Fronts obtained using only the best particle of the other swarm The right : obtained using only both the best particle and the best previous positions of other swarm

  24. Results of VEPSO_2

  25. Conclusion • PSO is very a useful technique for solving global optimization and a good alternative in cases where other case fail. • 3 types of weighted aggression methods are presented and DWA make best result when Pareto Front is convex. • PSO method solved efficiently well known test problems. • Modified version of PSO that resembles the VEGA ideas was also developed. • FUTURE WORK • Comparisons with another MO genetic algorithms are needed • VEGA (Schaffer 1984), nishing method (Goldberg and Richadson 1987), MOGA(Fonesca and Flemming 1993), NGSA,NGSA-II • Parallelization of PSO • PSO is also easy to parallelize

  26. References • Parallelization • Nowostawski, M and R. Poli(1999) Parallel genetic algorithm taxonomy. KES’99 Adelaide, South Australia. • Gordon V.S., D.Whitley Serial and Parallel Genetic Algorithms as Function Optimizers ICGA-90:5th Int. Conf. on Genetic Algorithms • Zdenk Konfst(2004) Parallel genetic algorithms :advances, computing trends, Application and Perspectives IPDPS’04 • Erick Cantu-Paz A survey of parallel genetic algorithms Illigal report 97003, The university of Illinois 1997 . • PSO • K.E. Parsopoulos, M.N. Vrahatis(2003) Particle swarm Optimization Method in Multiobjective Problems • K.E. Parsopoulos, M.N. Vrahatis(2002) Recent approaches to global optimization problems through Particle Swarm Optimization • K.E. Parsopoulos, D.K. Tasoulis, M.N. Vrahatis(?) Multiobjective optimization using parallel vector evaluated particle swarm optimzation • OTHERS • Carlos M. Fonseca and Peter J. Fleming (2003) Genetic Algorithms for Muitiobjective Optimization :formulation,Discussion and Generalizaion

More Related