1 / 30

Computational Intelligence Winter Term 2011/12

Computational Intelligence Winter Term 2011/12. Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund. Plan for Today. Evolutionary Algorithms (EA) Optimization Basics EA Basics.

elias
Télécharger la présentation

Computational Intelligence Winter Term 2011/12

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ComputationalIntelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund

  2. Plan for Today • EvolutionaryAlgorithms (EA) • Optimization Basics • EA Basics G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 2 G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 2

  3. system input output ! ? ! ! ? ! ! ? ! Optimization Basics modelling simulation optimization G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 3

  4. Optimization Basics given: objective function f: X → R feasible region X (= nonempty set) objective: find solution with minimal or maximal value! x* global solution f(x*) global optimum optimization problem: find x* 2 X such that f(x*) = min{ f(x) : x 2 X } note: max{ f(x) : x 2 X } = –min{ –f(x) : x 2 X } G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 4

  5. remark: evidently, every global solution / optimum is also local solution / optimum; the reverse is wrong in general! example: f: [a,b] → R, global solution at x* x* a b Optimization Basics local solution x* 2 X : 8x 2 N(x*): f(x*) ≤ f(x) if x* local solution then f(x*) local optimum / minimum neighborhood of x* = bounded subset of X example: X = Rn, N(x*) = { x 2 X: || x – x*||2 ≤  } G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 5

  6. Optimization Basics What makes optimization difficult? • some causes: • local optima (is it a global optimum or not?) • constraints (ill-shaped feasible region) • non-smoothness (weak causality) • discontinuities () nondifferentiability, no gradients) • lack of knowledge about problem () black / gray box optimization) strong causality needed! f(x) = a1 x1 + ... + an xn→ max! with xi2 {0,1}, ai2R ) xi* = 1 if ai > 0 add constaint g(x) = b1 x1 + ... + bn xn≤ b ) NP-hard ) still harder add capacity constraint to TSP ) CVRP G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 6

  7. Optimization Basics When using which optimization method? • mathematical algorithms • problem explicitly specified • problem-specific solver available • problem well understood • ressources for designing algorithm affordable • solution with proven quality required • randomized search heuristics • problem given by black / gray box • no problem-specific solver available • problem poorly understood • insufficient ressources for designing algorithm • solution with satisfactory quality sufficient )don‘t apply EAs )EAs worth a try G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 7

  8. Evolutionary Algorithm Basics idea: using biological evolution as metaphor and as pool of inspiration ) interpretation of biological evolution as iterative method of improvement feasible solution x 2 X = S1 x ... x Sn = chromosome of individual = population: multiset of individuals multiset of feasible solutions = fitness function objective function f: X → R often: X = Rn, X = Bn = {0,1}n, X = Pn = {  :  is permutation of {1,2,...,n} } or non-cartesian sets also : combinations like X = Rn x Bp x Pq ) structure of feasible region / search space defines representation of individual G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 8

  9. Evolutionary Algorithm Basics algorithmic skeleton initialize population evaluation parentselection variation (yields offspring) evaluation (of offspring) survival selection (yields new population) N stop? Y output: best individual found G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 9

  10. parent offspring Evolutionary Algorithm Basics Specific example: (1+1)-EA in Bn for minimizing some f: Bn→ R population size = 1, number of offspring = 1, selects best from 1+1 individuals • initialize X(0)2Bnuniformlyatrandom, set t = 0 • evaluate f(X(t)) • selectparent: Y = X(t) • variation: flipeachbitof Y independentlywithprobability pm = 1/n • evaluate f(Y) • selection: if f(Y) ≤ f(X(t)) then X(t+1) = Y else X(t+1) = X(t) • if not stoppingthen t = t+1, continueat (3) no choice, here G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 10

  11. parent offspring Evolutionary Algorithm Basics Specificexample: (1+1)-EA in Rnforminimizingsomef: Rn→ R population size = 1, number of offspring = 1, selects best from 1+1 individuals compactset = closed & bounded • initialize X(0)2C ½Rnuniformlyatrandom, set t = 0 • evaluate f(X(t)) • selectparent: Y = X(t) • variation = addrandomvector: Y = Y + Z, e.g. Z » N(0, In) • evaluate f(Y) • selection: if f(Y) ≤ f(X(t)) then X(t+1) = Y else X(t+1) = X(t) • if not stoppingthen t = t+1, continueat (3) nochoice, here G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 11

  12. Evolutionary Algorithm Basics Selection • selectparentsthatgenerateoffspring → selectionforreproduction • selectindividualsthatproceedtonextgeneration→ selectionforsurvival • necessary requirements: • - selection steps must not favor worse individuals • one selection step may be neutral (e.g. select uniformly at random) • at least one selection step must favor better individuals typically : selection only based on fitness values f(x) of individuals seldom : additionally based on individuals‘ chromosomes x (→ maintain diversity) G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 12

  13. don‘t use! Evolutionary Algorithm Basics Selection methods population P = (x1, x2, ..., x) with  individuals two approaches: 1. repeatedly select individuals from population with replacement 2. rank individuals somehow and choose those with best ranks (no replacement) • uniform / neutral selection choose index i with probability 1/ • fitness-proportional selection choose index i with probability si = problems: f(x) > 0 for all x 2 X required ) g(x) = exp( f(x) ) > 0 but already sensitive to additive shifts g(x) = f(x) + c almost deterministic if large differences, almost uniform if small differences G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 13

  14. outdated! Evolutionary Algorithm Basics Selection methods population P = (x1, x2, ..., x) with  individuals • rank-proportional selection order individuals according to their fitness values assign ranks fitness-proportional selection based on ranks) avoids all problems of fitness-proportional selection but: best individual has only small selection advantage (can be lost!) • k-ary tournament selection draw k individuals uniformly at random (typically with replacement) from P choose individual with best fitness (break ties at random) ) has all advantages of rank-based selection and probability that best individual does not survive: G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 14

  15. Evolutionary Algorithm Basics Selection methods without replacement population P = (x1, x2, ..., x) with  parents and population Q = (y1, y2, ..., y) with  offspring • (, )-selection or truncation selection on offspring or comma-selection rank offspring according to their fitness select  offspring with best ranks) best individual may get lost, ≥  required • (+)-selection or truncation selection on parents + offspring or plus-selection merge offspring and parents rank them according to their fitness select  individuals with best ranks) best individual survives for sure G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 15

  16. Evolutionary Algorithm Basics Selectionmethods: Elitism Elitistselection: bestparentis not replacedbyworse individual. • Intrinsicelitism: methodselectsfromparentandoffspring, bestsurviveswithprobability 1 - Forcedelitism: ifbest individual has not survivedthenre-injectionintopopulation, i.e., replaceworstselected individual bypreviouslybestparent G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 16

  17. Evolutionary Algorithm Basics Variation operators: depend on representation mutation → alters a single individual recombination → createssingleoffspringfromtwoormoreparents maybeapplied • exclusively (eitherrecombinationormutation) chosen in advance • exclusively (eitherrecombinationormutation) in probabilisticmanner • sequentially (typically, recombinationbeforemutation); foreachoffspring • sequentially (typically, recombinationbeforemutation) withsomeprobability G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 17

  18. Evolutionary Algorithm Basics Variation in Bn Individuals2 { 0, 1 }n ● Mutation local → chooseindex k 2 { 1, …, n } uniformlyatrandom, flipbit k, i.e., xk = 1 – xk global → foreachindex k 2 { 1, …, n }: flipbit k withprobability pm2 (0,1) “nonlocal“ → choose K indicesatrandomandflipbitswiththeseindices inversion → choosestartindex ksand end indexkeatrandominvert order ofbitsbetweenstartandandindex 10011 11011 11001 00101 → 00001 k=2 ks K=2 → ke a) b) c) d) G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 18

  19. Evolutionary Algorithm Basics Variation in Bn Individuals2 { 0, 1 }n ● Recombination (twoparents) 1-point crossover → drawcut-point k 2 {1,…,n-1} uniformlyatrandom;choosefirst k bitsfrom 1st parent,choose last n-k bitsfrom 2nd parent K-pointcrossover → draw K distinctcut-pointsuniformlyatrandom;choosebits 1 to k1from 1st parent, choosebits k1+1 to k2from 2nd parent, choosebits k2+1 to k3from 1st parent, and so forth … uniform crossover → foreachindex i: choosebit i withequalprobabilityfrom 1st or 2nd parent 00 0 1 11 1 1 1001 1001 1001 0111 0111 0111 11 0 1 ) ) ) c) a) b) G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 19

  20. Evolutionary Algorithm Basics Variation in Bn Individuals2 { 0, 1 }n ● Recombination (multiparent: ½ = #parents) diagonal crossover (2 < ½ < n) → choose½ – 1 distinctcutpoints, selectchunksfromdiagonals AAAAAAAAAA BBBBBBBBBB CCCCCCCCCC DDDDDDDDDD ABBBCCDDDD cangenerate½offspring; otherwisechooseinitialchunkatrandomforsingleoffspring BCCCDDAAAA CDDDAABBBB DAAABBCCCC genepoolcrossover (½ > 2) → foreachgene: choosedonatingparentuniformlyatrandom G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 20

  21. Evolutionary Algorithm Basics Variation in Pn Individuals X = ¼(1, …, n) ● Mutation local → 2-swap / 1-translocation 5 3 2 4 1 5 3 2 4 1 5 4 2 3 1 5 2 4 3 1 global → drawnumber K of 2-swaps, apply 2-swaps K times K is positive random variable; itsdistrinutionmaybe uniform, binomial, geometrical, …; E[K] and V[K] maycontrolmutationstrength expectation variance G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 21

  22. Evolutionary Algorithm Basics Variation in Pn Individuals X = ¼(1, …, n) ● Recombination (twoparents) order-basedcrossover (OB) partiallymappedcrossover (PMX) cyclecrossover (CX) G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 22

  23. Evolutionary Algorithm Basics Variation in Pn Individuals X = ¼(1, …, n) ● Recombination (multiparent) xxcrossover xxcrossover xxcrossover G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 23

  24. Evolutionary Algorithm Basics Variation in Rn Individuals X 2Rn ● Mutation additive: Y = X + Z (Z: n-dimensional randomvector) offspring = parent + mutation local → Z withboundedsupport Definition Let fZ: Rn→ R+bep.d.f. ofr.v. Z. The set { x 2Rn : fZ(x) > 0 } istermedthesupportof Z. fZ 0 x 0 nonlocal → Z withunboundedsupport fZ mostfrequentlyused! 0 x 0 G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 24

  25. Evolutionary Algorithm Basics Variation in Rn Individuals X 2Rn ● Recombination (twoparents) all crossovervariantsadaptedfromBn intermediate intermediate (per dimension) discrete simulatedbinarycrossover (SBX) G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 25

  26. Evolutionary Algorithm Basics Variation in Rn Individuals X 2Rn ● Recombination (multiparent), ½ ≥ 3 parents intermediate whereand (all points in convexhull) intermediate (per dimension) G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 26

  27. Evolutionary Algorithm Basics Theorem Let f: Rn → Rbe a strictlyquasiconvexfunction. If f(x) = f(y) forsome x ≠ y theneveryoffspringgeneratedby intermediate recombinationisbetterthanitsparents. Proof: ■ G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 27

  28. Evolutionary Algorithm Basics Theorem Let f: Rn → Rbe a differentiablefunctionand f(x) < f(y) forsome x ≠ y. If (y – x)‘ rf(x) < 0 thenthereis a positive probabilitythat an offspringgeneratedby intermediate recombinationisbetterthanbothparents. Proof: ■ G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 28

  29. Evolutionary Algorithms: Historical Notes Idea emerged independently several times: about late 1950s / early 1960s. Three branches / “schools“ still active today. ● Evolutionary Programming (EP): Pioneers: Lawrence Fogel, Alvin Owen, Michael Walsh (New York, USA). Original goal: Generate intelligent behavior through simulated evolution. Approach: Evolution of finite state machines predicting symbols. Later (~1990s) specialized to optimization in Rn by David B. Fogel. ● Genetic Algorithms (GA): Pioneer: John Holland (Ann Arbor, MI, USA). Original goal: Analysis of adaptive behavior. Approach: Viewing evolution as adaptation. Simulated evolution of bit strings. Applied to optimization tasks by PhD students (Kenneth de Jong, 1975; et al.). ● Evolution Strategies (ES): Pioneers: Ingo Rechenberg, Hans-Paul Schwefel, Peter Bienert (Berlin, Germany). Original goal: Optimization of complex systems. Approach: Viewing variation/selection as improvement strategy. First in Zn, then Rn. G. Rudolph: Computational Intelligence ▪ Winter Term 2009/10 29

  30. Evolutionary Algorithms: Historical Notes “Offspring“ from GA branch: ● Genetic Programming (GP): Pioneers: Nichael Lynn Cramer 1985, then: John Koza (Stanford, USA). Original goal: Evolve programs (parse trees) that must accomplish certain task. Approach: GA mechanism transfered to parse trees. Later: Programs as successive statements → Linear GP (e.g. Wolfgang Banzhaf) Already beginning early 1990s: Borders between EP, GA, ES, GP begin to blurr ... ) common term Evolutionary Algorithm embracing all kind of approaches ) broadly accepted name for the field: Evolutionary Computation scientific journals: Evolutionary Computation (MIT Press) since 1993,IEEE Transactions on Evolutionary Computation since 1997,several more specialized journals started since then. G. Rudolph: Computational Intelligence ▪ Winter Term 2009/10 30

More Related