1 / 45

G5BAIM Artificial Intelligence Methods

G5BAIM Artificial Intelligence Methods. Evolutionary Algorithms. Dr. Rong Qu. Learning in Computer Programs. How Computer Program Learn? Another aspect of AI This lecture is only an introduction

Télécharger la présentation

G5BAIM Artificial Intelligence Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. G5BAIMArtificial Intelligence Methods Evolutionary Algorithms Dr. Rong Qu

  2. Learning in Computer Programs • How Computer Program Learn? • Another aspect of AI • This lecture is only an introduction • This techniques can be used as an optimisation strategy but we are going to look at a learning example

  3. Evolutionary Algorithms • Population based algorithm • Genetic algorithm • Ant algorithm • Evolutionary strategies

  4. Evolutionary Algorithms • Chapter 8 of Michalewicz, Z. (1996). Genetic Algorithms + Data Structures = Evolution Programs, Springer-Verlag, ISBN 3-540-60676-9 • Good description of EA • History of ES • GA vs. ES

  5. Evolutionary Algorithms • Almost anything by David Fogel • Fogel, D. B. (1994) IEEE Transactions on Neural Networks, Vol 5:1, pp3-14 • Fogel, D.B. (1998) Evolutionary Computation The Fossil Record, IEEE Press, ISBN 0-7803-3481-7, pp 3-14 • Michalewicz, Z. and Fogel, D. (2000). How to Solve It : Modern Heuristics. Springer-Verlag, ISBN 3-540-66061-5

  6. Learning in Computer Programs • There are other ways that we could design computer programs so that they “learn” • For example, knowledge based on some suitable logic symbolism • Use inference rules • Learning by remembering and forgetting • Using memory

  7. Learning in Computer Programs • Look at how Evolutionary Strategies (ES’s) can be used as evolving learning strategies • Easily convert to optimisation tools

  8. Learning in Computer Programs • ES’s vs EP • not confuse evolutionary strategies (ES’s) with evolutionary programming (EP) • EP is about writing programs that write programs

  9. Evolutionary Algorithms vs. GA’s • GA’s • Population of chromosomes • Reproduction operators (more details in GAs section) • Mutation • Crossover • Selection strategy • Generations

  10. Evolutionary Algorithms vs. GA’s • ES’s are an algorithm that only uses mutation and does not use crossover • This is not a formal definition and there is no reason why we cannot incorporate crossover (as Michalewicz, 1996 shows)

  11. Evolutionary Algorithms vs. GA’s • ES’s are normally applied to real numbers (continuous variables) rather than discrete values. • Again, this is not a strict definition and work has been done on using ES’s for discrete problems (Bäck, 1991) and (Herdy, 1991)

  12. Evolutionary Algorithms vs. GA’s • ES’s are a population based approach • Originally only a single solution was maintained and this was improved upon.

  13. Evolutionary Algorithms vs. GA’s • In summary ES’s are • Like genetic algorithms but only use mutation and not crossover • They operate on real numbers • They are a population based approach • But we can break any, or all, of these rules if we wish!

  14. Evolutionary Algorithms - How They Work • An individual in an ES is represented as a pair of real vectors, v = (x,σ) • x, represents a point in the search space and consists of a number of real valued variables • The second vector, σ, represents a vector of standard deviations • how spread out the values in a data set are

  15. Evolutionary Algorithms - How They Work • Mutation is performed by replacing x by xt+1 = xt + N(0, σ) N(0, σ) is a random Gaussian number with a mean of zero and standard deviations of σ changing value by adding random noise drawn from normal distribution

  16. Evolutionary Algorithms - How They Work This mimics the evolutionary process that small changes occur more often than larger ones An algorithm (in C++) that produces Gaussian random numbers is supplied in the handout

  17. Evolutionary Algorithms - How They Work Set t = 0 Create initial point xt =  x1t,…,xnt  REPEAT Draw ni from a normal distribution for all i = 1,…,n yit = xit + ni New generation Set t = t+1 UNTIL (stopping condition)

  18. Evolutionary Algorithms - How They Work • In the earliest ES’s (where only a single solution was maintained), the new individual replaced its parent if it had a higher fitness • Two-numbered evolution scheme • Compete upon two individuals • Survival become new parent

  19. Evolutionary Algorithms - How They Work • In addition, these early ES’s, maintained the same value for σ throughout the duration of the algorithm • It has been proven that if this vector remains constant throughout the run then the algorithm will converge to the optimal solution

  20. Evolutionary Algorithms - How They Work • Problem • Although the global optimum can be proved to be found with a probability of one, it also states that the theorem holds for sufficiently long search time • The theorem tells us nothing about how long that search time might be

  21. Evolutionary Algorithms - How They Work • To try and speed up convergence Rechenberg has proposed the “1/5 success rule.” It can be stated as follows The ratio, , of successful mutations to all mutations should be 1/5. Increase the variance of the mutation operator if  is greater than 1/5; otherwise, decrease it

  22. Evolutionary Algorithms - How They Work • Motivation behind 1/5 rule • If we are finding lots of successful moves then we should try larger steps in order to try and improve the efficiency of the search • If we not finding many successful moves then we should proceed in smaller steps

  23. Evolutionary Algorithms - How They Work • The 1/5 rule is applied as follows if (k) < 1/5 then σ = σcd if (k) > 1/5 then σ = σci if (k) = 1/5 then σ = σ

  24. Evolutionary Algorithms - How They Work if (k) < 1/5 then σ = σcd if (k) > 1/5 then σ = σci if (k) = 1/5 then σ = σ • k dictates how many generations should elapse before the rule is applied • cd and ci determine the rate of increase or decrease for σ • ci must be greater than one and cd must be less than one • Schwefel (1981) used cd = 0.82 and ci = 1.22 (=1/0.82)

  25. Evolutionary Algorithms - How They Work • Problem with the applying the 1/5 rule • It may lead to premature convergence for some problems premature convergence • Increase the population size, which now turns ES’s into a population based approach search mechanism

  26. Evolutionary Algorithms - How They Work • Increase population size • The population size is now (obviously) > 1. • All members of the population have an equal probability of mating - compare with GA’s • We could now introduce the possibility of crossover

  27. Evolutionary Algorithms - How They Work • Increase population size • As we have more than one individual we have the opportunity to alter σ independently for each member • We have more options with regards to how we control the population (discussed next)

  28. Evolutionary Algorithms - How They Work • In evolutionary computation there are two variations as to how we create the new generation

  29. Evolutionary Algorithms - How They Work • ( + ), uses  parents and creates  offspring • After mutation, there will be  +  members in the population • All these solutions compete for survival, with the  best selected as parents for the next generation

  30. Evolutionary Algorithms - How They Work • (, ), works by the  parents producing  offspring (where  >  ) • Only the  compete for survival. Thus, the parents are completely replaced at each new generation • Or, to put it another way, a single solution only has a life span of a single generation

  31. Evolutionary Algorithms - How They Work • The original work on evolution strategies (Schwefel, 1965) used a (1 + 1) strategy • This took a single parent and produced a single offspring • Both these solutions competed to survive to the next generation

  32. Evolutionary Algorithms - Case Study • Develop agent that • knows how to play poker • learns to adapt its play when faced with different playing styles

  33. Evolutionary Algorithms - Case Study • Barone (1999) • Calculate the probability x of winning at any position • By enumerating all possible hands that can be held by the opponents

  34. Evolutionary Algorithms - Case Study • Barone (1999) • Learning how to play Poker • fold(x) = exp(-eval(b) * (x – a)) • call(x) = eval(c) * exp(-eval(b)2 * (x-a)2) • raise(x) = exp(eval(b) * (x + a – 1.0)) a, b, c: constants that shape the functions, which need to be learnt

  35. Evolutionary Algorithms - Case Study • Barone (1999) • (1+1) strategy • Mean of zero • Pre-defined standard deviation • Using the above model • Evolve a poker player • Adapt to player styles of four different players

  36. Evolutionary Algorithms - Case Study • Simplified Blackjack Blackjack is a two player game comprising of a dealer and a player. The dealer deals two cards (from a normal pack of 52 cards) to the player and one card to himself. All cards are dealt face up. All cards take their face value, except Jack, Queen and King which count as 10 and Aces which can count as one or eleven

  37. Evolutionary Algorithms - Case Study • Simplified Blackjack The aim for the player is to draw as many cards as he/she wishes (zero if they wish) in order to get as close as possible to 21 without exceeding 21 Other concepts (doubling down)

  38. Evolutionary Algorithms - Case Study • Simplified Blackjack • Known good strategies • Thorp (1966) – basic strategy • What to do for every possible pair of cards • Based on what the dealer’s “up card” is • As more cards, what to do when approaching 21

  39. Evolutionary Algorithms - Case Study • Simplified Blackjack • How might we write an agent that learns how to play blackjack? • Specify rules exactly • Make it be able to learn new set of rules • Without re-writing program

  40. Evolutionary Algorithms - Case Study • Simplified Blackjack • Identify each possible situation • How many potential hands we may have • Taking into account doubling down • Assign each of these hands the same probability what to do • Agent learn the probabilities by playing many (millions) of times of hands

  41. Evolutionary Algorithms - Your Go • Questions • Do you think this would work? • Should we use a single candidate for each probability or should we have a population greater than one? • What sort of evolutionary scheme should we use; ( + ) or (, ); and what values should we give  and ? • Can you come up with a better representation; other than trying to learn probabilities?

  42. Evolutionary Algorithms - Finally • Evolutionary algorithms can be used as search methods as well as a learning mechanism • It just needs saying!

  43. Summary • Learning in computer program • Evolutionary strategies • Knowledge based systems • ES’s vs. GA’s • Mutation and Crossover • Real and discrete variables • Probability of selecting parents

  44. Summary • How ES’s work? • v = (x,σ) • xt+1 = xt + N(0, σ) • 1/5 success rule • Population: ( + ), (,) • Case study • Poker • blackjack

  45. G5BAIMArtificial Intelligence Methods End of Evolutionary Algorithms Dr. Rong Qu

More Related