1 / 24

STAT131 W12L1b Markov Chains 2

STAT131 W12L1b Markov Chains 2. by Anne Porter alp@uow.edu.au. Definition: Markov Chain. A Markov Chain or Markov Process exists if the following conditions are satisfied:. There is a finite number of ‘states’ of the experimental

yamal
Télécharger la présentation

STAT131 W12L1b Markov Chains 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. STAT131W12L1b Markov Chains 2 by Anne Porter alp@uow.edu.au

  2. Definition: Markov Chain A Markov Chain or Markov Process exists if the following conditions are satisfied: • There is a finite number of ‘states’ of the experimental • system, and the system is in exactly one of these states after • each repetition of the experiment. The different states are • denoted by E1,E2,…,En , where each repetition of the • experiment has to result in one of these states. • The state of the process after a repetition of the experiment • depends (probabilistically) on only the state of the process • immediately after the previous experiment but not on the • states after earlier experiments.That is, the process has no • memory of the past, beyond the previous experiment.

  3. Example 1: Markov Chain Source: Griffiths Weather example (Additional notes) • Starting on a Wednesday the weather has an initial probability of 0.8 of being 'fine' and 0.2 of 'rain'. If the weather is fine on any day then the conditional probability that it will be fine the next day is 0.7, whereas if it rains on one day the conditional probability of it being fine the next is 0.4. (1) What are the states of this system? S= {fine, rain}

  4. To describe a Markov Chain: Two sets of probabilities must be known. • the initial probability vector and • the transition probability matrix

  5. Initial probability vector • The initial probability vector p0 describes the initial state (S) of the process p0= [ P( initial S is p1), P(initial S is p2),…, P(initial S is pn)] • If the initial state is known the initial vector will have • one of the probabilities equal to 1 and the rest equal to 0.

  6. Example 1: Markov Chain Source: Griffiths Weather example (Additional notes) • Starting on a Wednesday the weather has an initial probability of 0.8 of being 'fine' and 0.2 of 'rain'. If the weather is fine on any day then the conditional probability that it will be fine the next day is 0.7, whereas if it rains on one day the conditional probability of it being fine the next is 0.4. (1) What is the initial probability vector to start? Fine rain [0.8 0.2]

  7. Transition probability matrix • The (conditional) probability that the process moves from state i to state j is called a (one-step) transition probability, and is denoted by ,pij that is pij=P(Ej next |Ei before) • It is usual to display the values in an m (rows) x m (columns) matrix. That is a square matrix.

  8. Transition probability matrix After state 1 2 m Before state 1 2 m pij=P(Ej next |Ei before)

  9. P(end|start) Example 1: Markov Chain Source: Griffiths Weather example (Additional notes) • Starting on a Wednesday the weather has an initial probability of 0.8 of being 'fine' and 0.2 of 'rain'. If the weather is fine on any day then the conditional probability that it will be fine the next day is 0.7, whereas if it rains on one day the conditional probability of it being fine the next is 0.4. (1) What is the transition matrix End Fine Rain Start Fine rain

  10. Example 1: Markov Chain Source: Griffiths Weather example (Additional notes) • Starting on a Wednesday the weather has an initial probability of 0.8 of being 'fine' and 0.2 of 'rain'. If the weather is fine on any day then the conditional probability that it will be fine the next day is 0.7, whereas if it rains on one day the conditional probability of it being fine the next is 0.4. • This can be represented in Matrix notation (we previously did it as a tree diagram). To do this we use the Law of Total Probability.

  11. P(fine and fine) P(fine|fine) = 0.7 =0.8x0.7=0.56 P(fine)=0.8 P(fine and rain) = 0.3 P(rain|fine) =0.8x0.3=0.24 P(rain and fine) = 0.4 P(fine|wet) P(rain)=0.2 =0.2x0.4=0.08 = 0.6 P(rain and rain) P(rain|rain) =0.2x0.6=0.12 Probability: Using Tree Diagrams Wednesday Thursday P(FineT)=0.64 P(RainT)=0.36

  12. Probability: Law of Total Probability • What is the probability that it will be fine on Thursday? • Wet on Thursday? Represented in matrix form PB=PA.PB|A Where PB=P [P(B) P(Not B)] PA=[P(A1) P(A2) …P(Am) and

  13. Fine rain [0.8 0.2] End Fine Rain Start Fine rain Probability: Law of Total Probability • What is the probability that it will be fine on Thursday? • Wet on Thursday? • Initial probability vector • Transition Matrix

  14. P(fineT) P(rainT) Probability: Law of Total Probability Represented in matrix form PB=PA.PB|A • What is the probability that it will be fine on Thursday? • Wet on Thursday? • Initial probability vector • Transition Matrix P[B] = [0.8 0.2] x [0.8 0.2] = [0.64 0.36]

  15. Now predict P(fine) and P(Rain on Friday) • What was the probability of fine and rain on Thursday? [0.64 0.36] • What is the initial probability vector starting • on Thursday? [0.64 0.36] • What else do we need? Transition matrix So [P(fineF) P(rainF)]= [0.64 0.36]x

  16. Now predict P(fine) and P(Rain on Friday) [P(fineF) P(rainF)]= [0.64 0.36]x The size of the matrix will be 1x2 That is [P(fineF) P(rainF)] [P(fineF) P(rainF)]= [ ] 0.64x0.7+0.36x0.4

  17. Now predict P(fine) and P(Rain on Friday) [P(fineF) P(rainF)]= [0.64 0.36]x The size of the matrix will be 1x2 That is [P(fineF) P(rainF)] [P(fineF) P(rainF)]= [ ] 0.64x0.7+0.36x0.4 0.64x0.3+0.36x0.6 = [0.592 0.408] The sum of these two values P(fineF) and P(rainF) should equal 1

  18. P(fineF) P(rainF) =[0.64 0.36] x =[0.8 0.2] x Probability: n-step transition P[B] = [0.8 0.2] x = [0.64 0.36] =[P(fineT) P(rainT)] P(fineF) P(rainF)

  19. P(fine)=0.8 P(rain)=0.2 Probability: Using Tree Diagrams Friday Wednesday Thursday 0.7 P(fine|fineW) = 0.7 0.3 0.4 = 0.3 P(rain|fineW) 0.6 0.7 = 0.4 P(fine|wetW) 0.3 0.4 = 0.6 P(rain|rainW) 0.6 And we would multiply through each branch then add all probabilities for fineF and then rainF

  20. Example 2: Markov Chain Rules for Snakes and Ladders • If you land on the bottom of the ladder, automatically go to the top • If you land on the snake head automatically slide down to its tail • You must land exactly on square 7 to finish; if your move would take you beyond square 7, then you cannot take the move, so you remain on the same square. • (1) What is the state space? S={0,1,3,5,7}

  21. Example 2: Markov Chain Rules • If you land on the bottom of the ladder, automatically go to the top • If you land on the snake head automatically slide down to its tail • You must land exactly on square 7 to finish; if your move would take you beyond square 7, then you cannot take the move, so you remain on the same square. • (2) If we start on 0 in snakes and ladders what is the initial vector? States 0 1 3 5 7 [1 0 0 0 0 ]

  22. Example 2: Markov Chains Rules for Snakes and Ladders • If you land on the bottom of the ladder, automatically go to the top • If you land on the snake head automatically slide down to its tail • You must land exactly on square 7 to finish; if your move would take you beyond square 7, then you cannot take the move, so you remain on the same square. • (3) Represent the conditional probabilities of end states given the starting states.

  23. Example 2: Markov Chains • Transition Matrix - homework

  24. We will continue...

More Related