1 / 24

Chapter 17

Chapter 17. Markov Processes – Part 1. Markov Processes. Markov process models are useful in studying the evolution of systems over repeated trials or sequential time periods or stages. Examples: Brand Loyalty Equipment performance Stock performance. Markov Processes.

Télécharger la présentation

Chapter 17

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 17 Markov Processes – Part 1

  2. Markov Processes • Markov process models are useful in studying the evolution of systems over repeated trials or sequential time periods or stages. • Examples: • Brand Loyalty • Equipment performance • Stock performance

  3. Markov Processes • When utilized, they can state the probability of switching from one state to another at a given period of time • Examples: • The probability that a person buying Colgate this period will purchase Crest next period • The probability that a machine that is working properly this period will break down the next period

  4. Markov Processes • A Markov system (or Markov process or Markov chain) is a system that can be in one of several (numbered) states, and can pass from one state to another each time step according to fixed probabilities. • If a Markov system is in state i, there is a fixed probability, pij, of it going into state j the next time step, and pij is called a transition probability.

  5. Markov Processes • A Markov system can be illustrated by means of a state transition diagram, which is a diagram showing all the states and transition probabilities– probabilities of switching from one state to another.

  6. Transition Diagram .2 .4 1 2 .8 .35 .50 .65 What does the diagram mean? 3 .15

  7. Transition Matrix • The matrix P whose ijth entry is pij is called the transition matrix associated with the system. • The entries in each row add up to 1. • Thus, for instance, a 2 2 transition matrix P would be set up as shown at the right. To From

  8. Diagram & Matrix .2 .4 1 2 .8 To .35 .50 .6 3 From .15

  9. Vectors & Transition Matrix • A probability vector is a row vector in which the entries are nonnegative and add up to 1. • The entries in a probability vector can represent the probabilities of finding a system in each of the states.

  10. Probability Vector • Let P =

  11. State Probabilities • The state probabilities at any stage of the process can be recursively calculated by multiplying the initial state probabilities by the state of the process at stage n.

  12. State Probabilities

  13. State Probabilities • Example: • (n) = [1 (n) 2 (n) ] • (1) = (0) P • (2) = (1) P • (3) = (2) P • (n+1) = (n) P

  14. Steady State Probabilities • The probabilities that we approach after a large number of transitions are referred to as steady state probabilities. • As n gets large, the state probabilities at the (n+1)th period are very close to those at the nth period.

  15. Steady State Probabilities • Knowing this, we can compute steady state probabilities without having to carry out a large # of calculations (n) = [1 (n) 2 (n) ] [ 1 (n+1) 2 (n+1) ] = p11 p12 [1 (n)2 (n)] p21 p22

  16. Example • Henry, a persistent salesman, calls North's Hardware Store once a week hoping to speak with the store's buying agent, Shirley. If Shirley does not accept Henry's call this week, the probability she will do the same next week (and not accept his call) is .35. On the other hand, if she accepts Henry's call this week, the probability she will not accept his call next week is .20.

  17. Example: Transition Matrix Next Week’s Call This Week’s Call

  18. Example • How many times per year can Henry expect to talk to Shirley? • Answer: To find the expected number of accepted calls per year, find the long-run proportion (probability) of a call being accepted and multiply it by 52 weeks.

  19. Example Let 1 = long run proportion of refused calls 2 = long run proportion of accepted calls Then, .35 .65 [1 2 ] .20 .80 = [1 2 ]

  20. Example  +  =  (1)  +  =  (2)  +  = 1 (3) Solve for  and 

  21. Example

  22. The probability of the system being in a particular state after a large number of stages is called a steady-state probability.

More Related