1 / 106

Markov Processes System Change Over Time

scene from the television series “ Time Tunnel ” (1970s). Data Mining and Forecast Management MGMT E-5070. Markov Processes System Change Over Time. Markov Process Models. Also known as Markov Chains. Analyze how systems change over time. Common applications include:.

Télécharger la présentation

Markov Processes System Change Over Time

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. scene from the television series “Time Tunnel” (1970s) Data Mining and Forecast Management MGMT E-5070 Markov Processes System Change Over Time

  2. Markov Process Models • Also known as Markov Chains. • Analyze how systems change over time. • Common applications include: • consumer brand loyalty tendencies • consumer brand-switching tendencies • reliability analysis for equipment • the aging / writeoff of accounts receivable • the spoilage tendencies of perishable items over time

  3. Andrei Andreyevich Markov ( 1856 – 1922 ) • Ph.D , St. Petersburg University (1884) • Studied under Pafnuty Chebyshev. • Professor, St. Petersburg University (1886-1905) , but taught informally until 1922. • Early work in number theory, algebraic continual fractions, limits of integrals, and least squares method. • Launched the theory of stochastic processes ( Markov Chains ) : an all new branch of probability theory. Андрєй AндрєєвичМарков

  4. We will consider only the simplest types which are: discrete finite stationary first-order Markov Process Models THE SCOPE OF STUDY

  5. The states and transitions are “discrete” Markov Process Models DISCRETE This means, for example, that market share among different stores can only change once per week or once per month, but not second - to - second.

  6. The number of states is “finite” Markov Process Models FINITE Means, for example, that an accounts receivable can only age 3 months before it is written off as a bad debt. Once a bottle of wine spoils, there are no additional aging periods for it.

  7. Markov Process Models STATIONARY Transitions depend only on the current state……….not on prior states The chances of an accounts receivable being paid off are higher if it is one month old as opposed to two months old.

  8. Markov Process Models FIRST-ORDER PROCESSES Transition probabilities remain constant over time For example, the defection rates of customers from a local supermarket to its competitors month-by-month will always stay the same.

  9. Markov Process Models BASIC EXAMPLE TWO BARBERS IN A SMALL TOWN EACH HAVE A 50% MARKET SHARE. THEREFORE, THE VECTOR OF STATE PROBABILITIES AT THIS TIME IS: π ( 1 ) =(0.5,0.5) PERIOD NUMBER ONE 50% MARKET SHARE FOR BARBER “A” 50% MARKET SHARE FOR BARBER “B”

  10. Markov Process Models BASIC EXAMPLE THE MATRIX OF TRANSITION PROBABILITIES PER MONTH ARE: LOYALTY PROBABILITY FOR BARBER “A” IN ANY GIVEN PERIOD DEFECTION PROBABILITY FOR BARBER “A” IN ANY GIVEN PERIOD [ ] .90 .10 P = .25 .75 DEFECTION PROBABILITY FOR BARBER “B” IN ANY GIVEN PERIOD LOYALTY PROBABILITY FOR BARBER “B” IN ANY GIVEN PERIOD

  11. Markov Process Models BASIC EXAMPLE THE MARKET SHARE EACH BARBER HAS IN THE 2nd MONTH IS EQUAL TO THE PRODUCT OF THE VECTOR OF STATE PROBABILITIES IN PERIOD ( MONTH ) 1 AND THE MONTHLY MATRIX OF TRANSITION PROBABILITIES: π ( 2 ) = π ( 1 ) x P [ ] .45 .05 BARBER ‘A’ BARBER ‘B’ .90 .10 ( 0.5 , 0.5 ) = ( 0.575 , 0.425 ) .125 .375 .25 .75 BARBER ‘A’ BARBER ‘B’ .575 .425

  12. Markov Process Models BASIC EXAMPLE THE MARKET SHARE EACH BARBER HAS IN THE 3rd MONTH IS EQUAL TO THE PRODUCT OF THE VECTOR OF STATE PROBABILITIES IN PERIOD (MONTH) 2 AND THE MONTHLY MATRIX OF TRANSITION PROBABILITIES: π ( 3 ) = π ( 2 ) x P [ ] .5175 .0575 BARBER ‘A’ BARBER ‘B’ .90 .10 ( 0.575 , 0.425 ) = ( 0.62 , 0.38 ) .10625 .31875 .25 .75 BARBER ‘A’ BARBER ‘B’ .62375 .37625

  13. Markov Process Models BASIC EXAMPLE THE MARKET SHARE EACH BARBER HAS IN THE 4th MONTH IS EQUAL TO THE PRODUCT OF THE VECTOR OF STATE PROBABILITIES IN PERIOD (MONTH) 3 AND THE MONTHLY MATRIX OF TRANSITION PROBABILITIES: π ( 4 ) = π ( 3 ) x P [ ] .558 .062 BARBER ‘A’ BARBER ‘B’ .90 .10 ( 0.62 , 0.38 ) = ( 0.653 , 0.347 ) .095 .285 BARBER ‘A’ BARBER ‘B’ .25 .75 .653 .347

  14. Steady – State Probabilities Reached when the before and after state probabilities stay the same forever, assuming no changes in the matrix oftransition probabilities Here, the eventual market share in the “barber” problem. ALSO KNOWN AS THE EQUILIBRIUM ORSTEADY-STATE SOLUTION

  15. Equilibrium Condition BARBER EXAMPLE • The eventual market shares of the two barbers can be calculated directly from the matrix of transition probabilities. • Prior period vectors of stateprobabilities are not required. • The equations: .90 π1 + .25 π2 = π1 .10 π1 + .75 π2 = π2 [ ] .90 .10 .25 .75 MATRIX OF TRANSITION

  16. Equilibrium Condition BARBER EXAMPLE Since π1 + π2 = 1.0 , π1 = 1.0 – π2 Therefore, “ 1.0 – π2 “ may be substituted for π1 in either of the two equations below: .90 ( 1 – π2 ) + .25 π2 = 1 – π2 .10 ( 1 – π2 ) + .75 π2 = π2 or

  17. Equilibrium Condition BARBER EXAMPLE SUBSTITUTING IN THE 1st EQUATION, WE GET: .90 ( 1 – π2 ) + .25 π2 = 1 – π2 .90 - .90 π2 + .25 π2 = 1 – π2 - .10 = - .35 π2 π2 = .2857 and π1 = ( 1 - .2857 ) = .7143 EVENTUALLY BARBER ‘A’ WILL HAVE 71% OF THE MARKET WHILE BARBER ‘B’ WILL HAVE THE REMAINING 29%

  18. Equilibrium Condition BARBER EXAMPLE SUBSTITUTING IN THE 2nd EQUATION, WE GET: .10 ( 1 – π2 ) + .75 π2 = π2 .10 - .10 π2 + .75 π2 = π2 .10 = .10 π2 - .75 π2 + 1.0 π2 .10 = .35 π2 π2 = .2857 π1 = ( 1 - .2857 ) = .7143

  19. Markov Processes with QM for WINDOWS

  20. We scroll to “ Markov Analysis ”

  21. We click on “ New ” to solve a new problem

  22. We specify the number of states. Here, it is the market share for the two barbers

  23. We desire to find the market shares over “4” periods ( months ) The “ initial “ market shares (loyalty rates) are 50% / 50% respectively. ( Barber “A” is “1” ) ( Barber “B” is “2”) The Matrix of Transition is inserted to the right.

  24. The “ End of Period 1 “ is actually the end of period “2” The “ End of Period 2 “ Is actually the end of period “3” , etc. The market share ( loyalty rates ) after 4 months: Barber A ( 1 ) - 66% Barber B ( 2 ) - 34%

  25. The ‘Steady-State’ or ‘Equilibrium’ market shares ( loyalty rates ) are: Barber ‘A’ ( 71% ) Barber ‘B’ ( 29% )

  26. Markov Processes Using

  27. Template

  28. Insert the Matrix of Transition here Matrix of Transition for the next three periods

  29. To obtain the Steady-State market shares, ( loyalty rates ) go to Tools, Solver

  30. Steady State or Equilibirum market shares ( loyalty rates )

  31. Gas Station Example MARKOV PROCESSES A town has three gas stations: A,B,C. The only factor influencing the choice of station for the next purchase is the prior purchase. Each station is concerned about brand share. The town’s weekly gas sales are $10,000.00, and we assume each driver buys gas once per week.

  32. Gas Station Example MARKOV PROCESSES THE MARKET SHARES AT THIS PARTICULAR TIME ARE AS FOLLOWS: STATION ‘A’ - 30% STATION ‘B’ - 40% STATION ‘C’ - 30% THEREFORE, THE VECTOR OF STATE PROBABILITIES IS: π (1) = ( 0.3 , 0.4 , 0.3 ) PERIOD NUMBER ONE

  33. Gas Station Example MARKOV PROCESSES THE MATRIX OF TRANSITIONPROBABILITIES PER WEEK ARE: LOYALTY RATE STATION ‘A’ LOYALTY RATE STATION ‘B’ .90.05 .05 .10.80.10 .20 .10.70 P = LOYALTY RATE STATION ‘C’ ALL OTHERS ARE DEFECTION RATES

  34. Gas Station Example THE MARKET SHARE THAT EACH GAS STATION HAS IN THE NEXT WEEK IS EQUAL TO THE PRODUCT OF THE VECTOR OF STATEPROBABILITIES IN PERIOD ( WEEK ) 1 AND THE WEEKLY MATRIX OF TRANSITION PROBABILITIES: π ( 2 ) = π ( 1 ) x P P (A) = 0.30 A .90 .05 .05 .27 .015 .015 P (B) = 0.40 X B .10 .80 .10 = .04 .32 .04 P (C) = 0.30 C .20 .10 .70 .06 .03 .21 .37 .365 .265 π ( 1 ) P A B C

  35. Gas Station Example THE MARKET SHARE THAT EACH GAS STATION HAS IN THE 3rd WEEK IS EQUAL TO THE PRODUCT OF THE VECTOR OF STATEPROBABILITIES IN PERIOD ( WEEK ) 2 AND THE WEEKLY MATRIX OF TRANSITION PROBABILITIES: π ( 3 ) = π ( 2 ) x P P (A) = 0.37 A .90 .05 .05 .333 .018 .018 P (B) = 0.365 X B .10 .80 .10 = .037 .292 .037 P (C) = 0.265 C .20 .10 .70 .053 .027 .185 .423 .337 .24 π ( 2 ) P A B C

  36. Gas Station Example THE MARKET SHARE THAT EACH GAS STATION HAS IN THE 4th WEEK IS EQUAL TO THE PRODUCT OF THE VECTOR OF STATE PROBABILITIES IN PERIOD ( WEEK ) 3 AND THE WEEKLY MATRIX OF TRANSITION PROBABILITIES: π ( 4 ) = π ( 3 ) x P P (A) = 0.423 A .90 .05 .05 .381 .021 .021 P (B) = 0.337 X B .10 .80 .10 = .034 .269 .034 P (C) = 0.240 C .20 .10 .70 .048 .024 .168 .463 .314 .223 π ( 3 ) P A B C

  37. Steady–State Probabilities WE KNOW THAT WE HAVE ARRIVED AT THE EVENTUAL MARKET SHARES WHEN THE STARTING PROBABILITIES ARE EQUAL TO THE NEXT STATE PROBABILITIES: π ( Final State ) = π ( Starting ) x P P (A) = 0.589 A .90 .05 .05 .530 .029 .029 P (B) = 0.235X B .10 .80 .10 = .024 .188 .024 P (C) = 0.176 C .20 .10 .70 .035 .018 .123 .589 .235 .176 π ( starting ) P A B C

  38. Gas Station Example CONCLUSION Unless there is some change, the share of market will stay approximately 58.9% for station ‘A’, 23.5% for station ‘B’, and 17.6% for station ‘C’. Given that the total weekly gas sales are $10,000.00, the average weekly sales per station are: A : ( 0.589 ) ( 10,000 ) = $5,890.00 B : ( 0.235 ) ( 10,000 ) = $2,350.00 C : ( 0.176 ) ( 10,000 ) = $1,760.00

  39. Calculation of Steady-State Probabilities GAS STATION EXAMPLE INITIAL STATE PROBABILITIES TRANSITION PROBABILITIES NEW STATE PROBABILITIES = X A B C .90 .05 .05 .10 .80 .10 .20 .10 .70 A B C .90X1 .05X1 .05X1 .10X2 .80X2 .10X2 .20X3 .10X3 .70X3 X1 X2 X3 A B C P(A) = X1 P(B) = X2 P(C) = X3

  40. Calculation of Steady-State Probabilities GAS STATION EXAMPLE P(A) = .90X1 + .10X2 + .20X3 = 1X1 P(B) = .05X1 + .80X2 + .10X3 = 1X2 P(C) = .05X1 + .10X2 + .70X3 = 1X3 1X1 + 1X2 + 1X3 = 1 DEPENDENT EQUATION DEPENDENT EQUATION DEPENDENT EQUATION INDEPENDENT EQUATION

  41. Equation Conversion SET ALL 3 DEPENDENT EQUATIONS EQUAL TO ZERO: P(A) = .9X1 + .1X2 + .2X3 = 1X1 .9X1 – 1.0X1 + .1X2 + .2X3 = 0 -.1X1 + .1X2 + .2X3 = 0 P(B) = .05X1 + .8X2 + .1X3 = 1X2 .05X1 + .8X2 – 1.0X2 + .1X3 = 0 .05X1 -.2X2 + .1X3 = 0 P(C) = .05X1 + .1X2 + .7X3 = 1X3 .05X1 + .1X2 + .7X3 – 1.0X3 = 0 .05X1 + .1X2 -.3X3 = 0

  42. Summary Equations -.1X1 + .1X2 + .2X3 = 0 .05X1 - .2X2 + .1X3 = 0 .05X1 + .1X2 - .3X3 = 0 1X1 + 1X2 + 1X3 = 1 DEPENDENT EQUATIONS INDEPENDENT EQUATION The 3 dependent equations will not solve for the values of X1, X2, and X3. Therefore, we add the independent equation!

  43. The Solution TO ELIMINATE X1 AMONG THE DEPENDENT EQUATIONS: .05 X1 - .2X2 + .1X3 = 0 .05 X1 + .1X2 - .3X3 = 0 - .3X2 + .4X3 = 0 .05 X1 - .2X2 + .1X3 = 0 “.05” ( 1.0 X1 + 1.0 X2 + 1.0 X3 = 1 ) .05 X1 + .05 X2 + .05 X3 = .05 - .25 X2 + .05 X3 = - .05

More Related