1 / 14

Solutions Markov Chains 1

0. 1. 3. 2. Solutions Markov Chains 1. 16.4-1) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether they are recurrent. a. b. . All states communicate, all states recurrent. 0. 1. 2.

lola
Télécharger la présentation

Solutions Markov Chains 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 0 1 3 2 Solutions Markov Chains 1 16.4-1) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether they are recurrent. a. b. All states communicate, all states recurrent 0 1 2 3 Communicate {0}, {1, 2}, {3} states 0, 1, 2 recurrent state 3 transient

  2. 1 2 3 4 0 Solutions Markov Chains 2 16.4-3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they are recurrent. Drawing the transition diagram above, we see that states 1 and 2 communicate. Once you leave state 2, there is a finite probability you will never return. Therefore, state 2 is transient. States 3 and 4 communicate with each other but not with states 0, 1, or 2. Consequently, { 0, 1 } is recurrent { 2 } is transient { 3, 4 } is recurrent

  3. Solutions Markov Chains 3 16.5-4) The leading brewery on the West Coast (A) has hired a TM specialist to analyze its market position. It is particularly concerned about its major competitor (B). The analyst believes that brand switching can be modeled as a Markov chain using 3 states, with states A and B representing customers drinking beer produced from the aforementioned breweries and state C representing all other brands. Data are taken monthly, and the analyst has constructed the following one-step transition probability matrix. A B C What are the steady-state market shares for the two major breweries? Soln: (1) (2) (3) Let, Then, from (1), (4)

  4. p = - p 3 2 C B Solutions Markov Chains 4 (1) (2) (3) (4) 16.5-4) (cont.) from (2), from (4)

  5. Solutions Markov Chains 5 16.5-4) (cont.) But, Note: This could also be checked by Pn for some large n.

  6. Solutions Markov Chains 6 16.6-1) A computer is inspected at the end of every hour. It is found to be either working (up) or failed (down). If the computer is found to be up, the probability of its remaining up for the next hour is 0.90. It it is down, the computer is repaired, which may require more than one hour. Whenever, the computer is down (regardlewss of how long it has been down), the probability of its still being down 1 hour later is 0.35. a. Construct the one-step transition probability matrix. b. Find the expected first passage time from i to j for all i, j. Soln: Let, S = 0 if computer is down = 1 if computer is up Then, b. Find expected first passage times

  7. Solutions Markov Chains 7 16.6-1) (cont.) b. Find expected first passage times

  8. Solutions Markov Chains 8 16.6-1) (cont.) Alternative solution to b. (1) (2) (3) (4) Substituting (4) into (1) gives And from (4),

  9. p = 0 . 13 0 p = 0 . 87 1 Solutions Markov Chains 9 16.6-1) (cont.) Alternative solution to b.

  10. Solutions Markov Chains 10 16.6-2) A manufacturer has a machine that, when operational at the beginning of a day, has a probability of 0.1 of breaking down sometime during the day. When this happens, the repair is done the next day and completed at the end of that day. a. Formulate the evolution of the status of the machine as a 3 state Markov Chain. b. Fine the expected first passage times from i to j. c. Suppose the machine has gone 20 full days without a breakdown since the last repair was completed. How many days do we expect until the next breakdown/repair? Soln: a. Let, S = 0 if machine running at day’s end | running at start = 1 if machine down at day’s end | running at start = 2 if machine running at day’s end | down at start

  11. Solutions Markov Chains 11 16.6-2) (cont.) a. S = 0 if machine running at day’s end | running at start = 1 if machine down at day’s end | running at start = 2 if machine running at day’s end | down at start Continuing in this fashion gives the one-step transition prob. b. 0 Note: This makes intuitive sense. If the machine has a 10% chance of failing on any given day, then the expected number of days between failures is 10, (m01 = 10).

  12. Solutions Markov Chains 12 16.6-2) (cont.) 0 0 0

  13. Solutions Markov Chains 13 16.6-2) (cont.) Back substituting for m02, m10, m11

  14. Solutions Markov Chains 14 16.6-2) (cont.) c. Suppose the machine has gone 20 full days without a breakdown since the last repair was completed. How many days do we expect until the next breakdown/repair? If we read this as the expected number of days to breakdown since the last repair, we are asking for m21 in which case, If we read this as the expected number of days to breakdown and subsequent repair since the last repair, we are asking for m22. Again, this should make intuitive sense. A machine has a 10% chance of breaking down. Therefore, the expected time between failures is 10 days. Since it takes 1 day to repair, the time from repair to repair is 10+1 = 11 days.

More Related