1 / 23

10.5: Model Solution 10.6 Model Interpretation 10.7 Assumption and Limitation

10.5: Model Solution 10.6 Model Interpretation 10.7 Assumption and Limitation. 10.5 Model Solution. Three steps to create a Markov model: 1) Construct the state diagram by identifying all possible states that the modeled system may find itself.

sierra
Télécharger la présentation

10.5: Model Solution 10.6 Model Interpretation 10.7 Assumption and Limitation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 10.5: Model Solution10.6 Model Interpretation10.7 Assumption and Limitation

  2. 10.5 Model Solution • Three steps to create a Markov model: • 1) Construct the state diagram by identifying all possible states that the modeled system may find itself. • 2) Identify the state connections(or transitions). • 3)Parameterize the model by specifying the length of time spent in each state once it is entered(or the probability of transitioning from one state to another within the next time period).

  3. 10.5 Model Solution • Definition of “model solution”: To find the long term(i.e., the “steady state”) probability of being in any particular state. • This steady state solution is the overall probability of being in each system state. • In general, there is one balance equation for each system state. • The balance equation for each system state is: flows in=flows out • Given N states, there are N desired unknowns along with N linear equations, which is a straightfoward linear algebra math problem.

  4. Random walk through England: Model Solution

  5. Random walk through England: Model Solution • Let Pi represent probability of being in state i. So we have p1,p2,p3,p4 to represent the four states respectively. • So the balance equations for this model are: • 0.2*p2+0.1*p3+0.3*p4=0.6*p1 • 0.6*p1=p2 • 0.2*p4=0.3*p3 • 0.8*p2+0.2*p3=0.5*p4

  6. Random walk through England: Model Solution • The final state equations are: • 0.2*p2+0.1*p3+0.3*p4=0.6*p1 • 0.6*p1=p2 • 0.2*p4=0.3*p3 • p1+p2+p3+p4=1 • The results are: • P1=0.2644 • P2=0.1586 • P3=0.2308 • P4=3462

  7. Database Server Support: Model Solution

  8. Database Server Support: Model Solution • The balance equations for the six states are: • 4*p(1,1,0)+2*(1,0,1)=6*p(2,0,0) • 3*p(2,0,0)+4*p(0,2,0)+2*p(0,1,1)=10*p(1,1,0) • 3*p(2,0,0)+4*p(0,1,1)+2*p(0,0,2)=8*p(1,0,1) • 3*P(1,1,0)+3*p(1,0,1)=6*p(0,1,1) • 3*p(1,1,0)=4*p(0,2,0) • 3*p(1,0,1)=2*p(0,0,2)

  9. Database Server Support: Model Solution • The balance equations for the six states are: • 4*p(1,1,0)+2*(1,0,1)=6*p(2,0,0) • 3*p(2,0,0)+4*p(0,2,0)+2*p(0,1,1)=10*p(1,1,0) • 3*p(2,0,0)+4*p(0,1,1)+2*p(0,0,2)=8*p(1,0,1) • 3*P(1,1,0)+3*p(1,0,1)=6*p(0,1,1) • 3*p(1,1,0)=4*p(0,2,0) • p(2,0,0)+p(1,1,0)+p(1,0,1)+p(0,2,0)+p(0,1,1)+p(0,0,2)=1 • Results: • p(2,0,0)=0.1391; p(1,1,0)=0.1043; p(1,0,1)=0.2087; p(0,2,0)=0.0783; p(0,1,1)=0.1565; p(0,0,2)=0.3131;

  10. 10.6 Model Interpretation • Example 1:Random Walk Through England • Father’s question: • What percentage of days is the son actually not drinking in Leeds? • Answer: 74%. • Interpretation: p1=0.2644=26%. So, the percentage of days that the son not drinking in Leeds is: 1-26%=74%

  11. Example 1:Random Walk Through England • Lake District relative’s question: • Once the son finishes a day of kayaking in the Lake District, how long will it typically be before he returns? • Answer: 3.33 days • Interpretation: The mean time between entering a particular state (i.e., the state’s “cycle time”)is the inverse of the steady state probability of being in that state. • P3=0.2308; cycle time=1/0.2308=4.33; • It takes 1 day for the lad to kayak; • The time from when he finishes a day of kayaking until he typically starts kayaking again is: 4.33-1=3.33

  12. Example 1:Random Walk Through England • Policeman’s question: • How many days each month can the bobbies expect to see the son driving to London after drinking in Leeds? • Answer: 4.76 days. • Interpretation:p1=0.2644 • The days will find the lad drinking in a month: 0.2644*30=7.93 • Since the probability that the lad will go state 2 after state 1 is 0.6, so the days that the bobbies can expect to find the lad on the road to London is: • 7.93*0.6=4.76 days.

  13. Example 1:Random Walk Through England • Kayak renters’ question: • How many visits each month does the son typically visit their shop and typically how long does the son keep their kayak out each visit? • Answer: 2.08 visits per month, keeping the kayak an average of 3.33 days each visit. • Interpretation: • The only way to enter state 3 from another state is from state 4. • The days stay in state 4 each month is : 0.3462*30=10.39

  14. Example 1:Random Walk Through England • The probability to go to state 3 after state 4 is 0.2, so the lad typically start a new visit to the Lake District 10.39*0.2=2.08 times each month. • P3=0.2308 • The days that the lad can be expected to be kayaking each month is:30*0.2308=6.92 • The duration of each visit is: 6.92/2.08=3.33

  15. Database Server Support: Solution Interpretation

  16. Example 2:Database Server Support • User’s question: • What response time can the typical user expect? • Answer: 44.24 • Interactive Response Time Law: R=M/X0-Z ,(Z=0) • X0 ,the throughput of the system, measured at the CPU, is the product of its utilization and its service rate. • The CPU is utilized in states(2,0,0),(1,1,0),(1,0,1) • The utilization of CPU is: p(2,0,0)+p(1,1,0)+p(1,0,1)=0.4521 • The service rate of CPU = 6 transactions/minute. • X0 =0.4521*6=2.7126 • R=M/X0-Z=2/2.7126=0.7373 minutes/ transaction

  17. Example 2:Database Server Support • System administrator’s question: • How near capacity (utilization)of each of the system resources? • Answer: Ucpu=0.4521, Ufast=0.3391, Ulow=0.6783 • These are found as direct sums of the relevant steady state probabilities. • Ufast=p(1,1,0)+p(0,2,0)+p(0,1,1)=0.3391 • Ulow=p(1,0,1)+p(0,1,1)+p(0,0,2)=0.6783

  18. Example 2:Database Server Support • Company president’s question : IfI capture Company X’s clientele, which will likely double the number of users on my system, I will need to also double the number of active users on my system. What new performance levels should I spin in my speech to the newly acquired customers? • Answer: • The throughput is predicted to go from 2.7126 to 3.4768; • The response time is predicted to go from 44.24 to 69.03 • Now we have 4 users and 15 states.

  19. Example 2:Database Server Support • Company pessimist’s question: Since I know that the fast disk is about to fail and all the files will need to be moved to the slow disk, what will the new response time be? • Answer: 65.00 seconds/transaction • Now we have 2 devices, and 3 states which are (1,1),(2,0),(0,2) • The above two examples demonstrate how to use the knowledge of the steady state probabilities to arrive at more meaningful and more useful performance metrics.

  20. 10.7 Model Assumptions and Limitations • Markov Models are quite robust. However, there are Key assumptions and resulting limitations: • Memoryless Assumption: • It is assumed that all the important system information is captured in the state descriptors of a Markov model. That is, simply knowing which state the system is in, uniquely defines all relevant information. • Knowing the current state information alone is sufficient. This is the defining Markov characteristic and any other information is unnecessary as it applies to the system’s future behavior. • That is, previous history can be forgotten.

  21. 10.7 Model Assumptions and Limitations • Resulting Limitation: • Because everything must be captured in the state descriptor, Markov models are susceptible to state space explosion. • Having large state spaces implies additional complexity and a potential loss of accuracy.

  22. 10.7 Model Assumptions and Limitations • Exponential Assumption: • The exponential distribution is the only continuous distribution that is memoryless. • For example, the service time is 10 seconds, Knowing that the customer has already received 5 seconds worth of CPU time but not yet finished(previous history, which is irrelevant under the Markov memoryless assumption), the average amount of CPU time still needed is again 10 seconds. • Markov models assume that the time spent between relevant events, such as job arrival times and job service times, is exponentially distributed.

  23. 10.7 Model Assumptions and Limitations • Resulting Limitation: • To mitigate the limitation imposed by exponential assumptions, the concept of phases(or stages) can be introduced. • For example, service time can be partitioned into two phases ,each phase being exponentially distributed with an average of five seconds. • However the price is again a potential state space explosion since the state descriptor must now contain this additional phase information.

More Related