1 / 48

Intro. to Stochastic Processes

Intro. to Stochastic Processes. Cheng-Fu Chou. Outline. Stochastic Process Counting Process Poisson Process Markov Process Renewal Process. Stochastic Process. A stochastic process N = {N(t), t T} is a collection of r.v., i.e., for each t in the index set T, N(t) is a random variable

madge
Télécharger la présentation

Intro. to Stochastic Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intro. to Stochastic Processes Cheng-Fu Chou

  2. Outline • Stochastic Process • Counting Process • Poisson Process • Markov Process • Renewal Process

  3. Stochastic Process • A stochastic process N= {N(t), t T} is a collection of r.v., i.e., for each t in the index set T, N(t) is a random variable • t: time • N(t): state at time t • If T is a countable set, N is a discrete-time stochastic process • If T is continuous, N is a continuous-time stoc. proc.

  4. Counting Process • A stochastic process {N(t) ,t  0} is said to be a counting process if N(t) is the total number of events that occurred up to time t. Hence, some properties of a counting process is • N(t)  0 • N(t) is integer valued • If s < t, N(t)  N(s) • For s < t, N(t) – N(s) equals number of events occurring in the interval (s, t]

  5. Counting Process • Independent increments • If the number of events that occur in disjoint time intervals are independent • Stationary increments • If the dist. of number of events that occur in any interval of time depends only on the length of time interval

  6. Poisson Process • Def. A: the counting process {N(t), t0} is said to be Poisson process having rate l, l>0 if • N(0) = 0; • The process has independent-increments • Number of events in any interval of length t is Poisson dist. with mean lt, that is for all s, t 0.

  7. Poisson Process • Def. B: The counting process {N(t), t 0} is said to be a Poisson process with rate l, l>0, if • N(0) = 0 • The process has stationary and independent increments • P[N(h) = 1] = lh +o(h) • P[N(h)  2] = o(h) • The func. f is said to be o(h) if • Def A  Def B, i.e,. they are equivalent. • We show Def B  Def A • Def A  Def B is HW

  8. Important Properties • Property 1: mean number of event for any t 0, E[N(t)]=lt. • Property 2: the inter-arrival time dist. of a Poisson process with rate l is an exponential dist. with parameter l. • Property 3: the superposition of two independent Poisson process with rate l1 and l2 is a Poisson process with rate l1+l2

  9. Properties (cont.) • Property 4: if we perform Bernoulli trials to make independent random erasures from a Poisson process, the remaining arrivals also form a Poisson process • Property 5: the time until rth arrival , i.e., tr is known as the rth order waiting time, is the sum of r independent experimental values of t and is described by Erlan pdf.

  10. Ex 1 • Suppose that X1 and X2 are independent exponential random variables with respective means 1/l1 and 1/l2;What is P{X1 < X2}

  11. Conditional Dist. Of the Arrival Time • Suppose we are told that exactly one event of a Poisson process has taken place by time t, what is the distribution of the time at which the event occurred?

  12. Ex 2 • Consider the failure of a link in a communication network. Failures occur according to a Poisson process with rate 4.8 per day. Find • P[time between failures  10 days] • P[5 failures in 20 days] • Expected time between 2 consecutive failures • P[0 failures in next day] • Suppose 12 hours have elapsed since last failure, find the expected time to next failure

  13. Poisson, Markov, Renewal Processes Continuous Time Markov Chain: Exponential times between transitions Relax counting process Poisson Process: Counting process iid exponential times between arrivals Relax exponential interarrival times Renewal Process: Counting process iid times between arrivals

  14. Markov Process

  15. Markov Process • P[X(tn+1)  Xn+1| X(tn)= xn, X(tn-1) = xn-1,…X(t1)=x1] = P[X(tn+1)  Xn+1| X(tn)=xn] • Probabilistic future of the process depends only on the current state, not on the history • We are mostly concerned with discrete-space Markov process, commonly referred to as Markov chains • Discrete-time Markov chains • Continuous-time Markov chains

  16. DTMC • Discrete Time Markov Chain: • P[Xn+1= j | Xn= kn, Xn-1 = kn-1,…X0= k0] = P[Xn+1 = j | Xn= kn] • discrete time, discrete space • A finite-state DTMC if its state space is finite • A homogeneous DTMC if P[Xn+1= j | Xn= i ] does not depend on n for all i, j, i.e., Pij = P[Xn+1= j | Xn= i ], where Pij is one step transition prob.

  17. Definition • P = [ Pij] is the transition matrix • A matrix that satisfies those conditions is called a stochastic matrix • n-step transition prob.

  18. Chapman-Kolmogorov Eq. • Def. • Proof:

  19. Question • We have only been dealing with conditional prob. but what we want is to compute the unconditional prob. that the system is in state j at time n, i.e.

  20. Result 1 • For all n  1, pn = p0Pn, where pm = (pm(0),pm(1),…) for all m  0. From the above equ., we deduce that pn+1 = pnP. Assume that limnpn(i) exists for all i, and refer it as p(i). The remaining question is how to compute p • Reachable: a state j is reachable from i. if • Communicate: if j is reachable from i and if i is reachable form j, then we say that i and j communicate (i  j)

  21. Result 1 (cont.) • Irreducible: • A M.C. is irreducible if i  j for all i,j I • Aperiodic: • For every state iI, define d(i) to be largest common divisor of all integer n, s.t.,

  22. Result 2 • Invariant measure of a M.C., if a M.C. with transition matrix P is irreducible and aperiodic and if the system of equation p=pP and p1=1 has a strict positive solution then p(i) = limnpn(i) independently of initial dist. • Invariant equ. : p=pP • Invariant measure p

  23. Gambler’s Ruin Problem • Consider a gambler who at each play of game has probability p of winning one unit and probability q=1-p of losing one unit. Assuming that successive plays of the game are independent, what is the probability that, starting with i units, the gambler’s fortune will reach N before reaching 0?

  24. Ans • If we let Xn denote the player’s fortune at time n, then the process {Xn, n=0, 1,2,…} is a Markov chain with transition probabilities: • p00 =pNN =1 • pi,i+1 = p = 1-pi,i-1 • This Markov chain has 3 classes of states: {0},{1,2,…,N-1}, and {N}

  25. Let Pi, i=0,1,2,…,N, denote the prob. That, starting with i, the gambler’s fortune will eventually reach N. • By conditioning on the outcome of the initial play of the game we obtain • Pi = pPi+1 + qPi-1, i=1,2, …, N-1 Since p+q =1 Pi+1 – Pi = q/p(Pi-Pi-1), Also, P0 =0, so P2 – P1 = q/p*(P1-P0) = q/p*P1 P3 - P2 =q/p*(P2-P1)= (q/p)2*P1

  26. If p > ½, there is a positive prob. that the gambler’s fortune will increase indefinitely • Otherwise, the gambler will, with prob. 1, go broke against an infinitely rich adversary.

  27. CTMC • Continuous-time Markov Chain • Continuous time, discrete state • P[X(t)= j | X(s)=i, X(sn-1)= in-1,…X(s0)= i0] = P[X(t)= j | X(s)=i] • A continuous M.C. is homogeneous if • P[X(t+u)= j | X(s+u)=i] = P[X(t)= j | X(s)=i] = Pij[t-s], where t > s • Chapman-Kolmogorov equ.

  28. CTMC (cont.) • p(t)=p(0)eQt • Q is called the infinitesimal generator • Proof:

  29. Result 3 • If a continuous M.C. with infinitesimal generator Q is irreducible and if the system of equations pQ = 0, and p1=1, has a strictly positive solution then p(i)= limtp(x(t)=i) for all iI, independently of the initial dist.

  30. Renewal Process

  31. Renewal Process A counting process {N(t), t 0} is a renewal process if for each n, Xn is the time between the (n-1)st and nth arrivals and {Xn, n 1} are independent with the same distribution F. The time of the nth arrival is with S0 = 0. Can write and if m = E[Xn], n 1, then the strong law of large numbers says that Note: m is now a time interval, not a rate; 1/ m will be called the rate of the r. p.

  32. Fundamental Relationship It follows that where Fn(t) is the n-fold convolution of F with itself. The mean value of N(t) is Condition on the time of the first renewal to get the renewal equation:

  33. Exercise 1 Is it true that:

  34. Exercise 2 If the mean-value function of the renewal process {N(t), t 0} is given by Then what is P{N(5) = 0} ?

  35. Exercise 3 Consider a renewal process {N(t), t 0} having a gamma (r,l) interarrival distribution with density • Show that • Show that Hint: use the relationship between the gamma (r,l)distribution and the sum of r independent exponentials with rate l to define N(t) in terms of a Poisson process with rate l.

  36. Limit Theorems • With probability 1, • Elementary renewal theorem: • Central limit theorem: For large t, N(t) is approximately normally distributed with mean t/m and variance where s2 is the variance of the time between arrivals; in particular,

  37. Exercise 4 A machine in use is replaced by a new machine either when it fails or when it reaches the age of T years. If the lifetimes of successive machines are independent with a common distribution F with density f, show that • the long-run rate at which machines are replaced is • the long-run rate at which machines in use fail equals Hint: condition on the lifetime of the first machine

  38. Renewal Reward Processes Suppose that each time a renewal occurs we receive a reward. Assume Rn is the reward earned at the nth renewal and {Rn, n 1} are independent and identically distributed (Rn may depend on Xn). The total reward up to time t is If then and

  39. Age & Excess Life of a Renewal Process The age at time t is A(t) = the amount of time elapsed since the last renewal. The excess life Y(t) is the time until the next renewal: SN(t) t A(t) Y(t) What is the average value of the age

  40. Average Age of a Renewal Process Imagine we receive payment at a rate equal to the current age of the renewal process. Our total reward up to time s is and the average reward up to time s is If X is the length of a renewal cycle, then the total reward during the cycle is So, the average age is

  41. Average Excess or Residual Now imagine we receive payment at a rate equal to the current excess of the renewal process. Our total reward up to time s is and the average reward up to time s is If X is the length of a renewal cycle, then the total reward during the cycle is So, the average excess is (also)

  42. Inspection Paradox Suppose that the distribution of the time between renewals, F, is unknown. One way to estimate it is to choose some sampling times t1, t2, etc., and for each ti, record the total amount of time between the renewals just before and just after ti. This scheme will overestimate the inter-renewal times – Why? For each sampling time, t, we will record Find its distribution by conditioning on the time of the last renewal prior to time t

  43. Inspection Paradox (cont.) SN(t)+1 SN(t) t 0 t-s

  44. Inspection Paradox (cont.) SN(t)+1 SN(t) t 0 t-s For any s, so where X is an ordinary inter-renewal time. Intuitively, by choosing “random” times, it is more likely we will choose a time that falls in a long time interval.

More Related