1 / 67

Workshop on Stochastic Differential Equations and Statistical Inference for Markov Processes

Workshop on Stochastic Differential Equations and Statistical Inference for Markov Processes. January 19 th – 22 nd 2012 Lahore University of Management Sciences. Schedule. Day 1 ( Saturday 21 st Jan ): Review of Probability and Markov Chains

marin
Télécharger la présentation

Workshop on Stochastic Differential Equations and Statistical Inference for Markov Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Workshop on Stochastic Differential Equations and Statistical Inference for Markov Processes January 19th – 22nd 2012 Lahore University of Management Sciences

  2. Schedule • Day 1 (Saturday 21st Jan): Review of Probability and Markov Chains • Day 2 (Saturday 28th Jan): Theory of Stochastic Differential Equations • Day 3 (Saturday 4th Feb): Numerical Methods for Stochastic Differential Equations • Day 4 (Saturday 11th Feb): Statistical Inference for Markovian Processes

  3. Today • Review of Probability • Simulation of Random Variables • Review of Discrete Time Markov Chains • Review of Continuous Time Markov Chains

  4. Review of Probability

  5. Why Probability Models? • Are laws of nature truly probabilistic? • Coding uncertainty in models • Financial Markets, Biological Processes, Turbulence, Statistical Physics, Quantum Physics

  6. Mathematical Foundations • S is a collection of elements (outcomes of an experiment) • Each (nice) subset of S is an event • A is a collection of (nice) subsets of S • The set function is called a probability measure iff

  7. Independence • Two events are independent iff • This means that the occurrence of one does not affect the occurrence of the other

  8. Conditional Probability • Probability of given that has occurred • Denoted by • Independence can be reformulated as =

  9. Random Variables • A random variable X is areal valued function defined on the sample space such that • A is the state space of the random variable • If A is finite of countably infinite X is discrete • If A is an interval X is continuous

  10. Cumulative Distribution Function • The cumulative distribution function of X is the function • F is non decreasing and right continuous and

  11. Probability Mass Function • If X is a discrete random variable, the function is called the probability mass function of X • We also have • The cdf satisfies

  12. Probability Density Function • If X is a continuous random variable the probability density function is given by • The cdf satisfies

  13. Discrete Distributions • Uniform : • Bernoulli • Binomial • Poisson

  14. Continuous Random Variables • Uniform • Exponential • Gaussian

  15. Expectation of a R.V. • The expectation is defined as for a continuous random variable • For a discrete random variable • What is it?

  16. Expectation of Function of a R.V. • “Law of the unconscious statistician”

  17. Moments • The nth moment is given by • What do they ‘mean’?

  18. Multivariate Distributions • Several random variables can be associated with the same sample space • Can define a joint pmf or pdf • In case of a bivariate random vector

  19. Marginal pdf • The marginal pdf of X1 is given by • The marginal pdf of X2 is given by

  20. Conditional Expectation • Conditional Expectation is given by • Note this is a function of a random variable itself!!!

  21. Probability Generating Function • The pgf of random variable is given by • The pmf can be recovered by taking derivatives evaluated at 0

  22. Central Limit Theorem • Why are many physical processes well modeled by Gaussians? • Let be i.i.d random variables with finite mean and variance then as the limiting distribution of is a normal

  23. Law of Large Numbers • Let be i.i.d random variables with finite mean and variance then

  24. Numerics • Simulate a 1-D random Walk • Calculate the mean • Calculate the Variance • Simulate a 2D random walk • Calculate the mean • Calculate the Variance

  25. Simulating a Binomially Distributed Random Variable • Note sum of Bernoulli trials is a binomial • Let Xi be a Bernoulli trial with probability ‘p’ of success • is binomial ‘n’, ‘p’

  26. Continuous Random Variables • Inverse Transform Method • Suppose a random variable has cdf ‘F(x)’ • Then Y=F-1(U) also had the same cdf • Generating the exponential • Generate the exponential, compare with exact cdf • Generate a r.v. with cdf

  27. Rejection Method • Simulate & • To Simulate look @ •  If accept, else reject • To Simulate N(0,1) let • If set

  28. Section Challenge • Kruskal’s Paper and Simulation of the Kruskal Count • The n-hat problem through various approaches and simulating the n-hat problem

  29. Stochastic Processes

  30. Boring Definitions • A stochastic process is a collection of random variables • T is the index set, S is the common sample space • For each fixed denotes a single random variable • For each fixed is a functions defined on T

  31. Types of Stochastic Processes • Discrete Time Discrete Space (DTMC) • Discrete Time Continuous Space (Time Series) • Continuous Time Discrete Space (CTMC) • Continuous Time Continuous Space (SDE)

  32. Discrete Time Discrete Space Processes Discrete Time Markov Chains

  33. Discrete Time Markov Chain • The index set is discrete (finite or infinite) • Markov Property

  34. Transition Probability Matrix • The one step transition probability is defined as • If the transition probability does not depend on n the process is stationary or homogenous • The transition matrix is

  35. N-step Transition Probability • The n step transition probability is • How is this related to the one step transition probability? • Guess: Perhaps as the nth power?

  36. Chapman Kolmogorov Equations • To get from i to j in n steps is equivalent to get from i to k in s steps and from k to j in n-s steps, summed over all possible intermediate k’s • The n step transitions are just powers of the once step transition!!

  37. Communication Classes • Two states i and j ‘communicate’ ( ) if for some m and n • is an equivalence relation • The set of equivalence classes is called a ‘class’ of the DTMC •  If there is only one class in a MC it is irreducible

  38. Class Properties • Periodicity : The period of state i, ‘d(i)’; is the GCD of all such n for which • First Return Time • Transience & Recurrence • Transience • Recurrence

  39. Mean Return Time • Let be the random variable defining the first return time • The mean of is the mean return time Transient State Recurrent State

  40. First Passage Time • First passage time is defined as

  41. Stationary Distribution • For a DTMC a stationary distribution is non-negative vector • i.e. the eigenvector of P corresponding to eigenvalue 1

  42. Existence Theorem for Stationary Distribution • For a positive recurrent, aperiodic and irreducible DTMC there exists a unique stationary distribution such that

  43. Logistic Growth • The transition probabilities are given by where • Note the correspondence with the deterministic model for

  44. DTMC SIS Epidemic Model • Compartmental Model

  45. The Infected Class • I is a random variable that describes the infected class I={0,1,2………N} • Two classes {0} and {1,2,….N} • {0} is the absorbing class • Average time in infected state • F is the sub matrix corresponding to transient states

  46. DTMC SIR Epidemic Model • The transition probability is given by with

  47. Section Challenge • Simulate • Logistic Growth • SIS Model • SIR Model • Compare mean of MC Simulation with solution of corresponding deterministic Model

  48. Continuous Time Discrete Space Processes Continuous Time Markov Chains

  49. Definitions • The index set is an interval • States are discrete • Markov Property for any sequence

  50. Transition Probability • The transition probability is given by • If this only depends on the length of the time interval chain is homogenous

More Related