1 / 29

290 likes | 479 Vues

Chapter 8 Martingale & Brownian motion. Markov property of BM. Definition . A stochastic process { X ( t )} t ≥ 0 , is called a Markov process if

Télécharger la présentation
## Chapter 8 Martingale & Brownian motion

**An Image/Link below is provided (as is) to download presentation**
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.
Content is provided to you AS IS for your information and personal use only.
Download presentation by click this link.
While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

**Chapter 8Martingale & Brownian motion**Prof. Bob Li**Markov property of BM**• Definition. A stochastic process {X(t)}t≥0, is called a Markov process if • for any t > 0 and s > 0, the conditional distribution of X(t+s) given the process X(t) on the interval [0, s] is the same as the conditional distribution of X(t+s) given just the end point X(s). // Only the newest info matters Property 11. A BM is an Markov process. Proof. // Independent increment // Independent increment Prof. Bob Li**Transition probability function of BM**The transition probability function of a markov process {X(t)}t≥0 is formulated as the conditional distribution function of the process at time tgiven that it is at point b at a time s <t. When the markov process is a BM, the transition probability function does not change under time shift because of stationary increment. That is, This, in the case of a standard BM, is the Normal(b, ts) distribution function: Thetransition probability function is a form of conditioning on pastevent. Prof. Bob Li**Backward conditioning of standard BM**Before proceeding onto further properties of BM, we show that conditioning on future event leads to a different normal distribution. Again, let s <t. The conditional density function is // Independent increment // Stationary increment**Backward conditioning of standard BM**This is the Normal(bs/t, (ts)s/t) distribution. // bs/t means time-prorated average // vs. Normal(b, ts) distribution, the difference is the factor s/t. Prof. Bob Li**Notation for r.v. with randomness in condition**… This is the Normal(bs/t, (ts)s/t) distribution. We may write: The latter expression is a function of b that denotes the value of the r.v. X(t). Bypassing the notation b, the expression is conceptually a function of X(t). This has led to the r.v. of which the randomness comes from X(t) in the condition while the randomness of X(s)is neutralized by E[]. Similar notation will apply later on. Prof. Bob Li**Stopping time**• Next, we shall look into the strong markov property of BM. First, we need the following notion: • Definition. A random time T is called a stopping timefor a continuous-time process {X(t)}t≥0, if, for any t, • it is possible to decide whether T t by observing Prof. Bob Li**Strong markov property of BM**• Strong markov property is the same as the markov property, except that the fixed-time condition in the markov property is replaced by a stopping-time one: • Property 12. A BM {X(t)}t≥0has the strong markov property, that is, • for any finite stopping time T and any t > 0, the conditional distribution of X(T+t) given the process X(t) on the interval [0, T] is the same as the conditional distribution of X(T+t) given just the stopping point X(T). • Equivalently, the conditional r.v. of X(T+t) given X(T) is independent of the initial part of the process {X(t)} during the time interval (0, T). • Corollary. Let T be a finite stopping time of the BM {X(t)}t≥0 . Then, the new process {X*(t)}t ≥ 0defined via is a BM. • Intuitive proof. Condition on X(T) = b. The new process is a BM for all b because of independent increment. The new process meets whatever qualification required on a BM under the condition X(T) = b, for any b.**Hitting times**Definition. For a drift-free BM {X(t)}t≥0withX(0) = 0, the hitting time of a real value a is It is clearly a stopping time of the BM. Hereafter we assume that a > 0 in Ta, while the symmetric hitting time Ta has the identical distribution because X(0) = 0. Theorem. Proof. // Ta t isan à priori because X(0) = 0 // Having hit the value a by the time t, the BM // becomes starting at the value a afterwards. Prof. Bob Li**Hitting times**// Conditioning on Ta = s In conclusion, for a standard BM,**Zero crossings**There will surely be a hit sooner or later. However, this occurrence is non-positive as we are to show that the average hitting time is . // Recall the notion of a positive recurrent state of a markov chain. Prof. Bob Li**Non-positive recurrence of hitting times**Prof. Bob Li**Zero crossings**This new limit says that within any fixed amount of time t > 0, the standard BM almost surely will move to the right by some distance a. // To the left too, by symmetry Almost surely, the standard BM will not stay on either one side of the point X(t) throughout a time interval (t, t+h), no matter how small is h. The same fact easily extends to a drift-free BM. This proves: Property 13 (Zero crossings).The set of zero-crossings on a sample path of a drift-free BM is almost surely densearound any zero-crossing time. That is, no sample path has isolated zero-crossings. Prof. Bob Li**Zero crossings**Proposition. For 0 < s < t, let O(s, t) denote the event that the standard BM hits the zero value at least once in the interval (s, t). Then, // O stands for Origin Proof. // Conditioning on X(s) // Symmetry // Probability of moving by at least x to the left within the time ts**Interval exiting of drift-free BM**Let T(a, b) denote min{t ≥ 0: X(t) = a or X(t) = b}for a < 0 < b. It means the waiting time until a BM exits from the interval [a, b]. Theorem. ET(a, b) = a2 ET(b/a, 1) for a drift-free BM. Proof. Prof. Bob Li**Interval exiting of drift-free BM**Let T(a, b) denote min{t ≥ 0: X(t) = a or X(t) = b}for a < 0 < b. It means the waiting time until a BM exits from the interval [a, b]. // min = inf when continuous Theorem. For a drift-free BM, Proof. Prof. Bob Li**Continuous-time martingale**Definition.A continuous-time process {X(t)}t≥0is called a martingale when and Prof. Bob Li**Martingale property of drift-free BM**Proposition. A drift-free BM {B(t)}t≥0 is a martingale. Proof. // Drift-free • Levy’s characterization of BM: A standard BM is equivalent to • an almost surely continuous martingale with X(0) = 0 and quadratic variationt Prof. Bob Li**Martingales and drift-free BM**Proposition. Let {B(t)}t≥0be a drift-free BM. Then {B2(t)t}t≥0 defines a martingale. // Note that EB2(t) = Var(B(t)) =t. ThusEB2(t)t = 0. Proof. Inverse fact. If a process {B(t)}t≥0 is a continuous-time martingale such that B2(t)tis also a martingale, then {B(t)}t≥0 is a BM. Prof. Bob Li**Exponential martingale of BM**Theorem. Let {B(t)}t≥0be a drift-free BM. Then, defines a martingale, which is called the exponential martingale. Inverse theorem.Let {B(t)}t≥0 be a continuous process such that is a martingale for all c. Then, {B(t)}t≥0is a BM. Proof. Skipped. Proof. Skipped.**Brownian motion with drift**Definition.A stochastic process {X(t)}t≥0is a BM with the drift coefficient if where {B(t)}t≥0 is a drift-free BM. Prof. Bob Li**Brownian motion with drift**Let {B(t)}t≥0 be standard BM and X(t) =B(t) + t + x. Thus {X(t)}t≥0is BM with drift (coefficient) . Let A, B > 0. We are to calculate, for all x between –B and A, the probability P(x) for {X(t)}t≥0 to hit A before –B. For this purpose, we shall derive a differential equation for P(x) by conditioning on Y = X(h) X(0) = X(h) x, which is Normal(h,h) distributed. // o(h)probability for hitting A or –B by the time h already // Taylor expansion w.r.t. the variable Y Prof. Bob Li**Brownian motion with down drift**Assume that X(0) = 0 and < 0. // Since < 0. • Equivalently, • In other words, is an exponential r.v. with parameter 2. Prof. Bob Li**Application of down-drift BM to stock option**• Problem. Consider a stock call option with: • a striking price $Aabove the current price with no expiration date • The stock price follows a BM with a drift –d. • The strategy is to set a target level x and exercise the option as soon as the stock price hits this level. What should be x? Solution. Exercising the option at x would create a profit of $(x A). Target price x The expected gain Striking price A = P{The target x is ever hit}(x A) Current price 0 The optimal target level is • // An à priori of exercising is to cross the striking price A first. In • // setting the price differential between x and A, we may wait until the • // price A is reached. This is why xA has turned out independent of A.**Stock market application**The mathematical model of Brownian motion has many real-world applications. Stock market fluctuations are often cited. However,B. Mandelbrot (of "theory of roughness“ and fractal geometry) rejected its applicability to stock price movements in part because these are discontinuous.**Martingales and BM**Let where Y(t) is standard BM. For B< 0 < A, the interval exiting time T = min{t ≥ 0: X(t) = A or X(t)= B} is a stopping time. The probability was previously calculated as . Alternative derivation. Applying the martingale stopping theorem to the exponential martingale . Thus, Taking c = 2, Prof. Bob Li**Martingales and BM**Let where Y(t) is standard BM. For B< 0 < A, the interval exiting time T = min{t ≥ 0: X(t) = A or X(t)= B} is a stopping time. Calculation of average exiting time. Since{Y(t)}t≥0 is a martingale,**Thank you**for being here Prof. Bob Li

More Related