1 / 204

SS 2008 LV 1837

Folie 2. Riccardo Gismondi. Index. Elements of Probability IRandom NumbersElements of StatisticsElements of Probability IIGenerating Random VariablesBrownian Motion IGeometric Brownian Motion, B

shamus
Télécharger la présentation

SS 2008 LV 1837

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. SS 2008 LV 1837 Computersimulation for Finance

    2. Folie 2 Riccardo Gismondi Index Elements of Probability I Random Numbers Elements of Statistics Elements of Probability II Generating Random Variables Brownian Motion I Geometric Brownian Motion, B&S Model, Pricing of (exotic) options: 1-dim.

    3. Folie 3 Riccardo Gismondi Index Brownian Motion II Geometric Brownian Motion, pricing of (exotic) options: n-dim. Square root diffusion process: CIR Model Heston Model and stochastic Volatility Pricing of Exotic Options: Structured Products (Equity)

    4. Folie 4 Riccardo Gismondi Elements of Probability I

    5. Folie 5 Riccardo Gismondi Introduction Ground-up review of probability necessary to do and understand simulation Assume familiarity with Mathematical analysis (especially derivate and integrals) Some probability ideas (especially probability spaces, random variables, probability distributions) Outline Probability – basic ideas, terminology Random variables, joint distributions

    6. Folie 6 Riccardo Gismondi Probability Basics (1) Experiment – activity with uncertain outcome Flip coins, throw dice, pick cards, draw balls from urn, … Drive to work tomorrow – Time? Accident? Operate a (real) call center – Number of calls? Average customer hold time? Number of customers getting busy signal? Simulate a call center – same questions as above Sample space – complete list of all possible individual outcomes of an experiment Could be easy or hard to characterize May not be necessary to characterize

    7. Folie 7 Riccardo Gismondi Probability Basics (2) Event – a subset of the sample space Describe by either listing outcomes, “physical” description, or mathematical description Usually denote by E, F, E1, E2, etc. Union, intersection, complementation operations Probability of an event is the relative likelihood that it will occur when you do the experiment A real number between 0 and 1 (inclusively) Denote by P(E), P(E ? F), etc. Interpretation – proportion of time the event occurs in many independent repetitions (replications) of the experiment May or may not be able to derive a probability

    8. Folie 8 Riccardo Gismondi Probability Basics (3) Some properties of probabilities If S is the sample space, then P(S) = 1 Can have event E ? S with P(E) = 1 If Ø is the empty event (empty set), then P(Ø) = 0 Can have event E ? Ø with P(E) = 0 If EC is the complement of E, then P(EC) = 1 – P(E) P(E ? F) = P(E) + P(F) – P(E ? F) If E and F are mutually exclusive (i.e., E ? F = Ø), then P(E ? F) = P(E) + P(F) If E is a subset of F (i.e., the occurrence of E implies the occurrence of F), then P(E) ? P(F) If o1, o2, … are the individual outcomes in the sample space, then

    9. Folie 9 Riccardo Gismondi Probability Basics (4) Conditional probability Knowing that an event F occurred might affect the probability that another event E also occurred Reduce the effective sample space from S to F, then measure “size” of E relative to its overlap (if any) in F, rather than relative to S Definition (assuming P(F) ? 0): E and F are independent if P(E ? F) = P(E) P(F) Implies P(E|F) = P(E) and P(F|E) = P(F), i.e., knowing that one event occurs tells you nothing about the other If E and F are mutually exclusive, are they independent?

    10. Folie 10 Riccardo Gismondi Random Variables One way of quantifying, simplifying events and probabilities A random variable (RV) “is a number whose value is determined by the outcome of an experiment”. Technically, a function or mapping from the sample space to the real numbers, but can usually define and work with a RV without going all the way back to the sample space Think: RV is a number whose value we don’t know for sure but we’ll usually know something about what it can be or is likely to be Usually denoted as capital letters: X, Y, W1, W2, etc. Probabilistic behavior described by distribution function

    11. Folie 11 Riccardo Gismondi Discrete vs. Continuous RVs Two basic “flavors” of RVs, used to represent or model different things Discrete – can take on only certain separated values Number of possible values could be finite or infinite Continuous – can take on any real value in some range Number of possible values is always infinite Range could be bounded on both sides, just one side, or neither

    12. Folie 12 Riccardo Gismondi Discrete Distributions (1) Let X be a discrete RV with possible values (range) x1, x2, … (finite or infinite list) Probability mass function (PMF) p(xi) = P(X = xi) for i = 1, 2, ... The statement “X = xi” is an event that may or may not happen, so it has a probability of happening, as measured by the PMF Can express PMF as numerical list, table, graph, or formula Since X must be equal to some xi, and since the xi’s are all distinct,

    13. Folie 13 Riccardo Gismondi Discrete Distributions (2) Cumulative distribution function (CDF) – probability that the RV will be ? a fixed value x: Properties of discrete CDFs 0 ? F(x) ? 1 for all x As x ? –?, F(x) ? 0 As x ? +?, F(x) ? 1 F(x) is nondecreasing in x F(x) is a step function continuous from the right with jumps at the xi’s of height equal to the PMF at that xi

    14. Folie 14 Riccardo Gismondi Discrete Distributions (3) Computing probabilities about a discrete RV – usually use the PMF Add up p(xi) for those xi’s satisfying the condition for the event With discrete RVs, must be careful about weak vs. strong inequalities – endpoints matter!

    15. Folie 15 Riccardo Gismondi Discrete Expected Values Data set has a “center” – the average (mean) RVs have a “center” – expected value Also called the mean or expectation of the RV X Weighted average of the possible values xi, with weights being their probability (relative likelihood) of occurring What expectation is not: The value of X you “expect” to get E(X) might not even be among the possible values x1, x2, … What expectation is: Repeat “the experiment” many times, observe many X1, X2, …, Xn E(X) is what converges to (in a certain sense) as n ? ?

    16. Folie 16 Riccardo Gismondi Discrete Variances and Standard Deviations Data set has measures of “dispersion” – Sample variance Sample standard deviation RVs have corresponding measures Other common notation: Weighted average of squared deviations of the possible values xi from the mean Standard deviation of X is Interpretation analogous to that for E(X)

    17. Folie 17 Riccardo Gismondi Continuous Distributions (1) Now let X be a continuous RV Possibly limited to a range bounded on left or right or both No matter how small the range, the number of possible values for X is always (uncountably) infinite Not sensible to ask about P(X = x) even if x is in the possible range Technically, P(X = x) is always 0 Instead, describe behavior of X in terms of its falling between two values

    18. Folie 18 Riccardo Gismondi Probability density function (PDF) is a function f(x) with the following three properties: f(x) ? 0 for all real values x The total area under f(x) is 1: For any fixed a and b with a ? b, the probability that X will fall between a and b is the area under f(x) between a and b: Fun facts about PDFs Observed X’s are denser in regions where f(x) is high The height of a density, f(x), is not the probability of anything – it can even be > 1 With continuous RVs, you can be sloppy with weak vs. strong inequalities and endpoints Continuous Distributions (2)

    19. Folie 19 Riccardo Gismondi Continuous Distributions (3) Cumulative distribution function (CDF) - probability that the RV will be ? a fixed value x: Properties of continuous CDFs 0 ? F(x) ? 1 for all x As x ? –?, F(x) ? 0 As x ? +?, F(x) ? 1 F(x) is nondecreasing in x F(x) is a continuous function with slope equal to the PDF: f(x) = F'(x)

    20. Folie 20 Riccardo Gismondi Continuous Expected Values, Variances and Standard Deviations Expectation or mean of X is Roughly, a weighted “continuous” average of possible values for X Same interpretation as in discrete case: average of a large number (infinite) of observations on the RV X Variance of X is Standard deviation of X is

    21. Folie 21 Riccardo Gismondi Joint Distributions (1) So far: Looked at only one RV at a time But they can come up in pairs, triples, …, tuples, forming jointly distributed RVs or random vectors Input: (T, P, S) = (type of part, priority, service time) Output: {W1, W2, W3, …} = output process of times in system of exiting parts One central issue is whether the individual RVs are independent of each other or related Will take the special case of a pair of RVs (X1, X2) Extends naturally (but messily) to higher dimensions

    22. Folie 22 Riccardo Gismondi Joint CDF of (X1, X2) is a function of two variables Same definition for discrete and continuous If both RVs are discrete, define the joint PMF If both RVs are continuous, define the joint PDF f(x1, x2) as a nonnegative function with total volume below it equal to 1, and Joint CDF (or PMF or PDF) contains a lot of information – usually won’t have in practice Joint Distributions (2)

    23. Folie 23 Riccardo Gismondi Marginal Distributions What is the distribution of X1 alone? Of X2 alone? Jointly discrete Marginal PMF of X1 is Marginal CDF of X1 is Jointly continuous Marginal PDF of X1 is Marginal CDF of X1 is Everything above is symmetric for X2 instead of X1 Knowledge of joint ? knowledge of marginals – but not vice versa (unless X1 and X2 are independent)

    24. Folie 24 Riccardo Gismondi Covariance Between RVs Measures linear relation between X1 and X2 Covariance between X1 and X2 is If large (resp. small) X1 tends to go with large (resp. small) X2, then covariance > 0 If large (resp. small) X1 tends to go with small (resp. large) X2, then covariance < 0 If there is no tendency for X1 and X2 to occur jointly in agreement or disagreement over being big or small, then Cov = 0 Interpreting value of covariance – difficult since it depends on units of measurement

    25. Folie 25 Riccardo Gismondi Correlation Between RVs Correlation (coefficient) between X1 and X2 is Has same sign as covariance Always between –1 and +1 Numerical value does not depend on units of measurement Dimensionless – universal interpretation

    26. Folie 26 Riccardo Gismondi Independent RVs X1 and X2 are independent if their joint CDF factors into the product of their marginal CDFs: Equivalent to use PMF or PDF instead of CDF Properties of independent RVs: They have nothing (linearly) to do with each other Independence ? uncorrelated But not vice versa, unless the RVs have a joint normal distribution Important in probability – factorization simplifies greatly Tempting just to assume it whether justified or not Independence in simulation Input: Usually assume separate inputs are indep. – valid? Output: Standard statistics assumes indep. – valid?!?

    27. Folie 27 Riccardo Gismondi Random Numbers

    28. Folie 28 Riccardo Gismondi Introduction The building block of a simulation study is the ability to generate random numbers. Random number represent the value of a random variable uniformly distributed on (0,1)

    29. Folie 29 Riccardo Gismondi Pseudorandom Number Generation Multiplicative congruential method Mixed congruential method

    30. Folie 30 Riccardo Gismondi Computer exercises ( MATLAB ) Let find Let find

    31. Folie 31 Riccardo Gismondi Using random numbers to evaluate Integrals (1) Let g(x) a function and suppose we want to compute : If U is uniform distributed over (0,1), then we can express as If are independent uniform (0,1) random variables, it follows that the random variables are i.i.d variables with mean

    32. Folie 32 Riccardo Gismondi Using random numbers to evaluate Integrals (2) Therefore, by the strong law of large numbers, it follows that, with probability 1: Thus, we can approximate by generating a large number of and taking as our approximation the average value of . This approach to approximate integrals is called Monte Carlo method

    33. Folie 33 Riccardo Gismondi Computer exercises (MATLAB) Use simulation (Monte Carlo method) to approximate the following integrals : 1. 2.

    34. Folie 34 Riccardo Gismondi Using random numbers to evaluate Integrals (3) If we want to compute: then, by substitution ,

    35. Folie 35 Riccardo Gismondi Computer exercises (MATLAB) Use simulation (Monte Carlo method) to approximate the following integrals : 1. 2.

    36. Folie 36 Riccardo Gismondi Using random numbers to evaluate Integrals (4) Similarly, if we want to compute: then, by substitution , with

    37. Folie 37 Riccardo Gismondi Example: the estimation of Suppose that the random vector (X,Y) is uniformly distributed in the square of area 4 centered in the origin. Let us consider the probability that this random point is contained within the inscribed circle of radius 1: If U is uniform on (0,1) then 2U is uniform on (0,2) , and so 2U-1 is uniform on (-1,1)

    38. Folie 38 Riccardo Gismondi Example: the estimation of Therefore generate random numbers and set , and define : Hence we can estimate and hence by the fraction of pairs for which

    39. Folie 39 Riccardo Gismondi Home assignment (Matlab) Write a code to generate Uniform Random Numbers Write a code to generate Uniform Random Vectors Write a code to estimate and study the convergence

    40. Folie 40 Riccardo Gismondi Elements of Statistics

    41. Folie 41 Riccardo Gismondi Sampling Statistical analysis – estimate or infer something about a population or process based on only a sample from it Think of a RV with a distribution governing the population Random sample is a set of independent and identically distributed (IID) observations X1, X2, …, Xn on this RV In simulation, sampling is making some runs of the model and collecting the output data Don’t know parameters of population (or distribution) and want to estimate them or infer something about them based on the sample

    42. Folie 42 Riccardo Gismondi Sampling (cont’d.) Population parameter Population mean m = E(X) Population variance s2 Population proportion Parameter – need to know whole population Fixed (but unknown) Sample estimate Sample mean Sample variance Sample proportion Sample statistic – can be computed from a sample Varies from one sample to another – is a RV itself, and has a distribution, called the sampling distribution

    43. Folie 43 Riccardo Gismondi Sampling Distributions Have a statistic, like sample mean or sample variance Its value will vary from one sample to the next Some sampling-distribution results Sample mean If Regardless of distribution of X, Sample variance s2 E(s2) = s2 Sample proportion E( ) = p

    44. Folie 44 Riccardo Gismondi Point Estimation A sample statistic that estimates (in some sense) a population parameter Properties Unbiased: E(estimate) = parameter Efficient: Var(estimate) is lowest among competing point estimators Consistent: Var(estimate) decreases (usually to 0) as the sample size increases

    45. Folie 45 Riccardo Gismondi Confidence Intervals A point estimator is just a single number, with some uncertainty or variability associated with it Confidence interval quantifies the likely imprecision in a point estimator An interval that contains (covers) the unknown population parameter with specified (high) probability 1 – a Called a 100 (1 – a)% confidence interval for the parameter Confidence interval for the population mean m : CIs for some other parameters – in text

    46. Folie 46 Riccardo Gismondi Confidence Intervals in Simulation Run simulations, get results View each replication of the simulation as a data point Random input ? random output Form a confidence interval Brackets (with probability 1 – a) the “true” expected output (what you’d get by averaging an infinite number of replications)

    47. Folie 47 Riccardo Gismondi Hypothesis Tests Test some assertion about the population or its parameters Can never determine truth or falsity for sure – only get evidence that points one way or another Null hypothesis (H0) – what is to be tested Alternate hypothesis (H1 or HA) – denial of H0 H0: m = 6 vs. H1: m ? 6 H0: s < 10 vs. H1: s ? 10 H0: m1 = m2 vs. H1: m1 ? m2 Develop a decision rule to decide on H0 or H1 based on sample data

    48. Folie 48 Riccardo Gismondi Errors in Hypothesis Testing

    49. Folie 49 Riccardo Gismondi p-Values for Hypothesis Tests Traditional method is “Accept” or Reject H0 Alternate method – compute p-value of the test p-value = probability of getting a test result more in favor of H1 than what you got from your sample Small p (like < 0.01) is convincing evidence against H0 Large p (like > 0.20) indicates lack of evidence against H0 Connection to traditional method If p < a, reject H0 If p ? a, do not reject H0 p-value quantifies confidence about the decision

    50. Folie 50 Riccardo Gismondi Hypothesis Testing in Simulation Input side Specify input distributions to drive the simulation Collect real-world data on corresponding processes “Fit” a probability distribution to the observed real-world data Test H0: the data are well represented by the fitted distribution Output side Have two or more “competing” designs modeled Test H0: all designs perform the same on output, or test H0: one design is better than another Selection of a “best” model scenario

    51. Folie 51 Riccardo Gismondi Elements of Probability II

    52. Folie 52 Riccardo Gismondi Discrete Random Variables Binomial Random Variables Suppose that n independent trials, each of which results in success (1) with probability p, are to be performed. If X represents the number of success in the n trials, then X is said to be a Binomial Random Variable with parameters (p, n). Its PMF is given by A binomial (1, p) random variable is called Bernoulli random variable

    53. Folie 53 Riccardo Gismondi Discrete Random Variables Binomial Random Variables For a Binomial Random Variable with parameters (p,n) expectation and variance is given by (Home assignment !) Poisson Random Variables A random variable that takes on one of the values 0,1,2 3,…. is said to be Poisson Random Variable with parameter if its PMF is given by

    54. Folie 54 Riccardo Gismondi Discrete Random Variables Poisson Random Variables Poisson random variables have a wide range of applications. One reason is because such random variables may be used to approximate the distribution of the number of success in a large number of trials, when each trial has a small probability of success. To see why, suppose that X is a binomial random variable (n,p) and let Then

    55. Folie 55 Riccardo Gismondi Discrete Random Variables Poisson Random Variables

    56. Folie 56 Riccardo Gismondi Discrete Random Variables Poisson Random Variable Now for Hence for Since the mean and variance of a binomial random variable Y are given by Then for a Poisson variable with parameter , we get

    57. Folie 57 Riccardo Gismondi Continuous Random Variables Uniformly Distributed Random Variables A random variable X is said to be Uniformly Distributed over the interval (a,b) if its pdf is given by For the mean we have :

    58. Folie 58 Riccardo Gismondi Continuous Random Variables Uniformly Distributed Random Variables For the variance we have:

    59. Folie 59 Riccardo Gismondi Continuous Random Variables Normal Random Variables A random variable X is said to be normally distributed if its pdf is given by It is not difficult to show (Home assignment !) that :

    60. Folie 60 Riccardo Gismondi Continuous Random Variables Normal Random Variables An important fact about normal random variables is that if X is normally distributed then (Home assignment !) It follows from this that if Such a random variable is said to have standard normal distribution

    61. Folie 61 Riccardo Gismondi Continuous Random Variables Normal Random Variables Let denote the CDF of a , that is We have the Central Limit Theorem : Let a sequence of i.i.d random variables with finite mean and finite variance : then

    62. Folie 62 Riccardo Gismondi Continuous Random Variables Exponential Random Variables A random variable X is said to be exponentially distributed with parameter if its pdf is given by Its CDF is given by It is not difficult to show (Home assignment !) that :

    63. Folie 63 Riccardo Gismondi Continuous Random Variables Exponential Random Variables The key property of exponential random variable is the “memoryless property”: This equation is equivalent to which is clearly satisfied whenever X is an exponential random variable, since

    64. Folie 64 Riccardo Gismondi Generating Random Variables

    65. Folie 65 Riccardo Gismondi Discrete Random Variables The Inverse Transform Method (1) Suppose we want to generate a DRV X having PMF To accomplish this, we generate and set

    66. Folie 66 Riccardo Gismondi Discrete Random Variables The Inverse Transform Method (2) Since for we have that And so X has the desired distribution

    67. Folie 67 Riccardo Gismondi Computer exercises (MATLAB) Simulate a RV X such that Simulate a RV X such that Simulate a RV X such that

    68. Folie 68 Riccardo Gismondi Home assignment (Matlab) Write a code to generate a DRV with the Inverse Transform Method [U,Hist(U)]=f(data,N)

    69. Folie 69 Riccardo Gismondi Continuous Random Variables The Inverse Transform Algorithm (2) Consider a CRV having PDF F. A general method for generating such a random is based on the following proposition. Proposition : Let U be a uniform (0,1) RV. For any continuous distribution function F the RV X defined by has a distribution F Proof : let denote the PDF of . Then

    70. Folie 70 Riccardo Gismondi Continuous Random Variables The Inverse Transform Algorithm (3) Now since F is a PDF it follows that Hence, we see that

    71. Folie 71 Riccardo Gismondi Continuous Random Variables The Inverse Transform Algorithm (4) 1. Supposed we wanted to generate a CRV X with PDF If we let then 2. If X is a exponential RV with rate 1, then is PDF is given by

    72. Folie 72 Riccardo Gismondi Continuous Random Variables The Box-Muller method for generating Standard Normal Random Variables (1) Let X and Y be independent standard normal RV and let R and denote the polar coordinates of the vector (X,Y) , that is

    73. Folie 73 Riccardo Gismondi Continuous Random Variables The Box-Muller method for generating Standard Normal Random Variables (2) Since X and Y are independent , their joint PDF is given (why?) by To determine the joint density of and we make the change of variables :

    74. Folie 74 Riccardo Gismondi Continuous Random Variables The Box-Muller method for generating Standard Normal Random Variables (3) Since the Jacobian of the transformation is equal to 2 ( home assignment !) it follows that the joint PDF of and is However, this is equal to the product of an exponential density having mean 2 and the uniform density on ! It follows that and are independent with

    75. Folie 75 Riccardo Gismondi Continuous Random Variables The Box-Muller method for generating Standard Normal Random Variables (4) We can now generate a pair of independent standard normal RV with the following algorithm: Step 1 : generate uniform random numbers and Step 2: Step 3:

    76. Folie 76 Riccardo Gismondi Write a code to generate a standard normal RV with the Box-Muller Method [U,hist(U)]=f(N) Home assignment (Matlab)

    77. Folie 77 Riccardo Gismondi Continuous Random Variables The Polar method for generating Standard Normal Random Variables (1) Unfortunately, the Box-Mueller Transformation is computationally not very efficient: sine and cosine trigonometric functions Idea: indirect computation of a sine and cosine of a random angle.

    78. Folie 78 Riccardo Gismondi Continuous Random Variables The Polar method for generating Standard Normal Random Variables (2) Thus: If and are Uniform over (0,1) , then and are also Uniform over (0,1); setting we obtain that

    79. Folie 79 Riccardo Gismondi Continuous Random Variables The Polar method for generating Standard Normal Random Variables (3) are independent standard normal when is a randomly chosen point in the circle of radius 1 centered at the origin . Summing up we have the following algorithm to generate a pair of independent standard normal

    80. Folie 80 Riccardo Gismondi Continuous Random Variables The polar method Algorithm STEP 1 : Generate random numbers and STEP 2 : Set STEP 3 : if go to STEP 1 else return the i.s.n

    81. Folie 81 Riccardo Gismondi Write a code to generate a standard normal RV with the Polar Method [U,hist(U)]=f(N) Home assignment (Matlab)

    82. Folie 82 Riccardo Gismondi Continuous Random Variables Generating Multivariate Normal Variables (1) A d-dimensional multivariate normal distribution is characterized by a d-vector and a matrix : we abbreviate it as . must be symmetric and positive semidefinite, meaning that for all If is positive semidefinite, then the normal distribution has density

    83. Folie 83 Riccardo Gismondi Continuous Random Variables Generating Multivariate Normal Variables (2) The standard d-dimensional multivariate normal distribution with the identity matrix, is the special case: If , then its i-th component has distribution with . The i-th and j-th components have covariance

    84. Folie 84 Riccardo Gismondi Continuous Random Variables Generating Multivariate Normal Variables (3) Any linear transformation of a normal vector is again normal: , for any d-vector and matrix , and any matrix A, for any k. Using any of the methods described before, we can generate standard normal variables and assemble them into a vector . Thus the problem of sampling X from the multivariate normal reduces to finding a matrix A for which

    85. Folie 85 Riccardo Gismondi Continuous Random Variables Generating Multivariate Normal Variables (4) Among all such A, a lower triangular one is particularly convenient because it reduces the calculation of to the following: A full multiplication of the vector Z by the matrix A would require approximately twice as many multiplications and additions.

    86. Folie 86 Riccardo Gismondi Continuous Random Variables Generating Multivariate Normal Variables (5) A representation of as with A lower triangular is a Cholesky factorization of A . Consider a covariance matrix represented as Assuming and , the Cholesky factor is ( verified ! )

    87. Folie 87 Riccardo Gismondi Continuous Random Variables Generating Multivariate Normals Variables (6) Thus we can generate a bivariate normal distribution by setting For a d-dimensional case we need to solve :

    88. Folie 88 Riccardo Gismondi Write a code to generate Multivariate Normal RVs Home assignment (Matlab)

    89. Folie 89 Riccardo Gismondi Brownian Motion I

    90. Folie 90 Riccardo Gismondi Brownian Motion One Dimension: a standard one-dimensional Brownian motion on is a stochastic process with the following properties: is, with probability 1, a continuous function on the increments are independent for any and any for any

    91. Folie 91 Riccardo Gismondi Brownian Motion One Dimension: because of its mentioned properties, the following condition is valid for for constants and , the process is a Brownian Motion with drift and diffusion coefficient , if is a Standard Brownian Motion … represents a Brownian Motion with drift and diffusion coefficient

    92. Folie 92 Riccardo Gismondi Brownian Motion Standard Brownian Motion

    93. Folie 93 Riccardo Gismondi Brownian Motion Brownian Motion

    94. Folie 94 Riccardo Gismondi Brownian Motion One Dimension: Consequently: solves the stochastic differential equation (SDE) for deterministic, but time-varying and through integration we come to the results.

    95. Folie 95 Riccardo Gismondi Brownian Motion One Dimension: The process has continuous sample paths and independent increments Each increment is normally distributed with mean and variance

    96. Folie 96 Riccardo Gismondi Brownian Motion Random Walk Construction: subsequent values for a standard Brownian motion from and can be generated as follows: for with time-dependent coefficients with Euler approximation

    97. Folie 97 Riccardo Gismondi Home assignment (Matlab) Write a code to generate a Standard Brownian Motion Write a code to generate a Brownian Motion

    98. Folie 98 Riccardo Gismondi Brownian Motion Random Walk Construction: for a standard Brownian motion, we know that and for the covariance matrix we get for the covariance matrix of we denote:

    99. Folie 99 Riccardo Gismondi Brownian Motion Cholesky Factorization the vector has the distribution and we may simulate this as vector , where and satisfies by applying the Cholesky Factor, we get as a lower triangular matrix of the form

    100. Folie 100 Riccardo Gismondi Home assignment (Matlab) Write a code to generate a Standard Brownian Motion with Random Walk construction ( Cholesky factor )

    101. Folie 101 Riccardo Gismondi Brownian Motion Brownian Bridge Construction besides generating the vector from left to right, there also exists another method e.g. generating the last sample step first: then generate conditional on the value of and proceed with filling in the intermediate values this method is useful for variance reduction and low-discrepancy methods but it does not give us any computational advantage

    102. Folie 102 Riccardo Gismondi Brownian Motion Brownian Bridge Construction suppose consider the problem, to generate conditional on and the unconditional distribution is given by

    103. Folie 103 Riccardo Gismondi Brownian Motion Brownian Bridge Construction first we permutate the entries and by applying the conditioning formula to find the distribution of conditional on the value of we get the mean

    104. Folie 104 Riccardo Gismondi Brownian Motion Brownian Bridge Construction and the variance since the conditional mean is obtained by linearly interpolating between and

    105. Folie 105 Riccardo Gismondi Brownian Motion Brownian Bridge Construction suppose that combining the observations of the mean and the variance and finally …

    106. Folie 106 Riccardo Gismondi Brownian Motion Brownian Bridge Construction Brownian Bridge construction of Brownian path. Conditional on and , the value is normally distributed

    107. Folie 107 Riccardo Gismondi Home assignment (Matlab) Write a code to generate a Standard Brownian Motion with Brownian Bridge Construction

    108. Folie 108 Riccardo Gismondi Brownian Motion Brownian Bridge Construction How could the construction be modified for a Brownian motion with drift sampling the rightmost point would change sample from instead from

    109. Folie 109 Riccardo Gismondi Brownian Motion Principal Components Construction to visualize the construction of single Brownian path we can write in vector form: with originally derived from through Cholesky-Factorzation now, we want to derive through principal component construction, such that where are the eigenvalues of and are the eigenvectors

    110. Folie 110 Riccardo Gismondi Brownian Motion Principal Components Construction it can be shown that for an n-step path with equal spacing because of we can write for the discrete case and for the continuous limit with the eigenfunction on

    111. Folie 111 Riccardo Gismondi Brownian Motion Principal Components Construction the solution to this equation is Karhounen-Loève expansion of Brownian Motion

    112. Folie 112 Riccardo Gismondi Home assignment (Matlab) Write a code to generate a Standard Brownian Motion with Principal Components Construction (Karhounen-Loève)

    113. Folie 113 Riccardo Gismondi Geometric Brownian Motion, B&S Model, Pricing of (exotic) options: 1-dim

    114. Folie 114 Riccardo Gismondi Geometric Brownian Motion a stochastic process is a geometric Brownian Motion if is a Brownian motion with initial value in other words: a geometric Brownian motion is simply an exponentiated Brownian motion all methods for simulating Brownian motion become methods for simulating geometric Brownian motion through exponentiation a geometric Brownian motion is always positive, because the exponential function takes only positive values are independent for rather than the absolute changes

    115. Folie 115 Riccardo Gismondi Geometric Brownian Motion Basic properties suppose is a standard Brownian motion and a Brownian Motion generated from , such that we set and with application of Itô’s formula

    116. Folie 116 Riccardo Gismondi Geometric Brownian Motion Basic properties in contrast, a geometric Brownian motion is specified through the SDE ( B&S Model ) there seems to be an ambiguity in the role of the drift of a geometric Brownian Motion is and implies as can be verified through Itô’s formula we will use the notation

    117. Folie 117 Riccardo Gismondi Geometric Brownian Motion Basic properties if has initial value , then and respectively, if this provides a recursive procedure for simulating values of at

    118. Folie 118 Riccardo Gismondi Home assignment (Matlab) Write a code to generate a Geometric Brownian Motion

    119. Folie 119 Riccardo Gismondi Geometric Brownian Motion B&S Model if has initial value , then and respectively, if this provides a recursive procedure for simulating values of at

    120. Folie 120 Riccardo Gismondi Geometric Brownian Motion Lognormal Distribution if , then the marginal distribution of is that of the exponential of a normal random variable, which is called a lognormal distribution , if the random variable has the distribution of the distribution thus is given by and the density

    121. Folie 121 Riccardo Gismondi Geometric Brownian Motion Lognormal Distribution for Expected Value: Variance: if then and

    122. Folie 122 Riccardo Gismondi Geometric Brownian Motion Lognormal Distribution in fact, we have the first equality is the Markov property (follows from the fact that S is a one-to-one transformation of a Brownian motion, itself a Markov process) acts as an average growth rate for for a standard Brownian motion , we have with probability 1 for , we therefore find that with probability 1

    123. Folie 123 Riccardo Gismondi Geometric Brownian Motion Lognormal Distribution if the expression (growth rate) is positive as if it is negative as

    124. Folie 124 Riccardo Gismondi Geometric Brownian Motion Risk-Neutral Dynamics the difficulty within this model lies in finding the correct probability measure for building the expectation and for finding the appropriate discount rate this bears on how the paths of the underlying are to be generated and on how the drift parameter is chosen we assume the existence of a continuous compounding interest rate r for riskless borrowing and lending therefore a dollar investment grows as follows at time t we call the numeraire asset

    125. Folie 125 Riccardo Gismondi Geometric Brownian Motion Risk-Neutral Dynamics suppose the existence of an asset that does not pay dividends then under the risk-neutral measure, the discounted price process is a martingal if is a GBM under the risk neutral measure, then it must have and further in a risk neutral world, all assets would have the same average rate of return

    126. Folie 126 Riccardo Gismondi Geometric Brownian Motion Risk-Neutral Dynamics therefore the drift parameter for will equal the risk free rate should an asset pay dividends, then the martingal property continues to hold, but with replaced by the sum of , any dividends paid and any interest earned from investing these dividends at let be the value of all dividends paid over [0, t] and assume that the asset pays a continuous dividend yield of such that it pays dividends of at time t

    127. Folie 127 Riccardo Gismondi Geometric Brownian Motion Risk-Neutral Dynamics grows at if , then its drift in is now apply the martingal property to the combined process , requires that this drift equal therefore, we must have , i.e.

    128. Folie 128 Riccardo Gismondi Geometric Brownian Motion Risk-Neutral Dynamics Example: Futures Contract commits the holder to buying an underlying asset at a fixed price at a fixed date in the future both, the buyer and the payer agree to the futures price, specified in the contract, without either party paying anything immediately to the other

    129. Folie 129 Riccardo Gismondi Geometric Brownian Motion Risk-Neutral Dynamics for a futures contract with a zero value at the inception time t entails where is the history of the market prices up to time t at the spot and futures prices must agree, so and we may rewrite the condition as the futures price is a martingale under the risk neutral measure when modeling a futures price with a GBM, then you should set the drift parameter to zero: and finally this reveals to

    130. Folie 130 Riccardo Gismondi Geometric Brownian Motion B&S Model In Black-Scholes model many theoretical assumptions: 1. the short interest rate is known and is constant through time; 2. the underlying asset pays no dividends 3. the option is European 4. no transaction costs 5. is it possible to borrow any fraction of the price of a security to buy it or to hold it, at the short interest rate. 6. trading can be carried on continuously

    131. Folie 131 Riccardo Gismondi Geometric Brownian Motion B&S Model With the assumption that the underlying follows a GBM model, , and with no-arbitrage argument, Black and Scholes obtained the amazing formula for the European call option price :

    132. Folie 132 Riccardo Gismondi Home assignment (Matlab) Write a code for pricing a European call/put option simulating as and study the asymptotic convergence to the theoretical B&S price. Describe and plot the convergence of the MC estimator. What is the variance of the estimator? When will it slow down ? How many simulations you need to be “sure”? Write a report about your research.

    133. Folie 133 Riccardo Gismondi Geometric Brownian Motion Path-Dependent Options the reason behind simulating GBM paths, is that this can be used for pricing options, (e.g. not only Vanillas) particularly for those whose payoff depend on the path of an underlying asset and not simply its value the price of an option may be represented as an expected discounted payoff, so simulate by generating paths of the underlying asset, evaluate the discounted payoff on each path and average over all paths

    134. Folie 134 Riccardo Gismondi Geometric Brownian Motion Path-Dependent Payoffs an option price could in principal depend on the complete path over an interval e.g. Asian options – discrete monitoring: is an option on a time average of the underlying asset. Payoff Asian Call: Payoff Asian Put: with constant strike price and

    135. Folie 135 Riccardo Gismondi Geometric Brownian Motion Path-Dependent Payoffs Asian options – continuous monitoring: just replace the discrete average with the continuous over an integral

    136. Folie 136 Riccardo Gismondi Geometric Brownian Motion Path-Dependent Payoffs Geometric average option: replacing the arithmetic average with such options are useful in test cases for computational procedures they are mathematically convenient to work with because of its geometric average we find (with replaced by ), that

    137. Folie 137 Riccardo Gismondi Home assignment (Matlab) Write a code to price a Asian Option with arithmetic/geometric mean and discrete monitoring :

    138. Folie 138 Riccardo Gismondi Geometric Brownian Motion Path-Dependent Payoffs Barrier Options: the option gets “knocked out”, if the underlying asset crosses a prespecified level. e.g. a “down-and-out call” with barrier strike and expiration has the payoff where is the first time that the price of the underlying drops below and represents the indicator function of the event in the braces

    139. Folie 139 Riccardo Gismondi Geometric Brownian Motion Path-Dependent Payoffs Barrier Options: respectively, a “down-and-in call” has the payoff and gets “knocked-in” only when the underlying asset crosses the barrier. Up-and-out and up-and-in calls and puts can be derived analogously!

    140. Folie 140 Riccardo Gismondi Home assignment (Matlab) Write a code to price a Barrier Option (down and in, down and out) with discrete monitoring :

    141. Folie 141 Riccardo Gismondi Geometric Brownian Motion Path-Dependent Payoffs Lookback options: like barrier options, lookback options depend on extremal values of the underlying asset price. Such puts and calls expiring at have payoffs: e.g. a lookback call can be viewed as the profit from buying at the lowest price over and selling at the final price

    142. Folie 142 Riccardo Gismondi Geometric Brownian Motion Incorporating a Term Structure till now, we assumed that risk free rate is constant therefore the price of (riskless) zero-coupon bond maturing at can be calculated as follows: now we assume, that on the markets we observe bond prices that are incompatible with the above mentioned pricing formula therefore, to use this term structure in option-pricing, we have to introduce a deterministic, but time-varying risk-free rate

    143. Folie 143 Riccardo Gismondi Geometric Brownian Motion Incorporating a Term Structure then we get with this time-varying risk-free rate , the dynamics of an asset price under the risk-neutral measure can be described by the SDE (stochastic differential equation) with solution and this process can be simulated over

    144. Folie 144 Riccardo Gismondi Home assignment (Matlab) Write a code to price a Asian Option with discrete monitoring and time-varying interest rate: e.g., you can take :

    145. Folie 145 Riccardo Gismondi Geometric Brownian Motion Incorporating a Term Structure if we observe bond prices we can calculate the term structure as follows and we can simulate using

    146. Folie 146 Riccardo Gismondi Geometric Brownian Motion Simulating Off a Forward Curve now, we not just assume the observation of but also a series of forward prices for the asset under the risk-neutral measure, this implies and we can simulate

    147. Folie 147 Riccardo Gismondi Geometric Brownian Motion Deterministic Volatility Functions on the markets, it has been widely observed, that option prices are incompatible with a GBM model for the underlying asset applying the Back-Scholes model for option prices, this would assume to use the same volatility for all traded options with different strikes, different maturities that are traded on the same underlying asset. in practice, the observed volatility of these traded assets varies with strike and maturity

    148. Folie 148 Riccardo Gismondi Geometric Brownian Motion Deterministic Volatility Functions Black-Scholes has to be modified to find the real market prices! consequently, we let our volatility depend on and , such that it looks like the following this leads to the model assuming a deterministic volatility function, an option could be hedged through a position in the underlying asset this would not be the case in a stochastic volatility model

    149. Folie 149 Riccardo Gismondi Geometric Brownian Motion Deterministic Volatility Functions to get a numerical optimization problem has to be solved in general, there is no exact simulation procedure for these models and it is necessary to use an Euler scheme of the form or the Euler scheme for

    150. Folie 150 Riccardo Gismondi Home assignment (Matlab) Write a code to price a Asian Option with arithmetic/geometric mean, discrete monitoring and time-varying volatility :

    151. Folie 151 Riccardo Gismondi Brownian Motion II

    152. Folie 152 Riccardo Gismondi Brownian Motion Multiple Dimensions a process , is called a standard Brownian Motion on , if it has continuous sample paths independent increments and for all is the identity matrix it follows that each process is a standard one-dimensional Brownian motion and that and are independent for

    153. Folie 153 Riccardo Gismondi Brownian Motion Multiple Dimensions suppose: is a vector in and is a matrix, positive definite or semidefinite is Brownian Motion with drift and covariance …. whereby this process solves the SDE (stochastic differential equation) for deterministic, but time-varying and : … whereby

    154. Folie 154 Riccardo Gismondi Brownian Motion Multiple Dimensions this process has continuous sample paths, independent increments and

    155. Folie 155 Riccardo Gismondi Brownian Motion Random Walk Construction let be independent random vectors in we can construct a standard d-dimensional Brownian motion at times by setting and to simulate we first have to find a matrix for which set and

    156. Folie 156 Riccardo Gismondi Brownian Motion Random Walk Construction thus, simulation of is straightforward once has been factored for the case of time-dependent coefficients, we may set with

    157. Folie 157 Riccardo Gismondi Home assignment (Matlab) Write a code to generate a d-dimensional Standard Brownian Motion Write a code to generate a d-dimensional Brownian Motion

    158. Folie 158 Riccardo Gismondi Brownian Motion Brownian Bridge Construction we can apply independent one-dimensional constructions also for the multi-dimensional case to include a drift vector, it suffices to add to at the first step of the construction of the ith coordinate let be a standard k-dimensional Brownian motion and a matrix, with , satisfying this results into as and are recovered through a linear transformation

    159. Folie 159 Riccardo Gismondi Brownian Motion Principal Components Construction by application of the principal components construction to the multidimensional case the covariance matrix can be represented as the eigenvectors of this matrix can be written as where are the eigenvectors from are the eigenvectors from and the eigenvalues where are the eigenvalues from are the eigenvalues from

    160. Folie 160 Riccardo Gismondi Brownian Motion Principal Components Construction ranking the products of the eigenvalues then for any the variability of the first k factors is always smaller for a d-dimensional Brownian motion than for a scalar Brownian motion over the same time points since the d-dimensional process has greater total variability

    161. Folie 161 Riccardo Gismondi Geometric Brownian Motion, Pricing of (exotic) options : n-dim

    162. Folie 162 Riccardo Gismondi Geometric Brownian Motion Multiple Dimensions a multidimensional geometric Brownian motion can be written by the SSDE (system of stochastic differential equations) where each is a standard one-dimensional Brownian motion and and have the correlation

    163. Folie 163 Riccardo Gismondi Geometric Brownian Motion Multiple Dimensions by setting (variance-covariance-matrix) then and with the actual drift vector is given by and the covariances are given by with

    164. Folie 164 Riccardo Gismondi Geometric Brownian Motion Multiple Dimensions with cholesky factorization from we get a matrix such that and we can formulate this leads to an algorithm for simulating at the times

    165. Folie 165 Riccardo Gismondi Home assignment (Matlab) Write a code to generate a d-dimensional Geometric Brownian Motion

    166. Folie 166 Riccardo Gismondi Geometric Brownian Motion Multiple Dimensions options depending on multiple assets e.g. Spread-Option a call option on the spread between two assets and has the payoff Basket Option an option on a portfolio of underlying assets and has the payoff

    167. Folie 167 Riccardo Gismondi Home assignment (Matlab) Write a code to price a Basket Option and discrete monitoring:

    168. Folie 168 Riccardo Gismondi Geometric Brownian Motion Multiple Dimensions Outperformance Option two options on the maximum or minimum of multiple assets; e.g. the payoff can be Barrier Options e.g. a two asset barrier – here a down-and-in put – with payoff this is a down-and-in put on that knocks in when drops below the barrier

    169. Folie 169 Riccardo Gismondi Home assignment (Matlab) Write a code to price a Outperformance Option and discrete monitoring:

    170. Folie 170 Riccardo Gismondi Geometric Brownian Motion Multiple Dimensions Quantos options that are sensitive to a stock price and an exchange rate (e.g. different currencies for the option and the stock) with as the stock price and as the exchange rate (expressed as the quantity of domestic currency required per unit of the foreign currency) the payoff of the option in the domestic currency is given by

    171. Folie 171 Riccardo Gismondi Geometric Brownian Motion Multiple Dimensions Quantos (cont) respectively, the payoff corresponds to Quanto, whereby the strike is fixed in the domestic currency and the payoff of the option is made in foreign currency

    172. Folie 172 Riccardo Gismondi Home assignment (Matlab) Write a code to price a Quanto Option and discrete monitoring :

    173. Folie 173 Riccardo Gismondi Square root diffusion process: CIR Model

    174. Folie 174 Riccardo Gismondi Square-Root Diffusions Basic properties Consider a class of process that includes the square-root diffusion with W a standard one-dimension Brownian motion. We consider the case in which: If then for all If then remains strictly positive for all All of the coefficients could in principle be time-dependent i.e , with

    175. Folie 175 Riccardo Gismondi Square-Root Diffusions Applications CIR Model Heston Model

    176. Folie 176 Riccardo Gismondi Square-Root Diffusions Applications CIR Model The square-root diffusion process can be used as a model of the short rate. This model was done to illustrate the workings of a general equilibrium model and was proposed by Cox-Ingersoll-Ross (CIR) (1985).

    177. Folie 177 Riccardo Gismondi Square-Root Diffusions Applications Heston Model Heston proposed a stochastic volatility model in which the price of an asset is governed by the squared volatility follows a square-root diffusion and is a two-dimensional Brownian motion with

    178. Folie 178 Riccardo Gismondi Square-Root Diffusions Simulating with Euler Method Discretization at times by setting: with independent random variables Notice: we have taken the positive part of inside the square root. Some modification of this form is necessary because the values of produces by Euler discretization may become negative.

    179. Folie 179 Riccardo Gismondi Square-Root Diffusions Simulating with Transition Density Method A noncentral chi-square random variable with degrees of freedom and noncentrality parameter has distribution for The transition law of can be expressed as where

    180. Folie 180 Riccardo Gismondi Square-Root Diffusions Simulating with Transition Density Method This says that, given is distributed as times a noncentral chi-square random variable with degrees of freedom and noncentraly parameter Equivalently

    181. Folie 181 Riccardo Gismondi Square-Root Diffusions Simulating with Transition Density Method Chi-Square If is a positive integer and are independent random variables, then the distribution of is called the chi-square distribution with degrees of freedom. The chi-square distribution is given by where denotes the gamma function and for

    182. Folie 182 Riccardo Gismondi Square-Root Diffusions Simulating with Transition Density Method Noncentral Chi-Square For integer and constants the distribution of is noncentral chi-square with degrees of freedom and noncentrality parameter if then Consequently

    183. Folie 183 Riccardo Gismondi Square-Root Diffusions Simulating with Transition Density Method If N is a Poisson random variable with mean than Conditional on has the random variable the ordinary chi-square distribution with degrees of freedom: The unconditional distribution is thus given by

    184. Folie 184 Riccardo Gismondi Square-Root Diffusions Simulating with Transition Density Method Simulation of square-root diffusion on time gird with by sampling from the transition density

    185. Folie 185 Riccardo Gismondi Square-Root Diffusions Simulating with Transition Density Method Simulation of square-root diffusion on time gird with by sampling from the transition density

    186. Folie 186 Riccardo Gismondi Square-Root Diffusions Simulating with Transition Density Method Comparison of exact distribution (solid) and one-step Euler approximation (dashed) for a square-root diffusion:

    187. Folie 187 Riccardo Gismondi Square-Root Diffusions Sampling Gamma and Poisson Gamma Distribution The gamma distribution with shape parameter and scale parameter has the density with mean and variance The chi-square distribution is a special case of gamma distribution with scale parameter

    188. Folie 188 Riccardo Gismondi Square-Root Diffusions Sampling Gamma and Poisson Gamma Distribution Algorithms GKM1 for sampling from the gamma distribution with parameters

    189. Folie 189 Riccardo Gismondi Square-Root Diffusions Sampling Gamma and Poisson Gamma Distribution Algorithms (Ahrens-Dieter) for sampling from the gamma distribution with parameters

    190. Folie 190 Riccardo Gismondi Square-Root Diffusions Sampling Gamma and Poisson Poisson Distribution The poisson distribution with mean is given by We write Poisson is the distribution of the number of events in when the times between consecutive events are independent and exponentially distributed with mean

    191. Folie 191 Riccardo Gismondi Square-Root Diffusions Sampling Gamma and Poisson Poisson Distribution Inverse transformations method for sampling from the Poisson distribution with mean

    192. Folie 192 Riccardo Gismondi Heston Model and stochastic Volatility

    193. Folie 193 Riccardo Gismondi The Heston Model Is a stochastic volatility model in which the price of an asset is governed by the squared volatility follows a square-root diffusion and is a two-dimensional Brownian motion with

    194. Folie 194 Riccardo Gismondi The Heston Model Equations description The first equation gives the dynamic of a stock price The parameters represents the stock price at time the risk neutral drift the volatility

    195. Folie 195 Riccardo Gismondi The Heston Model Equations description The second equation gives the evolution of the variance which follows the square-root process The parameters represents the long vol, or long run average price volatility; as tends to infinity, the expected value of tends to the mean reversion parameter; rate at which reverts to the vol of vol, or volatility of the volatility; this determines the variance of

    196. Folie 196 Riccardo Gismondi The Heston Model Simulating 1. Simulate ( e.g. with random walk construction ) by setting with independent standard random normal vectors

    197. Folie 197 Riccardo Gismondi The Heston Model Simulating 2. Simulate ( e.g. with Euler discretization ) at times by setting 3. Simulate at times by setting

    198. Folie 198 Riccardo Gismondi The Heston Model Option pricing Call Option The time price of a European call option with time to maturity denoted is given by Put Option The price of a European put option at time is obtained through put-call parity and is given by or

    199. Folie 199 Riccardo Gismondi The Heston Model Option pricing are given by is given by

    200. Folie 200 Riccardo Gismondi The Heston Model Option pricing with

    201. Folie 201 Riccardo Gismondi The Heston Model Option pricing and solutions where

    202. Folie 202 Riccardo Gismondi The Heston Model Option pricing The parameters represents the following the spot price of the asset the strike price of the option interest rate dividend yield a discount factor from time to time and are the probabilities that the call option expires in-the-money.

    203. Folie 203 Riccardo Gismondi The Heston Model Exact Simulation The stock price at time given the values of and for can be written as: were the variance at time is given by the equation:

    204. Folie 204 Riccardo Gismondi The Heston Model Exact Simulation (Algorithm) Input: Output: with 1. Generate a sample from the distribution of given 2. Generate a sample from the distribution of given and 3. Recover given and 4. Generate a sample from the distribution of given and

    205. Folie 205 Riccardo Gismondi Pricing of Exotic Options: Structured Products (Equity)

More Related