Presentation is loading. Please wait.

Presentation is loading. Please wait.

Università di Verona, 2015 may 6th Michele Bonollo SDE & Statistics MiniCourse Topics Roadmap: contents, goals, tools.

Similar presentations


Presentation on theme: "Università di Verona, 2015 may 6th Michele Bonollo SDE & Statistics MiniCourse Topics Roadmap: contents, goals, tools."— Presentation transcript:

1 Università di Verona, 2015 may 6th Michele Bonollo michele.bonollo@imtlucca.it.it SDE & Statistics MiniCourse Topics Roadmap: contents, goals, tools

2 The general goals What does “application” exactly mean? It does not mean “a new mathematics for the applications”, but mathematics in the applications Application is different form practical implementation Practical implementation implies algorithms and software development The applied process must deal with daily, weekly, continuously runs The software execution requires data The data could be missing or low quality Hence, the practical implementation must be robust (resilient) with respect to data Hence, we need to manage the model sensitivity More over, the software must be general/flexible. Along with data, we have the parameters. Example: the number of simulations is a parameter Algorithms imply computational complexity/effort. They must be performing in the tradeoff accuracy/cost

3 Agenda SDE Topics SDE and Closed form solutions: Black and Sholes model review SDE Approximations: from the Eulero scheme to some higher order techniques. The stochastic processes simulation Discrete space and discrete state processes Some Brownian motion useful functionals: first hitting time, occupation time Application 1: Exotic derivative evaluation Application 2: Full evalution of VaR versus Delta-Gamma-Vega VaR Application 3: The Compound Poisson process and the Operational VaR estimation STAT Topics Parameter Estimation and Calibration review Volatility surface: tricks and practical problems Principal Component Analysis: applications to the term structure and to the volatility surface

4 The Model The model is dX t = X t r dt + X t  dB t. The dynamics may be assigned in differential or explicit form. Very often in finance the drift of X depends in some way from the current level X. Some other very popular diffusions. From Vasiceck to CIR to HW to Heston. The Problem/Goal How to show (Ito) that the lognormal diffusion is the solution. Solutions properties The practical management (hedging) of a (small) book of options The Tools Evaluation, Delta/Gamma/Vega Hedging Delta Equivalent & Delta cash The volatility estimation update. The market 8implied) volatility. A market tool SDE and Closed form solutions: Black and Sholes model review

5 The Model The model is dX t = a(X t,  ) dt + b(X t,  dB t. We are not always able to get the explict closed form solution. The Problem/Goal Continuous time is an elegant and useful set up. But in the real time, when we feed data and we set up some deals, we have discrete (eventually high frequency) sampling. Hence one needs some tricks and schema to model or to simulate over time the process. The Tools The Eulero schema dX t = a(X t,  )  t + b(X t,   B t is the simplest one. Which properties in the average and for each sample path w.r.t. to the general diffusions? The Milstein schema as a simple improvement that relies on the ITO formula. Application Numerical comparison in the B&S model between the «exact» simulation and the discrete schema SDE approximations. Eulero schema and higher order techniques

6 The Model The general montecarlo simulation MC approach is very popular. Some problems arise form the dimension and from the complexity of the model. Moreover we need to control the simulation variance. The Problem/Goal To build a simulation of a process. B&S, CIR, H&W. To implement a sensitivity model for the parameters. The Tools A pricing simulation model. Comparison between the B&S formula and the MC estimation An interval confidence for the MC estimation The greeks (Delta) in the MC approach. The stochastic processes simulation

7 The Model Eulero schema is the natual way to discretize a diffusion. On the other hand, one can wonder about a structural discrete time and/or discrete space problem. The simplest one is the binomial tree, where at each step X t+1 may move just UP or DOWN from the previous X t level, according to a given probabiliy (p,1-p). The Problem/Goal The binomial tree as a process converging to the diffusion. Eurpean and american options pricing by the binomial. Computational effort The Tools A pricing simulation model. Some alternatives. Trinomial tree. Quantization Examples Discrete state and discrete time processes

8 The Model B t, bronian motion, is a stochastic process with N(0,  ) independent increments over time. It is the (main) random source of the stochastic processes used in finance. The pricing processes (or interesest rate,..) are built on it, by SDE where in some lucky cases we have explicit solutions. In finance and many other fields it can be useful to calculate some quantities, such as: the time spent below/over a givan threshold (occupation time) the time before a boundary is touched for the fisrt time (hitting time) the time spent «near» a given point (local time). The Problem/Goal To know some of the above quantitities and the related probability/metrics The Tools An application to B&S diffusion model Some useful Brownian Motion functionals

9 Application 1: Exotic derivative evaluation A derivative may be exotic because of its payoff (i.e the function f(X,  ) for which we must evaluate the expected value E[f( )] at the maturity T, and/or in the underlying definition, i.e. in what exactly X represents (a single stock price, a basket, a function of orders statistics of a basket, …). By some details both in mathematics features and in data structure and data flows strategies, we will manage some exotic derivatives such as Asian, rainbow, …. Application 2: Full evalution of VaR versus Delta-Gamma-Vega VaR The VaR is the (adverse) quantile of the future value V t+h of a financial instrument or a portfolio. Given the model, from an algorithmic and computational perspective, one has two very different principles to calculate it: full evaluation, i.e. to re-price the derivtives over all the scenarios, or to approximate  V by first and second order sensitivities (greeks). On one hand, greeks are very fast (arithmetics instead of complex evaluation functions), on the other hand they could be not accurate. Practical applications are done Application 3: The Compound Poisson Process and the Operational VaR The operational loss is defined by L =  n X n, where in simple cases n = 1..N is a poisson random variable, and X n are lognormal LGE (loss given events); exactly as in the insurance sector. The (deterministic or poisson driven) sum of lognormal is not lognormal. How to model/approximate the poisson process? Several approaches are explained and implemented, such as matching moment method, approximations from insurance and some tricks by using the brownian motion functionals. Applications

10 The Model In the B&S model we have «THE» volatility . If we extract from the actual market prices the «implied» (or market) volatilities, we observe that we gave different s(t,K) values, accoridng to strikes and maturities. Market illiquidity? B&S model is not adequate? The market must be segmented? In the litterature, several extensions have been suggested, from thr local volatilities model until the Heston model and so on. Moreover, volatility can be actively traded by mean of volatility futures (VIX, VSTOXX,..) and variance swaps products. The Problem/Goal To give an overall picture of the theoretical/applied context, showing some actual equity volatilities surfaces, discussing he shape and the calibration issues, verifying arbitrage opportunities The Tools Real World market tools, Speadheets Volatility Surface

11 The Model From a front office point of view, it is very important to deal data and models in a very granular and accurate approach. Examples: for each interest rate curve 40-50 buckets are used, the volatility surface may have K x T = 720 points, each underlying, even if illiquid, must be used without any proxy or mapping. But the risk manager or the top management of the banks want to get some high-level picture about: where are the risks? how much we are exposed to the risk dirvers? Which is the effective position of the bank? The Problem/Goal The PCA, born in economic contexts, allows to reduce daramtically the dimension of the time series. The new «components» very often give useful insights about the market movements and the bank exposures The Tools PCA Applications to the therm structure and to the volaility surfaces. High level dashbords examples. PCA – Principal Components Analysis


Download ppt "Università di Verona, 2015 may 6th Michele Bonollo SDE & Statistics MiniCourse Topics Roadmap: contents, goals, tools."

Similar presentations


Ads by Google