I seem to never be entirely happy with any of the linux desktops these days. I have used XFCE on Ubuntu quite a bit in the past year, it mostly works, but I still had minor annoyances:

- sometimes (rarely) my laptop would not wake up from sleep.

- notifications sometimes keep popping up too much.

- on my desktop, experienced strong tearing issues with the Radeon graphic card, except with some very specific combination of video player settings and desktop settings (and then I had annoying redraw issue when pushing volume up/down in movies).

I am satisfied with two different approaches since:

- OpenSuse 13.2 with KDE 4. I use that on my desktop, all issues are gone, and the integration of KDE in OpenSuse is clearly the best I have experienced. In contrast, KDE 5 on Ubuntu was a disaster for me. I also managed to fuck up the apt dependencies so much that I thought it would be simpler to reinstall a new distribution.

- Mate on Ubuntu 15.04. Very impressed so far. It's probably what Gnome should have been instead of going to 3.0. Even if there are nice aspects of the Gnome shell, Mate is fast, pretty, user friendly, much better than Cinnamon. There are even a few layouts to choose (most of them are good), here is "Eleven with Mate menu" (it installed and setup the Plank dock automatically for that layout, more traditional layouts without dock are available):

## Wednesday, June 24, 2015

## Friday, June 19, 2015

### Square Root Crank-Nicolson

C. Reisinger kindly pointed out to me this paper around square root Crank-Nicolson. The idea is to apply a square root of time transformation to the PDE, and discretize the resulting PDE with Crank-Nicolson. Two reasons come to mind to try this:

Out of curiosity I tried it to price a one touch barrier option. Of course there is an analytical solution in my test case (Black-Scholes assumptions), but as soon as rates are assumed not constant or local volatility is used, there is no other solution than a numerical method. In the later case, finite difference methods are quite good in terms of performance vs accuracy.

The classic Crank-Nicolson gives a reasonable price, but the strong oscillations near the barrier, at every time step are not very comforting.

Moving to square root of time removes nearly all oscillations on this problem, even with a relatively low number of time steps compared to the number of space steps.

We can see that the second step prices are a bit higher than the third step (the lines cross), which looks like a small numerical oscillation in time, even if there is no oscillation is space.

As a comparison, the TR-BDF2 scheme does relatively well: oscillations are removed after the second step, even with the extreme ratio of time steps vs space steps used on this example so that illustrations are clearer - Crank-Nicolson would still oscillate a lot with 10 times less space steps but we would not see oscillation on the square root Crank-Nicolson and a very mild one on TR-BDF2.

The LMG2 scheme (a local richardson extrapolation) does not oscillate at all on this problem but is the slowest:

The square root Crank-Nicolson is quite elegant. It can however not be applied to that many problems in practice, as often some grid times are imposed by the payoff to evaluate, for example in a case of a discrete weekly barrier. But for continuous time problems (density PDE, Vanilla, American, continuous barriers) it's quite good.

In reality, with a continuous barrier, the payoff is not discontinuous at every step, but it is only discontinuous at the first step. So Rannacher smoothing would work very well on that problem:

The somewhat interesting payoff left for the square root Crank-Nicolson is the American.

- the square root transform will result in small steps initially, where the solution is potentially not so smooth, making Crank-Nicolson behave better.
- it is the natural time of the Brownian motion.

Out of curiosity I tried it to price a one touch barrier option. Of course there is an analytical solution in my test case (Black-Scholes assumptions), but as soon as rates are assumed not constant or local volatility is used, there is no other solution than a numerical method. In the later case, finite difference methods are quite good in terms of performance vs accuracy.

The classic Crank-Nicolson gives a reasonable price, but the strong oscillations near the barrier, at every time step are not very comforting.

Crank-Nicolson Prices near the Barrier. Each line is a different time. |

Moving to square root of time removes nearly all oscillations on this problem, even with a relatively low number of time steps compared to the number of space steps.

Square Root Crank-Nicolson Prices near the Barrier. Each line is a different time. |

We can see that the second step prices are a bit higher than the third step (the lines cross), which looks like a small numerical oscillation in time, even if there is no oscillation is space.

TR-BDF2 Prices near the Barrier. Each line is a different time. |

As a comparison, the TR-BDF2 scheme does relatively well: oscillations are removed after the second step, even with the extreme ratio of time steps vs space steps used on this example so that illustrations are clearer - Crank-Nicolson would still oscillate a lot with 10 times less space steps but we would not see oscillation on the square root Crank-Nicolson and a very mild one on TR-BDF2.

The LMG2 scheme (a local richardson extrapolation) does not oscillate at all on this problem but is the slowest:

LMG2 Prices near the Barrier. Each line is a different time. |

The square root Crank-Nicolson is quite elegant. It can however not be applied to that many problems in practice, as often some grid times are imposed by the payoff to evaluate, for example in a case of a discrete weekly barrier. But for continuous time problems (density PDE, Vanilla, American, continuous barriers) it's quite good.

In reality, with a continuous barrier, the payoff is not discontinuous at every step, but it is only discontinuous at the first step. So Rannacher smoothing would work very well on that problem:

Rannacher Prices near the Barrier. Each line is a different time. |

## Friday, May 08, 2015

### Decoding Hagan's arbitrage free SABR PDE derivation

Here are the main steps of Hagan derivation. Let's recall his notation for the SABR model where typically, \(C(F) = F^\beta\)

First, he defines the moments of stochastic volatility:

On the backward Komolgorov equation, he applies a Lamperti transform like change of variable:

And then makes another change of variable so that the PDE has the same initial conditions for all moments:

This leads to

It turns out that there is a magical symmetry for k=0 and k=2.

Note that in the second equation, the second derivative applies to the whole.

Because of this, he can express \(Q^{(2)}\) in terms of \(Q^{(0)}\):

And he plugs that back to the integrated Fokker-Planck equation to obtain the arbitrage free SABR PDE:

There is a simple more common explanation in the world of local stochastic volatility for what's going on. For example, in the particle method paper from Guyon-LabordÃ¨re, we have the following expression for the true local volatility.

In the first equation, the numerator is simply \(Q^{(2)}\) and the denominator \(Q^{(0)}\). Of course, the integrated Fokker-Planck equation can be rewritten as:

$$Q^{(0)}_T = \frac{1}{2}\epsilon^2 \left[C^2(F) \frac{Q^{(2)}}{Q^{(0)}} Q^{(0)}\right]_{FF}$$

Karlsmark uses that approach directly in his thesis, using the expansions of Doust for \(Q^{(k)}\). Looking a Doust expansions, the fraction reduces straightforwardly to the same expression as Hagan, and the symmetry in the equations appears a bit less coincidental.

Subscribe to:
Posts (Atom)