## Friday, April 19, 2013

### A Fast Exponential Function in Java

In finance, because one often dicretize the log process instead of the direct process for Monte-Carlo simulation, the Math.exp function can be called a lot (millions of times for a simulation) and can be a bottleneck. I have noticed that the simpler Euler discretization was for local volatility Monte-Carlo around 30% faster, because it avoids the use of Math.exp.

Can we improve the speed of exp over the JDK one? At first it would seem that the JDK would just call either the processor exp using an intrinsic function call and that should be difficult to beat. However what if one is ok for a bit lower accuracy? Could a simple Chebyshev polynomial expansion be faster?

Out of curiosity, I tried a Chebyshev polynomial expansion with 10 coefficients stored in a final double array. I computed the coefficient using a precise quadrature (Newton-Cotes) and end up with 1E-9, 1E-10 absolute and relative accuracy on [-1,1].

Here are the results of a simple sum of 10M random numbers:

0.75s for Math.exp sum=1.7182816693332244E7
0.48s for ChebyshevExp sum=1.718281669341388E7
0.40s for FastMath.exp sum=1.7182816693332244E7

So while this simple implementation is actually faster than Math.exp (but only works within [-1,1]), FastMath from Apache commons maths, that relies on a table lookup algorithm is just faster (in addition to being more precise and not limited to [-1,1]).

Of course if I use only 5 coefficients, the speed is better, but the relative error becomes around 1e-4 which is unlikely to be satisfying for a finance application.

0.78s for Math.exp sum=1.7182816693332244E7
0.27s for ChebyshevExp sum=1.718193001875838E7
0.40s for FastMath.exp sum=1.7182816693332244E7