SimianQuant

SimianQuant

Bringing Back Moore's Law

Harshad Deo

3 minutes read

One dimensional interpolators are some of the simplest functions implemented by mathematical libraries. This article compares Strata’s implementation of linear and cubic spline interpolation against their equivalents generated using the SimianQuant library.

For an alternate open source reference, benchmarks for implementations provided by Apache Commons Math (ACM) are also included. Two classes of generated implementations are considered:

  1. A Standard Variant, in which the data points are only known at runtime. This is usable by all datasets .
  2. A Mass Customized Variant, in which the data points are known at generation time. This is only usable for the chosen dataset.

As this and other articles illustrate, mass customization can significantly increase runtime performance for a small upfront cost (typically ~10 ms), but requires a license to a runtime instance of the library. The standard variant does not require a runtime library instance and can be exported from the web client.

All of the considered interpolators were calibrated on 20 data points. To summarize the results, the SimianQuant implementations for linear interpolation are about 2x faster than the corresponding Strata implementations, and those for cubic spline interpolation are about 800x faster.

All measurements were made on an AWS c5 instance with a clock speed of 3GHz.

Linear Interpolation

The algebraic and computational simplicity of linear interpolation limits the performance step that can be achieved.

Value

The SimianQuant implementations are between two and three times faster, mainly due to a better memory layout. Mass customization does not significantly increase performance, and those results are omitted.

Parametric Sensitivity

The SimianQuant implementations are reliably faster, although the performance step is lower. Sensitivity analysis requires keeping track of more information, which limits the tricks that the library can use. The results for the JVM implementations are omitted for clarity.

Cubic Spline Interpolation

From an algebraic perspective, cubic spline interpolation is marginally more interesting than linear interpolation. From an implementation perspective, comparing the two kinds of implementations illustrates an important point: the SimianQuant library reliably does not make mistakes, while even good human programmers, working with the constraints of a large system, often do.

Value

Strata’s implementation of cubic spline interpolation uses the same functions for both evaluation and sensitivity analysis (using algorithmic differentiation). Although this makes sense (and is probably necessary) from the perspective of consistency and maintainability, it results in code that, while correct, is abysmally slow.

The performance delta is so high that it (and that for subsequent cases) needs to be visualized on a logarithmic scale.

As the reference benchmark for Apache Commons Math shows, the SimianQuant implementation, though better, is not magical. The performance step is solely because the library does not make mistakes when implementing models.

Cubic spline interpolation also nicely illustrates the benefits of mass customization, even for simple models. The performance improves by about 40%.

Parametric Sensitivity

This section carries forward the themes from the previous one. The performance delta, though still more than 500x, is lower than the scalar case for basically the same reason as linear interpolation - more information needs to be kept track of, which reduces the tricks that the library can use. The results for the JVM implementations are omitted for clarity.


About Strata

Strata is an award winning open source analytics and market risk library published by Open Gamma. The company, however, is opaque about its performance and that of its subcomponents. This article is the fourth in a series that will benchmark the library against open source alternatives and code generated using the SimianQuant library. The source code for the majority of the benchmarks can be found in the companion github repository. Other articles in this series are:

  1. Elementary Functions
  2. Black Scholes
  3. SABR
  4. Two Dimensional Interpolation

Harshad

Life is soup, I am fork.

Recent posts

See more

Categories