In the famous scene from the movie “This Is Spinal Tap,” Nigel Tufnel shows that the volume controls on his amplifier go up to 11—one more than the standard 10. Having that extra power to “push over the cliff,” says Nigel, gives the band a unique edge. It’s a metaphor that applies well to risk management. Indeed, the persistent emphasis on maximum data precision has always been the case within the risk space.
Axioma’s delivery of large data sets to clients predated the Global Financial Crisis, with the release of its daily fundamental factor equity risk models in 2007. Axioma was the first firm in this space to provide daily models and while some considered it overkill at the time, that view would abruptly change with the onset of the crisis. The power of these models to expose sources of risk across a parsimonious set of risk factors is achieved with accurate inputs and reasonable pricing models.
Post-crisis, managing large data sets on financial instruments has become even more critical, as regulatory reforms now require firms to capture additional data points (ex: MIFID II, Money Market Reform). Moreover, security reference data (terms and conditions) are key inputs, not only in Axioma’s equity risk models, but also in stress-testing calculations that are increasingly in demand, whether per regulatory demands or concerns about the end of the longest running bull market. Front-office staff at hedge funds, asset managers and other financial institutions look to Axioma to help them perform proactive stress testing to gain real time intuition into what may happen in the markets. Axioma’s clients work directly with our specialists to construct predictive scenarios that help them evaluate the ramifications of various shocks on their portfolios. These shocks include movements within equity or bond markets, volatility changes, parallel or non-parallel rate curve shifts, widening of spreads, or historical stress shocks, such as the 2015 China Bubble Burst, 2015 Swiss FX Intervention, or even the 2008 Bear Market rally. (For more on stress testing, please see Axioma’s paper “Stress Testing Best Practices”) And, bringing it back to Nigel, to do this properly, clients need controls that go up to 11 in terms of data precision.
To support the growing data requirements post-crisis, there has been a deeper industry-wide focus on data lineage, which is the data life cycle that includes the data’s origins, its flows, transformations and end usage within an organization or by customers. Organizational behavior is changing around data, where firms are starting to treat data as an asset that drives every business decision. We are all too familiar with the actors of the crisis who inadvertently bet the company because they lacked the models or data on the risk that they were taking on. This ran the gamut of insurers that wrote CDS protection for free, or banks that knowingly originated bad loans and simply sold them on. Everyone in the food chain believed they had no risk, despite knowing that they were often wading into toxic sludge.
In the July whitepaper “CD0 2.0 Managing the New ‘Data-to-Value’ Equation,” Michael Atkin, thought leader in Enterprise Data Management and professor at Columbia University, argues that the new mandate for data requires two components:
Atkins argues that firms need to reinvent processes to provide customers with real time and accurate information. At Axioma, the level of sophistication of our risk platform is powered by the cloud, which can scale a large number of risk calculations. Additionally, Axioma Risk is API-compatible, meaning clients can make calls on Axioma Risk to extract the information almost immediately into their programs directly. Taking guidance from Sam Solomon, the angel investor who managed large data and analytics teams pre-crisis at Lehman Brothers and Standard & Poor’s, “Data is only as good as 5 seconds ago.” At Axioma, the combination of world class risk models and best of breed technological solutions gives clients the power to extract what they need quickly and accurately.