The amount of collateral received and posted by market participants has been climbing steadily since the margin rules for non-cleared derivatives took effect in September 2016. Asset owners have to ensure they can satisfy every collateral call over the lifetime of each OTC derivative. How can they manage huge amount of data and complex calculation models required to simulate a multitude of what if scenarios across the lifetime of their derivatives portfolios?
The 2018 ISDA margin survey found the 20 largest market participants collected over USD 1 trillion of initial margin (IM) and variation margin (VM) for their non-cleared derivatives transactions at year-end 2018 (Key Trends in the Size and Composition of OTC Derivatives Markets, ISDA, 20 May 2019). Another USD 218 billion in IM was posted by all market participants to the major central counterparties for their cleared interest rate derivative and credit default swap transactions, up from USD 124 billion three years earlier.
International rules on collateral use have been a central policy plank in authorities’ efforts to mitigate counterparty risk in over-the-counter (OTC) trades. “As more firms and transactions become subject to the margin requirements, ISDA expects IM and VM to continue to grow,” the report added.
As market participants take steps to better manage their counterparty exposures and strengthen internal risk policies, so demand for collateral continues to increase. Yet there is a trade-off. Liquidity risk increases as firms seek to meet all their collateral calls, especially during periods of market volatility when participants must post higher margins to combat increased uncertainty.
Optimising collateral coverage is no easy task
The challenge for asset owners is to ensure they can satisfy every collateral call over the lifetime of each OTC derivative. At the same time, asset owners want to optimise their collateral use, often across multiple portfolios, locations and trading entities, to avoid keeping hold of too much collateral and increasing portfolio ‘drag’.
Therefore the key collateral optimisation goals for firms are to:
- Evaluate the value of both their OTC derivatives and collateral portfolios in different economic scenarios (i.e. how they will react to a range of yield curve and FX rate changes), in order to size and adjust the portfolios accordingly
- Calculate risk metrics, such as collateral coverage ratios (the amount of collateral two counterparties agree on to cover the value of a principal transaction)
- Enable investment and risk managers to simulate the impact that adding positions to the portfolio will have with comprehensive “what if” scenarios
However, there are significant operational complexities in calculating existing position exposures and collateral obligations, and in predicting future demands so as to optimise an institution’s collateral allocations. It requires:
- Collection and manipulation of huge amounts of data
- The ability to conduct daily collateral coverage calculations to keep pace with changing derivatives positions, along with rapid calculation turnaround times
- Enhanced pricing methodologies to produce optimised collateral allocation models
- Scenario generation, as well as the computation of key risk metrics such as the collateral coverage ratio, worst collateral coverage ratio and liquidity coverage ratio
And as more exotic OTC instruments are engineered, these tasks become increasingly complicated.
Existing tools fall short
The price for an asset owner of failing to cover all its collateral needs, and of deploying collateral inefficiently, is high. In response, firms are looking for improved operational capabilities to support the investment process, which in turn will enhance their business risk management.
Some asset owners have developed in-house solutions in an effort to track their OTC derivative exposures and post collateral more effectively. However, while the tools are highly bespoke and tailored to specific needs, they may also be non-scalable, especially in light of more complex investment markets and regulatory pressures. In addition, there is a danger of introducing considerable key person and operational risks.
Co-developing a predictive collateral coverage solution
Faced with complex and growing collateral coverage obligations, one of BNP Paribas Securities Services’ clients, a large pan-Asian insurance company, approached us for help.
Collaborating closely with multiple stakeholders across the client’s organisation, we created a digital Predictive Collateral Coverage Reporting solution to meet the specific needs of the different stakeholders in the investment office, risk teams and compliance department.
By accurately measuring and monitoring the counterparty and liquidity risks that arise from the insurer’s exposures, the client is now able to make more informed decisions about the trades it should or shouldn’t undertake, and the collateral coverage and liquidity it will need in any given situation.
Marrying client experience with BNP Paribas’ expertise and technology ecosystem
True predictive collateral coverage reporting requires a huge amount of data. It also demands complex calculation models to simulate a multitude of what if scenarios across the lifetime of a derivatives portfolio.
At the heart of our solution is our Trusted Data Factory, a data hub that employs big data principles and advanced technologies to produce what if simulations and calculate the long-term risk projections.
The solution generates more than 3,000 economic scenarios spanning up to 10 years, with at least quarterly time steps. That equates to 240,000 data points, for a dozen risk factors. If 1,000 financial instruments are repriced in each scenario and time step, the result is hundreds of millions of repricings. These repricing results are then consolidated, taking into account counterparty risk business rules such as netting sets, haircuts and so on.
The complexity, technology sophistication and robustness involved in such an undertaking make it hard to successfully complete a project of this magnitude alone. But by co-innovating and co-developing together, the results have been transformative.
We are now able to provide decision-ready information within seconds, giving the client access to their positions and on-the-fly what if calculations on a daily basis. The system’s scalability and adaptability also ensure it can support any growth in our client’s derivatives volumes, which have already expanded by more than 50%.
Effective collateral management is an industry-wide issue
The collateral coverage challenges facing our insurance company client, with its sizable OTC derivatives book, were clearly significant. But they are by no means unusual.
With regulatory rules and industry requirements continuing to fuel demand for collateralisation, many asset owners and managers around the world face similar collateral management problems. Accurately predicting their collateral needs, and optimising what is becoming an increasingly scarce resource, is therefore vital to the success of firms’ investment and risk management practices.
Key features of an advanced collateral coverage ratio solution
Users can access our Predictive Collateral Coverage Reporting solution through their laptop or tablet. Self-service capabilities allow them to:
- Produce full, transparent and auditable exposure analyses of their current OTC derivatives positions by counterparty, country, firm entity, etc
- Determine the peak credit and liquidity exposures based on those positions
- Calculate their worst-case collateral coverage ratio, to identify how much they may be underexposed
- Employ on-the-fly “what if” analysis to evaluate the impact of hypothetical trades on the firm’s exposures, risk positions and coverage ratio
- Predict the institution’s long-term collateral needs based on its current and hypothetical positions
The service is only informational, and not to be relied upon for trading or portfolio construction purposes. Our experts supports clients to conduct workshops to support their data, rules and scenario definition.
While every effort is made to check and validate the data flows, the responsibility of the data origin rests with the data provider.