The global financial crisis revealed that many banks, including global systemically important banks (G-SIBs), were unable to aggregate exposures fully and quickly, which meant that their ability to use risk analysis and reporting to influence decisions in a timely fashion was seriously impaired, with wide-ranging consequences for individual firms and the stability of the financial system as a whole. Since then, regulators have placed the role of scenario planning and stress testing front and center of their drive to strengthen financial institutions’ risk management capabilities. Most national regulators have put in place new requirements for firms to provide assurance that they will be able to absorb additional losses under adverse circumstances, while retaining sufficient liquidity.
Doom scenarios that could cripple a financial institution will remain, but the role of scenario analysis and stress testing is to ensure the organization is robust against a broad range of potential stress scenarios and to understand the conditions under which it will come under duress (so-called reverse stress testing). This leads to a better understanding of the dynamics of the financial institution and its interactions with the wider environment, which subsequently guides the management and regulation of the organization.
In practice, stress testing exercises rely heavily on firms’ underlying risk models and their interaction with the specifics of the group-wide portfolio. The tests will only be as good as the sum of these parts and all are interdependent. The analytics will have: embedded assumptions; a variety of methodologies with domains of applicability and non-applicability; and a variety of data needs, both internal and external. The data will be dispersed around the organization in multiple silos and typically replicated within several
data warehouses. The data definitions and taxonomy will generally vary across legacy, new and vendor systems, which creates a multitude of issues when it comes to gathering data that is complete, reliable and timely for purposes of analytics and stress testing.
All of the above are interconnected but firms continue to proceed as if all are somehow separate or cannot be totally reconciled. At the individual business unit level, this may work via a broad variety of “fixes.” When considering group-wide analyses, however, this practice is clearly insufficient, although such “fixes” are generally applied. It is exactly these group-wide activities that are demanded by post-crisis regulatory regimes to improve overall risk management.
This paper considers the challenges financial institutions face when addressing the problems set out above. It reviews the overall regulatory requirements and continues to elaborate on the wider business benefits and additional analyses that are facilitated. It does this in the context of wider industry and regulatory initiatives, as well as their interactions.
Data, Analytics, Stress Testing Interactions
Consider first the role of analytics within a financial institution. The purposes are many (see Figure 1).
For each of the above purposes, there will be multiple data stores, e.g., for risk measurement these will span credit, market, operational and liquidity risk across banking and trading books for each business line and geography. The dimensionality quickly scales in such a way that, for even modest banks, the number of permutations is large, and for the most sophisticated Tier 1 banks it will be enormous.
Even the most apparently straightforward question – such as, “What is the firm’s group-wide exposure to client XYZ” – can be difficult to answer within reasonable timescales.
During the financial crisis, it was these difficulties in determining exposures that resulted in insufficient information being available in a timely manner to guide decision making.
In practice, such group-wide exposure questions can be extremely difficult to answer, this being particularly so when legacy systems and processes are taken into account.
To address such issues, the Basel Committee on Banking Supervision (BCBS) released BCBS 239 (Principles for effective risk data aggregation and risk reporting, Bank for International Settlements) in January 2013.
The 14 Principles of BCBS 239
- Data architecture and IT infrastructure
- Accuracy and integrity
- Clarity and usefulness
- Remedial actions and supervisory measures
- Home/host cooperation
Clearly, if an organization is well aligned to the 14 guidelines, it will be in good shape to use its data and infrastructure for wider risk analytics and business management requirements. However, the effect of these guidelines goes much further and touches upon several other regulatory initiatives for financial services firms (see Figure 2).
Each of these requirements involves significant risk analytics capabilities, ultimately driving the firm’s risk appetite and strategic decision making. This is achieved via the derivation of risk measures from the aggregated data in a consistent and synchronous manner.
The measures are frequently derived from the data using risk analytics, be that for credit, market or operational risk. Applying scenarios to this combination allows the portfolio of the organization to be stressed, leading to stressed measures of capital and hence a means for the firm to achieve risk appetite management. In essence, it allows “cause and effect” to be established for the organization’s business objectives and its interactions with the wider environment.
With the BCBS 239 January 2016 deadline for G-SIBs looming, such financial institutions are grappling with the demands of compliance. The industry progress reports
of January 2014 and January 2015 (BCBS 268 and BCBS 308: Progress in adopting the principles for effective risk data aggregation and risk reporting, Bank for International Settlements)1 suggest that many institutions have shortfalls in key areas and that this situation persisted and, in some cases, deteriorated between the 2014 and 2015 updates. Furthermore, numerous domestic SIBs are now also required to comply with the regulations, albeit over longer timescales. The situation varies across countries, but the general theme is that SIBs, whether global or domestic, are gradually being pulled into this regime. This all suggests increased impetus both up to and beyond January 2016, given that the principles imply a process of continuous improvement rather than a static finishing line. Those firms that are nearer to compliance will be in a much better position to leverage such data improvements for better risk management and more effective decision making. As firms move closer to the principles of BCBS 239, further applications of analytics and the more efficient use of such will materialize, leading, in turn, to more efficient risk management processes and decision making. This will span not only risk management but also wider applications of predictive analytics for the organization’s management.
The Portfolio Stress Testing Process
The more enlightened institutions will already have the main ingredients to construct a comprehensive, interactive and timely stress testing environment. The architecture will need to be constructed in such a way that all key risks across the firm are considered (see Figure 3).
Implications for Analytics
At present, it is often the case that most of an analyst’s time, during financial institution stress testing exercises, is dedicated to manual data aggregation and validation checks, usually with several fixes and workarounds being implemented. Much less time may be spent on the underlying model build and application of the scenarios, followed by the subsequent aggregation across the portfolio. This is especially cumbersome as the process is not a one-off but rather involves multiple scenarios and numerous iterations for each. Convergence towards BCBS 239 should gradually shift this balance, so the model build and validation process is better facilitated, while key aspects of the stress testing environment may be better automated. More comprehensive models should be created and the implications of such more rapidly evaluated across the portfolio. This, in turn, facilitates an enhanced feedback loop (Figure 3) and further scenarios to be applied in terms of the overall modeling process. Equally, should market conditions
suddenly change, more rapid model revalidation/rebuild is facilitated with subsequent feed through to portfolio level results.
The industry is facing a wide variety of regulatory and marketplace demands. These are producing significant challenges for financial institutions’ risk management and strategic planning processes. Many of these rely on the underlying risk analytics and their application to the portfolio via the stress testing exercises. The application of these analytics to such differing requirements is presently cumbersome in terms of the model build process and the subsequent analyses using those models. However, common themes across each of the regulatory requirements and convergence, over time, of institutions to the principles of BCBS 239 will better facilitate these processes, as well as open up opportunities for enhanced analytics with more diverse applications, resulting in better insights for management, as well as improved oversight from their regulators.