The Changing Face of Risk Management in 2020

During the last decade, the conventional approach to risk management in financial institutions has dramatically changed. In the early nineties, the role of risk management was primarily to monitor and limit potential losses in trading and banking book positions. The prevailing paradigm was to ‘buy and-hold’ a position in the book until maturity.

The way in which financial institutions now conduct their business has changed to a short-term orientation. Indeed, trading book positions, have always been predominantly short investment horizons. Today, sufficient liquidity makes it equally possible to off-load banking book positions in secondary markets thereby making the ‘buy-and-hold’ strategies dispensable.

The off-loading may be done either through selling the position itself, through secondary markets or by hedging the risk part through credit derivatives. Risk is no longer a characteristic that must simply be controlled and limited but rather something that may be retained for a position, as long as it holds its value and until it can be sold for the right price. Risk is also a characteristic that may be bought or sold in its own right as a traded commodity.

The development described here, has changed the role of risk management in a drastic manner. Instead of a middle-office function with primary focus on end-of-day reporting, risk-related calculation and profiling are now increasingly a significant part of the overall pricing process for financial instruments. One consequence is the replacement of the demand for overnight batch processing by demands for on-line, real-time calculations with very quick response times. Meeting these demands is a major challenge for risk systems architecture, particularly systems with a data-warehouse-based architecture.

There are other demands on modern risk systems architecture. Financial instruments are becoming increasingly complex. Highly structured derivative products are becoming standard in investment and trading portfolios.

Analytical, sensitivity-based and co-variance matrices based approaches do not allow for the degree of precision desired for the assessment of the associated risk and in order to produce meaningful figures, with the desired precision, a simulation-based approach is warranted. However, the calculation time for most available risk systems, a bank-wide simulation with five thousand paths, for example, is likely to run into several days.

In view of such performance implications, historical simulation is widely used by banks, even though the superiority of simulations, as a tool for risk assessment, is well documented. In business today, there is a demand to deliver superior risk figures in real time and as a result, demands on risk systems are more stringent.

If, for example, one had to calculate potential future exposure for credit risk, such simulations must be expanded over at least ten time nodes, thereby increasing response times by a factor of ten.

A new generation of risk systems architecture is required to cope with these demands. The typical monolithic architectures that sit on top of data warehouses cannot be scaled up in the manner required. Massive, parallel computations are mandatory. In addition, the data feed for risk systems must be based on real time feeds and the necessary pre-deal limit check functionality must be supported through incremental simulations.

A clear vision exists for this next generation of risk architectures. They have to be event-driven systems with service-oriented frameworks. The individual calculation services are coarse-grained components that each performs a complete step of the business logic, with communication between these components managed via a message broker and a middleware backbone.

The message broker is able to allocate the necessary calculations to multiple instances of the same service, in parallel, and takes care of dynamic load balancing between the machines, which results in a nearly unlimited linear scalability. Services can be executed either on huge multi-processor systems or on a large network of single processor boxes like PCs.

High performance systems, today, can already value more than fifty thousand transactions per second, even on a single CPU machine. This means a market risk calculation for a portfolio of a hundred thousand transactions with one thousand scenarios can be performed within approximately half an hour. Typically not something that any Perth managed IT services company can just handle without significant investment.

Spreading the calculations across a PC farm with four hundred boxes will give a response time of a mere five seconds. This will be a reasonably acceptable figure for any type of OTC or other non-electronic trading.

However, there is still room for improvement. It is not necessary to re-simulate the whole portfolio for every new or trial transaction. Incremental calculations are possible if the results of the individual simulation path are stored either at a transaction or a portfolio/sub-portfolio level.

In addition, either the simulated risk factor scenarios must be stored, or the parameters necessary to reproduce the simulation scenarios are retained. If those pre-requisites are fulfilled, the response time for an incremental risk calculation, for market as well as for credit risk, decreases to less than a second.

Those response times are far beyond the reach of present risk architectures: conventional risk systems typically require approximations to bring down response times necessary for risk simulations. For instance, one known approach is the use of a variant stratified sampling where the simulation concentrates around a VaR value estimated through second order approximations.

This approach works well at a portfolio or small business accounting level but does not allow for drill down or potential aggregations. Other systems use a second order approximation for the value function of the transactions. This approach is very fast, but lacks significant accuracy especially in specific situations, like options close to maturity or non-linear instruments with differentiable value functions that have singularities.

These approaches are acceptable for a broad range of users. However, for out-of-the-ordinary portfolios or investment/trading strategies, such as for hedge funds, where the requirement may be to capitalise on the non linearities and singularities, approximations may not be acceptable. The same is valid for AAA special purpose vehicles where the rating agencies require very accurate risk calculations.

Migrating to the required new generation of system architecture is not an easy task. It requires huge development efforts absorbing all available resources. While some vendors will be unable to undergo these changes, it will be a chance for new players to enter the field.

Therefore, we have to expect that not only the face of risk management is changing, but also the landscape of software vendors active in this area.

More Articles for You

Streamline Your Small Business Accounting with Templates

As a small business owner, I often find myself juggling multiple responsibilities, from managing operations to marketing my products or …

Nice Places to Eat in Craigieburn, Victoria

Craigieburn, a vibrant suburb located approximately 30 kilometers north of Melbourne’s central business district, has evolved significantly over the years. …

Essential Small Business Accounting Checklist

When I first embarked on my journey of managing my own finances, I quickly realized the importance of establishing a …

Places You Should Dine in Brunswick, Victoria

Brunswick, a vibrant suburb located just north of Melbourne’s central business district, is a melting pot of cultures, creativity, and …

Getting Started with Small Business Accounting

As I embark on my journey into the world of small business accounting, I realize that grasping the fundamentals is …

Where You Should Eat in Balwyn North, Victoria

Balwyn North is home to a selection of fine dining establishments that cater to those seeking an exquisite culinary experience. …