Professional Wealth Managementt

Home / Archive / Change of focus to credit and operational risk

By PWM Editor

Earlier efforts by investment banks highlight the importance of data management as a central pillar of risk management operations. The resulting solution has applications far beyond risk management. Risk management, involving measures such as market risk or value at risk (VaR), has been a highly visible topic for some time in the investment banking community. The Bank for International Settlements and national regulators have defined acceptable methodologies for calculating and reporting market risk. An army of software vendors and consultants offer a wide range of solutions and services to meet these regulatory requirements. For many investment banks, the use of the controversial VaR measurement is an issue that has been addressed. The resulting processes are in a “production and refinement” stage. With VaR addressed, much of the effort within the investment banks today focuses on credit and operational risks. Correct data A clear trend within asset management and private banking is the move towards the adoption of some of the VaR methodologies and concepts implemented by the investment banks several years ago. Many asset managers and private banks are creating new risk management departments and mandating them with oversight responsibilities using VaR as a basis for market risk measurement and management. For the software vendors and consultants this represents a second ride on the gravy train. However, for the asset managers and private banks it is an opportunity to benefit from the experiences of an earlier generation of technology users. After all, it may be the early bird that gets the worm, but it is always the second mouse that gets the cheese. Long hours of conference discussions, many miles of column inches, and large numbers of books have been written extolling the virtues – or otherwise – of the numerous statistical methods and software systems that can be employed to provide an institution with a solution to VaR. All of them, without exception, rely on correct and well-maintained data. Vulnerable models The collection and management of market data – that is, prices, identification codes, reference data and static data such as the terms and conditions for bonds – is a key function of any risk management architecture. The importance of this function is often overlooked or underestimated. Many of the models and methodologies used in market risk management are particularly vulnerable to poor quality market data. Many of the models used simply breakdown if there is a missing data point. Moreover, spikes or errors in pricing data can dramatically overstate the volatility of a given instrument and hence the VaR associated with it. The costs of such errors in terms of capital adequacy and capital allocation are not trivial. Efforts to reduce such incidences therefore have a rapid return on investment. The requirement for clean and correct data extends beyond the capture of the latest values as most risk models require historical data, or data derived from historical data (i.e. correlation, volatility, beta, tracking error). It is an inevitable feature of even the highest quality data vendors that errors will be present within their data. Many regulators and financial institutions rightly view this as one of the principal operational risks facing financial institutions. Many institutions are unwilling to leave their exposure to this operational risk in the hands of external data vendors, and several regulators are encouraging financial institutions to take direct control and responsibility for the quality of the market data that their risk systems consume. The implication of all this is that a vital component in any risk management system is a market database where the necessary market data can be cleansed and the quality assured by the institution’s own personnel. Accurate histories can be maintained, derived values can be calculated and the whole risk management process can be performed in the knowledge that the quality of the underlying data is guaranteed. Solutions It may sound a significant undertaking to create and maintain such a solution, and starting from scratch it would be. Fortunately the “second mouse rule” benefits the latest generation of institutions requiring market risk systems. FAME Information Services has spent the last 20 years providing historic market database solutions to the world’s financial, public sector and energy institutions. FAME’s clients range from investment banks, private banks, asset managers and insurance companies through to public sector institutions such as the European Central Bank, Federal Reserve, and the International Monetary Fund. The FAME Data Manager (FDM), as the name suggests, is an off-the-shelf market database management system. Developed and refined over many years working with FAME clients to meet their need for cleaned and filtered market databases, the FDM represents the sum of their experiences and requirements. The guiding principles of the solution are flexibility and transparency. The FDM solution is data source independent, allowing the institution to define its preferred sources, times and the frequency of data capture. The FDM then processes this data according to logical and statistical rules defined by the user. Cascading hierarchies of these rules allow the user to ensure that all errors and gaps in the data are captured and corrected. For example a rule may define that for a given group of instruments data vendor A is the preferred source and if vendor A is unavailable or incorrect, vendor B or C should be used. If these are both unavailable the rule might stipulate the use of a proxy, for example mapping the price change from a B share, American depository receipt or index. If these are unavailable the next step might be the generation of a value based on a statistical forecast or simply on the preceding day’s price. Errors can be detected using a variety of logical or statistical rules depending on the data in question. The most obvious method is the comparison of several sources for the same instrument. For corporate actions, statical data and reference data this can be especially effective. However, several data vendors may use the same exchange as the original source of a price. An error at the exchange may be replicated in several data vendors. Therefore further statistical checks can be performed to ensure the “reasonability” of a particular value. In parallel to this data processing an audit trail is generated. All raw data is retained, and a comprehensive, detailed log of all system and user actions is stored to ensure that every single data point can be traced back, through any changes, to its original source. The FDM can then derive any values required by the risk methodology such as correlation matrices and tracking errors before all necessary data is copied directly into the main risk system.

Back-office concerns Market data is consumed in all areas of a financial institution. Therefore the operational risk exposure associated with incorrect or missing data pervades the entire organisation. Several areas in particular are especially vulnerable to poor quality data and represent significant operational and reputation risks. Areas of specific concern include back office operations, accounting, portfolio management and fund administration. For fund administrators, one of the largest potential risks they face is the operational risk of miscalculating a fund’s net asset value. Missed or incorrectly captured corporate actions can have dire implications for back office and accounting operations. The FDM market database can be used to feed the entire organisation with quality assured market data, removing one of the largest operational risks facing financial organisations.

Global Private Banking Awards 2023