Risks must be continually monitored, evaluated, and controlled by instrument, counterparty, transaction desk, and corporate-wide. This requires a comprehensive
process as well as the understanding that its proper execution leads to more effec- tive management of exposure. Which is the best organizational form?
Some institutions have chosen a centralized approach where risk management and the finance division responsible for economic capital allocation act in uni- son. Others prefer to assign primary responsibility to the individual business units. These units are expected to manage their risks by adhering to:
● Established internal policies
● Internal control guidelines
● Risk management milestones, and
● Financial staying power expressed in respectability capital.
Explicit internal policies, established by the board of directors, must emphasize the need for balancing risk by appropriate liquidity supported by both sides of the balance sheet (see section 3). Internal controls must be rigorous in all chan- nels, from loans to investments, trading, and the management of inventoried positions in the banking book and trading book.
As has been explained on several occasions, risk management must be exercised by desk, instrument, counterparty, and any other variable important to the bank’s staying power. A crucial question to be answered in real time (see Chapter 15) is:
What can happen with mispricing inventoried derivatives contracts because of adverse movement in:
● Interest rates
● Exchange rates
● Or, changing market psychology?
Apart from credit risk, interest rate risk and other exposures affecting the bank’s portfolio in its home country, foreign exchange risk can negatively impact on the value of foreign assets and liabilities, damaging the bank’s balance sheet and P&L.
While adverse market effects are bound to happen and nobody in the banking business can be immune of potential financial losses, management should be in charge at all times. This it can do only through a thoroughly studied system of internal controls which operate in parallel channels to lines of authority like the sympatheticand parasympatheticsystems in the human body.
A basic organizational principle is that the person who is in charge of trading should have no control of the backoffice. This is a well-known principle, but it
is rarely observed. Even after the flagrant case of Nick Leeson in Barings Singapore office, which brought down the venerable bank, few financial institu- tions rethought:
● The structure of their internal controls, and
● The existing conflicts of interest between operations, accounting, and supervision.
Not only must parallel controls form a structured system able to assure inde- pendence of opinion, but also control systems should be regularly tested both quantitatively and qualitatively. A good way of testing how the system of inter- nal controls works is analysing credits for risk management. Here is a sample of critical questions:
● Are our credits diversified or concentrated in a few names?
● How are our credits distributed by counterparty? By currency? By interest rate? By maturity?
● What’s the pattern of our credits by credit officer? By branch? By foreign subsidiary?
There are plenty of crucial questions to be asked in testing how well the internal controls work: ‘Is there any abnormal number of “weak credits”?’ How much of the loans business of the bank is done with the same counterparty? How much faster is the derivatives business growing than the more classical business lines like loans, investments, and personal banking?
Other crucial queries for internal control and risk management reasons are ori- ented towards the traders, loans officers, and senior executives. Is the same credit officer dealing with the same counterparty all the time? Is the same dealer following a similar pattern with the same counterparty in regard to derivative financial instruments?
One of the major contributions of Basel II is the awareness it brought to the bank- ing industry about the aftermath of operational risk, and the capital requirement to confront it.10 Figure 14.3 brings three patterns to the reader’s attention in regard to risk exposure characteristic of a credit institution:
● The more classical one, where credit risk accounted for two-thirds of assumed exposure, and market risk for the balance.
● The portfolio heavily weighted in derivatives, where market risk is in excess of credit risk.
● The pattern of the global money centre bank where business risk is king, because sprawling operations in 80, 100, or more than 100 countries make the institution most vulnerable to political events and litigation resulting from its own mistakes.
CLASSICAL
CREDIT RISK
MARKET RISK 2/3
1/3 JUST NOTE
DIFFERENCE
WITH HIGH DERIVATIVES EXPOSURE
CREDIT RISK
MARKET RISK
<50 >50
ACCOUNTING FOR BUSINESS RISK AND FOR OPERATIONAL RISK
BUSINESS RISK
CREDIT RISK 50%
30%
MARKET
RISK OP RISK
15%
5%
JUST NOTE DIFFERENCE
JUST NOTE DIFFERENCE
Figure 14.3 Risk exposure according to estimates made by banks
Given its potential magnitude, business risk must be controlled. Respectability capital is the way of doing so at the financial end. The able management of credit risk, market risk, and operational risk is also important. While legal risk has been part of operational risk, its recent magnitude and business impact lead me to put it as part of business risk.
Mid-June 2005, JP MorganChase agreed to pay $2.2 billion to settle its part in a class action lawsuit, led by the University of California, that accused several banks of aiding Enron in defrauding investors before the energy trader went bankrupt in December 2001. A week earlier, Citigroup said it would pay $2.0 bil- lion to settle its part in the suit. Both banks denied any wrong-doing.
Business risk of that magnitude brings our discussion back to the query asked in the opening paragraph of this section about centralization vsdecentralization of crucial functions. The arguments about decentralization revolve around the fact that a most important tool in any process is on-the-job experience and judgment, enhanced through direct and constant communication. The pros say that while awareness of risk must be continuously emphasized through the company, local exercise of risk control can provide a clear and simple statement as to what should not be done in committing capital.
The arguments about the centralization of risk control start with the fact that risk policies and procedures must be clear, homogeneous, and well understood. If this responsibility is dispersed, inevitably there will be heterogeneity and mis- communications. Some local risk managers may not consider the unexpected, and therefore they may not constantly:
● Probe for potential problems
● Test for weaknesses, and
● Identify potential for loss.
To my mind, whether the chosen solution is centralization or decentralization of risk management and capital adequacy duties, the system to be established should be flexible, to permit adaptation to changing environments, including the evolving goals of the institution. Whichever the organizational solution to be chosen, the key objective is that of minimizing the possibility of incurring exposures outside the board’s and CEO’s guidelines, and supervisory rules. And because there will always be risks arising from rare or extreme events, the bank must have a policy of being adequately equipped with respectability capital.
Notes
1 D.N. Chorafas, The 1996 Market Risk Amendment: Understanding the Marking-to-Model and Value-at-Risk, McGraw-Hill, Burr Ridge, IL, 1998.
2 D.N. Chorafas, Economic Capital Allocation with Basel II: Cost and Benefit Analysis, Butterworth-Heinemann, London and Boston, 2004.
3 The Joint Forum ‘Risk Management Practices and Regulatory Capital’, BIS, November 2001.
4 ECB, Monthly Bulletin, February 2005.
5 D.N. Chorafas, After Basel II: Assuring Compliance and Smoothing the Rough Edges, Lafferty/VRL Publishing, London, 2005.
6 Basel Committee, The Joint Forum: Trends in Risk Integration and Aggregation, BIS, August 2003.
7 EIR, 14 January 2005.
8 BusinessWeek, 21 February 2005.
9 D.N. Chorafas, Economic Capital Allocation with Basel II: Cost and Benefit Analysis, Butterworth-Heinemann, London and Boston, 2004.
10 D.N. Chorafas, Operational Risk Control with Basel II: Basic Principles and Capital Requirements, Butterworth-Heinemann, London and Boston, 2004.
15
The Real-Time Management Report
1. Introduction
Typically, at the higher organizational layers of a corporation information gets distilled and reported in summary and/or by exception. Emphasis is on accuracy rather than on precision. By contrast, great detail and precision characterize the information requirements of the middle layer. The advent of on-line real-time response to management information requirements means that this process is about to change.
Whether we talk of budgets, balance sheets, P&L, or any other type of financial information, a basic characteristic of interactive information technology, at both the top and middle layer, is that database access should be ad hoc. Response must be given in real time with fully updated information:
● Using visualization, by turning numbers into figures
● Having built-in intelligence to identify exceptions and outliers
● Detecting evolving features and patterns, such as trends, spikes, heads, shoulders, and confidence intervals.
Knowledge artifacts are necessary to sort, combine, and prove transactions, as well as validate general ledger account numbers, and pinpoint personal respon- sibilities. Filters should be used in connection to all entries, including account- ing, financial, statistical, and other issues. Data input should be on-line under the ‘one entry, many uses’ principle.
A great deal of attention must be paid to system design. Parametric solutions per- mit flexible transfer of information from and to various applications. High tech- nology should be used as a competitive weapon, to promote the automation of accounting operations. A modern organization cannot afford the luxury of mediocre technology or of obsolete solutions.
By emphasizing the benefits to be obtained from fully interactive approaches, and by assuring that these are properly implemented, an able management pro- vides itself with the means to develop and sustain successful business opera- tions. A good example is real-time balance sheet reporting, the theme of this chapter. Financial models should be designed and implemented with the aim of bridging the gap that often exists between:
● Those people whose job is to develop and supply knowledge, in order to enhance the competitiveness of the firm, and
● Those who must manage output, assuring an uninterrupted flow of high- quality products or services – and of reliable financial information.
Increasingly, the distinction between well-managed and poorly managed entities lies in the ability of the former to experiment prior to commitments. Enriched through real-time response, a dynamic IT system can be instrumental in assuring that the enterprise does not get out of control. A significant part of what-ifexper- imentation with balance sheets (see section 2) rests on the foregoing require- ments, which have been met by top-tier banks.
For instance, since the late 1990s, Boston’s State Street Banks has been able to produce a virtual balance sheet(VBS) for its world-wide operations within 30 minutes (in fact, since then, the time lag has shrunk). A virtual balance sheet is management accounting, not financial accounting (see Part 1). It has all the char- acteristics of a classical balance sheet but it accepts up to 4% error as the price for immediate response.
● This is not acceptable for financial reporting purposes
● But it is perfectly alright for an internal accounting management informa- tion system (IAMIS).
Notice that a level of accuracy of ⫾4% has nothing to do with ‘cooking the books’. This is fast response, internal management information. For example, when Saddam Hussein invaded Kuwait on 4 August 1991, the top management of Bankers Trust was able to reposition the bank at the right side of the B/S, by having a global virtual balance sheet at short notice, experimenting on alterna- tives and using the time window offered by the bank’s London operations before the New York market opened for business.
2. ‘What-if’ experimentation with balance sheets
The serious user of financial or accounting statements is not a passive reader of fig- ures, who does so just to kill time. He or she will typically ask a series of questions aimed at answering professional worries, or at providing the insight necessary for important decisions. Meaningful questions will never be made in the abstract:
● They typically reflect a specific situation
● Inquire on what, when, who, whyor the way in which things evolve.
Many questions have no straight answer. A factual and documented response to them requires investigation, at least of the what-if type. What-ifexperimentation
started in the early 1980s with the spreadsheet and since that time it has made great strides. More than two decades down the line, experimentation has become a ‘must’. As an example, we will follow a scenario on the insurance industry.
Legitimate questions in evaluating projected profitability are of the kind:
● What if inflation rises by x% over the next two years but premiums only increase half that much?
● What are the effects on the company if the probability of natural and/or man-made disaster rises by y% but, because of competition, premiums remain the same as that of the last period?
Some of the answers to queries of this kind, particularly if there is precedence on x, y, and the other conditions, can be provided by information in the database handled through a spreadsheet. More sophisticated, and better documented, replies will require mathematical models, which map into the computer the:
● Range of operations of the company
● Market and the way the company interacts with its market
● Composition of the company’s investment portfolio in fixed income and equities
● Risk-sensitivity of the company’s insurance products and effect of rise in probabilities.
Other modules of an experimental system should simulate the money flows that arise as the result of risks taken in underwriting. These flows typically include:
premium receipts, claims payments, investment of funds, investment income, expenses, taxes, and dividends. Most of the factors outlined in the preceding paragraphs impact on the:
● Balance sheet, and
● Profit and loss statement.
Experimentation on different probabilities of underwriting risk and return is nec- essary because the net result of all money flows occurring in a given period of time is ultimately reflected in an insurance company’s assets and liabilities. In fact, this statement is valid for any firm, though each has its own ways and means of management analysis for accounting and financial reasons.
Changes occurring in the balance sheet and P&L that result from money flows must be calculated according to IFRS. But for management accountingpurposes there exist considerable degrees of freedom, and real-time interactive reports
should preferably be structured in a way that allow change to some of the para- meters on-line, and further experiment with the obtained interim results.
Clearly, this approach requires considerable system support. Figure 15.1 presents an example from an insurance application which capitalized on networked databases to provide a rich environment for experimentation. Important elements in this process have been aggregate flows such as underwriting profits and total earnings.
● The primary flows, which contribute to aggregate flows, are generally cal- culated from simple basic equations and numerical parameters specified by the experimenter.
CUSTOMER RELATIONSHIPS
REPORTS ON CLAIMS
PREMIUM ALLOCATION BUDGET
•HISTORICAL
•FLEXIBLE
PROFITABILITY REPORTING CONTRACTUAL
CLAUSES
CASH FLOW AND OTHER FINANCIAL
DATA
STANDARDS PROFIT
GOAL WORKSTATION
…
Figure 15.1 'What-if' experimentation requires on-line access to databases and artifacts which stimulate or optimize business conditions
● This contrasts to econometric models which attempt to forecast the values of such items as gross profits, by relating them directly to important eco- nomic indicators and their own past values.
Another major domain where experimentation assists management in the insur- ance industry is claim processing. This is a highly repetitive job but each case has its own characteristics and estimation needs a fair amount of knowledge and cal- culation. Hence, it is an ideal application domain for expert systems. However, to effectively contribute to profitable results,
● The knowledge-based artifact must operate on-line and access rich data- bases, and
● Be enriched by knowledge engineering tools that go beyond the capabili- ties of early constructs, utilizing genetic algorithms, neutral networks and fuzzy sets (more on this in section 3).
A similar approach to that shown in Figure 15.1 can be used in connection to other financial activities, such as loans. In the mid-1980s, Japan’s Mitsui Bank was one of the first to build an expert system for scoringcompany loans, which signifi- cantly improved upon past practices. It analysed balance sheets, using public databasesand Mitsui’s own data warehouse. It also compared companies applying for loans to standard credit criteria, and to one-another. The model reflected on:
● Company profits
● Acid test (current assets over current liabilities)
● Liquidity and cash flow
● Long-term assets
● Capital ratio, and
● Future business perspectives.
Other critical variables, or sensitivities, used by the Mitsui model include com- pany size, annual growth, productivity, and qualitative criteria such as quality of management. This has been one of the early success stories of system solutions capitalizing on knowledge engineering.
Whether in the sciences, in engineering, in insurance, or in banking, experimen- tation is a culture that characterizes the person willing and able to challenge the obvious, as distinct from the bureaucrat who can only follow the beaten path.
● The experimenter is not after pithy details.
● What he or she wants is better insight and foresight, to be obtained through investigation.
The cornerstone to all experimentation is a principle formulated by Dr Enrico Fermi, the nuclear physicist. What the Fermi principle states is that if our assumptions make sense, the errors that they possibly contain will average out;
they will not always be loading the results on the same side, therefore helping the process we put in motion in keeping a sense of balance, and therefore of accuracy.
The Fermi concept is very important to management, both in exploiting business opportunities and in controlling exposure. Ourbank’s risk managers never really have all the data they need when a decision is done. The balance is provided through reasonable assumptions. The careful reader will also note that both internal and external auditors operate in a similar way. Therefore the method has polyvalent applications. A person who never made a mistake never did anything;
but a person who has no control over his/her actions is an even greater disaster.