Don’t miss the latest developments in business and finance.

A V Rajwade: Uncertainty and mathematical models

WORLD MONEY

Image
A V Rajwade New Delhi
Last Updated : Jun 14 2013 | 3:57 PM IST
 
When I did my post-graduation in statistics in the mid-1950s, the only calculation aid we had was a Facit mechanical calculating machine. It was often a laborious task to compute basic measures of central tendency and dispersion like mean and variance. Things have, of course, progressed since then.
 
There is another major change. At that time, the principle uses of statistical methods we were taught were in sampling, quality control, data tabulation and analysis, and so on. Although Harry Markowitz's portfolio theory was propounded in 1952, I do not recall us being taught anything about the possible uses of statistical methods in financial markets.
 
Today, probability theory, confidence levels and statistical methods are at the heart of financial economics and risk measurement. Even the regulators are using statistical models to prescribe capital norms for financial intermediaries. And, the availability of PCs and spreadsheets permit even the most elaborate calculations in a flash.
 
While I have been using spreadsheets for the past two decades (remember Lotus123?), some recent experiences have started to make me think about the weaknesses in the culture of spreadsheets. Specifically, do the fast, accurate to the nth decimal calculations create an aura of reliability, leading to unquestioning acceptance of the output?
 
In turn, does this render redundant the need to compare the output with rough independent estimates "" or even check the direction in which the numbers should move? The problem is even greater when one is using a readymade IT system, and not a spreadsheet.
 
Occasionally, one finds a disturbing tendency on the part of dealers and other users to input the numbers prompted by the menu of the IT system and accept the outcome as the gospel truth.
 
The whole exercise has become so simple that not many seem to be taking the trouble to look at or understand the underlying pricing concepts and algorithms, from first principles, preferring instead to rely on readymade models.
 
Since one cannot know what exactly happens within the "Black Box" of the system, one is seduced into overlooking the need to have some idea of what the answers should be or what they imply: such over-reliance on solely the system output can be dangerous.
 
Internationally, the most celebrated case of the failure of models is, of course, the spectacular collapse in 1998 of Long Term Capital Management, a hedge fund so egregiously leveraged that its likely failure posed a systemic risk and forced the US Federal Reserve to organise a rescue operation.
 
And, LTCM could hardly be accused of not knowing what it was doing or the underlying models "" after all, it had Nobel Laureates on its board and they were involved in developing risk management systems.
 
The provocation for remembering the chain from Facit machines to LTCM, is from some recent reports about the huge losses apparently sustained by some funds, this time in the collateralised debt obligations (CDOs) market. Low interest rates in major markets have intensified the search for higher yields on the part of the investor.
 
This, in turn, has led to investment banks designing more complex credit derivatives. Since the supply of actual debt instruments to form the pool of debt obligations underlying a CDO is limited, synthetic CDOs have gained popularity. The underlying is not the actual debt instruments but, say, credit default swaps on the instruments.
 
Going further, so-called CDO2 instruments have come into vogue: the underlying is not a pool of debt securities or even credit default swaps, but a portfolio of CDO issues.
 
This makes the pricing of individual tranches even more difficult, particularly when ratings change. The recent downgrade of General Motors and Ford debt paper to junk grade has led to huge price swings, particularly in the riskier tranches of CDOs. Those speculating on a narrowing of the spreads seem to have suffered large losses.
 
All such complex plays require the use of sophisticated mathematical models and anyone surrendering common sense to faith in the "Black Box" runs a risk. The pricing of different credit tranches often depends on assumptions about the probability of default, the loss given default, the correlations amongst various instruments and so on.
 
The assumptions and relationships are complex and what may work in normal conditions, fails in illiquid or volatile markets. In general, reliance on quantitative models alone, overlooking the qualitative appreciation of risk, ignores the basic risk management principle: asking what happens in "the other 1 per cent of the time".
 
Financial economics makes a distinction between risk and uncertainty "" in the case of the former, while the outcome is not known the probability distribution of the various outcomes is predictable; in the latter case, neither the outcome nor the probability distribution is known.
 
It is useful to remember that the greater the complexity of the underlying instruments, greater the possibility that when market conditions change, risk turns into uncertainty.

Email: avrco@vsnl.com

 
 

Also Read

Disclaimer: These are personal views of the writer. They do not necessarily reflect the opinion of www.business-standard.com or the Business Standard newspaper

First Published: May 30 2005 | 12:00 AM IST

Next Story