Thursday, October 16, 2008

The psychology behind the mess

Oct 16, 2008

By Richard Arvey & Michael Frese

THE root cause of the current global financial crisis is that American lending institutions gave out mortgage loans to millions of home owners who could not afford them. When interest rates rose, these owners were forced to foreclose on their loans, leaving banks, investors and other purchasers of the loans saddled with huge losses.

How could such well-known and reputable firms engage in such unwise lending practices? As Fortune magazine asked last year: 'What were they smoking?'

We propose to answer this question through the lens of organisational psychology. What they were smoking wasn't the problem; what they were thinking was.

To begin with, there is what organisational psychologists have identified as the problem of common mode errors. An example of a common mode error is the over-optimism and overconfidence that executives in the financial industry all came to share. As a result, they over- estimated their ability to cope with the potential negative consequences of the risks they were assuming. Managers saw other firms enjoying high success rates despite the risks they undertook, and tended to assume they could take the same risks themselves. Once individuals adjusted to a certain risk level - and got away with it unscathed - they adopted even more risky practices.

The tendency to extrapolate from the past to the future accentuates such risky behaviour. First, financial executives assumed past profit rates could be sustained into the future. Second, this orientation was reinforced by the high bonuses that executives enjoyed for short-term 'good performance'. The result was that short-term profits began to drive their investment decisions, and they ceased to look at long-range risks.

Risk assessment involves estimating the probability of particular outcomes and their consequences. Financial executives made a number of errors when assessing the risk they were taking on. They underestimated or even ignored the probabilities of certain outcomes. They did not have an accurate picture of the likelihood that things could go wrong They formed judgments based on their own personal experiences and perceptions. They used data from their own firms from previous years, indicating high returns, and discounted the possibility of the recent past not being prologue to the indefinite future.

By and large, financial executives, like most people, tend to look for confirming rather than non-confirming evidence when discussing potential scenarios: What confirm happy expectations, no matter how flimsy, are highlighted; and what doesn't, aren't.

People also have difficulty believing that drastic worst-case scenarios can ever happen. That is why they live in earthquake zones and other dangerous places. Even though financial executives may have understood the risks intellectually, they didn't quite believe the worst-case scenario - like home prices falling drastically - would actually occur.

In his excellent book, The Black Swan: The Impact Of The Highly Improbable, Nassim Nicolas Taleb describes the ingrained tendency of humans to underestimate the improbable. That is, they fail to realise that incidents that have a low probability of occurring actually do indeed occur.

Similarly, executives may not have believed anything bad would happen to their own firms, although other firms may have come into harm's way. Psychologists call this tendency 'discounting', where individuals diminish the risk of extreme events occurring because the probability of their occurring is too low to evaluate intuitively.

Financial executives also seem to have believed that they could pass off risks to other agents - by securitising mortgages, for example, and selling off the securities to others. And the agents who bought these securities had an inaccurate assessment of their risks, because of what they were told by the rating agencies. It was a case of passing the buck, up and down the line.

On a more sinister and dark note, it is also possible that executives did not care if things went awry because they were so well off financially that they personally would not suffer any consequences if 'black swans' - the unexpected - were to turn up. Executives may have assumed too that even if things were to go wrong, other institutions would pick up the losses - governments in this case. This is the problem of moral hazard.

Finally, there were external pressures to conform. The Economist carried a report recently of a risk manager who described his experiences in a bank. Risk managers are paid to identify and assess risks. Some risk managers did actually warn of problems that have since come to light associated with some risky instruments. But there was enormous pressure on them from their peers, as well as senior management, to minimise the risks. For this reason, the risks were not better known, until it was too late.

The point here is that human error in judgments can cause colossal messes. We can't change human nature, but we can provide checks and balances to ensure that egregious errors happen with great rarity. Understanding the psychology behind decision-making processes would aid in providing greater checks and balances in the financial sector.

Richard Arvey is head of the Department of Management and Organisation, National University of Singapore. Michael Frese is chair for Work and Organisational Psychology, University of Giessen, and Visiting Professor of Organisational Behaviour at the London Business School. This is the eighth article in the ST-NUS Business School series on the financial crisis.

No comments: