Einstein’s conclusion that e=mc2 is one of the best known mathematical equations. However, this elegant solution conceals a web of mathematical and philosophical complexity. In a similar manner, the simple term ‘risk management’ camouflages the complex nature of managing risk within organisational settings.
People, the organisations in which they work and the environments in which they exist, are dynamic and non-linear. However, when organisations address the topic of risk management, discussion usually revolves around checklists, procedures and systems. Rarely questioned are the innate decision-making processes we each possess and whether or not they themselves can be compromised.
For example, Barings suffered a massive financial loss as a result of its directors’ failure to perceive that its financial viability could be jeopardised by allowing flawed managerial controls in its Singapore office. The Financial Times(1) reported that internal auditors had explicitly warned the Barings board in August 1994 that the management controls in place in Singapore were defective and posed a threat. Given what subsequently took place, the board appears to have believed that the risk of a collapse as a result of this hazard was so remote that extensive remedial measures were not justified. Unfortunately, the board was wrong.
Decision-making
Simon and Newell(2) suggest that: ‘Man is a mirror of the universe in which he lives, and all that he knows shapes his psychology, not just in superficial ways but almost indefinitely and subject only to a few basic constraints.’ Hence, for human beings perception is all. When our perception of a situation is incongruent with ‘reality’, the decisions we make and the actions we subsequently decide upon may be inappropriate. The evidence appears to suggest that there are a number of hidden socio-psychological pathologies that can prevent individuals and groups of people from making optimum decisions.
Ignoring the signals
When discussing why warnings of disaster are sometimes ignored, Turner and Pidgeon(3) point to the work of psychiatrist Martha Wolfenstein. She suggests that one of the reasons for such behaviour is: ‘...the sense of personal invulnerability which is essential for most individuals if they are to be able to go about their daily business without constantly worrying about all the possible dangers that could threaten them.’
An example of such thinking can be found in the words of Dr Brooke(4), who was a member of the team that responded to a fire at the Allied Colloids chemical plant, Bradford, West Yorkshire in 1992. “Never in my worst nightmare did I think that sort of thing could happen, and I’m sure you think that about your organisation. But there it was - happening.
Similarly, Mr W Lucas (5), a mining financier, after calling in the liquidators to one of the companies in which he had invested, said: “In mining you are always saying there is a risk the roof will fall in and you’ll be left without a mine at all. But you don’t expect it to actually happen.
This tendency to dismiss potentially injurious events in order to make the world seem a safer place can cause people to fail to recognise the significance of warning signals, and thereby paradoxically allow a dangerous situation to get worse.
Failure to learn
Even where warning signals or past errors are recognised, there is strong evidence to suggest that people do not enjoy learning from negative events, despite the fact that the lessons gained may be advantageous.
In a series of experiments, Wason(6) found that when subjects were presented with a simple conceptual task, where the most efficient solution involved eliminating hypotheses (learning from negative outcomes), they demonstrated a strong aversion to such a strategy. They preferred instead to search for evidence that would confirm their hypothesis (learning from a positive association).
Wason observed that such a defensive mechanism allowed individuals to seek out confirmatory evidence to support their world view rather than accept challenges to it. Where there is a strong aversion to learning from negative events, it may be difficult for individuals to learn from their own or other people’s mistakes.
Similarly, March & Levett(7) conclude that, in general, people prefer to search for successful outcomes and then learn to copy that behaviour rather than look for actions, they should avoid repeating. However, as Miller(8) observes: ‘Failure teaches leaders valuable lessons, but good results only reinforce their preconceptions and tether them more firmly to their “tried-and-true” recipes.’
It would appear that, when seeking to make a decision regarding the viability of any given project, people should look for examples of both failure and success.
Denial of failure
Another way in which people’s world view can lead to undesirable outcomes is if they attribute desirable outcomes to themselves, but blame external factors for failure. For instance, it was reported that, when Tracy Shaw, the actress who played the role of Maxine in Coronation Street, was facing a six month driving ban, she claimed that ‘... it was her anorexia that was to blame ... not her lack of general driving ability and care ...’(9).
A person whose locus of control is predominately external may take a fatalistic view of the world and assume that nothing can be done about untoward events. Such a person may make no effort to stop an event, even though it may be eminently preventable.
The physical and attitudinal change that takes place in an organisation, after an untoward event may in turn depend upon whether those in positions of power attribute it to bad luck, or whether they are able to change their views in the face of new evidence.
This is demonstrated by a comment in a study reported by Toft and Reynolds(10). When a senior manager was asked if he thought bad luck had played a part in creating an incident in which his organisation had been involved, he replied, “I think the feeling was, a year or two later, that it was very much a one off accident that had happened before, and was very unlikely ever to happen again, even if we had taken no action”. This comment was made despite the fact that two similar incidents had previously occurred and had been reported in the press.
Conclusion
Every day, people make scores of decisions, trusting that they are the most appropriate ones. However, there are powerful inherent socio-psychological processes that affect both individuals and groups, and if these are not recognised and managed effectively, they can increase the risk of inappropriate decisions. However, there are a number of tactics that organisations can implement to help reduce this risk and thus improve the probability of success.
--
Professor Brian Toft is managing consultant, Marsh, Tel: 01793 871809,
Email: Brian.Toft@marsh.com
References: 1) Financial Times, 3/03/95. 2) Simon, H & Newell, A. (1973) Human Problem Solving, Prentice Hall, p.95. 3) Turner, B A and Pidgeon, N (1997) Man-Made Disasters, Butterworth-Heinemann, p.34. 4) Health and Safety Practitioner, January 1999. 5) Financial Times, 24/04/97. 6) Wason, P C (1960) “On the failure to eliminate hypothesis in a conceptual task”, in The Quarterly Journal of experimental Psychology, Vol. 12, Part 3. 7) Levitt, B & March, G ‘Organisational Learning’, Ann. Rev. Sociol. (1988) 14, pp. 319-40. 8) Miller, (1997) “The Icarus paradox”, in Carnall, C.A., Strategic Change, Butterworth-Heinemann, p.88.9) Evening Standard, 16/10/96 10) Toft, B and Reynolds, S (1997) Learning from disasters: a management approach, 2nd edition, Perpetuity Press, p.66.
DEFENSIVE TACTICS
Each organisation needs to address the issues within the context of its own circumstances. Failing to take account of local conditions and problems may well create difficulties that increase the risk of failure from other unforeseen sources.
Ways to counter socio-psychological pathologies include