Suppose an individual believes something with his whole heart; suppose further that he has commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: What will happen?
The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervour about convincing and converting other people to his view. Leon Festinger, When Prophecy Fails, 1956
THE stock market will crash on December 21.
In the few seconds it took you to read that first sentence your innate survival instinct will have kicked in. It is telling you: something bad is going to happen, you need to act. The rational part of your brain is pushing back, saying: nonsense, he has no idea what will happen to the stock market on December 21. But you still have that nagging worry.
Grabbing people’s attention and coercing them into action with frightening stories and doomsday prophecies is an old trick. It exploits our risk aversion bias, which is part of our survival instinct telling us to look out for danger. The scary story trick is ubiquitous, journalists use it, religions use it, campaign groups use it and of course governments use it. Learning when to ignore such stories and when to see through them is an important life skill, and for investors an essential skill.
In 1954, a small team of social scientists conducted a remarkable piece of research which taught us a lot about how we respond to scary stories. The scientists infiltrated a doomsday cult and were able to observe at first hand how the beliefs and behaviours of the cult members evolved in response to a series of failed prophecies.
The central doomsday prophecy of the cult was that a devastating flood would sweep through North America on December 21. The members believed they were a select few who had been chosen by spacemen from the planet Clarion to be saved from the apocalypse. They believed the spacemen, or Guardians, would transport them in flying saucers to safety on alien planets.
Leon Festinger and his team joined the cult in early November, in time to see the cult members preparing for rescue at 4pm on December 17. The event was watched by reporters, TV crews and other curious individuals.
The cult members were told to wait for the flying saucers in the back yard of the headquarters, and to remove any metal objects from their clothing as these would heat up during the space flight, causing nasty burns.
Shortly after the flying saucers failed to arrive at 4pm, an inquisitive teenager turned up at the house. Without a shred of supporting evidence, and despite the teenager’s protestations to the contrary, the group decided he had been sent by the spacemen with instructions for the group. His inability or refusal to deliver the instructions was taken as a test of their faith.
After lengthy discussion, the group convinced themselves their spacemen masters were conducting a practice run and the flying saucers were going to arrive at another time. This invented explanation did not just allow the group to retain their beliefs, perversely it appeared to have strengthened their conviction. Festinger and his team observed this pattern many times. The disconfirmatory evidence – the absence of flying saucers – was ignored while confirmatory evidence – the teenage messenger – was invented. It was an extreme example of selection bias and confirmation bias, where any inconvenient data is discarded.
A few hours later the group’s leader announced that the flying saucers had been delayed and would arrive later that night. Again, the group waited outside the house, this time into the cold small hours, and again nothing happened. This time the group coped with the disappointment by simply ignoring it. They returned to the house without discussion.
In response to being proved wrong, the group did not change their beliefs, rather they stepped up their proselytising efforts, actively engaging with outsiders in an attempt to win new converts. One of the leaders explained his continued faith to one of the undercover observers: ‘I’ve had to go a long way. I’ve given up just about everything. I’ve cut every tie: I’ve burned every bridge. I’ve turned my back on the world. I cannot afford to doubt. I have to believe.’
The leader felt he had to keep the faith because to do so would require recognising the huge cost of previous mistakes. The loss to his status and damage to his ego were too much to contemplate. We now call this type of flawed thinking a sunk cost fallacy.The greater the sunk cost, either financially or reputationally, the more difficult it is to acknowledge the mistake and correct it.
There were two more no-shows by the flying saucers, then the group set about the torturous process of explaining away their failed prophecy.
The absence of the December 21 flood required an especially clever story. The explanation they settled on was an ingenious logical contortion: the spacemen had decided to save the world from the flood as a reward for the cult’s faith. The flood did not happen because they believed it would happen. The logic was entirely circular but, to the cult members, it provided the confirmatory narrative they required.
Rather than being beaten down by the failed prophecy the group emerged jubilant with their faith further reinforced. In Festinger’s words, ‘they had a satisfying explanation of the disconfirmation . . . From this point on their behaviour toward the newspapers showed an almost violent contrast to what it had been. Instead of avoiding newspaper reporters and feeling the attention they were getting in the press was painful, they almost instantly became avid seekers of publicity.’
As an interesting aside, the researchers noted that cult members who remained living within the main group after December 21 disconfirmation tended to emerge from the experience with their faith preserved or enhanced. On the other hand, those who had to cope with their disappointment alone tended to lose their faith. Social confirmation, groupthink and peer pressure were important parts of preserving and reinforcing the members’ beliefs.
After this research, Festinger came up with the theory of cognitive dissonance, when a person holds contradictory beliefs, values or ideas and struggles to make them consistent.
His story provides some useful hints for how to spot and when to ignore false doomsday prophecies.
Are those making the doomsday prediction heavily invested in it? Does the rationalisation of the scary story keep changing? Do the proponents have a substantial financial or reputational sunk-cost making it impossible to admit they may be wrong? Is contradictory, positive, data being ignored? Is supportive, negative, data being manufactured? Are the protagonists listening only to those who share their own views? Are the protagonists evangelical in their beliefs, needing to proselytise their beliefs?
If you see these behaviours, you should probably nod politely, ignore the story, and move on. Most likely it is nothing more than an attempt to grab your attention and control your behaviour with scaremongering. Do not waste time trying to convince them with science, a true believer will have too much of their reputation invested in the story to listen to reason.
I will leave it to you to judge if Festinger’s findings have any relevance to current events.
This article was first published by Equitile Investments on October 27, 2020, and is republished by kind permission.