SINCE the Government insists that science is our best weapon against ‘the pandemic’, it is surely only reasonable to expect its decision-making processes to be based on science too. Yet this could hardly be further from the truth.
Our democracy does not stretch far enough to allow us to scrutinise in fine detail how our government arrives at its decisions. But it seems to go something like this. When a problem emerges, committees and sub-committees made up of a handful of experts in fields considered relevant meet privately to discuss what to do.
These committees give a snapshot of current knowledge and recommend strategies which are discussed in secret by an even smaller group of Cabinet ministers. The Cabinet’s collective recommendation may then be put to a vote by 600 or so MPs, who are usually told what to do by their political party.
The politicians at the sharp end of the decision-making process mostly have no special expertise in the subject they are assessing. They do not use a validated methodology. They are advised by people who often have significant vested interests. There is no input from the public, despite vastly more expertise in the community than any Cabinet could possibly muster.
Despite the blatant inadequacies of this approach, once ‘the policy’ is proclaimed, it is as if it is forged from steel, as strong as religious conviction, reappraisal out of the question, criticism serving only to strengthen ministers’ faith in it.
It is difficult to overestimate the stubbornness with which we cling to our treasured truths. Imagine Greta Thunberg saying: ‘I’ve had another look at the data, and I think I’ve been exaggerating things.’
Our psychological biases act like fairground mirrors, bending and twisting our perceptions of ourselves and the external world. We tend to believe what we want to believe. We see what we want to see.
Yet once we muster sufficient willpower and humility, we can recognise, at least occasionally, our subconscious inclination to seek evidence that confirms our favoured point of view while ignoring data we don’t want to accept.
It is forbiddingly hard for human beings to admit we are mistaken. This is a scientific fact. We pay attention to only a few aspects of life while ignoring the rest. We bow to pressure to make decisions favoured by the groups we belong to, despite nagging doubt. We place more faith in our own opinions than those of others, however cogent these may be.
But there are tried and trusted ways to avoid these biases, like this advice: Deliberately look for ways to challenge the patterns you think you see in the data. Obtain information and opinions from a variety of sources.
Try to refute your favourite theories. Consider circumstances from multiple perspectives. Find people who disagree with you and try to understand why they think as they do. Surround yourself with diversity. Entertain the possibility that the truth is the exact opposite of what you believe. Never be afraid genuinely to admit you may be wrong.
Such self-criticism is tough even to initiate, never mind sustain. But if the Government were to invest even the smallest fraction of its vaccine funds on a Standing Body of Advisers drawn from disparate walks of life to assist policymakers to ruminate as broadly as possible, everything might change. It is just not acceptable for whole nations to be subject to such obstinately limited outlooks any longer.
To government ministers and their advisers, this will likely seem a naïve, possibly even insulting, suggestion. Yet our leaders are so obviously subject to tunnel vision, their constricted outlook has caused so much damage, and they are so plainly incapable of seeing past their noses, that something must change.
At the very least they must learn to ask themselves a set of essential questions and give and defend their answers in the public arena.
As a start, why not routinely consider ten simple questions, working through them with support from the Standing Body if necessary, facilitated by a multidisciplinary research team?
1. What is the problem?
2. To whom is it a problem?
3. Why have we defined this problem as most pressing rather than another?
4. What is our solution?
5. What evidence and what value judgements support our solution?
6. What benefits (medical, social, economic, personal) can be expected from our solution?
7. What costs (medical, social, economic, personal) can be expected from our solution?
8. How have we compared and balanced possible outcomes?
9. What new problems does our solution cause?
10. What are the alternatives to our preferred solution?
Psychological science shows how difficult it can be to think straight. It also shows us how to be honest with ourselves.