Climate researcher and science communicator John Cook has a reminder, and a fresh proposal.
The reminder is thus: when you present people with evidence that contradicts their deeply held beliefs, they don’t change their minds. Instead, their beliefs become more entrenched. This phenomenon has been repeatedly demonstrated in research.
The proposal is thus: to fight this tendency, people need to be ‘inoculated’ against it in the same way we are inoculated against infectious diseases – with a weaker form of the pseudoscience:
You might have a healthy understanding of the science. But if you encounter a myth that distorts the science, you’re confronted with a conflict between the science and the myth. If you don’t understand the technique used to distort the science, you have no way to resolve that conflict.
Half a century of research into inoculation theory has found that the way to neutralise misinformation is to expose people to a weak form of the misinformation. The way to achieve this is to explain the fallacy employed by the myth. Once people understand the techniques used to distort the science, they can reconcile the myth with the fact.
The metaphor is nice, and the approach is probably sound, although I’ve never looked at logical fallacies as weak forms of misinformation. To me, they are vital tools in the trade of argumentation – things to be spotted, avoided, and called out.
John Cook uses the example of anti-vaxxers believing that vaccines cause autism due to a discredited paper by Andrew Wakefield, and notes that the myth persists because science in this case is distorted by the post hoc, ergo propter hoc fallacy, where causation is wrongfully assumed because one event happens before the other.
If this is the case, he believes that recognising the fallacy would help people realise that the science has been distorted.
In some cases that might very well work.
However, I have seen anti-vaxxers attempt to use that same tool of pointing out fallacies to prove their own point. While most of their reasoning is indeed fallacious, the more devious and entrenched ones will throw a good old “correlation is not causation” right back at you. Which makes me think that there are lost causes too, particularly when you are dealing with a conspiracy theorist.
While I strongly support teaching critical thinking and the recognition of logical fallacies to anyone from the age six and onwards, I’m not sure whether it really will inoculate everyone against pseudoscience.
It’s useful, for sure. But I suspect that there are many more factors at play when it comes to entrenched beliefs in utter nonsense.
What do you think?