Last Updated on 2022-10-21 by Joop Beris
We live in the Age of Information. Never before in the history of human civilization have so many people had access to such a wealth of information. For many of us, knowledge is quite literally at our fingertips. I used to believe that if only everyone had access to information, we would all benefit as society would become more knowledgable with each generation. Unfortunately, if the recent pandemic has taught us anything, it is that this is a rather idealistic portrayal. There is a lot of misinformation out there and it seems to spread more readily than actual information. Resisting misinformation is a much needed skill. And this is where a book from 1995 can still help us today!
The problem of misinformation
One of the things I want to do with this blog, is to address the issues with misinformation and pseudoscience. You might wonder why I think those are a problem so let me explain briefly. The things we believe about the world generally inform our actions. For instance, if I believe the building I’m in is on fire, I will do my best to get out as quickly as possible. If my beliefs are correct, that may just save my life. If the building is on fire but I incorrectly believe that it isn’t, I may end up dead. Or another example, closer to the current situation: if I correctly believe that a vaccination against Covid-19 will protect me and those around me, I will be more inclined to get vaccinated. The closer the things we believe align with reality, the better our decisions will generally be.
This works on a wider scale too. The more people believe true things and reject false things, the better decisions we make on a whole. For a functioning democracy, an educated, well informed public is crucial because they are most likely to make educated, well informed decisions that not only benefit themselves but others as well. This is where misinformation gets in the way and why resisting misinformation is such an important skill.
In 1995, Random House Publishing published “The demon-haunted world“, the last book Carl Sagan would ever write. In this book, with the subtitle “science as a candle in the dark”, famous astronomer, astrophysicist, astrobiologist and science communicator Sagan explains the scientific method to the general public, advocating for critical and sceptical thinking. As such, this book is just as relevant today as it was back in 1995.
Sagan offers the reader what he calls: The Baloney Detection Kit. Because I am convinced that critical and sceptical thinking is vital to our societies and ultimately to our survival, I want to share this toolkit with you now and I hope sincerely that you will use it and spread it.
The Baloney Detection Kit
- Wherever possible there must be independent confirmation of the “facts.”
If several sources report on the same event and they agree about most of the details, the reports are probably reliable. But these sources must be independent of each other.
- Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
A substantive debate means that discussion is based on reality and is meaningful. It doesn’t include idle speculation or outlandish ideas. Knowledgeable proponents is also key. There’s no point listening to social media personalities for ways to combat Covid-19. Models, singers and actors aren’t doctors or virologists.
- Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
This one is perhaps difficult to grasp for many. However, it’s important to keep in mind that even the most knowledgeable people make mistakes. Even Albert Einstein, a name that’s almost synonymous with genius, made several mistakes during his career. So the argument from one person, no matter how much of an expert he/she is, should not carry much weight. It’s what you can show to be true, not who said it.
- Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
An event or fact often has more than one cause. The explanation or hypothesis you prefer, isn’t necessarily the right one. So if multiple explanations for an event could be true, it’s best to try and disprove all of them. The ones that survive this test will be all the more credible.
- Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
A major difference between the scientific community and the rest of society is that people will try to prove themselves wrong instead of right. Even if you are wrong, it means you have learned something and every hypothesis eliminated brings you one step closer to a better answer. The process of peer review is tough but it leads to better understanding.
- Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
Put simply, if you can measure it in any meaningful way: measure it! It’s much easier to compare numbers than it is to evaluate people’s often subjective arguments.
- If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.
Each argument in a chain must flow logically from the previous link, links should not be skipped and if even one link in the chain fails, the link of arguments isn’t valid. For instance, this is a valid chain: Tabby is a cat. Cats are mammals. Therefore, Tabby is a mammal.
- Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
Ideally, a hypothesis that includes the least number of assumptions or steps should be preferred over one that makes many assumptions or needs many steps. Sir Isaac Newton phrased it in the following way: “We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances. Therefore, to the same natural effects we must, as far as possible, assign the same causes.”
- Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle — an electron, say — in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate sceptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.
Simply put: if there’s no effective way to prove your hypothesis wrong, it’s a weak hypothesis. If I say there’s a dragon in my attic but that it is invisible, doesn’t require food, makes no sounds and doesn’t show up on infrared imaging, how could anyone prove the dragon is not there? There’s no reason to consider such a hypothesis. So whenever you hear someone make a statement, consider how that statement could be falsified. If it can’t, there’s little reason to accept the statement as true.
Resisting misinformation belongs on school curriculums
I hope that you also see value in the development of critical thinking skills. I wish that schools would focus more on how to think in their curriculum. Resisting misinformation should be taught in school, just as children learn to prove their maths. Critical thinking is also a prerequisite for developing free thought because you can use it to examine your own thinking. If you think that the Baloney Detection Kit is a valuable tool, please feel free to share this post.