Science Seer
Collectively we are not very effective at critical thinking. For starters, we immediately jump to conclusions, the default reaction to new information. Daniel Kahneman explains this in his essential book Thinking, Fast and Slow. If the subject is not clear cut, take a bit of time to think more about it.
Most people don’t have a good grasp of probability; it is something scientists are specifically trained for. We think of “N,” the number of replications. An event happens, N = 1, and there is zero confidence in predicting future results. If the same event happens three times in a row, people automatically think “cause and effect.” That’s not good enough. When we do experiments, we try for at least 25 replications, and that only gives us around 95% confidence in the results; one time in 20 we will still be wrong. So the next time something unusual happens to you, think “N = 1” and be patient.
If the event is truly impressive or frightening we tend to give it more importance. Every time a plane goes down, we all shudder and wonder about the safety of flying. But your probability of dying in bulk commercial aircraft is one in seven million. You are 19 times more likely to die stepping into a car.
And perhaps the best example of skewed understanding of probabilities is nuclear power. Chernobyl, the result of clearly shoddy improper procedures, resulted in about sixty total deaths attributed to immediate blast effects, half of those occurring well after the event. Because dosages were so low, there is a lot of debate over possible subsequent cancer deaths, with no resolution possible because the radiation levels were so low that accurate damage prediction is impossible.
Fukushima also scared a lot of people, although there were no immediate radiation-caused deaths. Several dozen people died from evacuation stress-related events. Fifteen thousand died as a result of the earthquake and tsunami. Nuclear waste bothers a lot of people, although if you got your entire lifetime of electricity from nuclear power, the waste would be the size of a door knob (usually stored in pools of water on site at the moment). Nuclear has by far the lowest death rate of nonrenewable sources, and that is almost entirely due to mining accidents. It has prevented millions of pollution-related deaths and its footprint is far smaller than solar, wind, or hydroelectric.
But what about critical thinking in daily events? We all read the news in various formats and much of what we browse is either exaggerated or untrue. How to know? First, consider the source. A reputable news medium counts for a lot; if you get all your information from Fox (as my brother does), at best you’re getting shoddy information. Look at the source. Ideally it’s a publication from a reputable scientific or medical journal, or maybe from a doctor or scientist. If you’re doubtful, follow the money. It’s a climate denial from a person funded by Exxon? Ha. And if it’s completely undocumented and a conspiracy theory to boot, stop reading.
I still find it incredible that any rational person would believe Qanon… but then, they aren’t rational, are they?
The study of risk *perception*, i.e., doing what some think is impossible – the objective study of subjectivity – is an interesting field. The (my term) “Slovic school” assets that risk perception is mediated by “world view threats”. Objectively small risks are magnetified if the events would threaten a world view; similarly large risks can be ignored if they are unrelated to the world view. Of course, this requires (unlike what is in the interesting article) discussion of the nature of probability. I happen to be a “propensity” partisan, like the philosophers Bunge, Popper and Mellor.
Frequentism, not to be confused with the *statistical* view of the same name, is provably wrong, as shown in 1930 by Jean Ville (_Etude Critique de Notion de Collectif_). Similarly, personalism or most varieties of Bayesianism, are also wrong. The best way to see why, IMO, there, is to see that there is actually no agreement on what the view actually involves beyond a few elementary matters. Even the theorem which gives the movement its name is not used in the same way by all Bayesians – so called Jeffrey conditionalization is also known, and that’s a continuum of methods. (The theorem is a theorem in the mathematical theory of probability, which itself has no factual reference; the different interpretations are different “semantic assumptions” that link it to other fields, like, e.g., classical thermodynamics or genetics.)
For those who are interested and know where to find me, I do have a paper on all of this from my student days I can pass along.
Of course, critical thinking skills are primordial if you want to live well, happily, and effectively. Not to mention just staying alive. The examples in this article point to conclusion-jumping in the direction of over-estimating risk. There is also under-estimation, as in the majority decision to not wear COVID masks in public, given only that it is legal. A bit of critical thinking, along with watching a few reliable newscasts, will tell you clearly that “It ain’t over, folks!”