It looks like 2023 will be the hottest year on record. Along with all the cyclones, hurricanes, floods, and bushfires we have already had. Those who study and take climate change seriously unanimously agree that man-made changes to the environment are causing the climate to change so fast that Mother Nature will struggle to keep up.
And for the minority (yes, it is a minority as has been confirmed by many surveys and studies across the world), their arguments against climate change go something like
this …
It might not be because of us ...
...so it's definitely not because of us.
That’s right. Whatever Twitter is called now is full of amateur opinions about the ‘natural’ cycle of temperatures and climate over millions of years and how this is just another part of that. Of course, it is only in the last 200 years or so that the atmosphere has had to deal with ozone-depleting chlorofluorocarbons, carbon dioxide from all our vehicles and electricity plants (which causes acid rains that kills the plants that make oxygen), the efficiently ruthless removal of even more plants and forests, plastics in the ocean as far as the eye can see and so on. And of course, it is only in the last 200 years that the rate of change has been so pronounced that things that are usually resilient (think corals) simply cannot adapt quickly enough to the rising temperatures in the water.
We often hear of this thing called ‘confirmation bias.’ That is, we tend to believe only the statistics that support our pre-existing beliefs. And we also know that again, thanks to whatever Twitter is called now, there is a never-ending stream of statistics. Some of them are even valid. So we are forced to choose statistics to take seriously.
But confirmation bias is a normal human action. There is nothing inherently wrong with it. For example, let’s say that you have a bird feeder in your backyard. You put bird seed in your bird feeder to attract beautiful songbirds to make your backyard a more beautiful place to sit in. But there is a problem. Your bird seed is going missing. More often than not, you notice that all the bird seed you put in your bird feeder every day is gone by lunchtime. And this cannot be to birds alone.
So you borrow a camera – like a ‘trail cam’ that hunters use. The camera is designed to take pictures every time it senses movement. And so you point this camera at your bird feeder. For ten days it takes pictures. After ten days, you have to return your trail camera to the person you borrowed it from. And now you have a chance to look at the pictures.
Sure enough, there is a pesky squirrel (let’s call him ‘Cyril’) that regularly appears and eats all your bird seed, chasing the birds away, and otherwise gorging on the bird seed himself. Now the next day, you fill up your bird feeder in the morning. And lo and behold, the birdseed is gone by lunchtime.
Technically, if you put this on Cyril the Squirrel, then you are guilty of confirmation bias. The loss of bird seed could plausibly be down to high winds, a broken bird feeder, an unusually high number of birds, and so on. But it is not wrong to blame Cyril for the loss of birdseed on the eleventh day … given you have confirmed that he was behind the loss of birdseed on the previous ten.
So confirmation bias serves a useful purpose. True, it can go too far. But it is very human to look back on your experiences (which have this annoying tendency to motivate you to form opinions) and then scrutinize statistics that violate your beliefs more vigorously than statistics that confirm them. We can’t spend every moment of every day examining every idea, opinion, or conclusion exhaustively. So we need to choose which claims are worthy of skepticism and which are not.
But today, we have something more sinister at hand. The concept of ‘identity politics’ touches on this. But doesn’t explain the whole story.
Put simply, humans tend to let emotions (not experience) drive decision-making. Let’s say that twenty years ago you had a bad experience with a ‘militant environmentalist’ that motivated you to join an online forum where there appeared to be a valid questioning of the claims of climate change. There was nothing sinister or overwhelming about this choice. It is just that you were yelled at for no apparent reason by some lunatic who claimed that ‘you were part of the problem.’ And this really ruined your birthday. This felt wrong to you, and you wanted to look to others to see if they felt it was wrong too.
But now, you have been an active participant in this forum ever since. You have put so much effort into feeling (and then reinforcing) the sweet sensation of vindication for all these years that it is now borderline impossible for you to leave. Online forum members are even considered to be ‘close electronic friends.’ But the evidence supporting a conclusion of disastrous man-made climate change is orders of magnitude more persuasive now than it was back then. So you and the other members of that forum are finding ever more ‘fringe’ statistics, from whatever source, that keep the forum and the social support structure it brings alive.
In this scenario, the existing belief is not so much based on a detailed study of the facts. It is instead a necessary foundation for personal and professional validation amongst the ‘peers’ of the forum. If your opinion changes, you lose membership in the forum. So your opinion doesn’t change. And you have lots of ‘statistics’ to back this up.
Our emotions dress up as reason everywhere.
Take engineering for example. A discipline with the perception of objectivity and thorough investigation. We routinely see seemingly smart men and women make bone-headed decisions that result in unmitigated disaster. Whether it was Boeing deciding to halve the amount of money it paid its suppliers (and hoped nothing would change or go wrong). Or TEPCO ignoring its own risk assessment regarding tsunamis before the Fukushima disaster. NASA and its toxic management culture that unambiguously resulted in two Space Shuttle disasters. The Royal Australian Navy cut back on maintenance so much that it ‘informally’ decided that ships and boats didn’t need to have redundant systems that worked when they were put to sea.
This is not confirmation bias. These disasters came down to the selfish prioritization of personal goals at the expense of other people. Some of these selfish motivations are down to culture. NASA for example created a ‘space flight’ culture where people who got in the way of launches were professionally ostracized. This included engineers who tried to explain that the Challenger and Columbia launches were dangerously risky. It also included the ‘bulk’ of NASA’s organization that quickly overruled them. So people prioritized promotion and ‘belonging’ over common sense and science.
But it is not always culture.
The then-CEO of Boeing, Dennis Muilenburg, was the unambiguous boss of the organization. But he chased the adulation from shareholders when he was able to generate high (short-term) profit after profit. The problem was this was done at the expense of engineering and physics. So pretty soon, aircraft started falling out of the sky.
These scenarios have been studied. And all these studies essentially point out how wrong the individuals (and in some cases the cultures that cultivated them) were in the lead-up to these problems.
But these studies also found something else. They found that the decision-makers of disaster always clung to some barely verifiable statistic that justified those boneheaded decisions. Even they were so concerned about what they were doing they needed the security blanket of ‘someone else’s numbers to give them the comfort they needed as they plowed through barrier after barrier towards the precipice of disaster.
The antidote to this is of course critical thinking. That takes time and effort. And sometimes pushes us outside our comfort zone. There are lots of tools in our toolbox to try and relegate the need for critical thinking. We might chase group consensus (where for some reason the shared opinion of many fools can feel way more enlightened than the reasoned conclusions of a single person who was motivated to ask questions). And of course, gathering statistics from fools is not research. It is not even a literature review.
So remember that 2023 is on track to be the hottest year since records began being kept. And it is true that the climate has naturally cycled over millions of years. But here is another statistic for you …
It took the dinosaurs no less than 33,000 years to die off
after the asteroid hit the Yucatan Peninsula
And that is considered to be a blink of an eye in the geological history of the world. So really, 200 years is about the time it takes for a bomb to explode or a car to crash in terms of how we gauge time. So make sure that when you design the next plane, train, or automobile, make corporate decisions, or do anything where the impact of you being ‘wrong’ is substantial … use critical thinking, and don’t just gather the statistics of fools.
Comentarios