Just over two months later, there have been over 30 acts of arson and vandalism against telecom sites and around 80 recorded acts of harassment against telecoms engineers. One viral incident saw the harasser confident enough to record their interaction, switching quickly from social distancing concerns to conspiracy theories.
It’s easy to dismiss those that are falling for these theories as dumb, or even malicious. But they’re not. We are all irrational, and conspiracy theories are an irrational way to make sense of the chaos in the world. What’s different now is a potent cocktail of social media, mixed with a chaotic, random event that is killing thousands. People feel better if there’s an explanation; a lone gunman killing a president or a princess dying in a car crash are too simple, too random. Plus, conspiracy theories that have turned out to be true don’t help, such as MK-Ultra and PRISM.
Why do we fall for conspiracy theories?
I am irrational. You are irrational. Everyone is. You may think that you think through every choice and arrive at the best decision, but it’s not true. We’re manipulated by advertising, by rhetoric, and by ourselves.
BF Skinner showed this with his Superstition Experiment, where he placed hungry pigeons in an empty box and fed them at random times. The randomness had the effect of developing behaviours in the pigeons—they seemingly came to believe that certain actions they were taking were providing them with food. 75% of the pigeons moved their heads in patterns or spun around in circles, in the belief that that was the trigger for receiving food. In the same way that Ivan Pavlov’s dogs had linked the ticking of a metronome (not a bell, as is usually believed) to feeding time, Skinner had shown that brains—even tiny ones—will link together randomness to create patterns.
Our brains may be bigger than a pigeon’s, but we do the same thing. Pareidolia is a near-universal phenomenon, where we think that we see faces in car grilles, in the moon, or even a deity in a piece of toast. We look for patterns, even where there are none.
Also, random patterns do not look like we assume. Get someone to toss a coin a hundred times and write down the result, and get someone to pretend to do so, and it’s usually easy to see who did what—the real result will have streaks of heads and tails that the fake result lacks. It’s the same fallacy that leads gamblers to believe they’re on a “hot streak”. That’s just what randomness looks like sometimes.
What has this got to do with 5G? Conspiracy theories around mobile phone masts have been around longer than 5G, and they took hold because of “clusters” of disease, particularly cancer. In a year, a certain number of people will fall victim to diseases like cancer, and while genetics and lifestyle increase the risk, bad luck plays a part. As a result, a map of cancer cases across the country will have a great deal of randomness to it—and therefore won’t look random to us at all.
The “Texas Sharpshooter Fallacy” is named after a sharpshooter shooting randomly at a wall, and then drawing a bullseye around where most of his bullets landed. Pure random chance means that cases of a disease will not be spread evenly across the country. Instead there will be clusters of cases, and it’s easy to see why people living in those areas will look for an explanation. Blaming the devices and masts that apparently work using radiation is an obvious next step.
How do we fight conspiracy theories?
If you want the truth about 5G, it’s easily available. Technology magazines, populist newspapers, communiques from mobile operators—they all say the same thing: the conspiracy theories are bunk, and the science doesn’t support them.
But the problem is that, in order to believe a conspiracy theory, you need to have already embraced “alternative news sources” and rejected what is seen as only the mainstream view. Anyone that goes against the theory is then dismissed as part of it. The whole premise may be built on sand, but that just makes it easy to shift as new facts are presented. “Well, they would say that, they’ve been bribed/have too much to lose/are one of the conspirators.”
Presenting facts that debunk a person’s belief may actually be counterproductive. If you’ve ever wondered why political debates don’t seem to go anywhere, it’s down to something called the “backfire effect”, where facts that don’t fit with beliefs are rejected and instead end up strengthening someone’s conviction. If you think this can’t be true and surely facts will trump supposition, then there’s a chance you’re falling into the same trap.
So, what can be done? Professors Stephan Lewandowsky and John Cook, authors of The Conspiracy Theory Handbook, might have an idea. They point to the roots of conspiracy theories being feelings of powerlessness (real or imagined) alongside experiencing unlikely, threatening events. Conspiracy theories are a coping mechanism to make sense of the world, and social media accelerates them faster than ever before.
Tackling this sense of powerlessness is, according to Lewandowski and Cook, key to defeating the spread of conspiracy theories. By promoting the views of “trusted messengers” that used to hold the same views but have since rejected them, by showing empathy and understanding, and by affirming and redirecting critical thinking, individuals can be brought round. They also warn against ridicule and “winning the argument”—that will simply backfire.
We shouldn’t rule out ridicule entirely, however—it may help. Not poking fun of individuals who have fallen into the trap of conspiratorial thinking but ridiculing the ideas. People don’t like to be part of an out-group. The old Candid Camera gag with everyone but the dupe facing the wrong way in an elevator shows just how powerful social cues can be. There’s a precedent here too, where the “meme-ification” of 9/11 conspiracies turned them into a joke and essentially neutered them.
Social media needs to play a part, too. Facebook, YouTube, and Twitter are acting against those propagating theories, but many think they aren’t being fast enough or strict enough. WhatsApp’s new forwarding limit may help, where adding just a little friction has slowed the spread of misinformation by a quarter. Friction may be key here—studies have shown that by slowing people down and adding extra steps (“Are you sure you want to retweet this?”) they become more capable of distinguishing truth from fiction, and less likely to share easily debunked nonsense.
Factsheets and scientific debunking will help the fight against conspiracy theories, but they are ineffective tools on their own. With their infrastructure under threat and personnel in potential danger, the telecoms industry needs to recognise this and act accordingly—and quickly. These conspiracy theories are creeping into the mainstream via celebrity Instagram posts and even breakfast TV hosts, emboldening those that are used to being on the fringe. In the long term, without action, mobile operators could be affected by government policy if the mainstream turns against 5G and votes accordingly.