Disinformation, the election, and you
The internet and social media enable disinformation – knowing how and why can ‘inoculate’ us against the virus of lies during the federal election campaign.
The collision of the internet and politics has brought riots in Washington and Melbourne and people dying from a disease they refused to accept existed. Radical opportunists now spread disinformation – information deliberately aimed to deceive – which undermines democracy, spreads disorder, and erodes trust in institutions and between citizens.
Activist and author Ed Coper describes the internet as a place that “favours extreme opinion and falsehoods”.
“Disinformation spreads six times faster on social media than information.”
The Information Age hasn’t broken down cultural and political barriers, it has made us retreat into “like-minded bubbles of groupthink”. Autocrats have prospered and knowledge has been usurped by outlandish lies.
In his book Facts and Other Lies: Welcome to the Disinformation Age, Coper explains how this has occurred, and offers suggestions on how to “inoculate” us against the “virus” of disinformation. These lessons will be vital in the lead up to an election that could be decided as much by how candidates game social media algorithms as by their policies.
How does disinformation spread?
Google searches personalise results, providing the user with more of what they’ve already engaged with (which is usually what they already believe).
Muneera Bano, Senior Lecturer in Software Engineering, Deakin University, says this process leads to a “dangerous cycle” that can polarise people’s views, and keep them from the truth.
Coper says social media also uses algorithms which favours “inflammation, outrage, hyper-partisanship and false information”. Users are also shielded from information which may contradict a mistaken belief.
So anti-vaxxers see anti-vaccination content. Vaccine advocates see pro-vaccine posts.
This is called the filter bubble effect, where platforms only show you things you engage with. It can usher people down informational rabbit holes.
Coper says a lot of people who are members of extremist groups on Facebook were recommended those groups by the algorithm, not by people.
“It was the platforms itself that said, ‘Hey, you sound like a neo-Nazi. You might be interested in joining this neo-Nazi group’...”
But worse than the prejudices of the technology are people prepared to exploit those biases to make money or foment unrest. Coper says those spreading false information have found incredibly powerful ways to “manipulate online platforms and manipulate us by spreading false information”.
A reasoned discussion of disinformation, for example, will not be able to compete against a “far-right meme about some wild conspiracy theory”.
Disinformation is cheap; it spreads organically, not through advertising, and it is contributing to a “very rapid breakdown in traditional social and ideological groupings across society” in western democracies.
Coper says one of the first realisations of this came with Brexit, where traditional UK political divides traditionally were “completely irrelevant”.
In Australia during the pandemic, far-right white nationalists allied with far-left hippie health Instagram influencers over vaccine mandates.
It’s all in our heads
We like to think we’re rational and assess info well but that’s just not the reality of how our brains work, says Coper.
We’ll ignore or dismiss new information if it challenges held beliefs.
We are emotional, irrational decision-makers, and our brains favour colourful stories to bland facts.
“You know, conspiracy theories and disinformation wins, usually because it's a better story.”
Colourful stories beat “peer reviewed scientific articles” hands down.
To counter disinformation, we must impart an engaging story, because our brains respond to emotive triggers.
Not surprisingly, it is “Trump-style voices” which do best at taking advantage of our “mental shortcuts”. In part, this is because our brains tend to fit new information into something we already think is true, a pre-existing worldview.
The Coper soundbites are confronting, but revealing:
“Facts don’t cure falsehoods.
“Our brains aren’t wired for truth.
“Social media rewards lies.”
Our brain wants us to get along with our peers, so adopting their opinions is much more important to us than whether information we encounter is true.
Though many of us were raised to hope the best argument wins, Coper says we now don't have a “shared reality” where we can disagree about things and debate reasonably.
“Instead, we live in completely parallel realities that never intersect.”
He sees the upcoming election as a contest for the unpredictable new political constituency that coalesced around the pandemic, which has nothing to do with traditional party lines.
Inoculating against the disinformation virus
Coper’s first strategy to counter balderdash is to “prebunk” by anticipating what the disinformation will be and put out a stronger, dominant narrative first. That way, when the lies come, the audiences are more impervious to it because it contradicts something they already think.
“The other thing we can do is to talk about disinformation in general and to warn about disinformation and the sorts of tactics that we will see in the upcoming election.”
It is important to remember the overarching narrative you want to impart, rather than getting stuck arguing specifics.
“Facts are normally left to speak for themselves or in dull articles. When you try to get an issue across you must find good stories.”
One big no-no is to mock an opposing or nutty point of view – mocking disinformation “makes it less likely someone will change their mind”.
It’s very difficult to change someone’s mind. Think of yourself – what level of information would you need to have your mind changes on a key issue?
The secret is to find shared values and uncover the good intentions behind a person’s beliefs. Aim to have a conversation that doesn’t trigger what is central to someone’s identity.
Try to frame messages in a way which plays into their worldview, rather than contradicting. Find common ground first. Ask yourself why that person would hold those views. Don’t go straight to the point of difference. Find points of overlap. Lower the stakes for them being wrong.
It is also crucial to make conversations about contentious issues private – take the discussion out of the comments section, seen publicly.
Coper says the best response to most disinformation in most cases is to do nothing – disinformation thrives on outrage and argument. Evaluate the disinformation when you see it. Ask yourself if it is harmful to your cause. Is it widespread? Is it being voiced by only a vocal minority? Would responding bring a benefit?
If you decide a response is warranted, it should be a deliberate effort through peer to peer spreading of a positive alternative narrative.
Unions and unionists are well placed to counter disinformation – member-based organisations have the advantage of having ears and eyes in many (work)places. We can identify threats early and make a timely response.