Claire Wardle
Ph.D., Professor
Brown University
US
As this is a column about misinformation, most people probably expect a deep dive into the perils of Generative AI or the chaos that might happen in one of the many elections scheduled to take place this year because of disinformation.
But as I sit at my laptop preparing to write this column my eyes are drawn to X, and an example of disinformation that has just appeared on my feed. It is a video, which opens with an image of President Zelensky in army fatigues. The image looks identical to a BBC News report with the same font, graphics and official BBC logo. The video claims that Zelensky received a severance package of $53million to become UK ambassador. It’s not a deepfake. It’s a cheap fake that uses the tried and tested method we call ‘imposter content’, the use of a known logo or name, to by-pass someone’s training to ‘investigate the source’.
It’s an important reminder that for all the discussion of generative AI and election-related disinformation, every day there is a continuous stream of falsehoods, much of it not very sophisticated. It is cheaply made and disseminated and all of it polluting our information ecosystem. While it’s tempting to focus on what’s new and shiny, we cannot forget the harm caused by the most simple techniques that still have the potential to impact the way people think about almost any issue a society is concerned about.
I use this Zelensky example to underscore that while AI technology is evolving at a worrying speed - the new OpenAI tool Sora that launched last month showcasing how easy it is to create very realistic 60 seconds video from a one sentence prompt - the biggest challenge remains our psychological biases. For those pro-Kremlin supporters who desperately want to believe Zelensky might be soon out of the picture, this rumor does what it needs to do. Almost a decade after the issue of misinformation became a global talking point, it is still causing harm, because we haven’t sufficiently invested in cradle-to-grave education programs to help people understand how their brains are being targeted and how vulnerable we all are.
So while we absolutely need to be prepared for the impact of generative AI tools, we need to remember how easy it is to cause harm with little if no technology. Back in 2019 a political operative created a 24-hour news cycle after he took a video of the then US House Speaker Nancy Pelosi, slowed down the video slightly to make it appear that she was drunk and slurring her words. In 2018, an impersonator that sounded identical to the then Brazilian Presidential candidate Jair Bolsonaro, recorded what sounded like a voice note from his hospital bed (he’d just been stabbed on the campaign trail) and it took three days for audio forensics specialists to figure out it wasn’t him. This past week an image of Donald Trump surrounded by a small group of Black supporters was being shared. It turned out it had been created by Generative AI but the same result could have been achieved using photoshop.
So when it comes to mitigating misinformation, the aim shouldn’t be waiting for fancy new tools to detect AI, or new election related misinformation initiatives. We have to continue pumping resources into educational initiatives, on a continuous basis, not just when there’s an upcoming election. The only way we build resilience in communities is by teaching people not only the tactics and techniques that might get used against them, but teaching them how our brains are too often working against us. We need people to be much more aware of their own biases, and the power of existing world views to shape the way they see any new information. We need to teach people that disinformation is rarely about persuading people to change their mind, it’s about strengthening their pre-existing beliefs with the hope of widening existing divisions within society. Chaos, confusion and division is always the goal.
As many countries see increased levels of polarization, it’s easier for disinformation actors to cause harm. When the goal is to widen existing divisions, growing distrust and even hatred for the ‘other side’ provides ripe conditions for disinformation campaigns to be effective. Instead we need to educate people that all of us are vulnerable to believing information that reinforces our world view. Educational initiatives, aimed at under 10s and over 70s should focus on how our brains are being hi-jacked, not how to better google a headline or whether or not to trust wikipedia.
The technologies will continue to get smarter but we all need to understand our brains won’t. We’ll always be hardwired to connect with others in our ‘in-group’ over those in the ‘out-group’. Understanding that is the only way we build resilience against whatever the latest tool makes possible.