The rise of social media platforms has played a huge role in giving us instant news updates and direct digital lines to brands, celebrities and politicians. Recent statistics show that there are expected to be 4.89 billion social media users worldwide in 2023, a staggering 59.9% of the world's population. Yet this hasn’t come without challenges - in the digital world misinformation and fake news can snowball, in some cases proving to be dangerous and harmful.
The power of lies
Misinformation can influence critical topics such as elections and health (Covid). It can cause people to try and take the law into their own hands (pizzagate), and it can also cause real financial damage to brands (Pepsi, MetroBank) or harm their credibility and reputation. Misinformation can also be used to influence public opinion, spread hate and sow divisions with clickbait and sensationalism. It can also undermine trust in legitimate journalism.
Social media platforms still present an avenue to quickly disseminate and amplify this kind of content, and with the emergence of powerful artificial intelligence tools like ChatGPT and Midjourney, we can now see realistic and credible-looking imagery and even video content to seemingly substantiate false claims (The Pope in a puffer jacket?). What’s more concerning is professional disinformation businesses now exist in 48 different countries.
PR can help
Tackling the creation and distribution of misinformation and fake news is going to require a united effort from governments, technology companies and the media, and that includes the PR and communications industry. Cision’s recent Global State of the Media Report found that journalists view newswires, industry experts and press releases as the most trustworthy ways to gather information. Communications professionals need to make sure they are taking the time to verify the credibility of their data and corroborate facts before providing reporters with information.
Communications professionals also need to be prepared for their organisation to be in the crossfire of a disinformation campaign or conspiracy theory. There are some things you can do before it happens. Pre-bunking has been proven to be effective. If you know of risks or there are already false claims circulating about your brand or your industry, don’t wait for them to become so well circulated that the lie becomes the truth. Get ahead of the story by using your own channels to explain what the lie is, what it’s about and how it’s not true.
Whilst it’s not possible to predict the topic or the timing of a misinformation campaign against your brand, there is an increase in the politicising of brands around elections and international sporting events. ESG topics are often a source of campaigns also. So, before the next election, ensure you’ve effectively shared your ESG commitments and accomplishments as anything you reveal after any accusations will be seen through the lens of the lie. You’ll be accused of inventing facts to defend the brand, so be sure to point to evidence of real activities that occurred before the incident or accusation. Also, don’t be afraid to work with your competitors in the same market to agree to those ‘universal’ truths about your industry. Having subtly different lines only creates more space for misinformation to thrive.
Whilst AI is almost certainly going to be involved in the development of harmful content, it can also hold a solution. We are implementing content classification algorithms to help us rapidly detect controversial content and provide early warning of risks. Identifying harmful narratives in real time to form a response and debunk them is crucial before the media or an influencer accidently increases their reach. As we’ve seen with stories around the world, it might also be worth meeting with your legal colleagues now to agree what the red lines are and when you’ll take legal action.
As humans, we’re able to see trends once they’ve happened, but with the help of AI we can process thousands of data points almost instantly and identify fake news just as it starts to form. It’s that relationship between people and technology - AI’s speed with a layer of our own insight and expertise - that will be vital as we fight against misinformation and fake news.
Written by Antony Cousins, executive director of AI strategy at Cision
If you enjoyed this article, sign up for free to our twice weekly editorial alert.
We have six email alerts in total - covering ESG, internal comms, PR jobs and events. Enter your email address below to find out more: