There has always been fake news, but in the age of the smartphone and platforms like Facebook and Google, it has taken on a new character, like reported by nesta.org.uk.
Democracy depends on open public debate, and while arguments, opinions and emotions form part of that debate, and facts may be disputed or interpreted in different ways for partisan purposes, the trust that underpins democratic decision-making, including the outcomes of elections, requires some consensus on what is generally regarded to be ‘true’.
Over time, established media organisations can be said to have built levels of trust with audiences for the accuracy of the information they provide.
This has historically been true of broadcast news in the UK, the impartiality of which has been both regulated and prized.
Major platforms have facilitated the distribution of fake news; the smartphone has facilitated rapid sharing of fake news by individuals; the algorithms underpinning platforms serve to maximize consumption of fake news items and enable their monetization.
Meanwhile, fake news pays. There are clear economic incentives for producers of fake news, and these relate to the level of engagement that social media users undertake. Fake news websites can raise money from advertising on their sites through, for example, Google Adsense or through Facebook advertising on their pages.
The creation of smart ‘bots’ and ‘troll factories’ has led to industrial-scale production of ‘fake news’ and ‘fake information‘ in some instances.
Fake news arguably becomes more of a problem when the economic viability of mature mainstream news organisations is threatened by the two major platforms – Facebook and Google – which are devouring advertising revenue, especially online; they hold significant – indeed, strategic – market power and regulatory intervention is needed to strengthen the real news market.
So what can we do?
1. UK (and EU) regulatory frameworks require updating
Specifically, OFCOM requires empowerment in this field: after all, the founding legislation for the current UK regulatory regime predates the creation of Facebook and the smartphone.
There needs to be effective recognition that certain Internet Intermediaries or Online Platforms are media organisations — in Ofcom’s words, editorially responsible providers — and should at least carry responsibilities in respect of protection of minors and prevention of hate speech: this has implications for the regulation of fake news.
Different aspects of these platforms may need specific regulation to ensure a balanced playing field in respect of data between the platforms and users; the APIs which allow interconnectivity for other app developers; and the advertising metrics which the platforms claim.
Facebook should be compelled to display the original logos of originating news organisations for items in its news feed, not a generic branding. The powers of the Electoral Commission also need to be revisited, especially in the context of personally targeted social media advertising. It is not clear that the scope and power of Facebook and Google has been considered by regulators on a cross-regulatory basis, and some of the issues raised by data, algorithms and advertising could cross the boundaries of regulatory knowledge and capacity.
2. We need to strengthen trusted media outlets
That means support for public service broadcasters like the BBC, which itself is investing significantly in factchecking fake claims on social media.
Facebook and Google make much of their money from content developed by others: consequently, a levy on the advertising receipts of Facebook and Google should be enforced to support media outlets hit by the platforms’ dominance of advertising.
3. Self-regulation
There are also measures which platforms themselves could take to address some of these these issues (self-regulation). Employing an Executive Editor and other editors for different segments and more internal human fact-checkers, would help. Using the actual brands of originating media in the Facebook news feed or Google search results would also make the information on whether a source was likely to be truthful more immediately discernible.
Facebook needs to recognise its responsibilities and invest some of its profits in employing a strong editorial team or teams headed by an Executive Editor, supported by fact- checkers. Facebook itself should dedicate itself to being a champion of the truth, more actively support its users in taking action on fake news, and collaborate with genuine media organisations on ensuring recognition for their content.
4. Digital literacy
Programmes of digital literacy could help citizens to gain greater understanding of the problems of fake news.
There is no silver bullet to the problems of fake news, but these actions would be a start. Tackling fake news requires more than voluntary action by the major online platforms, as there is substantial evidence that there are financial disincentives for them to do so.
We have reached a situation where there is unquestionable dominance of online advertising revenue by Facebook and Google in particular: and many of the media organisations on whom we could have relied to challenge fake news robustly have been badly hit by the impact on their own advertising revenues.
Now is the time to commence a full debate on the future of news and the best means to protect it: and regulators themselves should be challenged as to whether they believe a regulatory regime, put in place before Facebook was founded and before the smartphone existed is still relevant in every aspect to the challenges society now face.