David Colon: "Information is a common good, a public good, as essential as the air we breathe."
David Colon during Funders' Day. © Cyril Marcilhacy
David Colon, senior professor of history and researcher at Sciences Po Paris, is a media specialist and author of "The Information War: States in the Battle for our Minds". In this interview, he sheds light on the mechanisms of disinformation and mass manipulation, as well as solutions to counter these threats to democratic regimes.
In The Information War: States in the Battle for our Minds, your latest book, you analyse the mechanisms of this new form of war being waged today in the information space, with weapons such as disinformation, misinformation and information manipulation. What do these concepts cover?
Disinformation is the dissemination of information that is known to be false, with the aim of causing harm. Meanwhile, misinformation is the act of spreading false information without knowing it is false. Information manipulation encompasses both of these concepts but also involves the deliberate use of entirely authentic information to distort the perception of reality. Information manipulation is therefore the broadest concept that encompasses all the others.
Who are the main actors behind disinformation today?
These are primarily people who spread false information to generate advertising revenue. Disinformation is a huge market, worth over €10 billion a year. Traditional media also relied on advertising revenue and, in some cases, resorted to disinformation, but the rise of digital media, capable of reaching billions of people each day, has taken us into a whole new dimension.
The characteristics of algorithmic systems and the pursuit of maximum growth encourage the viral spread of hateful content or content that undermines social cohesion. The system would, however, be easy to change: for example, by removing the simple "share" button, you wouldn't prevent anyone from transferring information via copy-paste, but you would curb the spread of the most toxic content, given that this is the type shared the most. It would simply require adjusting the algorithms, but there's no will to do so. The trade-off between platform growth and advertising revenue on the one hand, and content integrity on the other, is always made in favour of growth.
Misinformation also comes from malicious actors acting on behalf of states, non-state organisations or industries. Russia, China, Iran... authoritarian regimes around the world realised 30 years ago that the rise of the internet and social media posed a threat to their power. They feared the spread of this virus of freedom, human rights and multipartite political life, and thus set out not only to control the internet within their territory but also to turn this informational tool against democratic regimes themselves. The best way to survive, according to them, is to kill democracies from within. We are therefore witnessing an information war, a war to the death between democratic and authoritarian regimes.
How does information manipulation threaten democratic regimes?
The strategy is to attack the very foundations of any effective democracy. The first foundation is trust in institutions, the electoral process, the results of the ballots and even the media. Undermining this trust is the surest way to destabilise democracies.
Next, the goal is to weaken social cohesion by fostering divisions wherever they already exist in society and amplifying their perception, ensuring that a divisive issue becomes central in public debate and encourages growing political polarisation.
Finally, another very powerful means used by anti-democratic actors is to undermine confidence in the accuracy of reported facts, to weaken the boundary between fact and opinion, to weaken the distinction between what is true and what is false, and thus to deprive citizens in democratic societies of the ability to make rational decisions. The rise of conspiracy theories is due to a multitude of actors who aim to create doubt, cutting off an increasing number of individuals from their perception of reality.
This information manipulation has become even easier since we switched to a new informational regime and now mostly access information through social media. We saw this clearly during the pandemic, for example: a study published one year ago found a very significant difference between those who had viewed traditional online media and those who had obtained their information via Facebook. The former were exposed to less than 3% unreliable content, while the latter encountered more than 20%.
By allowing these platforms to form an oligopoly and acquire a power unprecedented in human history, we are now faced with an unprecedented systemic threat to our democracies.
Are there any solutions to counter this informational and democratic threat?
Regulation, particularly at a European level, is an illusion. The DSA (Digital Social Act) has the merit of existing, but due to intense lobbying at all levels, the political will to apply the DSA is not there. In Brussels, while around 150 employees are responsible for implementing the DSA, 1,000 people are being hired by social media platforms, all of them non-European, to obstruct its implementation.
In this context, raising awareness is crucial, but it is also necessary to provide immediate concrete alternatives and ways to navigate an increasingly destabilising information chaos. For example, the influencer marketing market has grown 10-fold in eight years. Much of the economic, political and social behaviour of the younger generation in particular is now determined by influencers on YouTube, TikTok and Instagram. This is why, as proposed by the Etats Généraux de l'Information, we could consider certifying influencers who would commit not to promote a product they are not sure about, not to spread information they know is false and not to act on behalf of a foreign state to interfere in an electoral process or democratic debate. If we simply allowed social media users to distinguish between certified influencers and those who are not, we would solve part of the problem.
On the business side, directing advertising campaign budgets towards these ethical influencers would significantly reduce the toxicity of information environments. The same applies to media advertising investment: if major companies were to direct their advertising investment towards quality or certified media, this would reduce the proportion of their advertising budgets that ends up in the pockets of disinformers.
But the most practical and effective solution would be to create European social media platforms whose algorithmic design complies with European regulations and upholds our values and freedoms of information, opinion and expression.
What virtuous initiatives are emerging from civil society?
Recently, the Hello Quitte X collective set an example. It has designed a data portability tool, enabling users to leave a platform or social network without losing their subscribers or subscriptions. For reference, data portability has been enshrined in European and French regulations since 2016. However, for the last eight years, it has never been implemented due to platforms' refusal to let users leave. The Hello Quitte X collective has just changed the game.
Another example in the field of artificial intelligence is the recent partnership between Agence France Presse and Mistral AI, which will enable Mistral to develop its computer learning system based on reliable AFP content. These are highly promising alternatives for escaping disinformation narratives.
How can philanthropy contribute to these major challenges?
Philanthropy can help protect the democratic space by supporting initiatives aimed at safeguarding the integrity of the information ecosystem or by assisting struggling local media. It can also help to strengthen the reliability of information, particularly digital content, by creating new social media platforms.
We must uphold the idea that information should not be solely a commercial commodity but a common good, a public good, as essential as the air we breathe. We do not accept – and rightly so – that our air be polluted. So why should we accept the pollution of information?
Given the seriousness of the situation, both geopolitical and digital, I have no doubt that there will soon be a surge of activity in this area and that the combination of goodwill, know-how and external funding will enable us to create the social media we need.
FURTHER READING
→ Futures Philanthropy: Anticipation for the Common Good
→ Sylvain A. Lefèvre: "Should we trust 'trust-based philanthropy'?"
→ There's No Philanthropy Without Trust
→ Philanthropy: an ally of democracy