The British government announced on Monday that officials are investigating the potential role of foreign states in spreading disinformation online, which may have contributed to recent violent
protests. The government also warned social media companies to intensify efforts to curb the spread of false information.
The unrest erupted last week after the murder of three girls at a Taylor Swift-themed event in Southport, a seaside town in northern England. Misinformation on social media falsely identified the suspected killer as an Islamist migrant, leading to protests by anti-Islam and anti-immigration groups across various towns and cities in Britain. These protests targeted mosques and hotels housing migrants, resulting in violent clashes with police.
Jacob Davey, Director of Policy and Research at the Institute of Strategic Dialogue (ISD), emphasized the significant impact of online disinformation and the role of social media platforms in the unrest. "The spread of this information was central to the horrific events of the weekend," he told Reuters.
The government, which has previously accused countries like Russia of sowing discord, is now assessing the extent of foreign involvement in amplifying false messages. A spokesperson for Prime Minister Keir Starmer mentioned, "We have seen bot activity online, possibly involving state actors, amplifying disinformation. It is something we are investigating."
Elon Musk, owner of X (formerly Twitter), responded to a post blaming mass migration and open borders for the disorder in Britain by stating, "Civil war is inevitable."
Davey pointed out that disinformation is spread not only by those intending to cause trouble but also by social media platforms themselves due to algorithms designed to amplify certain narratives. "You saw that in the trending topics in the UK, with disinformation appearing in searches related to Southport. The business model of these platforms is crucial."
High-profile anti-immigrant activists also played a role in spreading misinformation. Stephen Yaxley-Lennon, known as Tommy Robinson and a former leader of the anti-Islam English Defence League, has been cited by the media for spreading false information on X. Although banned from the platform in 2018 for producing hateful content, he was reinstated after Musk acquired the platform.
Challenges and Responses
Britain's new Online Safety Act, introduced last year to tackle issues such as child sexual abuse and promoting suicide, may not be effective in addressing this situation. Professor Matthew Feldman, a specialist on right-wing extremism at the University of York, noted that the act does not seem to cover "online incitement to offline criminality or disorder."
Despite far-right groups being less organized than a decade ago, modern technology has enabled extremists and influencers to gain visibility and capture public attention. The ISD's Davey mentioned that recent unrest is part of a longer process, with extremist groups becoming more confident. Incidents of unrest outside migrant centers, disorder at Remembrance Day events, and large gatherings in support of Yaxley-Lennon in central London illustrate this trend.
"This is the culmination of a longer process where extremist groups have become more emboldened," Davey said. Photo by Police Barricade, Parliament Square by Nigel Mykura, Wikimedia commons.