Ad imageAd image

Factum Perspective: Fueling the Flames – Riots in the UK

 

Factum Perspective: Fueling the Flames – Riots in the UK

By Amantha Gunarathna

The recent UK riots have highlighted the destructive power of social media disinformation and its ability to incite real-world violence.

- Advertisement -
Ad imageAd image

Platforms like X and Telegram played crucial roles in disseminating this misinformation, with far-right groups using them to organize and incite violence. The riots draw disturbing parallels to the race riots of the 1970s, where similar racial tensions were stoked by inflammatory rhetoric, prominently perhaps Enoch Powell’s “Rivers of Blood” speech.

The recent riots were fueled by anti-immigrant sentiment and misinformation. It erupted following a tragic incident in Southport, where three young girls were fatally stabbed during a public event in Southport. Despite the suspect being a UK national, false information rapidly spread on social media, claiming that the attacker was an asylum seeker of Muslim origin. This disinformation ignited far-right groups and individuals across the country, leading to violent protests targeting Muslims, migrants, and their places of worship.

Far-right activists, including well-known figures like Tommy Robinson, used these platforms to circulate false narratives and inflammatory rhetoric, language that is used to provoke or incite people to anger or other strong emotions, significantly contributing to the unrest. Telegram, with its “less regulated” ecosystem, has become a hub for far-right groups to coordinate actions and spread extremist views, further exacerbating the situation.

The UK’s Online Safety Act, passed in 2023, was designed to address the issue of harmful content on the internet. Yet while hailed as a significant step towards regulating harmful online content, its practical effectiveness has been called into question.

Its primary purpose is to make social media platforms and tech companies more accountable for the content they host, aiming to reduce the spread of illegal and harmful material, including disinformation and hate speech. The Act empowers the media regulator Ofcom to enforce these rules, with the ability to impose fines or even ban platforms that fail to comply. It represents a significant step towards creating safer online environments for users.

Although the Act gives Ofcom the power to enforce rules and impose fines on non-compliant platforms, many of its key provisions are not yet fully operational, leaving a significant gap in enforcement. Moreover, the Act’s reliance on tech companies to self-police their platforms has proven to be problematic, as these companies often prioritize profit over stringent content moderation. The slow rollout of the Act’s enforcement mechanisms has also allowed harmful content, including disinformation and hate speech, to flourish.

Far-right groups and individuals have so far taken to the streets, targeting Muslim communities, migrants, and their places of worship. Specific incidents included attacks on mosques, looting of minority-owned businesses, and arson at hotels housing asylum seekers. The riots caused extensive property damage, with cars and buildings set on fire, and resulted in disruption across several cities, including London, Manchester, and Belfast. Over 700 arrests have been made, with more than 300 people charged and dozens of police officers injured.

Elon Musk has come under significant criticism for his comments on these riots and incidents, particularly for his posts on X (Twitter) that many have deemed inflammatory. Musk went as far as to describe the situation as an inevitable “civil war”, a statement that has been condemned by UK government officials for being “deeply irresponsible.” His remarks were seen as escalating tensions during a period of intense unrest.

Further concerns have been raised about Musk’s platform, X, which has allowed far-right figures like Tommy Robinson to amplify disinformation and hate speech that contributed to the unrest. According to research by the Center for Countering Digital Hate, posts from figures like Robinson have garnered millions of views, with little intervention from the platform to mitigate the spread of these harmful narratives.

Critics argue that Musk’s actions, including the reinstatement of previously banned users such as Tommy Robinson, have exacerbated the spread of disinformation. His engagement with inflammatory content, including retweeting a video from Robinson, are seen to have boosted the reach of far-right rhetoric, contributing to the unrest.

The UK government has responded by urging social media platforms to take action against the spread of hate speech and disinformation. There are discussions about whether X could face legal consequences or even be banned in the UK if it fails to comply with upcoming regulations under the Online Safety Act. These actions have led to calls for regulation and accountability for social media platforms in handling such sensitive situations. The likes of Musk have responded by claiming that such measures would infringe freedom of expression.

Hate speech fueling the recent UK riots has primarily originated from far-right groups, exploiting Telegram’s less regulated environment to circulate racist, Islamophobic, and anti-immigrant rhetoric, mobilizing their followers towards violence. Examples include inflammatory posts blaming migrants for social issues.

Ofcom, the UK’s media regulator, was granted the authority to enforce these rules, with the power to levy fines or even ban noncompliant platforms. However, the Online Safety Act introduced by the previous Conservative government has faced some criticism, as many of its enforcement mechanisms are not yet fully operational.

These riots draw parallels to the riots of the 1970s. The riots of that period were exacerbated by inflammatory rhetoric, notably Enoch Powell’s “Rivers of Blood” speech in 1968. Powell’s speech, which warned of Britain being inundated by immigration, stoked resentment among white Britons and provoked attacks against Asian and African communities.

Today, similar dynamics are at play, with social media amplifying divisive narratives that echo Powell’s speech. The recent riots were triggered by misinformation and hate speech targeting immigrants and minorities, much like the racial scapegoating in the 1970s.

Yet while the mediums have evolved, the underlying fears and tensions remain strikingly similar. The key difference lies in the scale and speed at which disinformation can spread today, fueled by digital platforms like Telegram and X, making the situation more volatile. Despite these differences, the core issues of racism, immigration, and social division persist, revealing deep-rooted challenges in British society that have yet to be fully addressed.

Communities need to be protected from misinformation. This is crucial for fostering social harmony and ensuring that immigrants are treated with dignity and respect. Regardless of their status, every person deserves basic human rights. As the riots in the UK recede, it has become clear that these rights need to be ensured by a more effective regulatory ecosystem, to combat violent forms of disinformation and hate speech aimed at minority communities.

Amantha Gunarathna is an educational consultant, a member of the British Computer Society, and an alumnus of the Open University of Sri Lanka.

Factum is an Asia-Pacific focused think tank on International Relations, Tech Cooperation, Strategic Communications, and Climate Outreach accessible via www.factum.lk.

 

The views expressed here are the author’s own and do

not necessarily reflect the organization’s.

The post Factum Perspective: Fueling the Flames – Riots in the UK appeared first on Newswire.

Share This Article
Leave a comment