Home>UK News>Did social media fan the flames of riot?
UK News

Did social media fan the flames of riot?

[ad_1]

Getty Images A crowd of people, some with their faces covered, standing in front of a police van. Some appear to be throwing a wheelie bin while many others are holding up their phones to film the sceneGetty Images

Police said online disinformation helped to spark a riot after three children were killed in Southport

It’s become a familiar pattern of events: a violent, terrifying attack unfolds, innocent people are killed, and social media is set alight with unfounded – and often incorrect – accusations about about the assailant’s identity and what the motivation was.

Think back to the stabbing attacks in Sydney earlier this year, falsely blamed on a Jewish student, or even the assassination attempt on Donald Trump in July.

It’s the same with Monday’s attack on a children’s holiday dance and yoga session in Southport, England.

A false name – attributed to the 17-year-old accused of killing three little girls as well as injuring eight other children and two adults – spread like wildfire across X, formerly known as Twitter.

The BBC is not repeating the false name here to avoid spreading misinformation. Media organisations cannot give the suspect’s real name for legal reasons, but Merseyside Police have said the name shared on social media is incorrect.

Nevertheless, posts on X sharing the fake name were actively promoted to users and racked up millions of views.

Just as with the Sydney attacks and the attempted shooting of Mr Trump, X was the focal point for untrue claims before they spilled onto other sites.

Getty Images Police officers wearing riot gear, facing the camera with shields and batons raised, while in the background a police van in burningGetty Images

Police say about 200 to 300 people were involved in the violence

It wasn’t just this fake name either. There were false claims the attacker was a refugee who arrived in the UK by boat in 2023 and unfounded speculation he is Muslim. Some of these posts were accompanied by Islamophobic and racist hate.

Merseyside Police have confirmed that the 17-year-old they have arrested was born in Cardiff to Rwandan parents, that he appears to have no known links to Islam, and that they are not currently investigating the attack as terror-related.

This all contradicts lots of these claims – but it didn’t stop them poisoning an already toxic online atmosphere.

So did these false and unfounded social media posts fan the flames of unrest in Southport?

There are lots of different factors contributing to the riot – led by protesters who expressed anti-immigrant and anti-Muslim views – not just social media.

After all, protests and violence like this long predate the existence of the internet. Everything from political rhetoric, racism and wider anger about immigration to questions about government and police transparency and timing likely played a part.

The fact that some of those at the rally decided to target Southport Mosque suggests that they may have been influenced by the unfounded online accusations that this had been an Islamist terror attack. The police themselves pointed to disinformation online as playing a part in the violence.

There was discussion of the rally on regional anti-immigration channels on the Telegram app. Protest movements are often organised in closed chat groups we can’t access.

But false and unfounded claims about the attack expanded beyond the usual online spaces where these kinds of protests are organised.

Content being seen by millions of X users, rather than just the fringes of social media, could also normalise some of the hate being pushed.

Some of those amplifying these ideas included prominent political commentators and politicians. Others were less well-known, but with a reputation for promoting evidence-free conspiracy theories.

PA Media An imam at Southport Mosque, inspecting the damage inside, where missing windows and damaged ceiling tiles can be seenPA Media

Rioters smashed windows and burned fences at the mosque

Many of these accounts have purchased blue ticks, which means their posts feature more prominently on others’ feeds. They are based all around the world too, with several right-leaning American profiles boasting hundreds of thousands of followers becoming very involved.

They reshared variations of the false claims, using them to push anti-immigration views and ideas. That was in turn met by backlash denouncing the speculation – which also racked millions of views, but proportionally less than the original claims.

Because the issue was picking up so much engagement, it was promoted further by X’s algorithm, so that my own feed and those of several others who reached out to me were dominated with these posts as soon as they logged on to their X accounts.

Some of the false claims seemed to originate not from these prominent political commentators or known conspiracy accounts though, but from anonymous profiles and pseudo-news accounts.

Among the first accounts to share the false name, for example, was Channel 3 Now, which purports to be a legitimate news outlet, but whose origins are very unclear. The social media profiles belonging to the channel suggest it is based in the US or Pakistan, with little information about who actually works for the site.

The site has since issued a “sincere apology and correction” following their article and posts, which says “we deeply regret any confusion or inconvenience this may have caused” and admitted their content was “not accurate”. They have not replied directly to the BBC’s questions.

Other profiles – not using real names or images – copy-and-pasted the fake name to their own timelines across different, separate accounts to share the claim more widely.

Some of these profiles seemed to belong to real users based in the UK, but others sharing this had the hallmarks of inauthentic accounts. These accounts can be automated or run by groups aiming to manipulate the online debate, and have a track record of posting only divisive content about issues like immigration.

EPA A crowd of people, ranging from young children to grey-haired adults, light candles and leave floral tributes to the victims of the Southport attack, some bowing their heads in sadness, others holding each other for comfortEPA

More than a thousand people attended a vigil for the victims of the attack – separate from the protest

This isn’t just about pushing political agendas, either. Tragedies like this can also be exploited online to accumulate likes, views and follows – which in turn can be monetised.

X has again been a focal point of this online frenzy, since the platform began allowing its paid blue-tick users to earn a portion of ad revenue for their posts.

Their posts are more likely to be recommended on people’s feeds, and they may be incentivised to post more controversial or sensational content to boost their views.

This all seems to have had an effect on how attacks such as Southport are discussed in the hours and days after they have happened, since these kinds of frenzies are becoming more frequent and intense since the changes to X.

On top of this, since Elon Musk took over the social media company, he has changed how it moderates false and hateful content – sacking dozens of employees who worked in these areas and introducing new measures like community notes provided by users to fact-check posts.

X has not responded to the BBC’s request for comment. The social media company says publicly that it defends and protects users’ voices on the site.

And while this is about a whole lot more than a tidal wave of social media posts, it’s yet more evidence of the way the online world can inflame pre-existing tensions in the real world.

And that’s a world where three little girls have had their lives cut short.

[ad_2]

Source link

Review Overview

Summary

Leave a Reply

Your email address will not be published. Required fields are marked *