UK government says online platforms must act to prevent hateful and violent content News
627389 / Pixabay
UK government says online platforms must act to prevent hateful and violent content

The UK government, Wednesday, urged social media platforms to take responsibility for harmful online content in light of the surge of violent anti-immigration riots sweeping the UK and a recent online anti-immigration “hit list.” Government regulator OfCom has released an open letter to UK online service providers about content on their platforms which may incite violence and criminal offenses in the UK.

The on-going violent far-right riots and a series of online incidents have drew concern about the risk online platforms may pose in inciting hatred, violence and criminal offences. The riots were initially sparked by online disinformation that the perpetrator of a violent attack in Southport was an Islamic migrant. Following this, a list of immigration lawyers and services throughout the UK was circulated on Telegram as anti-immigration protest targets.

UK Law Society president Nick Emmerson wrote to the prime minister, lord chancellor, and home secretary on Monday, sounding the alarm on the list. He said:

I have written […] asking that the threats against the legal profession be treated with the utmost seriousness. A direct assault on our legal profession is a direct assault on our democratic values, and we are supporting our members who are being targeted.

The Law Society reportedly says, “39 immigration centers have been named in online discussions, as well as the names of firms and individual solicitors.”

The UK Government says social media platforms “clearly need to do far more.” Jim McMahon, minister for Housing, Communities, and Local Government, told the BBC, “We don’t know if they will transpire to be protests like we’ve seen in other places or whether it’s a list that’s intended just to cause alarm and distress or even to provoke.” The BBC says it approached Telegram for comment but has not yet responded.

The Institute for Strategic Dialogue (ISD) wrote, at the time of the Southport attack, that “algorithms amplify false information.” It further said that “anti-muslim and anti-migrant users weaponize false information.” The ISD found that Telegram is unmoderated mainly and “served as a hub for domestic and international far-right communities” to spread hateful rhetoric, “distribute locations and targets for further action,” provide “practical advice for would-be rioters,” and “encourage extremist violence.”

OfCom said it has been “engaging with various online services” to see how best they can prevent and/or mitigate the risk of their platforms being used to spread disinformation and hatred, “provoke violence and commit other offences under UK law.” It outlined existing online safety obligations, such as that “UK-based video-sharing platforms must protect their users from videos likely to incite violence or hatred.”

OfCom also discussed the impact of the Online Safety Act, which is yet to come into force. It said that while this act will set out responsibilities for online services and platforms, “[s]ome of the most widely-used online sites and apps will in due course need to go even further” to protect users. It called for online platforms to “act now”:

New safety duties under the Online Safety Act will be in place in a few months, but you can act now to make your sites and apps safer for users.

The Crown Prosecution Service also warned against sharing online content that incites violence. It wrote on X (formerly Twitter) that it is taking “online violence seriously” and “will prosecute when the legal test is met.”