New UK regulations to hold social media companies accountable for their content
Online Safety Act set to address rise in disinformation, says expert
By Zuhal Demirci
LONDON - (AA) - The UK's Online Safety Act will impose new responsibilities on social media companies and make them accountable for their content, according to an expert.
The new law adopts a zero-tolerance approach to protecting children on social media platforms.
The bill became law on Oct. 26, 2023 and will be fully enforced next year. The UK’s media regulator Ofcom will then oversee its implementation and will have the authority to take action against companies that fail to meet their new obligations.
Speaking to Anadolu, Callum Hood, head of research at the Center for Countering Digital Hate (CCDH), said that while the internet and social media have brought great benefits to societies around the world, online platforms can also lead to some unwanted consequences.
Hood highlighted that it is becoming increasingly clear that these consequences can cause “real damage” to societies and democracies, pointing to the far-right violence in the UK that followed the spread of disinformation on social media as one such example.
He explained that these events were driven by accounts that spread disinformation and that Muslims and immigrants who were targeted on social media were harmed by these incidents.
“I think in the UK, the Online Safety Act, what it's trying to do, on a simple level, is what many forms of social media regulation that have been gradually inching forward in various parts of the world are now trying to do, which is have some kind of accountability for social media platforms,” said Hood.
He noted that the negative impact of disinformation seen in the UK is also being observed in other parts of the world.
“There's no accountability for them. There's no price to pay. So that's what needs to change. There needs to be some kind of accountability when they take these very deliberate decisions that cause harm,” Hood added.
- Far-right groups use social media to incite violence
The act aims to curb speculative news and online hate speech and has gained renewed attention after far-right groups used social media to incite violence.
Following a stabbing in the English seaside town of Southport on July 29 where 17-year-old Axel Rudakubana killed three children and injured 10 other people, far-right groups coordinated protests on platforms like Telegram, TikTok and X, spreading speculative news and calling for protests against migrants and Muslims.
The events led to widespread clashes with police, numerous arrests and detentions, and significant property damage across the UK.
British Prime Minister Keir Starmer hinted at tougher measures if social media companies do not do more to remove harmful content, accusing them of fueling violence instigated by far-right groups.
- New Online Safety Act
The new law will criminalize sharing false or threatening content intended to cause psychological or physical harm, imposing new responsibilities on social media platforms to remove illegal content such as incitement to racial hatred and criminal activity.
Aimed at protecting both children and adults online, the 286-page law, set to fully come into effect in the second half of next year, will require technology companies to take greater measures to safeguard children from harmful material.
Social media platforms will need to remove content related to child sexual exploitation and abuse, coercive behavior, promoting or facilitating suicide or self-harm, animal cruelty, illegal drug or weapon sales, and terrorism.
Providers will also need to implement systems to reduce the risk of their services being used for illegal activities.
Companies failing to comply could face fines of up to £18 million ($24 million) or 10% of their global revenue, whichever is greater.
Senior executives may also face criminal charges if they fail to meet Ofcom’s requirements for information.
Ofcom will also hold companies and senior executives criminally responsible where found culpable if media platforms fail to comply with enforcement notices related to child sexual exploitation and abuse.
The regulator is continuing public consultations on the act’s obligations.
Kaynak:
This news has been read 119 times in total
Türkçe karakter kullanılmayan ve büyük harflerle yazılmış yorumlar onaylanmamaktadır.