Technology

How Will An Online Harms Regulator Work?

Government wants to introduce a regulator that makes tech firms legally responsible for a wide range of online harms.

Share this article

Share this article

Government wants to introduce a regulator that makes tech firms legally responsible for a wide range of online harms.

Technology

How Will An Online Harms Regulator Work?

Government wants to introduce a regulator that makes tech firms legally responsible for a wide range of online harms.

Share this article

Plans for new laws making tech giants and social networks more accountable for harmful content online have been set out by the Government, in a bid to make the UK one of the safest places in the world to be online.

Here is everything you need to know about the long-awaited white paper:

– Will there be regulation?

An independent regulator will be responsible for ensuring tech companies abide to a new duty of care and code of practice.

The Government is currently consulting on whether this should mean the creation of a brand new regulator or whether it should be housed within an existing regulator, such as Ofcom.

Social media regulation
Culture Secretary Jeremy Wright met with tech bosses including Facebook’s Mark Zuckerberg in February to discuss the plans to tackle online harms (Niall Carson/PA)

– What will the regulator do?

It is proposed that the regulator be given powers to ensure all companies affected by a new regulatory framework fulfil their duty of care.

Clear safety standards will be set out, which force companies to report to the regulator.

Tech firms could be issued substantial fines for any proven failures, with a requirement to publish a notice to the public detailing where they went wrong.

The Government is also consulting on giving the regulator even tougher powers to make individual senior managers criminally liable for any breaches.

This could extend to preventing offenders from appearing in search results, app stores or links on social media posts, before making internet service providers block non-complaint websites or apps entirely as a last resort.

– What is considered an online harm?

The Government has defined a wide list of what it considers an online harm, both illegal and with less legal definition.

Illegal harms that will be tackled include:

– Child sexual abuse and exploitation
– Terrorist content and activity
– Organised immigration crime
– Modern slavery
– Extreme pornography
– Revenge pornography
– Harassment and cyberstalking
– Hate crime
– Encouraging or assisting suicide
– Incitement of violence
– Sale of illegal goods or services, such as drugs and weapons
– Contempt of court and interference with legal proceedings
– Sexting of indecent images by under 18s

Online harms
Many of the measures are designed to protect children and the vulnerable online (Peter Byrne/PA)

The harms that will be covered that have less legal definition include:

– Cyberbullying and trolling
– Extremist content and activity
– Coercive behaviour
– Disinformation
– Violent content
– Advocacy of self-harm
– Promotion of Female Genital Mutilation

It will also make companies liable for exposing children to legal content for adults, such as pornography.

– Who will regulation affect?

Any companies that let users share or discover user-generated content or interact with others online will be affected by the regulations – particularly social networks such as Facebook, Instagram and Twitter.

However, it will also stretch to other parts of the web, including file hosting sites, forums, messaging services and search engines.

YouTube
The proposals will effect a number of web-based firms (Nick Ansell/PA)

– Why is the Government cracking down on online content?

The Government wants to stamp out a host of online harms, such as illegal activity and content, ranging from terrorism-related material, to child sexual exploitation, abuse and inciting or assisting suicide.

It also wants to tackle areas that are not illegal but it believes could be damaging to individuals, particularly children and vulnerable people.

It has come to the conclusion that self-regulation is no longer working and therefore wants to introduce new legally-binding measures that make tech companies hosting the content responsible for blocking it or removing it swiftly.

Molly Russell
Molly Russell, 14, who took her own life in November 2017 (Family handout/PA)

The urgency to act has been highlighted by a number of cases, such as teenager Molly Russell, who was found to have viewed content linked to self-harm and suicide on Instagram before taking her own life in 2017.

More recently, material relating to terrorism has also been a concern, following the mosque attack in New Zealand which was livestreamed on Facebook.

– What next?

A 12-week consultation of the proposals will now take place before the Government will publish its final proposals for legislation.

Related Articles
Get news to your inbox
Trending articles on News

How Will An Online Harms Regulator Work?

Share this article