What is the Online Safety Bill, who is in favour, who opposes it, and how will it be enforced? | Science & Tech News

The long-delayed Online Safety Bill has become law after receiving royal assent.

It is one of the government’s flagship pieces of legislation for this term, but comes following several delays due to controversy over its potential privacy implications.

Politics latest: PM says AI can be ‘co-pilot’ for jobs

Here’s what you need to know about the Online Safety Bill.

What does the Online Safety Bill aim to do?

The government never shies away from an opportunity to present the UK as a global leader, and has said the bill will make this country “the safest place in the world to be online”.

It aims to do this by imposing rules upon companies like Meta, Apple, and even Wikipedia, with the goal of keeping inappropriate and potentially dangerous content away from vulnerable eyes.

This includes things like self-harm material, which a coroner ruled last year contributed to teenager Molly Russell taking her own life.

The bill also aims to hold platforms responsible for illegal content such as child sexual abuse images, make adult websites properly enforce age limits, and stop underage children being able to create social media accounts.

Perhaps most controversially, one of the proposals would force platforms like WhatsApp and Signal to undermine messaging encryption so private chats could be checked for illegal content.

I’ve been reading about this for ages – why’s it taken so long?

As that last section indicates, this is a very wide-ranging piece of legislation.

Other illegal content it wants to crack down on includes selling drugs and weapons, inciting or planning terrorism, sexual exploitation, hate speech, scams, and revenge porn.

Then there’s the potentially harmful but not illegal material, like eating disorder content and alleged bullying.

There have been concerns within the Tory Party that it is simply too far-reaching, potentially to the point of threatening free speech online.

Those worries weren’t enough to knock the bill’s former chief advocate, the then culture secretary Nadine Dorries.

Indeed, proposals got even tougher between the bill’s first pitch in 2019 and eventual parliamentary debut in 2022, adding measures like criminalising cyber-flashing.

That already long three-year gap was blamed on the pandemic, and subsequent delays have been exacerbated by prime ministerial downfalls – first Boris Johnson and then Liz Truss.

The bill now falls under the watch of Michelle Donelan, the technology secretary, who’s made some changes to alleviate criticism while still satisfying its supporters.

Please use Chrome browser for a more accessible video player

What is in the Online Safety Bill?

Who’s in favour?

Among the bill’s backers have been charities like the NSPCC, safety group the Internet Watch Foundation (IWF), bereaved parents who say harmful online content contributed to their child’s death, and sexual abuse survivors.

Ahead of the bill facing its final stages in parliament last month, a woman who suffered years of abuse on an encrypted messaging app was one of more than 100 people who signed a letter to big tech bosses aimed at highlighting the need for action.

And the IWF released new figures a day before the bill’s passage through the House of Lords warning of “unprecedented” numbers of children falling victim to online sexual extortion.

The NSPCC’s recent campaigning cited reports of a rise in online child grooming cases, which the charity said showed the legislation is “desperately needed”.

Sir Peter Wanless, the charity’s chief executive, said the bill becoming law is a “watershed moment” that will keep children safer online.

The father of Molly Russell is one of several parents who have voiced their support for the bill, and welcomed an amendment filed during its committee stage that could grant coroners and bereaved families access to data on deceased children’s phones.

Four in five UK adults are also said to support making senior managers at tech firms legally responsible for children who are harmed by what they see on their platforms.

Away from child safety, consumer group Which? said the bill would help tackle online fraud by forcing tech firms to “take more responsibility” for fraudulent adverts on their platforms.

Please use Chrome browser for a more accessible video player

Ian Russell ‘worried’ about revised Online Safety Bill

Who has opposed it?

Aside from Tory MPs, the main opposition has unsurprisingly come from tech companies.

They had long expressed concerns about the rules around legal but harmful content, suggesting it would make them unfairly liable for material on their platforms.

Ms Donelan acknowledged the issue and removed the requirement, but the bill still tasks them with protecting children from damaging content like that which promotes suicide and eating disorders.

The update also saw material encouraging self-harm made illegal.

Much of the recent criticism from tech firms has centred around messaging encryption, with major platforms like WhatsApp even threatening to leave the UK if they are forced to enable scanning texts.

Encryption protects messages from being seen by people outside the chat.

Advocates of the technology say any attempt by government to allow for a “backdoor” would compromise people’s privacy and potentially let bad actors break into them too.

Ministers have sought to downplay the chances of this measure ever actually being used, but it remains in the bill.

Read more tech news:
Rockstar’s biggest game turns 10
What you need to know about new iPhone update

Please use Chrome browser for a more accessible video player

Love Island star: ‘My DMs are scary’

How will the bill be enforced?

Enforcement will fall to media regulator Ofcom.

Companies found to be in breach of the bill can be fined up to ยฃ18m or 10% of their annual global turnover, whichever’s greater (and in the case of a company like Meta, it’s comfortably the latter).

Firms and senior managers could also be held criminally liable if found not to be doing enough to protect children.

In extreme cases, platforms may even be completely blocked from operating in the UK.

Ofcom boss Dame Melanie Dawes said the regulator would not act as a “censor” seeking to take content down, instead focusing on setting new standards that make platforms “safer by design”.

“Importantly, we’ll also take full account of people’s rights to privacy and freedom of expression,” she added.

Ofcom will take a phased approach to bringing the act into force after a consultation period, with the majority of measures expected to commence within two months.


Leave a Reply

Your email address will not be published. Required fields are marked *