The Online Safety Bill was approved by both houses of parliament last month and will receive royal assent shortly. But that, in itself, will not achieve the government’s laudable aim of making the UK ‘the safest place in the world to be online’. 

Joshua rozenberg

Joshua Rozenberg

The legislation, which runs to nearly 300 pages, imposes new safety duties on service providers and grants new powers to Ofcom. We can expect the communications regulator to begin consulting soon on the first set of standards it expects service providers to meet in tackling illegal online harms, including child sexual exploitation, fraud and terrorism. Further consultations will follow and Ofcom hopes to finalise its advice to ministers next spring.

Once the legislation is fully effective, Ofcom will be able to impose penalties for non-compliance of £18m or, if higher, 10% of the service provider’s ‘qualifying worldwide revenue’. In theory, at least, tech companies could be ordered to pay billions of pounds. Ultimately, Ofcom could ask the courts to prevent a service provider operating in the UK.

The government’s first consultation paper on internet safety was published six years ago. A relentless series of responses and proposals has resulted in legislation that was finally agreed by MPs and peers last month. What would otherwise be a daunting bill of more than 240 sections and 17 schedules is made more palatable by section 1, which explains its purpose, and section 2, which tells readers how the legislation is organised. Even so, there will be plenty of scope for solicitors who practise in this field to publish written guidance for prospective clients.

The main challenge for the drafter has been to define the services that have never before been the subject of legislation. A ‘user-to-user service’ is one that allows users to generate or upload content that can be seen by other users. But that doesn’t include services that merely allow users to exchange emails and messages. The legislation also imposes requirements on ‘search services’ – such as Google – and on providers of pornographic services, who must ensure their services are not encountered by children.

In policy terms, the challenge has been to balance freedom of expression against the need to protect adults as well as children. The largest platforms will have to give adult users greater control over the kinds of content they see and whom they engage with online. Regulated services will have to take steps to prevent their services being used for illegal activities and remove illegal content when it appears.

Some content is already illegal. Perhaps the most obvious involves child sexual abuse but terrorism is another important area. Others highlighted by the government are: controlling or coercive behaviour; extreme sexual violence; fraud; hate crime; inciting violence; illegal immigration; and facilitating suicide.

Other content may be legal for adults but harmful for children. That could cover bullying as well as pornography. Content providers will have to show how they are keeping under-age children off their platforms. Among ‘priority content that is harmful to children’ is content that encourages a challenge or stunt highly likely to result in serious injury and content that depicts real or realistic serious violence against a person or an animal – or even a fictional creature.

On the other hand, user-to-user services that meet a specified threshold will be under a duty to protect free speech. They will not be allowed to remove content, or ban users, except when permitted by their terms of service or required by law.

There will be additional protections for journalists and recognised news publishers. Major search services must have systems to prevent users encountering fraudulent advertising.

The legislation will also make important changes to the law on malicious communications, replacing legislation originally passed in the 1930s to stop pranksters who used to annoy telephone operators.

There’s a new offence of sending pictures of someone else’s genitals to someone else with the intention of causing harm, distress or humiliation.

Another new offence involves sharing or threatening to share an intimate photograph or film. There are exceptions for people who choose to take their clothes off in public and the offence does not apply to the kind of image ordinarily shared between family and friends. But the offence applies to images that merely appear to show another person in an intimate state. So the legislation would apply to computer-generated video images of people that appeared to be real – so-called deepfakes.

It will be an offence to send flashing images electronically to someone with epilepsy with the intention of causing harm. Another new offence will be committed by a person who intentionally encourages or helps someone else to carry out an act of serious self-harm.

Although these reforms will not be fully implemented before next year, service providers are already taking action to protect children from harm. And so they should.

 

joshua@rozenberg.net

Topics