Site icon Occasional Digest

Ofcom finalises rules for tech firms to protect children online

Occasional Digest - a story for you

Social media platforms and websites will be legally required to protect children from accessing harmful content online or risk facing fines, the communications watchdog has said.

Sites must adhere to Ofcom’s new regulations – known as the Children’s Codes – by 25 July and will be required to instate age verification checks and change algorithm recommendations to continue operating in the UK.

Any site which hosts pornography, or content which encourages self-harm, suicide or eating disorders must have robust age checks in place to protect children from accessing that content.

Ofcom boss Dame Melanie Dawes said it was a “gamechanger” but critics say the restrictions do not go far enough and were “a bitter pill to swallow”.

Ian Russell, chairman of the Molly Rose Foundation, which was set up in honour of his daughter who took her own life aged 14, said he was “dismayed by the lack of ambition” in the codes.

But Dame Melanie told BBC Radio 4’s Today programme that age checks were a first step as “unless you know where children are, you can’t give them a different experience to adults.

“There is never anything on the internet or in real life that is fool proof… [but] this represents a gamechanger.”

She admitted that while she was “under no illusions” that some companies “simply either don’t get it or don’t want to”, the Codes were UK law.

“If they want to serve the British public and if they want the privilege in particular in offering their services to under 18s, then they are going to need to change the way those services operate.”

Prof Victoria Baines, a former safety officer at Facebook told the BBC it is “a step in the right direction”.

Talking to the Today Programme, she said: “Big tech companies are really getting to grips with it , so they are putting money behind it, and more importantly they’re putting people behind it.”

Under the Codes, algorithms must also be configured to filter out harmful content from children’s feeds and recommendations.

As well as the age checks, there will also be more streamlined reporting and complaints systems, and platforms will be required to take faster action in assessing and tackling harmful content when they are made aware if it.

All platforms must also have a “named person accountable for children’s safety”, and the management of risk to children should be reviewed annually by a senior body.

If companies fail to abide by the regulations put to them by 24 July, Ofcom said it has “the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK.”

Source link

Exit mobile version