fbpx
News Hub

Tech firms must clamp down on illegal online materials, says Ofcom

Written by Wed 15 Nov 2023

Tech firms must use a range of measures to protect their users from illegal content online under detailed plans enforced by Ofcom.

The UK communications watchdog, appointed as the UK’s new online safety regulator, has drafted Codes of Practices for social media gaming, pornography, search, and sharing platforms to comply with responsibilities under the Online Safety Act.

Chief Executive of Ofcom, Melanie Dawes, said the watchdog is wasting no time in setting out how they expect tech firms to protect people from illegal harm online.

“Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular,” said Dawes.

The focus of this operation is on ‘priority offences’ set out in the legislation. This includes child abuse, grooming, and the encouragement of suicide. However, it could also encompass any illegal content.

Ofcom Drafts Codes for Tech Firms

Ofcom said it is not taking a ‘one size fits all approach’. The communications watchdog is suggesting measures applicable to all services in scope. Ofcom also proposed other measures based on the risks identified in the service’s illegal content risk assessment and its size.

Larger and high-risk services should ensure children are not presented with lists of suggested friends, do not appear in other users’ lists of suggested friends, are not visible in other users’ connection lists, and more.

“It is right that protecting children and ensuring the spread of child sexual abuse imagery is stopped is top of the agenda. It is vital companies are proactive in assessing and understanding the potential risks on their platforms, and taking steps to make sure safety is designed in,” said Susie Hargreaves, Chief Executive of the Internet Watch Foundation.

Ofcom also suggested that these services should use a technology called ‘hash matching’ and automated tools to detect URLs that have been identified as hosting child sexual abuse material.

‘Hash matching’ is a way of identifying illegal images of child sexual abuse by matching them to a database of illegal images. This technology helps detect and remove child sexual abuse material circulating online.

“In the five years it has taken to get the Online Safety Act onto the statute book, grooming crimes against children on social media have increased by a staggering 82%.

“That is why Ofcom’s focus on tackling online grooming is so welcome, with this code of practice outlining the minimum measures companies should be taking to better protect children,” said Peter Wanless, Chief Executive of the NSPCC.

Large general search services should also offer crisis prevention information when responding to search requests related to suicide. They should also address queries seeking specific, practical, or instructive information on suicide methods.

Ofcom Tackles Child Sexual Abuse and Grooming Targeted

Protecting children will be Ofcom’s priority. The communications watchdog said friend requests are frequently used by adults looking to groom children for sexual abuse.

According to Ofcom’s research, three in five secondary-school-aged children (11-18 years) have been contacted online in a way that potentially made them feel uncomfortable. 

Approximately 30% have received an unwanted friend or follow request. A total of one in six secondary school students (16%) have either been sent naked or half-dressed photos, or been asked to share these themselves.

“For many, it happens repeatedly. If these unwanted approaches occurred so often in the outside world, most parents would hardly want their children to leave the house. Yet somehow, in the online space, they have become almost routine. That cannot continue,” said Dawes.

New Codes to Fight Fraud and Terrorism

Ofcom’s draft codes also suggested targeted steps to combat fraud and terrorism. 

New measures for larger, higher-risk services involve automatic detection to find and remove posts associated with the sale of stolen credentials, such as credit card details. Other measures include verifying accounts to mitigate the risk of fraud and foreign interference in UK processes, such as elections.

Services must block accounts operated by proscribed terrorist organisations. To mitigate the risk of various types of illegal harm, services are encouraged to implement a core set of measures.

Suggested measures included appointing an accountable person for compliance with illegal content, reporting, and complaints duties; ensuring well-resourced and trained moderation teams with performance targets; facilitating easy reporting and blocking options for users, and additional measures.

What Happens Now?

Ofcom will consult with industry experts to develop the final version.

Services will then have three months to conduct their risk assessment, while Ofcom’s final Codes of Practice will be subject to Parliamentary approval. Ofcom expects this to conclude by the end of 2024 when the communications watchdog will then begin enforcing the Codes. 

Companies that do not meet the approved expectations will face enforcement actions, including fines.

In spring 2024, Ofcom will publish a consultation on additional protections for children from harmful content. This will include promoting suicide, self-harm, eating disorders, and cyberbullying.


Hungry for more tech news?

Sign up for your weekly tech briefings!

Written by Wed 15 Nov 2023

Send us a correction Send us a news tip