NEWS TECHNOLOGY

Pressure Mounted on Tech Giants to Tackle Online Illegality.

Ofcom has recommended additional internet safety rules that might compel tech firms to stop illicit content from going viral and restrict the ability to send virtual goods to or record a child’s broadcast.

On Monday, the UK agency released a consultation to gather opinions on further safeguards to keep people, especially children, safer online.

As part of additional internet safety precautions, these might potentially entail forcing some bigger platforms to evaluate if they need to proactively identify terrorist content.

According to Oliver Griffiths, director of Ofcom’s online safety branch, the proposed measures aim to update the “constantly evolving” threats while also building upon the UK’s current online safety regulations.

He declared, “Where we have concerns, we’re taking swift enforcement action and holding platforms accountable.”

“But technology and harms are constantly evolving, and we’re always looking at how we can make life safer online.”

Three key areas were identified in the consultation where Ofcom believes more might be done:

  • Preventing the spread of harmful content
  • Addressing the causes of harm, and
  • Providing children with additional precautions

The BBC has asked for comments from TikTok, the livestreaming website Twitch, and Meta, the company that owns Facebook, Instagram, and Threads.

Ofcom’s suggestions cover a wide variety of topics, from the misuse of private images to the risk of viewers seeing bodily injury during livestreams. They also differ in the kinds of platforms they may be implemented on.

The idea that providers should include a way for users to report a livestream if it “depicts the risk of imminent physical harm” is an example that would apply to all user-to-user websites that let one person livestream to several people, where there may be a chance of displaying unlawful action.

In the meanwhile, only the biggest digital companies, which pose a greater risk of relevant harms, would be subject to possible requirements for platforms to utilize proactive technology to detect information deemed harmful to children.

“Sticking Plasters”

As part of efforts to enhance online safety, Ofcom’s proposals aim to build on the current policies.

Certain platforms have already taken action to try and restrict services like livestreaming that experts have warned could expose the kids to grooming.

Shortly after a BBC investigation discovered hundreds of accounts going live from Syrian refugee camps with youngsters pleading for money, TikTok prohibited children in 2022 and raised the minimum age to go live on the platform from 16 to 18.

YouTube recently announced that starting on July 22, users would need to be 16 years old to livestream.

However, several organizations claim that the regulator’s proposed additional criteria bring to light fundamental problems with the Online Safety Act, which is the set of comprehensive regulations in the UK that Ofcom is responsible for implementing.

What the Online Safety Act is – and how to keep children safe online

Ian Russell, the chair of the Molly Rose Foundation, an organization founded in memory of his 14-year-old daughter Molly Russell, who committed killed herself after viewing thousands of images encouraging suicide and self-harm, stated that while additional measures are always welcome, they will not address either of the systemic weaknesses in the Online Safety Act.

Mr. Russell claimed, “As long as the focus is on sticking plasters, not comprehensive solutions, regulation will fail to keep up with current levels of harm and major new suicide and self-harm threats.”

Ofcom, he continued, demonstrated a “lack of ambition” in its approach to regulating.

“It’s time for the prime minister to intervene and introduce a strengthened Online Safety Act that can tackle preventable harm head-on by fully compelling companies to identify and fix all the risks posed by their platforms.”

Instead of requiring “incremental changes,” the regulator should force businesses to “think more holistically” about safeguards for children, according to Leanda Barrington-Leach, executive director of the children’s rights charity 5Rights.

She stated, “Children’s safety should be embedded into tech companies’ design of features and functionalities from the outset.”

However, as stated by Rani Govender of the NSPCC, Ofcom’s decision to mandate additional protections for livestreaming “could make a real difference to protecting children in these high-risk spaces.”

Ofcom aims to receive input from service providers, civil society, law enforcement, and the general public throughout the consultation, which is open until October 20, 2025.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *

Clan Reporters is a Nigerian newspaper founded in 2014 by Paul Omo Okojie, a media consultant, communicator, and entrepreneur. Published in hard copy print format, the newspaper was established to deliver timely news, in-depth reporting, and relevant commentary on issues affecting Nigerian communities, with a focus on politics, society, business, and grassroots affairs. As both the founder and the guiding force behind the newspaper, Paul Omo Okojie also leads OMC Okojie Media Consultants (often shortened to OMC), the media firm responsible for the editorial direction, strategic communications, and overall operations of Clan Reporters. Under his leadership, the newspaper has aimed to blend professional journalism with community engagement, giving voice to local stories and perspectives often overlooked in mainstream media. Okojie’s background in journalism and media consultancy has shaped Clan Reporters into a platform committed to credibility, accountability, and service to its readership. Over the years, the publication has sought to uphold high standards of reporting while fostering informed public discourse in Nigeria.