Tech platforms may be required to stop illegal content from going viral and limit the ability for people to send virtual gifts or record a child’s livestream, as part of new online safety measures proposed by Ofcom.
The UK regulator released a consultation on Monday seeking feedback on additional measures to protect citizens, especially children, online.
These measures could also include larger platforms being required to proactively detect terrorist material, as part of further online safety measures.
According to Oliver Griffiths, the director of Ofcom’s online safety group, the proposed measures aim to build upon existing UK online safety regulations while keeping up with constantly evolving risks.
Griffiths stated, “We are holding platforms accountable and taking swift enforcement action when we have concerns. However, technology and the associated risks are constantly evolving, and we are continuously exploring ways to make the online world safer.”
The consultation focuses on three main areas where Ofcom believes more can be done:
– Preventing the spread of illegal content
– Addressing harms at the source
– Providing additional protections for children
The BBC has reached out to TikTok, livestreaming platform Twitch, and Meta – the parent company of Instagram, Facebook, and Threads – for comment.
Ofcom’s proposed measures cover a variety of issues, from intimate image abuse to the risk of individuals witnessing physical harm on livestreams, and vary in terms of the type or size of platform they may apply to.
For example, the proposal for providers to have a mechanism for users to report a livestream if it contains content that “depicts the risk of imminent physical harm” would apply to all user-to-user sites that allow a single user to livestream to multiple viewers, where there may be a risk of showcasing illegal activities.
On the other hand, the potential requirement for platforms to use proactive technology to identify harmful content for children would only apply to larger tech companies that pose a higher risk of relevant harms.
Ian Russell, the chair of the Molly Rose Foundation – an organization established in memory of his 14-year-old daughter Molly Russell, who took her own life after being exposed to thousands of images promoting suicide and self-harm – stated that “further measures are always welcome, but they will not address the systemic weaknesses in the Online Safety Act.”
He also criticized Ofcom for lacking ambition in its approach to regulation and called for the prime minister to intervene and introduce a strengthened Online Safety Act that would compel companies to identify and address all risks posed by their platforms.
What is the Online Safety Act and how can we keep children safe online
The consultation will remain open until October 20, 2025, and Ofcom hopes to receive feedback from service providers, civil society, law enforcement, and the general public.
This comes as tech platforms strive to comply with the UK’s comprehensive online safety regulations, which Ofcom has been tasked with enforcing.
Some platforms have already taken steps to address features that experts have warned could expose children to grooming, such as livestreaming.
In 2022, TikTok raised its minimum age for going live on the platform from 16 to 18, shortly after a BBC investigation revealed hundreds of accounts going live from Syrian refugee camps, with children begging for donations.
YouTube recently announced that it will increase its minimum age for livestreaming to 16, starting on July 22.
Sign up for our Tech Decoded newsletter to stay updated on the world’s top tech stories and trends. Not in the UK? Sign up here.