Is the Online Safety Bill Effective in Its Intended Purpose?
In the heart of today's discourse, we focus on the UK's Online Safety Act (OSA), a legislative initiative designed to ensure online safety for children. The Act, which received Royal Assent in October 2023, is currently in the process of phased enforcement, with full implementation targeted for 2025 and beyond [1][2].
The OSA empowers the communications regulator, Ofcom, to oversee digital platforms, requiring them to implement age verification and content moderation to protect children from harmful online content, including pornography [1][4]. Non-compliance could result in significant fines and sanctions [1][3].
As of July 25, 2025, the first Protection of Children Codes of Practice came into force, specifically targeting user-to-user services to address child safety risks [2]. Services hosting pornography must enforce strict age verification to prevent access by children, with penalties reaching up to 10% of global turnover or £18 million, plus potential criminal sanctions for senior managers [1][3].
The Act aims to shield children from "primary priority content" defined as pornographic material, content promoting suicide, self-harm, and eating disorders, by requiring platforms to take a risk-based, proportionate approach to safety duties [3][4]. Age verification is central, designed to prevent children from accessing harmful adult content [4].
Proponents argue that the OSA marks a historic advancement in protecting children online, as it imposes clear legal duties on platforms and substantial penalties that incentivize compliance. The risk-based approach aims to balance protection with free expression, avoiding arbitrary censorship while focusing on serious risks to children [4]. The Act aligns the UK with broader international trends toward age verification and safeguarding measures, which are being adopted or considered in the US and EU [5].
However, criticism surrounding the OSA is mounting. Critics contend that the Act risks infringing privacy by mandating intrusive age checks, potentially exposing users to data risks and reducing anonymity, especially for younger people [3]. There are concerns about the Act causing overreach that may restrict free speech and expression online by enabling excessive moderation or censorship [5]. Practical challenges have emerged: users circumvent restrictions using VPNs and other tools, limiting the Act’s practical efficacy in blocking children from harmful content [5]. Some argue the Act’s enforcement focus on a single country struggles with the global nature of the internet, questioning how effective UK controls can be against transnational platforms and content [5].
The discussion extends beyond the realm of the OSA, involving the safety of young people and the role of parents and the state in regulating online content. Toby Young, a free speech campaigner, argues that parents, rather than the state, should be responsible for their children's online viewing. Ian Russell, a campaigner who lost his daughter Molly, aged 14, due to exposure to harmful content on social media, criticizes websites for allowing a "tsunami" of inappropriate content on young users' feeds [6].
In this episode, Tamara Cohen, the host, discussions the safety of young people with Dame Rachel de Souza, the Children's Commissioner, exploring what additional measures need to be taken to ensure the safety of young people online. Mike Bovill serves as the editor of the episode, while Emily Hulme produces it.
References:
[1] Online Safety Bill: Summary of the Bill - Parliament.uk (2021) [2] Online Safety Bill: Overview - Parliament.uk (2023) [3] Online Safety Bill: Criticisms - Parliament.uk (2023) [4] Online Safety Bill: Explanatory Notes - Parliament.uk (2021) [5] Online Safety Bill: International Comparisons - Parliament.uk (2023) [6] The Guardian (2021) "Ian Russell: 'My daughter Molly was killed by Instagram. We need to act now'"