Social media executives from Meta, Snap, YouTube, TikTok and X are called upon to Downing Street on Thursday for a high-stakes meeting with Prime Minister Sir Keir Starmer and Technology Secretary Liz Kendall over children’s safety online. The tech bosses will be questioned about the steps they are implementing to protect young users and respond to parent worries, as the government continues its review on whether to implement a complete prohibition on social media for under-16s, in line with Australia’s approach. Sir Keir has emphasised that the meeting will centre on ensuring “social media companies step up and take responsibility”, warning that “the consequences of failing to act are stark” and that the government has a duty to parents and the next generation to put children’s safety first.
The Downing Street Showdown
Thursday’s meeting represents a pivotal moment in the government’s push to hold tech giants to account for their part in protecting vulnerable young users. The gathering comes at a crucial juncture, with Parliament having dismissed calls for an complete ban on social media for those under 16 just hours earlier, despite support from the House of Lords. Instead of introducing a broad prohibition, MPs voted to grant ministers authority to establish their own restrictions, indicating the government’s preference for a increasingly bespoke regulatory approach rather than a comprehensive legislative ban.
The scheduling of the Downing Street summit demonstrates the administration’s commitment to appear firm on internet safety whilst addressing multifaceted commercial and political pressures. Professor Gina Neff from the University of Cambridge’s Minderby Centre for Technology and Democracy indicated the meeting permits the administration to illustrate it is acting proactively on online harms. Downing Street has already recognised that some services have advanced, implementing steps such as deactivating autoplay for children by preset, and providing parents improved controls over device usage, though observers argue substantially more must be completed.
- Tech executives grilled regarding protections for children and parental concern responses
- Ministers exploring restrictions on social platforms for children under 16 following Australia’s example
- MPs voted against outright ban but gave ministers ability to implement controls
- Some platforms already implemented safeguards like stopping autoplay for young users
Parliament’s Rejection and the Wider Discussion
Wednesday evening’s parliamentary vote proved damaging to supporters of a complete ban on social media for under-16s, representing the second time MPs have dismissed such proposals despite strong support from the upper chamber. The government’s decision to favour ministerial discretion over legislative action reflects a more cautious approach, with officials contending that an complete prohibition would be premature given ongoing policy considerations. This strategy allows the government flexibility in designing tailored controls rather than introducing a sweeping ban that some worry could be hard to enforce and effectively oversee across various platforms.
The rejection has heightened discussion regarding whether the UK is adequately protecting its children from online harms. Whilst the government maintains that providing ministers with powers to establish customised regulations represents a more pragmatic solution, critics argue this approach misses the decisive intervention the situation necessitates. Recent studies conducted in Australia, where an ban on social media for under-16s was implemented in December 2025, reveals that over 60 per cent of underage users continue accessing platforms regardless, highlighting serious doubts about the success of legislative restrictions and suggesting the challenge stretches well past straightforward bans.
Bipartisan Criticism
The parliamentary decision has provoked sharp scrutiny from opposition benches. Conservative shadow education secretary Laura Trott charged Labour MPs of failing parents and children by rejecting the ban, maintaining that other nations are recognising social media’s harms whilst the UK drops back under the current government. Liberal Democrat education spokeswoman Munira Wilson reinforced these worries, stating that “the time for partial solutions is over” and insisting on immediate measures to restrict the most harmful platforms for young users rather than gradual policy tweaks.
Australia’s Cautionary Example
Australia’s experience with online platform restrictions provides a sobering case study for policymakers considering similar measures in the UK. When the country implemented a prohibition on social media for under-16s in December 2025, it was celebrated as a significant milestone in safeguarding young users from digital risks. However, emerging research from the Molly Rose Foundation has revealed a troubling reality: more than 60 per cent of young Australians continue using social media platforms in spite of the legislative prohibition. This significant rate of non-compliance suggests that legislative bans alone may prove insufficient in stopping young users intent on access from accessing the services they want to access.
The Australian results hold considerable implications for the UK’s continuing policy debates. If a comparable ban were implemented in Britain, the evidence indicates implementation would present substantial challenges, with young people likely finding ways to circumvent age-verification systems and restrictions through various technical means. The data challenges arguments that a simple legislative prohibition represents a silver-bullet solution to digital safety issues, instead highlighting the need for a more holistic approach integrating regulatory measures, platform accountability, parental oversight tools, and digital literacy training to meaningfully address the risks young people face online.
| Key Finding | Implication |
|---|---|
| Over 60% of underage Australians still access social media despite ban | Legislative prohibitions alone cannot effectively prevent determined young users from accessing platforms |
| Ban introduced in December 2025 has failed to achieve widespread compliance | Enforcement mechanisms remain weak and young people find workarounds to restrictions |
| Blanket bans do not address underlying appeal of social media to young people | Multi-faceted approach combining regulation, platform accountability, and education is necessary |
Subject Matter Experts Urge Substantive Measures
Child safety advocates and digital rights experts have stepped up demands for tech companies to implement meaningful action beyond voluntary measures. The Molly Rose Foundation, established in memory of 14-year-old Molly Russell who died by suicide after accessing dangerous material on the internet, has been especially outspoken in calling for structural reform. Rather than pursuing blanket bans that prove difficult to enforce, campaigners argue the focus must shift towards holding platforms accountable for the systems driving dangerous material to vulnerable users.
Andy Burrows, head of the Molly Rose Foundation, has stressed that Thursday’s meeting at Downing Street represents a pivotal juncture for state intervention. The charity has consistently argued that social media companies possess the technological means to implement robust safeguards, yet often prioritise user engagement figures over user wellbeing. Experts emphasise that real safeguarding requires platforms to redesign their algorithmic recommendations, improve content moderation, and offer parents with meaningful tools to monitor their kids’ internet use successfully.
The Algorithm Problem
At the heart of concerns lies the algorithmic systems that control what content younger audiences see. These algorithms are engineered to maximise engagement, often pushing sensational, harmful, or addictive content to at-risk groups. Reforming these systems constitutes one of the most critical issues in digital safety, demanding transparency from platforms about how their recommendation engines operate and what protective measures are in place.
- Algorithms favour user engagement over user safety and wellbeing
- Platforms should enhance openness regarding content recommendation systems
- External reviews of harm caused by algorithms are vital to ensuring accountability
What’s Coming Next
Thursday’s summit at Downing Street will set the tone for the government’s approach to online child safety in the months ahead. Following the meeting, Sir Keir Starmer and Liz Kendall are set to outline their findings and determine whether existing voluntary measures from tech companies prove sufficient or whether enhanced statutory intervention becomes necessary. The government remains in the midst of its public consultation on whether to implement an Australia-style ban on social media for under-16s, with the conclusions from this week’s talks likely to affect the final policy direction.
Ministers have signalled their preference for granting themselves powers to impose restrictions rather than introducing a complete prohibition, citing worries regarding enforceability and effectiveness. However, increasing pressure from opposition parties, child protection advocates, and parents suggests the government may encounter ongoing calls for stronger action. The weeks ahead will be crucial in ascertaining whether tech companies can show real commitment to protecting young users or whether the government will enact legislation to force compliance with stricter safety standards.