Demand a safer online world
Parents and families are worried about the unacceptable harm that children are being exposed to on social media platforms every day.
Time and again, tech companies have wilfully ignored real harm being caused to young people – including inappropriate content, platforms deliberately designed to keep kids hooked for hours, and devices that do nothing to stop children receiving or sending nude images.
We’ve seen that tech companies will do anything to avoid acting unless they’re forced. All over the world, tech companies have knowingly sacrificed children’s safety for profit. In Australia, despite a social media ban, children continue to access unsuitable sites. We can’t let children in the UK pay the price for tech companies’ failings.
Time is up for tech companies, and we need to see serious and urgent change. Join us in demanding a safer online world for children.
You can tell the Government what you think by filling out their consultation, sharing your own thoughts and feelings about what needs to change. The louder our voices are, the harder they are to ignore.
What we’re calling for
We believe there are three key, urgent changes needed to protect children – changes that go beyond what a simple social media ban could achieve.
While the Online Safety Act means platforms must now check which users are over 18 to access pornography and harmful content, we need to go much further.
We want to see platforms set risk-based age limits for children – like we see with film ratings. And we’re calling for these age limits to be properly enforced, with tech companies being required to use highly effective age assurance to protect children from using sites that aren’t suitable for them. If existing age limits were enforced effectively right now, 2.5 million children under the age of 13 would instantly be better protected.
Features like infinite scrolling, addictive algorithms and auto-playing videos are having a huge impact on children. In fact, children often tell us that feeling addicted to their phones is a huge worry and they want to see this change.
For too long, tech companies have prioritised their growth and engagement over the wellbeing of children. Banning the design tricks that keep young people glued to devices will help to create a healthier relationship with all online spaces, not just social media.
In 2025, police forces logged almost 37,000 child sexual abuse image crimes across the UK. That number has been rising, and many more cases are likely to go unreported.
Technology to block illegal images in real time already exists and must be built in as standard on children’s phones and devices to help stop them from receiving, viewing or sending nude images. Alongside this, children must be protected from receiving dangerous or misleading information by AI chatbots.
Children and young people are also invited to respond to the consultation, and it’s important that their voices are heard. We’ve written some advice to help them share their thoughts.
The consultation closes at 11:59pm on 26 May 2026.