Our manifesto outlines “22 ways to build resilience and aspiration in people and communities” across five key areas. Download your copy.

Dismiss close

Child exploitationDigital skills

Online Safety Bill: Balancing safety with opportunity

A hand holds out a mobile phone. Illustrated above it are a mixture of "Like", email, and "Friend Request" symbols.

At Catch22, we deliver over 100 services across the country that aim to build resilience and aspiration in people and communities. We have a major interest in the Online Safety Bill, as our practitioners see on a daily basis how what happens online can translate into harm offline:

  • We run services supporting victims of child exploitation (including County Lines) – and 97% of referrals to our exploitation and missing services involve online harm. These children and young people are victims of some of the most complex and sophisticated online schemes and grooming. 
  • We also run gangs services; supporting young people at risk of falling into gangs, or who are already involved in gangs  
  • We’re proud to run, alongside our partners RedThread, the successful Social Switch Project which has trained over 1,400 frontline professionals in London (police, youth workers, teachers and health staff), in how to spot and address online harms. It has also supported nearly 100 young people interested in a career in social media to fulfil their ambition. 
  • We have published online harms research, informed and led by the voices of young people. 

But we also know the significant opportunities that the online world presents. We have come together with 36 other organisations as part of the Children’s Coalition, to endorse a joint amendment package, which seeks to not only give children the online protection that they need and deserve – but also allows them to thrive online. 

Empowering young people 

We firmly believe that children’s experiences and needs must be at the heart of the Bill. Their rights of freedom of expression must be upheld and they must be supported and encouraged to participate actively in the whole host of opportunities that the online world presents. 

 So how can safety be balanced with opportunity? 

Legislation is an important mechanism. We want to see: 

  • Safety by design focus: there is currently a focus on content over safe and protective technology. The technology to stop children from viewing explicit images, for examples, does exist, but investment is needed to ensure the tools are widely used. As new apps and features are developed, particularly as we see things like the ‘metaverse’ grow, there should be a legal design requirement to demonstrate that if children use these tools, which we know they are, their safety is paramount. If tools are not designed with child safety in mind, then they must be designed in a way to stop the child using it. The toy car principle, which we’ve written about before, applies here.  
  • Transparency from platforms, at minimum in the form of annual reports, to enable impactful academic research into the how online harm can be tackled. We shouldn’t be relying on whistle-blowers. Academics need to know more about the behaviours of children and young people online if they are to suggest evidence-based interventions for the rest of society.  
  • Empowerment for the individual, enabling them to complain to the regulator or a representative body. In our research with children and young people, they told us that they want to be able to report serious harm and to get swift responses – instead, they are getting automated replies, weeks later, and rarely believe a person has seen the report, and do not feel acknowledged. Currently, the legislation suggests official complaints will only be made to the regulation body through an organisation, representing many young people’s complaints, rather than there being a fit for purpose individual complaints system. 
  • Age assurance in a manner which protects privacy and data and is accessible for every individual – not just those with passports / ID. We are cautious of seeing social media platforms implementing age-assurance themselves, as it could mean that they’d hold even more data on users than they already do. However, there are other approaches to age verification which can involve the device, age-predictive facial recognition software, etc.  And of course we need the so-called ‘rules of the road’ (minimum standards set by Ofcom so these systems can be trusted).  
  • Media literacy for all – with a requirement on these hugely influential platforms to fund training and advocacy programmes. Good training exists for schools, for frontline professionals, and for parents – like our programme, The Social Switch Project. Uptake is strong in the communities it is offered to, with excellent impact stats. But without funding, many are missing out and schools cannot be expected to front all the cost of this training, which is being born out of increased use of these social media platforms. In our view, the onus should be on platforms and literacy by design rather than the burden being placed on users, parents and teachers to up-skill. 

The Online Safety Bill is complex but, with the above principles in place, this vital piece of legislation has a much stronger chance of achieving success. Technology is of course rapidly evolving, social media platforms will come and go, but strong principles will endure.