Our manifesto outlines “22 ways to build resilience and aspiration in people and communities” across five key areas. Download your copy.

Dismiss close

Child exploitation

Catch22 gives oral evidence to the Online Safety Bill Committee hearing

A group of young people sitting on a bench browse their smartphones. None of their faces can be seen.

This week Catch22’s Chief Development Officer Mat Ilic was invited to the Public Bill Committee at the House of Commons to share Catch22’s thoughts on the latest draft of the Online Safety Bill. Mat was joined by Full Fact’s William May, and Professor Lorna Woods and William Perrin of the Carnegie UK Trust.


Catch22’s work across child sexual and criminal exploitation, our efforts to improve digital skills and digital access, as well as our delivery of alternative provision schools across England and Wales, means we are in a position of both protecting children and young people and working to prepare them for a thriving future.

Since 2015, we have been researching the impact of social media on young peoples’ lives too, and in our report released last year, we set out children’s and young peoples’ expectations for the protections they want and the kind of online world they want to see.

It found:

  • In approximately half of our child criminal and sexual exploitation referrals, online harms are the primary reason for referral.
  • Young people want to see better training for professionals and guardians in relation to online behaviour
  • Children and young people want to see improved monitoring, swift action and accountability from tech organisations, rather than the responsibility being placed on the user

The Bill aims to provide a higher level of protection for children than adults but, as was highlighted in yesterday’s Committee hearing, there are flaws in the Bill which could prevent this from being the case:

1. A lack of focus on safety-by-design, and instead reliability on content regulation

Much of this is yet to be clarified. As Mat said at the Committee: 

“We have previously called this the toy car principle, which means any content or product that is designed with children in mind needs to be tested in a way that is explicitly for children, as Mr Moy talked about. It needs to have some age-specific frameworks built in, but we also need to go further than that by thinking about how we might raise the floor, rather than necessarily trying to tackle explicit harms. Our point is that we need to remain focused on online safety for children and the drivers of online harm and not the content.”

2. The need for greater transparency from platforms

This can be achieved through legislative requirements demanding that platforms are transparent on how they make decisions relating to children’s safety and the removal or content, or not, and greater collaboration with academics in the field. 

“I want to underline the point about empowerment for children who have been exposed to or experience harm online, or offline as a result of online harm. It should include the ability to bring forward cases where complaints, or other issues, were not taken seriously by the platforms.”

3. Stronger enforcement of regulation

Currently, even the companies’ own attempts to self-regulate are inconsistent and inadequate. Even when children and young people are reporting harm, they seldom receive adequate responses.

“We would try to raise the floor rather than chase specific threats and harms with the legislation.”

On this final point, the discussion returned to safety-by-design and empowering users, of all ages. Particularly with young users, Catch22 wants to see a much higher expectation of media literacy. The Social Switch Project, funded by the Mayor of London’s Violence Reduction Unit, is providing this for frontline professionals; Greater Manchester’s Violence Reduction Unit is also funding school pilots to deliver training to young people in a peer-to-peer model.

As well as improved media literacy, users will feel empowered to raise complaints if they know this might improve the design of a platform – preventing future harm. Yet, as it stands, the reactive nature of platforms’ complaint systems, which are impersonal and with considerable delays, means platforms, police and legislation are always a step behind and harm continues.

“Rather than prejudging types of content, I think it would be more helpful to look at what is there and what the system is doing. Then we could look at what a proportionate response would be—looking, as people have said, at the design and the features. Rather than waiting for content to be created and then trying to deal with it, we could look at more friction at an earlier stage.”

– Professor Lorna Woods