Our manifesto outlines “22 ways to build resilience and aspiration in people and communities” across five key areas. Download your copy.

Dismiss close

Child exploitation

Online Safety Bill: Why a proactive duty of care is so important

A group of young people browse their phones whilst sitting on a wall. Overlaid is text that reads "Online Harms Research Hub".

Today’s announcement offers a strong starting point but in order to protect the most vulnerable and to stay one step ahead of future developments, there must a focus on prevention – and a responsibility to provide a proactive duty of care.

At a time when children and young people are more reliant on online platforms than ever, we are pleased to see the Government propose a system of independent regulation, with Ofcom in a position to remove harmful content in order to protect vulnerable users.

“I think with big online platforms, because they are so powerful, they’ve got so much influence on the world, there’s no one really going up against them… I think if there was a law put in place, that’s the best route for it, because then it’s like, hang on, there’s a government that can say, no, this is wrong, and you have to change.”

– Young person involved in Catch22’s Online Harms research

While responsibility for reacting to harmful content – quickly and decisively – is essential, there must also be a focus on a proactive duty of care.  This means placing responsibility on the platforms to participate in preventing harms before they happen. The current status quo places children in a position where, whether they report it or not, they will be forced to relive the harm – be it exposure to confronting content or cyberbullying. When we get to the stage of fines and damages, the harm has been done and often has been relived several times. Catch22’s victim services witness the long-term support that is often required to assist children and young people who have been the victims of the most serious online harms.

Without proactive responsibility on these platforms, we are leaving many children and young people to navigate confronting and potentially violent situations by themselves, often without guidance or prompt support or adequate responses from platforms.

At Catch22, we’re also concerned about the ever-expanding ‘grey harms’, those behaviours which are not illegal but which we know can lead to harmful, and even illegal behaviour  – adult contacts messaging children unknown to them, abusive language used on online games, and, specifically addressed in parliament today, content which encourages self-harm. It is essential that we get the principles of this bill right to ensure that as new technologies emerge and as new harms evolve, society is prepared to respond and platforms are clear in the expectations imposed on them.

“I think there does need to be someone else that holds them to account a little bit higher up because them being accountable, obviously, isn’t working … there are so many examples of that which happen every single day and if [companies] were accountable for it, then it wouldn’t be happening. So I do think that something else does need to be done to stop it from happening.”

– Young person involved in Catch22’s Online Harms research

There must be a commitment to long-term preventative training for professionals, robust education for young people in all age groups, and, as stated in Oliver Dowden’s Parliamentary statement today, real responsibility on organisations to “engineer the harm out of their platforms”.

Catch22 Online Harms research

Dr Faith Gordon is the lead researcher of Catch22’s study into children and young peoples’ experience of social media and online behaviour.

Speaking with 50 children and young people from across Catch22’s services and programmes, the study gives children and young people the opportunity to share their perceptions of social media, define what is deemed ‘acceptable use’ across various platforms, and suggest the measures they think could work to make social media safer.

A Senior Lecturer in Law at ANU and an experienced researcher on online harms, Dr Gordon said:

“All of the children and young people we spoke to highlighted the prevalence, extent and impact of online harms on their lives, and in particular the damaging consequences to their mental health and well-being. Unanimously young people  felt that social media companies and other online platforms were clearly not able to prevent harms and in noting how ‘powerful’ these platforms are, they suggested that a system which has ‘teeth’ in holding companies to account was urgently needed.”

Dr Gordon’s research also includes qualitative interviews with other researchers, tech organisations, educators, Catch22’s victim support services, and those working in safeguarding and law enforcement, to discuss the challenges they experience in terms of how acceptable behaviour is monitored and enforced, and the wider societal measures that need to be considered.

Dr Gordon continued:

“Introducing these new proposals and also the raft of potential secondary legislation mentioned today by the UK Secretary of State, may take considerable time. During this time more children and young people will continue to experience online harms and more victims will require support. It is vitally important that reforms and new measures are well thought-out and appropriately implemented, but there must not be extensive delays. The COVID-19 pandemic has shone a light onto just how prevalent online harms are and just how urgent the need for action is on these issues in the UK”.

Catch22 held an event last week with Victims’ Commissioner Dame Vera Baird, sharing the interim findings of the research. The event, chaired by Catch22’s Naomi Hulston, also included panelists Dr Faith Gordon, Professor Lorna Woods from Carnegie Trust UK, and Jordan Khanu – a member of the Young Peoples’ Action Group for the Mayor of London’s Violence Reduction Unit, which is currently funding The Social Switch Project.