Government is set to publish its decision on Online Harms regulation in June and current discussions are focused on the parameters around such legislation. Catch22 brought leaders in the field together to discuss what we all want to see.
Catch22 wants to see Online Harms regulation have impact in young peoples’ lives, online and offline.
Our first online harms panel discussion, titled Education or Regulation?, was due to be held at the House of Lords but following the lockdown and with a swift move to online, we quickly transformed it to a digital event. It couldn’t be a more relevant time to hear what leaders in this arena are thinking – from techUK, Ofcom, Google alongside researchers and youth workers.
There was a strong consensus that tech companies and frontline workers want this regulation to offer clear direction and expectations for every player – which of course is no easy feat.
Why now?
The current environment, where youth workers and teachers now need to operate almost entirely remotely to engage with young people, has forced the much-needed understanding that the world of social media and online reliance is here to stay. There’s never been more urgency to understand what young people are doing online and how that space can be kept as safe as possible.
While they are yet to be officially confirmed, Ofcom are likely to be the regulator of online harms. Ofcom’s Content Policy Director Mark Bunting began the discussion, highlighting that people have, over time, become more pessimistic about the benefits of being online for their children. He says that Ofcom recognises that a strategy simply based on blocking or removing illegal or harmful content is not going to be sufficient to deal with the underlying causes of those problems.
Franklyn Addo, Team Leader at Redthread’s Youth Violence Intervention Programme agrees that a blanket approach is impractical and wants the voices of young users to influence any major decisions:
“When I meet a young person who’s just been seriously injured, their first request is for the hospital Wi-Fi … it’s their main way of contacting their friends and family to let them know what has happened. The cohort of young people that we are working with live and breathe the digital world.”
What do we know?
Dr Keir Irwin-Rogers has been researching how we can ensure the latest empirical research on online harms is used to shape future regulation. His research highlights the online glamorisation of group rivalries and drug dealing, the social pressures that come with a larger audience, people’s increased exposure to violence, and the perceived ‘hidden space’ of an online world.
It’s why Google.org supported The Social Switch Project in 2019, a collaboration between Catch22 and Redthread, to open up digital career opportunities for the most vulnerable young people. The project empowers the most knowledgeable front-line practitioners, who work with young people, to talk about social media behaviour in an open and honest way – so they are in position to talk about the repercussions of what is posted, the longevity of their content and the alternative route they could be taking to positively shape their lives. Every session is always booked out and there’s a reason why.
Keir says nationally we are all well behind the curve in terms of education and regulation related to online harms; we need to be focussing on what is already out there. He said:
“Because regulation is so complex, we have to be careful not to rush it through. In terms of education, we could be scaling things up like the training delivered by The Social Switch Project. I attended one of those sessions and every single professional there, dozens in the room, went home that day with ways in which they were going to improve their practice and shape their organisation’s policies and practices. It was incredibly useful for all those professionals but it’s been delivered on a face-to-face basis, primarily in London.
“There’s no reason why this can’t be scaled up nationally and I think probably consideration would need to be given to why we can’t deliver this training online as well – you’ll reach a lot more people faster that way. The training possibly shouldn’t just be for professionals working with young people but for parents as well.”
As an advisor to the Youth Violence Commission he adds, “many adults are thoroughly clueless about social media. Not in a disparaging way – it’s understandable why – but I think if we want to get on top of this, then our education initiatives are going to have to be rolled out at scale and much more quickly than they have been.”
techUK’s Vinous Ali wants consideration of the diversity of players in this space – techUK is the primary membership body for digital companies, with the likes of Microsoft and Google as members, but about two-thirds of their membership is smaller organisations who provide the hardware and the infrastructure that our digital worlds are based on. She says they must all be involved:
“There’s no one-size-fits-all model that applies to both illegal and legal content with equal weights, nor that applies for both the largest companies across to the very smallest. It’s vital to ensure that we create a regulatory system that is flexible and adaptable. Regulation will only be one piece of this; education and awareness is going to be incredibly important.”
She says the tools exist, but the awareness on how to use them and how to use them well is lacking, especially in the households of the most at-risk youth.
Tom Morrison-Bell from Google’s Policy team adds that that we need to see more research and more education on the positive things young people are doing on line too:
“many assume that only bad things are going to happen to children if they’re online for more hours than they would be normally … I think we need to understand the evidence base more which requires everybody to come together.”
What should we expect from regulation?
Ofcom’s Mark Bunting says regulation should be about establishing the standards of accountability and transparency that we expect from the organisation involved:
“to disclose what they’re doing to protect their users; how they’re assessing the effectiveness of those systems and processes; and perhaps most importantly, to be open about where things are going wrong because it’s only by learning from where systems and processes aren’t sufficient to protect users that we can consider fixes and learn collectively.
“one of the jobs for the regulator, I think, is to create a safe space for companies themselves to disclose what they’re learning and what they’re observing in their own environments, and then to work with us, with civil society, and with users themselves to try to find ways of addressing problems.”
Tom Morrison-Bell compares the regulation to that of the GDPR implementation:
“It’s not just about penalising… in the way that the ICO approached GDPR, [ICO] were very explicit in how GDPR was to be implemented and that they wanted to help companies become compliant. That’s because the outcomes of compliance are so valuable. I think there will be huge benefit in having a regulator that any organisation that is in scope of the regulation can seek guidance from.
Catch22 agrees – it’s the job of any good regulator. We want to see collaboration between all parties on what the outcomes are when things do go wrong and what step every person involved could be doing next time, across both a safety-net perspective and an educational perspective.
How can we, working with students, young offenders, and children at risk of exploitation, continue to improve the way we minimise future harm if we’re not hearing about the extent at which harm is committed and how it occurred?
How can we ensure programmes like The Social Switch Project are sufficiently touching on the issues they actually need to unless we hear about what goes wrong in the first place?
We don’t need to wait until regulation arrives to take action. This conversation, the first of many, made one thing abundantly clear: We need more frank and honest conversations between all players and now is the time to talk.