This website uses cookies to help us understand the way visitors use our website. We can't identify you with them and we don't share the data with anyone else. If you click Reject we will set a single cookie to remember your preference. Find out more in our privacy policy.

‘Digital by Default’: impact of the pandemic on children’s and young people’s experiences online, and what can a children’s rights-based approach offer?

This blog is based on Dr Faith Gordon's Keynote speech at the Euromet 2021 symposium. Over three days, attendees heard from a range of experts on topics under the theme of "The impact of the COVID-19 pandemic on vulnerable young people".

03 August 2021

‘Digital by Default’ in the COVID-19 Context

The COVID-19 lock-downs have resulted in the introduction of physical distancing measures, restrictions on movement and more than 1.5 billion children and young people affected by school closures (UNICEF, 2020). Additional aspects of life have become ‘digital by default’, with children and young people having to turn to digital solutions for education, socialisation and play.

Reports estimate that internet usage has increased by over 50 percent in some parts of the world (World Economic Forum, March 2020), while others continue to experience the ‘digital divide’. Catch22 has been part of highlighting the impact on some of the most marginalised children and young people in the UK, including young care leavers. 

Online platforms have provided positive opportunities for engagement, however increased time spent online has also resulted in increased exposure to risks and harm. This was confirmed by reported statistics in the United Kingdom, such as those from the Internet Watch Foundation (2020) and the UK Home Office (2020). 

Children’s and Young People’s Participation

During one of the COVID-19 lockdowns in the UK, focus groups were conducted with 42 children and young people aged 10-22 years, as well as 15 interviews with key professionals including senior police, educators, safeguarding experts, youth workers, victim service providers, tech and gaming companies, regulators and representatives from the wider tech industry.

The research captured their opinions on online harms and the impact on their lives, on service provision and on responses to online harms. The study also asked participants to describe what ‘acceptable use’ is in online spaces and what they thought about law enforcement’s current role in addressing online harms. Lastly, it considered what changes are needed to in order to make online spaces safer.

What have children and young people experienced online during COVID?

Children and young people said online platforms were positive for communicating with friends, keeping in contact with people in other places, for educational purposes, finding new hobbies and having a sense of belonging as part of an online “community”. Some young people, in particular those who identify as LGBTQI, felt it was easier to talk to people online than in person and this assisted them with participation. 

Cyberbullying, threats, harassment, unwanted contact, unwanted content, negative consequences for mental health, the “toxic” nature of interactions and content, the lack of boundaries and the nature of restrictions for adults and youth, as well as the levels of surveillance, were some of the negative aspects identified by children and young people:

I don’t think I know one person who hasn’t had something bad go on online. (Young Person).

Young people questioned the safety of platforms and placed emphasis on the companies to address these issues: 

Like if the platforms are safe and that, how does child porn get on to platforms, and how does the grooming happen and that? (Young Person).

The emphasis placed on the platforms to take responsibility was a consistent theme in the focus groups with children and young people. 

Children and young people asserted that the current system of self-regulation clearly hasn’t been working and that more was required to ensure online spaces were safe and positive for children and young people.

Regulation not a panacea

Regulation and the creation of a regulatory framework has been the central focus throughout the discussions and debates in the United Kingdom. The proposed legislative framework of the draft Online Safety Bill, includes details of an independent regulator.

Regulation however is not a panacea. It will not address each and every aspect of online harm. It is one aspect in an array of required measures, including education, the need to address social inequalities, the need for transparency by companies and partnership work and a recognition and respect of the rights children have under the United Nations Convention on the Rights of the Child 1989.

‘Grey areas’ that exist pose challenges. Policy professionals interviewed for this study felt that a distinction should be made in relation to extreme harms and the more “grey areas” that exist. Grouping all online harms together to provide broad responses would not work in practice. 

Rather, policy professional interviewed advocate for a principles-based approach in devising a legislative framework, which would be more likely to adapt to the ever-changing nature of tech and the new kinds of harms that may emerge in the future. 

Ethics and ethical professional practices have been missing from the discussions. In other aspects such as mainstream media regulation, professional ethical guidelines for industry exist and professionals can be held to account in relation to these. 

Education is also key. Children and young people feel that the education and training they receive on online safety is “outdated” and does not keep up with latest developments. Better innovation in education is important, including the development of a clear and effective education system for adults, as well as for children and young people. 

Children and young people suggested that peer-led training and different formats including interactive videos and keeping information up to date, are essential for education and learning for children, young people and adults. 

A discussion of children’s rights in the digital environment has largely been missing from UK policy discussions. Children and young people in this study wanted to be better informed about their rights online and they feel that this was closely related to “acceptable use” and consent. 

What can a children’s rights-based approach offer?

The United Nations Convention on the Rights of the Child 1989, which the United Kingdom signed up to, was drafted long before the digital environment that we know and use today.

In recognising this, the UN Committee produced the UN General Comment on Children’s Rights and Digital Environment, published in March 2021 (General Comment, No. 25). It offers up-to-date guidance and recommendations on aspects such as children’s rights to privacy, non-discrimination, protection, education and play. 

A rights-based approach to decision-making can be extremely beneficial in the discussions about online harms, online safety and proposals. The General Comment Number 25 outlines how States parties should implement the Convention in relation to the digital environment. It also sets out clear guidance on how legislation, policies and other measures can be fully compliant with the Convention on the Rights of the Child and the Optional Protocols, in order to fully promote and fulfil all of the rights children have in the digital age. 

Below are some examples drawn from the General Comment: 

Best interests

    • States parties should ensure that, in all actions regarding the provision, regulation, design, management and use of the digital environment, the best interests of every child is a primary consideration (General Comment, No. 25).

Design and Development of legislation: Participation

    • When developing legislation, policies, programmes, services and training on children’s rights in relation to the digital environment, States parties should involve all children, listen to their needs and give due weight to their views. They should ensure that digital service providers actively engage with children, applying appropriate safeguards, and give their views due consideration when developing products and services (General Comment, No. 25).
    • States parties should review, adopt and update national legislation in line with international human rights standards, to ensure that the digital environment is compatible with the rights set out in the Convention and the Optional Protocols thereto. Legislation should remain relevant, in the context of technological advances and emerging practices (General Comment, No. 25).
    • They should mandate the use of child rights impact assessments to embed children’s rights into legislation, budgetary allocations and other administrative decisions relating to the digital environment and promote their use among public bodies and businesses relating to the digital environment (General Comment, No. 25).

Redress and Remedy 

    • States parties should ensure that the mandates of national human rights institutions and other appropriate independent institutions cover children’s rights in the digital environment and that they are able to receive, investigate and address complaints from children and their representatives. Where independent oversight bodies exist to monitor activities in relation to the digital environment, national human rights institutions should work closely with such bodies on effectively discharging their mandate regarding children’s rights (General Comment, No. 25).
    • Businesses should respect children’s rights and prevent and remedy abuse of their rights in relation to the digital environment. States parties have the obligation to ensure that businesses meet those responsibilities (General Comment, No. 25).

Suggestions by young people in our study include:

  •  children and young people leading panels;
  • meaningful involvement in consultation and design of platforms and products;
  • emphasis on company responsibility and being held to account for actions or inaction;
  • education on rights and on navigating terms and conditions for platforms and products in the digital environment. 

A children’s rights-based framework has a lot to offer, and policymakers working in the online harms and online safety spaces in the UK should be encouraged to urgently engage with it. 


Find out more:


Resources:

Archbold, L, Verdoodt, V., Gordon, F. and Clifford, D. (2021) ‘Children’s Privacy in Lockdown: Intersections between Privacy, Participation and Protection Rights in a Pandemic’, Law, Technology and Humans, 3(1), pp. 18-34, https://lthj.qut.edu.au/article/view/1803

Gordon, F. and Cochrane, J. (2020) ‘Submission in Response to the United Nations Committee on the Rights of the Child’s Call for Comments on the General Comment on Children’s Rights in Relation to the Digital Environment’, November 2020,  https://www.ohchr.org/EN/HRBodies/CRC/Pages/GCChildrensRightsRelationDigitalEnvironment.aspx?mc_cid=25c645450b&mc_eid=2d4eeed250

Forthcoming: Gordon, F. (2021) Online Harms Experienced by Children and Young People: ‘Acceptable Use’ and Regulation. London: Catch22. 

Verdoodt, V., Fordye, R., Archbold, L., Gordon, F. and Clifford, D. (2021) ‘Esports and Platforming of Children’s Play during COVID-19’, International Journal of Children’s Rights. Contact author for a copy: