Children’s rights in the digital environment have been defined – now they need defending


MSc candidate Shrenya Soni details some of the discussions at The Digital Futures for Children centre’s Annual Research Insights Day, hosted by the London School of Economics and Political Science and the 5Rights Foundation with collaborators across academia, policy, and civil society.

A few hours of a child’s attention can generate measurable revenue for a platform. That is not a hypothetical warning: it is already happening, at scale, by design.

At the Digital Futures for Children Annual Research Insights Day, hosted by the LSE on 13th March, a key tension emerged: children are increasingly navigating digital systems that shape how they learn, interact, and understand the world, while the standards intended to safeguard them remain inconsistently applied. Five years after the UN’s General Comment No. 25, the challenge is no longer defining children’s digital rights, but enforcing them.

A global framework exists, full adoption does not

The opening panel, chaired by LSE’s Professor Sonia Livingstone and featuring Baroness Beeban Kidron of the UK House of Lords and 5Rights Foundation, Professor Eva Lievens of Ghent University, and Dr Kim R. Sylwander of the Digital Futures for Children Centre at LSE, reflected on the last five years of General comment No. 25 (GC25). Widely regarded as the most comprehensive articulation of children’s rights in digital environments, the comment challenges the idea that technology operates outside established social and legal norms.

As Baroness Kidron argued, the notion of “tech exceptionalism” has allowed digital systems to evade the standards applied elsewhere. The term describes a broader ideological framing in which digital technologies are constructed as distinct from, and not fully subject to, existing legal, ethical, and social norms. This framing not only delays regulation but also reshapes expectations of responsibility, particularly for corporate actors.  GC25 rejects this, asserting that children’s rights— including privacy, participation and freedom of thought—apply fully in digital spaces.

Yet implementation remains partial. Professor Lievens pointed to the tendency of policymakers to focus narrowly on protection while overlooking participation and agency, reducing children to passive subjects of protection rather than recognising them as active stakeholders with a right to influence how digital systems are designed and governed. References to GC25 are emerging across laws, policy initiatives, and public debates in countries such as Brazil, Spain, and Indonesia, but these rarely translate into consistent enforcement. The issue is not the absence of legal tools: regulatory bodies already have the authority to act, yet this authority is unevenly exercised.

Dr Sylwander highlighted a further complication. Without global benchmarks or consistent data, it remains difficult to assess how children’s rights in digital spaces are actually being realised, particularly in digital environments. The result is a fragmented evidence base that makes accountability harder to sustain.

Responsibility is therefore displaced, rather than meaningfully enforced. Parents are expected to manage risks they cannot fully understand, and schools are left navigating complex systems without clear standards. Meanwhile, companies continue to design products that depend on the very practices GC25 was intended to challenge. The implication is clear: access to children should be conditional, granted only to services that can meet rights-respecting standards.

EdTech is everywhere. Accountability isn’t.

The second session, chaired by LSE’s Dr Alison Powell, brought together Dr Sandra El Gemayel of the Digital Futures for Children Centre, Dr Ayça Atabey of the University of Edinburgh, Kasia Suliga a computing lead and educator, and Amelie, a 5Rights Global Youth Ambassador, to examine how EdTech and AI are reshaping learning environments.

Dr El Gemayel’s research, based on work with primary and secondary schools in the UK, shows how deeply embedded these tools have become. Teachers value the efficiencies they provide, particularly in feedback and assessment, while children often assume their data is secure. Beneath this, however, the experience is far less consistent.

Students frequently encounter automated feedback that is unclear or misaligned with their needs. In Dr El Gemayel’s research, children described AI tools as “glitching” or failing to understand what they were trying to do. What is framed as “personalised learning” can instead make the learning process more opaque. In response, many students turn to external tools to generate explanations or practice materials when formal systems fall short, creating an informal layer of support alongside official platforms.

Dr Atabey’s work reframes this as a structural issue rather than a technical one. Educational benefits of EdTech, she argued, are often limited, while commercial incentives remain deeply embedded. The presence of data extraction and advertising infrastructures within learning environments challenges the idea of education as a protected public space. The question is not whether these tools should be removed, but whether they can be redesigned to align with children’s rights.

As an educator, Suliga described the speed at which these tools are being adopted in the classroom, often without clear frameworks to guide their use. Teachers are already integrating AI into everyday tasks, from drafting professional communication to preparing learning materials, yet the broader implications for learning remain uncertain. The challenge is not whether classrooms will adopt AI, but how to achieve balance and meaningful pedagogical integration.

Amelie’s perspective made that tension explicit. For students like herself, these systems are not optional, yet their limitations are widely recognised. Concerns about bias, data use, and uneven outcomes are not abstract. As Amelie noted, students are often left to teach themselves how to navigate the risks of the tools they are required to use.

Screen time is the wrong measure

The afternoon session on mental health and resilience, chaired by LSE’s Dr Mariya Stoilova, shifted the focus from systems to experience. Presentations by Dr Kasia Kostyrka-Allchorne of Queen Mary University of London and Damon De Ionno of Revealing Reality introduced new approaches to understanding children’s digital lives through projects such as ORChiD and DigiPulse.

A central argument across both projects is that “screen time” is an inadequate measure. What matters is not how long children spend online, but what they encounter and how they respond to it. Dr Kostyrka-Allchorne’s research shows that social comparison, exposure to upsetting content, and difficult interactions are associated with increased anxiety and depression, both across individuals and within the same individual over time.

By contrast, general use, including leisure and everyday interaction, shows little consistent link to negative outcomes. This distinction shifts attention away from quantity and towards experience, suggesting that reducing time online is unlikely to address harm alone.

The DigiPulse project extends this point by capturing real-time engagement and revealing the effects of smartphone use on mental health. As De Ionno’s findings make clear, children’s participation takes place within systems designed to extract value from attention, shaping their experiences in ways that are not always apparent.

Understanding wellbeing in the digital environment, then, requires a shift in perspective. It is not only about limiting exposure to risk, but about recognising the conditions under which participation occurs and the incentives shaping those environments.

Regulation exists, but enforcement hesitates

The final session, chaired by Baroness Beeban Kidron and featuring Steve Wood of PrivacyX Consulting, Beckett LeClair of the 5Rights Foundation, and Jasmina Byrne, former Chief of Foresight and Policy at UNICEF, returned to the question of regulation. If the evidence is clear, why does progress remain slow?

Wood pointed to the limits of current approaches. He notes that while regulation has driven some change, it is not yet working as effectively as intended. Much of the current model relies on engagement with companies, encouraging compliance rather than enforcing it.

LeClair emphasised that regulation’s influence is visible precisely because companies invest heavily in lobbying against it. The issue is not whether regulation matters, but whether it is applied with sufficient urgency.

Byrne’s intervention points to where this could shift. Rather than responding after harm occurs, regulatory thinking needs to move upstream, shaping systems before they are deployed. This requires working alongside developers and embedding children’s rights into design processes from the outset, rather than treating regulation as a corrective.

Across the discussion, one point remained consistent. Children are rarely included in these processes, despite being most affected by them. When they are consulted, their insights are immediate and grounded in experience, making their exclusion increasingly difficult to justify.

What acting on it requires

The challenge is no longer identifying the issues, but responding to them at the scale they demand. Frameworks such as GC25 already define what children are entitled to, and research continues to document where current systems fall short. Acting on this knowledge would mean enforcing existing regulations, designing digital systems that embed children’s rights from the outset, and recognising children as participants in shaping the environments they use.

Children do not experience the digital world in policy timelines or research cycles. They experience it in real time, within systems that are already shaping their lives.

This post gives the views of the author and not the position of the Media@LSE blog, nor of the London School of Economics and Political Science.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *