DFC chair Baroness Beeban Kidron stated positively that the bill signifies that governments are ‘beginning to percolate’ on the impact of technologies on children, and she believes that the call for smartphone bans has been the catalyst to this interest. The meetup’s discussion points can be split into the following themes: EdTech, AI and universal values.
DFC Chair Baroness Beeban Kidron shares her thoughts
EdTech and schools
EdTech has been a stealth issue, based on the assumption that EdTech’s adoption is “better and pedagogically sound.” However, research has yet to support this assumption, and a gap remains between people’s expectations and EdTech’s delivery. The DFC’s project on EdTech and children’s rights aims to give a voice to children’s experience of EdTech and provide guidance for policy moving forward – see a summary of DFC’s previous EdTech work.
AI
5Rights’ Children & AI Design Code is an important ‘first word’ to embed children’s rights into AI. AI must be thought of in relation to specific domains, as technology changes across contexts: what AI achieves in the health sector and what it achieves in schools, for example, has different advantages and risks. One interesting idea discussed is the importance of understanding children's lives, over and above understanding the implications of the digital. For example, in the context of friendships being complemented or substituted by AI chatbots. The task is not merely to understand the consequences of increased use of chatbots in children's lives, but also to understand, from an account of children's friendships, whether the chatbots add or detract from the quality of their experiences.
AI’s impact on children is of utmost importance when we consider the ‘maker’s experience’, often missing from the conversation. There are reports of children digging for cobalt in the Democratic Republic of Congo, and with AI requiring more cobalt, there are concerns that the expansion of AI has an impact on children experiencing forced labour.
International decisions concerning AI are not designed with due consideration of children. We must consider the implications of technological progression on children’s rights globally.
Universal values and local implementation
Whether rights set out universal priorities for children was debated, as ‘universality’ creates tension in the implementation of technology in diverse national and local contexts. Careful and tailored local implementation is currently insufficient, with predominantly English language data and widespread use of Western-centric models. This implementation is likely to worsen with the introduction of AI.
Given the acknowledged crisis at the UN, in its funding and authority, there is value in working horizontally to achieve systemic change in national and regional law and policies. This could include sharing good practice and learning from each other.
Overall, around the world we face sufficient commonalities to sustain a shared agenda of ideas and activities.
A number of DFC reports in process were highlighted:
Other DFC projects referred to included:
Other projects and research shared during the meetup:
- ParentZone: new research on online child financial harms.
- Work of the Digital Child Rights Foundation.
- Online Safety Foundation Uganda, working to promote responsible digital use, child protection and digital literacy across communities.
- Stoilova, M., Bulger, M., & Livingstone, S. (2024). Do parental control tools fulfil family expectations for child protection? A rapid evidence review of the contexts and outcomes of use. Journal of Children and Media, 18(1), 29-49.
- 5Es framework for EdTech – helping developers consider ethical and environmental factors in EdTech design in a practical way.
- List of organisations that are involved in protecting young people online in various ways.
- Quebec’s (Canada) recent special commission on the impact of screens on the health and development of young people, driving a national strategy that aims to centre the voice, testimony and vision of children.
- On AI relationship bots: Voice box research from young people’s perspectives.
- The State of Illinois’ bill that heavily regulates AI therapy innovations (for all ages), which reproduces the current therapy experiences. This makes therapy innovations difficult to access, which poses limitations for teens.
- Common Sense Media research that addresses problems with YouTube.
- A study of tensions that led to the failure of inBloom. Also addresses the tension between Silicon Valley’s quick development pace and the precautionary principle when considering the needs of schools and children.