EdTech needs a code of practice

International reg decisions

Enforcement action cover

  • Children in education deserve high standards of privacy and data protection, and EdTech needs to be safe and trusted.
  • EdTech used in schools results in widespread invasion of children’s privacy through data collection, and there is insufficient evidence to support the claimed learning benefits.
  • Schools and parents lack the technical and legal expertise to understand the data collected, who has access to it, and the impact on children and their futures; the regulator and government must take action.

The Digital Futures for Children (DFC) proposal for a code of practice for EdTech would address the urgent challenges we see in today’s data-driven education. Supported by a certification scheme, this would reduce the burden on schools as data controllers, and enable uses of EdTech that benefit children, respecting their rights and allowing beneficial sharing of education data for innovation. It advances our Blueprint for education data, itself the culmination of three years’ multidisciplinary, multistakeholder work that sets out clear criteria for a pro-innovation, rights-respecting framework for EdTech. It is high time the UK took action to protect children’s data and rights while children are learning.

The DFC proposal for a code of practice for EdTech is supported by evidence from two new research briefs:

International regulatory decisions concerning EdTech companies’ data practices

This shows the extent of recent government actions against Google Workspace for Education’s treatment of children’s data, and raises the pressing question: why is the UK not taking similar action?

  • Recent regulatory decisions highlight the increasing global concerns about data practices EdTech companies carry out in schools. These include unfair data practices such as lack of transparency, complexity around what data is handled and how it is handled, failure to take appropriate measures to comply with the laws, deceptive data practices, abuse of companies’ dominant market positions, and pervasive data practices that we increasingly see with the prevalence of AI use in schools.
  • Regulatory and legal actions globally show that EdTech companies have breached data protection laws, and schools and local authorities bear the consequences of overburdening legal obligations as the responsibility falls on them as data controllers. It is striking that many countries have taken legal action against EdTech companies, especially Google, but there has been little such action in the UK as yet.

Enforcement action improves privacy for children in education: more is needed. A brief analysis of recent changes to policies and practice in Google’s Workspace for Education

This shows that recent government actions have led to improvements in Google’s policies, but these are still insufficient.

  • We found that regulation and enforcement actions, mainly in the Netherlands, have led to improvements in Google’s policies. Although these are steps in the right direction, they are not sufficient to address the challenges schools face as data controllers. Schools still struggle with Google’s complex, opaque data practices and policies, while bearing the onus for Google’s compliance.
  • The distinction between Google Workspace for Education policies and Google’s general Privacy Policy remains unclear, particularly regarding how children’s data is handled for educational purposes, adding further confusion for users.
  • Schools, acting as data controllers, face challenges in navigating Google’s privacy policies and lack meaningful control over how Google processes data, leaving schools with a significant burden under the law.

The DFC proposals for A blueprint for education data and code of practice for EdTech would address the urgent challenges we see in today’s data-driven education. It is high time the UK took action to protect children’s data and rights while children are learning. There are also lessons here for regulators and child rights advocates in other countries.

The background

These new Research Briefs build on the DFC’s research on education data. They document the extent of the legal and compliance problems with EdTech’s commercial processing of children’s data while they learn, as well as the regulatory improvements that schools want and that children deserve:

Right hand column for useful information, pullquotes, promo boxes, etc.