Key learning points
-
Children in education deserve high standards of privacy and data protection; therefore, EdTech needs to be safe and trustworthy.
-
The use of EdTech in schools often leads to a significant invasion of children's privacy through data collection. There is insufficient evidence to support the claimed learning benefits associated with these practices.
-
Schools and parents often lack the technical and legal expertise to understand what data has been collected, who has access to it and its impact on children and their futures. Regulators and governments must take action.
-
A code of practice for EdTech, supported by a certification scheme, would make it easier for schools to confidently use technology in a rights-respecting manner, for the benefit of children.
The Digital Futures for Children centre (DFC)'s proposal for a code of practice for EdTech would address the urgent challenges posed in today’s data-driven education.
-
A code of practice, supported by a certification scheme, would reduce the burden on schools, which currently act as data controllers on top of their educational responsibilities. By delegating part of the responsibility of demonstrating compliance to companies that profit from children’s data obtained in education, a more equitable playing field could be established. Schools, without these overwhelming responsibilities, could then utilities the real benefits of the available technology while ensuring the protection of children.
-
This code would help schools identify products that protect children’s data rights, uphold the rights of the child and provide educational benefits. Building trust in EdTech would boost its adoption by schools and parents. EdTech providers would understand the standards they should meet and how to meet them.
-
This proposal further advances the DFC’s Blueprint for education data, itself the culmination of three years’ multidisciplinary, multistakeholder work that sets out clear criteria for a pro-innovation, rights-respecting framework for EdTech.
The DFC proposal for a code of practice for EdTech is supported by evidence from three research briefs:
1. International regulatory decisions concerning EdTech companies’ data practices
This brief highlights international government actions against Google Workspace for Education, regarding the platform’s handling of children’s data. This raises a pressing question: why is the UK not taking similar action?
-
Recent regulatory decisions underscore the growing global concerns around the data practices of EdTech companies that have access to schools. These concerns include unfair data practices such as lack of transparency, complexity around what data is handled and how it is handled, failure to take appropriate measures to comply with the laws, deceptive data practices, abuse of companies’ dominant market positions and pervasive data practices that we increasingly see with the prevalence of AI use in schools.
-
International regulatory and legal actions demonstrate that many EdTech companies have breached data protection laws. As a result, schools and local authorities (as responsible data controllers) bear the consequences of overburdening legal obligations. It is striking that while many countries have taken legal action against EdTech companies, especially Google, there has been little such action in the UK to date.
2. Enforcement action improves privacy for children in education: more is needed. A brief analysis of recent changes to policies and practice in Google’s Workspace for Education
This research showed that although international government actions led to improvements in Google’s policies, these are still insufficient.
-
We found that regulation and enforcement actions, mainly in the Netherlands, have led to improvements in Google’s policies. Although these are steps in the right direction, they are not sufficient to address the challenges schools face as data controllers. Schools still struggle with Google’s complex, opaque data practices and policies, while bearing the onus for Google’s compliance.
-
The distinction between Google Workspace for Education policies and Google’s general Privacy Policy remains unclear, particularly regarding how children’s data is handled for educational purposes, adding further confusion for users.
-
Schools, acting as data controllers, face challenges in navigating Google’s privacy policies and lack meaningful control over how Google processes data, leaving schools with a significant burden under the law.
3. In support of a Code of Practice for Education Technology
This research summarises the conversation around EdTech:
-
Children in education deserve strong protection of their rights, including privacy and data protection. Core concerns around EdTech include social profiling, data collection and data leaks of children’s data.
-
There is limited evidence to support the claimed learning benefits of education technology in the classroom. Moreover, there are no benchmarks for assessing how EdTech interactions truly benefit children, which raises concerns for educators, parents and children themselves.
-
That said, EdTech used in schools can offer significant administrative and educational support to teachers. Moreover, education-based data collection could offer benefits such as improving educational and other outcomes for children. Good regulation should be fair and help build trust in EdTech to ensure the continued use of innovative technologies by schools, children, parents or caregivers.
The DFC proposals address the urgent challenges we face in today’s landscape of data-driven education. It is high time the UK took action to protect children’s data and rights while children are learning. There are also lessons here for regulators and child rights advocates in other countries.
The background
These research briefs build on the DFC’s research on education data. They document the extent of the legal and compliance problems with EdTech’s commercial processing of children’s data while they learn, as well as the regulatory improvements that schools want and that children deserve: