Free from fear or favour
No tracking. No cookies

Government Urged to Drop Data Protection Reforms Which May Make Children an Easy Target

An open letter has been sent to Secretary of State for Science, Innovation and Technology, Michelle Donelan, urging her to ditch the bill which is back before the House of Lords today

Michelle Donelan leaves a cabinet meeting in November 2022
An open letter has been sent to Secretary of State for Science, Innovation and Technology, Michelle Donelan, urging her to ditch the bill. Photo: Sipa US / Alamy

Don’t miss a story

Sign up to the Behind the Headlines newsletter (and get a free copy of Byline Times in the post)

The Government is being urged to drop data protection reforms which appear to put commercial interests ahead of protecting children’s data, and instead support action aimed at better protecting the education sector under existing law.

The Data Protection and Digital Information Bill, currently at the Committee stage in the House of Lords, aims to update and simplify the UK’s data protection framework, according to the Government, but has been controversial with data protection

In a joint effort led by Defend Digital Me, expert groups, teaching unions and
academics with a focus on state education, data, technology, and human rights, have
written an open letter to the Secretary of State for Science, Innovation and Technology
Michelle Donelan urging her to ditch the bill.

“Overall the Bill is a significant shift away from a rights-based regime towards a set of market standards which treats data as a product,” Stephen Cragg KC notes in a legal opinion on the bill.

“If the new definition of personal data … is enacted that will also, of course, mean that fewer data of children will be protected under the new law.”

The Funding Crisis in Schools is Reaching Catastrophic Levels

Leicestershire School Heads have opened their books to parents showing them just how much they are struggling

The open letter explains how the Bill undermines every one of the seven key data
protection principles, lowering today’s standards of obligations on lawfulness, fairness
and transparency; purpose limitation; data minimisation; accuracy; storage limitation;
confidentiality and security and accountability.

Several clauses in the Bill appear to put commercial interests ahead of protecting children’s data, allowing companies to train their AI products on students, and investigating the opportunities of combining this with genomic data with a contract already awarded.

Clause two of the bill redefines the terms “scientific research” and “scientific
research purposes” to mean “any research that can reasonably be described as
scientific, whether publicly or privately funded, and carried out as a commercial or
noncommercial activity” and reduces people’s rights to see a copy of their data, ask for
corrections, object to re-uses, and could result in reduced data security when it is
kept indefinitely in fully identifiable formats, and not anonymised as it should be now.

The letter states this is “a seismic shift” removing layers of protection that may open the door to commercial and other third-parties exploiting “those weak spots to intrude into our lives”.

What Can the UK Teach the World About AI Safety?

For all the PR of the AI Safety Summit, what is the UK Government actually doing to safeguard its citizens from the dangers of AI, data misuse and prejudicial algorithms?

Clauses three and six of the bill creates opportunity for more unexpected uses of
our information without informed consent and therefore less protection from re-uses.
The bill elevates a list of legitimate interests to a position where the fundamental
rights of data subjects, including children can effectively be ignored where the
processing of personal data is concerned and gives the Secretary of State the power to
add to this list without primary legislation, bypassing Parliamentary controls.

Business friendly interests, such as direct marketing, have been added to this list
without provisos which Defend Digital Me warns will give “succour to commercial
organisations to increase levels of spam”. There is no added safeguards to protect people from it.

The Bill permits targeted political marketing at children aged 14-18 – with no fact-checking or oversight measures – meaning teenagers could be exposed to content containing political extremism.

It fails to address the lack of oversight in England of widespread profiling,
data mining, marketing, and school data agreements that can leave children of all ages
open to commercial exploitation. The Bill further shifts the imbalance of power away from
school staff, families and learners, by removing the obligation to have a Data Protection
Officer, and reducing the accountability for data processing.


Fact Checkers Slam Government Inaction on Political Deepfakes Ahead of General Election, Saying Laws ‘Not Fit for Purpose’

Full Fact has called on the Government to clarify how mis- and disinformation will be challenged in this pivotal election year

The Department for Education (DfE) has a woeful track record with data, the Information Commissioner’s Office (IOC) audit in 2020 made 139 recommendations for improvement and with over 60% classified as urgent or high priority that “represent clear and immediate risks”.

In 2022, the DfE was reprimanded after gambling companies misused a learning records
database, and in March 2024, the ICO took regulatory action against five public
authorities under the FOI Act including the DfE.

The open letter voices suggests children may become an easy market for data brokering, increasing the volume of spam, and more upselling within EdTech products.

The EdTech sector is 70% start-ups which can fail to meet cybersecurity standards. These are often products still in development where the company uses children as free data producers to train and develop new AI products with teachers providing free digital labour for the companies. Many start-ups are bought out with the data they’ve collected changing hands multiple times, often between foreign investors without values directly connected to education or pedagogy.

The DfE is reportedly considering “a number of questions”, regarding the re-use of national pupil data for AI development including data ownership and IP. Defend Digital Me has identified potentially unsafe technology products which fall under the bills “safeguarding vulnerable individuals” umbrella.


Receive the monthly Byline Times newspaper and help to support fearless, independent journalism that breaks stories, shapes the agenda and holds power to account.

We’re not funded by a billionaire oligarch or an offshore hedge-fund. We rely on our readers to fund our journalism. If you like what we do, please subscribe.

Many such products that may soon be banned in educational settings in the EU
under the EU AI Act, but will be allowed in the UK, include those that claim to be able to
identify mood and emotions using “pose estimation” based on data from pupils’ faces,
and products marketed as being able to identify and profile “hidden social-emotional

If marketed under the bills safeguarding umbrella these products will continue to be permitted in UK classrooms, colleges or universities. The changes mean they could
even skip any risk assessment known as ‘the balancing test’ in future. In addition to biometric data, increasingly sensitive bodily data is collected by emerging technologies through haptics, immersive tech, robot sensors and by voice assisted tools.

The DfE and Government Office for Science recently awarded a contract to look at the implications of future genomic technologies on the education sector. Some researchers want to see genetic data population-wide joined with educational records.

Dr Helen Wallace, Director of GeneWatch UK, has said of the Bill as drafted “is a
short-sighted and extremely dangerous attempt to tear up existing safeguards for
people’s DNA and genetic information”.

“If passed, these changes will damage people’s trust in health, research and police uses of their DNA, perhaps for generations,” she said.

Written by

This article was filed under