Free from fear or favour
No tracking. No cookies

Exams Watchdog Boss Flouted Concerns on Algorithm Bias Raised in his Own Report

The chair of Ofqual didn’t follow his own advice on how algorithms can reinforce discrimination because of “biases in underlying datasets”

Students and teachers take part in a demonstration outside the Department for Education. Photo: unreguser/Xinhua News Agency/PA Images

Exams Watchdog Boss Flouted Concerns on Algorithm Bias Raised in his Own Report

The chair of Ofqual didn’t follow his own advice on how algorithms can reinforce discrimination because of “biases in underlying datasets”

Share this article

Roger Taylor is the chair of Ofqual, the exam watchdog that has come under heavy fire for its deployment of an algorithm to decide the results of ‘A’ Level and GCSE students unable to sit exams due to the Coronavirus pandemic – a decision the Government has U-turned on this afternoon.

Taylor – a former private schoolboy who, ironically, flunked his ‘A’ Levels – is also chair of the Government’s Centre for Data Ethics and Innovation (CDEI), which advises ministers on the how new technologies should be managed, harnessed and regulated.

In March 2019, the CDEI was commissioned by the Government to investigate how algorithms can reinforce human biases, including racial and socioeconomic discrimination. The full report hasn’t been published yet, despite the body’s stated aim to present its evidence to the Government in March this year.

However, the CDEI produced an interim report in July last year that delivered some preliminary conclusions – many of which make for embarrassing reading, given that Ofqual’s algorithm has been accused of entrenching structural inequalities in the education system.


Coding Bias

The CDEI interim report states that faults arise in algorithms when they “begin to reinforce problematic biases” due to design errors or “biases in underlying datasets”.

This becomes particularly pernicious, the report claims, when these algorithms are “then used to support important decisions about people’s lives, for example determining whether they are invited to a job interview”.

For context, the Ofqual exams algorithm attempted to prevent grade inflation – to ‘standardise’ the results of 2020 students in relation to previous years – by adjusting predicted grades based on two central factors: the past performance of schools, and teachers’ assessments of their students.

Because private schools have historically performed better than state schools, the algorithm dramatically downgraded the performance of teenagers from comprehensive schools. Many of the highest-performing students at the historically worst schools saw their grades tumble, while kids in the private sector had their results bumped up.

The Ofqual algorithm used a dataset with in-built biases – the centuries-old dominance of the private school system, combined with the high-performance of schools in good areas – to decide one of the most important, life-shaping moments for thousands of young people. Taylor’s magical sorting hat broke the advice of the CDEI, unequivocally.

In fact, the interim report explicitly acknowledges that class discrimination can be compounded by ill-judged algorithms.

“Decision-making, algorithmic or otherwise, can of course also be biased against characteristics which may not be protected in law,” it states, “but which may be considered unfair, such as socioeconomic background.”


Agents or Subjects?

Algorithms are impersonal, complex forces that code almost every aspect of modern life – from social media feeds to the people you’re likely to go on a date with.

Therefore, the CDEI report argues, “a certain level of transparency about the performance of algorithms will be necessary for customers and citizens to be able to trust that they are fair”.

But this hasn’t exactly been the model followed by Ofqual during the exam results debacle. It appears as though few teachers or students were consulted on the algorithm before it was deployed, leading to a widespread backlash when people eventually realised its effects.

This point – about involving the subjects of algorithms when they are designed – was similarly stressed by Taylor in an interview with Computer Weekly in November 2018.

Speaking about his objectives for the CDEI, Taylor said his overriding ambition “is to make people feel that these technologies are not something that is just going to be done to them, but something they are going to have agency in how they are deployed across society, whether it’s collectively or individually, and how it’s going to impact their lives”.

With young people now out on the streets protesting against their treatment, and chants of “f*ck the algorithm” being heard outside Parliament, it is safe to conclude that students have not felt a sense of agency as a result of the methods used by Taylor’s exam watchdog.

The Department for Education and the CDEI have been contacted for comment.



Written by

This article was filed under
, , ,