An independent inquiry adopts nearly all of our recommendations in our first challenge to the misuse of Public Impact Algorithms. Strong guidance given to the UK’s Bar Standards Board on the use of “remote proctoring software” which should now guide others’ use of this technology.

Photo by Jon Tyson on Unsplash

 

About The Justice Programme 

The Justice Programme is a project of the Open Knowledge Foundation, which works to ensure that public impact algorithms do no harm. Find out more about The Justice Programme here, and Public Impact Algorithms here.

 

The story so far

During the Covid pandemic, many educational institutions started using remote proctoring software to monitor students during their exams – i.e monitoring students using their webcams in combination with facial recognition and behavioral recognition technology.

Remote proctoring software invades students’ privacy, and runs a serious risk of replicating discrimination in the use of  opaque algorithmic systems. Read more about it here.

In the UK, the Bar Standards Board (BSB) contracted with Pearson Vue to provide such software for the vocational exams for barristers in 2020. The use of remote proctoring software was justified by the BSB on the grounds that it was necessary to ensure the ‘integrity’ of the exams.

With funding from the Digital Freedom Fund, we notified the Bar Standards Board (BSB) in the UK that we intended to bring legal action, due to concerns that the procedural protections against the use of opaque systems, namely a data protection impact assessment (and an equality impact assessment) had not been properly conducted, if at all.

 

The Independent Inquiry

In response, the BSB announced that use of remote proctoring software would be paused whilst an independent expert inquiry took place.

The inquiry was run by Professor Huxley-Binns, an expert in the topic, working alongside Dr Sarabajaya Kumar, an expert in diversity and disability.

The focus of the inquiry was to find out what happened, why it happened, who was to blame and what can be done to prevent it from happening again.

 

Our submissions to the inquiry

The Justice Programme Litigation Team  gave lengthy evidence to the inquiry, culminating in a set of recommendations. When the inquiry’s findings were published, seven out of our eight recommendations were adopted by Professor Huxley-Binns.

This is a big achievement! The BSB has agreed to adopt these recommendations in the form of an action plan. The report and action plan should act as a safeguard, in that future students experiencing problems with remote proctoring software can use the recommendations to hold the BSB to account. In a time when so much of new algorithmic decision-making systems are still unregulated, safeguards such as guidelines and recommendations are extremely important.

Our recommendations included:

  • putting the voices, needs and experiences of students at the centre of any future procurement and/or deployment of exam solutions based on emerging technologies.
  • consolidating and simplifying the data protection framework, with clear data protection standards for all course and exam providers and the timely use of data protection impact assessments to identify and mitigate the risks before contracts are put in place
  • ensuring open access to the type of technology being used and in all cases ensure it is transparent and explained to the end user.

What’s Next?

Use of remote proctoring technology is expanding fast, and this case is only a drop in the ocean.

We need to raise awareness of the potential harms of these opaque technologies and challenge further misuse.

Here at The Justice Programme we are already researching the use of remote proctoring in migrant language testing in the UK, as well as in student doctors’ vocational exams, where harms have been widely reported.

Stay tuned for further news, here and on the OKFJP twitter channel.

 

+ posts

Meg designed the Open Knowledge Justice Programme in response to the important questions posed by the increasing use of information technology, data and algorithms in the justice system. The resulting training curriculum has developed from her experience within Open Knowledge’s School of Data project team, supporting the delivery of data-driven projects aimed at governments, journalists and citizens. Prior to joining the Open Knowledge Foundation, Meg worked as a legal adviser to detained asylum seekers.