Today we’re pleased to share more details of the Justice Programmes new strategic litigation project: challenging the (mis)use of remote proctoring software.  

What is remote proctoring?

Proctoring software uses a variety of techniques to ‘watch’ students as they take exams. These exam-invigilating software products claim to detect, and therefore prevent, cheating. Whether this software can actually do what it claims, or not, there is concern that they breach privacy, data and equality rights and that the negative impacts of their use on students are significant and serious. 

Case study: Bar Exams in the UK

In the UK, barristers are lawyers who specialise in courtroom advocacy. The Bar Professional Training Course (BPTC) is run by the professional regulatory body: the Bar Standards Board (BSB).

In August 2020, because of COVID 19, the BPTC exams took place remotely, and used a proctoring app from US company Pearson Vue.

Students taking exams had to allow their room to be scanned and an unknown, unseen exam invigilator to surveil them.  Students had to submit a scan of their face to verify their identity – and were prohibited from leaving their seat for the duration of the exam. That meant up to 5 hours (!) without a toilet break.  Some students had to relieve themselves in bottles and buckets under their desks whilst maintaining ‘eye contact’ with their faceless invigilator. Muslim women were forced to remove their hijabs – and at least one individual had to withdraw from sitting the exam rather than, as they felt it,  compromise their faith. The software had numerous errors in functionality, including suddenly freezing without warning and deleting text. One third of students were unable to complete their exam due to technical errors.

Our response

The student reports, alongside our insight into the potential harms caused by public impact algorithms, prompted us to take action. We were of the opinion that what students were subjected to breached data, privacy and other legal rights as follows:

Data Protection and Privacy Rights

  • Unfair and opaque algorithms. The software used algorithmic decision-making in relation to the facial recognition and/or matching identification of students and behavioural analysis during the exams. The working of these algorithms was unknown and undisclosed.
  • The app’s privacy notices were inadequate. There was insufficient protection of the students’ personal data. For example, students were expressly required to confirm that they had ‘no right to privacy at your current location during the exam testing session’ and to ‘explicitly waive any and all claims asserting a right to individual privacy or other similar claims’. Students were asked to consent to these questions just moments before starting an extremely important exam and without being warned ahead of time.
  • The intrusion involved was disproportionate. The software required all students to carry out a ‘room scan’ (showing the remote proctor around their room). They were then surveilled by an unseen human proctor for the duration of the exam. Many students felt this was unsettling and intrusive.
  • Excessive data collection. The Pearson VUE privacy notice reserved a power of data collection of very broad classes of personal data, including biometric information, internet activity information (gleaned through cookies or otherwise), “inferences about preferences, characteristics, psychological trends, preferences, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes” and protected characteristics.
  • Inadequately limited purposes. Students were required to consent to them disclosing to third parties their personal data “in order to manage day to day business needs’, and to consent to the future use of “images of your IDs for the purpose of further developing, upgrading, and improving our applications and systems”.
  • Unlawful data retention. Pearson VUE’s privacy notice states in relation to data retention that “We will retain your Personal Data for as long as needed to provide our services and for such period of time as instructed by the test sponsor.”
  • Data security risks. Given the sensitivity of the data that was required from students in order to take the exam, high standards of data security are required. Pearson VUE gave no assurances regarding the use of encryption. Instead there was a disclaimer that “Information and Personal Data transmissions to this Site and emails sent to us may not be secure. Given the inherent operation and nature of the Internet, all Internet transmissions are done at the user’s own risk.”
  • Mandatory ‘opt-ins’. The consent sought from students was illusory, as it did not enable students to exert any control over the use of their personal data. If they did not tick all the boxes, they could not participate in the exam. Students could not give a valid consent to the invasion of privacy occasioned by online proctoring when their professional qualification depended on it. They were in effect coerced into surrendering their privacy rights. According to the GDPR, consent must be “freely given and not imposed as a condition of operation”.

 

Equality Rights

Public bodies in the UK have a legal duty to carefully consider the equalities impacts of the decisions they make. This means that a policy, project or scheme must not unlawfully discriminate against individuals on the basis of a ‘protected characteristic’: their race, religion or belief, disability, sex, gender reassignment, sexual orientation, age, marriage or civil partnership and/or pregnancy and maternity.

In our letter to the BSB, we said that the BSB had breached their equality rights duties by using a software that featured facial recognition and/or matching processes, which are widely proven to discriminate against people with dark skin.

The facial recognition process also required female students to remove their religious dress, therefore breaching the protections that are afforded to people to observe their religion. Female Muslim students were unable to select being observed by female proctors, despite the negative cultural significance of unknown male proctors viewing them in their homes.

We also raised the fact that some people with disabilities or women who were pregnant were unfairly and excessively impacted by the absence of toilet breaks for the duration of the assessment. The use of novel and untested software, we said, had the potential to discriminate against older students with fewer IT skills.

The BSB’s Reply

After we wrote to express these concerns, the BSB:

  • stopped using remote proctoring apps, as was scheduled for the next round of bar exams
  • announced that an inquiry into their use of remote proctoring apps in August 2020 to produce an independent account of the facts, circumstances and reasons as to why things went wrong. The BSB invited us to make submissions to this inquiry, which we have done. You can read them here.

Next steps

Here at the Open Knowledge Justice Programme, we’re delighted that the BSB has paused the use of remote proctoring and keenly await the publication of the findings of the independent inquiry.

However, we’ve been recently concerned to discover that the BSB has delegated decision-making authority for the use of remote proctoring apps to individual educational providers – e.g universities, law schools – and that many of these providers are scheduling exams using remote proctoring apps.

We hope that the independent inquiry’s findings will conclusively determine that this must not continue.

 

Sign up to our mailing list or follow the Open Knowledge Justice Programme on Twitter to receive updates.

 

 

 

+ posts

Meg designed the Open Knowledge Justice Programme in response to the important questions posed by the increasing use of information technology, data and algorithms in the justice system. The resulting training curriculum has developed from her experience within Open Knowledge’s School of Data project team, supporting the delivery of data-driven projects aimed at governments, journalists and citizens. Prior to joining the Open Knowledge Foundation, Meg worked as a legal adviser to detained asylum seekers.