200 students failed their exams. Automated proctoring could be to blame, but doubts remain 

In Spain, 200 students of the International University of La Rioja failed their exams. Some blame a glitch in the proctoring software, but it might have been a change in the system’s rules. University officials gave contradictory explanations, leaving students to fight against bureaucracy and the assessment of a machine.

MidJourney - A student is taking an exam at her computer, seen from the back. Next to her, a cell phone on a tripod is filming her.

Taking exams online is a common modality in many higher education institutions, especially after the coronavirus pandemic forced millions of students around the globe to stay at home. Some of these institutions opted to buy software that, once installed in the students’ computers at home, monitors their performance all the way. This means that the supervision during the test is mostly or fully automated – and there are times when this can affect people’s futures.

These monitoring systems are known as ‘proctoring’ programs. In Spain, several universities started using them during the pandemic and kept them afterward for distance learning. Many of these universities relied on the proctoring software provided by the Spanish company Smowltech. Among them is the International University of La Rioja (UNIR), which has been accused by a group of students of failing around 200 people who took online exams last May. The reason could have been a glitch in Smowltech’s program.

For several days, most of these students had a zero in several subjects, and it was not until they organized and reported the situation to the university that revisions started to take place. Some of them thought that they would have to pay and go through the whole academic year again. In the end, they were told that there had been a problem with the camera system that monitored their exams.

Automated supervision

The proctoring system, called Smowl, is used in this case to alert if someone is copying during a test. It consists of an application that is installed on personal computers during the exam. It is then capable of monitoring what programs and apps are opened, while also registering the environmental sound of the room and images of what the person is doing for the whole time.

Students are obliged, among other things, to set up a double-camera framework: on the one hand, the front camera monitors at all times the face of the person and the access to the room. Apart from this, students must connect the camera of a second device, like a cell phone or a tablet, which focuses on them from a side angle. Its goal is to focus on the environment, the hands and body, the screen of the computer and the surface of the table where they are taking the test. These instructions are all compiled in a student’s manual they receive when enrolling.

If Smowl detects that one of these cameras is not connected or placed appropriately, the button to access the exam will turn red, and students are advised to not start the exam. Nevertheless, if this happens during the test, or the system detects that the position in the framework is not what it is programmed to accept as correct, it might mark the test as failed with a zero.

Some of the affected students told AlgorithmWatch that around 200 schoolmates experienced the latter during the last exam session of the academic year. Apparently, Smowl detected a problem with the external camera – the one connected from a separated device – that could mean that people had copied.

“I went through one of my exams and everything turned out well. I started another one and a red light lit up. If that happens you’re supposed to contact the technical support team at UNIR, but when I called they told me that it was not a problem, that the incident had been registered and that it wouldn’t affect my grades”, tells Itsasne, one of the affected students who preferred to hide her last name out of concern for their relationship with the university. “But when I got my grades, everything had been qualified with a zero and even though my teachers said that the images recorded by Smowl looked correct, they couldn’t grant me a pass right away”.

Fighting the machine

The university has not clarified to AlgorithmWatch if the fails were completely motivated by an automated error with Smowl or for another reason: “The system detects evidence of actions that are not permitted in the conduct rules for exams, which are then collected by the IT service staff and the teachers. They analyze the evidence and decide whether the student passes or fails the exam”, explained Adela López, the students’ vice chancellor at UNIR.

Another student, Laura, claims that her teacher told her she was suspended – as other classmates – due to an inappropriate position of the external camera: “A second revision has been accepted only on those cases where the technical system did not detect and alert at the moment that the second camera did not work”, she said according to an email AlgorithmWatch had access to.

Some students did claim that after receiving their grades, a modification was done to the manual that explains how to place the external camera. AlgorithmWatch asked the university if this change had something to do with the number of fails and the complaints of the students, but received no specific answer. We did get the chance to check the manual and saw that, indeed, it had been modified for that purpose. We also accessed an internal statement signed by López after the exams' session that clarified that “no changes had been done in the rules or the guidelines applied”.

“The academic assessment regulation foresees that the student may ask the lecturer for a revision of the grade. If the student is still not satisfied with it, they may request that a committee appointed by the dean or director reviews their case”, said López.

While most of the affected students AlgorithmWatch talked to managed to get their exams reviewed, by the time this report was written others still had a zero in some subjects or failed anyway during the revision process.

Ban on face recognition – is it enough?

The use of Smowl – particularly in this university – has mobilized plenty of students this year, but it is not a first. Smowl also offers face recognition to verify a student's identity. While plenty of universities still use this feature, UNIR does not. It did so after facing complaints to the Spanish data protection agency (AEPD, in Spanish) by the former student association HUxIR. In 2021, around one thousand students organized under this association to denounce the use of this technology during exams. Today, identity is verified with a selfie and a picture of an ID.

The use of face recognition is very common to monitor exams, although authorities like the AEPD have concluded that it is not a proportionate measure, and has become one simple extra measure part of proctoring systems, as denounced by Alianza por la Justicia Algorítmica (AxJA). Similarly so, Alliance for Algorithmic Justice in English, a group that brings together different associations and experts to litigate cases where technology is causing some type of discrimination. While they are looking into the recent case with UNIR, they are also responsible for one of the sanctions imposed on the Universitat Oberta de Catalunya (UOC) for the use of face recognition in examination processes.

“Face recognition is being implemented as a way to guarantee that grades are correct”, says Mireia Orra, coordinator of AxJA and Head of Policy at Eticas Tech. “The main problem, as we saw even in the refutations that the UOC sent us, is that they justify that only a few exams are done with face recognition because they grade students during in-person tests or with continuous evaluation. But in that case it makes face recognition even less necessary and proportionate if there is an alternative”. 

Alternatives in this case imply the use of other monitoring tools, like the two-camera framework, that has also been denounced by HUxIR. In this case, data protection authorities do not see a privacy violation, but their use leads to bureaucratic mazes that can still imply that a couple of hundred students might not graduate from university.

Naiara Bellio (she/her)

Reporter

Naiara Bellio covers the topics privacy, automated decision-making systems, and digital rights. Before she joined AlgorithmWatch, she coordinated the technology section of the Maldita.es foundation, addressing disinformation related to people's digital lives and leading international research on surveillance and data protection. She also worked for Agencia EFE in Madrid and Argentina and for elDiario.es. She collaborated with organizations such as Fair Trials and AlgoRace in researching the use of algorithmic systems by administrations.

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.