The European Union poured 5 million euros into the development of a border surveillance system called NESTOR. When we tried to look into it, we were presented hundreds of redacted, blacked out pages.
The research and innovation project NESTOR, funded within the HORIZON 2020 framework, ended in April 2023. It is, according to the project deliverables, an assemblage of unmanned vehicles, Artificial Intelligence technology, Social Media analysis, and data-based risk assessments to protect EU borders.
After the usual legally granted information requests a solid wall of opacity popped up before us, built to shield the workings of this EU border security project from public scrutiny.
As a result, we are not able to grasp NESTOR’s technological output and do not know whether the systems worked as intended or not, what risks come with them, and if there were mitigation measures put in place to avoid such risks. Some documents we obtained left us in the dark: When we requested access to NESTOR’s Grant Agreement, we were given a heavily redacted document that included 170 fully blacked out pages in a row.
Our research was recently the ZDF Magazin Royale show’s main topic. You can find the show (in German only) in the ZDF media library:
We did manage to obtain a full list of project deliverables and also learned that NESTOR alone produced 88 documents, each being of a length that spans from 10 to 388 pages. But we also found that 40 of them are not meant to be shared with the public, and were ultimately only able to request access to 10 project documents in total.
This is hardly a way to provide actual transparency. And this structural opacity does not only concern NESTOR, AlgorithmWatch found after more than a year of regular interactions with the EU Commission’s Research Executive Agency (REA) that manages such projects and reviews their deliverables. It applies especially to ethics-related deliverables, meaning the ones to which, in the public interest, should be provided the broadest possible access.
In theory, EU regulation mandates disclosures of requested documents within 15 working days. In practice, lawful exceptions grounded in public security, commercial interests, and even privacy grant the agency “discretionary powers.” The agency then might arbitrarily limit the scope of requests, arbitrarily bundle multiple requests together, or systematically prevent access to information concerning essential aspects of researched and developed systems. Consequentially, an informed and independent public scrutiny is impossible.
Examples from the requests we sent to the REA abound. Concerning some projects as BORDERUAS, redactions went so far as to the Table of Contents. When trying to look into the contentious (due to, for example, an intended deployment of emotion recognition-based lie detectors) and long completed iBorderCtrl project, the documents refused to disclose information on such topics as “ethical risk awareness and monitoring” or even “compliance with legal obligations.” In case of the ROBORDER project, in which swarms of unmanned vehicles to patrol EU borders were developed, everything concerning “Full details of demos/operational tests” was blacked out in the documents we received – even after we contested a first batch of extremely redacted documents.
Redactions in the BORDERUAS project’s Table of Contents.
We found extreme redactions to consistently concern:
a) Grant Agreements, b) details of pilots’/tests’ outcomes, c) considerations on dual use items (applications that can be both civilian and military; note that Horizon 2020 and Horizon Europe-funded projects are restricted to civilian applications), d) identities and affiliations of project deliverables’ authors and reviewers, e) detailed costs and funding allocation, f) specific assessments of risks and potential for misuse, g) specific assessments of envisioned mitigation strategies to avoid such risks.
Such consistent redactions concerned documents from both long-completed and still running projects.
This is not all. Project websites disappeared or were suddenly inaccessible, deliverables were missing that should long have been in the public domain, and REA’s constantly asked for reducing the scope of access to information requests. All this contributed to building a wall behind which the most relevant information on EU border security projects disappeared.
It will likely remain so. The AI Act failed to properly subject harmful and high-risk automated systems in migration to stringent transparency rules and oversight obligations. A full, evidence-based assessment of EU-funded research and innovation projects concerning people on the move is a long way off.
Want to learn more about EU-funded experiments at our borders? Read our long-read article: