This Wednesday, the “data ethics commission” of the German government released a 240-page report (PDF, English version here). It contains 75 concrete recommendations regarding the digitization of society, many of which have to do with algorithmic decision-making.
The 16-strong commission, which included 9 men, started work in 2018. It brought together a majority of scholars, data protection officers and some representatives of the industry. While their work addressed several aspects of digitization, in particular a recommendation to abandon any plan to treat personal data as pieces of property that could be bought and sold, the third part of their report was entirely devoted to algorithmic systems.
On this point, the central recommendation of the commission is to apply different regulations to autonomous systems based on a 5-point scale:
- Systems with low potential harm such as drink dispensers should not be regulated.
- Systems with some potential harm such as dynamic pricing in e-commerce should be lightly regulated and post-hoc controls should be set up.
- Systems with regular or obvious potential harm such as personalized pricing should undergo an approval procedure associated to regular controls.
- Systems with considerable potential harm, such as companies that have quasi-monopolies in credit scoring, should publish the details of their algorithms, including the factors used in the calculations and their weights, the data processed and an explanation of their inner logic. Controls should be possible via a real-time interface.
- Systems with unwarranted potential harm such as autonomous weapons should be “fully or partially” forbidden.
Licenses for social networks?
Media intermediaries such as Facebook and YouTube (Google) could fit anywhere from category 1 to 4 depending on which part of their platforms is considered, the commission wrote. An option could be the attribution of licenses, as is done in the broadcast industry where TV and radio channels need to receive and regularly renew the approval of a central authority.
Algorithmic systems could be graded according to an “overarching model” to be developed by lawmakers, and displaying the category a system belongs to could be made mandatory. Whether this measure will look like the energy consumption scale affixed to fridges and light bulbs is still unclear.
The commissioners encouraged the government to provide more funding to current oversight bodies and to support self-regulation initiatives, in particular by publishing an “Algorithmic Accountability Codex” to be drafted by a new commission. Companies could have to nominate a person in charge of algorithmic accountability, following the model of the data protection officers introduced by the GDPR.
Largely positive recommendations
AlgorithmWatch welcomes some of the commission’s recommendations, especially that socially important automated systems be made accessible to journalists and researchers. AlgorithmWatch’s OpenSchufa project, an investigation into Germany’s largest credit score provider made in partnership with Open Knowledge Foundation, was a first step in this direction. Companies like Schufa should be more regulated, as the commissioners rightly argue.
However, many aspects of the commission’s report remain blurry. Exactly how an “overarching model” to sort algorithms by their potential harm should be developed will raise new questions. Existing systems number in the thousands and several of them might fall between categories. A loophole could be introduced to allow systems with unwarranted potential harm to go forward by forbidding them only in part and by adding rubber-stamping human as tokens of non-automation to allow the rest.
Considering that the new president of the European Commission, Ursula von der Leyen, announced that she would propose EU-wide regulations before next March, and given her former role as a member of the German government, it can be expected that the data ethic commission’s recommendations will find their way to the European level in the near future.
Photo: BMI (edited)
Read more on our policy & advocacy work on ADM in the public sector.