Limites du droit pour lutter contre la Technopolice
-
Le AI Now Institute vient de publier un rapport qui propose un panorama des réglementations des systèmes biométriques à travers le monde. La coordinatrice du rapport résume le rapport dans cet entretien.
Extraits choisis sur les limites du droit dans la protection des libertés :
What are the major gaps you see in approaches to biometric regulation across the board?
A good example to illustrate that point is: How is the law dealing with this whole issue of bias and accuracy? In the last few years we’ve seen so much foundational research from people like Joy Buolamwini, Timnit Gebru, and Deb Raji that existentially challenges: Do these systems work? Who do they work against? And even when they pass these so-called accuracy tests, how do they actually perform in a real-life context?
Data privacy doesn’t concern itself with these types of issues. So what we’ve seen now—and this is mostly legislative efforts in the US—is bills that mandate accuracy and nondiscrimination audits for facial-recognition systems. Some of them say: We’re pausing facial-recognition use, but one condition for lifting this moratorium is that you will pass this accuracy and nondiscrimination test. And the tests that they often refer to are technical standards tests like NIST’s face-recognition vendor test.
But as I argue in that first chapter, these tests are evolving; they have been proven to underperform in real-life contexts; and most importantly, they are limited in their ability to address the broader discriminatory impact of these systems when they’re applied to practice. So I’m really worried in some ways about these technical standards becoming a kind of checkbox that needs to be ticked, and that then ignores or obfuscates the other forms of harms that these technologies have when they’re applied.
How did this compendium change the way you think about biometric regulation?
The most important thing it did for me is to not think of regulation just as a tool that will help in limiting these systems. It can be a tool to push back against these systems, but equally it can be a tool to normalize or legitimize these systems. It’s only when we look at examples like the one in India or the one in Australia that we start to see law as a multifaceted instrument, which can be used in different ways. At the moment when we’re really pushing to say “Do these technologies need to exist at all?”
The law, and especially weak regulation, can really be weaponized. That was a good reminder for me. We need to be careful against that.This conversation has definitely been revelatory for me because as someone who covers the way that tech is weaponized, I’m often asked, “What’s the solution?” and I always say, “Regulation.” But now you’re saying, “Regulation can be weaponized too.”
That’s so true! This makes me think of these groups that used to work on domestic violence in India. And I remember they said that at the end of decades of fighting for the rights of survivors of domestic violence, the government finally said, “Okay, we’ve passed this law.” But after that, nothing changed. I remember thinking even then, we sometimes glorify the idea of passing laws, but what happens after that?