Rapport de l'Agence européenne des droits fondamentaux
Facial recognition technology (FRT) makes it possible to compare digital facial images to determine whether they are of the same person. Comparing footage obtained from video cameras (CCTV) with images in databases is referred to as ‘live facial recognition technology’. Examples of national law enforcement authorities in the EU using such technology are sparse – but several are testing its potential. This paper therefore looks at the fundamental rights implications of relying on live FRT, focusing on its use for law enforcement and border-management purposes. EU law recognises as ‘sensitive data’ people’s facial images, which are a form of biometric data. But such images are also quite easy to capture in public places. Although the accuracy of matches is improving, the risk of errors remains real – particularly for certain minority groups. Moreover, people whose images are captured and processed might not know this is happening – and so cannot challenge possible misuses. The paper outlines and analyses these and other fundamental rights challenges that are triggered when public authorities deploy live FRT for law enforcement purposes. It also briefly presents steps to take to help avoid rights violations.