There is little concern over personal privacy, but only if it is accurate enough and will minimize wrongful arrests. Focus is on police actions to contain and predict criminal activity, thus promoting a police state. ? TN Editor
In April 2021, the European Commission (EC) released its much-awaited Artificial Intelligence Act, a comprehensive regulatory proposal that classifies AI applications under distinct categories of risks. Among the identified high-risk applications, remote biometric systems, which include facial recognition technology (FRT), were singled out as particularly concerning. Their deployment, specifically in the field of law enforcement, may lead to human rights abuses in the absence of robust governance mechanisms.
Law enforcement and facial recognition technology
Across jurisdictions, policymakers are increasingly aware of both the opportunities and risks associated with law enforcement's use of FRT. Here facial recognition refers to the process of the (possible) recognition of a person by comparing a probe image (photos or movies/stills of suspects or persons of interest) to facial images of criminals and missing persons stored in one or multiple reference databases to advance a police investigation.