Facial Recognition Technology: Turning Innocent People Into Suspects

by &
Information on this page was reviewed by a specialist defence lawyer before being published. Click to read more.
Facial recognition

The South Wales police are standing by their facial recognition software despite the fact that its use during last year’s Champions League Final yielded a 92% rate of false positives.

Data published on the force’s website suggests that of the 170,000 people who arrived at Cardiff for a match between Real Madrid and Juventus, 2,470 were identified as potential criminals. However, the force has since conceded that 92%, or 2,297, of these supposed identifications were found to be wrong.

Police response

South Wales police point out that “no facial recognition system is 100% accurate” and that a number of arrests have been made as a result of the implementation of the program.

“Over 2,000 positive matches have been made using our ‘identify’ facial recognition technology, with over 450 arrests”, a spokesperson stating, omitting reference to the impact on those misidentified.

The police force blamed the percentage of false positives at the football final on “poor quality images” supplied by agencies, including UEFA and Interpol.

Similar incidents

The South Wales police have had similar issues in other high-profile events.

Their figures revealed that 46 people were wrongly identified at an Anthony Joshua boxing match, and 42 false positives were registered at a rugby match between Wales and Australia in November.

A recent London police trial of facial recognition technology at a Six Nations rugby match had an even worse hit-rate, generating 104 “alerts”, of which 102 were false.

Criticism of inaccuracy

Civil liberties campaign group Big Brother Watch has been a vocal critic of the technology since its inception.

“Not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool”, the group says.

The group highlights the fact that the technology takes up a great deal of police time and resources, requiring them to sift through large numbers of images to look for an actual match.

It points out that facial scanners misidentified more people at one event than were correctly spotted in nine months of use.

The president-elect of the Law Council of Australia, Arthur Moses, says:

“If you have technology that is not up to scratch and it is bringing back high returns of false positives then you really need to go back to the drawing board”.

Civil liberties

Big Brother Watch has also criticised the technology for curtailing freedoms of assembly and free speech.

“In the UK, we have already seen how real-time facial recognition was shamefully deployed at a peaceful demonstration and used to identify individuals with mental health issues at a public event,” said Big Brother Watch lead researcher Jennifer Krueckeberg. “This shows that not only criminals but people who are perceived as troublemakers [to government power] can easily be targeted.”

She believes a further risk is the facilitation of “politically motivated surveillance”, whereby citizens are prevented from protesting against the state and its agencies.

Such surveillance is already prevalent in China and even in the US – with the notorious COINTELPRO and similar programs being used since the 1950s to spy on non-violent protest groups.

It was also recently revealed that the FBI tracked black lives matter protesters for no other reason than their involvement in antiracism protests.

Racial bias

In a submission to the parliamentary joint committee on intelligence and security, the Human Rights Law Centre stated that both false positive and false negative results for facial recognition are more likely to result in the matching of ethnic minorities in Australia.

It cited studies that found facial recognition had “a bias towards the dominant ethnic group in the area in which it is developed”. The percentage of false positives and interactions increased for racial minorities.

Closer to home

A parliamentary committee is currently in the process of reviewing submissions on the Identity-matching Services Bill 2018.

This Bill, introduced into Federal Parliament in February, would establish a national facial recognition system allowing police to access to a hub linking all identification photo databases that could be matched with CCTV images.

While the UK database only contains images of citizens who have been arrested, the Australian scheme would contain all state and territory drivers licence photos, along with passport, visa and citizenship images.

These images could then be matched against live images of faces in crowds and public places.

Such technology has the potential to greatly increase the level of surveillance in Australia – a country which already has the most pervasive meta-data retention laws in the developed world.

Last updated on

Receive all of our articles weekly

Authors

Zeb Holmes

Zeb Holmes

Zeb Holmes is a lawyer with a passion for social justice who advocates criminal law reform, and a member of the content team at Sydney Criminal Lawyers®.
Ugur Nedim

Ugur Nedim

Ugur Nedim is an Accredited Criminal Law Specialist with 25 years of experience as a Criminal Defence Lawyer. He is the Principal of Sydney Criminal Lawyers®.

Your Opinion Matters