Currently, a parliamentary committee is reviewing a bill introduced into federal parliament in February that would establish a national facial recognition system allowing federal and state police access to a hub linking all identification photo databases that could be matched with CCTV images.
The interoperability hub will be run by Dutton’s home affairs super-ministry, which has recently bragged that the algorithms it uses when training its facial recognition systems are superior because of the wide range of data sets that can be found in Australia’s multicultural society.
However, a report released last week found that UK police use of facial recognition technology is completely flawed. Produced by civil liberties group Big Brother Watch, the Face Off report also found that the technology disproportionately misidentifies minority ethnic groups and women.
Big Brother Watch sent out freedom of information requests to all UK police forces and the information that was received revealed that police use of automated facial recognition technology misidentified innocent people on average 95 percent of the time.
The technology has so far been trialled at shopping centres, sporting events, festivals and community events. And it has the potential to turn all CCTV cameras into identity checkpoints.
A flagrant breach of civil liberties
“In the UK, we have already seen how real-time facial recognition was shamefully deployed at a peaceful demonstration and used to identify individuals with mental health issues at a public event,” said Big Brother Watch lead researcher Jennifer Krueckeberg.
“This shows that not only criminals but people who are perceived as troublemakers can easily be targeted, generating a severe chilling effect on free speech.”
The main images police in Britain are accessing at present are stored in the Police National Database. These images are custody shots of individuals following their arrest.
But, the photos of innocent people who never end up being charged or convicted of crimes are being retained in this database unlawfully, along with those of convicted criminals.
Ms Krueckeberg further stressed that the “covert ID checks are fundamentally incompatible with civil liberties.”
The report outlines that the use of live automated facial recognition cameras in the UK is undermining the right to privacy, the right to freedom of expression and the right to freedom from discrimination, which are all protected under the UK Human Rights Act 1998.
But, as facial recognition technology makes inroads into the Australian scene, there’s no federal bill of rights to protect against incursions upon basic freedoms. And as the actions of the state Coalition government have recently shown, there are no protections at the state level in NSW either.
All pervasive checkpoints
There are two types of facial recognition technology. Facial matching involves the matching of still images against photos in a database, while automated facial recognition uses cameras that can scan crowds and public places, matching faces against a database in real-time.
In October last year, when the Council of Australian Governments (COAG) agreed to the establishment of the National Facial Biometric Matching Capability, prime minister Turnbull made assurances that the system would not be used on live CCTV footage.
The Identity-Matching Services Bill 2018 only provides for the real-time matching of images taken from CCTV footage with those in the national database. But, once such a framework is in place, it’s easy to imagine that amendments could be made to allow for the system to be upgraded.
And while the primary database in the UK only contains images of citizens who have been arrested, the federated system in this country will contain all state and territory drivers licence photos, along with passport, visa and citizenship images.
“Creating a biometric database of law-abiding citizens for the police is a dangerous intrusion into their privacy and makes a mockery of the principle innocent until proven guilty,” Ms Krueckeberg told Sydney Criminal Lawyers®.
“With such an all-encompassing database, authoritarian surveillance tools like real-time facial recognition could easily be rolled out across the national CCTV network in the very near future.”
It’s already a reality elsewhere
China has a network of 170 million CCTV cameras. In March this year, cameras in sixteen parts of the country were upgraded with automatic facial recognition technology. Known as Skynet, this system can scan faces and compare them with a database at a rate of three billion per second.
Police in certain parts of China have also been issued with facial recognition smart glasses since earlier this year. This enables officers to check the identity of individuals on the spot, independently from security cameras.
And in the far western province of Xinjiang – one of the most heavily policed regions in the world – authorities have been using automated facial recognition technology as part of a highly pervasive data surveillance system that collates information from a range of technologies.
Since April last year, it’s estimated that up to one million of the majority Muslim Uyghur population in the remote region have been or are being detained in political re-education camps by the Chinese government.
And the way in which the Australian government says facial recognition technology will be utilised in this country keeps morphing.
Counterterrorism was the initial justification for the system. But, the COAG agreement of last October stated the technology would be used in the investigation of offences punishable by three or more years’ imprisonment.
But this restriction is not included in the legislation, which has led critics to warn that it could be used in the prosecution of minor crimes, such as littering or jaywalking.
The Australian facial recognition system will allow for government agencies and private sector organisations to utilise the identity matching services to verify a known or claimed identity.
And the legislation also provides the home affairs minister with the power to expand the types of identification information used in the system, as well as add new services.
Live automation creeping in
And while the bill’s explanatory notes only mention that images taken from CCTV footage will be matched with identification photos in national databases, there’s been some peculiar goings-on with the use of live cameras in the capital.
Facial recognition technology is being trialled at Canberra airport, which allows passengers to walk directly through the terminal after a flight without producing their passports. And this system sounds suspiciously similar to the live automated system being utilised by police in the UK.
Minister Dutton told the National Press Club in February that he believes the large-scale use of this airport facial recognition system is just around the corner. “We’re probably maybe a generation – a technology generation – off it. So, a couple of years,” he said. “It’s very close indeed.”
Paul Gregoire is a Sydney-based journalist and writer. He has a focus on human rights issues, encroachments on civil liberties, drug law reform, gender diversity and First Nations rights. Prior to Sydney Criminal Lawyers®, he wrote for VICE and was the news editor at Sydney’s City Hub.