Technology is constantly evolving in all walks of life, including law enforcement. CCTV footage, telephone intercepts, DNA profiling and GPS tracking have all become an integral part of the police force’s crime fighting artillery.
And starting this October, Japanese giant Hitachi will be trialling technology which it says can predict where crime is likely to occur.
The company claims that the system can pinpoint the likely time and location of crime within a 200-metre radius, and even assign a threat level so that law enforcement knows how many officers to dispatch.
The innovation is called “Hitachi Visualisation Predictive Crime Analytics” (“VPCA”), and police stations including the NYPD are already showing interest. The system claims to work by processing enormous amounts of publicly available data, including social media posts, historical statistics, public transit maps and weather reports – something that the company says a team of crime fighters cannot do.
Like others before it, Hitachi recognises that people who engage in crime often post information on social media in the belief that no one is watching, or that their code words and slang will not be picked up or understood by law enforcement.
The company says that its system has an enormous bank of commonly used terms and codes, and that such information is easily deciphered and decoded by its advanced software.
Hitachi acknowledges that information from social media is vital to the system’s success, improving accuracy by 15%.
Criticisms and Concerns
Unlike the ‘pre cogs’ in the 2002 film “Minority Report”, the Hitachi system does not claim to be foolproof mechanism for predicting the precise nature of a future crime.
Which raises the question: how accurate will the system be?
Hitachi says that it will be holding trials in half-a-dozen US cities and releasing the results publicly, so that people can judge for themselves. It also proposes to conduct “blind” trials at police stations – whereby police look at the results after the trial period has concluded to determine how accurate the predictions turned out to be.
Concerns have also been raised that the system might falsely implicate innocent people – leading to unjustified arrests – and lead police to unfairly target certain areas or groups based on general patterns of conduct, potentially leading to over policing and racial profiling.
But Hitachi has hit back, saying that its technology actually removes bias because it is based upon real data, rather than personal experiences and prejudices.
The results of the trials are eagerly anticipated by police enforcement across the US. And if nothing else, publicity surrounding the new technology should cause potential offenders to think twice about what they post online.