According to the UK’s Big Brother Watchdog, facial recognition systems used by UK police are inaccurate, “flagging up” an array of innocent people as suspects.
The high-definition cameras detect all the faces in a crowd, comparing them with existing police photographs from previous arrests and mug shots.
Any potential matches are then flagged for a police officer to investigate further.
The facial recognition cameras have been trialled by the police at events where incidents are likely to occur, such as festivals and football matches.
At London’s Notting Hill carnival in 2016 and 2017, the system is said to have incorrectly flagged 102 people as potential suspects, which led to no arrests.
South Wales Police also revealed to the BBC that its technology had made 2,685 correct matches between May 2017 and March 2017 – but 2,451 potential “criminals” were the wrong suspects.
Big Brother Watchdog has called on the government to make sure that the police do not keep the photos of innocent people, and hopes police stop using the tech.
Despite this, police have defended the technology, saying “safeguards are in place”.
Written by Leah Alger