Artificial Intelligence & Machine Learning , Governance & Risk Management , Next-Generation Technologies & Secure Development

London Police Lack Records of Facial Recognition Tests

Metropolitan Police Service Shared Seven Facial Recognition Images With Private Developer
London Police Lack Records of Facial Recognition Tests

London's Metropolitan Police Service now says it shared seven images with a private developer during tests of facial recognition technology for crime prevention in the city's King's Cross section. There are, however, no records related to the outcome of those tests as part of the pilot program, authorities write in a letter to Mayor Sadiq Aman Khan.

See Also: OnDemand | The Four Steps to Build a Modern Data Protection Platform

There also is no record of whether the facial recognition technology used within the King's Cross section of London produced a match with seven images that were provided, the letter notes. And it’s not clear whether any official action was taken based on the results, although it seems unlikely, according to the letter.

These new update comes more than a month after Khan's office announced that the city's police service had shared some images with the private developer of a 67-acre section of King's Cross. Since the story first broke, the use of facial recognition technology in this part of London has stirred an outcry over personal privacy in public spaces (see: Facial Recognition Use in UK Continues to Stir Controversy).

While much of the King's Cross development is privately owned, most of it is open and accessible to the public. All use of facial recognition technology within the development has now been discontinued, authorities say.

Argent, the developer behind the King's Cross Central Limited Partnership, had been using facial recognition technology within two cameras that were part of the neighborhood's CCTV system in order to scan pedestrians near the King's Cross railway station as part of a crime prevention effort, according to reports in the Guardian.

As more information about the use of facial recognition technology in King's Cross became available, the Metropolitan Police Service admitted that it had developed a partnership with Argent that included help with tests programs of the technology. That relationship, however, ended in March 2018, and the pilot program stopped then, according to the BBC.

Use of Images

In the letter to the mayor, the Metropolitan Police Service says that the department's Camden Borough provided King's Cross Estate Services with files containing images and reference numbers of people who had either been arrested by police and charged with a crime, or issued a formal warning.

The letter adds that the names and other personal information of these people were not shared. The images were taken from three police databases: The Central Registration and Identification Scheme, the Police National Computer and the Custody Imaging System.

"On each occasion, the images were shared to prevent crime, to protect vulnerable members of the community or to support the safety strategy," according to the letter.

The Agreement

The letter from the Metropolitan Police Service also shows that King's Cross Estate Services was using facial recognition technology developed by Japanese firm NEC. Previously, the developer had declined to say which company developed the facial recognition system used in King's Cross, according to the BBC.

The BBC also reported that police in other parts of the U.K. also have used NEC's technology for facial recognition.

Under the agreement between the police and the King's Cross developer, the facial recognition system was based on a "independent server," which could be accessed only by one authorized individual. This server was located in the CCTV control room at King's Cross Estate.

In the letter to the mayor, the Metropolitan Police Service notes that it's agreement with the developer stretched over two years from May 2016 and March 2018, and that a new agreement had been put into place in January of this year, although no images had been shared.

Sophie Linden, London's deputy mayor of policing and crime, added in her own letter to the Metropolitan Police Service that the department should not engage in agreements involving the use of facial recognition technology.

"Going forward, the MPS has written to all BCU commanders making it clear that there should be no local agreements on the use of live facial technology," Linden says.

Need for Regulation

Following the first reports of this use of facial recognition technology in King's Cross, the U.K. Information Commissioner's Office - Britain's chief privacy watchdog - launched an investigation into the developer's use of facial recognition technology. The ICO has the ability to suggest privacy violation fines under the European Union's General Data Protection Regulation.

The BBC previously reported that Britain's surveillance camera commissioner, who oversees the use of surveillance cameras within the country, is also investigating the use of facial recognition technology within King's Cross.

Outside of the U.K., the use of facial recognition technology by private firms and governments agencies is stirring controversy.

For instance, in India, privacy advocates have raised concerns about the increasing use of facial recognition technology deployed throughout airports to verify and authenticate passengers (see: Facial Recognition: Balancing Security vs. Privacy).

At a recent event in India, Kay Firth-Butterfield, who studies artificial intelligence for the World Economic Forum, says that governments need to carefully consider privacy issues as they consider using facial recognition technology to beef up security, according to a report by CNBC.

"When does use (of facial recognition technology) by the government amount to security compared to the invasion of our civil liberties?" she asks.


About the Author

Apurva Venkat

Apurva Venkat

Special Correspondent

Venkat is special correspondent for Information Security Media Group's global news desk. She has previously worked at companies such as IDG and Business Standard where she reported on developments in technology, businesses, startups, fintech, e-commerce, cybersecurity, civic news and education.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing bankinfosecurity.eu, you agree to our use of cookies.