Information Commissioner Elizabeth Denham has launched an investigation into the use of facial recognition around the King’s Cross area of London.
It has been sparked by the controversy over the use of the technology by property developer Argent and has implications for its wider application in public spaces.
Denham (pictured) issued a statement saying: “I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector.
“My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.
“Facial recognition technology is a priority area for the ICO and when necessary, we will not hesitate use our investigative and enforcement powers to protect people’s legal rights.
“We have launched an investigation following concerns reported in the media regarding the use of live facial recognition in the King's Cross area of central London, which thousands of people pass through every day.
“As well as requiring detailed information from the relevant organisations about how the technology is used, we will also inspect the system and its operation on-site to assess whether or not it complies with data protection law.
“Put simply, any organisations wanting to use facial recognition technology must comply with the law - and they must do so in a fair, transparent and accountable way. They must have documented how and why they believe their use of the technology is legal, proportionate and justified.”
Her announcement comes shortly after Biometrics Commissioner Paul Wiles added his voice to the concerns and called on the Government to update the laws surrounding the technology.
Earlier this week he told the BBC: "There's no point in having facial matching tech unless you are matching it against some kind of database - now what is that database?
"It's alarming whether they have constructed their own database or got it from somewhere else. There is a police database which I very much hope they don't have access to.”
Argent told The Guardian that it was using the technology “in the interest of public safety and to ensure that everyone who visits has the best possible experience”.
Last month Denham added her voice to the warnings about the possible dangers of police forces using facial recognition technology, and said her organisation was investigating how it had been used in trials by South Wales Police.
She pointed to the potential for inherent bias against certain ethnic groups in the algorithms used in facial recognition.