Critics have warned facial recognition technology risks undermining human rights

The experiment was conducted discreetly. Between 2016 and 2018, two surveillance cameras were installed in the Kings Cross area of London to analyse and track passers-by using facial recognition technology.

The deployment of the cutting-edge technology at one of the British capital's busiest transport hubs, which was first revealed in the Financial Times, has fuelled controversy in Britain where its use does not yet have a .

The company in charge of the project argued it has acted "only to help the ... prevent and detect crimes in the area", and that it had no commercial use.

But data watchdog the Information Commissioner's Office (ICO) seized on the case.

It opened an investigation and expressed concern about the increasing use of , which allows for faces captured on camera to be compared with images stored to databases.

The Kings Cross case is not isolated, with shopping centres in Manchester and Sheffield, and a museum in Liverpool, also reportedly trialling the technology.

Privacy group Big Brother Watch has denounced the trend as an "epidemic", with other critics warning it risks undermining .

Police surveillance?

Ed Bridges, 36, has sued Welsh police for targeting him with this technology while he was Christmas shopping in 2017 and at a protest in 2018, with his case working its way through the High Court in Cardiff.

This is the first time that such action has been brought in British courts.

Although the police use was authorised and advertised, the Cardiff University employee told AFP it felt like "being robbed".

Two surveillance cameras were installed in London to analyse and track passers-by using facial recognition technology.

"This is not something I had consented to," Bridges said.

"People have a reasonable expectation of privacy and the state should be supporting the right... not undermining it."

Megan Goulding, his lawyer from the human rights organisation Liberty, said facial creates concerns about "self-censorship and free expression".

"We think it's wrong that people are made to change how they live to try to protect themselves from unwarranted police surveillance," she added.

However, according to a survey commissioned by ICO earlier this year, facial recognition enjoys widespread public support, with more than 80 percent of respondents apparently backing its use by the police.

'Slow down'

When using facial recognition police utilise a "watch list" of wanted suspects.

The translate the facial features of passers-by into a digital version, which is compared to the data on the list.

If the "similarity score" is high enough, police perform a check.

In Bridges' case, the Cardiff court dismissed his complaint, finding the watchlist was "clearly targeted" on people "suspected of involvement in crimes".

He is appealing the decision and UK Information Commissioner Elizabeth Denham has warned it should not justify indiscriminate use of the technology.

London has 420,000 surveillance cameras, according to a 2017 study by the Brookings Institution think-tank

She has called on the authorities to "slow down" its development pending the rollout of a clear framework for its use.

London is a potential hotspot for the deployment of facial recognition, with its 420,000 surveillance cameras, according to a 2017 study by the Brookings Institution think-tank.

That places the British capital just behind the 470,000 cameras found in China's capital, Beijing.

London's Metropolitan Police has carried out a number of trials of the technology.

"Anyone can refuse to be scanned," the force states in its guidance posted online.

"It's not an offence or considered 'obstruction' to actively avoid being scanned."

But Daragh Murray, a human rights specialist at the University of Essex, is sceptical.

"People who refused consent, who covered their face to (avoid) the camera's system, their behaviour was treated as suspicious and they were engaged by the police," he told AFP.

Murray called the technology "a fundamental shift in the balance of power between the state and the citizen".

Echoing ICO, he is pressing for a legal framework, including an oversight agency.

"We have seen the first generation of when you're using it to identify people who already know," Murray said.

"The next generation, you'll be able to use it to identify people who you don't know."