A Cardiff resident who lost a High Court challenge over police deployment of automated facial recognition technology has begun the appeal process, his solicitor revealed this week.
In Bridges v CCSWP and SSHD, the High Court ruled in September that South Wales Police’s use of the technology was lawful after a challenge was brought by Ed Bridges, who believed his face was scanned by South Wales Police at a protest and while he was doing his Christmas shopping.
Bridges was represented by civil liberties group Liberty which says thousands of people have been scanned without their knowledge or consent at high-profile events such as the 2017 Champions League Final and Ed Sheeran concerts.
Bridges’s solicitor, Megan Goulding, told the Public Law Project's annual conference that an application has been submitted to the Court of Appeal.
Goulding told a packed breakout session on automated technology that the judicial review had faced evidential difficulties because the police force immediately deletes data if no match is detected with a watchlist of suspects. ‘All we could prove is that the client had been in the range of cameras. The police’s answer was “you still cannot prove the camera was pointed towards the client, and the image was captured and processed”. On that logic, not many people could challenge facial recognition because their data would be deleted too soon.’
Another problem was the lack of data on the total number of faces scanned at every deployment. ‘All we could do was estimate…that half a million faces had been scanned’. Liberty asked the court to be ‘bold’, Goulding said, and take into account the negative, as well as positive, impact of the technology on the community.
Another evidentiary challenge was accessing information held by the manufacturer of the facial recognition system. Goulding said: ‘In terms of what we were arguing for our public sector equality duty challenge, the police should have investigated for the potential of indirect discrimination before using this technology. We were also saying to do that assessment, it is necessary for you to get access to the training dataset.’
The use of automated decision tools could be limited, Goulding told the conference, ‘if you can establish it’s a breach of the public sector equality duty for a public authority not to get information or take steps to get information from its private manufacturer that will show whether the system will discriminate.'