Fingerprints, retinal scans and blood samples – the digital age has extended our perceived identity into data that can be analysed and stored. This data can be a powerful tool for government authorities, who must formulate preemptive and reactionary measures to protect citizens. While it is always reassuring to see someone protected due to a successful measure taken by a legal authority, we have to consider the question: at what cost?

In December 2017, Ed Bridges “popped out of the office to do a bit of Christmas shopping”. That was the first incident where his identity was captured by the Automatic Facial Recognition (AFR) software currently being tested by the South Wales Police Authority (SWP). The next time was at an anti-arms peace protest, within a large gathering. This particular technology is far more sophisticated than a normal surveillance camera, which allows the police to simply monitor public areas. It essentially creates a biometric map of an individual through the surveillance equipment. A numerical code of the faces of each person passing by is created and stored as a unique identification that the individuals don’t have access to.

The police claim that the technology scans multiple faces in crowds, comparing the data to their ‘watchlists’ to find a match and thus track criminals. Liberty, the human rights group supporting Bridges, believes that there is no regulation on what determines these ‘watchlists’ and when the SWP can exercise the use of AFR. It also argues that there is no accountability regarding the storage of all this private data, which has been taken without consent.

The two opposing arguments were discussed in the case brought by Mr. Bridges against the South Wales Police Authority, in the Queen’s Bench Division on 11th August 2020. The court held – on the first ground of appeal regarding ‘no interference with Article 8 of the European Convention on Human Rights’— that “AFR is a novel technology” and that the Divisional Court ignored the deficiencies in the current legislative framework; too often the questions of “who can be placed on the watchlist” and “when AFR can be deployed” have been left to the discretion of individual police officers. Even lawyer Gerry Facenna (on behalf of Britain’s information commissioner) admitted that a legal framework needed to be drawn up for the AFR, and that the current rules are “all a bit ad hoc”. So a lacuna in the legislative framework was recognised and may lead to more comprehensive legislation.

There remains a question of how urgent this issue is, if the authorities essentially just took a picture. Alongside the obvious breach of the human right to private life, the issue with this technology is that it makes a very specific biometric map of your face. The specificity makes it analogous to a blood sample or a retinal scan; this private information is very closely linked to our sense of autonomy and privacy and should be within our control. This software, alongside the setting up of sufficient AFR cameras, would allow police authorities to completely track your whereabouts without you even noticing. Unlike the location you can turn off on your phones, this system would continue tracking you with or without your permission. This demonstrates an urgent need for legislation that will ensure that we can still retain our privacy and dignity.                 

This may seem like another instance where courts balance out security interests against privacy, but I believe this technology has pointed out a gaping hole in the data privacy legislative framework. We have legislation protecting our unique identification factors and personal data. Simultaneously, we allow the surveillance of our activities in public areas in the interests of security. This technology lies on the cusp of personal data and general surveillance, making it very hard for the law to pin it down and regulate it. This legislative gap also indicates how the law develops after technology, when they should be developed in parallel. For example, there was a 10-year lag between the creation of YouTube and a coherent legislative framework like the GDPR being created. As technology advances, we lose track of the vast amounts of data being stored and our ignorance prevents us from protecting our data in the future.

The Bridges case has brought the age-old fight between security and privacy to the forefront and it will contribute to the future of privacy in the UK. The Data Protection Act, 2018 and the Convention provide some hope in finding a balance. But finding the right balance is crucial to avoid further exploitation through personal data collection; the future is bright, until we stop paying attention.

(Image credit: https://commons.wikimedia.org/wiki/File:Face_Recognition_3252983.png?fbclid=IwAR33bay4KUL6hJbZcif3lzHyvajzCdj0OXX9_tJuPsIORQoUliL9_2MwAkA)


For Cherwell, maintaining editorial independence is vital. We are run entirely by and for students. To ensure independence, we receive no funding from the University and are reliant on obtaining other income, such as advertisements. Due to the current global situation, such sources are being limited significantly and we anticipate a tough time ahead – for us and fellow student journalists across the country.

So, if you can, please consider donating. We really appreciate any support you’re able to provide; it’ll all go towards helping with our running costs. Even if you can't support us monetarily, please consider sharing articles with friends, families, colleagues - it all helps!

Thank you!