Skip to main content

If worries about these potent technologies are n’t taken seriously, there are real risks of the UK slipping into an “all-encompassing” surveillance state, according to outgoing biometrics and surveillance camera commissioner Fraser Sampson. This puts the future control over the police’s increasingly sophisticated biometry surveillance capabilities in jeopardy.

Sampson stated in an exclusive interview with Computer Weekly that while he anticipated widespread polarization regarding the police’s use of technology,” the kind of isolated approach the government has to this” did certainly come as a surprise.

He continues,” I’ve been shocked at how much the police and local authorities know about the equipment they’re using, where it is, what it can do, who they bought it from, why people might be concerned about it, how much it’s accomplishing, and how comfortable people are with it doing more things in the future.”

Sampson notes that there are a number of ways that visual recognition and other AI-driven technologies can aid in, for example, catching people who use and contribute to websites dedicated to child sexual abuse material, finding lost or vulnerable people, and locating terror suspects.

Sampson has even stated in the past that the state can fulfill its positive duty to shield its citizens from receiving cruel or degrading treatment by using biometric technology. But, Sampson claims that the UK police’s current approach to this technology may make those advantages out of their reach due to the way it has been used in other, more contentious contexts.

Sampson also slammed the cliche that goes,” You’ve completely missed the point if you actually raise that as a defense,” and said that “if nothing has been done, you have nothing to worry about.”

Sampson was given the dual responsibility in March 2021 and has since been in charge of monitoring how police gather, store, and use a variety of biometric data ( including digital facial images ) and promoting their adherence to the surveillance camera code of conduct.

Sampson has issued several cautions about the state of police biometrics and surveillance during that time. He has, for instance, brought up concerns about the lack of a clear legal framework to control how the police use biometric materials and artificial intelligence ( AI), as well as the size and scope of public space surveillance.

Sampson has expressed additional worries about the police’s public” culture of retention” around genetic data as well as the dubious legality of using hyperscale public cloud infrastructure to store and process law enforcement data.

Sampson has repeatedly brought up the continued and illegal retention of millions of custody images of people who have never been charged with a crime in relation to the data retention issue in particular.

He noted, for instance, that the Home Office ( which owns the majority of the biological databases used by British police ) says it cannot be done because the database they are held on has no bulk deletion capability, despite the High Court’s ruling in 2012 that these images must be deleted.

Sampson has long criticized the government’s proposals for data reform, claiming that the Investigatory Powers Commissioner will now have biometric oversight over the measures in its Data Protection and Digital Information ( DPDI) Bill, eliminating the requirement for the nation to publish a Surveillance Camera Code of Practice. He claims that both of these actions will make his roles redundant.

Sampson agreed to a short-term reappointment near the end of 2022 until the DPDI Bill had received imperial assent, despite the fact that his initial appointment had been for two years. Sampson, but, submitted his resignation in August 2023, citing issues with the bill.

Sampson writes in a letter to Suella Braverman, the then-home secretary, that because of changes to the Parliamentary timetable, it is not anticipated to pass until spring 2024 at the earliest, and that he is so “unable to find in any practical way in which I can continue to discharge the functions of these two roles.”

” Distantly exploring the online ocean”

Sampson has spent a significant amount of the last few years tracking police deployments of both life and longitudinal facial recognition technology.

Regarding the practical advantages of facial-recognition technology, the Met Police’s director of intelligence Lindsey Chiswick recently informed MPs that” a number of important arrests” have already resulted from its use, such as conspiracy to supply class A drugs, assault on emergency personnel, possession with the intent to provide class B drugs; serious physical harm; and being eluding capture after escaping from prison.

These are just a few of the examples I’ve provided today, but there are much more advantages than simply the number of arrests that technology alerts police officers to make. One instance of where deterrence was advantageous is the Coronation of the King. You’ll have noticed, she said,” We made it very clear in advance that we would be there as part of that deterrence effect.”

There was a greater benefit to the community in that area at the time, if I remember my time up in Camden when I went to see one of the facial-recognition deployments. Because of the effect it was having on crime in that area, we really received a lot of very good feedback from shop owners and locals.

Several worries about the Met Police’s approach to the technology and its deployment have already been raised by civil society organizations, lawyers, and politicians, according to Computer Weekly.

They emphasized issues with automation bias, the illegal retention of millions of custody images that are used to compile facial recognition watchlists, and the inaccurate description of its live facial-recognition ( LFR ) deployments as” trials” despite being used in operational contexts to make arrests.

They also questioned its proportionality and necessity because, despite scanning 144, 366 people’s biological information over the course of six deployments in the first half of 2022, just eight arrests were made.

They also emphasized the social power dynamics surrounding the technology, arguing that even if it were 100 percent correct every single time, it would still be used against groups of people like democratic protesters or ethnic minorities.

Sampson also raises concerns about how British police have approached facial recognition deployments, noting the frailty of the evidence surrounding its efficacy in resolving serious crimes and highlighting the possibility of slipping into “all-encompassing” facial-recognition surveillance.

On the one hand, Sampson claims that detractors claim that UK police “never actually seem to catch anyone significant using it, let alone pretty dangerous or high-harm offenders,” but on the other, proponents of policing will counter that this is because it has been used so sparingly on relatively few people,” we’re not going to have very spectacular results, so we’ve got to use it more to prove the case more.”

Sampson asserts that the explicit nature of the deployments—police forces are required to publicly state when and where they are using it—means wanted people will merely avoid the area in response to the Home Office’s repeated assertion that LFR is a useful crime prevention tool capable of stopping terrorists, rapists and other violent offenders.

It is important to note that real arrests resulting from police use of the technology have certainly supported the Home Office’s claim that it can stop mainly serious crimes like rape and murder.

I’m just not going to go to those places, he says, “if you [as the police ] are planning to advertise where you’re using it and between what hours of the day, except if they want me and I have no idea that they are wanted, which is a little strange.

The ANPR]Automatic Number Plate Recognition ] system is similar to my concern in that it is incredibly simple to defeat if you truly want to. In the end, you have a mass trawling of the digital ocean in the full knowledge that almost everything you find online is n’t what you’re looking for because I’ll clone the plates, use stealth tape, or use some other obscurant tactics.

The argument then shifts to making the ability more covert in order to avoid this trap:” Then it becomes very sinister…you ca n’t just avoid one town, because it could be looking for you anywhere,” he continues. The use case is based on that justification.

a spooky effect

Sampson more questions the technology’s ability to prevent crime by arguing that authorities primarily rely on its chilling effect more than its real success in locating wanted people. Andnbsp, He claims that the reasoning behind this is that people “might behave” if they are aware of the police’s potential and potential use of it.

Sampson describes this as “heading to George Orwell territory” and adds,” It’s truly difficult for the police to find the proof that it can work when used correctly, without having to throw away all the safeguards to prove it, because once they’re gone, they are gone.”

Sampson asserts that there are significant risks of the technology being misused, yet in circumstances where it can be used effectively to, for instance, locate missing children or identify at-large terror suspects in crowds.

You have to wonder whether intrusion is equal if it has a chilling effect where people are extremely uncomfortable.
Fraser Sampson, commissioner of surveillance cameras and talkative biometrics

You have a relatively small window within which catastrophic events can occur, and you need to find that one face among many, he advises, “if you’ve got one missing, prone person, whether it’s an infant or an old person.”

” Technically speaking, it’s quite simple to do that now if you have the ability to turn that on across integrated camera networks within the search area, whether in a town or city.” In some circumstances, I believe you would then be explaining to the public why you had n’t turned it on.

The question is, how do we know you’ve merely turned it on during those times? What are you going to do with it after that?

Sampson more cautions against taking action” just because you can,” citing the government’s plan to integrate the UK passport database with facial recognition systems. and once more compared this to” trawling the whole online ocean in the dragnet” in an attempt to catch someone.

He continues,” You have to wonder whether that is equal if that intrusion for them is felt in a chilling effect where people are really nervous about it.

a dispersed landscape

According to Sampson, having” a very strong, very clear, logical, oversight accountability framework, so that when people have questions about this stuff, as they naturally will, they know where to go” is the solution to such complexities surrounding facial recognition and other genetic capturing technologies.

However, the DPDI Bill’s changes mean that the regulatory framework for biometrics and surveillance, which is already uneven, is “being more scattered and broken up …to say we can leave it all to other existing bodies, I’ve never seen any evidence in support of that.” In actuality, every piece of evidence I’ve seen points in the opposite direction.

Sampson claims that the government’s data reforms will further splinter what is now a pretty fragmented regulation landscape, noting that his appointment to the dual role was in recognition of the growing convergence between new biological capabilities and surveillance techniques.

Academics Pete Fussey and William Webster&nbsp note that the DPDI Bill will weaken oversight of the police’s aggressive surveillance capabilities if passed as is in an impartial report Sampson commissioned on the effects of these data reforms on surveillance oversight. &nbsp,

” The possibilities for integrated surveillance technology, driven by AI and supported by the internet, create genuine public anxieties over political freedoms… In current form, the bill will delete various surveillance oversight activities and mechanisms that are set out in legislation and arise from the fulfilment of legal duties placed on commissioners, “it said.

Noting that surveillance oversight has previously been” overburdened and under- resourced”, the report added:” The bill contains no provision for continuing the work of driving up standards for the development, procurement, adoption and use of surveillance cameras, a programme of work frequently applauded across police, practitioner and industry communities.”

According to one of the report’s interviewees, the bill also makes no provision for the absorption of Sampson’s roles by the ICO, and” only deals with extinction”.

Sampson is not alone in calling for obvious legal frameworks to govern police use of biometrics, with both Parliament and civil society groups making repeated calls for fresh regulation.

This includes a&nbsp, House of Lords inquiry&nbsp, into police use of sophisticated analytic technologies, an&nbsp, independent legal review&nbsp, by Matthew Ryder QC, the&nbsp, UK’s Equalities and Human Rights Commission, the&nbsp, past biometrics commissioner&nbsp, for England and Wales, Paul Wiles, and the&nbsp, House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.

But, the&nbsp, government maintains&nbsp, that there is” currently a comprehensive framework “in place.

New legal frameworks needed

Commenting on the various calls for genetic regulation that have been made by divergent voices, and whether those at the helm of decision- making in this area are heeding such calls, Sampson says:” I do n’t think that they’re actually getting it.”

He adds that while senior policing officials and local authorities are” starting to get it”, they generally look to central government” for a steer on this, and the government clearly do n’t get it”.

Sampson further adds that” there is no underpinning or rationale to the]government’s ] direction of travel” with oversight of police technology, and that it has so far been unable to provide evidence attesting to the efficacy of its current frameworks:” Just saying it a number of times in a pre- rehearsed form of words does n’t make it so, and I just do n’t see the evidential basis for this]approach].”

Giving the example of behind- sealed- door meetings he has been involved in regarding the government’s overall project to embed cosmetic- recognition capabilities across UK policing, Sampson says there is a” deep tone deafness “among important decision- makers about how to approach the roll- out of new technologies in a law enforcement context.

Just building something and then…waiting for the public to eventually cotton on to why it’s good for them… is simply not the right way around
Fraser Sampson, commissioner of surveillance cameras and talkative biometrics

He says while” public trust and confidence is ]seen as ] a desired outcome”, it should instead be an” essential input” that precedes the roll- out of new tech:” You ca n’t hope to build this and run it successfully, particularly not in a jurisdiction that relies on policing by consent, unless you build that trust and confidence first.”

He adds:” Just building something and then…waiting for the public to eventually cotton on to why it’s good for them– which is the presumption at the heart of it– is simply not the right way around.”

On building trust and confidence in policing rather than technologies, Sampson also warned against” predicted policing “practices and the dangers of using algorithms or AI to make predictions about people’s potential for criminal behaviour. &nbsp, He says that such approaches rely greatly on assumption and create a baseline level of suspicion around members of the public.

” We failed magnificently as a nation trying to predict whether a 17 year old was going to pass their A- levels in a couple of subjects using algorithms, why would we presume to think we’re going to be any better at predicting whether that exact teenager is going to be convicted of street robbery in the next five years? “he says.

” And actually, why would we even presume to try when there are so many other areas we could use this technology for, that do n’t rely on those assumptions, do n’t have that track record, and do n’t have that existing level of suspicion?”

Sampson will continue his research into cutting-edge biometric and AI technologies in a variety of security and policing contexts at Sheffield Hallam’s Centre of Excellence in Terrorism, Resilience, Intelligence and Organised Crime Research ( CENTRIC ) after leaving his dual roles on October 31, 2023.

Leave a Reply