Guilty until proven innocent: shoppers falsely identified by facial recognition system struggle to clear their names

A picture


When Ian Clayton, a retired health and safety professional from Chester, popped into Home Bargains one February lunchtime, he was suddenly approached by a stern-looking member of staff.“Excuse me, can you please put everything down and leave the shop now?” she said.Clayton recalled how he was stunned, and it was only as he was briskly walked past the tills towards the exit that he stopped to ask what he had done.“You’ve come up on our system called Facewatch as a shoplifter,” came the reply.“There’s a poster in the window.

” With that, he was left outside the shop alone, with a QR code to scan and no idea what had happened,He is one of a number of people who have spoken to the Guardian after being falsely identified as a thief by shops using Facewatch, a live facial recognition system being rolled out across the UK to clamp down on retail crime,The company’s website claims that its system has a 99,98% accuracy rate and that last month it sent 50,288 alerts of “known offenders” to shops including B&M, Home Bargains, Sports Direct, Farm Foods and Spar, which all now use the software,But those who have been wrongly identified and forced to leave shops, either via the technology itself or human error, say they were given no support, and did not know how to complain about their treatment or prove their innocence.

Clayton, 67, said that after he was ejected from Home Bargains he tried calling a phone number on a Facewatch poster, and was sent through to a message saying the company did not take calls and he had to send an email instead.He was only able to get answers after submitting a subject access request – a formal request under data protection laws for personal information – that revealed he had been incorrectly associated with a shoplifting incident on a previous visit to the shop.“It was like I was guilty until proven innocent.It’s an awful feeling.It leaves a pit in your stomach and when I look back now I can feel it again,” he said.

“It feels very Orwellian.We’re constantly being recorded and put on these systems but should we be there? It feels like spying without cause.I’m hyper aware of cameras everywhere now, I’m so aware of them.”Home Bargains eventually issued him an apology and a £100 voucher as a “gesture of goodwill without admission”, on the condition that the details of the incident remain confidential.Clayton declined: “I just thought: ‘Really, you’re trying to buy my silence?’”As facial recognition spreads across police forces and retail stores, UK biometrics commissioners are warning that national oversight is lagging far behind the technology’s rapid expansion.

Last year, the Home Office admitted facial recognition cameras were more likely to incorrectly identify black and Asian people than their white counterparts, and women more than men, and there have been conflicting studies on their overall accuracy.Warren Rajah, a data strategist in south London, was asked to put down his shopping basket and leave his local Sainsbury’s store after being told he had come up on the Facewatch system in February.“For me, this is a civil rights issue that we are slow-waltzing into because if you are just removed without question, your civil rights are being impacted,” he said.“We already live in a country that has issues with racism, it’s an unavoidable issue.And we know cameras cannot pick up features of people that have darker features with as much accuracy.

And this could be happening to people who are much more vulnerable than me.”He said he had major concerns about this technology being rolled out in police forces, as well as in the retail sector.“Who is regulating these companies and can they be trusted with our information? And more importantly, no one has actually defined what your recourse is when something goes wrong,” he said.After countless emails, he eventually found out he was not on the Facewatch database system and staff members had misidentified him.He was offered a £75 voucher as an apology – when he said he did not feel comfortable returning to the store, he was told to use it online.

Jennie Sanders, 48, from Birmingham, was browsing in B&M on a Saturday afternoon last year when a security guard told her she had been flagged up on the Facewatch system and he had to escort her around the store to check she was not stealing,“I was really upset,It was in front of loads of people, and I was really embarrassed,I said I wanted to leave and he escorted me out of the shop,” she said,“It was scary but what was more scary was when I got home and started looking into Facewatch, I saw they share the information between loads of retailers.

“I thought: ‘I’m going to be treated like a shoplifter in every store.I’m not going to be able to do any shopping in person ever again.’”She was told she had to send a copy of her passport to Facewatch to prove her identity before she could find out that she was on the system for stealing a bottle of wine from B&M, which she said never happened.B&M told her it no longer had any evidence, including CCTV footage from the day, so she was taken off the system and offered a £25 voucher.“I took a couple of days off work, I was absolutely beside myself.

Why was I on a database of criminals without my knowledge?” she said.“I’m never going into B&M again.I try to stay away from places with cameras at all – it has really affected me.”Sanders said she complained to the Information Commissioner’s Office (ICO), the formal watchdog monitoring how personal information is being used in facial recognition technology, but seven months later she had yet to hear back.She added: “We’re told to raise complaints and send all correspondence to the information commissioner, but they don’t get back to you.

What the hell is happening with any sort of response to the victims of this?”Rajah had also considered complaining to the ICO, but could find no information on how to do so.“They are so toothless,” he said.“And this issue has been well reported, and they haven’t publicised a formal complaints process.Where’s that information? How can you complain when there are no avenues to follow?”A Sainsbury’s spokesperson said: “We have sincerely apologised to Mr Rajah for his experience in our Elephant and Castle store.This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store.

“The Facewatch system has a 99.98% accuracy rate and all matches are reviewed by trained managers, with additional training provided after this incident to ensure our safeguards are consistently followed.”Nick Fisher, the chief executive of Facewatch, said: “We are aware of the matters referenced and in each case, we acted promptly once they contacted the Facewatch data protection team.“These cases relate to human error in the way processes were carried out in-store, rather than any failure of Facewatch’s technology.We are sorry these individuals experienced being challenged while shopping and understand why this would have been upsetting.

“These three errors are extremely rare cases when viewed in the context of the more than 500,000 alerts we send to retailers each year, but we recognise that any mistake is upsetting for the individual concerned.The system is designed to support, not replace, human decision-making.”A spokesperson for the ICO said: “We recognise the harm and upset that can be caused by misidentification.For this reason, use of facial recognition technology must strictly comply with data protection law and be handled with care and transparency.“If someone has concerns about how their data has been collected, used, or shared, and those concerns cannot be resolved with the retailer directly, they have the right to raise a complaint with us.

“We also continue to actively regulate in this area and will be publishing further retail‑focused guidance to support retailers in understanding and meeting their data protection obligations, while ensuring the public is properly protected,”Home Bargains and B&M declined to comment,
technologySee all
A picture

AI facial recognition oversight lagging far behind technology, watchdogs warn

Britain’s biometrics watchdogs have warned that national oversight of AI-powered face scanning to catch criminals is lagging far behind the technology’s rapid growth.With the Metropolitan police almost doubling the number of faces they scan in London over the past 12 months and a rising use of the technology by retailers in the UK, Prof William Webster, the biometrics commissioner for England and Wales, said the “slow pace of legislation was trying to catch up with the real world” and “the horse had gone before the cart”.Dr Brian Plastow, who holds the same role in Scotland, warned the technology was “nowhere near as effective as the police claim it is” and said there was a “patchwork legal framework” throughout the UK. He said in England and Wales, police were “really just marking their own homework”.The watchdogs said new laws were needed to govern when and how police forces used live facial recognition technology, with a new regulator to clamp down on misuse

A picture

Guilty until proven innocent: shoppers falsely identified by facial recognition system struggle to clear their names

When Ian Clayton, a retired health and safety professional from Chester, popped into Home Bargains one February lunchtime, he was suddenly approached by a stern-looking member of staff.“Excuse me, can you please put everything down and leave the shop now?” she said. Clayton recalled how he was stunned, and it was only as he was briskly walked past the tills towards the exit that he stopped to ask what he had done.“You’ve come up on our system called Facewatch as a shoplifter,” came the reply. “There’s a poster in the window

A picture

How does live facial recognition work and how many UK police forces use it?

The Labour government thinks facial recognition technology is “the biggest breakthrough for catching criminals since DNA matching”. It wants all police forces to use it and recently announced 40 new vans rigged with live facial recognition cameras to be deployed in town centres across England and Wales.Supporters say it streamlines police work and catches criminals. Opponents fear it violates civil liberties and can be biased against minorities.The simplest systems check faces captured on CCTV, mobile phones, dashcams, social media and doorbell cameras against mugshots held on the police national database

A picture

UK ‘invention agency’ grants £50m of public money to US tech and venture capital firms

Britain’s “invention agency” has pledged £50m of UK taxpayer money to US tech companies and venture capital projects.Dreamed up by Dominic Cummings to fund “crazy” ideas, the Advanced Research and Invention Agency (Aria) is meant to “restore Britain’s place as a scientific superpower”.But a joint investigation by the Guardian and Democracy for Sale, an investigative website, has established that more than an eighth of the agency’s £400m in research and development funding over the past two years has gone to 14 US tech companies and venture capital groups, in some cases, with no clear return for the UK or Aria.One of these companies, Rain Neuromorphics, is also backed by the OpenAI chief executive, Sam Altman, and was reported to be near collapse last year, shortly after winning Aria money. It did not respond to a request for comment; two of its founders appear to have left the company

A picture

Under a cloud: the growing resentment against the massive datacentres sprouting across Australian cities

Residents say AI factories with unknown environmental impacts are being rushed into development as proponents argue Australia must ride the data boom or be left behindFollow our Australia news live blog for latest updatesGet our breaking news email, free app or daily news podcastWhen West Footscray resident Sean Brown takes his 19-month-old boy to the park, their walk passes an imposing new building cheerily spruiked as “Australia’s largest hyperscale AI factory”, a datacentre called M3.He hates it: the construction noise from its constant expansion, the looming towers and the insistent background hum, the exhaust from the growing array of diesel generators that can help power the ranks of servers inside.And he worries what it represents for his young child’s future.“He is growing – neurologically, pulmonarily, physically – in the shadow of a facility whose cumulative environmental impact … has never been assessed,” Brown says.“They’re building something which is, frankly, terrible for the community

A picture

Parents already have controls over smartphones – they should use them | Letters

A crucial facility seems to be missing from the coverage of smartphones in schools – and outside (I was wrong about the danger of smartphones in schools. It’s far, far worse than I thought, 22 April). Parental controls, which both Apple and Android have, enable downtimes to be set to ensure phones don’t work in school. They can also set downtimes for outside school and block inappropriate apps.We use these for our 14-year-old daughter to keep her safe and manage the addictive effects of phone use