Tennessee grandmother wrongly jailed for six months, latest victim of AI-driven misidentification — facial recognition is jailing the wrong people, but police keep using it anyway

Angela Lipps speaks with WDAY News during an interview about her wrongful arrest.
(Image credit: Matt Henson / WDAY)

A Tennessee grandmother spent nearly six months in jail after police in Fargo, North Dakota, used facial recognition software to identify her as the primary suspect in a bank fraud case, according to reporting by WDAY News.

Fargo police were investigating a series of bank fraud incidents in April and May last year, in which a woman used a fake U.S. Army ID to withdraw tens of thousands of dollars. Detectives ran surveillance footage through facial recognition software, which returned a match to Lipps. A detective then compared her Tennessee driver's license and social media images to the suspect and concluded that she was the perpetrator based on facial features, body type, and hair. Nobody from the department contacted Lipps before U.S. Marshals arrested her at gunpoint on July 14 while she was babysitting four children.

Article continues below

Lipps sat in a Tennessee county jail for 108 days before North Dakota officers collected her. Her attorney, Jay Greenwood, immediately requested her bank records, and when Fargo police finally met with Greenwood and Lipps on December 19, five months after her arrest, the records showed she had been buying cigarettes and depositing Social Security checks in Tennessee at the time police placed her in Fargo. The case was dismissed on Christmas Eve, but the damage had already been done; she had no money, no coat, and no way home, and subsequently lost her house, her car, and her dog.

Its not unusual

Shockingly, this is just the latest in a series of structural failures that have led to innocent people being persecuted for crimes they didn’t commit. A January 2025 WaPo investigation documented at least eight instances of Americans wrongfully arrested after police found a possible FRT match, and in every case, investigators skipped fundamental steps like checking alibis and comparing physical descriptions that would have cleared the suspect before arrest.

The facial recognition vendors themselves, such as Clearview AI, even attach explicit caveats to their systems. Clearview requires agencies to acknowledge that results "are indicative and not definitive" and that officers must conduct further research before acting on them. According to an April 2024 ACLU submission to the U.S. Commission on Civil Rights, in at least five of seven wrongful arrest cases, police had received explicit warnings that FRT results don’t constitute probable cause but made arrests anyway.

Robert Williams, whose 2020 wrongful arrest in Detroit was the first publicly reported FRT false-positive case, reached a landmark settlement with the city in June 2024 that now requires independent corroborating evidence before any FRT match can be used to seek an arrest warrant. However, only 15 states had enacted any FRT legislation covering law enforcement at the start of 2025, and North Dakota is not among them.

As for Lipps, she is now back home in Tennessee, awaiting an apology from the Fargo Police Department that hasn’t yet come.

Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Google Preferred Source

Luke James
Contributor

Luke James is a freelance writer and journalist.  Although his background is in legal, he has a personal interest in all things tech, especially hardware and microelectronics, and anything regulatory. 

  • Ralston18
    AI (all caveats aside) can certainly make a mess of things. Not a fan by any means.

    However, the real problem is the breakdown of the judicial system and the lack of "common sense" overall.

    If the facial recognition system had picked out some celebrity that would have been resolved within hours. Or even a family member of such celebrity.

    But find someone old, poor, defenseless, etc. - you can see what happens.

    AI / FRT only started the problem. The judicial system, the police, and likely bureacratic others, made the situation worse, and put it on a path to get even worse. As what happened.

    Others need to be held accountable as well.
    Reply
  • SonoraTechnical
    Luke,
    Thank you for posting this story. They literally ruined the woman's life. It's disgusting isn't it?

    Ralston18 said:
    However, the real problem is the breakdown of the judicial system and the lack of "common sense" overall.

    But find someone old, poor, defenseless, etc. - you can see what happens.

    The judicial system, the police, and likely bureacratic others, made the situation worse, and put it on a path to get even worse. As what happened.

    Others need to be held accountable as well.
    The level of injustice served this woman is beyond belief. Being poor, she didn't have the resources to fight it.
    Reply
  • ravewulf
    We live in such a dark timeline...
    Reply
  • PEnns
    Unbelievable!!

    This woman needs some bad a$$ lawyer and sue the heck out of everybody involved: The Facial rec. company, the bank and the police. And the horses they rode on too!!
    Reply
  • abufrejoval
    She should count herself lucky: just standing close to false positives means you lose the abililty to complain in other parts of the world.
    Reply
  • DougMcC
    PEnns said:
    Unbelievable!!

    This woman needs some bad a$$ lawyer and sue the heck out of everybody involved: The Facial rec. company, the bank and the police. And the horses they rode on too!!
    This is how you solve this problem. Get her a multimillion dollar judgement and Fargo will be thinking twice about the value their facial recognition software is bringing them.
    Reply
  • QuarterSwede
    DougMcC said:
    This is how you solve this problem. Get her a multimillion dollar judgement and Fargo will be thinking twice about the value their facial recognition software is bringing them.
    I would certainly find a lawyer who would make a mint on suing the pants off of them as an example. I wouldn’t go for a $ amount, it would be a percentage of the company to actually hurt them. And if that’s against the law, I would challange that. The absolute lack of critical thinking and outright lazyness is disguisting, disgraceful, and embarassing.
    Reply
  • palladin9479
    PEnns said:
    Unbelievable!!

    This woman needs some bad a$$ lawyer and sue the heck out of everybody involved: The Facial rec. company, the bank and the police. And the horses they rode on too!!

    Guaranteed she is getting calls from lawyers to represent her. There are at least two governments she can sue for seven digit figures.
    Reply
  • Ralston18
    Lawsuits etc. aside, the real shame is with those who used facial recognition and immediately failed to investigate further (as detectives should do) or otherwise follow-up in some manner to determine if it all made sense.

    That failure is the root and the shame of it all.

    Overall, taking the report at face value, it narrows down to too many people simply not doing their jobs or doing their jobs properly. Mostly just common sense was needed.....

    There may eventually (years likely) be monetary settlements but the damage is done.

    People must be held fully accountable for their actions or lack of actions per situations and circumstances.

    It was not AI who locked the proverbial "cell and threw away the key".
    Reply
  • gamerk316
    Welcome to America, where you are thrown in jail for months on end awaiting a hearing, despite literally all the evidence showing there's no way you could have committed the crime you are accused as. And you get out, and find out you lost your job, your house, and who knows what else, with little to no means of getting financial restitution.
    Reply