What happened to Angela Lipps reads less like a cautionary tale about technology and more like a full collapse of common sense.
In reporting for the Grand Forks Herald, Matt Henson laid out how a 50-year-old grandmother from north-central Tennessee ended up spending nearly six months in jail after Fargo police used facial recognition software in a bank fraud investigation and concluded she was the woman in the surveillance footage. She says she had never been to North Dakota, did not know anyone there, and had never even been on an airplane until authorities flew her there in custody to face charges.
Attorney and commentator Steve Lehto, reacting to the same case on Lehto’s Law, put it in even plainer language: an innocent woman was dragged more than 1,200 miles away from a crime scene, jailed for months, and had her life wrecked because officials appear to have trusted a computer match more than basic detective work.
And the worst part may be how ordinary the failure now sounds.
This was not a science-fiction glitch or some futuristic robot making an arrest on its own. It was a chain of human choices, each one apparently accepting the previous mistake as good enough. By the time anyone seriously looked into whether Angela Lipps could have committed the crime, she had already lost half a year of her life.
A Tennessee Grandmother Suddenly Became A Fugitive
Matt Henson reports that Lipps was arrested on July 14 at her Tennessee home by a team of U.S. Marshals while she was babysitting four young children. She says she was taken away at gunpoint and booked into her county jail as a fugitive from justice from North Dakota.
Her reaction to the whole thing, as Henson quoted her, was immediate and simple: “I’ve never been to North Dakota, I don’t know anyone from North Dakota.”
That should have been the beginning of a serious pause. Instead, it appears to have been treated as background noise.

Lipps sat in jail in Tennessee for nearly four months, held without bail as a fugitive, while the extradition process played out. The charges out of North Dakota were serious: four counts of unauthorized use of personal identifying information and four counts of theft tied to a Fargo-area bank fraud investigation. As Henson explains, she was told that if she wanted to fight the case, she would have to go to North Dakota to do it.
That is the kind of bureaucratic cruelty that can destroy a person long before a courtroom ever fixes the mistake.
Steve Lehto, discussing the same events, notes how absurd that sounds on its face. A woman who says she has never even visited the state is thrown in jail, left there for months, and effectively told that the only way to clear her name is to let the system drag her even farther into the error.
The AI Match Became The Case
Henson reports that the Fargo police investigation began in April and May 2025, when detectives were looking into several bank fraud cases involving a woman seen on surveillance video using a fake U.S. Army military ID card to withdraw tens of thousands of dollars.
To identify the woman, Fargo police turned to facial recognition software. According to court documents cited by Henson, that software identified the suspect as Angela Lipps.
From there, the case seems to have moved with alarming speed and very little skepticism. Henson reports that the detective then compared the surveillance images with Lipps’ social media accounts and Tennessee driver’s license photo, later writing in charging documents that she appeared to match the suspect based on facial features, body type, and hairstyle and color.
That last part jumps out because it sounds less like hard investigative proof and more like someone talking themselves into a conclusion the machine had already suggested.
Lehto seized on exactly that point in his video, mocking the idea that “hairstyle” could be part of the serious justification for uprooting someone’s life across state lines. His sarcasm works because the reasoning sounds so thin. If the only thing holding the case together was software, a look at social media, and a vague sense that the person kind of looked similar, then it was never a strong case to begin with.
And yet it was strong enough, somehow, to bring armed federal officers to her door.
Months Passed Before Anyone Did The Obvious
One of the most disturbing details in Henson’s reporting is that no one from the Fargo Police Department ever contacted Lipps before she was arrested, at least according to her account.
No interview. No phone call. No chance to say she had never been to North Dakota. No simple early effort to test whether the AI result made any real-world sense.
Then, even after the arrest, the pace remained astonishingly slow. Henson reports that North Dakota authorities did not pick her up from her Tennessee jail cell until October 30, a full 108 days after her arrest. The next day, she made her first appearance in a North Dakota courtroom.
Think about that for a moment. More than three months passed between the arrest and the transfer, and still nobody had apparently done the basic work that might have saved her from ever being sent there in the first place.

Lehto hammers that point repeatedly. He says this case is a perfect example of people overestimating what AI can do and then skipping the boring but necessary steps of real investigation. If the software says “possible match,” that might justify a deeper look. It should not be treated like a final answer.
And that is what seems so maddening here. There were so many chances to catch the mistake.
Her Bank Records Blew The Case Apart
According to Henson, everything changed only after North Dakota defense attorney Jay Greenwood got involved.
Greenwood’s reaction, as quoted in the article, was exactly what many readers likely thought as soon as they heard the story: “If the only thing you have is facial recognition, I might want to dig a little deeper.”
That line should probably be engraved somewhere in every police department using this technology.
Greenwood asked Lipps for her bank records, and once those records were obtained, the case began to unravel almost immediately. Henson reports that Fargo police met with Greenwood and Lipps at the Cass County jail on Dec. 19, after she had already been locked up for more than five months. It was the first time police interviewed her.
Her bank records showed she was in Tennessee, not North Dakota, at the very times police said she was in Fargo committing fraud. Greenwood told Henson that around the same period when the fraud was happening, Lipps was depositing Social Security checks, buying cigarettes at a gas station, ordering pizza, and using Cash App for Uber Eats – all back home, more than 1,200 miles away.
Five days later, on Christmas Eve, the case was dismissed.
That sequence is almost unbearable to read because the solution was not hidden in some advanced forensic mystery. It was sitting in routine bank records. Once someone bothered to ask for them, the whole thing collapsed.
Lehto makes the same point with some anger, and rightly so. He says it would not have taken much to check where she was, to contact her, or even to send detectives to speak with her before using the full force of extradition. Instead, the system acted first and verified later.
The Price Of Being Wrong Was Paid By Her, Not By The System
Getting the case dismissed did not repair what had already been done.
Matt Henson reports that when Lipps was released on Christmas Eve, Fargo police did not cover the cost of getting her home. She was left stranded in North Dakota in summer clothes, with no coat, in winter conditions, frightened and unsure how she was going to make it back to Tennessee.

Local defense attorneys gave her money for a hotel room and food on Christmas Eve and Christmas Day. Then Adam Martin, founder of the F5 Project, drove her to Chicago so she could get home.
That detail says more about the system than any official statement could. The same machinery that had enough confidence to haul her across the country apparently had no obligation, or no interest, in helping once it became clear they had jailed the wrong woman.
Lehto sounds particularly disgusted by that part. In his retelling, once the authorities finally admitted they had the wrong person, they effectively shrugged and left her in North Dakota to figure it out for herself.
By then, the damage was already catastrophic. Henson reports that because she could not pay her bills while jailed, Lipps lost her home, her car, and even her dog. She says no one from the Fargo Police Department has apologized.
That may be the most chilling detail in the whole story. A machine made a bad match, people chose to trust it, and a real woman lost nearly everything. Yet there is still no apology.
This Was Not Just An AI Failure, But A Human One
It is tempting to call this an AI error and stop there, because the software clearly played the starring role in the misidentification. But that description is too generous to the humans involved.
The software did not extradite Angela Lipps. The software did not decide to skip a phone call. The software did not choose to sit on the case for months without confirming her whereabouts. The software did not refuse to speak on camera afterward.
Those were human decisions.
Henson notes that when Fargo Police Chief David Zibolski was asked at his retirement news conference why nobody from Fargo police ever spoke with Lipps during the five months she was in jail, he declined to engage, saying they were not there to talk about that.
That answer may be legally cautious, but morally it sounds awful.
Lehto, for his part, says the police should be on the hook for some kind of damages, though he acknowledges that legal immunity rules may make that difficult. Still, his broader point is sound: if officials rely on AI as if it were foolproof, and do so without the ordinary safeguards of investigation, then the consequences are not “oops” consequences. They are life-altering harms.
And that is exactly what this became.
The Technology Is Not The Scariest Part
The frightening thing about this case is not that facial recognition made a mistake. Any technology can fail.
The frightening thing is how little resistance there seemed to be once the error entered the system.
A computer suggested a suspect. A detective thought the person looked close enough. No one seriously checked the alibi until months later. Armed officers made the arrest. Courts processed the extradition. Jails held her. And only when a defense lawyer pulled ordinary bank records did the state finally discover what should have been obvious much earlier.
That is not a story about one flawed piece of software. It is a story about how easily an institution can mistake speed for certainty and confidence for proof.
Angela Lipps was not merely inconvenienced by that error. According to Matt Henson’s reporting and Steve Lehto’s commentary, she lost her freedom, her home, her car, her dog, and six months of her life because a machine got it wrong and too many people around it stopped asking questions.
That should scare far more people than the phrase “artificial intelligence” ever could.

Mark grew up in the heart of Texas, where tornadoes and extreme weather were a part of life. His early experiences sparked a fascination with emergency preparedness and homesteading. A father of three, Mark is dedicated to teaching families how to be self-sufficient, with a focus on food storage, DIY projects, and energy independence. His writing empowers everyday people to take small steps toward greater self-reliance without feeling overwhelmed.

































