True Crime: The AI Algorithm That Stole Angela Lipps’ Freedom

How a flawed facial recognition match sent a Tennessee grandmother to a North Dakota jail for crimes she didn't commit.

True Crime Suspect Angela Lipps Misidentified by AI - Via NBC News - YouTube

Like so many other areas of life in the 2020s, true crime experiences changes as AI facial recognition becomes a tool for police forces. Well, it’s also a bit of a terrifying reality, because, as ChatGPT will tell you, AI “can make mistakes.” And, that’s what helped to rob the freedom of Angela Lipps. Read on to find out more.

The Algo That Got It Wrong

It’s not a made up story to make folks quake in their boots about the power of AI. And, that’s the real tragedy. Because, it actually happened. Many outlets, including The Daily Mail reported that Angela Lipps, a grandma was babysitting the grandkids in Tennessee last year.

The outlet noted that:

Angela Lipps, 50, was left astonished when US Marshals showed up to her home in Elizabethton on July 14, 2025, while she was babysitting four children, and arrested her at gunpoint.

Angela treated as a criminal on minimal proof - NBC - YouTube
Angela treated as a criminal on minimal proof – NBC – YouTube

Can you even imagine her fear as she was accused of “felony theft and felony unauthorized use of personal identifying information?” Later, she ended up being moved to North Dakota. But, during the legal process, she sat for “108 days without bail.”

Notably, the scariest part of that, was the fact that she’d never visited North Dakota. And presumably, she’ll strike it off her bucket list for future vacation dates. 

As an aside, it worth noting that another interesting report about AI and the legal system involved Darron Lee. He used the tech tool to try and cover up his crimes. And failed.

Trending On Socal Media

The entire debacle that stole the freedom of an innocent woman all began because an AI decided she looked like a wanted true crime felon. Based on that alone, the cops arrested her, taking the “match” as a given fact. Right now, it’s trending on social media after NBC News shared the news of her December release.  

 

Across social media, and not just on YouTube, the news fuels some anger and a deeper mistrust of AI as a crime tool. While the crime really happened, it wasn’t done by Angela.The basic facts of the case involved Fargo Police investigating a 2025 fraud where someone used a fake ID to pull cash from a bank.

Lazy Police Work?

What made many people furious, is that it seemed like the cops dumped the idea of routine investigations. Instead, they hit the “search” button on specialized software.

Apparently, when the AI red-flagged Lipps, investigators seemed disinterested in the grind of finding other evidence. Despite being over a thousand miles away, she was extradited and stuck behind bars for nearly half a year.

The Huge Fail Hurt

Basically, to summarise what happened, the AI software flagged a “potential match” based on a surveillance photo. Arguably, that was only the first mistake. Why? Well, because the police then relied solely on the tech to effect an arrest. The alarming part, is that it seems they didn’t even check to find out if she’d been in North Dakota at the time the true crime happened.

At the end of the day, apart from the harrowing ordeal of false imprisonment, Angela Lipps lost her home, her car, and her dog during her incarceration. It took time, but finally, her attorney got her released after finding proof that she stayed in Tennessee the whole time.

AI Scares People

Obviously, folks now question the use of tech in police work. While the AI match might have been close, is that good enough? Should warrants be issued based on that alone? Admittedly, Fargo cops changed their policies, but it came too late for grandma Angela.

Who’s gonna give her back her peace of mind? Notably, everyone knows that software can make mistakes, so critics feel it’s way too soon to employ the technology.

Reactions Come Fast & Furious

Apart from shocked comments on the NBC report, others talk about it across social media. On Reddit, one person wrote, “That’s honestly terrifying. Imagine being locked up for months in a place you’ve never even been to, all because an algorithm said ‘close enough.’ AI is supposed to assist, not replace common sense.”

Here are a few more reactions in the discussion:

  • Institutions are rushing to use AI to replace the human mind. Police don’t bother with further investigation because the AI told them what they want to know, and they can’t wrap their heads around the thought that the tech might be wrong.
  • Since when has ‘looks like the suspect’ been sufficient grounds for arrest? AI could give you that with 100% accuracy and it would still be dumb to rely only on that one thing.
  • This is just laziness and incompetence. Not only on the part of the police but on the judge who would sign a warrant simply based on AI and apparently nothing else.
  • Have y’all seen that movie Mercy? AI decides if you’re guilty or not. Pretty scary because it seems like we could be there in 5-10 years.

What are your thoughts? Do you agree that investigators need to be more cautious about using AI to try and close true crime cases? Do you agree that things seem to be getting rather too dystopian?

Let us know in the comments below, and remember to come back here often for all your true crime news and updates. Also, remember, we have a TikTok crime channel you can follow.



You might also like More from author

Comments are closed.

GT server