
A police report-writing AI unexpectedly malfunctioned and claimed a Utah officer had turned into a frog.
A strange problem came to light in December when the Heber City Police Department in Utah was trying out new AI tools to reduce the amount of paperwork they had to handle.
Rather than providing a clear summary of what happened, the software produced a report claiming an officer had transformed into a frog.
It turned out the police body camera had mistakenly recorded audio from a Disney movie that was playing nearby.
Disney film causes AI police report mix-up
According to Sergeant Keel, the AI-powered report writing software identified the movie playing in the background as ‘The Princess and the Frog,’ as he shared with FOX 13 News.
“That’s when we learned the importance of correcting these AI-generated reports,” he added.
The department started testing two different software programs, Draft One and Code Four, in early December.

George Cheng and Dylan Nguyen, both 19-year-old MIT dropouts, developed Code Four. This program uses audio from body cameras to automatically generate complete police reports.
Axon, a company specializing in police technology, revealed Draft One last year. Like the previous tool, it automatically creates written reports using footage from body cameras.
Draft One is powered by OpenAI’s GPT technology. Axon says this will significantly reduce the amount of paperwork and administrative tasks for police officers.
In the frog incident, Draft One was the platform responsible for drafting the faulty report.
The error became apparent during a practice traffic stop used to showcase what the tool could do.
FOX 13 reported that the AI-generated document contained many errors and needed significant editing before anyone could actually use it.
Keel said the tool still offers clear benefits, despite occasional glitches.
The Heber City sergeant claims the AI software now saves him six to eight hours of work every week.

“I’m not the most tech-savvy person, so it’s very user-friendly,” he told the outlet.
Keel emphasized that the program genuinely saves time each week. He also noted it’s both simple to learn and user-friendly.
This situation shows the challenges law enforcement faces as they quickly try to integrate artificial intelligence into their everyday work.
Law enforcement and security agencies are using artificial intelligence more and more for tasks like writing reports, identifying faces, and helping with surveillance. But mistakes made by these automated systems are still a worry, and AI has caused issues for police work before.
In 2025, an AI-powered security system made an error and incorrectly identified a student’s bag of Doritos as a gun.
In another case, a man was wrongly arrested because an AI system mistakenly flagged him as someone barred from a casino, even though the system was highly confident – with a 99% match – in its incorrect identification.
As such, human oversight remains essential, something Keel made clear following the frog fiasco.
Read More
- INJ/USD
- STX/USD
- Gold Rate Forecast
- Lord Of The Flies Review: Near-Perfect Adaptation Is A Reminder Of Classic Novel’s Haunting Power
- Man pulls car with his manhood while on fire to raise awareness for prostate cancer
- Avengers: Doomsday Spoilers & Leaks Addressed By Director Joe Russo: “It’s Over-Policed”
- Netflix’s Little House On The Prairie Reboot: Release Date, Cast & Everything We Know
- Spanish nuns are saving a breed of giant rabbits from extinction
- Cavaliers vs. Pistons Game 2 Results According to NBA 2K26
- The 10 Most Intense Stephen King Thriller Books of All Time
2026-01-07 22:50