
A police report-writing AI unexpectedly malfunctioned and claimed a Utah officer had turned into a frog.
A strange problem came to light in December when the Heber City Police Department in Utah was trying out new AI tools to reduce the amount of paperwork they had to handle.
Rather than providing a clear summary of what happened, the software produced a report claiming an officer had transformed into a frog.
It turned out the police body camera had mistakenly recorded audio from a Disney movie that was playing nearby.
Disney film causes AI police report mix-up
According to Sergeant Keel, the AI-powered report writing software identified the movie playing in the background as ‘The Princess and the Frog,’ as he shared with FOX 13 News.
“That’s when we learned the importance of correcting these AI-generated reports,” he added.
The department started testing two different software programs, Draft One and Code Four, in early December.

George Cheng and Dylan Nguyen, both 19-year-old MIT dropouts, developed Code Four. This program uses audio from body cameras to automatically generate complete police reports.
Axon, a company specializing in police technology, revealed Draft One last year. Like the previous tool, it automatically creates written reports using footage from body cameras.
Draft One is powered by OpenAI’s GPT technology. Axon says this will significantly reduce the amount of paperwork and administrative tasks for police officers.
In the frog incident, Draft One was the platform responsible for drafting the faulty report.
The error became apparent during a practice traffic stop used to showcase what the tool could do.
FOX 13 reported that the AI-generated document contained many errors and needed significant editing before anyone could actually use it.
Keel said the tool still offers clear benefits, despite occasional glitches.
The Heber City sergeant claims the AI software now saves him six to eight hours of work every week.

“I’m not the most tech-savvy person, so it’s very user-friendly,” he told the outlet.
Keel emphasized that the program genuinely saves time each week. He also noted it’s both simple to learn and user-friendly.
This situation shows the challenges law enforcement faces as they quickly try to integrate artificial intelligence into their everyday work.
Law enforcement and security agencies are using artificial intelligence more and more for tasks like writing reports, identifying faces, and helping with surveillance. But mistakes made by these automated systems are still a worry, and AI has caused issues for police work before.
In 2025, an AI-powered security system made an error and incorrectly identified a student’s bag of Doritos as a gun.
In another case, a man was wrongly arrested because an AI system mistakenly flagged him as someone barred from a casino, even though the system was highly confident – with a 99% match – in its incorrect identification.
As such, human oversight remains essential, something Keel made clear following the frog fiasco.
Read More
- Tom Cruise? Harrison Ford? People Are Arguing About Which Actor Had The Best 7-Year Run, And I Can’t Decide Who’s Right
- Gold Rate Forecast
- Brent Oil Forecast
- Adam Sandler Reveals What Would Have Happened If He Hadn’t Become a Comedian
- Katanire’s Yae Miko Cosplay: Genshin Impact Masterpiece
- Answer to “Hard, chewy, sticky, sweet” question in Cookie Jam
- What are the Minecraft Far Lands & how to get there
- Arc Raiders Player Screaming For Help Gets Frantic Visit From Real-Life Neighbor
- Yakuza Kiwami 2 Nintendo Switch 2 review
- ETH PREDICTION. ETH cryptocurrency
2026-01-07 22:50