While attempting to appear more professional, the Westbrook Police Department ended up looking like amateurs instead. Their brilliant idea? Using AI to slap their department patch onto drug bust photos. What could possibly go wrong?
The department posted the altered image on Facebook, apparently thinking nobody would notice the weird distortions and blurry details created by their AI makeover. Surprise! People noticed. The internet did what it does best—pointing out every flaw and questioning the department’s credibility.
The internet’s favorite pastime: catching authorities with their digital pants down and making them live with the consequences.
When confronted, the cops doubled down. No AI here! Just regular photo editing! Their denial only made things worse. Social media exploded. Citizens wondered what else the department might be fudging. Not a great look for people whose testimony needs to be believed in court.
Eventually, someone at headquarters connected the dots. Maybe lying about technological incompetence isn’t the best strategy? The department finally admitted that yes, they’d used ChatGPT to alter evidence photos. Oops. They apologized for the “oversight” and offered to share the original photos with media outlets. Too little, too late.
The incident raises serious questions about evidence integrity. Courts don’t typically smile upon doctored evidence, AI-generated or otherwise. The department’s technological misstep could have legal implications far beyond embarrassing Facebook comments.
It’s also a stark reminder of AI’s limitations. These tools aren’t programs with flaws, especially when operated by users who don’t understand their capabilities or risks. In this case, a simple departmental patch turned into a full-blown credibility crisis. The incident highlighted how AI can make unpredictable alterations in evidence that could potentially undermine legal proceedings.
The seized evidence was substantial, with 61 grams of fentanyl and 23 grams of methamphetamine confiscated during the June 24 bust. The concerns echo broader ethical considerations around AI systems operating as black boxes with diminished human oversight in critical areas. The Westbrook incident serves as a cautionary tale for other departments. AI might seem like a handy tool for making your social media posts look cooler, but when it comes to evidence, maybe stick to the unaltered truth. Novel concept for law enforcement, right?
References
- https://opentools.ai/news/maine-polices-ai-mishap-a-case-of-altered-evidence-raises-eyebrows
- https://keyt.com/news/2025/07/02/police-department-apologizes-for-sharing-ai-altered-photo-of-seized-drugs/
- https://www.vice.com/en/article/cops-apologize-after-altering-drug-bust-photo-with-ai/
- https://futurism.com/police-ai-slop-drug-bust
- https://www.everlaw.com/blog/ai-and-law/unlocking-justice-ai-evidence-analysis-forensics/