We found an undocumented bug in the Apollo 11 guidance computer code

https://www.juxt.pro/blog/a-bug-on-the-dark-side-of-the-moon/

JUXT Blog: A bug on the dark side of the Moon

How a specification found what fifty-seven years of scrutiny missed.

Super interesting. I wish this article wasn’t written by an LLM though. It feels soulless and plastic.
For what it’s worth, Pangram thinks this article is fully human-written: https://www.pangram.com/history/f5f68ce9-70ac-4c2b-b0c3-0ca8...
A bug on the dark side of the | Pangram Labs

Does "A bug on the dark side of the " contain AI-generated text? Pangram finds that this document is We believe that this document is fully human-written

Then pangram isn't very good, because that article is full of Claude-isms.

Is it possible for a tool to know if something is AI written with high confidence at all? LLMs can be tuned/instructed to write in an infinite number of styles.

Don't understand how these tools exist.

The WikiEDU project has some thoughts on this. They found Pangram good enough to detect LLM usage while teaching editors to make their first Wikipedia edits, at least enough to intervene and nudge the student. They didn’t use it punatively or expect authoritative results however. https://wikiedu.org/blog/2026/01/29/generative-ai-and-wikipe...

They found that Pangram suffers from false positives in non-prose contexts like bibliographies, outlines, formatting, etc. The article does not touch on Pangram’s false negatives.

I personally think it’s an intractable problem, but I do feel pangram gives some useful signal, albeit not reliably.

Generative AI and Wikipedia editing: What we learned in 2025

Like many organizations, Wiki Education has grappled with generative AI, its impacts, opportunities, and threats, for several years. As an organization that runs large-scale programs to bring new e…

Wiki Education