Strange juxtaposition with the war in Iran happening at the same time as the controversy around Anthropic, OpenAI, and the Pentagon. OpenAI has a post with the details of their agreement, which also includes red lines on mass surveillance and autonomous weapons:

"We think our agreement has more guardrails than any previous agreement for classified AI deployments, including Anthropic’s."

I’m confused about how Anthropic’s proposed contract differs from the contract that OpenAI has shared. If the Pentagon offered the same agreement to Anthropic, would they accept it? Lots of questions.

Our agreement with the Department of War

Details on OpenAI’s contract with the Department of War, outlining safety red lines, legal protections, and how AI systems will be deployed in classified environments.

@manton The provided contract language only says that the DoW can’t cross those red lines “where law, regulation, or Department policy” says that they can’t.

That sounds a lot to me like using more words to repeat “for all lawful purposes,” which sounds a lot like “You’re not allowed to do things if you agree that you’re already not allowed to do them,” which is basically no restriction at all.

@carljonard My reading is that OpenAI thinks the combination of existing rules (e.g. 3000.09) plus OpenAI’s safety checks will prevent autonomous use. That’s hardly no restriction. But we can’t compare with Anthropic’s language.
Leo Gao (@nabla_theta) on X

the contract snippet from the openai dow blog post is so obviously just "all lawful use" followed by a bunch of stuff that is not really operative except as window dressing. the referenced DoD Directive 3000.09 basically says the DoD gets to decide when autonomous weapons systems

X (formerly Twitter)
Mike Masnick (@masnick.com)

I saw some folks asking what the difference was between what OpenAI signed with the DoD and what Anthropic said they wanted, and Sam more or less admits here the key point: OpenAI's deal requires them to trust the NSA. Anthropic's contract had real safeguards.

Bluesky Social