#AI #nukes #Hegseth #bullies #bullying
"Hegseth Demands Anthropic Let Military Use AI However It Wants—Even for Autonomous Killer Drones and Spying On Americans
Secretary of Defense Pete Hegseth said the company that owns the AI assistant Claude would be punished unless it drops all ethical guidelines.
(. . .)
Anthropic’s powerful AI model, Claude, is currently the only one permitted to handle classified military data, and the company was awarded a $200 million contract last year to develop AI capabilities for the Department of Defense to use alongside other AI firms.
However, the company’s usage policy prohibits its use for mass surveillance and for the development of autonomous weapons—such as drones that attack targets without a human operator.
These limitations have infuriated the Defense Department leadership. On Tuesday, Hegseth called Anthropic’s CEO, Dario Amodei, to a meeting at the Pentagon, where he demanded 'unfettered' access to Claude without any guardrails.
This goal was outlined last month in the department’s 'AI Strategy' memo, which called for the US to adopt an 'AI-first warfighting force' and for companies to allow their technology to be deployed for 'any lawful use,' free from ethical safeguards.
While it would not be an unusual step for the Pentagon to cut ties with Anthropic, threats to declare it a supply chain risk have been described as extraordinary.
Jessica Tillipman, the associate dean for government procurement law studies at George Washington University, who specializes in AI governance, wrote on social media that the threat of 'declaring Anthropic a supply chain risk is deeply problematic,' as it’s 'generally something we reserve for products that create security risks, and using it in this way undermines its purpose.'
As Elizabeth Nolan Brown wrote on Wednesday for Reason, it 'would mean anyone who wants to work with the US military in any capacity must sever ties with the AI company,” which could deal a major blow to the business."
https://www.commondreams.org/news/hegseth-jawbones-anthropic