Anthropic denies being able to sabotage AI tools during war

💥 Explore this awesome post from WIRED 📖

📂 **Category**: Business,Business / Artificial Intelligence,Model Manipulation

📌 **What You’ll Learn**:

Anthropic cannot manipulate CLOUD generative artificial intelligence model once the U.S. Army has it operational, an executive wrote in a lawsuit on Friday. This statement came in response to the Trump administration’s accusations that the company may have tampered with its artificial intelligence tools during the war.

“Anthropic never had the ability to cause Claude to stop working, change its functionality, close access to it, or impact or jeopardize military operations,” Thiago Ramasamy, head of public sector at Anthropic, wrote. “Anthropic does not have the access required to disable the technology or change model behavior prior to or during ongoing operations.”

The Pentagon has been sparring with a leading AI lab for months over how its technology should be used for national security, and what the limits should be on that use. This month, Defense Secretary Pete Hegseth described Anthropic as a supply chain risk, a designation that will prevent the Defense Department from using the company’s software, including through contractors, in the coming months. Other federal agencies are also abandoning Claude.

Anthropic has filed two lawsuits challenging the constitutionality of the ban and is seeking an emergency order to overturn it. However, customers have already started canceling deals. A hearing in one of the cases is scheduled for March 24 in federal district court in San Francisco. The judge can decide to temporarily set back shortly after.

In a filing earlier this week, government lawyers wrote that the Department of Defense “is not required to tolerate the risk of critical military systems being compromised at pivotal moments for national defense and active military operations.”

WIRED magazine reported that the Pentagon was using Claude to analyze data, write memos, and help develop battle plans. The government’s argument is that Anthropic could disrupt active military operations by stopping access to Claude or pushing out malicious updates if the company doesn’t approve certain uses.

Ramasamy rejected this possibility. “Anthropic does not maintain any backdoor or remote kill switch,” he wrote. “Human personnel, for example, cannot log into the DoW system to modify or disable models during the process; the technology simply does not work that way.”

He went on to say that Anthropic would only be able to provide updates with the approval of the government and its cloud provider, in this case Amazon Web Services, though he did not specify that by name. Ramasamy added that Anthropic does not have access to claims or other data that military users enter into Cloud.

Anthropologie executives assert in court filings that the company does not want veto power over military tactical decisions. Anthropic was willing to guarantee the same amount in the contract proposed on March 4, Sarah Heck, chief policy officer, wrote in a lawsuit on Friday. [Anthropic] “You understand that this authorization does not grant or confer any right to control or veto lawful operational decision-making of the War Department,” the motion said, according to the filing, which referred to an alternative name for the Pentagon.

Heck claimed the company is also willing to accept language that would address its concerns about using Claude to help carry out lethal strikes without human supervision. But the negotiations eventually collapsed.

For now, the Department of Defense said in court filings that it is “taking additional measures to mitigate the supply chain risks” posed by the company by “working with third-party cloud providers to ensure human command cannot make unilateral changes” to existing Cloud systems.

⚡ **What’s your take?**
Share your thoughts in the comments below!

#️⃣ **#Anthropic #denies #sabotage #tools #war**

🕒 **Posted on**: 1774070838

🌟 **Want more?** Click here for more info! 🌟

By

Leave a Reply

Your email address will not be published. Required fields are marked *