Tech insiders are urging the Department of Defense and Congress to withdraw the humanitarian label as a supply chain risk

🚀 Read this must-read post from TechCrunch 📖

📂 **Category**: AI,Government & Policy,Anthropic,dod,dow,supply chain risk

💡 **What You’ll Learn**:

Hundreds of tech workers have signed an open letter urging the Department of Defense to withdraw its designation of Anthropologie as a “supply chain risk.” The letter also calls on Congress to intervene and “consider whether the use of these extraordinary powers against a US technology company is appropriate.”

The letter includes signatories from major technology and venture capital firms including OpenAI, Slack, IBM, Cursor, Salesforce Ventures, and more. It follows a dispute between the Ministry of Defense and Anthropic after the AI ​​laboratory last week refused to give the military unrestricted access to its AI systems.

Anthropic’s two red lines in its negotiations with the Pentagon were that it did not want its technology used for mass surveillance of Americans or to operate autonomous weapons that make targeting and firing decisions without a human in the loop. The Department of Defense said it has no plans to do either, but does not believe it should be constrained by vendor rules.

In response to Anthropic CEO Dario Amodei’s refusal to bow to Hegseth’s threats, President Donald Trump on Friday directed federal agencies to stop using Anthropic’s technology after a six-month transition period. Hegseth said he would follow through on his threats and designate Anthropic as a supply chain risk — a designation typically reserved for foreign adversaries that would blacklist the AI ​​company to prevent it from working with any agency or company that does business with the Pentagon.

In a post on Friday, Hegseth wrote: “As of now, no contractor, supplier or partner that does business with the US military may conduct any business activity with Anthropic.”

But the post on X doesn’t automatically make Anthropic a supply chain risk. The government needs to complete a risk assessment and notify Congress before military partners are forced to cut ties with Anthropic or its products. Anthropic said in a blog post that the destination was “legally unsound” and that it would “challenge any supply chain risk classification in court.”

Many in the industry see the US administration’s treatment of Anthropians as cruel and clear retaliation.

TechCrunch event

San Francisco, California
|
October 13-15, 2026

“When two parties cannot agree on terms, the natural course is to separate and work with a competitor,” the open letter said. “This situation sets a dangerous precedent. Punishing a US company for refusing to accept contract changes sends a clear message to every technology company in America: Accept whatever terms the government demands, or face retaliation.”

Beyond worrying about the government’s harsh treatment of humans, many in the industry remain concerned about potential government overreach and the use of AI for nefarious purposes.

Banning governments from using AI to conduct mass surveillance is also a “personal red line” and “should be the line of all of us,” Boaz Barak, a researcher at OpenAI, wrote in a social media post on Monday.

Moments after Trump publicly attacked Anthropic, OpenAI announced it had reached its own deal to deploy its models in the Department of Defense’s classified environments. Sam Altman, CEO of OpenAI, said last week that the company has the same red lines as Anthropic.

“If anything good can come out of the events of the past week, it will be if we in the AI ​​industry begin to treat the use of AI to abuse government and surveil its people as a catastrophic risk in itself,” Barrack wrote. “We’ve done a good job of assessments, mitigation and operations for risks like bioweapons and cybersecurity. Let’s use similar processes here.”

💬 **What’s your take?**
Share your thoughts in the comments below!

#️⃣ **#Tech #insiders #urging #Department #Defense #Congress #withdraw #humanitarian #label #supply #chain #risk**

🕒 **Posted on**: 1772474790

🌟 **Want more?** Click here for more info! 🌟

By

Leave a Reply

Your email address will not be published. Required fields are marked *