Judge Blocks Pentagon’s Anthropic Supply Chain Designation

Bitbuy
fiverr


A US federal judge in San Francisco has granted Anthropic’s request for temporary reprieve after the Pentagon’s designation of the company as a supply chain risk.

In an order on Thursday, Judge Rita Lin of the District Court for the Northern District of California ordered a preliminary injunction against the Pentagon over the label. It also temporarily halts a directive from US President Donald Trump ordering federal agencies to stop using Anthropic’s chatbot, Claude.

“Nothing in the governing statute supports the Orwellian notion that an American company may be branded a potential adversary and saboteur of the US for expressing disagreement with the government,” said Judge Lin.

Anthropic was the top player in enterprise AI markets with 32%, ahead of OpenAI on 25%, as of 2025, according to Menlo Ventures. A government-wide ban on Anthropic would plummet this position.

okex

The judge said that these “broad punitive measures” taken against Anthropic by the Trump administration and Defense Secretary Pete Hegseth appeared “arbitrary, capricious, [and] an abuse of discretion.”

The order came after Anthropic filed a lawsuit in a Columbia federal court on March 9, alleging that Hegseth overstepped his authority when he designated the company a national security supply-chain risk.

Screenshot from court ruling. Source: Courtlistener

Anthropic opposed autonomous weapons and mass surveillance

The dispute stems from a deal in July 2025 between the AI firm and the Pentagon on a contract to make Claude the first frontier AI model approved for use on classified networks. 

Negotiations collapsed in February with the Pentagon seeking to renegotiate, insisting Anthropic allow military use of Claude “for all lawful purposes” and without restrictions.

Anthropic maintained that its technology should not be used for lethal autonomous weapons and mass domestic surveillance of Americans.

On Feb. 27, Trump ordered all federal agencies to cease using Anthropic products. “The Leftwing nut jobs at Anthropic have made a DISASTROUS MISTAKE trying to STRONG-ARM the Department of War,” he wrote on Truth Social. 

A 90-minute court hearing took place in San Francisco on March 24, during which Judge Lin pressed government lawyers on whether Anthropic was being punished for publicly criticizing the Pentagon.

Classic illegal First Amendment retaliation

“Punishing Anthropic for bringing public scrutiny to the government’s contracting position is classic illegal First Amendment retaliation,” the March 26 ruling stated. 

Anthropic said in a statement that it was “grateful to the court for moving swiftly, and pleased they agree Anthropic is likely to succeed on the merits.” 

Magazine: Nobody knows if quantum secure cryptography will even work