U.S. District Judge Rita Lin granted a preliminary injunction on Thursday blocking the Pentagon from classifying AI company Anthropic as a supply chain risk. The ruling hands Anthropic an early legal victory in a dispute that has forced the company into federal court on two fronts.
The designation did more than bar the Pentagon from using Anthropic's AI product, Claude. It required any company doing business with the Pentagon to sever ties with Anthropic entirely, a sweeping measure that rippled across the company's commercial relationships and market position.
That distinction matters. The Pentagon didn't just walk away from a vendor. It branded one.
According to Breitbart, in her written order, Judge Lin addressed the constitutional implications of the Pentagon's move. She found that nothing in the governing statute supports what she called an "Orwellian concept," one that would allow the government to label an American company a potential adversary and saboteur of the United States simply for expressing disagreement with government policies.
That language is striking from the bench. Federal judges do not toss around the word "Orwellian" casually. Judge Lin had already telegraphed her skepticism during a hearing earlier in the week, questioning why the Pentagon would need such extensive restrictions when it could simply stop using Claude's services directly.
It is a fair question. If the government doesn't want to buy a product, it doesn't have to. That is vendor selection. But compelling every Pentagon contractor to blacklist a company across the board is something else entirely.
Anthropic challenged the designation on both constitutional and procedural grounds, arguing that War Secretary Pete Hegseth and the Pentagon violated First Amendment protections and procurement law. The company claimed the classification was causing immediate and irreparable damage: business partners were reconsidering contracts, and federal agencies were dropping Claude.
An Anthropic spokesperson welcomed the ruling:
"We're grateful to the court for moving swiftly, and pleased they agree Anthropic is likely to succeed on the merits."
The spokesperson added that the company's focus remains on "working productively with the government to ensure all Americans benefit from safe, reliable AI."
That is careful corporate language, but the lawsuit itself tells a different story. Anthropic isn't just asking to keep its contracts. It argues that the federal government weaponized a procurement designation to punish speech.
Pentagon lawyers pushed back in a Tuesday filing, framing the dispute as a contract negotiation issue rather than a free speech matter. They argued that the conflict stemmed from Anthropic's behavior during contract negotiations, not from the specific limitations the company proposed regarding mass surveillance and autonomous weapons.
The government's position is straightforward: the Pentagon has legitimate authority to select its own vendors, and the First Amendment does not grant companies the right to unilaterally impose contract terms on the federal government. Government attorneys argued that Anthropic offered no legal precedent for what they characterized as a "radical interpretation" of First Amendment protections.
There is a version of this argument that makes sense. The government is not obligated to buy from anyone. Procurement decisions are executive functions, and companies do not have a constitutional right to a federal contract.
But that argument collides with the scope of what actually happened here. The Pentagon didn't decline to purchase a product. It classified a company as a supply chain risk, a designation that forced every Pentagon contractor to cut ties with Anthropic. That goes well beyond choosing a different vendor. It is an active measure designed to isolate a company from the defense marketplace entirely.
This case sits at the intersection of two issues conservatives should watch closely: the government's expanding use of regulatory and procurement tools to discipline private companies, and the question of who controls the most consequential technology of the coming decade.
Silicon Valley AI firms have earned conservative skepticism. Many of them have embedded ideological biases into their products, censored viewpoints, and built systems that reflect the political monoculture of San Francisco rather than the country they serve. Anthropic itself proposed contract limitations on mass surveillance and autonomous weapons applications, restrictions that raise obvious questions about whether the company is fit to serve national defense needs.
Those are legitimate concerns. But legitimate concerns require legitimate tools. If the government's position is that Anthropic's proposed contract terms were unacceptable, the answer is to reject those terms. If Anthropic's product poses genuine security risks, the answer is a transparent review process with due process protections. Slapping a supply chain risk label on a company and forcing the entire defense contractor ecosystem to cut ties is the kind of administrative overreach conservatives have spent decades fighting when it comes from the EPA, the DOJ, or any other agency.
The principle does not change because the target is a tech company that probably votes blue.
A parallel case is proceeding in a D.C. court, meaning Anthropic is fighting this battle on two legal fronts simultaneously. Judge Lin's preliminary injunction provides temporary relief, but the underlying dispute is far from resolved. The injunction signals that the court believes Anthropic is likely to succeed on the merits, which puts real pressure on the Pentagon to either justify its designation through proper channels or find another path forward.
The AI race is too important for the United States to lose ground over a procurement fight that a federal judge just called Orwellian. The government has every right to demand that its AI tools serve national security without ideological strings attached. It does not have the right to blacklist American companies through administrative fiat.
There is a difference between choosing your vendors and destroying one. Judge Lin seems to know what this was.