Anthropic Sues Department of Defense Over Supply-Chain Risk Designation


Anthropic filed a federal lawsuit against the US Department of Defense and other federal agencies on Monday, challenging its designation of the AI company as a “supply-chain risk.”

The Pentagon formally sanctioned Anthropic last week, capping a weeks-long, publicly aired disagreement over limits on use of its generative AI technology for military applications such as autonomous weapons.

“We do not believe this action is legally sound, and we see no choice but to challenge it in court,” Anthropic CEO Dario Amodei wrote in a blog post on Thursday.

The lawsuit, which was filed in a federal court in California, requested that a judge reverse the designation and stop federal agencies from enforcing it. “The Constitution does not allow ​the government to wield its enormous power to punish a company for its protected speech,” Anthropic said in the filing. “Anthropic turns to the judiciary as a last resort to vindicate its rights and halt the Executive’s unlawful campaign of retaliation.”

The AI startup, which develops a suite of AI models called Claude, is facing the possibility of losing hundreds of millions of dollars in annual revenue from the Pentagon and the rest of the US government. It also may lose the business of software companies that incorporate Claude into services they sell to federal agencies. Several Anthropic customers have reportedly said they are pursuing alternatives due to the Defense Department’s risk designation.

Amodei wrote that the “vast majority” of Anthropic’s customers will not have to make changes. The US government’s designation “plainly applies only to the use of Claude by customers as a direct part of contracts with the” military, he said. General use of Anthropic technologies by military contractors should be unaffected.

The Department of Defense, which also goes by the Department of War, and the White House did not immediately respond to requests for comment about Anthropic’s lawsuit.

Attorneys with expertise in government contracting say Anthropic faces a difficult battle in court. The rules that authorize the Department of Defense to label a tech company as a supply-chain risk don’t allow for much in the way of an appeal. “It’s 100 percent in the government’s prerogative to set the parameters of a contract,” says Brett Johnson, a partner at the law firm Snell & Wilmer. The Pentagon, he says, also has the right to express that a product of concern, if used by any of its suppliers, “hurts the government’s ability to effectuate its mission.”

Anthropic’s best chance of success in court could be proving it was singled out, Johnson says. Soon after Defense Secretary Pete Hegseth announced that he was designating Anthropic a supply-chain risk, rival OpenAI announced it had struck a new contract with the Pentagon. That could be instrumental to Anthropic’s legal argument if the company can demonstrate it was seeking similar terms as the ChatGPT developer.

OpenAI said its deal included contractual and technical means of assuring its technology would not be used for mass domestic surveillance or to direct autonomous weapons systems. It added that it opposed the action against Anthropic and did know why its rival could not reach the same deal with the government.

Military Priority

Hegseth has prioritized military adoption of AI technologies, with posters recently seen in the Pentagon showing him pointing and that read, “I want you to use AI.” The dispute with Anthropic kicked up in January after Hegseth ordered several AI suppliers to agree that the department was free to use their technologies for any lawful purpose.

Anthropic, which is the only company currently providing AI chatbot and analysis tools for the military’s most sensitive use cases, pushed back. It contends that its technologies are not yet capable enough to be used for mass domestic surveillance of Americans or fully autonomous weapons. Hegseth has said Anthropic wants veto power over judgments that should be left to the Defense Department.



Source link