• News
  • Technology News
  • Tech News
  • For the first time, Pentagon tells why Anthropic’s Claude AI models have been designated as 'nation security risk' for US military

For the first time, Pentagon tells why Anthropic’s Claude AI models have been designated as 'nation security risk' for US military

For the first time, Pentagon tells why Anthropic’s Claude AI models have been designated as 'nation security risk' for US military
Defense Department’s chief technology officer Emil Michael has outlined for the first tine why Anthropic’s Claude AI models have been designated as a ‘national security risk’. Speaking on CNBC’s Squawk Box, Michael said Claude’s ‘different policy preferences’ embedded in its constitution are training and that could ‘pollute’ the Pentagon’s supply chain. “We can’t have a company that has a different policy preference that is baked into the model through its constitution, its soul, its policy preferences, pollute the supply chain so our warfighters are getting ineffective weapons, ineffective body armor, ineffective protection,” Michael explained.Michael further added that the decision to label the AI giant as ‘national security risk’ was not done in order to punish Anthropic. noting that the company has a “huge commercial business” and only a small fraction of its revenue comes from U.S. government contracts. He also dismissed claims that the Pentagon has actively discouraged companies from using Anthropic outside of defense supply chains, calling such reports “rumours.

Anthropic is the first American company be called as supply chain risk

Anthropic was recently tabled as a supply chain risk, a designation reserved only for foreign adversaries. The move requires defense contractors and vendors to certify they are not using Claude in Pentagon-related work. Responding to this, Anthropic sued the Trump administration, calling the designation “unprecedented and unlawful” and warning that hundreds of millions of dollars in contracts were at risk.
Poll
Do you believe the Pentagon's designation of Anthropic as a national security risk is justified?

Why Anthropic has sued the government

The Pentagon wanted Anthropic to remove hard limits on deploying its AI for fully autonomous weapons and domestic surveillance of American citizens. Anthropic denied saying that current AI models are not reliable enough for autonomous weapons and that using them in that way would be dangerous. It also called domestic surveillance a violation of fundamental rights.When those negotiations broke down, Defense Secretary Pete Hegseth formally designated Anthropic a national security supply-chain risk. President Donald Trump then directed the government to stop working with Anthropic altogether, with a six-month phase-out announced for existing contracts.The Defense Department has been equally firm, saying that US law – not a private company – should determine how America defends itself, and that the military needs full flexibility to use AI for “any lawful use.” The Pentagon warned that Anthropic's self-imposed restrictions could endanger American lives.

Transition plan in place at Pentagon

Michael acknowledged that the Pentagon cannot “just rip out” Anthropic’s technology overnight, emphasising that a transition plan is in place. “This is not just Outlook where you could delete it from your desktop,” he said, underscoring the complexity of replacing AI systems integrated into defense operations.
author
About the AuthorTOI Tech Desk

The TOI Tech Desk is a dedicated team of journalists committed to delivering the latest and most relevant news from the world of technology to readers of The Times of India. TOI Tech Desk’s news coverage spans a wide spectrum across gadget launches, gadget reviews, trends, in-depth analysis, exclusive reports and breaking stories that impact technology and the digital universe. Be it how-tos or the latest happenings in AI, cybersecurity, personal gadgets, platforms like WhatsApp, Instagram, Facebook and more; TOI Tech Desk brings the news with accuracy and authenticity.

End of Article
Follow Us On Social Media