Technology

Anthropic's Opus 4.6 Reshapes Coding as Pentagon Rift Grows

Anthropic launched Claude Opus 4.6 with agent teams and advanced coding capabilities, revealing that up to 90% of its code is AI-written, while a $200 million Pentagon contract hangs in the balance over military usage restrictions.

R
Redakcia
Share
Anthropic's Opus 4.6 Reshapes Coding as Pentagon Rift Grows

A New Model Amid High-Stakes Tensions

Anthropic released Claude Opus 4.6 on February 5, 2026, delivering its most capable AI model yet for coding, agentic workflows, and enterprise tasks. But the technical milestone arrived against a turbulent backdrop: a deepening dispute with the Pentagon over military use of its technology, and a high-profile public feud with OpenAI over advertising in AI chatbots.

Opus 4.6 and the AI-Written Codebase

The new model introduces agent teams -- groups of AI agents that can split large tasks into parallel jobs, communicate with each other, and share task lists. Opus 4.6 also features a 1-million-token context window in beta, improved debugging and code review, and deeper integration with tools like Microsoft PowerPoint. On the GDPval-AA benchmark, which measures performance on economically valuable knowledge work, Opus 4.6 outperforms OpenAI's GPT-5.2 by roughly 144 Elo points.

Perhaps more striking than the model itself is what Anthropic revealed about its own development process. The company disclosed that 70 to 90 percent of its codebase is now AI-generated, depending on the team. Boris Cherny, head of Claude Code, said he has not written any code himself in over two months. When Anthropic launched Cowork -- a general computing tool -- in January, four engineers built it in 10 days, with most of the code written by Claude itself. Anthropic's product leaders describe this as the dawn of "vibe working" -- a step beyond vibe coding, where professionals can realize complex ideas without deep technical expertise.

The Pentagon Showdown

While Anthropic celebrates technical breakthroughs, a $200 million contract with the Department of Defense, signed in July 2025, is under serious threat. According to Axios, the Pentagon is pushing AI companies to allow military use of their tools for "all lawful purposes." Anthropic has refused to fully comply, insisting that mass surveillance of Americans and fully autonomous weapons remain off limits.

Defense Secretary Pete Hegseth is reportedly "close" to severing ties with Anthropic and designating it a "supply chain risk" -- a classification typically reserved for foreign adversaries that would force any Pentagon contractor to cut business with the company. The dispute intensified after reports that Claude was used in the U.S. military operation to capture then-Venezuelan President Nicolas Maduro, raising questions about how Anthropic's usage policies were applied in practice.

The Super Bowl Salvo

Adding another front to an eventful month, Anthropic ran satirical Super Bowl advertisements attacking OpenAI's decision to introduce ads inside ChatGPT. The spots -- titled "Deception," "Betrayal," "Treachery," and "Violation" -- carried the tagline: "Ads are coming to AI. But not to Claude." One ad depicted an AI therapist pivoting mid-session to pitch a dating app.

The campaign worked. CNBC reported an 11% jump in daily active users and a 6.5% spike in site traffic following the game. OpenAI CEO Sam Altman called the ads "clearly dishonest," while ChatGPT officially unveiled its advertising features hours after the spots aired.

What It All Means

Anthropic finds itself at the center of three defining questions in AI: how far autonomous coding can go, where to draw the line on military applications, and whether AI assistants should serve users or advertisers. Its willingness to risk a $200 million defense contract over ethical red lines, while simultaneously demonstrating that AI can write most of the code powering a frontier AI company, positions Anthropic as both a technical leader and a lightning rod in the intensifying debate over AI governance.

Stay updated!

Follow us on Facebook for the latest news and articles.

Follow us on Facebook

Related articles