Technology & AI

No one has a good plan for how AI companies should work with governments

As Sam Altman found out Saturday night, it’s a tough time to be doing work for the US government. Around 7 p.m., OpenAI’s chief executive announced that he would be publicly questioning X, as a way to undermine his company’s decision to take over the Pentagon contract that Anthropic had just vacated.

Most of the questions focused on OpenAI’s willingness to participate in mass recruitment and automated killing — activities that Anthropic had decided to ignore in its discussions with the Pentagon. Altman was often critical of the public sector, saying it was not his role to set national policy.

“I strongly believe in the democratic process,” he wrote in response, “that our elected leaders have the power, and that we should all respect the constitution.”

An hour later, he admitted to his surprise that many people seemed to disagree. “There’s more open debate than I thought there would be,” Altman said, “about whether we should have democratically elected government or unelected private corporations have more power. I think that’s something that people don’t agree with.”

It’s an important time for both OpenAI and the technology industry as a whole. In his Q&A, Altman drew on a common scenario in the defense industry, where military leaders and industry partners are expected to defer to civilian leadership.

But what’s more telling is that, as OpenAI transitions from a successful consumer startup to part of the national security infrastructure, the company seems ill-equipped to handle its new responsibilities.

Altman’s public town hall came at a high time for his company. The Pentagon had recently blacklisted OpenAI competitor Anthropic for emphasizing contract restrictions on surveillance and automatic weapons. Hours later, OpenAI announced that it had won the same contract Anthropic had given up. Altman portrayed the deal as a quick way to end the conflict — and it certainly paid off. But he seemed unprepared for how much blowback it would generate for the company’s users and employees.

Techcrunch event

San Francisco, CA
|
October 13-15, 2026

OpenAI has been working with the US government for years – but not like this. When Altman made his case to Congressional committees in 2023, for example, he still largely followed the social media playbook. He spoke highly of the company’s ability to change the world while accepting risks and actively engaging with lawmakers — the perfect combination of motivating investors while rolling back regulations.

Less than three years later, that approach no longer works. AI is clearly powerful and the capital requirements are so great that it is impossible to avoid very sensitive interactions with the government. What’s surprising is how unprepared both sides seem to be.

The biggest controversy right now is Anthropic itself, and US Defense Secretary Pete Hegseth’s plan on Friday to designate the lab as a procurement risk. That threat hangs over every conversation like an unfired gun. As former Trump aide Dean Ball wrote over the weekend, this article will remove Anthropic from its hardware and hosting partners, effectively destroying the company. It would be an unprecedented move for an American company, and while it may be overturned in court, it will wreak havoc in the meantime and send shockwaves through the industry.

As Ball explained the process, Anthropic was doing the existing contract under terms that had been established years ago – so that management could insist on changing the terms. It’s more than anything that would fly in the private sector and send a sad message to other retailers.

“Even if Secretary Hegseth backs down and reduces his broad threat against the Anthropic, a great deal of damage has been done,” Ball wrote. “Many companies, political actors, and others will have to operate under the assumption that the nation’s mentality is about to rule.”

It’s a direct threat to Anthropic, but it’s also a big problem for OpenAI. The company is already under a lot of pressure from workers to maintain the red line. At the same time, the right-wing media will be wary of any sign of OpenAI becoming an unstable political ally. In the middle of it all is the Trump administration, doing everything in their power to make the situation as difficult as possible.

It can be argued that OpenAI never set out to be a defense contractor, but because of its big ambitions, it is forced to play the same game as Palantir and Anduril. Entering the Trump administration means choosing sides. There are no political actors here, and winning some friends will mean disrespecting others. It remains to be seen how high the price OpenAI will pay, either in lost business or lost employees, but it’s unlikely to come out unscathed.

It may seem strange that this disruption comes at a time when there are more prominent tech investors holding positions of influence in Washington than ever before, but most of them seem perfectly happy with the idea of ​​nations. Among Trump-affiliated venture capitalists, Anthropic has long been seen as currying favor with the Biden administration in ways that could hurt the big industry — a view underscored by Trump adviser David Sacks’ reaction to the ongoing dispute. Now that the opposite has happened, few seem willing to stand up for a broad principle of free enterprise.

This is a difficult situation for any company to be in – and while politically aligned players may benefit in the short term, they will be exposed in the same way when political winds inevitably change. There’s a reason why, for decades, the defense industry has been dominated by slow-moving, highly regulated conglomerates like Raytheon and Lockheed Martin. Acting as the Pentagon’s industrial arm gave them the political cover they needed to avoid politics, staying focused on technology without forcing a reset every time the White House changed hands.

Today’s competitors may go faster than their predecessors – but they’re not as good for the long haul.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button