Sam Altman got an exclusive test over Claude’s Super Bowl commercials

Anthropic’s Super Bowl commercial, one of four commercials for the AI lab that dropped on Wednesday, begins with the word “BETRAYAL” boldly splashed across the screen. The camera pans to a man earnestly asking a chatbot (apparently meant to represent ChatGPT) for advice on how to talk to his mother.
The bot, voiced by a blonde woman, offers classic advice. Start by listening. Try a nature walk! Then it switches to an ad for a (hopefully!) fake cougar-dating site called Golden Encounters. Anthropic ends this area by saying that while ads will come to AI, they won’t come to its chatbot, Claude.
Another sale includes a young man looking for advice on how to build a six pack. After providing her height, age, and weight, the bot serves her an ad for height-enhancing insoles.
Anthropic’s ads are smartly targeted at OpenAI users, after that company’s recent announcement that ads will be coming to the free ChatGPT section. And they quickly caused an uproar, posting articles that Anthropic “fuds,” “skewers,” and “dunks on” OpenAI.
They were so funny that even Sam Altman admitted to X that he laughed at them. But obviously he didn’t see them as funny. They encouraged him to write a novella-sized propaganda piece that began calling his opponent “dishonest” and “authoritative.”
In that post, Altman explains that the ad-supported category is designed to shoulder the burden of providing free ChatGPT to its many millions of users. ChatGPT is still the most popular chatbot by a large margin.
But an OpenAI executive insisted that the ads were “dishonest” suggesting that ChatGPT would twist the conversation to include an ad (and possibly a colorless product, to begin with).” We obviously won’t be running ads the way Anthropic presents itself,” Altman wrote on social media. “We are not stupid and we know that our users would reject that.”
Techcrunch event
Boston, MA
|
June 23, 2026
Indeed, OpenAI has promised that ads will be classified, labeled, and never influence the conversation. But the company also said it plans to make them more direct in the conversation — which is the main focus of Anthropic’s ads. As OpenAI explained on its blog, “We plan to test ads below responses on ChatGPT if there is a suitable sponsored product or service based on your current conversation.”
Altman then went on to make other questionable assertions about his opponent. “Anthropic is giving rich people an expensive product,” he wrote. “We also feel strongly that we need to bring AI to the billions of people who can’t afford subscriptions.”
But Claude has a free chat section, too, with subscriptions at $0, $17, $100, and $200. ChatGPT tiers are $0, $8, $20, and $200. One could argue that subscription categories are equal.
Altman also alleged in his post that “Anthropic wants to control what people do with AI.” He says it prevents the use of Claude’s code by “companies that don’t like it,” like OpenAI, and that Anthropic tells people what they can and can’t use for AI.
In fact, Anthropic’s entire marketing deal from day one has been “responsible AI.” The company was founded by two former students of OpenAI, after all, who said they were shocked by the safety of AI when they worked there.
However, both chatbot companies have usage policies, AI guardrails, and talk about AI safety. And while OpenAI allows ChatGPT to be used for erotica while Anthropic does not, OpenAI, like Anthropic, has decided that some content should be blocked, especially in relation to mental health.
Yet Altman took this Anthropic-tells-you-do-do argument to an extreme level when he accused Anthropic of “dominating people.”
“Some authoritative company will not bring us alone, to say nothing of other obvious dangers. It is a dark path,” he wrote.
Using “endorsement” in reference to a deceptive Super Bowl ad is wrong, to say the least. It makes no sense given the current state of the world where protesters around the world are being killed by agents of their own government. While business rivals have been duking it out in ads since the beginning of time, Anthropic has clearly hit a nerve.



