Five creators of the AI economy explain where the wheels come from

Earlier this week, five people involved in all layers of the AI supply chain sat down at the Milken Global Conference in Beverly Hills, where they talked to this editor about everything from chip shortages to orbital data centers to whether the entire underlying tech architecture is wrong.
On stage with TechCrunch: Christophe Fouquet, CEO of ASML, a Dutch company that owns extreme ultraviolet lithography equipment without modern chips that would not exist; Francis deSouza, COO of Google Cloud, who oversaw the biggest infrastructure bet in business history; Qasar Younis, founder and CEO of Applied Intuition, a $15 billion virtual reality AI company that started in simulation and has since moved into defense; Dimitry Shevelenko, chief business officer of Perplexity, an AI-native search-to-agents company; and Eve Bodnia, a quantum physicist who left academia to challenge the foundation much of the AI industry takes for granted in her startup, Logical Intelligence. (Meta’s former chief AI scientist, Yan LeCun, signed on as founding chairman of the technology’s research board earlier this year.)
Here’s what five had to say:
The issues are real
The AI boom is running into the hard limits of the body, and the barriers start much lower than many may realize. Fouquet was the first to say it, describing the “great speed of the production of chips,” while expressing his “firm belief” that despite all that effort, “in these two, three, maybe five years, the market will be limited,” which means that the hyperscalers – Google, Microsoft, Amazon, Meta – will not get all the chips they pay for.
DeSouza emphasized how big – and how fast growing – this problem is, reminding the audience that the revenue of Google Cloud exceeded 20 billion dollars in the last quarter, growing by 63%, while its backlog – a profit committed but not yet arrived – almost doubled in one quarter, from $ 250 billion to $ 460 billion. “The need is real,” he said calmly.
For Younis, the limitation comes primarily from elsewhere. Applied Intuition builds autonomous systems for cars, trucks, drones, mining equipment and defense vehicles, and its bottle is not silicon – data that one can only collect by sending machines into the real world and watching what happens. “You have to find it in the real world,” he said, and no amount of artificial simulation fully fills that gap. “It will be a long time before you can fully train models that work in the artificial world.”
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
The power problem is also real
If chips are the first bottle, power is what comes from behind them. DeSouza confirmed that Google is exploring data centers in space as a serious answer to energy problems. “You get a lot more energy,” he noted. Of course, even in orbit, it’s not easy. DeSouza noted that space is a vacuum, so it eliminates convection, leaving radiation as the only way to shed heat to the environment (a process that is slower and more difficult for engineers than the air and liquid cooling systems data centers rely on today). But the company still takes it as a legitimate option.
The most serious argument that de Souza made was, arguably, about efficiency through integration. Google’s strategy to integrate its full AI stack — from custom TPUs to models and agents — is paying dividends in flops per watt (more computation per unit of power) that the company buys off-the-shelf components it can’t replicate, he suggested. “Gemini running on TPUs uses a lot more power than any other configuration,” because chip designers know what’s coming to the model before it ships, he said.
Fouquet made a similar point later in the interview. He said: “Nothing is precious. The industry is in a strange time right now, it’s investing so much, it’s driven by strategic demand. But more computing means more power, and more power has value.”
A different kind of intelligence
While the rest of the industry debates the scale, architecture, and efficiency of ideas within the paradigm of the big language model, Bodnia is building something very different.
His company, Logical Intelligence, is built on so-called energy-based models (EBMs), a class of AI that does not predict the next token in sequence but instead tries to understand the rules underlying the data, in a way that he argues is closer to how the human brain actually works. “Language is the connection between my brain and yours,” she said. “Imagination itself has nothing to do with any language.”
His largest model has up to 200 million parameters – compared to hundreds of billions in the top LLMs – and he says it runs thousands of times faster. More importantly, it is designed to update its knowledge as the data changes, rather than requiring retraining from scratch.
In designing chips, robots and other domains where the system needs to obey natural laws rather than linguistic patterns, he says EBMs are a natural fit. “When you drive a car, you don’t look for patterns in any language. You look around you, understand the laws of the world around you and make a decision.” It’s an interesting debate and one that will likely attract more attention in the coming months, as the AI field begins to question whether scale alone is enough.
Agents, guardrails, and trust
Shevelenko spent much of the interview explaining how Perplexity has evolved from a search product into what he now calls a “digital worker.” The Perplexity Computer, its newest offering, is designed not as a tool for the information worker to use, but as a task directed by the information worker. “Every day you wake up and have 100 employees in your team,” he said of this opportunity. “What will you do to make the most of it?”
It’s a compelling pitch; and it raises obvious questions about control, so I asked them. His answer was: grain. Business administrators can specify not only which connectors and tools an agent can access, but whether those permissions are read-only or write-only – a very important distinction when agents work within business systems. When Comet, Perplexity’s computerized agent, takes action on behalf of the user, it launches a program and asks for authorization first. Some users find the conflict annoying, Shevelenko said, but he said he sees it as important, especially after joining Lazard’s board, where he said he found himself unexpectedly sensitive to the CISO’s actions protecting a 180-year-old brand built entirely on customer loyalty. “Granulation is the basis of hygiene and security,” he said.
Sovereignty, not just security
Younis offered what may have been the panel’s most charged observation in terms of geography, namely that physical AI and the sovereignty of the world are entangled in ways that digital AI never was.
The Internet began to spread as an American technology and experienced a setback only at the application layer – Ubers and DoorDashes – where the offline effects were visible. Physical AI is different. Autonomous vehicles, security drones, mining equipment, agricultural machinery – these are manifesting in the real world in ways that governments cannot ignore, raising questions about security, data collection, and who ultimately controls the systems that operate within national borders. “Almost always, each country says: we don’t want this intelligence in our borders, controlled by another country.” A few countries, he told the crowd, would currently use a robot rather than have nuclear weapons.
Fouquet did it in a different framework. China’s AI progress is real — the release of DeepSeek earlier this year sent something close to panic through parts of the industry — but that progress is trapped beneath a layer of modeling. Without access to EUV lithography, Chinese chipmakers can’t make more advanced semiconductors, and models built on older hardware run at a compound disadvantage no matter how much better the software gets. “Today, in the United States, you have data, you have computer access, you have chips, you have talent. China is doing a great job at the top of the stack, but it lacks other elements below,” said Fouquet.
It’s a generational question
Towards the end of our panel, someone in the audience asked an obviously uncomfortable question: will all this affect the next generation’s capacity for critical thinking?
The responses were positive, as you would expect from people who have invested their careers in this technology. DeSouza quickly pointed out the magnitude of the problems that powerful tools might allow humanity to face. Think of neurological diseases whose biological mechanisms we don’t understand, greenhouse gas emissions, and grid infrastructure set back decades. “This should take us to another level of innovation,” he said.
Shevelenko made a very important point: entry-level work may disappear, but the ability to present something independently has never been so easy. “[For] anyone with Computer Perplexity. . . stress is your curiosity and your agency.”
Younis made a sharp distinction between knowledge work and physical work. He pointed to the fact that the average American farmer is 58 years old and that labor shortages in mining, long-haul trucking, and agriculture are common and growing – not because wages are too low, but because people don’t want those jobs. In those domains, virtual AI is not displacing willing workers. It fills a void that already exists and looks set to deepen from here.
If you shop through links in our articles, we may earn a small commission. This does not affect our editorial independence.



