Coby Adcock’s Scout AI raises $100 million to train its battle models. We visited its training camp.

At a US military base in central California, four-seater vehicles roam the hilly roads. This is a training exercise, but not for humans in cars: This is an attempt to train AI models to enter conflict zones.
The private military ATVs are used by Scout AI, a startup founded in 2024 by Coby Adcock and Collin Otis, which calls itself a “security lab.” The company said Wednesday it raised a $100 billion Series A round, led by Align Ventures and Draper Associates, following its $15 million seed round in January 2025.
Scout invited TechCrunch on an exclusive tour of its training operations at an unnamed military base.
The company is building an AI model it calls “Fury” to operate and command military equipment, first for logistical support but soon for autonomous weapons. CTO Collin Otis compares the job, which builds on existing LLMs, to training the military.
“They start when they’re 18 years old, and sometimes they even start after college, so you want to start with that level of intellectual foundation,” Otis told TechCrunch. “It’s helpful to start with someone who’s already invested and say, hey, what do I have to do to teach this thing to be an amazing military AGI, versus just being a super smart AGI?”
Scout has received military technology development contracts worth up to $11 million from organizations such as DARPA, the Army Applications Laboratory, and other Department of Defense clients. It is one of 20 private companies whose technology is used by the US Army’s 1st Cavalry Division during its regular training cycle at Ft. Hood in Texas, with the expectation that the unit will deliver proven products when it is next used in 2027.
In Scout’s indoor testing, the rubber meets the dirt on a bumpy surface. There, the company’s team, led by ex-military personnel, puts vehicles in their place in simulation campaigns.
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
While autonomous vehicles are beginning to appear in many cities around the world, they operate there in highly regulated and regulated environments. Autopiloting on unmarked or off-road routes is another challenge entirely. Otis, a former executive at the private trucking company Kodiak, said he was inspired to start Scout when he realized that the program he was helping to build didn’t have enough intelligence to operate in an unpredictable war zone.
A new way to be independent
Scout is turning to a new autonomous technology: Vision language action models, or VLAs, are based on LLMs and are used to control robots. First released by Google DeepMind in 2023, the tech startup produced robots like Physical Intelligence and Figure.AI, a humanoid robotics company led by Adock’s brother, Brett.
Adcock is on the Figure board. He says that experience convinced him of the opportunity to bring more intelligence to the growing fleet of autonomous vehicles. His brother introduced him to Otis, who was mentoring Sketch, and they began applying the latest in AI to military solutions.
“If I gave you a drone controller right now and strapped a headset on you, you could learn to fly that thing in minutes,” Otis said. “You’re really just learning how to connect your prior knowledge to these little joysticks. It’s not a huge leap. That’s the way to think about VLAs and why they’re so open.”
Indeed, I had the opportunity to drive one of the Scout’s ATVs on rutty roads, and the terrain was difficult: steep hills, loose sand turns, disappearing tracks, confusing intersections. I’m not an experienced ATV rider but I made my first attempt (if I do say so myself). That’s the kind of common sense the company wants in its models, having been training with these ATVs for just six weeks after using regular ATVs to start the program.
I also rode an ATV under autonomous control, and I felt the difference – it’s faster than one might think for rider comfort. The performance team points out how the cars keep to the right on wide lanes but stay in narrow ones, like their trainee drivers. They too, confused, slowed down to think about their next move, something that happened a few times as it carried us through a 6.5 km loop before returning to base.
Although VLAs are still new enough to be used by any company in an operational situation, “the technology is good enough to do that field testing with the military to see how it can work effectively for the US military,” said Stuart Young, a former DARPA program manager who worked on autonomous ground vehicles. And like other private companies, Scout’s full autonomy stack includes deterministic programming and other AI flavors to complement the capabilities of its agents.
Young left DARPA this month to join Field after managing a program called RACER. It has called on companies to develop high-speed, autonomous off-road vehicles, helping to invest in this space in the same way that the organization’s Grand Challenge is growing self-driving cars. Two competitors in this space, Field AI and Overland AI, came out of that program, and Scout also participated as an add-on later.
The first applications of autonomy on the ground, according to Scout managers and military experts, will be automatic replication: Moving water or ammunition to remote observation points, or to a warehouse where a manned truck may be followed by six to ten autonomous vehicles, saving precious human labor for more important tasks. Brian Mathwich, an infantry officer who served as a Scout, recalled a recent exercise in Alaska where he led a delivery team in complete darkness and called on autonomous vehicles to assist him.

Adding intelligence to the military motorpool
Scout sees itself primarily as a software company, which builds the intelligence layer of military equipment. It does not aim to make autonomous vehicles but to build upon them.
Adcock expects that the startup’s first widely adopted product will be called “Ox,” the company’s command and control software, integrated into hard computing hardware (GPUs, communications, cameras). It is intended to allow individual soldiers to organize multiple drones and autonomous ground vehicles with quick commands: “Go to this area and look for enemy forces.”
However, getting that software to work requires training on real cars. Hence the Foundry, which is what the company calls its military training range. There, drivers spend eight-hour shifts putting the ATVs through their paces, then use a reinforcement learning program to get them where they need to go, and then use them to improve the model. The police commander asked the ATVs of this company to turn around with the security forces.
One hypothesis Scout explores is that VLAs will allow this relatively limited data set, in conjunction with training data in simulations, to deliver a fully capable driving agent. Although the car seems comfortable on the roads, for example, it is not yet ready for full off-road performance.
The Scout also trains drones for reconnaissance and as weapons, giving them intelligence with visual language models, a variant of the multi-dimensional LLM.
Scout is working on a plan that would see artillery squadrons flying a large “quarterback” platform that provides multiple countermeasures to command. In one mission, drones would search the terrain for hidden enemy tanks and attack them, perhaps without human intervention. Otis argues that an alternative in this situation may be indirect artillery fire, which is less accurate than drone strikes.
Although autonomous weapons are a milestone in the politics of defense technology, experts note that the concept is old: Heat-seeking missiles and mines have been in use for years. The expert question is how the weapons are controlled, Jay Adams, a retired US Army Captain who leads the Scout operations team, told TechCrunch.
He notes that the company’s armed drones can be programmed to only attack threats in a specific area, or only with human authentication. He also says that private weapons platforms are less likely to fire because they are afraid, the way an eighteen-year-old soldier would.
VLAs, too, offer the promise of better targeting. Scout says its models are pre-trained on a specific set of military data to prepare for, say, running into an enemy tank while on a reconnaissance mission. Lt. Col Nick Rinaldi, who directs the Army Applications Laboratory’s Scout project, says that although automatic targeting is difficult and unlikely to be used outside restricted areas in the near term, VLAs’ ability to think about threats makes them a promising technology that should be investigated.
Adams says the promise of drones that can identify their targets is the key to future war: Although Russia’s attack on Ukraine has created a lot of interest in drone warfare, he believes that having people using individual UAVs is not enough for the US to deal with a large number of inexpensive systems that cannot be controlled if they threaten US forces.
The goal is to combat anti-military vibes

Like many defense startups, Scout wears its mission on its sleeve, and executives will openly criticize companies that are reluctant to give their technology to the government. Google, for example, has reportedly dropped out of the Pentagon’s competition to develop autonomous crowd control systems, which Scout is also working on.
“The AI people don’t want to work with the military,” Otis told TechCrunch, referring to Anthropic’s clash with the Pentagon over its operating principles. “None of them are open to agents using one-way attack drones, or agents using missile systems.”
However, Scout is actually using existing LLMs as a basis for building its agents, although it declined to say which ones. Otis says it has agreements with “well-known hyperscalers” to provide pre-trained intelligence for the Scout base model. Otis also declined to comment on whether it uses open-weight models, such as those offered by Chinese companies. Many companies that rely on AI thinking build on these models to operate at lower costs compared to models from frontier labs such as Anthropic or OpenAI.
Scout expects to address this by building its model from the ground up in the coming years, and the founders say most of its money will go into those training and accounting costs. Indeed, Otis wonders if Scout will beat the current leaders in AGI because its model will be constantly interacting with the real world.
“There’s an argument in the AGI community that you’re getting smarter by learning the Internet, and more intelligence comes with connecting to the world,” Otis said.
Does that mean Adcock is competing against his brother’s army of humanoid robots in Figures? No, Otis says, but “we can scale very quickly because our customer has assets,” he said, referring to the Pentagon.
If you shop through links in our articles, we may earn a small commission. This does not affect our editorial independence.



