💥 Explore this insightful post from TechCrunch 📖
📂 **Category**: AI,Robotics,Exclusive
📌 **What You’ll Learn**:
At a US military base in central California, four-seat all-terrain vehicles cruise the hillside trails. This is a training exercise, but not for the people in the vehicles: it’s an effort to train AI models to enter conflict zones.
The autonomous military ATVs are powered by Scout AI, a startup founded in 2024 by Coby Adcock and Colin Oatis, which calls itself the “frontier laboratory of defense.” The company said Wednesday it has raised a $100 million funding round, led by Align Ventures and Draper Associates, following a $15 million seed round in January 2025.
Scout invited TechCrunch for an exclusive tour of its training operations at a military base that it asked us not to name.
The company is building an AI model it calls “Fury” to operate and command military assets, first for logistics support but soon for autonomous weapons. CTO Collin Otis compares this work, which builds on existing LLM degrees, to training soldiers.
“They start when they’re 18, and sometimes they start after college, so you have to start with that basic level of intelligence,” Otis told TechCrunch. “It’s helpful to start with someone who’s already made an investment and then say, hey, what do I need to do to teach this thing to be an amazing military intelligence general, versus just being a broad, intelligent general intelligence?”
Scout has secured military technology development contracts totaling $11 million from organizations such as DARPA, the Army Applications Laboratory, and other Department of Defense customers. It is one of 20 independent companies whose technology is used by the U.S. Army’s 1st Cavalry Division during its regular training rotation at Ft. Hood in Texas, with the unit expected to deliver self-proven products when deployed in 2027.
For the Scout’s indoor test, rubber meets dirt in the hilly terrain of the base. There, the company’s operations team, led by former soldiers, puts the vehicles through their paces in simulated missions.
TechCrunch event
San Francisco, California
|
October 13-15, 2026
While self-driving cars are starting to appear in more cities around the world, they operate in more regulated environments with rules. Operating autonomously on unmarked or off-road trails is a whole other challenge. Otis, a former executive at self-driving trucking company Kodiak, said he was motivated to start Project Scout when he realized the system he helped build there wasn’t smart enough to operate in an unpredictable war zone.

A new approach to self-government
Scout is turning to a newer technology for autonomy: Vision Language Action models, or VLAs, which are based on LLMs and used to control robots. First released by Google DeepMind in 2023, it has infused the technology into robotics startups like Physical Intelligence and Figure.AI, the humanoid robotics company led by Adock’s brother Brett.
Adcock is on the figure plate. He says this experience convinced him of the opportunity to provide broader intelligence to the Army’s growing fleet of autonomous vehicles. His brother introduced him to Otis, who was a consultant for FIGER, and they set about applying the latest AI technologies to military solutions.
“If I handed you a drone controller right now and strapped a headset on you, you could learn to fly this thing in minutes,” Otis said. “You’re really just learning how to connect your prior knowledge to these two little joysticks. It’s not a big jump. This is the way to think about VLAs and why they’re such an opening.”
In fact, I had the opportunity to drive one of Scout’s ATVs around rough trails, and the terrain was challenging: steep hills, loose sand at turns, disappearing trails, and confusing intersections. I’m not an experienced ATV driver but I gave it a fair try on my first try (if I do say so myself). This is the kind of general intelligence the company wants in its models, which was trained via these ATVs for just six weeks after civilian ATVs were used to begin the process.
I’ve also ridden an ATV under autonomous control, and felt the difference – it accelerates faster than a human would with rider comfort in mind. The operations team points out how vehicles hug the right on wider lanes, but stay in the middle of narrow lanes, like their trainee drivers. Also, when they felt disoriented, they would suddenly slow down to think about their next move, something that happened several times as we carried us on a 6.5 km loop before returning to base.
Although VLAs are new enough that they have not yet been deployed by any company in an operational environment, “the technology is good enough to do this trial in the field with soldiers to see how they are most effective for U.S. forces,” said Stuart Young, a former program manager at DARPA who has worked on ground vehicle autonomy. As with other autonomous companies, Scout’s full autonomy suite also includes deterministic systems and other flavors of AI to complement the capabilities of its agents.
Young left DARPA this month to join Field after running a program called RACER. It asked companies to create high-speed, autonomous off-road vehicles, helping to seed this space in the same way the organization’s Grand Challenge boosted self-driving cars. Two competitors in this space, Field AI and Overland AI, were taken out of this program, and Scout also participated as a later addition.
The first applications of ground autonomy, according to Scouting executives and military technicians, will be automated resupply: carrying water or ammunition to remote observation posts, or in a convoy where a manned truck could be followed by six to ten autonomous vehicles, saving valuable human labor for more important missions. Brian Matwich, an active-duty infantry officer who doubles as a fellow military Scout, recalls a recent training exercise in Alaska where he led a resupply convoy in complete darkness and hoped autonomous vehicles would help him.

Adding intelligence to the Army’s vehicle collection
Scout sees itself primarily as a software company, building the intelligence layer for military machines. It does not intend to make self-driving vehicles itself, but rather build on top of them.
Adcock expects that the startup’s first product to be widely adopted will be a product called “Ox,” the company’s command and control software, which is bundled with hardware hardware (graphics processing units, communications, cameras). Its purpose is to allow individual soldiers to organize multiple autonomous drones and ground vehicles with quick commands: “Go to this waypoint and observe enemy forces.”
However, making this program work requires training on real vehicles. Hence the Foundry, which is what the company calls its training range on the military base. There, drivers spend eight-hour shifts putting ATVs through their paces, then work through a reinforcement learning system to record where they need to take over, which is then used to improve the model. The base commander asked the company’s ATV to do its part with the security patrols.
One hypothesis Scout is testing is that VLAs will enable this relatively limited data set, combined with training data in simulations, to deliver a fully capable driving agent. Although the car seems comfortable off-road, for example, it’s not fully off-road ready.
Scouts also train with drones for reconnaissance and as weapons, giving them intelligence using vision language models, a multimodal LLM variant.
Scout is working on a system that would see groups of munitions drones flown with a larger “quarterback” platform that would provide more computational resources to pilot them. In one mission, the drones will search a geographic area for hidden enemy tanks and attack them, possibly without human intervention. Otis stresses that an alternative approach in this scenario might be indirect artillery fire, which is inaccurate compared to drone strikes.
While autonomous weapons are a bright spot in defense technology policy, experts point out that the concept is old: Heat-seeking missiles and mines have been used for decades. The question for technicians is how to control the weapons, Jay Adams, a retired US Army captain who leads the scout operations team, told TechCrunch.
He points out that the company’s drones can be programmed to attack threats in a specific geographic area only, or only with human confirmation. He also says that autonomous weapons platforms are unlikely to fire because they are afraid, as an eighteen-year-old soldier might do.
VLAs also offer the promise of better targeting. Scout says its models are pre-trained on a specific set of military data to prepare them, for example, to collide with an enemy tank while on a resupply mission. Although automated targeting is difficult and unlikely to be used outside constrained environments in the near term, VLAs’ ability to reason about threats makes them a promising technology to investigate, says Lt. Col. Nick Rinaldi, who oversees Scout’s work at the Army Applications Laboratory.
Adams says the promise of drones that can pinpoint their targets is key to future warfare: While the Russian invasion of Ukraine has generated significant interest in drone warfare, he believes that humans operating individual drones is not enough for the United States to counter a large number of low-cost unmanned systems if they threaten American forces.
A mission to counter anti-military sentiment

Like many defense startups, Scout wears its mission on its sleeve, and executives will freely criticize companies that are reluctant to hand over their technology to the government. Google, for example, has reportedly withdrawn from a Pentagon competition to develop control systems for autonomous drone swarms, a capability Scout is also working on.
“AI people don’t want to work for the military,” Otis told TechCrunch, referring to Anthropic’s dispute with the Pentagon over terms of service. “None of them are open to operating agents on unidirectional attack drones, or operating agents on missile systems.”
However, Scout actually uses existing LLM degree holders as a base to build its agents, though it declines to identify which ones. Otis says it has entered into agreements with “well-known hyperscaling specialists” to provide pre-trained intelligence for the Scout base model. Otis also declined to comment on whether it uses open-weight models, such as those offered by Chinese companies. Many companies that rely on AI inference rely on these models to operate at a lower cost compared to models from frontier labs like Anthropic or OpenAI.
Scout expects to address this problem by building its own model from the ground up in the coming years, and the founders say much of its capital will go toward training and account costs. In fact, Otis wonders whether Scout will outperform current leaders in artificial general intelligence (AGI) because its model will constantly interact with the real world.
“There’s an argument in the AGI community that you can only become intelligent by reading the Internet, and most intelligence comes from interacting in the world,” Otis said.
Does this mean Adcock is competing with his brother’s army of humanoid robots in form? No, Otis says, but “we can scale much faster because our client has the assets,” referring to the Pentagon.
When you buy through links in our articles, we may earn a small commission. This does not affect our editorial independence.
🔥 **What’s your take?**
Share your thoughts in the comments below!
#️⃣ **#Coby #Adcocks #Scout #program #raises #million #train #models #war #visited #training #camp**
🕒 **Posted on**: 1777458917
🌟 **Want more?** Click here for more info! 🌟
