
driveblocks lands €3.5M to scale off-road AI
driveblocks raises €3.5M to advance mapless, certifiable AI perception for off-road vehicles in agriculture, construction, mining and defense worldwide
TL;DR
Munich-based driveblocks raised €3.5M (pre–Series A) in a FORWARD.one-led round to push certifiable, map-light perception for off-road autonomy. The stack fuses cameras + LiDAR into a real-time 3D view and can complement GNSS. Funding will support product work, safety certification and OEM production agreements, with 2026 pilots across ag, construction and defense.
Off-road autonomy is still the hard mode
Autonomy has made huge strides on highways and in well-structured urban environments, but the moment you leave paved roads the rules change. In the field, a vehicle can’t assume crisp lane markings, predictable curbs, or consistent signage; instead it has to interpret uneven ground, shifting surfaces, vegetation that intrudes into the drivable corridor, and obstacles that don’t look like anything from a standard on-road dataset. Those conditions are exactly why off-road autonomy often lags behind on-road driver assistance in maturity, even though demand is strong in industries that can benefit the most—agriculture, construction, mining, and defense.
A major blocker is reliability under real-world variability. Dust, fog, rain, glare, shadows, and low light can degrade perception; terrain itself can confuse models because slopes and ruts change what “drivable” means from one meter to the next. Even if a system “works most of the time,” that is not enough when you need to certify safe behavior and deploy at scale, because edge cases are not rare off-road—they are the job. That’s why the industry keeps returning to the same core question: can you build perception that stays stable when the environment is unstructured, the sensor data is messy, and maps are incomplete or outdated?
This is where the latest funding momentum around off-road perception is interesting for anyone tracking industrial AI. In our work at the ai world organisation, we see a clear pattern: investment is flowing to companies that can turn perception into a certifiable, product-ready module—something OEMs and integrators can actually buy, integrate, and support across vehicle lines, rather than a one-off demo that only works on a sunny test track. That practical, “production-first” mindset is also why topics like industrial autonomy, robotics safety, and perception under adverse conditions keep showing up across the ai world organisation events calendar and the programming conversations shaping the ai world summit and ai conferences by ai world.
What driveblocks is building: perception that works without perfect maps
Munich-based driveblocks is positioning itself around a perception-centric approach to off-road autonomy, focused on converting sensor streams into a real-time, three-dimensional understanding of the world around the vehicle. The company describes its stack as a perception platform built on “Physical AI,” structured around three steps—detect, understand, and navigate—so perception outputs can translate into local driving behavior rather than stopping at raw detections.
At a technical level, driveblocks emphasizes multi-modal perception: combining camera and LiDAR inputs and fusing them into a 3D model that can support navigation decisions in real time. The company also positions the system for environments where you can’t rely on pristine maps, explaining that off-road needs a purpose-built tech stack because conditions like dust, vegetation, and hilly terrain are not exceptions—they’re routine. In other words, the target is not “autonomy that looks good on a pre-mapped route,” but autonomy that can keep perceiving and planning when the ground itself is unpredictable.
A practical detail that matters for adoption is integration. driveblocks explicitly frames its offering as modular and plug-and-play, designed to combine with a customer’s existing technology rather than forcing a full rip-and-replace. It also states that it can integrate with existing autonomy approaches such as GNSS-based driving to enhance capabilities, which is important because many industrial deployments start with assisted functions and gradually expand autonomy as confidence and safety evidence grows. For industrial customers, that gradual path—adding perception and safety features first, then stepping toward more autonomous behaviors—can be the difference between a pilot that stalls and a program that goes into production.
The product framing on safety is equally deliberate. driveblocks highlights obstacle detection paired with a “safe stop” behavior, where a dynamic safety area is calculated and the vehicle can execute a stop if an obstacle becomes relevant. It also describes capabilities like terrain-aware path following, obstacle-aware behavior, and even multi-vehicle convoy/formation concepts, which signals a direction toward more sophisticated off-road operational scenarios beyond single-vehicle point-to-point driving. From an industry perspective, that scope matters: off-road autonomy becomes far more valuable when it can handle interactions—workers nearby, machines moving around each other, and variable terrain—without requiring a perfect external infrastructure layer.
The €3.5M raise and why investors care
driveblocks has raised €3.5 million in a Pre-Series A round led by FORWARD.one, with participation from Bayern Kapital, rethink Ventures, and angel investor Joachim Drees. The round brings the company’s total funding to €7.5 million. Funding numbers alone don’t explain product viability, but they do show that investors see a near-term commercialization path—especially in industrial autonomy, where budgets can be meaningful and ROI can be clear when automation reduces downtime, improves safety, or addresses labor gaps.
According to the company’s stated plans, the new capital is aimed at product development, progress toward safety certification, and kicking off production agreements with OEMs and system integrators. That’s a notable trio of priorities because it reflects the shift from “build the model” to “prove it can be deployed responsibly.” Safety certification work is slow and evidence-heavy, and it often forces engineering teams to create clearer interfaces, more interpretable behavior, better monitoring, and robust fail-safes—all of which improve product quality even outside formal certification.
The company also points to a hiring and go-to-market push in 2026, including adding capabilities across commercial, operations, and R&D in Munich, alongside production pilots with OEMs in agriculture, construction, and defense. For the off-road autonomy market, pilots are where perception startups either earn trust or discover their real constraints. A perception stack can look strong in curated tests, but production pilots force it to survive the “boring disasters” of the field: sensor occlusion, repeated exposure to dust, unexpected object types, shifting work zones, and operational edge cases that weren’t in the training set.
From a broader European industrial-tech lens, this raise also sits inside a strategic narrative about robotics capability and supply chain independence. The article notes that commercialization and certification would support Europe’s goals around robotics independence while also responding to labor shortages. Whether you frame it as productivity, resilience, or workforce reality, the direction is consistent: industrial AI is being asked to leave the lab and show measurable value in harsh environments.
From autonomous racing to industrial-grade reliability
driveblocks was founded by Dr. Alexander Wischnewski and Dr. Stephan Matz, following their experience in autonomous racing at the Technical University of Munich. Their background includes winning a $1 million Indy Speed Track prize at 280 kph and placing second in a multi-vehicle race in Las Vegas—achievements that highlight deep capability in high-performance autonomy under pressure. The key question, of course, is how that expertise translates to off-road perception, where the problem is less about top speed and more about robust understanding in messy environments.
The company’s rationale is that off-road and commercial autonomy have fundamentally different characteristics than consumer automotive autonomy, and that typical on-road assumptions don’t transfer cleanly. It positions its platform around camera and LiDAR perception, sensor fusion, and local navigation that is purpose-built for off-road conditions. It also emphasizes a hybrid approach that blends AI with more geometrically interpretable algorithms, with the explicit goal of making certification feasible in the near term while improving over time as data grows.
That hybrid framing is important because it reflects a pragmatic view of safety: purely end-to-end AI can be powerful, but certification and incident analysis often benefit from components that can be inspected, bounded, and tested with clearer guarantees. driveblocks also stresses embedded deployment—running on production-grade devices—because industrial autonomy can’t depend on a fragile “server-in-the-loop” setup when connectivity is limited and latency matters. In short, the pitch is not “the biggest model,” but “the most dependable perception module that fits inside real machines.”
Another signal of ecosystem thinking is driveblocks’ stated engagement with the Autoware Foundation, positioned as a way to support modular, compatible autonomy stacks and reduce vendor lock-in. Autonomy adoption in industry often happens through ecosystems—OEMs, integrators, sensor vendors, and software platforms—so interoperability and standards can become commercial advantages, not just technical ideals. If perception outputs can plug into established pipelines, it lowers integration friction and helps customers move faster from prototype to production.
What comes next and how it connects to AI World
driveblocks’ near-term direction is gradual commercialization: starting with use cases that deliver immediate value and carry limited safety requirements, then expanding toward more demanding autonomy as confidence and validation grow. This sequencing is a common pattern in industrial AI because it lets customers capture ROI early (for example, safer operations or better situational awareness) while building the evidence base needed for higher-autonomy modes. Over time, the companies that win in this space are usually the ones that treat deployment as a product discipline—updates, monitoring, safety processes, and customer integration—not just a one-time model release.
For industry watchers and practitioners, the larger story is that off-road perception is becoming a stand-alone category with its own requirements, buyers, and certification pathways. The “mapless” theme is especially relevant because industrial environments change constantly: a mine face evolves, a construction site reorganizes weekly, and fields look different across seasons. That is why perception that can operate without relying on perfect maps—and that can still behave safely—has become a focal point for both startups and OEM programs.
At the same time, this is exactly the kind of applied AI narrative we spotlight through the ai world organisation events ecosystem: where AI meets operations, safety, embedded compute, and real deployment constraints. When we shape sessions for the ai world summit, the ai world summit 2025 / 2026 edition themes, and our broader ai conferences by ai world, we look for case studies that connect funding announcements to real engineering trade-offs—sensor fusion strategies, validation in harsh conditions, certifiable architectures, and the business reality of working with OEMs.
If your organization is building industrial autonomy, perception modules, robotics safety frameworks, or next-gen sensor stacks, stories like driveblocks’ raise are a useful marker of where the market is heading: toward production-ready, certifiable perception that can survive dust, vegetation, uneven terrain, and the operational chaos that defines real off-road work.