www.ptreview.co.uk
08
'25
Written on Modified on
Advancing 3D Sensor Fusion with Au-Zone
NXP and Au-Zone introduce the Raivin, a production-ready 3D sensor fusion module combining radar, vision, and AI for robust real-time edge perception.
www.nxp.com

Dusty construction sites. Fog-covered fields. Crowded warehouses. Heavy rain. Uneven terrain. What does it take for an autonomous machine to perceive and navigate challenging real-world environments like these – reliably, in real time? Together with Au-Zone Technologies, we set out to build a perception system that performs under operational stress but is fast to integrate and easy to scale.
The result: the Raivin module, a 3D perception system that fuses radar sensing, vision processing and edge AI inference into a single, production-ready unit. Designed for operational complexity, the Raivin enables machines to process and act on complex environmental data in real time.
With pre-trained AI perception models and a unified hardware-software stack, the Raivin simplifies the deployment of intelligent perception, marking a step forward in bringing scalable autonomy to the edge.
Challenge: Overcoming Traditional Perception Limits
The push toward autonomy and physical AI is outpacing the readiness of traditional perception solutions. Many still rely on single-sensor stacks that falter in complex, unpredictable environments.
- Camera-only systems degrade in low visibility and poor lighting
- LiDAR is precise but costly and power-hungry
- Radar is reliable in poor weather but lacks resolution for precise object classification
Together with Au-Zone, we set out to solve the problem — co-developing an edge AI sensor fusion system designed to deliver high-confidence, low-latency perception.
Vision delivers rich semantic understanding: object detection, classification and segmentation. Radar adds continuous depth and motion tracking, even through obscured environments. Fused with AI inference, these signals form a synchronized, context-aware 3D model of the world. This enables real-time decision-making with a high degree of confidence.
But even with the benefits of multi-sensor systems, autonomous applications are still limited by how quickly they can respond to real-world stimuli. Latency matters, and these workloads can’t tolerate delays caused by cloud processing or slow sensor refresh rates.
Only by combining radar and vision with edge AI processing in one unit could we deliver a system fast enough, reliable enough and robust enough to meet the demands of next-generation autonomy.
Solution: A Full-Stack System for Edge Deployment
From the start, the Raivin was a co-development effort. A full-stack design process where every layer, from silicon to software, was developed collaboratively to deliver unified performance at the edge.
The Raivin Module is a commercially available AI perception solution that provides low-level radar cube and vision data processing with edge AI into a single, deployable unit.

Au-Zone’s Edge AI Software Stack and Development Tools
The Raivin module was developed using Au-Zone’s EdgeFirst Studio™, a platform that simplifies multimodal data collection, AI-assisted labeling, training, validation and deployment, without requiring deep ML expertise.
Within EdgeFirst Studio, the EdgeFirst Perception Stack helps developers accelerate sensor fusion design through pre-trained models and a workflow-optimized environment. Teams can label datasets, fine-tune models and validate performance within a single toolchain, significantly reducing development time and lowering the barrier to entry.
The result is a tightly integrated 3D perception system optimized for low latency, low power and ready for deployment in an edge environment.
Results: Trusted Performance in Real-World Conditions
The Raivin was put to the test in a live demo replicating the types of environmental stressors autonomous machines face every day, from weather and motion to visual obstructions:
Vision delivers rich semantic understanding: object detection, classification and segmentation. Radar adds continuous depth and motion tracking, even through obscured environments. Fused with AI inference, these signals form a synchronized, context-aware 3D model of the world. This enables real-time decision-making with a high degree of confidence.
But even with the benefits of multi-sensor systems, autonomous applications are still limited by how quickly they can respond to real-world stimuli. Latency matters, and these workloads can’t tolerate delays caused by cloud processing or slow sensor refresh rates.
Only by combining radar and vision with edge AI processing in one unit could we deliver a system fast enough, reliable enough and robust enough to meet the demands of next-generation autonomy.
Solution: A Full-Stack System for Edge Deployment
From the start, the Raivin was a co-development effort. A full-stack design process where every layer, from silicon to software, was developed collaboratively to deliver unified performance at the edge.
The Raivin Module is a commercially available AI perception solution that provides low-level radar cube and vision data processing with edge AI into a single, deployable unit.

Au-Zone’s Edge AI Software Stack and Development Tools
The Raivin module was developed using Au-Zone’s EdgeFirst Studio™, a platform that simplifies multimodal data collection, AI-assisted labeling, training, validation and deployment, without requiring deep ML expertise.
Within EdgeFirst Studio, the EdgeFirst Perception Stack helps developers accelerate sensor fusion design through pre-trained models and a workflow-optimized environment. Teams can label datasets, fine-tune models and validate performance within a single toolchain, significantly reducing development time and lowering the barrier to entry.
The result is a tightly integrated 3D perception system optimized for low latency, low power and ready for deployment in an edge environment.
Results: Trusted Performance in Real-World Conditions
The Raivin was put to the test in a live demo replicating the types of environmental stressors autonomous machines face every day, from weather and motion to visual obstructions:
- In fog, radar maintained object detection, tracking and spatial awareness
- In glare, the fusion engine maintained accurate object tracking
- In simulated rainfall, radar and AI worked together to retain accurate perception
- In cluttered scenes, radar tracked velocity, while AI and vision segmented and classified people, equipment and obstacles in real time
What’s Next: Simplifying Sensor Fusion at Scale
Historically, sensor fusion has been complex, requiring fragmented tools, custom pipelines and deep domain expertise. The Raivin changes that.
With pretrained AI models integrated into Au-Zone’s EdgeFirst Studio, engineers can implement radar and vision integration without starting from scratch. The software supports dataset management, training and validation, enabling fast iteration with minimal coding or ML infrastructure. It can also be used as a data collection platform to explore custom solutions for different objects and working environments.
The ready-built hardware solution is optimized for edge AI processing, eliminating concerns about custom implementations and hardware tradeoffs.
The Raivin is already commercially available, giving OEMs a validated 3D perception system that can scale. Whether deployed in mobile robots, precision agriculture or fleet vehicles, the Raivin module enables fast integration of AI-powered perception through a single platform.
Historically, sensor fusion has been complex, requiring fragmented tools, custom pipelines and deep domain expertise. The Raivin changes that.
With pretrained AI models integrated into Au-Zone’s EdgeFirst Studio, engineers can implement radar and vision integration without starting from scratch. The software supports dataset management, training and validation, enabling fast iteration with minimal coding or ML infrastructure. It can also be used as a data collection platform to explore custom solutions for different objects and working environments.
The ready-built hardware solution is optimized for edge AI processing, eliminating concerns about custom implementations and hardware tradeoffs.
The Raivin is already commercially available, giving OEMs a validated 3D perception system that can scale. Whether deployed in mobile robots, precision agriculture or fleet vehicles, the Raivin module enables fast integration of AI-powered perception through a single platform.