Software Defined Sensor Fusion for Autonomous Systems
Session details:
Autonomous systems depend on an expanding array of sensors—vision, depth, lidar, radar, tactile, and more—to perceive and act in complex environments. Yet traditional fusion pipelines are often bound to fixed hardware, limiting adaptability and slowing innovation. This session presents a software‑defined approach to sensor fusion that decouples perception logic from hardware, enabling rapid iteration, scalable integration of new sensor modalities, and consistent performance across diverse platforms. We’ll explore how modern software‑centric fusion frameworks allow humanoids, mobile robots, and next‑generation aerial and ground systems to combine heterogeneous sensing inputs into a coherent world model and a reliable understanding of their own state.