SLAM module senses depth with ±0.17% RMS accuracy

Article By : Julien Happich

The module merges Occipital’s Structure Core embeddable depth sensor with Inuitive’s NU3000 depth processing chip.

Occipital and Inuitive have teamed up to develop a complete hardware and software module to enable efficient room-scale sensing and SLAM (simultaneous localisation and mapping) for reality, augmented reality and virtual reality (MR/AR/VR) headsets and robotics.

The module merges Occipital’s Structure Core embeddable depth sensor with Inuitive’s NU3000 depth processing chip. The two companies have worked together to tightly integrate Structure Core and NU3000 to minimise host system loads while delivering an exceptional user experience. The new solution is able to sense depths from 30cm to greater than 5m with an accuracy as high as ±0.17% RMS at 1m (using fit-to-plane). The integrated solution reduces CPU load to be equivalent to less than 25% of a recent dual-core ARM CPU during 6-DoF tracking.

System latency (from camera to fully-tracked pose) is just 10ms. Even with this high performance, the power consumption for depth + visible is between 1.3W and 2.0W, depending on the configuration selected.

Along with Structure Core, Occipital also offers Bridge Engine, an advanced MR software engine and development platform. Available on multiple platforms, Bridge Engine allows manufacturers to accelerate development of market-ready devices with advanced position-tracked VR and mixed reality capabilities. Structure Core’s dual infrared cameras can be used for stereo depth sensing when ambient sunlight would otherwise blind robotic navigation systems that rely on time-of-flight or structured light depth sensors.

Subscribe to Newsletter

Test Qr code text s ss