We built a test device with two different IMUs to study how the quality of the IMU affects the accuracy of inside-out tracking (VISLAM) in the Spectacular AI SDK.
The device has two global-shutter camera sensors (OV9281) with wide-angle lenses (ArduCam B0223), and two inertial-measurement units (IMUs):
* TDK/InvenSense MPU6050, a popular and inexpensive MEMS IMU
* Murata SCHA634-D03, a high-quality MEMS IMU
The cameras and IMUs are synchronized with a Raspberry Pi Pico microcontroller and the data is recorded on an NVidia Jetson Xavier NX development kit. For ground truth, we used a VIVE tracker with four base stations.
We then replayed two different variants of the data through the Spectacular AI SDK: One where the data from the MPU6050 IMU was used and another with the data from the SCHA634 IMU. The same image data was used in both variants.
The test device is also capable of running VISLAM in real-time. We used our default algorithm settings that are lightweight enough to achieve this without any GPU-acceleration: the results seen in the video could be produced in real time using only the two ARM Cortex A57 cores on the Xavier. The stereo image data was processed at 30FPS in VGA resolution and IMU data at 600Hz.
In the last part of the video with the augmented reality (AR) visualization, we modified the setup slightly for improved visual quality: the image data used as the AR background comes from an OAK-D Pro Wide device attached to the same rig.
#computervision #sensorfusion #artificialintelligence #slam #augmentedreality
Негізгі бет MPU6050 vs. SCHA634 - Testing the effect of IMU quality on VISLAM accuracy
Пікірлер: 8