Unfortunately, there is no publicly available documentation or technical paper that specifically details the lead data enrichment fusion routines and their underlying mathematics used on the VOXL flight board's IMU and cameras. This information is often considered proprietary by the manufacturers or research institutions involved in the project.
However, some resources might help understand the general principles and approaches used in sensor fusion for attitude estimation and motion tracking:
- PX4 Autopilot Documentation:
The PX4 Autopilot is an open-source autopilot system commonly used in research and hobbyist drones. It implements sensor fusion algorithms for various sensors, including IMUs, magnetometers, GPS, and optical flow sensors. While not directly related to the VOXL flight board, the PX4 documentation provides a good overview of sensor fusion concepts and the underlying Kalman filter and complementary filter techniques. You can find the relevant documentation here: https://docs.px4.io/main/en/ - Research Papers on Sensor Fusion:
Several research papers have been published on the topic of sensor fusion for attitude estimation and motion tracking. These papers often discuss different algorithms and their mathematical formulations.
"Sensor Fusion for Attitude Estimation: A Tutorial" by Mahony et al. (2008): https://arxiv.org/pdf/1912.13077
"A Comparative Study of Kalman Filter and Extended Kalman Filter for Attitude Estimation" by Julier and Uhlmann (1997): https://www.researchgate.net/publication/269254094_Extended_Kalman_Filter_vs_Error_State_Kalman_Filter_for_Aircraft_Attitude_Estimation
"Real-Time Attitude Estimation from Gyroscopes and Accelerometers" by Madgwick et al. (2011): https://arxiv.org/pdf/2002.10205