Hello to every one.
I am struggling with something related to the Visual Inertial Odometry of my voxl2 board and mavros. I am not able to understand if it is working correctly or not. Basically this is the behavior I am not understanding correctly.
I am using the tracking camera to estimate the position and the orientation of my uav. The resulting pose seems correct and the quality is good as well. I start also mavros, since I want to integrate my application with the ROS framework.
I am comparing the value present in the /qvio/pose topic and the one in the /mavros/local_position/pose that should be the estimated position from the uav point of view.
Results:
-
Orientation: I am mainly considering the YAW. The Yaw tracks the IMU data directly, without taking into account the value generated from the /qvio/pose. There is an offset due to the initial rotation of the uav, and the rotation is inverted by sign
-
Position: there is an offset. No very huge but the position seems to be filter with other things
I was wondering if these things are correct, and why
Regards.