Mix multiple sensors for position estimation
-
Hi to all,
I am trying to add different sensors to improve the estimation of the local position (i.e. the Local_position_ned). However, everything I made to set the ekf2 (using the px4 variables ekf2_ev params and similar) is ignored by the autopilot. I am also interested to understand more how the vision hub shares information with the px4 control stack. Is there any reference?
Regards,
-
@jocacace13 Yes, the source code is available here: https://gitlab.com/voxl-public/voxl-sdk/services/voxl-vision-hub/-/blob/master/src/vio_manager.c?ref_type=heads
A high level overview is here: https://docs.modalai.com/configure-extrinsics/
-
@Moderator Thanks for the quick reply. Do you think that, inspecting that source code and studying the High level architecture, I am also able to completely substitute the vision hub input, to feed the autopilot with a slam-generated odometry?
-
@jocacace13 yes. A better approach would be to use voxl-qvio-server as a template and replace the qvio algorithm with yours.
-
Dear @Moderator, again many thanks for the reply. More or less it is clear to me the code structure. However, I am struggling with a something that I am not able to understand.
I am connected with QGround to the voxl2 board running px4 v 1.14. I went to the mavlink inspect pane to listen to the LOCAL_POSITION_NED. Even if I stop or disable the voxl-qvio-server, the position is still received from the autopilot. How is this possible? Who is sending this data?
-
@jocacace13 PX4 sends out it's local position (as Mavlink LOCAL_POSITION_NED)