Implementing PX4 avoidance in mission mode using the voxl and QGC



  • I have set up the collision prevention system using the documentation given. Still, I see that there is no documentation for obstacle avoidance in mission mode, particularly the local planner node PX4 created. I have been able to fly a drone with PX4 avoidance using the Jetson TX2, but I do not know how to set this up on the voxl and get all the necessary dependencies installed for the local planner to function. For example, I wasn't able to install the pcl-ros package on the voxl. Is this possible to implement the mission mode version of obstacle avoidance using the voxl as the companion computer and the stereo pair, tracking camera, and the flight core? If not, are there some workarounds or different nodes I could use for the stereo camera pair, at least? I'd still like to use the algorithm implemented in the PX4 avoidance repo if possible.



  • You should be able to use the open source PX4 avoidance, but it may require a bit of an atypical architecture. When trying to use big ROS projects we recommended running Ubuntu on VOXL in Docker

    The base Yocto layer should be ROS core and expose the VIO and depth from stereo modules:

    Then the Ubuntu OS in the Docker should subscribe to the appropriate hardware layer topics to execute the PX4 Autonomy

    We should have this better documented, but keep asking on this thread and we'll try to get you going. Shouldn't be any major blockers, but a few tweaks are probably necessary.



  • Thank you, I'll follow this and will let you know if I do it successfully or if there are any roadblocks in the way!



  • Alright, my first roadblock! I was able to set up a docker image running ubuntu and I have installed voxl-cam-ros, snap_vio, and the dfs node and have built the catkin workspace with no problems seemingly. My problem comes with what to do next exactly: I could continue with the PX4 avoidance README and try SITL but I'm not sure about that because that involves installing PX4's firmware and running a gazebo simulation, which seems too cumbersome both in space and graphics for the VOXL to do. The second option would be to run in hardware. Part of the process of running the PX4 local planner requires specifying the camera serial number along with the company name before running "generate_launchfile.sh" which adds several parameters to launch a specific launch file for the camera specified. The second option seems like the more likely option but I'm not sure.

    So my question is two-fold: Should I try to run this in SITL at all or just run it in hardware and if so, what parameters/launch files do I need to include so I could run with this in hardware? Do I even need to set up the node for the stereo camera in the same way it is done for the Realsense or Occipital cameras (For reference, I'm specifically referring to the Run on Hardware section of the PX4 avoidance repo).

    Thanks for the help already! Hopefully, I won't have too many more issues that I'll need more insight on.



  • We haven't tried the Avoidance stack in a while, so it's probably going to come down to what you are more comfortable with. If you are comfortable with Gazebo, SITL might be a good way to go.

    The trick with ROS always comes down to making sure all of the nodes are talking to each other. There is a good tutorial on debugging ROS here

    You'll want to have a list of the ROS topics the Ubuntu PX4 Avoidance stack requires, and then a list of the ROS topics the yocto base layer is providing. Then you'll need to map the two together.

    You should only have one roscore, which should likely be in the Yocto layer. Your PC implementation should be able to see that if you have ROS_IP, etc configured properly.

    The most valuable tool right now will probably be rostopic list and rostopic echo. Run ros topic on each of the layers:

    • yocto base layer
    • Ubuntu Docker
    • PC workstation

    Make sure you see the proper topics in each location and that they are publishing data.

    Regarding stereo, I don't think that PX4 docs section is applicable. That RosDfsExample node should already be publishing a disparity map and point cloud. You can see it is publishing the disparity map here and point cloud here



  • Sounds easy enough, thanks for the quick reply!



  • Another roadblock! I decided to go straight to trying to run it in hardware as opposed to the simulator and I have been running into issues setting up the serial connection between the flight controller and VOXL. How do I specify the fcu_url on a serial port, specifically UART_J10 if I cannot talk to the ports with read and writes? I would not like to use udp and just use a serial connection but I'm confused about how to set up the fcu_url for a serial connection.



  • Not exactly familiar with that parameter. Have you seen how we implement mavros here https://docs.modalai.com/mavros? That might help. You might be able to use our voxl-vision-px4 to communicate with the flight controller, vio and mavros code: https://gitlab.com/voxl-public/ros/mavros_test


Log in to reply