ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Testing voxl-vision-hub offboard code without arming the drone

    Starling & Starling 2
    2
    4
    6
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Hector GutierrezH
      Hector Gutierrez
      last edited by

      Hello,

      Voxl-vision-hub is the most important tool (to my knowledge) to develop autonomous flight applications in the VOXL2 platform.
      One can develop a new application based on the existing templates as "offboard_mycode.c" , change the config parameter "offboard_mode" to "mycode" and the code will execute when the drone enters offboard mode - which requires having the drone armed and flying !

      It would be of great value for development, test and debug purposes to be able to run offboard_mycode when the drone is NOT armed. This is not exactly HITL using Gazebo, the idea is to run the program while printing debug values to the monitor to see if intermediate variables have the correct value and the code is running correctly.

      Is there a way to run and test offboard code in VOXL2 while the drone is not armed , by sending debug text to the terminal ? Please let me know. Thanks.

      Eric KatzfeyE 1 Reply Last reply Reply Quote 0
      • Eric KatzfeyE
        Eric Katzfey ModalAI Team @Hector Gutierrez
        last edited by

        @Hector-Gutierrez We have actually used HITL for exactly this purpose which is nice because then you can see the flight behavior in the Gazebo visualization. Unfortunately, it's not a trivial matter to get everything setup to make it work with offboard mode since you need to have the VIO data that Gazebo generates go to voxl-vision-hub.

        Hector GutierrezH 1 Reply Last reply Reply Quote 0
        • Hector GutierrezH
          Hector Gutierrez @Eric Katzfey
          last edited by

          @Eric-Katzfey - thanks for the prompt response.
          My test scenario is as follows. I need to modify the follow_tag code to include a search maneuver (search for a tag while following a square or circular search path) followed by fly towards the tag.
          I would like to test for instance if the tag is being properly detected by moving the drone by hand in the search pattern.
          If the tag is found I can send a debug text message to the terminal.
          This is much simpler than HITL. Just examine if the code logic is correct by looking at certain variables while moving the drone by hand.
          This would require to run offboard code while the drone is not armed, and output debug text to the terminal.
          It doesn't need to use fake sensor values (as in HITL): the drone is disarmed and being moved by hand. VIO would be working. Is this possible ?
          Please let me know - thanks.

          Eric KatzfeyE 1 Reply Last reply Reply Quote 0
          • Eric KatzfeyE
            Eric Katzfey ModalAI Team @Hector Gutierrez
            last edited by

            @Hector-Gutierrez Can you just remove the propellers and actually arm?

            1 Reply Last reply Reply Quote 0
            • First post
              Last post
            Powered by NodeBB | Contributors