ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. nickyblanch
    3. Posts
    • Profile
    • Following 0
    • Followers 1
    • Topics 11
    • Posts 36
    • Best 2
    • Controversial 0
    • Groups 0

    Posts made by nickyblanch

    • RE: Starling - 3D Mapping & Path Planning features aren't working

      @muhammadbilal1 I'm glad you got VOXL-MAPPER installed and running! Unfortunately, I'm not familiar with the path planning functionality as I have never used it. One of ModalAI's people will need to chime in about this.

      However, it is certainly true that the drone will need to be in OFFBOARD mode for path planning to work. PX4 won't accept external commands otherwise.

      posted in Starling & Starling 2
      nickyblanchN
      nickyblanch
    • RE: Is the Flight Core V2 a direct replacement for V1?

      @Vinny Thank you for providing all of this info! Extremely helpful response. FCV2 isn't quite as backward-compatible as I had hoped, so I will still try to track down a FCV1 - if I can't get one, then I will be upgrading to the FCV2 per your notes here. I'll update the thread if I discover any other required modifications.

      posted in Ask your questions right here!
      nickyblanchN
      nickyblanch
    • RE: Starling - 3D Mapping & Path Planning features aren't working

      @muhammadbilal1 Hi, did you install voxl-mapper on the VOXL2? I didn't realize that I had to install it separately until after a while of trying to get an output (although, I was on VOXL, not VOXL2).

      If it's already installed, make sure it's enabled and running with

      voxl-inspect-services
      

      If the voxl-mapper service isn't enabled or running, you may need to run something like

      systemctl enable voxl-mapper
      systemctl start voxl-mapper
      
      posted in Starling & Starling 2
      nickyblanchN
      nickyblanch
    • RE: Is the Flight Core V2 a direct replacement for V1?

      @Vinny c38bd713-8af3-46d4-bba9-b377e926ae0f-IMG_3251.jpg

      posted in Ask your questions right here!
      nickyblanchN
      nickyblanch
    • RE: Is the Flight Core V2 a direct replacement for V1?

      @Vinny Hi Vinny, thank you for your response. I will update with list and picture of the connections in a few hours when I am back in my lab. The connections are all the default connections used for the Flight Core V1 Seeker drone.

      Certainly, there are connections for the GPS on the back, the ESC, the VOXL, and several other peripherals. I believe the connector that broke is J1 [VOXL Communications Interface Connector (TELEM2, /dev/ttyS4, UART5)]

      posted in Ask your questions right here!
      nickyblanchN
      nickyblanch
    • Is the Flight Core V2 a direct replacement for V1?

      Hi,

      I damaged for Flight Core V1 board inside of my Seeker drone (VOXL CAM). I see that the V1 boards are no longer available for purchase. If I purchase a V2, will I be able to use it as a drop-in replacement without modifying any other hardware or software?

      Best,
      Nick B

      posted in Ask your questions right here!
      nickyblanchN
      nickyblanch
    • RE: Expand /dev/root storage space on VOXL2 for custom ROS package?

      @Moderator I see, thank you for the clarification - I think I was confused because I saw the HW info returned Flight Core V2.

      I will try to run everything from /data/ , but I was initially worried about path errors because the other ROS packages and Python dependencies appeared to be installed in root. Will update here.

      posted in Ask your questions right here!
      nickyblanchN
      nickyblanch
    • Expand /dev/root storage space on VOXL2 for custom ROS package?

      Hi all,

      I'm trying to run a custom ROS package onboard my Seeker drone (VOXL2). The custom ROS package has a lot of Python dependencies - I'd estimate it could be up to 10GB including everything in total, based on when I installed the dependencies on my Ubuntu machine.

      However, there is not enough free space in the /dev/root filesystem to handle the Python dependencies and the custom ROS node.

      5ba5b7e9-858b-429e-8b03-5fee98f21d55-image.png

      Could there by some way that I can re-allocate space from something else (say, /dev/sda9) for /dev/root?

      For reference, the custom ROS package that I'm trying to install is a modified version of the yolov7-ros package.

      [I'm also looking into the voxl-tflite-server for a custom TFLite model, but the documentation for voxl-tflite-server makes me think it would be difficult to get my custom model to be compatible]

      Any suggestions are appreciated!
      -Nick B

      posted in Ask your questions right here!
      nickyblanchN
      nickyblanch
    • RE: VOXL mpa-to-ros & Odometry Transform (MISSING TRANSFORM)

      @nickyblanch For anyone else working with ToF over voxl-mpa-to-ros, I'll document what I ended up doing. Also consider this my application to ModalAI's dev team 😉

      1.) One of the first things you will notice when retrieving ToF data over vox-mpa-to-ros is that the ToF pointcloud2 message is stamped with the frame world in the header. However, this frame will not exist by default in most ROS ecosystems. Therefore, to even visualize the data to start with, you will need to create a world frame. The most basic way to do this is with a static TF on either the ROS host or a remote computer connected to it (a laptop running some version of ROS1). This is simply a debug step to make sure you're getting data, and this won't be necessary later.

      rosrun tf static_transform_publisher 0 0 0 0 0 0 1 map world 10
      

      The frame name world is confusing here, because the world frame generally refers to a global frame in a ROS ecosystem. However, in this case, the world frame is local to the ToF sensor.

      2.) Another thing you'll notice is that the timestamp in the header of each message published by voxl-mpa-to-ros is relative to boot time on the VOXL, not the standard Unix epoch time that ROS expects. Be careful to avoid timing errors by either:

      • (a) Republishing incoming messages with a new timestamp on a laptop (not recommended, may cause slight timing mismatches) (discussed later)

      • (b) Use the timestamps from the ROS messages originating from the VOXL for all transforms and additional ROS operations needed. Do not use

      ros::Time::now()
      

      because it will cause messages to be ignored by the message filter.

      3.) If you're interested in stabilizing the ToF messages into a global frame to create a 3D map (which was my goal), then there is a series of transforms that you will need to publish. The transform tree looks like:

      [ToF Frame] -> [IMU Frame] -> [Map Frame] -> [NED Frame]
      

      The NED frame is optional, but neither the ToF frame nor the IMU frame are in NED, so it makes it easier to interpret the data if we standardize it to Z down, Y to the right, and X forward at the end.

      4.) The first transform takes the ToF frame (world) to the IMU frame. This is a static transform. This transform will vary based on the specific setup of your VOXL and FC. For my Seeker drones, I pulled the extrinsics file and saw that the IMU and ToF frames have a 180 degree rotation in Yaw relative to each other. Therefore, a ROS transform like this is required:

      void ToF_IMU_static_broadcaster(void) {
      
          // SETUP ////////////////////////////////////////////////////////////////////////////////
          static tf2_ros::StaticTransformBroadcaster static_broadcaster;
          geometry_msgs::TransformStamped static_transformStamped;
      
          // HEADER TIMESTAMP AND TF FRAMES ///////////////////////////////////////////////////////
          static_transformStamped.header.stamp.sec = 0;
          static_transformStamped.header.stamp.nsec = 0;
          static_transformStamped.header.frame_id = [INSERT ToF FRAME NAME];
          static_transformStamped.child_frame_id = [INSERT IMU FRAME NAME]; // CAN BE ANYTHING; I USED BASE_LINK
      
          // TRANSLATION; NO TRANSLATION //////////////////////////////////////////////////////////
          static_transformStamped.transform.translation.x = 0;
          static_transformStamped.transform.translation.y = 0;
          static_transformStamped.transform.translation.z = 0;
          
          // ORIENTATION //////////////////////////////////////////////////////////////////////////
          // See ModalAI VOXL SEEKER CAM V1 extrinsics for more information on the relationship
          // between the ToF and the VIO coordinate frames. Essentially, the yaw axes are offset
          // by 180 degrees (1 pi rads).
          tf2::Quaternion quat;
          quat.setRPY(0, 0, 3.14);
          quat.normalize();
          static_transformStamped.transform.rotation.x = quat.x();
          static_transformStamped.transform.rotation.y = quat.y();
          static_transformStamped.transform.rotation.z = quat.z();
          static_transformStamped.transform.rotation.w = quat.w();
      
          // SEND //////////////////////////////////////////////////////////////////////////////////
          static_broadcaster.sendTransform(static_transformStamped);
      }
      

      5.) Next, you will need a transfrom from the IMU frame to a global frame. Although this global frame can have any name, you should not use world because it conflicts with the frame the ToF data is provided in. I used map; note that this matches the frame the VIO data is provided in, so we get the added benefit of being able to directly plot VIO data in this frame.

      This transform is not static like the previous one. Rather, we need to dynamically change the transform to reflect the changing relationship between the frames attached to the VOXL itself and the unmoving world frame. VIO data allows us to do that. Essentially, the odometry xyz and rpy values are used as translations and rotations, respectively.

      It is critical that we do rotation BEFORE translation. Therefore, this transform is technically split up to two distinct transforms in ROS in order to ensure rotation is handled before translation.

      void poseCallback(const nav_msgs::Odometry::ConstPtr& msg) {
      
          // RETRIEVE POSE DATA ///////////////////////////////////////////////////////////////////
          float x = msg->pose.pose.position.x;
          float y = msg->pose.pose.position.y;
          float z = msg->pose.pose.position.z;
          float x_o = msg->pose.pose.orientation.x;
          float y_o = msg->pose.pose.orientation.y;
          float z_o = msg->pose.pose.orientation.z;
          float w_o = msg->pose.pose.orientation.w;
      
          // TRANSFORM HANDLERS ////////////////////////////////////////////////////////////////////
          static tf2_ros::TransformBroadcaster br;
          geometry_msgs::TransformStamped transform_rotation;
          geometry_msgs::TransformStamped transform_translation;
      
          // ROTATION //////////////////////////////////////////////////////////////////////////////
          tf2::Quaternion q(-1.0 * x_o, -1.0 * y_o, -1.0 * z_o, 1.0 * w_o);
      
          // SEND TRANSFORMS ////////////////////////////////////////////////////////////////////////
          // ROTATION
          transform_rotation.header.stamp = msg->header.stamp;
          transform_rotation.header.frame_id = [INSERT IMU FRAME NAME];
          transform_rotation.child_frame_id = [INSERT INTERMEDIATE FRAME NAME];
          transform_rotation.transform.translation.x = 0 * x;
          transform_rotation.transform.translation.y = 0 * y;
          transform_rotation.transform.translation.z = 0 * z;
          transform_rotation.transform.rotation.x = q.x();
          transform_rotation.transform.rotation.y = q.y();
          transform_rotation.transform.rotation.z = q.z();
          transform_rotation.transform.rotation.w = q.w();
          br.sendTransform(transform);
      
          // TRANSLATION
          transform_translation.header.stamp = msg->header.stamp;
          transform_translation.header.frame_id = [INSERT INTERMEDIATE FRAME NAME];
          transform_translation.child_frame_id = [INSERT MAP FRAME NAME];
          transform_translation.transform.translation.x = -1.0 * x;
          transform_translation.transform.translation.y = -1.0 * y;
          transform_translation.transform.translation.z = -1.0 * z;
          transform_translation.transform.rotation.x = 0;
          transform_translation.transform.rotation.y = 0;
          transform_translation.transform.rotation.z = 0;
          transform_translation.transform.rotation.w = 1;
          br.sendTransform(transform_translation);
      
      }
      
      

      6.) Lastly, we should make a transform to take our current frame into a more standard NED frame. This is similar to the first transform because it is static.

      void map_NED_static_broadcaster(void) {
      
          // SETUP ////////////////////////////////////////////////////////////////////////////////
          static tf2_ros::StaticTransformBroadcaster static_broadcaster;
          geometry_msgs::TransformStamped static_transformStamped_NED;
      
          // HEADER TIMESTAMP AND TF FRAMES ///////////////////////////////////////////////////////
          static_transformStamped_NED.header.stamp.sec = 0;
          static_transformStamped_NED.header.stamp.nsec = 0;
          static_transformStamped_NED.header.frame_id = [INSERT MAP FRAME NAME];
          static_transformStamped_NED.child_frame_id = [INSERT NED FRAME NAME];
      
          // TRANSLATION; NO TRANSLATION //////////////////////////////////////////////////////////
          static_transformStamped_NED.transform.translation.x = 0;
          static_transformStamped_NED.transform.translation.y = 0;
          static_transformStamped_NED.transform.translation.z = 0;
          
          // ORIENTATION //////////////////////////////////////////////////////////////////////////
          static_transformStamped_NED.transform.rotation.x = -0.5;
          static_transformStamped_NED.transform.rotation.y = 0.5;
          static_transformStamped_NED.transform.rotation.z = 0.5;
          static_transformStamped_NED.transform.rotation.w = 0.5;
      
          // SEND //////////////////////////////////////////////////////////////////////////////////
          static_broadcaster.sendTransform(static_transformStamped_NED);
      }
      

      7.) If you find yourself wanting to mess around with the names of the frames that each message is published in, or with the timestamps of each message, there is an alternative to editing and re-building the voxl-mpa-to-ros package. Using a custom class like the one below, you can republish messages with a new timestamp or new frame, while keeping the same data.

      class pc_subscribe_and_publish {
          public:
              pc_subscribe_and_publish() {
      
                  // TOPIC WE ARE SUBSCRIBING TO //////////////////////////////////////////////////////////
                  sub = node.subscribe("/tof_pc", 10, &pc_subscribe_and_publish::pc_callback, this);
      
                  // TOPIC WE ARE PUBLISHING //////////////////////////////////////////////////////////////
                  pub = node.advertise<sensor_msgs::PointCloud2>("[INSERT NEW TOPIC NAME]", 100);
      
              }
              void pc_callback(const sensor_msgs::PointCloud2ConstPtr& msg) {
      
                  // KEEP ALL DATA THE SAME, ONLY CHANGE TIMESTAMP ////////////////////////////////////////
                  sensor_msgs::PointCloud2 msg_out;
                  msg_out.header = std_msgs::Header();
                  msg_out.header.frame_id = [INSERT NEW FRAME NAME];
                  msg_out.header.stamp = msg->header.stamp;
                  msg_out.height = msg->height;
                  msg_out.width = msg->width;
                  msg_out.fields = msg->fields;
                  msg_out.is_bigendian = msg->is_bigendian;
                  msg_out.point_step = msg->point_step;
                  msg_out.row_step = msg->row_step;
                  msg_out.data = msg->data;
                  msg_out.is_dense = msg->is_dense;
                  pub.publish(msg_out);
              }
          private:
              ros::NodeHandle node;
              ros::Publisher pub;
              ros::Subscriber sub;
      };
      

      8.) The last step to get everything running is to add some ROS requirements. In your main function, you'll need to make sure the above transform publishers are subscribing to the odom messages. You'll also need to create an instance of the custom class if you want to do any republishing.

      int main(int argc, char** argv) {
      
          // SETUP ////////////////////////////////////////////////////////////////////////////////
          ros::init(argc, argv, "tof_tf_broadcaster");
          ros::NodeHandle node;
      
          // BASE LINK TF BROADCASTER /////////////////////////////////////////////////////////////
          ros::Subscriber sub = node.subscribe("/qvio/odometry", 10, &poseCallback);
      
          // SENSOR FRAME TF BROADCASTER //////////////////////////////////////////////////////////
          map_NED_static_broadcaster();
          ToF_IMU_static_broadcaster();
      
          // PC2 SUBSCRIBER/PUBLISHER /////////////////////////////////////////////////////////////
          pc_subscribe_and_publish sapobject;
      
          // RUN
          ros::spin();
          return 0;
      
      }
      

      9.) Create a catkin workspace and setup your custom ROS package like any tutorial online will show you. Build the package, create a launch file, and run it. By visualizing the ToF data in the NED frame in RVIZ, you should see that objects are stabilized and the pointcloud moves around with the drone. A mapping package such as octomap_server can then create an occupancy map based on all of the ToF pointcloud data that is coming in.

      2b6d1b55-e5ed-449d-9474-d5692e2abed1-LAB_ROOFLESS_ZOOM.png
      7f55e3fe-5a32-4c07-a7b3-532dfa29be50-LAB_ANGLE_ZOOM.png

      This code is edited slightly from the exact code running in my ROS ecosystem, so it's not tested as-is, but let me know if you try it out and something isn't working. Good luck!

      posted in Ask your questions right here!
      nickyblanchN
      nickyblanch
    • VOXL mpa-to-ros & Odometry Transform (MISSING TRANSFORM)

      Hi all,

      I have a quick question that has me stuck and I would really appreciate some input. I'm trying to run an external occupancy grid mapping program (octomap_server to be exact) to take the ToF data from my Seeker drone and create a 3D map. I can receive data over ROS just fine using mpa-to-ros, but I'm stuck getting the ToF data to display in RViz or be usable in any way.

      It appears that my Seeker drone isn't publishing any TF (transform) data at all over ROS. The ToF PointCloud2 message has the frame_id 'world', but it needs a transform to a fixed frame ('Map' for example) to display. Because the vehicle is moving, this transform should use the odometry data instead of being a static transform.

      1. Am I correct that voxl-mpa-to-ros should not be publishing any TF data whatsoever? rosrun tf view_frames shows no TF data being published. The documentation here makes it seem like a TF should be published by the VOXL itself.
        ec65d710-03ae-4662-8a8d-522a5f509bd4-image.png

      2. Does anyone have a suggestion on how to handle the transform? The only thing I need to do is to get the ToF PointCloud2 message translated into a fixed frame so that I can display it. I figured out how to do a static transform in ToF Pointcloud on ROS (RVIZ) but that won't cut it.

      New to ROS and VOXL-SDK, any help is very appreciated!

      Happy Holidays,
      Nick B

      posted in Ask your questions right here!
      nickyblanchN
      nickyblanch
    • RE: ToF Pointcloud on ROS (RVIZ)

      @Amber-Elaine-Parker It's missing a TF. After launching ROS Indigo on the VOXL, open a new bash and type:

      rosrun tf static_transform_publisher 0 0 0 0 0 0 1 map world 10
      

      This is enough to create the world frame and make the ToF sensor PointCloud2 message display on Rviz (if you select the world frame under global options in Rviz). However, it looks like the frame isn't oriented correctly in my case. Editing the parameters of this command might change the orientation of the new world frame.

      I don't know enough about ROS/VOXL SDK to confidently call this an oversight by the ModalAI team, but there should definitely be a mention that this is required somewhere within the documentation.

      posted in VOXL-CAM
      nickyblanchN
      nickyblanch
    • RE: Troubleshooting/Improving VIO Accuracy

      @AP123 Hi, I have not seen this before, but I also haven't tried enabling obstacle avoidance. I can't pretend to know much about the obstacle avoidance code running inn the VOXL SDK, but I suppose this error could be caused by some obstacle avoidance service not running on the VOXL when it should be, hence QGroundcontrol being confused.

      posted in GPS-denied Navigation (VIO)
      nickyblanchN
      nickyblanch
    • RE: Troubleshooting/Improving VIO Accuracy

      @nickyblanch I'll send an update here for anyone else with bad VIO quality that may be affected by vibration - tearing apart/reassembling the VOXL Cams may have helped a small amount, but the majority of the problem was definitely centered around the battery. The battery seemed fine and secure, but after adding foam all the way around the battery, the VIO quality improved significantly.

      Curiously, although the vibration data is lower now, it still shows red ( > 5 m/s^2) in all three directions. I compared this against my brand new VOXL2 Seeker drone that doesn't have any VIO problems and that drone also shows red ( > 5 m/s^2) for vibration data. However, despite the high vibration, both the new VOXL2 drone and the older, previously unflyable VOXL1 drones fly semi-well in position mode.

      posted in GPS-denied Navigation (VIO)
      nickyblanchN
      nickyblanch
    • RE: can't get rviz working

      @fjksalf If the IP address of the VOXL is 192.168.1.1 then you need to execute export ROS_MASTER_URI=http://192.168.1.1:11311 on the Ubuntu system and then run rviz from the same terminal.

      posted in VOXL-CAM
      nickyblanchN
      nickyblanch
    • RE: Troubleshooting/Improving VIO Accuracy

      @Moderator Hi, just wanted to follow up - have you seen this problem before and/or how it was fixed? I tore the entire VOXL Cam portion apart, tightened everything, and re-secured the VOXL Cam to the airframe with a lot of thick foam between the VOXL and the airframe. Vibration still shows red (> 5.00m/s^2) after takeoff in Position Mode but is fine before takeoff. VIO flight technically works when moving slowly, but it is extremely unstable and the Seeker is constantly jerking up/down and left/right. Brand new propellers, motors appear to work fine. The drone has sustained a few crashes while indoors in the past.

      I don't have the budget for a new VOXL cam and need to get the drone flying within the next week so I can finish my thesis and graduate - any suggestions? Is there possibly some software filtering in the VOXL sdk that I can check?

      Thanks!

      posted in GPS-denied Navigation (VIO)
      nickyblanchN
      nickyblanch
    • RE: Troubleshooting/Improving VIO Accuracy

      @Moderator Hi, thank you for your response. The VOXL cam is firmly attached to the frame, the four main mounting screws are tight and the rubber washers are under compression. I will go through and make sure everything else is tight, but it all appears to be. I swapped out the propellers for brand new ones yesterday.

      I added foam padding between the VOXL CAM and the frame in an attempt to dampen vibration but it may have made it worse. VIO failed after about a minute of slow flying.

      4bced3ca-ca93-4522-9869-555da1bc35f7-image.png

      79bb361a-f56d-426d-a9e4-7a8fa132f83b-image.png
      a62de6d3-65f9-422d-b34c-bdac535a8a47-image.png

      posted in GPS-denied Navigation (VIO)
      nickyblanchN
      nickyblanch
    • RE: Troubleshooting/Improving VIO Accuracy

      @tom It's been a bit since the last update, but I after lots of fresh installs, calibrations, etc, I think the issue may be narrowed down to vibration. Shown below is the output of voxl-inspect-vibration for IMU0 during a relatively stable flight. The results are the same for IMU1.
      1038f6f9-e602-45ae-a081-39ba237d7a07-image.png

      It's clear that the vibration on the IMU is high, but is there anything I can do about it? The frame of the Seeker is unmodified, exactly as it came. Also, do you know if this vibration is high enough to cause occasional VIO failures?

      Thanks,
      Nick

      posted in GPS-denied Navigation (VIO)
      nickyblanchN
      nickyblanch