ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. Jetson Nano
    3. Best
    • Profile
    • Following 2
    • Followers 1
    • Topics 86
    • Posts 420
    • Best 4
    • Controversial 0
    • Groups 0

    Best posts made by Jetson Nano

    • Offboard mode not working

      Hey @Eric-Katzfey @James-Strawson ,

      Thanks a lot for helping me with the integration of my VOXL2 with Cube Orange. It is working well. I am also using it with the ToF, HighRes, and tracking camera from ModalAI.
      Position hold using VIO and Semi-Autonomous (VOA) are also working well.

      Now, I am facing an issue with the offboard mode. The moment I switch to offboard, the drone aggressively moves and crashes. (in both "figure_eight and "trajectory"). [In the offboard mode = "off", I am not able to switch to offboard. ]

      There are no error messages in QGC or otherwise. Except offboard everything else works.
      Could please help with this issue?
      I'm working on this as a priority, so any suggestion will be quickly tested. I hope to hear from you.

      Thank you

      posted in VOXL 2
      Jetson NanoJ
      Jetson Nano
    • Odometry message not displayed with VIO

      Hey @Alex-Kushleyev @Eric-Katzfey @tom

      I'm using a VOXL2 compute along with Flight Core V2 as external flight controller.

      PX4 version: v1.15.1 (Stable)

      The VOXL2 communicates with FC as mentioned in this documentation.

      For position hold I have set EV parameter using tracking camera fusing with a Optical Flow with Rangefinder onboard. I'm able to see the odometry message being shown in mavlink console on QGC. Also, I try streaming the message from VOXL2,

      mavlink stream -r 30 -s ODOMETRY -d /dev/ttyS1

      In voxl-vision-hub I have set the mode to figure_eight. But once, I switch to offboard mode from position mode the drone stays at the same place and does not perform figure 8.

      Once I ssh into VOXL2, and check for mavlink status, I get the LOCAL_POSITION_NED but not #331 ODOMETRY message. How can I get that message?

      Without that message I'm not sure if VIO inputs from tracking camera would be passed on to FC for fusion on switching to offboard mode of flight.

      posted in GPS-denied Navigation (VIO)
      Jetson NanoJ
      Jetson Nano
    • ToF Depth Sensor support

      We have been looking out for VOXL Time of Flight (TOF) Depth Sensor from ModalAI products page to integrate with our VOXL2 board. But the sensor is currently unavailable for the critical time period.

      The sensor provided is ToF Depth sensor from PMD. Could the same sensor module provided by PMD(ie. PMD Flexx2 3D Camera Sensor module) be directly integrated with VOXL2?

      posted in Ask your questions right here! voxl2 tof depth sensor
      Jetson NanoJ
      Jetson Nano
    • Drift issue with Openvins

      @zauberflote1 @Cliff-Wong @Alex-Kushleyev @Eric-Katzfey

      Hey guys, I am using voxl-open-vins-server.
      I facing an issue of constant drift, the drone is drifting while takeoff itself. Once I take off itself the vins values varies from 1 m to 80 m. I take off in stabilised for safety. The drone once changed to position mode, constantly moves in one direction, until pilot gives different command, again it will continue in the direction of the pilot command.
      I am mentioning my config file here, please guide me to achieve strong position hold.

      /**
       * This file contains configuration that's specific to voxl-open-vins-server.
       * 
       * *NOTE*: all time variables are measured in seconds
       * 
       * OpenVins param breakdown:
       * 
       * do_fej: whether or not to do first estimate Jacobians
       * imu_avg: whether or not use imu message averaging
       * use_rk4_integration: if we should use Rk4 imu integration.
       * cam_to_imu_refinement: whether or not to refine the imu-to-camera pose
       * cam_intrins_refinement: whether or not to refine camera intrinsics
       * cam_imu_ts_refinement: whether or not to calibrate cam to IMU time offset
       * max_clone_size: max clone size of sliding window
       * max_slam_features: max number of estimated SLAM features
       * max_slam_in_update: max number of SLAM features in a single EKF update
       * max_msckf_in_update: max number of MSCKF features used at an image timestep
      
       * 
       * Feature Reps can be any of the following:
       * 0 - GLOBAL_3D
       * 1 - GLOBAL_FULL_INVERSE_DEPTH
       * 2 - ANCHORED_3D
       * 3 - ANCHORED_FULL_INVERSE_DEPTH
       * 4 - ANCHORED_MSCKF_INVERSE_DEPTH
       * 5 - ANCHORED_INVERSE_DEPTH_SINGLE
       * feat_rep_msckf: (int) what representation our msckf features are in
       * feat_rep_slam: (int) what representation our slam features are in
      
       * cam_imu_time_offset: time offset between camera and IMU
       * slam_delay: delay that we should wait from init before estimating SLAM features
       * gravity_mag: gravity magnitude in the global frame
       * init_window_time: amount of time to initialize over
       * init_imu_thresh: variance threshold on our accel to be classified as moving
       * 
       * imu_sigma_w: gyroscope white noise (rad/s/sqrt(hz))
       * imu_sigma_wb: gyroscope random walk (rad/s^2/sqrt(hz))
       * imu_sigma_a: accelerometer white noise (m/s^2/sqrt(hz))
       * imu_sigma_ab: accelerometer random walk (m/s^3/sqrt(hz))
       * imu_sigma_w_2: gyroscope white noise covariance
       * imu_sigma_wb_2: gyroscope random walk covariance
       * imu_sigma_a_2: accelerometer white noise covariance
       * imu_sigma_ab_2: accelerometer random walk covariance
       * 
       * ****_chi2_multiplier: what chi-squared multipler we should apply
       * ****_sigma_px: noise sigma for our raw pixel measurements
       * ****_sigma_px_sq: covariance for our raw pixel measurements
       * use_stereo: if feed_measurement_camera is called with more than one
       * image, this determines behavior. if true, they are treated as a stereo
       * pair, otherwise treated as binocular system
       * if you enable a camera with stereo in the name, this will be set to true
       * automatically
       * 
       * try_zupt: if we should try to use zero velocity update
       * zupt_max_velocity: max velocity we will consider to try to do a zupt
       * zupt_only_at_beginning: if we should only use the zupt at the very beginning
       * zupt_noise_multiplier: multiplier of our zupt measurement IMU noise matrix
       * zupt_max_disparity: max disparity we will consider to try to do a zupt
       * *NOTE*: set zupt_max_disparity to 0 for only imu based zupt, and
       * zupt_chi2_multipler to 0 for only display based zupt
       * 
       * num_pts: number of points we should extract and track in each image frame
       * fast_threshold: fast extraction threshold
       * grid_x: number of column-wise grids to do feature extraction in
       * grid_y: number of row-wise grids to do feature extraction in
       * min_px_dist: after doing KLT track will remove any features closer than this
       * knn_ratio: KNN ration between top two descriptor matchers for good match
       * downsample_cams: will half image resolution
       * use_nultithreading: if we should use multi-threading for stereo matching
       * use_mask: if we should load a mask and use it to reject invalid features
       */
      {
      	"en_auto_reset":	true,
      	"auto_reset_max_velocity":	20,
      	"auto_reset_max_v_cov_instant":	0.10000000149011612,
      	"auto_reset_max_v_cov":	0.10000000149011612,
      	"auto_reset_max_v_cov_timeout_s":	0.5,
      	"auto_reset_min_features":	1,
      	"auto_reset_min_feature_timeout_s":	3,
      	"auto_fallback_timeout_s":	3,
      	"auto_fallback_min_v":	0.600000023841858,
      	"en_cont_yaw_checks":	false,
      	"fast_yaw_thresh":	5,
      	"fast_yaw_timeout_s":	1.75,
      	"do_fej":	true,
      	"imu_avg":	true,
      	"use_rk4_integration":	true,
      	"cam_to_imu_refinement":	true,
      	"cam_intrins_refinement":	true,
      	"cam_imu_ts_refinement":	true,
      	"max_clone_size":	8,
      	"max_slam_features":	35,
      	"max_slam_in_update":	10,
      	"max_msckf_in_update":	10,
      	"feat_rep_msckf":	4,
      	"feat_rep_slam":	4,
      	"cam_imu_time_offset":	0,
      	"slam_delay":	1,
      	"gravity_mag":	9.80665,
      	"init_window_time":	1,
      	"init_imu_thresh":	1,
      	"imu_sigma_w":	0.00013990944749616306,
      	"imu_sigma_wb":	4.1189724174615527e-07,
      	"imu_sigma_a":	0.0038947538150776763,
      	"imu_sigma_ab":	5.538346201712153e-05,
      	"msckf_chi2_multiplier":	1,
      	"slam_chi2_multiplier":	40,
      	"zupt_chi2_multiplier":	1,
      	"msckf_sigma_px":	4,
      	"slam_sigma_px":	4,
      	"zupt_sigma_px":	4,
      	"try_zupt":	true,
      	"zupt_max_velocity":	0.02,
      	"zupt_only_at_beginning":	true,
      	"zupt_noise_multiplier":	1.5,
      	"zupt_max_disparity":	8,
      	"init_dyn_use":	true,
      	"triangulate_1d":	false,
      	"refine_features":	true,
      	"max_runs":	5,
      	"init_lamda":	0.001,
      	"max_lamda":	10000000000,
      	"min_dx":	1e-06,
      	"min_dcost":	1e-06,
      	"lam_mult":	10,
      	"min_dist":	0.1,
      	"max_dist":	60,
      	"max_baseline":	40,
      	"max_cond_number":	600000,
      	"use_mask":	false,
              "use_multithreading": true,
      	"use_stereo":	true,
      	"use_baro":	false,
      	"num_opencv_threads":	4,
      	"fast_threshold":	30,
      	"histogram_method":	1,
      	"knn_ratio":	0.7,
      	"takeoff_accel_threshold":	0.80,
      	"takeoff_threshold":	0.3,
      	"use_stats":	false,
      	"max_allowable_cep":	1,
      	"en_force_init":	false,
      	"en_force_ned_2_flu":	false,
      	"track_frequency":	40,
      	"publish_frequency":	15,
      	"en_vio_always_on":	false,
      	"en_ext_feature_tracker":	false,
              "en_gpu_for_tracking": true,
      	"num_features_to_track":	40
      }
      
      posted in Ask your questions right here!
      Jetson NanoJ
      Jetson Nano