ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. James Strawson
    3. Best
    • Profile
    • Following 1
    • Followers 4
    • Topics 0
    • Posts 89
    • Best 12
    • Controversial 0
    • Groups 3

    Best posts made by James Strawson

    • RE: ToF v2 outdoor noise

      That's a typo on the website we've just corrected. The Starling 2 Max does not include a TOF sensor, and this sensor is not intended for outdoor use.

      I spent some time tuning the TOF point cloud processing filter params on Friday to reduce noise. These changes are in the dev branch now: https://gitlab.com/voxl-public/voxl-sdk/services/voxl-camera-server/-/commit/919c24bef21cbea2eb7be33853884f5f606a0763

      These these changes are likely to help in the outdoor use case too so I suggest giving it a go.

      posted in Image Sensors
      James StrawsonJ
      James Strawson
    • RE: VOXL ToF sensor rotated

      Hello Aks,

      I'm afraid there is no clean way in software to rotate the debug images other than copying the image byte-by-byte.

      Typically we only use the tof_depth and tof_confidence image pipes for quick debug. voxl-vision-px4 makes real use of the sensor by subscribing to its full point cloud pipe and then projecting the 3D points into space using the known location and orientation of the TOF sensor as specified in /etc/modalai/extrinsics.conf

      Best,
      James

      posted in VOXL-CAM
      James StrawsonJ
      James Strawson
    • RE: Crash during first flight - Log included

      @peterkrull

      Yes, the yaw estimate will track VIO unless you turn off the Yaw bit in EKF2_EV_CTRL and have EKF2_MAG_TYPE set to 0 (which is it on Starling by default). If you are doing mostly indoor flights you can completely disable the mag by setting EKF2_MAG_TYPE to 5 and SYS_HAS_MAG to 0.

      The default EKF2 setup for starling is focused on indoor flight. It enables the magnetometer for calibration and logging but does not use the data. Every time I power on a starling QGC reports 0 degree yaw unless I rotate it. This ekf2 setup is described by the following file that's loaded during the flashing process

      https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-px4-params/-/blob/master/params/v1.14/EKF2_helpers/indoor_vio.params?ref_type=heads

      Note that we are very close to the tip of px4 development and PX4's integration of EKF2 is undergoing a lot of rework right now, including restructuring the EKF2_AID_MASK param into several other EKF2_XYZ_CTRL params.

      Please put the CAL_MAG0_ROT rotation parameter back to the default of 0. The driver handles rotating the sensor data from the IC into the frame of reference of the GPS/MAG module, the CAL_MAG0_ROT parameter exits for the case where you mount the gps/mag module in an unusual orientation.

      Finally, looking through the logs it seems you experienced a typical vio failure, mostly likely due to lack of visible features on takeoff. Starling is a very small drone with the tracking camera close to the ground, so I recommend trying again by taking off on a feature-rich surface or off of a stand that lets the camera see more of the room during the critical takeoff stage. Once it's up in the air with a good view it will lock in.

      posted in Starling & Starling 2
      James StrawsonJ
      James Strawson
    • RE: voxl-vision-px4 crashing potentially when px4 cpu load too high

      This is a known bug that was fixed in voxl-vision-px4 v0.8.8 which is currently on the development repo, scheduled to be merged into the stable repo in a week or so.

      The bug is not caused by PX4, it was due the the handling of the specific mavlink packet containing console data. It should behave normally when not using the PX4 console.

      posted in VOXL
      James StrawsonJ
      James Strawson
    • RE: MAVLink Odometry X Y value 0

      Yes, It seems your installed voxl-vision-px4 and libmodal_pipe versions are from SDK 0.8 wheras your voxl-qvio-server is from SDK 0.7, that is the source of the mismatch.

      You can upgrade qvio server with opkg install voxl-qvio-server to fix the mismatch quickly but I would recommend just flashing back to the SDK 0.7 platform release since SDK 0.8 does not support TOF on VOXL1. The platform releases are tested as a whole an include a set of packages all known to work together. When individual packages get upgraded instead of the whole SDK together then things tend to break.

      posted in Ask your questions right here!
      James StrawsonJ
      James Strawson
    • RE: How to use voxl-logger to sample hires camera?

      Hi Matthew,

      When you try to run voxl-logger with the hires camera it should have given the following error message:

      voxl:/$ voxl-logger --cam hires --samples 1
      connected to ch 0 cam server: /run/mpa/hires/
      ERROR only support RAW8, RAW16, and FLOAT32 images right now
      got NV21 instead
      

      The reason these are the only formats currently supported is because it uses opencv to save/load images. OpenCV is quick enough for black and white images, but simply too slow to be practical for large color images.

      As an experiment I just made a new branch of voxl-mpa-tools with a slow but functional switch-case for the NV21 images produced by the hires camera. This will work for saving one or two images, but won't be able to keep up with a steady 30fps stream.

      Here is the code if you want to try:
      https://gitlab.com/voxl-public/modal-pipe-architecture/voxl-mpa-tools/-/commit/633adfdaf8162548d5df4218243a32e06d2ea7b1

      Migrating voxl-logger to use a faster image compression library and/or hardware acceleration is in the pipeline but I can't promise a release date for that I'm afraid.

      I hope this helps,
      James

      posted in VOXL m500 Reference Drone
      James StrawsonJ
      James Strawson
    • RE: Depth Information Source Unclear

      Hi Griffin,

      DFS stands for depth-from-stereo and is only used on the M500 and Sentinel platforms that have stereo cameras.

      Starlings use the PMD TOF camera which outputs 3D point points directly from voxl-camera-server. That data is in the reference frame of the camera such that Z points out the lens, X to the right of the image, and Y pointing downwards.

      voxl-vision-hub then consumes that data and rotates it based on the mounting location described in /etc/modalai/extrinsics.conf and the current position/orientation of the drone as reported by VIO.

      voxl-mapper is not installed by default as it's not part of the core SDK, however it is locally available and can be installed with apt install voxl-mapper without needing an internet connection.

      Best,
      James

      posted in Starling & Starling 2
      James StrawsonJ
      James Strawson
    • RE: Starling 2 Inverted Yaw and Pitch

      @Kashish-Garg-0

      Yes, everything in the aerospace industry including PX4 and ardupilot use the standard NED coordinate frame with Z pointing down. A significant portion of the code in MAVROS seems to be flipping the direction of the Y and Z axis back and forth when converting from mavlink to ros topics.

      I suggest using the vvhub_body_wrt_local pipe instead of the qvio pipe as the qvio data is centered around the imu whereas voxl-vision-hub transforms this to be centered around the body (COG) and corrects for gravity alignment.

      You can inspect the local pose on VOXL natively with

      voxl-inspect-pose vvhub_body_wrt_local
      OR
      voxl-inspect-pose --local

      Note this pipe was called vvpx4_body_wrt_local in SDK0.9 and older.

      I don't think ROS libraries like TF2 care which way gravity points. A transform is a transform, you can have it oriented however you prefer when you do your math. You should only need to change coordinate frames for the sake of visualization if RVIS doesn't let you define a default orientation or rotate the ground plane around in the UI.

      posted in Ask your questions right here!
      James StrawsonJ
      James Strawson
    • RE: voxl 2 Failed to appy GPU delegate

      Models must contain only instructions from a limited set and be quantized properly to run on the GPU. Please refer to the TensorFlow docs for details:

      https://www.tensorflow.org/lite/performance/gpu

      posted in VOXL 2
      James StrawsonJ
      James Strawson
    • RE: Apriltag relocalization not relocalizing?

      yes, that's the correct rotation matrix for a tag on the ground.

      For a tag on the wall with a center 1m off the ground you can use:
      "T_tag_wrt_fixed": [0, 0, -1],
      "R_tag_to_fixed": [[0, 0, 1], [1, 0, 0], [0, 1, 0]]

      The second video of this docs page walks you through the voxl-vision-hub command line args to help you debug relocalization.

      posted in AprilTag Relocalization
      James StrawsonJ
      James Strawson
    • RE: Creating a new client to read from voxl-tflite-server pipe

      Hi Steve,

      You would likely want your own separate project that uses libmodal_pipe to create the client interface. That can be a systemd service if you want it on boot. MPA encourages lots of separate microservices and projects to all be compiled and run independently to avoid monolithic programs that grow too big and become hard to maintain. Examples such as all the voxl-inspect-*** tools in voxl-mpa-tools also serve as good starting points along with the modal-hello-client example you already found.

      I hope this helps,
      James

      posted in Modal Pipe Architecture (MPA)
      James StrawsonJ
      James Strawson
    • RE: voxl-dfs-server: stereo pointcloud coordinate frame

      Hi Eric,

      The stereo camera pair intrinsics and extrinsics are saved to /data/modalai/opencv_stereo_intrinsics.yaml and /data/modalai/opencv_stereo_extrinsics.yaml

      We then use the following opencv calls to generate the rectification maps for each left and right image

      cv::Mat R1, P1, R2, P2;
      	cv::Mat map11, map12, map21, map22;
      	cv::Mat Q; // Output 4×4 disparity-to-depth mapping matrix, to be used later
      	double alpha = -1; // default scaling, similar to our "zoom" parameter
      	cv::stereoRectify(M1, D1, M2, D2, img_size, R, T, R1, R2, P1, P2, Q, cv::CALIB_ZERO_DISPARITY, alpha, img_size);
      
      	// make sure this uses CV_32_FC2 map format so we can read it and convert to our own
      	// MCV undistortion map format, map12 and 22 are empty/unused!!
      	cv::initUndistortRectifyMap(M1, D1, R1, P1, img_size, CV_32FC2, map11, map12);
      	cv::initUndistortRectifyMap(M2, D2, R2, P2, img_size, CV_32FC2, map21, map22);
      

      Those maps then get converted into a format that can be used by the Qualcomm hardware acceleration blocks for image dewarping, but the fundamental matrices and undistortion coefficients stay the same. I think the above lines is what you are after.

      You are correct that the disparity map is centered about the left camera, I believe this is typical in OpenCV land too.

      Note that we do not do a precise calibration between the Stereo and Hires Cameras from the factory, depending on how precise you need to alignment to be you may need to use a tool like Kalibr to do that.

      posted in Ask your questions right here!
      James StrawsonJ
      James Strawson