ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. Popular
    Log in to post
    • All Time
    • Day
    • Week
    • Month
    • All Topics
    • New Topics
    • Watched Topics
    • Unreplied Topics
    • All categories
    • Nikos MavrN

      I2C reading of 12S battery

      Support Request Format for Best Results
      • • • Nikos Mavr
      19
      0
      Votes
      19
      Posts
      387
      Views

      Eric KatzfeyE

      @Nikos-Mavr You're welcome! And can you please start a new forum post for this new issue? Thanks!

    • D

      tarling - Path planning (blue line) is erratic and drone moves to wrong locations

      Ask your questions right here!
      • • • DronAlan
      19
      0
      Votes
      19
      Posts
      489
      Views

      Cliff WongC

      @DronAlan
      Hi there, the configuration & extrinsics files look fine (just switch offboard mode from figure_eight back to trajectory mode. Since you confirmed figure 8 and position mode are working fine and reflect true ground truth in voxl portal, next is to dive a bit deep into the processes:

      Just to confirm: are you using a loaded saved map? Orientation can be different in a loaded maps based on how it was saved. Going forward in debugging this I would create the map while flying, then test plan-to-point to ensure it is working properly.

      Before take off, if you ssh onto the drone and run voxl-vision-hub --debug_offboard and rerun the mapping test (point 1m in front of the drone hovering), and you don't need to map the entire room, just get the general area. In the ssh session, monitor the output from voxl-vision-hub. Take off and fly forward and backward in position mode, with some slight yaw motions to generate a decent map in voxl portal (recall we're generating a map in flight then plan to fly to a point, not using a old-saved map). Then switch into trajectory mode. In the ssh session, you should get a Received trajectory has duration message printed. Goto voxl-portal and plan-a-point. Then execute go-to-point. In the ssh session you should get a Received insert command. printed out on the terminal. And it will show the drone's forward commands, i..e set points, to your planned point. I expect the commanding: XYZ values to increase in the positive X direction. Please post the output of the ssh session here.

      If the drone moves in the backwards direction as you've been seeing but the commanding XYZs are moving forward, we have found our problem (PX4 issue such that I'll need your params file). If the drone moves forward with commanding XYZs are moving forward then it's how your older maps are being saved and we can go from there.

    • R

      Hadron ov64b snapshots have a vertical image artifact

      Video and Image Sensors
      • • • restore
      17
      0
      Votes
      17
      Posts
      364
      Views

      C

      @Alex-Kushleyev Thank you!

      I was able to get both the full YUV and them in JPEG working. We also noticed that the vertical artifact seemed to disappear in these images. Regarding the resolution we were using when we had the artifacts, we were using 9248 × 6944 on the old drivers.

    • syamala kotireddyS

      Starling 2 / VOXL2 M0129 ESC not detected during voxl-esc scan or firmware upgrade

      ESCs
      • • • syamala kotireddy
      9
      0
      Votes
      9
      Posts
      172
      Views

      ModeratorM

      @boron Please submit an RMA, https://modalai.com/rma and just refer to this thread. Include your shipping information and we'll send you a new one

    • R

      Starling 2 Max Motor Catches

      Starling & Starling 2
      • • • RyanH
      9
      0
      Votes
      9
      Posts
      249
      Views

      Alex KushleyevA

      @RyanH , from the spin log above, i can only tell that the motor attempts to spin up but re-starts after the open-loop sinusoidal spinup is done and the ESC detects that the motor is not spinning.

      You should try to clear the debris by spinning the motor by hand first (or use compressed air) and then use the test script to spin for longer periods of time without propeller (changing speeds, etc).

      You could use a spin step command to automate increasing and decreasing spin speed (without rpm control, since it's not needed):

      ./voxl-esc-spin-step.py --id 0 --step-delay 2.5 --step-frequency 1 --power 20 --step-amplitude 30

      Alex

    • C

      Running 4 Ar0144s on M0188

      VOXL 2 Mini
      • • • cbay
      8
      0
      Votes
      8
      Posts
      198
      Views

      C

      @Alex-Kushleyev Thanks so much, we have the resistors on the way, will let you know how this goes.

      Once this has been done, I would assume we would have no issues getting all of the cameras working with OpenVins

    • Dan JenningsD

      Voxl2 + M0041 RevB Battery Monitor on Arducopter

      Ask your questions right here!
      • • • Dan Jennings
      8
      0
      Votes
      8
      Posts
      214
      Views

      Eric KatzfeyE

      @Dan-Jennings I'm also guessing that the driver may not be detecting the hardware so is not able to get the data for the battery_status. Have you tried swapping some other hardware to see if the problem is associated with a particular unit?

    • AniruddhaA

      PX4 calibration

      Starling & Starling 2
      • • • Aniruddha
      7
      0
      Votes
      7
      Posts
      169
      Views

      Aaron PorterA

      @Aniruddha
      On QGC, do the Parameters ever fully load? The reason I am asking is because if the connection to the GCS is not fast enough or strong enough for QGC to fully download the parameters then the other tabs like sensors and actuators in QGC will not appear. Which is why you are having an issue doing the level horizon from QGC. I know that QGC is seeing the video feed meaning you are connected, is the Green bar going across the full length of the fly view bar on QGC?

    • J

      How to fix the UVC camera DEVICE ID

      Video and Image Sensors
      • • • Jskim
      7
      0
      Votes
      7
      Posts
      231
      Views

      J

      Thank you for the quick reply.
      I will try doing as you instructed and post the results.

      Thank you.
      Kim

    • Daehan WonD

      Issue with USB Camera Disconnecting on VOXL2

      Ask your questions right here!
      • • • Daehan Won
      6
      0
      Votes
      6
      Posts
      130
      Views

      VinnyV

      OK @Daehan-Won
      Yeah, I'd try to run one first on it's own before the hub to rule that out.
      Keep us posted.
      Thanks!

    • Daehan WonD

      Question about sonar sensor(distance sensor) in voxl2

      VOXL 2
      • • • Daehan Won
      6
      0
      Votes
      6
      Posts
      132
      Views

      Daehan WonD

      @Eric-Katzfey Thank you for reply!

      I will try creating a custom build using the document you provided.

    • J

      Image Stabilization calibration and pipe size clarification

      Ask your questions right here!
      • • • jameskuesel
      6
      0
      Votes
      6
      Posts
      208
      Views

      Alex KushleyevA

      @jameskuesel ,

      For item 1, (a frame from hires recording with eis), you can do this already by just capturing the YUV from the hires_misp_color stream. you could do this using existing tool voxl-record-raw-image.

      For item 2, the full frame can also be captured (using the same tool) but only in raw bayer format. This would be good for offline processing, if you wanted to get the maximum image quality, however this image would not have any processing applied from MISP.

      Additionally, since MISP supports multiple outputs, as you already know, (misp channels), you could set up one channel to be a full frame image (which you normally dont stream), with EIS off.

      So with correct voxl-camera-server.conf, you should be able to get all the streams you want

      full frame hires recording low-res streaming

      And grab YUVs from any of those streams. The last part would be then just encoding them to JPG (which could be done separately or part of voxl-record-raw-image, which we could add)

      If you send me your voxl-camera-server.conf (or the part specific to the hires camera), i can update it to show you how you can get the three streams.

      Alex

    • R

      Trigger Hadron camera with DO_SET_CAM_TRIGG_DIST

      VOXL 2
      • • • restore
      6
      0
      Votes
      6
      Posts
      222
      Views

      Eric KatzfeyE

      @restore There are a few examples of code that listens for Mavlink messages in our SDK. For example, in voxl-mpa-tools, take a look at the source code for voxl-inspect-mavlink.c. https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-mpa-tools/-/blob/master/tools/voxl-inspect-mavlink.c?ref_type=heads

      Probably would be a good idea to run voxl-inspect-mavlink on the mavlink_onboard and mavlink_to_gcs pipes to see which one has the desired Mavlink message in it. Then create your own application that listens for the Mavlink message on that pipe.

    • Jesus CardenasJ

      Cannot Download QDL Image File

      Ask your questions right here!
      • • • Jesus Cardenas
      5
      0
      Votes
      5
      Posts
      109
      Views

      Z

      @Jesus-Cardenas There seems to be some work that still needs to be done on that folder. Please allow us couple of days and we will update this inquiry!

    • C

      Two-camera VIO non-functional in voxl-open-vins-server 0.6.0 (SDK 1.6.3)

      GPS-denied Navigation (VIO)
      • • • cbay
      5
      0
      Votes
      5
      Posts
      112
      Views

      Alex KushleyevA

      I have uploaded the latest ar0144 drivers with fsin versions for all camera slots here : https://storage.googleapis.com/modalai_public/temp/ar0144/ar0144_drivers_20260402.zip

      There are two additional files (inside the zip), which you should copy to /usr/lib/camera to make sure you have the latest updates:

      com.qti.sensor.ar0144.so -- contains functions for exposure / gain control (we recently made some improvements to make gain control smoother) com.qti.tuned.default.bin -- fixed gain scaling so that min gain (1.0x analog gain) is equal to 100 in the HAL3 gain units, not 54 (and the max gain will be 29.6 = 29600 for AR0144) -- you can also double check this using voxl-camera-server -l and update the min/max gain settings in your voxl-camera-server.conf to make sure you are using the full range.

      My colleague will follow up with a diagram for locations of the DNI resistors that need to be installed to enable sync signal for camera slots 1 and 3 (0402 0-ohm resistor)

      Alex

    • N

      Yaw Error Estimate on PX4v1.15

      Ask your questions right here!
      • yaw flight core v2 • • ndwe
      5
      0
      Votes
      5
      Posts
      129
      Views

      Eric KatzfeyE

      @ndwe Whether you use a VOXL or Jetson as a companion computer shouldn't make any difference unless you want to use the VOXL SDK. The version you run is totally up to you. If our v1.14.0 based version works for you then use it. We do plan to transition to v1.17.x in the near future. I don't think the issues you are seeing are related to something specific in the flight core v2 hardware setup. I would think that if you swapped it out for some other brand you would see the same issue. I would ask on the PX4 forums to see if others have experienced the same issues.

    • I

      Starling 2 loses all cameras; voxl-camera-server -l reports 0 cameras even after voxl-configure-cameras 27, voxl-configure-mpa, and reflash

      Ask your questions right here!
      • • • irw
      4
      0
      Votes
      4
      Posts
      130
      Views

      VinnyV

      Hi @irw
      Can you please post High Res and clear photos of both sides of the VOXL 2?

    • M

      Which STEP file do I need for Starling2 MAX GPS mast?

      3D Models
      • • • MikeD
      3
      0
      Votes
      3
      Posts
      38
      Views

      M

      @Alex-Kushleyev Awesome Thank you!

    • Q

      Time Of Flight (TOF) camera output FPS divided by 5 after upgrading from SDK 1.5.0 to SDK 1.6.3 (Starling2 Max C29)

      Support Request Format for Best Results
      • • • qt
      3
      0
      Votes
      3
      Posts
      83
      Views

      Q

      @Alex-Kushleyev, thank you for your answer.
      I confirm your assumption, when I set the decimator to 1, the fps is not divided.
      Here are the tests I did on SDK 1.5.0 and SDK 1.6.3 :

      test with SDK 1.5.0 ; fps = 10 | standy_enabled = false or true | decimator = 5 | Pipe Name | bytes | wide | hgt |exp(ms)| gain | frame id |latency(ms)| fps | mbps | format | tof_depth | 43200 | 180 | 240 | 2.90 | 0 | 1540 | 28.0 | 10.0 | 3.5 | RAW8 timestamp(ms)| w | h | Zmax | center point (m) (conf) 7170622 | 240 | 180 | 7.1 | -0.0 0.0 0.0 0 fps = 60 | standy_enabled = false or true | decimator = 5 | Pipe Name | bytes | wide | hgt |exp(ms)| gain | frame id |latency(ms)| fps | mbps | format | tof_depth | 43200 | 180 | 240 | 1.30 | 0 | 2155 | 9.3 | 59.9 | 20.7 | RAW8 timestamp(ms)| w | h | Zmax | center point (m) (conf) 10666991 | 240 | 180 | 3.1 | 0.0 0.0 0.0 0 fps = 60 | standy_enabled = false or true | decimator = 5 | Pipe Name | bytes | wide | hgt |exp(ms)| gain | frame id |latency(ms)| fps | mbps | format | tof_depth | 43200 | 180 | 240 | 1.18 | 0 | 869 | 9.4 | 59.9 | 20.7 | RAW8 timestamp(ms)| w | h | Zmax | center point (m) (conf) 10753813 | 240 | 180 | 3.1 | 0.0 0.0 0.0 0 test with SDK 1.6.3 fps = 10 | standy_enabled = true or false | decimator = 1 | Pipe Name | bytes | wide | hgt |exp(ms)| gain | frame id |latency(ms)| fps | mbps | format | tof_depth | 43200 | 180 | 240 | 3.02 | 0 | 260 | 29.7 | 10.0 | 3.5 | RAW8 timestamp(ms)| w | h | Zmax | center point (m) (conf) 2052194 | 240 | 180 | 7.1 | -0.0 0.0 0.0 0 fps = 60 | standy_enabled = true or false | decimator = 1 | Pipe Name | bytes | wide | hgt |exp(ms)| gain | frame id |latency(ms)| fps | mbps | format | tof_depth | 43200 | 180 | 240 | 1.18 | 0 | 7532 | 9.3 | 59.9 | 20.7 | RAW8 timestamp(ms)| w | h | Zmax | center point (m) (conf) 2190721 | 240 | 180 | 3.1 | 0.0 0.0 0.0 0 fps = 60 | standy_enabled = true or false | decimator = 10 | Pipe Name | bytes | wide | hgt |exp(ms)| gain | frame id |latency(ms)| fps | mbps | format | tof_depth | 43200 | 180 | 240 | 1.20 | 0 | 83 | 10.1 | 6.0 | 2.1 | RAW8 timestamp(ms)| w | h | Zmax | center point (m) (conf) 2469720 | 240 | 180 | 3.1 | 0.0 0.0 0.0 0

      As you can see, for SDK 1.5.0, fhe fps is NEVER divided, regardless of the value of standby_enabled.
      For SDK 1.6.3, fhe fps is ALWAYS divided, regardless of the value of standby_enabled.
      So, my problem is solved but I think you have a bug on the management of the paramater 'standby_enabled'.
      I fastly investigate the code of voxl-camera-server and I can't find where you are using 'standby_enabled'. Here is the search result in all the project :

      # Query: standby_en # ContextLines: 2 6 results - 2 files include/common_defs.h: 370 modal_exposure_msv_config_t ae_msv_info; ///< ModalAI AE data (MSV) 371 372: int standby_enabled; ///< Standby enabled for lidar 373 int decimator; ///< Decimator to use for standby 374 src/config_file.cpp: 129 printf(" gain_min : %d\n", cams[i].ae_msv_info.gain_min); 130 printf(" gain_max : %d\n", cams[i].ae_msv_info.gain_max); 131: printf(" standby_enabled: %d\n", cams[i].standby_enabled); 132 printf(" decimator: %d\n", cams[i].decimator); 133 printf(" independent_exposure:%d\n", cams[i].ind_exp); 642 // standby settings for tof only 643 if(is_tof_sensor(cam->type)) { 644: json_fetch_bool_with_default(item, "standby_enabled", (int*)&cam->standby_enabled, cam->standby_enabled); 645 json_fetch_int_with_default (item, "decimator", &cam->decimator, cam->decimator); 646 }

      Maybe you should add a condition in your function PerCameraMgr::ProcessTOFPreviewFrame():

      void PerCameraMgr::ProcessTOFPreviewFrame(mpa_ion_buf_t* buffer_info, camera_image_metadata_t meta) { tofFrameCounter++; if(grab_cpu_pitmode_active() && tofFrameCounter % (int)configInfo.decimator != 0){ return; } auto noStridePlaneSize = static_cast<size_t>(pre_width*pre_height*1.5); auto realWidth = static_cast<uint32_t>(pre_width*1.5); uint8_t* noStridePlane; if (buffer_info->stride != realWidth) { noStridePlane = new uint8_t[noStridePlaneSize]; removePlaneStride(buffer_info->stride, realWidth, buffer_info->height, (uint8_t*) buffer_info->vaddress, noStridePlane); } else { noStridePlane = static_cast<uint8_t*>(buffer_info->vaddress); } uint16_t srcPixel16[pre_width * pre_height] = {0}; // NOTE we don't actually puvblish tis particular metadata to the pipe // TOF data is published separately in a very different way to cameras meta.format = IMAGE_FORMAT_RAW8; meta.size_bytes = pre_width * pre_height; meta.stride = pre_width; Mipi12ToRaw16(meta.size_bytes, noStridePlane, srcPixel16); tof_interface->ProcessRAW16(srcPixel16, meta.timestamp_ns); if (buffer_info->stride != realWidth) { delete[] noStridePlane; } M_VERBOSE("Sent tof data to royale for processing\n"); return; }

      Best regards
      Quentin

    • David AveryD

      HDMI output from Seeker Vision FPV Goggles

      FPV Drones
      • • • David Avery
      3
      0
      Votes
      3
      Posts
      89
      Views

      David AveryD

      @tom Thank you!