ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. Alex Kushleyev
    3. Posts
    • Profile
    • Following 0
    • Followers 11
    • Topics 0
    • Posts 1744
    • Best 90
    • Controversial 1
    • Groups 1

    Posts made by Alex Kushleyev

    • RE: Problem of overconsumption. HELP!

      @Daniel-Rincon ,

      If you create a diagram of your image processing pipeline (which streams from which cameras go where), to the best of your knowledge, we may be able to suggest some optimizations in terms of voxl-camera-server settings or elsewhere.

      Also, please consider that VOXL2 cannot run without cooling at full power continuously (which is generally true for most computing devices). The challenge is to find a balance between the computing needs while not flying vs flying. The drone in flight can provide airflow for cooling or additional fan can be added to help reduce the temperature Voxl temperature.

      When VOXL2 CPU reaches around 95 degrees C, the system will start slowing down the CPU cores to prevent overheating, which will have an impact on processing speeds and some software components may no longer be able to run properly.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: MDK-M0107-2-06 VOXL Hi-res RGB Sensor Datasheet

      @alexvolt , the datasheet for M0107 can be found here : https://docs.modalai.com/M0107/ . This is a Sony IMX412 camera.

      The connector pin-out provides the power supply voltages.

      This module is designed to work with VOXL products and it we would not be able to provide much support for connecting this camera module to another platform. The main reason being that inability to test the custom setup ourselves would make any debugging very difficult.

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Help with pympa

      Hello @itzikwds,

      Sorry for the delay. We can extend pympa to support the ai_detection class.

      Here is as example of an application that receives the detection data : https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-mpa-tools/-/blob/dev/tools/voxl-inspect-detections.c

      However, since you need this in python, pympa would need to be extended to support the detection data.

      Can you please confirm that you still need this functionality or did you find another solution?

      Alex

      posted in VOXL SDK
      Alex KushleyevA
      Alex Kushleyev
    • RE: FLIR Lepton on Starling 2 Max

      @RyanH , The FLIR Lepton socket should be included with your Starling 2 Max, you should be able to see it at the bottom. Assuming the socket is there, you just need to plug in Flir Lepton 3.5 and enable the voxl-lepton-server.

      After installing voxl-lepton-server, you will need to configure the correct i2c and spi port, as described here : https://docs.modalai.com/M0173/#downward-range-finder-and-flir-lepton

      The link you provided seems to be a standard Flir Lepton 3.5, so it should work (160x120 resolution)

      36eeea77-4727-4a56-9661-784b83f87bf8-image.png

      posted in Starling & Starling 2
      Alex KushleyevA
      Alex Kushleyev
    • RE: Stereo Image sensor selection

      @SKA ,

      If you would like to get the joined stereo image, i believe you need to disable MISP. currently you have en_misp set to true. Please set it to false and try again.

      MISP is not set up to work with joining two images into one. Disabling MISP will default to the original behavior that you are expecting.

      I should add a warning about this, when MISP is enabled. Thank you.

      Please let me know if disabling MISP works for you.

      Alex

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Stereo Image sensor selection

      @SKA , how many cameras do you have connected?

      Also, what does “voxl-camera-server -l” show and what sensormodules do you have in /usr/lib/camera?

      Alex

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: AR0144 RGB output on VOXL2

      @Jordyn-Heil ,

      I am working on some examples of how to use the GPU with images received from the camera server via MPA (raw bayer, yuv, rgb).

      I just wanted to clarify your use case.

      • Currently, the camera server receives RAW8 bayer image from AR0144 camera (it can be RAW10/12 but it is not yet working reliably, different story)
      • the camera server then uses CPU to de-bayer the image into RGB
      • RGB is then sent out via MPA and can be viewed by voxl-portal (which supports RGB) or can be encoded via voxl-streamer, which also supports RGB input.

      Now, in order to do any image stitching, the following would need to be done in the client application

      • subscribe the the multiple camera streams (whether bayer or RGB)
      • load camera calibration (intrinsic) for each camera
      • load extrinsic parameters for each camera
      • load LSC (lens shading correction) tables for each camera (or could be the same for all, close enough)
      • for simplicity, lets assume that there is one large output buffer is allocated for all cameras to be pasted into (panorama image). and when individual images come in, the job of the app is to do the following:
        • apply LSC corrections to the whole image
        • (optional) perform white balance corrections (to be consistent across all cameras) (would need some analysis across all images)
        • undistort the image according to the fisheye calibration params
        • project the image according to the extrinsics calibration into the stitched image

      this would ignore the time synchronization aspect for now and basically as each image comes in, the stitched image is updated.

      Also, for your algorithm development, do you work with RGB image or YUV?

      I am planning to share something simple first, which will subscribe to the 3 or 4 AR0144 cameras, load some intrinsics and extrinsics and overlay the images in a larger image (without explicitly calibrating the extrinsics). Then we can go from there.

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: AR0144 RGB output on VOXL2

      @Jordyn-Heil ,

      Sorry for the delay.

      In order to test the AR0144 Color RTSP stream, you can use voxl-streamer. I just pushed a small changed that fixed the pipe description for AR0144 color output from YUV to RGB. The current implementation publishes RGB, but the mpa pipe description was set to YUV, so voxl-streamer did not work.

      So right now, i can run voxl-streamer:

      voxl2:/$ voxl-streamer -i tracking_front_misp_color
      Waiting for pipe tracking_front_misp_color to appear
      Found Pipe
      detected following stats from pipe:
      w: 1280 h: 800 fps: 30 format: RGB
      Stream available at rtsp://127.0.0.1:8900/live
      A new client rtsp://<my ip>:36952(null) has connected, total clients: 1
      Camera server Connected
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      rtsp client disconnected, total clients: 0
      no more rtsp clients, closing source pipe intentionally
      Removed 1 sessions
      

      voxl-streamer will receive the RGB image and use the hardware encoder to encode the images into the video stream.

      The corresponding changes are on voxl-camera-server dev branch of voxl-camera-server.

      Will that work for you, or do you need the voxl-camera-server to publish the encoded stream? In that case, since voxl-camera-server only supports encoding YUVs, I would need to conver to YUV then feed the image into the encoder.

      I will follow up regarding GPU stuff in the next post.

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: voxl-camera-server saying Boson doesn't support 640x512 resolution when starting

      @Matthew-Wellner

      Please try this kernel (it tells the system to use CCI3 instead of CCI2 when using camera in slot 2) : https://storage.googleapis.com/modalai_public/temp/test_kernels/qti-ubuntu-robotics-image-m0054-boot-slot2-cci3.img

      With this change, the CCI communication to Boson will actually happen on the Voxl's J7 upper path (using CCI3) and will go over hardware path that is working for the EO camera. By the way, after the initial probing of Boson during start-up of camera server, there is no communication to it at all via CCI.

      I just tested it on my setup with Hadron in J7.

      adb reboot bootloader
      fastboot boot qti-ubuntu-robotics-image-m0054-boot-slot2-cci3.img
      

      Then ADB into voxl2 and when you run voxl-camera-server -l, you will not detect any cameras.

      enable CCI mux on M0159 for J7:

      voxl-gpio -m 6 out && voxl-gpio -w 6 1
      

      check if cameras are detected

      voxl2:/$ voxl-camera-server -l
      DEBUG:   Attempting to open the hal module
      DEBUG:   SUCCESS: Camera module opened on attempt 0
      DEBUG:   ----------- Number of cameras: 2
      
      DEBUG:   Cam idx: 0, Cam slot: 2, Slave Address: 0x00D4, Sensor Id: 0x00FF
      DEBUG:   Cam idx: 1, Cam slot: 3, Slave Address: 0x006C, Sensor Id: 0x6442
      DEBUG:   Note: This list comes from the HAL module and may not be indicative
      DEBUG:   	of configurations that have full pipelines
      
      DEBUG:   Number of cameras: 2
      

      Then, run voxl-camera-server and view the streams via voxl-portal

      Please let me know whether this works.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: voxl-camera-server saying Boson doesn't support 640x512 resolution when starting

      @Matthew-Wellner ,

      Yes it looks like the cable is not an issue. There is something else we can try. I can build a kernel that uses CCI3 for camera slot 2, so the communication with Boson would be going thru the EO camera connector between M0181 and M0159. If this works, we can at least confirm that everything else works.

      At least if this works, you may be able to use Hadron and meanwhile request a replacement.

      Can you let me known which SDK you are using and your kernel version (mach. Var) when you run voxl-version?

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: voxl-camera-server saying Boson doesn't support 640x512 resolution when starting

      @Matthew-Wellner , don't worry about the OV64b not showing up when plugged into J6 (first, there is a different gpio that would need to be turned on to enable the CCI mux, but even if we do that, the camera won't work, as I explained earlier due to the fact that J6 on VOXL2 with default kernel is set up for a single camera or stereo combo).

      In any case, yes, it looks like the CCI connection In the Boson path is broken somewhere between M0181 and M0159.

      If you are comfortable swapping ucoax cables, you could try to do that and see if the cable is the issue (swap the two cables places between Boson and OV64B). Just make sure you plug them in correctly, not cross the connections, see https://docs.modalai.com/voxl2-hadron/#warning)

      It sounds like a case for an RMA, which is unfortunate. We do test all of these assemblies. Sorry about that!

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Rear Tracking Camera Not Working

      @george-kollamkulam , i am assuming the drone is already set up with the correct camera config (you can also re-do it using voxl-configure-cameras and select 27. If your drone was incorrectly set up in C28 configuration, you would see the same error.

      One way to check this before re-running voxl-configure-cameras is to list contents of /usr/lib/camera and see which sensormodule files are in there. For C27, you should see three AR0144 sensormodules (for slots 0, 2, 6), one IMX412 sensormodule for slot 1 and another one for the TOF in slot 3. From debug info when running voxl-camera-server -l, the three types of cameras have different sensor ID, as follows:

      AR0144: 0x0356
      IMX412: 0x0577
      TOF : 0x2975

      Assuming that step above will result in the same issue (AR0144 camera (sensor ID 0x0356) not detected in camera slot 2), then i think you need to inspect the M0173 board and specifically the connection of the camera in port J5 of M0173. Make sure the ucoax cable is plugged in properly, similarly like others. Unfortunately that will require some disassembly.

      You could also double check first the ucoax connection at the camera itself if that is easier to access. If you have a spare AR0144 camera, you could try to disconnect the camera cable from the non-functioning camera and connect the spare AR0144 to see if the camera is the issue or the cable (or downstream connections).

      Hopefully it is just a software mis-configuration, but please inspect the cables as suggested.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: voxl-camera-server saying Boson doesn't support 640x512 resolution when starting

      @Matthew-Wellner , sorry for the delay.

      First off, please update your libmodal-journal package. You are using a version that has a bug and prevents most of the output messages from being printed from voxl-camera-server. You can just use apt to update it or download it directly from : http://voxl-packages.modalai.com/dists/qrb5165/dev/binary-arm64//

      Now, as I understand, you are only able to detect Boson in slot ID5, which is not where it's actually connected.

      Let me explain why Boson is successfully detected in slot 5 (but won't stream) and how this can be used to diagnose the issue.

      • Inside the Hadron camera, the two cameras (IR and EO) have a single I2C (CCI) bus that they use to communicate - both cameras are connected to the same bus
      • Each VOXL2 camera slot has a I2C (CCI) port assigned to it in hardware
      • VOXL2 only has 4 independent CCI controllers, but 6 physical camera slots (in the 3 camera connectors, 2 per connector)
      • the CCI ports are shared in several camera slots, as follows (mapping of camera slot to CCI #:
      0 : 0
      1 : 1
      2 : 2
      3 : 3
      4 : 1
      5 : 3
      

      This can also be seen from the voxl2 connector pinouts for J6, J7, J8 (look for CAMx_CCIy.. pins on each connector) : https://docs.modalai.com/voxl2-connectors/#j6-pin-out

      This means that CCI1 is shared between slots 1 and 4, CCI3 is shared between slots 3 and 5

      Now.. on M0159 adapter, the single CCI connection coming from Hadron is split into two, and the command (from instructions) to enable the mux that connects the two CCI busses on M0159 (from the two camera slots) together is the following : voxl-gpio -m 6 out && voxl-gpio -w 6 1. Then from M0159, you have two sets of cables that connect to VOXL2 via M0181.

      If Hadron is connected to J7, then the single CCI connection will be bridged into slot 2's and slot 3's CCI ports (which are 2 and 3). However, since they are bridged, then both cameras can be detected on either CCI 2 or CCI 3.

      Also, one important detail is that both IR and EO cameras in Hadron are always on, as opposed to typical cameras we use, which have a reset signal, which is off when the camera is off. When the camera is probed / used, the camera pipeline first turns on the camera and then uses it. Normally, in order to probe a camera in slot 5, it would first need to be turned on via its reset pin, but since Boson is actually connected to slot 2, that reset pin would have been incorrect.. but sin Boson is always ON, the lack of reset and shared CCI bus allows you to probe it in the wrong slot.

      The fact that you were able to detect Boson in slot 5 (CCI3) but not able to detect it in slot 2 (CCI2) means that the CCI2 hardware path is broken (connector or wire issue). Also, if you remove the ov64b sensormodule, you should also be able to probe the Boson in slot 3 :).

      Incorrectly detecting the camera in the wrong slot will not allow you to get any images, since the MIPI data connection is going to slot 2, not 5, even though the CCI busses are shared / bridged.

      You should do a quick test and plug in your M0181 adapter in J6 and see if Boson is detected in slot 0. If yes, then the issue may be in the Voxl's J7 connector. If not, then the issue is upstream (between M0181 and Hadron). Please note that EO will not work in slot 1 (J6U) because with default kernel, J6U is set up to be used for a second camera in a stereo pair. A hires camera in J6U will be detected (probed successfully) but will not stream.

      Please inspect the wiring and try to probe Boson in J6L (slot 0) and let me know what that results in.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: EIS merge

      Hello @SKA,

      I just posted the IMX214 sensormodules here : https://storage.googleapis.com/modalai_public/temp/imx214_test_bins/imx214_eis_drivers_20250815.zip

      These support the new resolutions 4196x3120 and 2096x1560 needed for MISP / EIS.

      IMX214 (M0025-2)
      
      4196x3120 - calibration file resolution 4196x3120 or 2098x1560
      2096x1560 - calibration file resolution 2096x1560 or 2098x1560
      

      (calibration file is optional, EIS will run without one)

      I also just updated the voxl camera server (dev branch) to enable MISP support for these resolutions and also a new feature to allow specifying the bayer pattern (in case you use the flipped sensormodule).

      Here is an example snipped of voxl-camera-server.conf. The parameter bayer_type is optional and hidden by default. you can use rggb or bggr for IMX214. You can change this parameter if you see that your colors are flipped (red and blue).

                              "type": "imx214",
                              "name": "hires0",
                              "bayer_type": "rggb"
      

      The performance of EIS with IMX214 has not been tuned - specifically the behavior of the ROI controller. We can come back to that if there is interest.. Mostly focusing on IMX412 with the fisheye lens for now.

      Please try and make sure you can at least make it work and we can discuss performance tuning.

      Alex

      posted in VOXL 2
      Alex KushleyevA
      Alex Kushleyev
    • RE: Calibration yaml for M0061

      Hello @daz_22 ,

      The ideal intrinsics of the IMX412 camera module (M0161) are as follows:

      Full resolution:

      • focal length : 1999 (~2000) pixels
      • principal points: 2028, 1520 (for full frame image, such as 4056x3040 or 4040x3040)

      2x2 Binned resolution

      • focal length : 1000 pixels
      • principal points : 2028/2, 1520/2

      For other resolutions, you can adjust the principal points to be half of the width and height because the cropped image (in case of 3840x2160 or 1920x1080) will be centered.

      The lens model is fisheye and you should set the distortion coefficients to zeros unless you calibrate them.

      If you are going to calibrate the camera using voxl-calibrate-camera, it is very slow at full resolution, so i suggest calibrating at half resolution.

      Here is an old calibration result I found for one of these cameras, just for reference:

      %YAML:1.0
      ---
      M: !!opencv-matrix
         rows: 3
         cols: 3
         dt: d
         data: [ 999.5391440104593, 0.0, 1016.522606127024,
                 0.0, 998.2572561064064, 750.2875966666587,
                 0.0, 0.0, 1.0 ]
      D: !!opencv-matrix
         rows: 4
         cols: 1
         dt: d
         data: [ -0.02176119101007071, 0.01025514886831165, 
                 -0.01688007838706262, 0.006525270203215156 ]
      reprojection_error: 0.72238
      width: 2020
      height: 1520
      distortion_model: fisheye
      calibration_time: "2024-11-15 16:38:58"
      
      posted in FAQs
      Alex KushleyevA
      Alex Kushleyev
    • RE: Flir Boson+ Issues on VOXL2 Mini

      @jonathankampia ,

      Have you checked that your Boson actually works (using a USB interface with either Flir GUI or a UVC interface)?

      Also, is your Boson configured to send MIPI data in 8-bit raw format?

      The camera server cannot change the FPS of Boson (or any of its streaming parameters). In fact there is no communication at all happening from VOXL2 to Boson (besides just probing / detection) - all the settings have to be configured by the user prior connecting to VOXL2 (including resolution, fps, etc).

      https://docs.modalai.com/M0153/#boson-software-setup

      Please double check these points..

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Waterproofing M0166 Camera Lens

      @Alex-May , our lenses do not have any IP rating for waterproofing, so we cannot provide any guarantee regarding the performance when exposed to water.

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Rear Tracking Camera Not Working

      @george-kollamkulam , did you have a working configuration and it stopped working after some event? Please provide some more details.

      Thank you

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Using VOXL Mini 4-in-1 ESC (UART) + VOXL 2 IO (PWM) at the same time

      @Gary-Holmgren,

      Yes the two can work together.

      You will need to modify the /usr/bin/voxl-px4-start file which loads all the modules based on the contents of /etc/modalai/voxl-px4.conf.

      The voxl esc + voxl 2 io board is not a standard configuration, but you just need to make sure the following modules are started:

      qshell voxl_esc start #(will start on default QUP2 / slpi uart port 2)
      qshell voxl2_io start -p 7 #(use QUP7)
      

      You can also just manually start these modules in px4 shell for testing. Watch the output of both modules in px4 console, because both modules will detect the corresponding board and will print useful information whether the corresponding board has been detected on the specified port). For this purpose, it may be easier to run voxl-px4 in foreground (instead of a service):

      voxl-px4 -d
      

      (-d disables the daemon mode and will let you interact with the px4 shell)

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Unresponsive polling from FPV Racing 4-in-1 ESC

      @shawn_ricardo , just to close the loop on the original issue.. the delay in the calibration procedure was caused by UART data or processing of the ESC feedback backing up. The fix is to just slow down the command rate, seen in this commit : https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/commit/b512f9e3d5e695868775de3d40eedb2ad15cf6d9 . Additionally, using command line argument --cmd-rate 100 when you run the calibration script on VOXL2 will fix the issue.

      Regarding the issue in the step response where the feedback data appears not smooth, i believe it is related to FTID's buffer / delay. FTDI adapters have this parameter latency_timer which is 16ms by default, which amount of time the FTDI board holds the RX'ed data before passing forwarding it to the PC (and vice versa, i believe). You can set that value to 1:

      You can check this value:

      cat /sys/bus/usb-serial/devices/ttyUSB0/latency_timer
      

      And set it:

      sudo su
      echo 1 > /sys/bus/usb-serial/devices/ttyUSB0/latency_timer
      

      You would need to do this every time you unplug / plug the FTDI device back in.

      (note that this applies only to Linux PC, not VOXL2).

      Besides this, i think the plots look good. It looks like you have set the kp pretty low, which keeps the response softer (less chance of de-sync).

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev