ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. Alex Kushleyev
    • Profile
    • Following 0
    • Followers 11
    • Topics 0
    • Posts 1744
    • Best 90
    • Controversial 1
    • Groups 1

    Alex Kushleyev

    @Alex Kushleyev

    ModalAI Team

    92
    Reputation
    237
    Profile views
    1744
    Posts
    11
    Followers
    0
    Following
    Joined Last Online

    Alex Kushleyev Unfollow Follow
    ModalAI Team

    Best posts made by Alex Kushleyev

    • RE: ToF v2 keeps crashing because of high temperature

      @dlee ,

      Yes the new TOF sensor (IRS2975C) is more powerful that the previous generation. What I mean by that is that it can emit more IR power but also heats up more. Emitting more power allows the sensor detect objects at larger distances or objects that are not as reflective.

      In current operating mode, the auto exposure control is enabled inside the sensor itself, which modulates the emitted IR power based on the returns that the sensor is getting. That is to say, the power draw will vary depending on what is in the view of the sensor. If there are obstacles nearby, the output power should be low, otherwise it can be high. At full power, the module can consume close to 0.8-0.9W

      So the first solution, if design allows, is to add a heat spreader to dissipate the heat, which you already started experimenting with. The sensor has a large exposed copper pad in the back for heat sinking purposes for this exact reason. Just be careful not to short this pad to anything, use non-conducting (but heat transfering) adhesive pad between the sensor and heat spreader.

      In terms of a software solution to the issue, we can query the temperature of the emitter. We can also control the maximum emitted power used by the auto exposure algorithm. That is to say, still leave the auto exposure running in the sensor, but limit the maximum power that it is allowed to use.

      We are planning to add some software protection that limits the maximum output power as a function of the emitter temperature. This will require some implementation and testing.

      Meanwhile, please consider using a heat spreader, which will be the best solution if you want to make use of the full sensor's operating range and not have our software limit the output power in order to prevent overheating.

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Propeller Coefficients for Starling V2

      Hello @Kashish-Garg-0

      we have a curve that is "motor voltage vs rpm", meaning that for a desired RPM, it tells the ESC what average motor voltage should be applied. The average motor voltage is defined as battery_voltage * motor_pmw_duty_cycle. The battery voltage in this curve is in millivolts. Since you are typically controlling the desired RPM, as a user you do not need to worry about what "throttle" or voltage to apply - the ESC does this automatically in order to achieve the desired RPM. this calibration curve is used as a feed-forward term in the RPM controller. The ESC does support an "open loop" type of control where you specify the power from 0 to 100%, which is similar to a standard ESC, but PX4 does not use that ESC control mode.

      By the way, you can test the ESC directly (not using PX4) using our voxl-esc tools (https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/tree/master/voxl-esc-tools) which works directly on VOXL2 or a standalone linux PC (or mac). voxl-esc-spin.py has a --power argument where you specify the power from 0 to 100, which translates directly to the average duty cycle applied to the motor.

      Here is the calibration for the Starling V2 motor / propeller that we use:
      https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/blob/master/voxl-esc-params/mavic_mini_2/mavic_mini_2.xml?ref_type=heads#L63

      Also, you can take a look at this post to see how to interpret those parameters a0, a1, a2 : https://forum.modalai.com/topic/2522/esc-calibration/2

      We also have some dyno tests for this motor / propeller : https://gitlab.com/voxl-public/flight-core-px4/dyno_data/-/blob/master/data/mavic_mini2_timing_test/mavic_mini2_modal_esc_pusher_7.4V_timing0.csv . We are not sure how accurate that is, but it can be used as a starting point. @James-Strawson can you please confirm that is the correct dyno data for the Starling V2 motors?

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Sending Recorded Video Though Camera Server on VOXL2

      @reber34 , perhaps this approach can work for you:

      • record a video encoded at high bit rate (using voxl-camera-server and voxl-record-video . Please note that the output of voxl-record-video will not be in a standard container (such as mp4, etc), but you can fix it with ffpeg : ffmpeg -r 30 -i voxl-record-video.h264 -codec copy videofile.mp4
      • re-encode the video offline with desired codecs / bit rates / resolutions
      • install gst-rtsp-launch which uses gstreamer to set up an RTSP stream https://github.com/sfalexrog/gst-rtsp-launch/
        • you will first need to figure out what gstreamer pipeline to use on voxl2 that will load your video and parse the h264/h265 frames (can use null sink for testing) and then use that pipeline with gst-rtsp-launch which will take the encoded frames and serve them over rtsp stream.
      • gstreamer may be more flexible for tuning the encoding parameters of h264/h265 (compared to voxl-camera-server) and you can also use it in real time later (using voxl-streamer, which uses gstreamer under the hood)

      Another alternative is to use voxl-record-raw-image to save raw YUVs coming from voxl-camera-server and then use voxl-replay and voxl-streamer - the latter will accept YUVs from the MPA pipe and encode them using the bit rate that you want. Note that depending on the image resolution, YUV images will take a lot more space than encoded video, but maybe that is also OK since VOXL2 has lots of storage.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: voxl_mpa_to_ros2 camera_interface timestamp

      @smilon ,

      I believe you are correct! Thank you. We will double check this and fix.

      posted in ROS
      Alex KushleyevA
      Alex Kushleyev
    • RE: UAV climbs out of control in POSITION mode (All QVIO sensors successfully calibrated)

      @rdjarvis , the cpu (by default) will run in auto mode, meaning it will slow down under light load. Can you try to set the cpu to performance mode using voxl-set-cpu-mode perf and see if your performance inproves?

      For front and rear stereo, which outputs for those cameras do you use? The code does debayering to mono and color for each camera, so disabling unneeded debayering can help.

      posted in PX4 Autonomy Developer Kit
      Alex KushleyevA
      Alex Kushleyev
    • RE: OV7251 RAW10 format

      Hello @Gicu-Panaghiu,

      I am going to assume you are using VOXL1, since you did not specify..

      We do have RAW8 and RAW10 support for OV7251. The selection of the format has to be done in several places.

      First, you have to select the correct camera driver, specifically..

      ls /usr/lib/libmmcamera_ov7251*.so
      /usr/lib/libmmcamera_ov7251.so
      /usr/lib/libmmcamera_ov7251_8bit.so
      /usr/lib/libmmcamera_ov7251_hflip_8bit.so
      /usr/lib/libmmcamera_ov7251_rot180_8bit.so
      /usr/lib/libmmcamera_ov7251_vflip_8bit.so
      

      there are 5 options and one of them is _8bit.so which means it will natively ouptput 8bit data (all others output 10 bit data).

      the driver name, such as ov7251_8bit has to be the sensor name <SensorName>ov7251_8bit</SensorName> in /system/etc/camera/camera_config.xml.

      You can check camera_config.xml for what sensor library is used for your OV7251.

      When you run voxl-configure-cameras script, it will actually copy one of the default camera_config.xml that are set up for a particular use case, and I believe it will indeed select the 8bit one - this was done to save cpu cycles needed to convert 10bit to 8bit, since majority of the time only 8bit pixels are used.

      Now, you mentioned that HAL_PIXEL_FORMAT_RAW10 is passed to the stream config and unfortunately this does not have any effect on what the driver outputs. If the low level driver (e.g. libmmcamera_ov7251_8bit.so) is set up to output RAW8, it will output RAW8 if you request either HAL_PIXEL_FORMAT_RAW8 or HAL_PIXEL_FORMAT_RAW10.

      So if you update the camera_config.xml to the 10bit driver and just keep the HAL_PIXEL_FORMAT_RAW10 in the stream config (then sync and reboot), you should be getting a 10 bit RAW image from the camera. But since the camera server is currently expecting 8 bit image, if you just interpret the image as 8 bit, it will appear garbled, so you will need to handle the 10 bit image (decide what you want to do with it) in the camera server.

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: VOXL ESC V1 or V2

      @wilkinsaf , M0027 was never shipped to any customers, as it was a test version of the ESC. So there should only be M0049, M0117, M0134 and M0129 (mini) ESCs out there. Like Vinny said, all of those ESCs have a blue status LED for each MCU.

      If your ESC has a larger rectangular shape (as opposed to a square), it could be a really old ESC (Atmel-based, not STM32-based), which we do not really support any more. I hope this helps!

      Alex

      posted in VOXL 2
      Alex KushleyevA
      Alex Kushleyev
    • RE: Cannot change TOF framerate

      The ipk is available here now : http://voxl-packages.modalai.com/stable/voxl-hal3-tof-cam-ros_0.0.5.ipk - you should be able to use the launch file to choose between two modes (5=short range and 9=long range) and fps, which are listed in the launch file.

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Onboard Image Processing using ROS + OpenCV (+ possibly some ML library in the future)

      @Prabhav-Gupta , yes it seems like OpenCV and ROS YUV_NV12 formats do not match up. I will take a look at it.. it seems the ROS YUV is packed (interleaved) while standard for storing YUV NV12 is having two planes : plane 1 : Y (size: widthheight), plane 2 : UV (size: widthheight/2)

      In the meantime.. you can stream a rtsp h264/h265 from VOXL (use decent quality so that image looks good) and use opencv to receive the stream and get decompressed images: https://stackoverflow.com/questions/40875846/capturing-rtsp-camera-using-opencv-python

      Would that work for you ? (unfortunately with rtsp stream, you will not get the full image metadata, like exposure, gain, timestamp, etc).

      RTSP streaming can be done using voxl-streamer, which can accept either a YUV (and encode it) or already encoded h264/5 stream from voxl-camera-server.

      Alex

      posted in ROS
      Alex KushleyevA
      Alex Kushleyev
    • RE: Poor GPS Fix

      @Rodrigo-Betances , yes, absolutely. I will provide your name to the team.

      posted in PX4 Autonomy Developer Kit
      Alex KushleyevA
      Alex Kushleyev

    Latest posts made by Alex Kushleyev

    • RE: Problem of overconsumption. HELP!

      @Daniel-Rincon ,

      If you create a diagram of your image processing pipeline (which streams from which cameras go where), to the best of your knowledge, we may be able to suggest some optimizations in terms of voxl-camera-server settings or elsewhere.

      Also, please consider that VOXL2 cannot run without cooling at full power continuously (which is generally true for most computing devices). The challenge is to find a balance between the computing needs while not flying vs flying. The drone in flight can provide airflow for cooling or additional fan can be added to help reduce the temperature Voxl temperature.

      When VOXL2 CPU reaches around 95 degrees C, the system will start slowing down the CPU cores to prevent overheating, which will have an impact on processing speeds and some software components may no longer be able to run properly.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: MDK-M0107-2-06 VOXL Hi-res RGB Sensor Datasheet

      @alexvolt , the datasheet for M0107 can be found here : https://docs.modalai.com/M0107/ . This is a Sony IMX412 camera.

      The connector pin-out provides the power supply voltages.

      This module is designed to work with VOXL products and it we would not be able to provide much support for connecting this camera module to another platform. The main reason being that inability to test the custom setup ourselves would make any debugging very difficult.

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Help with pympa

      Hello @itzikwds,

      Sorry for the delay. We can extend pympa to support the ai_detection class.

      Here is as example of an application that receives the detection data : https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-mpa-tools/-/blob/dev/tools/voxl-inspect-detections.c

      However, since you need this in python, pympa would need to be extended to support the detection data.

      Can you please confirm that you still need this functionality or did you find another solution?

      Alex

      posted in VOXL SDK
      Alex KushleyevA
      Alex Kushleyev
    • RE: FLIR Lepton on Starling 2 Max

      @RyanH , The FLIR Lepton socket should be included with your Starling 2 Max, you should be able to see it at the bottom. Assuming the socket is there, you just need to plug in Flir Lepton 3.5 and enable the voxl-lepton-server.

      After installing voxl-lepton-server, you will need to configure the correct i2c and spi port, as described here : https://docs.modalai.com/M0173/#downward-range-finder-and-flir-lepton

      The link you provided seems to be a standard Flir Lepton 3.5, so it should work (160x120 resolution)

      36eeea77-4727-4a56-9661-784b83f87bf8-image.png

      posted in Starling & Starling 2
      Alex KushleyevA
      Alex Kushleyev
    • RE: Stereo Image sensor selection

      @SKA ,

      If you would like to get the joined stereo image, i believe you need to disable MISP. currently you have en_misp set to true. Please set it to false and try again.

      MISP is not set up to work with joining two images into one. Disabling MISP will default to the original behavior that you are expecting.

      I should add a warning about this, when MISP is enabled. Thank you.

      Please let me know if disabling MISP works for you.

      Alex

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Stereo Image sensor selection

      @SKA , how many cameras do you have connected?

      Also, what does “voxl-camera-server -l” show and what sensormodules do you have in /usr/lib/camera?

      Alex

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: AR0144 RGB output on VOXL2

      @Jordyn-Heil ,

      I am working on some examples of how to use the GPU with images received from the camera server via MPA (raw bayer, yuv, rgb).

      I just wanted to clarify your use case.

      • Currently, the camera server receives RAW8 bayer image from AR0144 camera (it can be RAW10/12 but it is not yet working reliably, different story)
      • the camera server then uses CPU to de-bayer the image into RGB
      • RGB is then sent out via MPA and can be viewed by voxl-portal (which supports RGB) or can be encoded via voxl-streamer, which also supports RGB input.

      Now, in order to do any image stitching, the following would need to be done in the client application

      • subscribe the the multiple camera streams (whether bayer or RGB)
      • load camera calibration (intrinsic) for each camera
      • load extrinsic parameters for each camera
      • load LSC (lens shading correction) tables for each camera (or could be the same for all, close enough)
      • for simplicity, lets assume that there is one large output buffer is allocated for all cameras to be pasted into (panorama image). and when individual images come in, the job of the app is to do the following:
        • apply LSC corrections to the whole image
        • (optional) perform white balance corrections (to be consistent across all cameras) (would need some analysis across all images)
        • undistort the image according to the fisheye calibration params
        • project the image according to the extrinsics calibration into the stitched image

      this would ignore the time synchronization aspect for now and basically as each image comes in, the stitched image is updated.

      Also, for your algorithm development, do you work with RGB image or YUV?

      I am planning to share something simple first, which will subscribe to the 3 or 4 AR0144 cameras, load some intrinsics and extrinsics and overlay the images in a larger image (without explicitly calibrating the extrinsics). Then we can go from there.

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: AR0144 RGB output on VOXL2

      @Jordyn-Heil ,

      Sorry for the delay.

      In order to test the AR0144 Color RTSP stream, you can use voxl-streamer. I just pushed a small changed that fixed the pipe description for AR0144 color output from YUV to RGB. The current implementation publishes RGB, but the mpa pipe description was set to YUV, so voxl-streamer did not work.

      So right now, i can run voxl-streamer:

      voxl2:/$ voxl-streamer -i tracking_front_misp_color
      Waiting for pipe tracking_front_misp_color to appear
      Found Pipe
      detected following stats from pipe:
      w: 1280 h: 800 fps: 30 format: RGB
      Stream available at rtsp://127.0.0.1:8900/live
      A new client rtsp://<my ip>:36952(null) has connected, total clients: 1
      Camera server Connected
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      gbm_create_device(156): Info: backend name is: msm_drm
      rtsp client disconnected, total clients: 0
      no more rtsp clients, closing source pipe intentionally
      Removed 1 sessions
      

      voxl-streamer will receive the RGB image and use the hardware encoder to encode the images into the video stream.

      The corresponding changes are on voxl-camera-server dev branch of voxl-camera-server.

      Will that work for you, or do you need the voxl-camera-server to publish the encoded stream? In that case, since voxl-camera-server only supports encoding YUVs, I would need to conver to YUV then feed the image into the encoder.

      I will follow up regarding GPU stuff in the next post.

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: voxl-camera-server saying Boson doesn't support 640x512 resolution when starting

      @Matthew-Wellner

      Please try this kernel (it tells the system to use CCI3 instead of CCI2 when using camera in slot 2) : https://storage.googleapis.com/modalai_public/temp/test_kernels/qti-ubuntu-robotics-image-m0054-boot-slot2-cci3.img

      With this change, the CCI communication to Boson will actually happen on the Voxl's J7 upper path (using CCI3) and will go over hardware path that is working for the EO camera. By the way, after the initial probing of Boson during start-up of camera server, there is no communication to it at all via CCI.

      I just tested it on my setup with Hadron in J7.

      adb reboot bootloader
      fastboot boot qti-ubuntu-robotics-image-m0054-boot-slot2-cci3.img
      

      Then ADB into voxl2 and when you run voxl-camera-server -l, you will not detect any cameras.

      enable CCI mux on M0159 for J7:

      voxl-gpio -m 6 out && voxl-gpio -w 6 1
      

      check if cameras are detected

      voxl2:/$ voxl-camera-server -l
      DEBUG:   Attempting to open the hal module
      DEBUG:   SUCCESS: Camera module opened on attempt 0
      DEBUG:   ----------- Number of cameras: 2
      
      DEBUG:   Cam idx: 0, Cam slot: 2, Slave Address: 0x00D4, Sensor Id: 0x00FF
      DEBUG:   Cam idx: 1, Cam slot: 3, Slave Address: 0x006C, Sensor Id: 0x6442
      DEBUG:   Note: This list comes from the HAL module and may not be indicative
      DEBUG:   	of configurations that have full pipelines
      
      DEBUG:   Number of cameras: 2
      

      Then, run voxl-camera-server and view the streams via voxl-portal

      Please let me know whether this works.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: voxl-camera-server saying Boson doesn't support 640x512 resolution when starting

      @Matthew-Wellner ,

      Yes it looks like the cable is not an issue. There is something else we can try. I can build a kernel that uses CCI3 for camera slot 2, so the communication with Boson would be going thru the EO camera connector between M0181 and M0159. If this works, we can at least confirm that everything else works.

      At least if this works, you may be able to use Hadron and meanwhile request a replacement.

      Can you let me known which SDK you are using and your kernel version (mach. Var) when you run voxl-version?

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev