ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. Alex Kushleyev
    • Profile
    • Following 0
    • Followers 11
    • Topics 0
    • Posts 1834
    • Best 95
    • Controversial 1
    • Groups 1

    Alex Kushleyev

    @Alex Kushleyev

    ModalAI Team

    97
    Reputation
    261
    Profile views
    1834
    Posts
    11
    Followers
    0
    Following
    Joined Last Online

    Alex Kushleyev Unfollow Follow
    ModalAI Team

    Best posts made by Alex Kushleyev

    • RE: ToF v2 keeps crashing because of high temperature

      @dlee ,

      Yes the new TOF sensor (IRS2975C) is more powerful that the previous generation. What I mean by that is that it can emit more IR power but also heats up more. Emitting more power allows the sensor detect objects at larger distances or objects that are not as reflective.

      In current operating mode, the auto exposure control is enabled inside the sensor itself, which modulates the emitted IR power based on the returns that the sensor is getting. That is to say, the power draw will vary depending on what is in the view of the sensor. If there are obstacles nearby, the output power should be low, otherwise it can be high. At full power, the module can consume close to 0.8-0.9W

      So the first solution, if design allows, is to add a heat spreader to dissipate the heat, which you already started experimenting with. The sensor has a large exposed copper pad in the back for heat sinking purposes for this exact reason. Just be careful not to short this pad to anything, use non-conducting (but heat transfering) adhesive pad between the sensor and heat spreader.

      In terms of a software solution to the issue, we can query the temperature of the emitter. We can also control the maximum emitted power used by the auto exposure algorithm. That is to say, still leave the auto exposure running in the sensor, but limit the maximum power that it is allowed to use.

      We are planning to add some software protection that limits the maximum output power as a function of the emitter temperature. This will require some implementation and testing.

      Meanwhile, please consider using a heat spreader, which will be the best solution if you want to make use of the full sensor's operating range and not have our software limit the output power in order to prevent overheating.

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Propeller Coefficients for Starling V2

      Hello @Kashish-Garg-0

      we have a curve that is "motor voltage vs rpm", meaning that for a desired RPM, it tells the ESC what average motor voltage should be applied. The average motor voltage is defined as battery_voltage * motor_pmw_duty_cycle. The battery voltage in this curve is in millivolts. Since you are typically controlling the desired RPM, as a user you do not need to worry about what "throttle" or voltage to apply - the ESC does this automatically in order to achieve the desired RPM. this calibration curve is used as a feed-forward term in the RPM controller. The ESC does support an "open loop" type of control where you specify the power from 0 to 100%, which is similar to a standard ESC, but PX4 does not use that ESC control mode.

      By the way, you can test the ESC directly (not using PX4) using our voxl-esc tools (https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/tree/master/voxl-esc-tools) which works directly on VOXL2 or a standalone linux PC (or mac). voxl-esc-spin.py has a --power argument where you specify the power from 0 to 100, which translates directly to the average duty cycle applied to the motor.

      Here is the calibration for the Starling V2 motor / propeller that we use:
      https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/blob/master/voxl-esc-params/mavic_mini_2/mavic_mini_2.xml?ref_type=heads#L63

      Also, you can take a look at this post to see how to interpret those parameters a0, a1, a2 : https://forum.modalai.com/topic/2522/esc-calibration/2

      We also have some dyno tests for this motor / propeller : https://gitlab.com/voxl-public/flight-core-px4/dyno_data/-/blob/master/data/mavic_mini2_timing_test/mavic_mini2_modal_esc_pusher_7.4V_timing0.csv . We are not sure how accurate that is, but it can be used as a starting point. @James-Strawson can you please confirm that is the correct dyno data for the Starling V2 motors?

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Sending Recorded Video Though Camera Server on VOXL2

      @reber34 , perhaps this approach can work for you:

      • record a video encoded at high bit rate (using voxl-camera-server and voxl-record-video . Please note that the output of voxl-record-video will not be in a standard container (such as mp4, etc), but you can fix it with ffpeg : ffmpeg -r 30 -i voxl-record-video.h264 -codec copy videofile.mp4
      • re-encode the video offline with desired codecs / bit rates / resolutions
      • install gst-rtsp-launch which uses gstreamer to set up an RTSP stream https://github.com/sfalexrog/gst-rtsp-launch/
        • you will first need to figure out what gstreamer pipeline to use on voxl2 that will load your video and parse the h264/h265 frames (can use null sink for testing) and then use that pipeline with gst-rtsp-launch which will take the encoded frames and serve them over rtsp stream.
      • gstreamer may be more flexible for tuning the encoding parameters of h264/h265 (compared to voxl-camera-server) and you can also use it in real time later (using voxl-streamer, which uses gstreamer under the hood)

      Another alternative is to use voxl-record-raw-image to save raw YUVs coming from voxl-camera-server and then use voxl-replay and voxl-streamer - the latter will accept YUVs from the MPA pipe and encode them using the bit rate that you want. Note that depending on the image resolution, YUV images will take a lot more space than encoded video, but maybe that is also OK since VOXL2 has lots of storage.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: voxl_mpa_to_ros2 camera_interface timestamp

      @smilon ,

      I believe you are correct! Thank you. We will double check this and fix.

      posted in ROS
      Alex KushleyevA
      Alex Kushleyev
    • RE: HiRes camera extrinsics config

      @Gary-Holmgren , you are correct, the high resolution camera is not used for VIO, we typically use it just for video recording / streaming.

      You can certainly add a new transform to the extrinsics file and use it in your application. You should use the same name for the camera in the extrinsics file as you name it in voxl-camera-server.conf just to be consistent.

      posted in Support Request Format for Best Results
      Alex KushleyevA
      Alex Kushleyev
    • RE: OV7251 RAW10 format

      Hello @Gicu-Panaghiu,

      I am going to assume you are using VOXL1, since you did not specify..

      We do have RAW8 and RAW10 support for OV7251. The selection of the format has to be done in several places.

      First, you have to select the correct camera driver, specifically..

      ls /usr/lib/libmmcamera_ov7251*.so
      /usr/lib/libmmcamera_ov7251.so
      /usr/lib/libmmcamera_ov7251_8bit.so
      /usr/lib/libmmcamera_ov7251_hflip_8bit.so
      /usr/lib/libmmcamera_ov7251_rot180_8bit.so
      /usr/lib/libmmcamera_ov7251_vflip_8bit.so
      

      there are 5 options and one of them is _8bit.so which means it will natively ouptput 8bit data (all others output 10 bit data).

      the driver name, such as ov7251_8bit has to be the sensor name <SensorName>ov7251_8bit</SensorName> in /system/etc/camera/camera_config.xml.

      You can check camera_config.xml for what sensor library is used for your OV7251.

      When you run voxl-configure-cameras script, it will actually copy one of the default camera_config.xml that are set up for a particular use case, and I believe it will indeed select the 8bit one - this was done to save cpu cycles needed to convert 10bit to 8bit, since majority of the time only 8bit pixels are used.

      Now, you mentioned that HAL_PIXEL_FORMAT_RAW10 is passed to the stream config and unfortunately this does not have any effect on what the driver outputs. If the low level driver (e.g. libmmcamera_ov7251_8bit.so) is set up to output RAW8, it will output RAW8 if you request either HAL_PIXEL_FORMAT_RAW8 or HAL_PIXEL_FORMAT_RAW10.

      So if you update the camera_config.xml to the 10bit driver and just keep the HAL_PIXEL_FORMAT_RAW10 in the stream config (then sync and reboot), you should be getting a 10 bit RAW image from the camera. But since the camera server is currently expecting 8 bit image, if you just interpret the image as 8 bit, it will appear garbled, so you will need to handle the 10 bit image (decide what you want to do with it) in the camera server.

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Tracking camera calibration not progressing

      @KnightHawk06 , use voxl-calibrare-camera tracking_down_misp_grey <remaining options>

      posted in VOXL-CAM
      Alex KushleyevA
      Alex Kushleyev
    • RE: Cannot change TOF framerate

      The ipk is available here now : http://voxl-packages.modalai.com/stable/voxl-hal3-tof-cam-ros_0.0.5.ipk - you should be able to use the launch file to choose between two modes (5=short range and 9=long range) and fps, which are listed in the launch file.

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Onboard Image Processing using ROS + OpenCV (+ possibly some ML library in the future)

      @Prabhav-Gupta , yes it seems like OpenCV and ROS YUV_NV12 formats do not match up. I will take a look at it.. it seems the ROS YUV is packed (interleaved) while standard for storing YUV NV12 is having two planes : plane 1 : Y (size: widthheight), plane 2 : UV (size: widthheight/2)

      In the meantime.. you can stream a rtsp h264/h265 from VOXL (use decent quality so that image looks good) and use opencv to receive the stream and get decompressed images: https://stackoverflow.com/questions/40875846/capturing-rtsp-camera-using-opencv-python

      Would that work for you ? (unfortunately with rtsp stream, you will not get the full image metadata, like exposure, gain, timestamp, etc).

      RTSP streaming can be done using voxl-streamer, which can accept either a YUV (and encode it) or already encoded h264/5 stream from voxl-camera-server.

      Alex

      posted in ROS
      Alex KushleyevA
      Alex Kushleyev
    • RE: Image streaming slowdown when using voxl-mpa-to-ros2

      @cfirth , if you are streaming raw RGB images, then 6404803 is 921KB, at 30fps will be 27MB/s which is definitely more than you want to transfer over wifi for a single stream. You can measure the available bandwidth using iperf as i suggested in previous post. Consider using compressed image format in ROS or use h264/5 encoded streams. You can use voxl-streamer to encode the tracking frames into a streaming video. If you describe your use case in more detail, we can provide additional suggestions.

      posted in Support Request Format for Best Results
      Alex KushleyevA
      Alex Kushleyev

    Latest posts made by Alex Kushleyev

    • RE: M0172 CAD File

      @aheyne ,

      Sorry for the delay.

      You can find the M0172 CAD file here

      Alex

      posted in 3D Models
      Alex KushleyevA
      Alex Kushleyev
    • RE: Starling 2 Max Questions before Purchase

      @jimbow77 , regarding #4, there are four batteries (two sets).

      #3 - we don't have an option to install FLIR Lepton, but you can purchase it separately (from another vendor) and install it. The adapter is present in Starling 2 Max, so you would just need to install the Lepton sensor into the socket.

      Alex

      posted in Starling & Starling 2
      Alex KushleyevA
      Alex Kushleyev
    • RE: Python MPA image

      @l05 can you try running the ./make_package.sh script inside docker? also if you are just trying to deploy your app, you can copy it directly from voxl-mpa-tools/build64/tools, to voxl2 using adb or ssh (assuming the build itself succeeded).

      Alex

      posted in Modal Pipe Architecture (MPA)
      Alex KushleyevA
      Alex Kushleyev
    • RE: Starling 2 Max Questions before Purchase

      @jimbow77 ,

      1. here is what the TOF sensor mounting looks like, right next to front-facing IMX412 and AR0144 cameras.
        Starling2_MAX_C29_TOF.png

      2. Starling 2 and Starling 2 Max are very similar in terms of what software they are running. If you are able to run the mapping / navigation on a Starling 2, then you should be able to follow the same steps to get it to work on Starling 2 Max (sorry for lack of details, but it seems you just want to confirm that the same functionality is supported, and the answer is yes). Please note that getting the navigation and mapping to work well in your application will require testing and tuning.

      I will check regarding 3 and 4 and get back to you.

      Alex

      posted in Starling & Starling 2
      Alex KushleyevA
      Alex Kushleyev
    • RE: Python MPA image

      @l05 , can you please clarify the following: do you need to receive images from camera server in a python script or a C/C++ application?

      The recommended way is to use a C/C++ application, for which we can provide some examples. Python extension via Pympa makes it easier to quickly test / prototype something, but may have additional overhead to receive / convert images.

      For example, there is a tool voxl-inspect-cam-ascii, which is installed by default with the VOXL2 SDK, which subscribes to an image and renders the image to your terminal using ASCII. It uses opencv to downscale the image. You can find the source code here : https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-mpa-tools/-/blob/master/tools/voxl-inspect-cam-ascii.cpp . this app also shows how to publish an image (it optionally publishes the down-scaled image as well).

      You can test this app by running voxl-inspect-cam-ascii <camera_stream> in a terminal, and you should see an ASCII version of your image. The name of the camera stream you can get from voxl-portal (camera tab), for example it would be something like tracking or hires_front_color, depending on the cameras / streams that you have enabled.

      If you wanted to develop your own application, you can start by first building the voxl-mpa-tools package from source and then add your own application into the existing build structure.

      Alex

      posted in Modal Pipe Architecture (MPA)
      Alex KushleyevA
      Alex Kushleyev
    • RE: How to upgrade OpenSSL

      @chengyouzen , have you tried building OpenSSL from source (you should be able to do it directly on VOXL1)? That may be the only way..

      Alex

      posted in FAQs
      Alex KushleyevA
      Alex Kushleyev
    • RE: EIS functionality

      @SKA ,

      In your case, the issue should be solved by commenting out one line in https://gitlab.com/voxl-public/voxl-sdk/services/voxl-camera-server/-/blob/eis/src/misp.cpp :

             //TODO: FIXME
             //in roll following mode, the roll is not stabilized to the horizon, so it does not need to be flipped
             //need to resolve this in a better way because Rout is meant for another purpose
             if (!follow_roll)
                 rc_matrix_right_multiply_inplace(&eis_ctx.H.m, eis_ctx.Rout.m);    //rotate the image according to desired output rotation (nominally identity)
      

      Specifically, comment out the if statement : if (!follow_roll) , so that the Rout matrix is always applied to the full transform H.

      The issue is that I tried to solve another use case by using this parameter (when the voxl2 / IMU is flipped upside down while camera is right side up). I just tested and it appears to be working.

      I will figure out a better way of handling the upside down IMU in this case so that the output transform can be used correctly, but for now you can just comment out that if statement.

      Can you please let me know if that fix worked for you?

      Alex

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: EIS functionality

      @SKA , OK, so what you are saying is that in full-follow mode, the rotation parameter in the extrinsics hires -> hires_eis for the -90 degree rotation is not respected, correct?

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: EIS functionality

      @SKA , I thought i fixed that issue a while ago, let me double check. Are you using voxl-camera-server from eis branch or an official release from an SDK?

      Alex

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Python MPA image

      @l05 , please see the following post, you can download the voxl-opencv library with python3 bindings from here .

      https://forum.modalai.com/topic/3519/help-with-pip-installing-opencv

      This version of opencv was used for testing the pympa-experimental tools.

      The script you were looking at is setup to receive image frames from an rtsp stream - is that what your goal is? (as opposed to receiving images from a mipi camera connected to VOXL2 (via voxl-camera-server)).

      Alex

      posted in Modal Pipe Architecture (MPA)
      Alex KushleyevA
      Alex Kushleyev