ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. Alex Kushleyev
    • Profile
    • Following 0
    • Followers 11
    • Topics 0
    • Posts 1925
    • Best 105
    • Controversial 1
    • Groups 1

    Alex Kushleyev

    @Alex Kushleyev

    ModalAI Team

    107
    Reputation
    281
    Profile views
    1925
    Posts
    11
    Followers
    0
    Following
    Joined Last Online

    Alex Kushleyev Unfollow Follow
    ModalAI Team

    Best posts made by Alex Kushleyev

    • RE: ToF v2 keeps crashing because of high temperature

      @dlee ,

      Yes the new TOF sensor (IRS2975C) is more powerful that the previous generation. What I mean by that is that it can emit more IR power but also heats up more. Emitting more power allows the sensor detect objects at larger distances or objects that are not as reflective.

      In current operating mode, the auto exposure control is enabled inside the sensor itself, which modulates the emitted IR power based on the returns that the sensor is getting. That is to say, the power draw will vary depending on what is in the view of the sensor. If there are obstacles nearby, the output power should be low, otherwise it can be high. At full power, the module can consume close to 0.8-0.9W

      So the first solution, if design allows, is to add a heat spreader to dissipate the heat, which you already started experimenting with. The sensor has a large exposed copper pad in the back for heat sinking purposes for this exact reason. Just be careful not to short this pad to anything, use non-conducting (but heat transfering) adhesive pad between the sensor and heat spreader.

      In terms of a software solution to the issue, we can query the temperature of the emitter. We can also control the maximum emitted power used by the auto exposure algorithm. That is to say, still leave the auto exposure running in the sensor, but limit the maximum power that it is allowed to use.

      We are planning to add some software protection that limits the maximum output power as a function of the emitter temperature. This will require some implementation and testing.

      Meanwhile, please consider using a heat spreader, which will be the best solution if you want to make use of the full sensor's operating range and not have our software limit the output power in order to prevent overheating.

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Propeller Coefficients for Starling V2

      Hello @Kashish-Garg-0

      we have a curve that is "motor voltage vs rpm", meaning that for a desired RPM, it tells the ESC what average motor voltage should be applied. The average motor voltage is defined as battery_voltage * motor_pmw_duty_cycle. The battery voltage in this curve is in millivolts. Since you are typically controlling the desired RPM, as a user you do not need to worry about what "throttle" or voltage to apply - the ESC does this automatically in order to achieve the desired RPM. this calibration curve is used as a feed-forward term in the RPM controller. The ESC does support an "open loop" type of control where you specify the power from 0 to 100%, which is similar to a standard ESC, but PX4 does not use that ESC control mode.

      By the way, you can test the ESC directly (not using PX4) using our voxl-esc tools (https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/tree/master/voxl-esc-tools) which works directly on VOXL2 or a standalone linux PC (or mac). voxl-esc-spin.py has a --power argument where you specify the power from 0 to 100, which translates directly to the average duty cycle applied to the motor.

      Here is the calibration for the Starling V2 motor / propeller that we use:
      https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/blob/master/voxl-esc-params/mavic_mini_2/mavic_mini_2.xml?ref_type=heads#L63

      Also, you can take a look at this post to see how to interpret those parameters a0, a1, a2 : https://forum.modalai.com/topic/2522/esc-calibration/2

      We also have some dyno tests for this motor / propeller : https://gitlab.com/voxl-public/flight-core-px4/dyno_data/-/blob/master/data/mavic_mini2_timing_test/mavic_mini2_modal_esc_pusher_7.4V_timing0.csv . We are not sure how accurate that is, but it can be used as a starting point. @James-Strawson can you please confirm that is the correct dyno data for the Starling V2 motors?

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Sending Recorded Video Though Camera Server on VOXL2

      @reber34 , perhaps this approach can work for you:

      • record a video encoded at high bit rate (using voxl-camera-server and voxl-record-video . Please note that the output of voxl-record-video will not be in a standard container (such as mp4, etc), but you can fix it with ffpeg : ffmpeg -r 30 -i voxl-record-video.h264 -codec copy videofile.mp4
      • re-encode the video offline with desired codecs / bit rates / resolutions
      • install gst-rtsp-launch which uses gstreamer to set up an RTSP stream https://github.com/sfalexrog/gst-rtsp-launch/
        • you will first need to figure out what gstreamer pipeline to use on voxl2 that will load your video and parse the h264/h265 frames (can use null sink for testing) and then use that pipeline with gst-rtsp-launch which will take the encoded frames and serve them over rtsp stream.
      • gstreamer may be more flexible for tuning the encoding parameters of h264/h265 (compared to voxl-camera-server) and you can also use it in real time later (using voxl-streamer, which uses gstreamer under the hood)

      Another alternative is to use voxl-record-raw-image to save raw YUVs coming from voxl-camera-server and then use voxl-replay and voxl-streamer - the latter will accept YUVs from the MPA pipe and encode them using the bit rate that you want. Note that depending on the image resolution, YUV images will take a lot more space than encoded video, but maybe that is also OK since VOXL2 has lots of storage.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Recording tracking camera video compressed and coloured

      @vjuliani , the tracking camera ( i assume ov7251 or AR0144) is monochrome, so you can not make the image colored. You can install the latest version of the voxl-streamer from dev (nightly builds) http://voxl-packages.modalai.com/dists/qrb5165/dev/binary-arm64/ and it should allow you to encode h265.

      Alex

      posted in Qualcomm Flight RB5 5G Drone
      Alex KushleyevA
      Alex Kushleyev
    • RE: GPIO missing in /sys/class/gpio

      @psafi , before calling voxl-gpio read, you need to enable the pin by setting it to input or output. Please check voxl-gpio -h. (hint: use voxl-gpio -m / voxl-gpio mode)

      Alex

      posted in VOXL 2 Mini
      Alex KushleyevA
      Alex Kushleyev
    • RE: OV7251 RAW10 format

      Hello @Gicu-Panaghiu,

      I am going to assume you are using VOXL1, since you did not specify..

      We do have RAW8 and RAW10 support for OV7251. The selection of the format has to be done in several places.

      First, you have to select the correct camera driver, specifically..

      ls /usr/lib/libmmcamera_ov7251*.so
      /usr/lib/libmmcamera_ov7251.so
      /usr/lib/libmmcamera_ov7251_8bit.so
      /usr/lib/libmmcamera_ov7251_hflip_8bit.so
      /usr/lib/libmmcamera_ov7251_rot180_8bit.so
      /usr/lib/libmmcamera_ov7251_vflip_8bit.so
      

      there are 5 options and one of them is _8bit.so which means it will natively ouptput 8bit data (all others output 10 bit data).

      the driver name, such as ov7251_8bit has to be the sensor name <SensorName>ov7251_8bit</SensorName> in /system/etc/camera/camera_config.xml.

      You can check camera_config.xml for what sensor library is used for your OV7251.

      When you run voxl-configure-cameras script, it will actually copy one of the default camera_config.xml that are set up for a particular use case, and I believe it will indeed select the 8bit one - this was done to save cpu cycles needed to convert 10bit to 8bit, since majority of the time only 8bit pixels are used.

      Now, you mentioned that HAL_PIXEL_FORMAT_RAW10 is passed to the stream config and unfortunately this does not have any effect on what the driver outputs. If the low level driver (e.g. libmmcamera_ov7251_8bit.so) is set up to output RAW8, it will output RAW8 if you request either HAL_PIXEL_FORMAT_RAW8 or HAL_PIXEL_FORMAT_RAW10.

      So if you update the camera_config.xml to the 10bit driver and just keep the HAL_PIXEL_FORMAT_RAW10 in the stream config (then sync and reboot), you should be getting a 10 bit RAW image from the camera. But since the camera server is currently expecting 8 bit image, if you just interpret the image as 8 bit, it will appear garbled, so you will need to handle the 10 bit image (decide what you want to do with it) in the camera server.

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Can't run the voxl-emulator docker image

      It looks like your machine does not have QEMU support or is missing ARM support on your host machine. Please try instructions here : https://www.stereolabs.com/docs/docker/building-arm-container-on-x86/ (which also show you how to run ARM64 docker images on x86)

      posted in VOXL
      Alex KushleyevA
      Alex Kushleyev
    • RE: Cannot change TOF framerate

      The ipk is available here now : http://voxl-packages.modalai.com/stable/voxl-hal3-tof-cam-ros_0.0.5.ipk - you should be able to use the launch file to choose between two modes (5=short range and 9=long range) and fps, which are listed in the launch file.

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Minimal example of using camera_cb

      Hi

      If you are looking for more low level functionality and a bit easier to experiment with, you may want to check out https://gitlab.com/voxl-public/utilities/voxl-rtsp/-/tree/dev . This tool is good for testing or running simple scenarios. This tool accepts some commands (like exposure control) via command line, so you could add a simple feature to save an image upon a command line input. Please see README to see how the exposure control input works and code for how it's implemented.

      Please make sure that voxl-camera-server is disabled when you run voxl-rtsp.

      If you still would like to go through the voxl-camera-server approach, we may need some more input from respective devs :).

      I hope this helps..

      Alex

      posted in VOXL m500 Reference Drone
      Alex KushleyevA
      Alex Kushleyev
    • RE: Rotate imx412 stream

      @TomP , oh yeah, my bad.. you found the right one 🙂

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev

    Latest posts made by Alex Kushleyev

    • RE: Powering Servos from M0065/PWN Breakout

      @MattB69 ,

      The 5V output on M0065 (10-pin connector with pwm outputs) should not be used to power any servos. The 5V signal is generated from 3.3V (coming from VOXL2) using a step-up converter, only capable of 80mA output).

      https://docs.modalai.com/voxl2-io-datasheet/

      the 3.3V (which is coming from VOXL2), also should not be used to power servos, as that would be risking bringing down a major power rail on VOXL2.

      Alex

      posted in Support Request Format for Best Results
      Alex KushleyevA
      Alex Kushleyev
    • RE: Triple IMX412 with Dual AR0144 on VOXL2?

      @Gary-Holmgren ,

      The configuration you are asking about is the standard C28 (https://docs.modalai.com/voxl2-coax-camera-bundles/#mdk-m0173-1-03) with an addition of another hires camera.

      Using MISP (not built-in ISP), VOXL2 supports as many hires cameras as can be physically connected (may run into some limitations if you are running at full resolution and encoding all streams at the same time).

      C28 config uses VOXL2 + M0173 and you are left with voxl2 J8 (which has two camera ports), so we just need to add another IMX412 to voxl2 J8. The best way to do this would be using M0181 (which supports a Boson + hires camera), it would be connected like this (you can just ignore Boson):

      c89b4153-c47e-4dc2-bace-baabe08fbfa1-image.png

      What is not shown here is the standard C28 config (using M0173 adapter, plugged into VOXL2 J6, J7).

      Just to make sure there will be enough processing resources for the three IMX412, can you briefly describe what resolution you plan to use and whether you need encoded video, etc. I could test this configuration for you.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: IRCUT FILTER

      @YUUJI-INOUE ,

      Yes the square glass piece is the IR cut filter. It should be glued to the lens assembly in the for corners of the square. You could remove it yourself, at your own risk. Please see a similar discussion and results in the following thread : https://forum.modalai.com/topic/4826/msu-m0149-1-ir-filter (this that case, the lens is slightly different and it was much more difficult to remove the IR filter). In your case, the filter may pop out without breaking, but it is likely to break. You may want to put some tape on the filter before removing it, which would help avoid having loose pieces of glass.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Color Tracking Camera w/ Global Shutter

      also see discussion here : https://forum.modalai.com/topic/4491/ar0144-rgb-output-on-voxl2/

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Minimizing voxl-camera-server CPU usage in SDK1.6

      Hi @Rowan-Dempster,

      Thanks for the reminder. I will try to get this in by early next week. Sorry for the delay!

      Happy New Year!

      Alex

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: New Official Stereo Module?

      @john_t , you can use two AR0144 modules (M0166) in a stereo configuration. When connected to VOXL2 using M0173, the two cameras will be synchronized (start of exposure).

      Even though we are not actively using dual AR0144 for DFS, we have tested and it works with voxl-dfs-server with some tweaks.

      If there is interested in this, we can document how to set it up (using voxl-dfs-server.)

      Would you be using your own DFS processing?

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: CANT ACCESS CAMERA

      @taiwohazeez ,

      Once thing i noticed is that you don't have en_misp set to 1 for the tracking_front camera. I don't think this is the reason for the failure, but it should be set to 1.

      Meanwhile, please try to identify which camera is causing the issue by disabling cameras and testing camera server ( you can set enabled flag to 0, so that this camera is not started.

      Also, when you installed SDK 1.6.2, did you run voxl-configure-cameras and which option did you select for the camera configuration?

      Alex

      posted in FPV Drones
      Alex KushleyevA
      Alex Kushleyev
    • RE: Camera Calibration

      @taiwohazeez ,

      It is possible, as the output suggests that the motion blur is causing the issue.

      You can override the auto exposure algorithm and reduce the exposure down to 5ms or so to help with this. There are two ways of doing this:

      • using voxl-portal : open the regular camera view for your tracking camera, click a small check box in the bottom left corner to enable the advanced control panel. This will allow you to set the exposure and gain, you can try to set the gain to the max (should be 1600 (16x) for AR0144) and then play around with exposure to have the lowest exposure that still makes the checker board clear. After you are done changing the manual exposure, you can run the calibrator and switch the voxl-portal to the calibrator output.

        • please note that until you touch the sliders for exposure and gain, they will not correspond to the current values that are being used by auto-exposure algorithm.
      • from command line you can send a command to camera server: voxl-send-command tracking_front set_exp_gain 5.0 1600
        (this would set exposure to 5.0ms and gain to 1600 ISO (16x))

      Please note that the manual exposure settings / mode will persist until you reboot the camera server.

      Alex

      posted in FPV Drones
      Alex KushleyevA
      Alex Kushleyev
    • RE: Stinger & Hadron 640r

      @dstylesunf ,

      M0188 does not have support for the Boson part of the Hadron (mainly difference in power requirements)

      However, M0195 supports Boson, which means you can also plug in the Hadron by using M0202 (or older M0159) adapter instead of M0201 (which would be used for Boson). The RGB Camera inside Hadron (which is OV64B) would be treated as an independent hires camera.

      Basically, M0202 (M0159) plugs into the back of Hadron docs and then on the other end you could connect both coax cables to M0181 straight into VOXL2 mini or using M0195 instead of M0181.

      You can also use M0194 adapter for connecting Hadron to VOXL2 mini. In the diagram above, a Boson with a high-res camera is shown, but you can replace Boson+M0201+hires with Hadron + M0202, which would plug into M0194:

      M0194 Boson + Hires Connection Example

      fe90d9f8-7197-4c4e-8170-1e2af357864a-image.png

      M0195, however, would allow you to connect one extra camera (if you are using a dual tracking camera setup) compared to individual adapters plugged in to VOXL2 J6 and J7.

      Please let us know if you have any other questions.

      Alex

      posted in FPV Drones
      Alex KushleyevA
      Alex Kushleyev
    • RE: Color Tracking Camera w/ Global Shutter

      @jakkkkobo , the color AR0144 support (very basic) is in voxl-camera-server: https://gitlab.com/voxl-public/voxl-sdk/services/voxl-camera-server/-/blob/master/src/hal3_camera_mgr.cpp?ref_type=heads#L1405 (PerCameraMgr::ProcessAR0144ColorFrame).

      Once you connect the color version of the AR0144 camera, just change the camera type from ar0144 to ar0144-color in voxl-camera-server.conf. Then camera server will publish the raw bayer (8 bit), mono and RGB (not yuv) images.

      If there still interest in using the color version of AR0144, we can develop the code further to follow our standard color camera pipeline:

      • debayer on GPU
      • add LSC correction for R, G and B
      • output YUV
      • add encoded video output

      Right now the implementation is very basic, just for simple testing.

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev