ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. Alex Kushleyev
    • Profile
    • Following 0
    • Followers 10
    • Topics 0
    • Posts 1610
    • Best 77
    • Controversial 1
    • Groups 1

    Alex Kushleyev

    @Alex Kushleyev

    ModalAI Team

    79
    Reputation
    203
    Profile views
    1610
    Posts
    10
    Followers
    0
    Following
    Joined Last Online

    Alex Kushleyev Unfollow Follow
    ModalAI Team

    Best posts made by Alex Kushleyev

    • RE: ToF v2 keeps crashing because of high temperature

      @dlee ,

      Yes the new TOF sensor (IRS2975C) is more powerful that the previous generation. What I mean by that is that it can emit more IR power but also heats up more. Emitting more power allows the sensor detect objects at larger distances or objects that are not as reflective.

      In current operating mode, the auto exposure control is enabled inside the sensor itself, which modulates the emitted IR power based on the returns that the sensor is getting. That is to say, the power draw will vary depending on what is in the view of the sensor. If there are obstacles nearby, the output power should be low, otherwise it can be high. At full power, the module can consume close to 0.8-0.9W

      So the first solution, if design allows, is to add a heat spreader to dissipate the heat, which you already started experimenting with. The sensor has a large exposed copper pad in the back for heat sinking purposes for this exact reason. Just be careful not to short this pad to anything, use non-conducting (but heat transfering) adhesive pad between the sensor and heat spreader.

      In terms of a software solution to the issue, we can query the temperature of the emitter. We can also control the maximum emitted power used by the auto exposure algorithm. That is to say, still leave the auto exposure running in the sensor, but limit the maximum power that it is allowed to use.

      We are planning to add some software protection that limits the maximum output power as a function of the emitter temperature. This will require some implementation and testing.

      Meanwhile, please consider using a heat spreader, which will be the best solution if you want to make use of the full sensor's operating range and not have our software limit the output power in order to prevent overheating.

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Propeller Coefficients for Starling V2

      Hello @Kashish-Garg-0

      we have a curve that is "motor voltage vs rpm", meaning that for a desired RPM, it tells the ESC what average motor voltage should be applied. The average motor voltage is defined as battery_voltage * motor_pmw_duty_cycle. The battery voltage in this curve is in millivolts. Since you are typically controlling the desired RPM, as a user you do not need to worry about what "throttle" or voltage to apply - the ESC does this automatically in order to achieve the desired RPM. this calibration curve is used as a feed-forward term in the RPM controller. The ESC does support an "open loop" type of control where you specify the power from 0 to 100%, which is similar to a standard ESC, but PX4 does not use that ESC control mode.

      By the way, you can test the ESC directly (not using PX4) using our voxl-esc tools (https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/tree/master/voxl-esc-tools) which works directly on VOXL2 or a standalone linux PC (or mac). voxl-esc-spin.py has a --power argument where you specify the power from 0 to 100, which translates directly to the average duty cycle applied to the motor.

      Here is the calibration for the Starling V2 motor / propeller that we use:
      https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/blob/master/voxl-esc-params/mavic_mini_2/mavic_mini_2.xml?ref_type=heads#L63

      Also, you can take a look at this post to see how to interpret those parameters a0, a1, a2 : https://forum.modalai.com/topic/2522/esc-calibration/2

      We also have some dyno tests for this motor / propeller : https://gitlab.com/voxl-public/flight-core-px4/dyno_data/-/blob/master/data/mavic_mini2_timing_test/mavic_mini2_modal_esc_pusher_7.4V_timing0.csv . We are not sure how accurate that is, but it can be used as a starting point. @James-Strawson can you please confirm that is the correct dyno data for the Starling V2 motors?

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Sending Recorded Video Though Camera Server on VOXL2

      @reber34 , perhaps this approach can work for you:

      • record a video encoded at high bit rate (using voxl-camera-server and voxl-record-video . Please note that the output of voxl-record-video will not be in a standard container (such as mp4, etc), but you can fix it with ffpeg : ffmpeg -r 30 -i voxl-record-video.h264 -codec copy videofile.mp4
      • re-encode the video offline with desired codecs / bit rates / resolutions
      • install gst-rtsp-launch which uses gstreamer to set up an RTSP stream https://github.com/sfalexrog/gst-rtsp-launch/
        • you will first need to figure out what gstreamer pipeline to use on voxl2 that will load your video and parse the h264/h265 frames (can use null sink for testing) and then use that pipeline with gst-rtsp-launch which will take the encoded frames and serve them over rtsp stream.
      • gstreamer may be more flexible for tuning the encoding parameters of h264/h265 (compared to voxl-camera-server) and you can also use it in real time later (using voxl-streamer, which uses gstreamer under the hood)

      Another alternative is to use voxl-record-raw-image to save raw YUVs coming from voxl-camera-server and then use voxl-replay and voxl-streamer - the latter will accept YUVs from the MPA pipe and encode them using the bit rate that you want. Note that depending on the image resolution, YUV images will take a lot more space than encoded video, but maybe that is also OK since VOXL2 has lots of storage.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: ESC failure error after SDK 1.1.2 upgrade

      @smilon , voxl-esc-calibrate.py is a script that runs a test procedure in a single motor (with propeller mounted) to calibrate the behavior of the motor / propeller. This procedure only needs to be run once if you change motor or propeller type from a default configuration. The output of this script is just 3 coeficients a1, a2, a3 which you would need to manually enter into an ESC calibration xml file and then upload the xml paramer file to the ESC. Full details about the ESC calibration (when to do it and how) can be found here : https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/blob/master/voxl-esc-tools/calibration.md?ref_type=heads

      If you are using standard motors and propellers (one of standard ModalAI drones), you do not need to run this calibration procedure.

      It sounds like you got it working, I believe voxl-configre-mpa took care of it. You can see what voxl-configure-mpa typically does here : https://docs.modalai.com/voxl-configure-mpa/ , which includes running voxl-esc to upload the latest firmware and params for a specific vehicle.

      posted in ESCs
      Alex KushleyevA
      Alex Kushleyev
    • RE: voxl_mpa_to_ros2 camera_interface timestamp

      @smilon ,

      I believe you are correct! Thank you. We will double check this and fix.

      posted in ROS
      Alex KushleyevA
      Alex Kushleyev
    • RE: OV7251 RAW10 format

      Hello @Gicu-Panaghiu,

      I am going to assume you are using VOXL1, since you did not specify..

      We do have RAW8 and RAW10 support for OV7251. The selection of the format has to be done in several places.

      First, you have to select the correct camera driver, specifically..

      ls /usr/lib/libmmcamera_ov7251*.so
      /usr/lib/libmmcamera_ov7251.so
      /usr/lib/libmmcamera_ov7251_8bit.so
      /usr/lib/libmmcamera_ov7251_hflip_8bit.so
      /usr/lib/libmmcamera_ov7251_rot180_8bit.so
      /usr/lib/libmmcamera_ov7251_vflip_8bit.so
      

      there are 5 options and one of them is _8bit.so which means it will natively ouptput 8bit data (all others output 10 bit data).

      the driver name, such as ov7251_8bit has to be the sensor name <SensorName>ov7251_8bit</SensorName> in /system/etc/camera/camera_config.xml.

      You can check camera_config.xml for what sensor library is used for your OV7251.

      When you run voxl-configure-cameras script, it will actually copy one of the default camera_config.xml that are set up for a particular use case, and I believe it will indeed select the 8bit one - this was done to save cpu cycles needed to convert 10bit to 8bit, since majority of the time only 8bit pixels are used.

      Now, you mentioned that HAL_PIXEL_FORMAT_RAW10 is passed to the stream config and unfortunately this does not have any effect on what the driver outputs. If the low level driver (e.g. libmmcamera_ov7251_8bit.so) is set up to output RAW8, it will output RAW8 if you request either HAL_PIXEL_FORMAT_RAW8 or HAL_PIXEL_FORMAT_RAW10.

      So if you update the camera_config.xml to the 10bit driver and just keep the HAL_PIXEL_FORMAT_RAW10 in the stream config (then sync and reboot), you should be getting a 10 bit RAW image from the camera. But since the camera server is currently expecting 8 bit image, if you just interpret the image as 8 bit, it will appear garbled, so you will need to handle the 10 bit image (decide what you want to do with it) in the camera server.

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Minimal example of using camera_cb

      Hi

      If you are looking for more low level functionality and a bit easier to experiment with, you may want to check out https://gitlab.com/voxl-public/utilities/voxl-rtsp/-/tree/dev . This tool is good for testing or running simple scenarios. This tool accepts some commands (like exposure control) via command line, so you could add a simple feature to save an image upon a command line input. Please see README to see how the exposure control input works and code for how it's implemented.

      Please make sure that voxl-camera-server is disabled when you run voxl-rtsp.

      If you still would like to go through the voxl-camera-server approach, we may need some more input from respective devs :).

      I hope this helps..

      Alex

      posted in VOXL m500 Reference Drone
      Alex KushleyevA
      Alex Kushleyev
    • RE: Cannot change TOF framerate

      The ipk is available here now : http://voxl-packages.modalai.com/stable/voxl-hal3-tof-cam-ros_0.0.5.ipk - you should be able to use the launch file to choose between two modes (5=short range and 9=long range) and fps, which are listed in the launch file.

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: VOXL ESC Mini 4-in-1 Current per Motor

      @Moderator said in VOXL ESC Mini 4-in-1 Current per Motor:

      Is it possible to step up voltage?

      Can you please clarify the question? 🙂

      Mini ESC is designed for small drones ( < 500g ). The ESC has been tested to handle 15A continous at 15V input continuously (60+ seconds), but with full direct air flow from propellers. This would simulate a full throttle "punch-out" on a small FPV drone (high current, but also lots of direct airflow = cooling). Do not use this ESC if the drone needs 10-15A per channel just to hover. Use it in application where hover current per motor is less than 5A (ideally 2-3A which is very typical) and absolute maximum continuous current per motor can be 10-15A.

      For example, motors used for small FPV drones often are around 1306 size (3-4S Lipo). Those motors are usually rated for up to 10-12A continous (for 30-60 seconds). Larger motors can be used as long as maximum motor current does not exceed 10-15A (still 2-3A at hover) and there is sufficient cooling.

      Always check ESC board temperature during initial flights / tuning. Temperature must stay below 110C at all times (critical), typically in the range of 40-70C for most applications. The ESC will most likely fail above 125C.

      Temperature of the ESC board is the limiting factor because the board is so small. Mosfets can handle a lot of current as long as they don't overheat. So the design of the drone is very important (either use low current so that temperature is not an issue or properly design air flow from propellers and/or add heat spreader to keep the ESC board temperature in normal range for higher current draw applications).

      ESC provides real time temperature feedback and it can be viewed in PX4 / QGC. Additionally, the PX4 logs contain the temperature information.

      posted in ESCs
      Alex KushleyevA
      Alex Kushleyev
    • RE: Onboard Image Processing using ROS + OpenCV (+ possibly some ML library in the future)

      @Prabhav-Gupta , yes it seems like OpenCV and ROS YUV_NV12 formats do not match up. I will take a look at it.. it seems the ROS YUV is packed (interleaved) while standard for storing YUV NV12 is having two planes : plane 1 : Y (size: widthheight), plane 2 : UV (size: widthheight/2)

      In the meantime.. you can stream a rtsp h264/h265 from VOXL (use decent quality so that image looks good) and use opencv to receive the stream and get decompressed images: https://stackoverflow.com/questions/40875846/capturing-rtsp-camera-using-opencv-python

      Would that work for you ? (unfortunately with rtsp stream, you will not get the full image metadata, like exposure, gain, timestamp, etc).

      RTSP streaming can be done using voxl-streamer, which can accept either a YUV (and encode it) or already encoded h264/5 stream from voxl-camera-server.

      Alex

      posted in ROS
      Alex KushleyevA
      Alex Kushleyev

    Latest posts made by Alex Kushleyev

    • RE: EIS merge

      Hi All,

      Sorry for the delay again. We were chasing down a vibration issue which was apparently causing significant rolling shutter artifacts on the front camera in my tests. Apparently some of the lenses for IMX412 cameras have loose internal components, causing the vibration which is impossible to correct. On my test platform (Starling 2 Max), I had this issue and no frame modifications improved anything. The last thing to try was to switch out the lens and the issue was completely gone.

      We will investigate the reason for the loose lenses and the solution. In my case, shaking the lens would result in audible rattle inside the lens. If anyone has "shaky" lens, we will send a replacement.

      I will post some sample videos of what a bad lens might look like.

      The good news is that there is actually no major vibration issue on the Starling 2 Max and I can move forward with the EIS sw release.. stay tuned for another day or so! 🙂

      Alex

      posted in VOXL 2
      Alex KushleyevA
      Alex Kushleyev
    • RE: AR0144 RGB output on VOXL2

      @Jordyn-Heil

      If you use 4 AR0144 connected to M0173 and 5th hires IMX412 (M0161) attached to J8 via M0155, then everything should work out of the box without any hw or kernel changes. I can actually test this pretty easily. Let me do that.

      Also I can send you some sample images comparing mono and color AR0144.

      We do have limited quantity of the color AR0144, let me check on that as well..

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Teledyne FLIR Lepton 3.5 Integration

      @george-kollamkulam , can you please check if you already have the M0157 adapter installed in your Starling 2?

      ea276a7b-e519-4515-bf99-15abb66c7e24-image.png

      Just for completeness, we are updating M0157 to M0187 which has identical functionality but also has additional i2c-gpio circuit to perform a hardware reset of lepton and tof sensor if needed. Also a small change in dimensions.

      However, if you already have M0157, which you should, it should work just fine.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Teledyne FLIR Lepton 3.5 Integration

      @george-kollamkulam ,

      Please take a look at this document : https://docs.modalai.com/voxl2-d0014/#downward-range-finder-and-flir-lepton

      You will need M0157 adapter + actual Lepton sensor. The M0157 adapter will plug into the M0173 camera front end. Then it will be pretty much plug and play.

      Let me check how to get M0157 - it does not appear to be in the shop.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Starling 2: TOF sensor not recognized by VOXL Portal

      @Hector-Gutierrez , can you please send a picture of the TOF sensor connected to M0173? Just to make sure the connections are correct.

      posted in Starling & Starling 2
      Alex KushleyevA
      Alex Kushleyev
    • RE: VOXL2 with M0172 Add-On: M0166 & M0178 Cameras Not Detected (SDK 1.4.5)

      @yardy In order to update the kernel variant without re-installing the whole SDK, please follow these instructions:

      cd <voxl-sdk-release>/system-image
      
      # reboot voxl2 into fastboot mode
      adb reboot bootloader
      
      # flash VOXL2 kernel variant 0
      fastboot flash boot_a m0054-1-var00.0-kernel.img
      fastboot flash boot_b m0054-1-var00.0-kernel.img
      fastboot reboot
      
      # OR flash VOXL2 kernel variant 1
      fastboot flash boot_a m0054-1-var00.1-kernel.img
      fastboot flash boot_b m0054-1-var00.1-kernel.img
      fastboot reboot
      

      You can also test the kernel without overwriting it (the original kernel will be retained after voxl2 reboot:

      adb reboot bootloader
      fastboot boot m0054-1-var00.0-kernel.img
      
      # OR
      fastboot boot m0054-1-var00.1-kernel.img
      

      The latter method is useful when you are not sure if the kernel you are about to try actually works (when you are experimenting with building your own kernel) - you can try the new kernel without the risk of temporarily bricking your board.

      Alex

      posted in Support Request Format for Best Results
      Alex KushleyevA
      Alex Kushleyev
    • RE: Starling 2: TOF sensor not recognized by VOXL Portal

      @Hector-Gutierrez , in order to focus the hires camera (I am assuming IMX412), you can twist the lens in the M12 lens holder. Normally we do focus the hires cameras to a norminal focus distance, but maybe you had one from earlier batch which was not focused. Also, focus of the camera depends on the application, so it is recommended for users to double check and correct the focus for their needs. If there is some sealant on the threads, it is not permanent and can be removed using tweezers - loosening it up a bit should allow you to twist the lens inside the holder.

      Regarding the PMD TOF, not sure yet. Has it worked before?

      Alex

      posted in Starling & Starling 2
      Alex KushleyevA
      Alex Kushleyev
    • RE: AR0144 RGB output on VOXL2

      (it seems the M0166 CAD file does not include a lens holder and lens.. will check if we have one)

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: AR0144 RGB output on VOXL2

      @Jordyn-Heil ,

      VOXL can support 6 independent cameras (up to 4K each, with some limitations). Depending on the adapters used, there would potentially need to be a small kernel change. The Kernel for voxl2 is public and I could point you to the changes if needed.

      Can you please clarify how exactly you are connecting six M0149 cameras to VOXL2? I don't think it's possible to use them all without a kernel change.

      Using 6 uCoax versions of AR0144 may not be possible at the moment, but we can get close (5). You could use M0173 and connect 4 AR0144 there (+ TOF), but then use VOXL2 J8L to connect 5th AR0144 without any kernel changes (just 1.0.1 kernel variant). Single camera could be connected either using M0172 adapter or M0155 (uCoax) or M0076 / M0135 for old style connectors. For dual uCoax on J8, i don't think we have an adapter for right now, but you could connect two M0149's using an M0135 interposer.

      If you use VOXL2 mini + M0188, you could actually connect 5 ucoax cameras (there are 6 connectors, but the 6th sensor is not yet enabled in the kernel). https://docs.modalai.com/M0188/#image-sensor-interfaces

      *** EDIT: due to a subtle detail that VOXL2 has only 4 independent CCI (I2C) busses for camera control connecting more than 4 cameras with the same slave ID would create a conflict during communication. So, connecting 5 or 6 AR0144 cameras would actually be an issue.. However, the M0166 camera does support a change of CCI Slave address via a resistor on the camera PCB. That would require removing one 0402 resistor and installing another resistor on the camera PCB (and would void the warranty...). Additionally, it would require a small change in the camera driver, which is not difficult to support.. You would modify the CCI slave id for the 2 cameras so that they do not conflict with the other 4 cameras. So if you REALLY need more than 4 AR0144 cameras connected to a single VOXL2, it is possible.. However, I still do not see how the 6th M0166 camera can be enabled just yet, so 5 should be possible.

      Lets discuss further..

      The 3D CAD drawing of M0166 can be found here : https://storage.googleapis.com/modalai_public/modal_drawings/M0166_3D.stp

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: VOXL2 with M0172 Add-On: M0166 & M0178 Cameras Not Detected (SDK 1.4.5)

      @yardy ,

      I believe the issue is that when using M0173, you need to use the .1 kernel variant (mach.var: 1.0.1) because M0173 uses different GPIO for camera resets, including some other changes specific to signal routing on M0173,

      When using M0172, the adapter acts similar to M0135 dual sensor expander, so you would need to switch to kernel variant 1.0.0. You can select the kernel variant when you install the SDK but there is also a way to just install the kernel variant directly (let me know if you need more info).

      My suggestion is to switch to kernel variant 1.0.0 and use J7. On M0172 J1L should map to slot id 2 and J2U would map to the upper slot id 3 (when connected to J7). You would need AR0144 sernsormodule id 3 and irs2975c sensormodule id 2.

      The reason why you should not use J6 or J8 is because in the 1.0.0 kernel variant, the J6U and J8U camera slots (slots 1 and 5) are configured for a OV7251 stereo set up with lane merging using a single CSI port (combo mode with slots 0 and 4 respectively), so you technically cannot use 6 independent cameras with 1.0.0 kernel variant.

      Please try and let me know if it works or if you have more questions. I will double check this myself as well.

      Alex

      posted in Support Request Format for Best Results
      Alex KushleyevA
      Alex Kushleyev