ModalAI Forum
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. Alex Kushleyev
    • Profile
    • Following 0
    • Followers 10
    • Topics 0
    • Posts 1651
    • Best 79
    • Controversial 1
    • Groups 1

    Alex Kushleyev

    @Alex Kushleyev

    ModalAI Team

    81
    Reputation
    207
    Profile views
    1651
    Posts
    10
    Followers
    0
    Following
    Joined Last Online

    Alex Kushleyev Unfollow Follow
    ModalAI Team

    Best posts made by Alex Kushleyev

    • RE: ToF v2 keeps crashing because of high temperature

      @dlee ,

      Yes the new TOF sensor (IRS2975C) is more powerful that the previous generation. What I mean by that is that it can emit more IR power but also heats up more. Emitting more power allows the sensor detect objects at larger distances or objects that are not as reflective.

      In current operating mode, the auto exposure control is enabled inside the sensor itself, which modulates the emitted IR power based on the returns that the sensor is getting. That is to say, the power draw will vary depending on what is in the view of the sensor. If there are obstacles nearby, the output power should be low, otherwise it can be high. At full power, the module can consume close to 0.8-0.9W

      So the first solution, if design allows, is to add a heat spreader to dissipate the heat, which you already started experimenting with. The sensor has a large exposed copper pad in the back for heat sinking purposes for this exact reason. Just be careful not to short this pad to anything, use non-conducting (but heat transfering) adhesive pad between the sensor and heat spreader.

      In terms of a software solution to the issue, we can query the temperature of the emitter. We can also control the maximum emitted power used by the auto exposure algorithm. That is to say, still leave the auto exposure running in the sensor, but limit the maximum power that it is allowed to use.

      We are planning to add some software protection that limits the maximum output power as a function of the emitter temperature. This will require some implementation and testing.

      Meanwhile, please consider using a heat spreader, which will be the best solution if you want to make use of the full sensor's operating range and not have our software limit the output power in order to prevent overheating.

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Propeller Coefficients for Starling V2

      Hello @Kashish-Garg-0

      we have a curve that is "motor voltage vs rpm", meaning that for a desired RPM, it tells the ESC what average motor voltage should be applied. The average motor voltage is defined as battery_voltage * motor_pmw_duty_cycle. The battery voltage in this curve is in millivolts. Since you are typically controlling the desired RPM, as a user you do not need to worry about what "throttle" or voltage to apply - the ESC does this automatically in order to achieve the desired RPM. this calibration curve is used as a feed-forward term in the RPM controller. The ESC does support an "open loop" type of control where you specify the power from 0 to 100%, which is similar to a standard ESC, but PX4 does not use that ESC control mode.

      By the way, you can test the ESC directly (not using PX4) using our voxl-esc tools (https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/tree/master/voxl-esc-tools) which works directly on VOXL2 or a standalone linux PC (or mac). voxl-esc-spin.py has a --power argument where you specify the power from 0 to 100, which translates directly to the average duty cycle applied to the motor.

      Here is the calibration for the Starling V2 motor / propeller that we use:
      https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/blob/master/voxl-esc-params/mavic_mini_2/mavic_mini_2.xml?ref_type=heads#L63

      Also, you can take a look at this post to see how to interpret those parameters a0, a1, a2 : https://forum.modalai.com/topic/2522/esc-calibration/2

      We also have some dyno tests for this motor / propeller : https://gitlab.com/voxl-public/flight-core-px4/dyno_data/-/blob/master/data/mavic_mini2_timing_test/mavic_mini2_modal_esc_pusher_7.4V_timing0.csv . We are not sure how accurate that is, but it can be used as a starting point. @James-Strawson can you please confirm that is the correct dyno data for the Starling V2 motors?

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: Sending Recorded Video Though Camera Server on VOXL2

      @reber34 , perhaps this approach can work for you:

      • record a video encoded at high bit rate (using voxl-camera-server and voxl-record-video . Please note that the output of voxl-record-video will not be in a standard container (such as mp4, etc), but you can fix it with ffpeg : ffmpeg -r 30 -i voxl-record-video.h264 -codec copy videofile.mp4
      • re-encode the video offline with desired codecs / bit rates / resolutions
      • install gst-rtsp-launch which uses gstreamer to set up an RTSP stream https://github.com/sfalexrog/gst-rtsp-launch/
        • you will first need to figure out what gstreamer pipeline to use on voxl2 that will load your video and parse the h264/h265 frames (can use null sink for testing) and then use that pipeline with gst-rtsp-launch which will take the encoded frames and serve them over rtsp stream.
      • gstreamer may be more flexible for tuning the encoding parameters of h264/h265 (compared to voxl-camera-server) and you can also use it in real time later (using voxl-streamer, which uses gstreamer under the hood)

      Another alternative is to use voxl-record-raw-image to save raw YUVs coming from voxl-camera-server and then use voxl-replay and voxl-streamer - the latter will accept YUVs from the MPA pipe and encode them using the bit rate that you want. Note that depending on the image resolution, YUV images will take a lot more space than encoded video, but maybe that is also OK since VOXL2 has lots of storage.

      Alex

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: ESC failure error after SDK 1.1.2 upgrade

      @smilon , voxl-esc-calibrate.py is a script that runs a test procedure in a single motor (with propeller mounted) to calibrate the behavior of the motor / propeller. This procedure only needs to be run once if you change motor or propeller type from a default configuration. The output of this script is just 3 coeficients a1, a2, a3 which you would need to manually enter into an ESC calibration xml file and then upload the xml paramer file to the ESC. Full details about the ESC calibration (when to do it and how) can be found here : https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-esc/-/blob/master/voxl-esc-tools/calibration.md?ref_type=heads

      If you are using standard motors and propellers (one of standard ModalAI drones), you do not need to run this calibration procedure.

      It sounds like you got it working, I believe voxl-configre-mpa took care of it. You can see what voxl-configure-mpa typically does here : https://docs.modalai.com/voxl-configure-mpa/ , which includes running voxl-esc to upload the latest firmware and params for a specific vehicle.

      posted in ESCs
      Alex KushleyevA
      Alex Kushleyev
    • RE: voxl_mpa_to_ros2 camera_interface timestamp

      @smilon ,

      I believe you are correct! Thank you. We will double check this and fix.

      posted in ROS
      Alex KushleyevA
      Alex Kushleyev
    • RE: OV7251 RAW10 format

      Hello @Gicu-Panaghiu,

      I am going to assume you are using VOXL1, since you did not specify..

      We do have RAW8 and RAW10 support for OV7251. The selection of the format has to be done in several places.

      First, you have to select the correct camera driver, specifically..

      ls /usr/lib/libmmcamera_ov7251*.so
      /usr/lib/libmmcamera_ov7251.so
      /usr/lib/libmmcamera_ov7251_8bit.so
      /usr/lib/libmmcamera_ov7251_hflip_8bit.so
      /usr/lib/libmmcamera_ov7251_rot180_8bit.so
      /usr/lib/libmmcamera_ov7251_vflip_8bit.so
      

      there are 5 options and one of them is _8bit.so which means it will natively ouptput 8bit data (all others output 10 bit data).

      the driver name, such as ov7251_8bit has to be the sensor name <SensorName>ov7251_8bit</SensorName> in /system/etc/camera/camera_config.xml.

      You can check camera_config.xml for what sensor library is used for your OV7251.

      When you run voxl-configure-cameras script, it will actually copy one of the default camera_config.xml that are set up for a particular use case, and I believe it will indeed select the 8bit one - this was done to save cpu cycles needed to convert 10bit to 8bit, since majority of the time only 8bit pixels are used.

      Now, you mentioned that HAL_PIXEL_FORMAT_RAW10 is passed to the stream config and unfortunately this does not have any effect on what the driver outputs. If the low level driver (e.g. libmmcamera_ov7251_8bit.so) is set up to output RAW8, it will output RAW8 if you request either HAL_PIXEL_FORMAT_RAW8 or HAL_PIXEL_FORMAT_RAW10.

      So if you update the camera_config.xml to the 10bit driver and just keep the HAL_PIXEL_FORMAT_RAW10 in the stream config (then sync and reboot), you should be getting a 10 bit RAW image from the camera. But since the camera server is currently expecting 8 bit image, if you just interpret the image as 8 bit, it will appear garbled, so you will need to handle the 10 bit image (decide what you want to do with it) in the camera server.

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Minimal example of using camera_cb

      Hi

      If you are looking for more low level functionality and a bit easier to experiment with, you may want to check out https://gitlab.com/voxl-public/utilities/voxl-rtsp/-/tree/dev . This tool is good for testing or running simple scenarios. This tool accepts some commands (like exposure control) via command line, so you could add a simple feature to save an image upon a command line input. Please see README to see how the exposure control input works and code for how it's implemented.

      Please make sure that voxl-camera-server is disabled when you run voxl-rtsp.

      If you still would like to go through the voxl-camera-server approach, we may need some more input from respective devs :).

      I hope this helps..

      Alex

      posted in VOXL m500 Reference Drone
      Alex KushleyevA
      Alex Kushleyev
    • RE: Cannot change TOF framerate

      The ipk is available here now : http://voxl-packages.modalai.com/stable/voxl-hal3-tof-cam-ros_0.0.5.ipk - you should be able to use the launch file to choose between two modes (5=short range and 9=long range) and fps, which are listed in the launch file.

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: VOXL ESC Mini 4-in-1 Current per Motor

      @Moderator said in VOXL ESC Mini 4-in-1 Current per Motor:

      Is it possible to step up voltage?

      Can you please clarify the question? 🙂

      Mini ESC is designed for small drones ( < 500g ). The ESC has been tested to handle 15A continous at 15V input continuously (60+ seconds), but with full direct air flow from propellers. This would simulate a full throttle "punch-out" on a small FPV drone (high current, but also lots of direct airflow = cooling). Do not use this ESC if the drone needs 10-15A per channel just to hover. Use it in application where hover current per motor is less than 5A (ideally 2-3A which is very typical) and absolute maximum continuous current per motor can be 10-15A.

      For example, motors used for small FPV drones often are around 1306 size (3-4S Lipo). Those motors are usually rated for up to 10-12A continous (for 30-60 seconds). Larger motors can be used as long as maximum motor current does not exceed 10-15A (still 2-3A at hover) and there is sufficient cooling.

      Always check ESC board temperature during initial flights / tuning. Temperature must stay below 110C at all times (critical), typically in the range of 40-70C for most applications. The ESC will most likely fail above 125C.

      Temperature of the ESC board is the limiting factor because the board is so small. Mosfets can handle a lot of current as long as they don't overheat. So the design of the drone is very important (either use low current so that temperature is not an issue or properly design air flow from propellers and/or add heat spreader to keep the ESC board temperature in normal range for higher current draw applications).

      ESC provides real time temperature feedback and it can be viewed in PX4 / QGC. Additionally, the PX4 logs contain the temperature information.

      posted in ESCs
      Alex KushleyevA
      Alex Kushleyev
    • RE: Onboard Image Processing using ROS + OpenCV (+ possibly some ML library in the future)

      @Prabhav-Gupta , yes it seems like OpenCV and ROS YUV_NV12 formats do not match up. I will take a look at it.. it seems the ROS YUV is packed (interleaved) while standard for storing YUV NV12 is having two planes : plane 1 : Y (size: widthheight), plane 2 : UV (size: widthheight/2)

      In the meantime.. you can stream a rtsp h264/h265 from VOXL (use decent quality so that image looks good) and use opencv to receive the stream and get decompressed images: https://stackoverflow.com/questions/40875846/capturing-rtsp-camera-using-opencv-python

      Would that work for you ? (unfortunately with rtsp stream, you will not get the full image metadata, like exposure, gain, timestamp, etc).

      RTSP streaming can be done using voxl-streamer, which can accept either a YUV (and encode it) or already encoded h264/5 stream from voxl-camera-server.

      Alex

      posted in ROS
      Alex KushleyevA
      Alex Kushleyev

    Latest posts made by Alex Kushleyev

    • RE: Rotate imx412 stream

      @TomP , oh yeah, my bad.. you found the right one 🙂

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Rotate imx412 stream

      @TomP , actually there is already non-flip version in /etc/modalai/chi-cdk/imx412 . You can just use the sensormodule from that location with the same slot id (index at the end of sensormodule). Back up the current sensormodule (move it out of /usr/lib/camera/ or rename to .bak) and put the new one in the same location.

      Alex

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Rotate imx412 stream

      @TomP , ok no problem. I can add rotated sensormodules to our release and send you a copy. One more question - can you please provide me exact sensormodule name for imx412 that you have in /usr/lib/camera ?

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Connecting MSU-M0149-1, MSU-M0107, and VOXL 2 Time of Flight (TOF) Depth Sensor

      @QSL ,

      TOF is running at a pretty low MIPI rate (i cant recall exactly what it is), so it is easier to make a longer flex for lower transmission rates. I think that for experimentation, we could try connecting several M0170 back to back, but that would not be recommended for anything other than one-off testing because you would need like 3 of these back-to-back and TOF already has a number of connections / interposers. Daisy chaining several extension cables will contribute to signal and power losses and will make the whole assembly less reliable.

      For making a custom flex (a longer version of M0170), let's check with @Vinny :).

      Maybe you can get a vendor to make a longer version of the cable based on our design.

      Alex

      posted in System Architecture Design Reviews
      Alex KushleyevA
      Alex Kushleyev
    • RE: Starling 2: TOF sensor not recognized by VOXL Portal

      @Hector-Gutierrez , sorry about the delay.. Before submitting RMA (i will provide instructions), can you please share the output of dmesg after you run voxl-camera-server -l ? this will let me confirm that there is an attempt to ping the TOF sensor and it fails. You could save the output of dmesg into a text file and share it or paste it here.

      A successful dmesg output should look something like this (this is from a Starling 2 Max), which has an extra IMX412 camera:

      [ 1712.362675] CAM_ERR: CAM-MEM: cam_mem_mgr_create_debug_fs: 126 failed to create dentry
      [ 1712.383711] CAM_INFO: CAM-HFI: cam_hfi_init: 878 Init IO1 : [0x10c00000 0xcf300000] IO2 [0xe0200000 0x1ed00000]
      [ 1712.393864] CAM_INFO: CAM-ICP: cam_icp_mgr_process_dbg_buf: 2572 FW_DBG:CICP_FW_E : HFI  :QC_IMAGE_VERSION_STRING=CICP.FW.1.0-00079,OEM_IMAGE_VERSION_STRING=CRM,BUILD_TIME: Oct 17 2019 05:49:19,CACHE_ENABLED at icphostinterface.c:636 QC_IMAGE_VERSION_STRING=CICP.FW.1.0-00079 OEM_IMAGE_VERSION_STRING=CRM
      [ 1712.393868] CAM_INFO: CAM-ICP: cam_icp_mgr_process_dbg_buf: 2572 FW_DBG:CICP_FW_E : HFI  :ELF variant: CACHE-ENABLED:T480:API_V2:USE_CDM_1_1: , API version: 0x2000049 at icphostinterface.c:637 QC_IMAGE_VERSION_STRING=CICP.FW.1.0-00079 OEM_IMAGE_VERSION_STRING=CRM
      [ 1712.393916] CAM_INFO: CAM-ICP: cam_icp_mgr_hw_open: 3879 FW download done successfully
      [ 1712.470777] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor0: Linked as a consumer to regulator.58
      [ 1712.472096] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor0: Linked as a consumer to regulator.56
      [ 1712.474369] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor0: Linked as a consumer to regulator.60
      [ 1712.474408] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor0: Linked as a consumer to regulator.79
      [ 1712.498846] CAM_INFO: CAM-SENSOR: cam_sensor_driver_cmd: 918 Probe success,slot:0,slave_addr:0x30,sensor_id:0x356
      [ 1712.498997] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor0: Dropping the link to regulator.79
      [ 1712.502731] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor0: Dropping the link to regulator.60
      [ 1712.502868] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor0: Dropping the link to regulator.56
      [ 1712.502981] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor0: Dropping the link to regulator.58
      [ 1712.526976] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor1: Linked as a consumer to regulator.58
      [ 1712.527116] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor1: Linked as a consumer to regulator.60
      [ 1712.527175] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor1: Linked as a consumer to regulator.56
      [ 1712.527326] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor1: Linked as a consumer to regulator.79
      [ 1712.541852] CAM_INFO: CAM-SENSOR: cam_sensor_driver_cmd: 918 Probe success,slot:1,slave_addr:0x34,sensor_id:0x577
      [ 1712.541967] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor1: Dropping the link to regulator.79
      [ 1712.543257] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor1: Dropping the link to regulator.56
      [ 1712.543316] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor1: Dropping the link to regulator.60
      [ 1712.543392] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor1: Dropping the link to regulator.58
      [ 1712.561608] qcom,camera ac50000.qcom,cci:qcom,cam-sensor3: Linked as a consumer to regulator.60
      [ 1712.563718] qcom,camera ac50000.qcom,cci:qcom,cam-sensor3: Linked as a consumer to regulator.59
      [ 1712.565875] qcom,camera ac50000.qcom,cci:qcom,cam-sensor3: Linked as a consumer to regulator.55
      [ 1712.567770] qcom,camera ac50000.qcom,cci:qcom,cam-sensor3: Linked as a consumer to regulator.79
      [ 1712.575483] CAM_INFO: CAM-SENSOR: cam_sensor_driver_cmd: 918 Probe success,slot:3,slave_addr:0x7a,sensor_id:0x2975
      [ 1712.577616] qcom,camera ac50000.qcom,cci:qcom,cam-sensor3: Dropping the link to regulator.79
      [ 1712.579717] qcom,camera ac50000.qcom,cci:qcom,cam-sensor3: Dropping the link to regulator.55
      [ 1712.579758] qcom,camera ac50000.qcom,cci:qcom,cam-sensor3: Dropping the link to regulator.60
      [ 1712.579812] qcom,camera ac50000.qcom,cci:qcom,cam-sensor3: Dropping the link to regulator.59
      [ 1712.604128] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor6: Linked as a consumer to regulator.58
      [ 1712.606318] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor6: Linked as a consumer to regulator.56
      [ 1712.608394] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor6: Linked as a consumer to regulator.60
      [ 1712.608473] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor6: Linked as a consumer to regulator.79
      [ 1712.631999] CAM_INFO: CAM-SENSOR: cam_sensor_driver_cmd: 918 Probe success,slot:6,slave_addr:0x30,sensor_id:0x356
      [ 1712.632517] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor6: Dropping the link to regulator.79
      [ 1712.636492] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor6: Dropping the link to regulator.60
      [ 1712.636595] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor6: Dropping the link to regulator.56
      [ 1712.636686] qcom,camera ac4f000.qcom,cci:qcom,cam-sensor6: Dropping the link to regulator.58
      [ 1712.662182] qcom,camera ac50000.qcom,cci:qcom,cam-sensor2: Linked as a consumer to regulator.59
      [ 1712.662338] qcom,camera ac50000.qcom,cci:qcom,cam-sensor2: Linked as a consumer to regulator.60
      [ 1712.662399] qcom,camera ac50000.qcom,cci:qcom,cam-sensor2: Linked as a consumer to regulator.56
      [ 1712.662543] qcom,camera ac50000.qcom,cci:qcom,cam-sensor2: Linked as a consumer to regulator.79
      [ 1712.677605] CAM_INFO: CAM-SENSOR: cam_sensor_driver_cmd: 918 Probe success,slot:2,slave_addr:0x34,sensor_id:0x577
      [ 1712.677728] qcom,camera ac50000.qcom,cci:qcom,cam-sensor2: Dropping the link to regulator.79
      [ 1712.679699] qcom,camera ac50000.qcom,cci:qcom,cam-sensor2: Dropping the link to regulator.56
      [ 1712.679736] qcom,camera ac50000.qcom,cci:qcom,cam-sensor2: Dropping the link to regulator.60
      [ 1712.679789] qcom,camera ac50000.qcom,cci:qcom,cam-sensor2: Dropping the link to regulator.59
      [ 1712.739998] CAM_WARN: CAM-CRM: cam_req_mgr_close: 160 release invoked associated userspace process has died
      

      Specifically, the following line says that the TOF sensor has been probed successfully:

      [ 1712.575483] CAM_INFO: CAM-SENSOR: cam_sensor_driver_cmd: 918 Probe success,slot:3,slave_addr:0x7a,sensor_id:0x2975
      

      If the probe fails, you would see some CCI errors. If you do not see anything related to slave address 0x7a , the the sensor is not being probed and the issue is on the SW configuration side, which we can investigate further.

      Alex

      posted in Starling & Starling 2
      Alex KushleyevA
      Alex Kushleyev
    • RE: Rotate imx412 stream

      @TomP ,

      Are you using MISP in your config file or the Qualcomm ISP?

      there are typically two ways to rotate the image stream:

      • read out the image backwards (typically a one or two register change in the camera configuration)
        • reverse readout will also typically reverse the bayer pattern (RGGB vs BGGR), so whatever is doing the debayering also needs to be aware of the bayer pattern change
      • rotate image post readout in ISP or GPU, although i believe even the Qualcomm pipeline does the rotation in the GPU (however we have never used the rotation in the QC pipeline within voxl-camera-server).

      For some cameras like OV7251, OV9782, AR0144, the en_rotate flag from voxl-camera-server.conf causes the appropriate registers to be updated to set the reverse readout and then the same flag is also used for correctly debayering (in ov9782 case). https://gitlab.com/voxl-public/voxl-sdk/services/voxl-camera-server/-/blob/dev/src/cci_direct_helpers.cpp?ref_type=heads#L73

      For the IMX412 we have not made that option available yet, but i know it is trivial to do, so i can make that change.

      If you do use the QC ISP pipeline (using small_encoded or large_encoded streams), in order to make this work, we actually generate another sensormodule driver that sets the reverse read-out registers and informs the isp pipeline that the Bayer pattern is reversed. This is not ideal because we need to maintain another set of sensormodule drivers.. Ideally, there would be a flag in ISP pipeline to cause the rotation, but the following does not seem to work: https://gitlab.com/voxl-public/voxl-sdk/services/voxl-camera-server/-/blob/dev/src/hal3_camera_mgr.cpp?ref_type=heads#L87 (the flag used to set rotation in stream requests below in the same file)

      Please let me know which streams you are actually using for IMX412 and I can help you.

      Alex

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Teledyne FLIR Lepton 3.5 Integration

      @george-kollamkulam , the lepton tracker is an implementation of optic flow tracker for the lepton data, you can see the details here : https://gitlab.com/voxl-public/voxl-sdk/services/voxl-lepton-tracker/-/blob/master/src/feature_tracker.cpp

      posted in Ask your questions right here!
      Alex KushleyevA
      Alex Kushleyev
    • RE: AR0144 RGB output on VOXL2

      @Jordyn-Heil , you will need to use a recent version of voxl-camera-server (you can build it from dev or get a package from the nightly builds).

      Then edit the camera server config file (/etc/modalai/voxl-camera-server.conf) and replace the camera type ar0144 with ar0144-color. If you get an error (unknown camera type), it means you are using older version of camera server.

      Currently, the debayering is happening on the CPU.

      Here is the place where processing is happening for the color version of AR0144. https://gitlab.com/voxl-public/voxl-sdk/services/voxl-camera-server/-/blob/dev/src/hal3_camera_mgr.cpp?ref_type=heads#L1477

      • debayer from GRBG to RGB
      • run AWB (auto white balance)
      • run AE (auto exposure)
      • send out RGB, grey, and original raw bayer frames

      Please try it out.

      Alex

      posted in Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Camera not being identified on J7U

      @Idan-Fiksel , it looks like it worked since your machine variant is now 1.0.0 . I assume it was 1.0.1. Sorry the numbering in the name of the kernel is 00.0 vs 00.1…

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev
    • RE: Camera not being identified on J7U

      @Idan-Fiksel , yes two identical cameras (TOF in this case) should work on J6L and J8L. Two things to check in this case (I already verified):

      • make sure there is no CCI (I2C) slave address conflict, in other words the camera ports using different CCI busses. I verified this using https://docs.modalai.com/voxl2-connectors/ - J6L uses CCI0 bus (CAM0_CCI0) and J8L uses CCI1 bus (CAM4_CCI1)
      • make sure the camera driver is present for the camera slots you need (0 and 4):
      ls /usr/share/modalai/chi-cdk/irs2975c
      com.qti.sensormodule.irs2975c_0.bin  com.qti.sensormodule.irs2975c_2.bin  com.qti.sensormodule.irs2975c_4.bin
      com.qti.sensormodule.irs2975c_1.bin  com.qti.sensormodule.irs2975c_3.bin  com.qti.sensormodule.irs2975c_5.bin
      

      The last thing you may need to limit the maximum "exposure" of the TOF sensor (maximum emitted power) using voxl-camera-server.conf in order to prevent drawing too much current (in bursts. This will depend on other load on VOXL2's power supply, etc and will need testing to make sure that VOXL2's power supply is not affected by drawing too much current under full system load.

      Please consult the following document for the power consumption and different operating modes of the PMD TOF sensor : https://docs.modalai.com/M0178/#currentpower-consumption

      Alex

      posted in Video and Image Sensors
      Alex KushleyevA
      Alex Kushleyev