@박지현 Unfortunately, for Ardupilot, our ESCs are only supported with UART interface when using the VOXL 2 / VOXL 2 mini as the flight controller. When using Flight Core v2 that is not supported.
ModalAI Team
-
RE: Using Ardupilot with Flight Core v2+VOXL Mini 4-in-1 ESC : Min/Max PWM Mismatch and motors keep spinning after disarmposted in ESCs
-
RE: Using Ardupilot with Flight Core v2+VOXL Mini 4-in-1 ESC : Min/Max PWM Mismatch and motors keep spinning after disarmposted in ESCs
tagging @Eric-Katzfey for Ardupiot -related question..
Some answers:
3-1: VOXL ESC supports PWM input via aux pads (as you probably already know, but just noting it here) : https://docs.modalai.com/voxl-mini-esc-datasheet/#pwm-inputs--outputs. The range of the commands are 1000us=0% duty cycle, 2000us = 100% duty cycle (linear mapping). There is no calibration needed or supported (the MCU clock is pretty accurate).3-2: Recommended PWM frequency (up to 1/(2000us) = 500hz as long as pulses have some off time between them), one-shot is supported.
3-3: i don't think that we have tested the mini ESC with Ardupilot in PWM mode, but the ESC should work very similarly to a standard ESC in PWM control mode
3-4: there is no way to set the zero-throttle PWM value in the ESC params, however here is my recommendation:
- in order to stop the motor and keep it off, send pulse 950us
- in order to start the motor, send pulse 1050us and operate in range 1050-2000us.
- the turn off point is 1000us with some hysteresis, but you don't want to operate close to that point and risk motors shutting off in flight.
Also, we do not support Dshot input for VOXL ESCs. Spin direction reversal is not supported with the PWM commands.
UART protocol is recommended because you don't run into the PWM-related range issues and you also get telemetry back from the ESCs (and much higher command rate is supported).
Alex
-
RE: Python Programmatic GStreamer Access for Hardware Encoded Acceleration and Low Latencyposted in Ask your questions right here!
You did not provide the actual error that you are seeing, however I could try to guess what it is (even if not, the details below should probably help you anyway). The default build of
voxl-opencvpackage does not have python3 support. So if you are using gstreamer with opencv in python and that is the error, you should install thevoxl-opencvpackage that I built with python3 support- https://storage.googleapis.com/modalai_public/temp/voxl-opencv_4.5.5-3_arm64.deb
- source : https://gitlab.com/voxl-public/voxl-sdk/third-party/voxl-opencv/-/tree/add-python3-bindings
Below, i will assume that you want to grab images from Boson part of the Hadron, however similar approach should apply to the RGB camera in Hadron.
First you should test ability to grab images without python. You may need to replace your camera # depending on what camera id your Boson is.
IMPORTANT: make sure that
voxl-camear-serveris not running while you are trying to use gstreamer.systemctl stop voxl-camera-serverTip: you can actually stream video using X forwarding with ssh. This should stream live Boson feed from voxl2 to your linux machine:
ssh -Y username@<voxl-ip> gst-launch-1.0 qtiqmmfsrc camera=0 ! "video/x-raw,width=640,height=512,framerate=30/1" ! videoconvert ! autovideosinkdisplay image directly in terminal as ascii:
gst-launch-1.0 qtiqmmfsrc camera=0 ! "video/x-raw,width=640,height=512,framerate=30/1" ! autovideoconvert ! aasinkand then finally, a python script that grabs h264 video from qtiqmmfsrc, decodes it and returns frames to python:
import time import cv2 #get RGB (BGR?) directly #stream='gst-launch-1.0 qtiqmmfsrc camera=0 ! video/x-raw, width=640,height=512,framerate=30/1 ! autovideoconvert ! appsink' #get h264 -> decode -> BGR stream='gst-launch-1.0 qtiqmmfsrc camera=0 ! video/x-h264,format=NV12,profile=high,width=640,height=512,framerate=30/1 ! h264parse ! qtivdec ! qtivtransform ! video/x-raw,format=BGR,width=640,height=512 ! autovideoconvert ! appsink' print(stream) vcap = cv2.VideoCapture(stream,cv2.CAP_GSTREAMER) frame_cntr = 0 while(1): ret, frame = vcap.read() if ret: frame_cntr += 1 print('got frame %d with dims ' % frame_cntr, frame.shape)Hopefully, that works for you.
Final recommendation - if you use
qtiqmmfsrcthis way, the Boson data is processed in the Qualcomm ISP and unless your have a special tuning file for Boson, the processed output will have degraded quality. Boson, by default, outputs post AGC 8-bit image which is already processed and does not need to be further processed by the ISP. I am not sure whether you can get RAW8 data from qtiqmmfsrc (unmodified data from Boson).We handle the above issue in
voxl-camera-serverby working with the RAW8 directly. We also recently started experimenting with 14bit pre-AGC data from Boson, which would need some processing before it is usable (if you are interested in that, i can share some more information).Finally, if you would like to use
voxl-camera-server, which is what we recommend and support, there is also a way to get encoded h264/h265 data into python (using our experimental pympa (python MPA bindings)). That is a topic for a discussion in another post, if you are interested..Alex
-
RE: Python Programmatic GStreamer Access for Hardware Encoded Acceleration and Low Latencyposted in Ask your questions right here!
Hi @joseph-vale
This is not my wheel house, but maybe @Alex-Kushleyev can give you some pointers.
Thanks!
Vinny -
RE: Starling 2 Max doesn't report battery informationposted in Starling & Starling 2
@Hunter-Scott Can you update to SDK 1.6.0? Then do the
ver allandqshell voxl_esc statuscommands again and attach the output? -
RE: Running QVIO on a hires cameraposted in GPS-denied Navigation (VIO)
@Rowan-Dempster , you should use a monochrome stream (
_grey), since QVIO needs a RAW8 image.If you are not using MISP on hires cameras, that is fine, you can start off using the output of the ISP.
You should calibrate the camera using whatever resolution you decide to try. This is to avoid any confusion, since if you using ISP pipeline, the camera pipeline may select a higher resolution and downscale + crop. So whenever you are changing resolutions, it is always good to do a quick camera calibration to confirm the camera parameters.
When using MISP, we have more control over which camera mode is selected, because MISP gets the RAW data, not processed by the ISP, so we know the exact dimensions of the image sent from camera.
Alex
-
RE: VOXL2 Timeposted in Ask your questions right here!
@voxltester It uses the NTP to update time and date. So, once you have an internet connection, it will update.
-
RE: Running QVIO on a hires cameraposted in GPS-denied Navigation (VIO)
We have not tried this recently, but it should work. Here are some tips:
- Use IMX412 camera (M0161 or similar) because it has great image quality and the fastest readout speed of all of our cameras (IMX214 is not recommended for this, it is an old and "slow" camera sensor)
- faster readout = less rolling shutter skew
- use the latest camera drivers, which max out the camera operating speed in all modes : https://storage.googleapis.com/modalai_public/temp/imx412_test_bins/20250919/imx412_fpv_eis_20250919_drivers.zip
- The readout times are documented here for all modes : https://docs.modalai.com/camera-video/low-latency-video-streaming/#imx412-operating-modes
- for example
1996x1520(2x2 binned) mode has about 5.5ms readout time, which is pretty short - QVIO (mvVISLAM.h) has a parameter "readout time", which suggests that it supports rolling shutter. I have not tried it myself, but i heard that it does work.
mvVISLAM_Initialize(...float32_t readoutTime ..) @param readoutTime Frame readout time (seconds). n times row readout time. Set to 0 for global shutter camera. Frame readout time should be (close to) but smaller than the rolling shutter camera frame period.Here is where this param is currently set to 0 in
voxl-qvio-server: https://gitlab.com/voxl-public/voxl-sdk/services/voxl-qvio-server/-/blob/master/server/main.cpp?ref_type=heads#L371- in order to correctly use the readout time, you have to ensure that the camera pipeline indeed selects the correct camera mode (for which there is the corresponding readout time) : https://docs.modalai.com/camera-video/low-latency-video-streaming/#how-to-confirm-which-camera-resolution-was-selected-by-the-pipeline
- also, readout time is printed out by
voxl-camera-serverwhen you run it in-dmode (readout time in nanoseconds here):
- also, readout time is printed out by
VERBOSE: Received metadata for frame 86 from camera imx412 VERBOSE: Timestamp: 69237313613 VERBOSE: Gain: 1575 VERBOSE: Exposure: 22769384 VERBOSE: Readout Time: 16328284- keep the exposure low to avoid motion blur (IMX412 has quite a high analog gain, up to 22x and 16x digital gain). If you want to prioritize gain vs exposure, would need to tweak the auto exposure params in camera server (when you get to that point, i can help you)..
- it would be interesting to compare performance against QVIO with AR0144 - that would probably require collecting images from AR0144 and IMX412 (side by side) + IMU data and running QVIO offline with each camera.
Good luck if you try it! let me know if you have any other questions. Please keep in mind that QVIO is based on a closed-source library from Qualcomm and our support of QVIO is limited.
Alex
- Use IMX412 camera (M0161 or similar) because it has great image quality and the fastest readout speed of all of our cameras (IMX214 is not recommended for this, it is an old and "slow" camera sensor)
-
RE: tracking down pipe switching to images of traccking front cameraposted in Video and Image Sensors
@mark , thank you for confirming. The error is the same that i am observing. It should not matter whether it is 8 or 12 bit image (in your case it is 8 bit).
In my case the issue happens on M0154 board. I don't think M0054 vs M0154 makes any difference.
Just to clarify what is actually happening..
The issue starts with some CRC errors (interference between multiple cameras, it seems). In my case, the rate of CRC errors depends on the position of the ucoax cables. (you can also watch these CRC errors and move the coax cables for the second tracking camera).
If the error occurs at a critical point (such as beginning of the frame), the camera pipeline reports a critical problem and should produce an error in the camera pipeline.
The correct behavior should be that the problematic camera stops streaming and in latest SDK, it will actually be restarted and stream will recover. However, the camera is never stopped, instead it reports a duplicate image from the other tracking camera.
We are investigating the root cause, as it is somewhere low level in the camera stack (not in
voxl-camera-server).Alex
-
RE: Flashing Custom Ardupilot Firmwareposted in Ask your questions right here!
@clange Yes, definitely. All the support to build a Debian package for installation is in https://github.com/ArduPilot/ardupilot/tree/master/libraries/AP_HAL_QURT/packaging