@restore I don't think this is possible today. How did you set it up to take the snapshot? How is it triggered?
ModalAI Team
-
RE: Trigger Hadron camera with DO_SET_CAM_TRIGG_DISTposted in VOXL 2
-
RE: voxl io serverposted in VOXL Accessories
Hi @qubotics-admin ,
Our understanding is that the
voxl2_iodriver should present itself as a generic PWM output (up to 8 channels) and you should be able to map any function you need to it. Although we have not explicitly tested every function. Can you please clarify which version ofvoxl-px4you are using and elaborate on the "limited options" that you have for selecting the PWM function (what are you expecting to be able to set and what are the options).Alex
-
RE: Yaw Error Estimate on PX4v1.15posted in Ask your questions right here!
@ndwe You could also ask on the PX4 forum to see what ideas the PX4 community has on this issue.
-
RE: Yaw Error Estimate on PX4v1.15posted in Ask your questions right here!
@ndwe I would first try upgrading to v1.16.1 and see if the issue persists. Some other ideas: make sure EKF2_MAG_DECL is set properly for your area, try setting EKF2_MAG_TYPE to 1, and / or increase COM_ARM_EKF_YAW to a higher value (like 0.7).
-
RE: VOXL2 Mini – Royale createDevice failed when starting ToF cameraposted in Ask your questions right here!
Which TOF sensor do you have plugged into VOXL2 Mini (there are two options) and what is the camera type listed for the TOF entry in
voxl-camera-server.conf? there may be a mismatch..Config file TOF options are :
pmd-toffor the (EOL) TOF V1 : https://docs.modalai.com/M0040/pmd-tof-liow2for the new TOF V2 : https://docs.modalai.com/M0169/ (https://docs.modalai.com/M0178/) - two different variants that differ in the connector type that connects them to VOXL2
Alex
-
RE: Running QVIO on a hires cameraposted in GPS-denied Navigation (VIO)
Hi @Rowan-Dempster , you are right, actually, there is no option right now to log from an ion buffer using
voxl-logger. However,voxl-replayhas an option to send camera frames as ion buffers. I have not tested it recently, but you could give it a try : https://gitlab.com/voxl-public/voxl-sdk/utilities/voxl-logger/-/blob/extend-cam-logging/tools/voxl-replay.cppFor offline processing, it should not matter much whether you are using ion buffers or not. There would be a bit more cpu usage, but hopefully not too much. Having
voxl-replaysupport cam playback as ion buffers is probably more important than using ion buffers for logging, since then your offline processing pipeline uses the same flow (ion buffers) as the live vio pipeline.We may add logging from ion buffer, but it's probably not a high priority.
By the way, i wanted to mention one detail. I recently made a change in camera server dev branch to allocate the ion buffers that are used for incoming raw images as uncached buffers. This actually happens to reduce the cpu usage (so that cpu does not have to check / flush cache before sending the buffer to the GPU). The cpu reduction was something like 5% of one core per camera (for a large 4K image). For majority of hires camera use cases, this is beneficial, because usually the cpu never touches the raw10 image before sending to GPU.
However, when you are logging the raw10 image to disk using voxl-logger, the cpu will have to read the contents of the whole image and send it via the pipe - uncached reads are more expensive. There will be increased cpu usage (i dont remember how much), but it should still be fine unless you are trying to log very large images at high rate. If you wanted to profile the cpu usage while logging, you can just disable making the raw buffers uncached and see if that helps. I have not yet figured out a clean way to handle this, maybe i will add a param for type of caching to use for the raw camera buffer.
look for the following comment in https://gitlab.com/voxl-public/voxl-sdk/services/voxl-camera-server/-/blob/dev/src/hal3_camera_mgr.cpp :
//make raw preview stream uncached for Bayer cameras to avoid cpu overhead mapping mipi data to cpu, since it will go to gpu directlyAlex
-
RE: Running QVIO on a hires cameraposted in GPS-denied Navigation (VIO)
Hi @Rowan-Dempster ,
Yes there are still some perf optimizations that i need to merge to dev. I will take a look at what is left...
Regarding intrinsics question.. In a perfect world, if you have an image 4040x3040 and perform intrinsic calibration (focal length, principal points, distortion), then if you scale the resolution by a factor of N, then your focal length and principal points will scale by exactly N and the fisheye distortion parameters will remain the same (because the fisheye distortion is a function of angle, not pixels and downscaling the image will not change the angle of the pixels, as long as the focal length and principal points are downscaled accordingly).
In practice, if you want to map calibration from one resolution to another resolution, you have to be careful and know exactly how the two resolutions relate to each other. Specifically, there may be a pixel offset in second resolution. Take a look at the two resolutions that we are dealing with :
4040x3040and1996x1520and you will notice that 4040/1996 != 3040/1520 -- so what happened? Why arent we using 2020 as the exact factor of 2 binning of 4040?The answer relates to the opencl debayering function, which uses hw-optimized gpu instructions, which only accepts certain widths without having to adjust image stride before feeding the buffer to the GPU (which would have some cpu overhead). It so happened that 1996 was the closest to 2020 which was an acceptable resolution for the specific opencl debayering function. The camera would happily output 2020x1520 if we configured it to do so.
So the next question is then how does 1996 pixels related to the perfect size of 2020 (which would be the exact N=2 downscale of 4040). The answer is (and is very specific to how we implemented it at the time of writing the driver..) is that to get to 1996 from 2020, we cut off pixels on the right of the image (essentially offset 0 but width of image is reduced to 1996). This actually means that the x principal point will be exactly in the same location (divided by 2). This would not be true if the 1996 resolution was shifted (centered around 2020/2 = 1010, which is something potentially more reasonable to do
).So you should be able to scale the focal length AND principal points by 2 to get the calibration for 1996x1520 resolution, if you are converting intrinsics calibrated at 4040x3040. But don't take my word for it, you should double check it and calibrate in both resolutions. After confirming, you should not have to do two calibrations for each camera.
In future, we may enable the full (uncropped) binned resolution, but it's a pretty small crop, so it may not be worth it.
Alex
-
RE: Running QVIO on a hires cameraposted in GPS-denied Navigation (VIO)
Here is an outline of the data flow that you may want to start with:
Choose the camera resolution, since it cannot be changed after logging
- IMX412: 4040x3040 or 1996x1520 (2x2 binned), use full frame to maximize FOV (4:3 aspect ratio, not 16:9, which will crop the image)
voxl-logger + copy intrinsics / extrinsics
- log raw10 bayer and grey or normalized (or all 3)
- log imu data
- save camera intrinsics
- save camera and imu extrinsics (extrinsics.conf)
voxl-playback
-
option 1: simple:
- playback grey / normalized, feed directly into voxl-qvio server
-
option2 : more complex
- playback raw10 bayer
- run offline misp implementation to debayer + AWB + normalize
- publish grey / normalized
-
both options:
- run voxl-qvio-server, which will load voxl-qvio-server.conf
- playback imu into voxl-qvio-server
- qvio server loads camera calibration and extrinsics
- qvio server outputs vio pose
- use voxl-logger to log the output of qvio
- analyze the results of vio logs
Misc Notes
- QVIO performs better if the imu data is temperature compensated
- during drone take-off, the imu temperature and therefore imu biases can change quickly and QVIO may have trouble tracking the biases
- the bias correction can also be done offline and applied to the logged data imu data to produce better bias-compensated imu data (gyro, accel)
- Consider also logging the global shutter camera (AR0144?) at the same time as IMX412, so that it is possible to compare output of QVIO using a global shutter camera vs rolling shutter.
-
RE: Running QVIO on a hires cameraposted in GPS-denied Navigation (VIO)
Hi @Rowan-Dempster ,
The only things that you cannot change after the raw10 images are collected are:
- raw frame resolution and any other camera settings (which we typically don't change anyway)
- exposure and gain, as they are controlled by the auto exposure algorithm in real time
- the auto exposure algorithm only runs if at least one processed stream is requested (yuv, encoded, normalized, etc), you can see the logic here. If the camera server only has a client for the bayer image, the Auto Exposure will not be updated, which is not good. The solution should be:
- either log or use
voxl-inspect-cameraor any other client to get the yuv or grey stream going (or even encoded) - log raw10 bayer only, but you could also log the output video or YUVs
- *** i will look into potentially adding a param to force the camera to always do AE processing even if there are no clients for the processed images (essentially disable idling).
- either log or use
I would suggest not collecting too many data sets until you actually get the pipeline working, as you may realize that there is an issue in the data sets or something is missing. Probably best to focus on the processing pipeline..
- You cannot simultaneously receive 2x2 binned and not binned images from the camera, so it either has to be in unbinned mode (4040x3040 resolution) or binned (1996x1520). You can always do binning / resizing in misp, but the readout time will change for the larger resolution, as we discussed before. So one thing to test would be potentially putting two IMX412 cameras side by side and simultaneously logging at 4040x3040 (cam1) and 1996x1520 (cam2), and then run offline processing pipeline to see if QVIO can indeed compensate properly for the larger rolling shutter skew. If you are able to run QVIO using full 4040x3040 resolution, then you can have EVERYTHING : 4k video, EIS, vio... you can still run EIS and save videos from the binned resolution, but they will be lower quality.
Another thing to keep an eye on is what you are logging (bandwidth). I did some tests a while ago and voxl2 write speed to disk is quite high, about 1.5GB/s, but you can run out of disk space pretty quickly. However, it should definitely be able to log 4040x3040@30 + 1996x1520@30 (or @60). will probably need to use the ion buffers to log the raw10 images (which is supported, i just need to test), to skip the overhead for sending huge images over the pipes. Camera server already publishes the raw bayer via regular pipes and also ion buffers.
I am going to look over the components needed for this and make sure they are in good state:
- merge the new logging modes to dev (voxl-logger, voxl-replay)
- make a cleaner example of simple standalone misp to include AWB, although you don't need AWB for VIO
But either way, you should be able to start with logging tests and just see if you can playback the logs and get QVIO to do something reasonable from the log playback on target. To bypass MISP, you could just log the output of MISP for now (grey or normalized), so you have less components in the initial test pipeline. then build it up to include debayering, etc.
Perhaps it is easy to log the raw10 bayer + grey or normalized image, so that you have the AE issue solved as well (making sure the AE is running), then for playback you can choose either normalized (feed directly into voxl-qvio-server) or raw10, which will need some debayering and other processing. It is good to have choices. But i would suggest starting with lower resolution first until you also double check ability to log 4K raw images.
Alex