M0149 camera refocusing and tuning parameters
-
@Alex-Kushleyev Okay. Thanks for the feedback. In that case should I increase the reprojection criteria in voxl-camera-calibration package and save the calibration below 0.6 reprojection? Any other suggestions are welcome. I will perform handheld qvio tests also with this calibration. Any specific error I should be looking for in qvio? Like BAD_CAM_CAL?
-
@Aaky , yes you can adjust the threshold, since the threshold was tuned for ov7251. Reprojection error of 0.5 or 0.6 is not going to matter too much.
QVIO will complain about BAD_CAM_CAL if the calibration is REALLY bad, so it probably wont be the case here.
The ultimate metric of QVIO is drift vs distance traveled (and stability), in good feature rich environments it could get as low as few % drift over distance.
-
@Alex-Kushleyev Thanks for your feedback Alex. I got re-projection error of 0.51 on AR0144 and now I have started to test QVIO in indoors. On few instances I got BAD_CAM_CAL while testing handheld. My MSV was around 70 and extrinsic is caliberated correctly as per my UAV's IMU and camera positions. Please look at below image. I didnt got any error other then BAD_CAM_CAL (ofcourse discounting IMU and vibration errors)
As a observation I didnt see odometry drift in QVIO while I roam around and came back to same origin where my odometry said same x,y and z axis to be almost 0.
Even if Quality in above image looks fine, there are good enough points for tracking still why does QVIO report BAD_CAM_CAL? I am aware QVIO is blackbox for everyone but just wanted your analysis.
Also on a very seperate note, I read somewhere on forum there is some work happening on dual camera support for QVIO? Can you provide me any early stage update on same? Even how is the software architecture for dual QVIO instance and fusion?
-
@Aaky ,
I am not sure why QVIO is reporting bad cam cal. Please double check that you entered the camera calibration parameters correctly for QVIO.
In your test image, this is actually not a great scenario for VIO because all the points are roughly in the same location and at the same distance. Try to test in a more feature rich environment and see if the bad camera calibration warning is still there.
Sometimes it is possible that something else is off and the algorithm "thinks" that it is caused by a bad camera calibration. It is trying to provide a solution within a large state space (many parameters) so it could incorrectly estimate some parameter and think the camera calibration is off.
We are not working on a dual QVIO fusion, but we are working on another flavor of VIO algorithm that supports dual cameras.
Alex
-
@Alex-Kushleyev Thanks Alex. Will test this further. Any update over shading correction for corners of AR0144 camera stream as we discussed previously?
-
@Aaky ,
Sorry, there is no update just yet. it is in the queue to experiment with, but we are testing with AR0144 with the same lens shading issues and it seems the performance is good. It could potentially be better with the shading corrected, but it is not a blocking issues at this point, as we understand. Just to be realistic, I would estimate at least a few weeks before I would have something to test for this feature.
Alex
-
@Alex-Kushleyev Thanks for the update. Can you provide me with camera parameters of AR0144 with which you got better results? In my tests I am getting good results in indoors but it fails in outdoors (imagine flat grass land) where in both cases camera is looking 35 degree downwards from horizon and ideally should be able to get some features. My camera reporjection error came out to be 0.51. If you can share any test bench data and parameters with which you tested QVIO with AR0144 will act as good reference point for my development.
Thank you! -
@Aaky , camera calibration parameters are specific to each camera module, so i don't think any other calibration parameters would work for you better than what you have. Does that make sense? Or are you asking about different parameters (please specify).
I will check to see if we can provide reference data sets running QVIO with AR0144
-
@Alex-Kushleyev Alex actually I am asking for voxl-camera-server related parameters for AR0144 and ofcourse not camera calibration parameters. Along with that, like you mentioned even good test datasets recorded by you would be also beneficial with all camera server parameters and maybe qvio server parameters except intrinsic and extrinsic. Let me know.
Thank you! -
having very similar issues to you Aaky, qvio does not like this M0149 camera whatsoever.
-
@Gary-Holmgren , I replied to your new thread, we can discuss your issues there. https://forum.modalai.com/topic/3522/vio-errors-with-new-ar0144-tracking-camera-unstable-flight-crashing
-
@Alex-Kushleyev I wanted one quick help. I am running my drone with AR0144 in dusty environment, imagine an under construction building with lots of dust around and with UAV's prop wash it increases more and more. VIO resets more often in this situation. I am totally aware computer vision algorithms wont work if its unable to find any features, but can the camera parameter be tuned to make it work in dusty environment? I have very important demo tomorrow with this setup and if you can help me to identify if anything can be done with AR0144 camera would be highly appreciated. My MSV in camera-server configuration is set to 80 since my environment would be bit darker. Let me know apart from this if anything can be done.
-
@Aaky , as you already mentioned, dusty environments can very difficult for computer vision, I would say this is more like an open-ended research topic as opposed to parameter tuning task in current VIO. I imagine, you would need to use a different approach to feature detection and tracking, depending on the particular environment.
If you wanted to try to push VIO to the limit, my suggestion would be to do this offline with collected data logs and playback (raw images and IMU data) running in the same environment at different exposures. During data collection, you could fly the drone manually or even do hand-held tests without spinning propellers, although spinning propellers would generate even more dust movement.
Without running offline tests and seeing how the algorithm performs with different parameters across many data sets, there is little hope to magically get it right and improve the performance. There is no shortcut here - it is a complicated problem and needs to be approached in a structured way
Also, we have never tested VIO in very dusty conditions (i am not sure how much dust you are talking about, but I am assuming significant, since you mentioned complete lack of features due to dust). Therefore, we don't have a quick suggestion how to improve performance in these conditions.
Alex
-
@Alex-Kushleyev Thanks Alex for your inputs.
On a seperate note, I wanted to discuss one thing. Can we expand current QVIO architecture to take input from two cameras? Or run two QVIO instances and fuse their odometries into PX4 (Maybe when one fails other Odometry would be fused and vice versa). Just wanted to check if any such kind of concept would work on current QVIO pipeline?
-
@Aaky , please see the following comments:
- the QVIO architecture does not support multiple cameras, there is no way for us to make it work
- You can certainly set up and run multiple instances of QVIO using same IMU and two different cameras. These instances would not know anything about each other, that is not ideal use of the dual-camera set up, but could be used in conjunction with a higher-level fusion software to produce a single estimate. Setting up / running dual (independent) QVIO is the easier part, but then fusing that information is more complicated. However, we do not have any support of this approach (mainly because it has limited value).
- if you REALLY want to try this approach, i can help you set it up (running two instances of QVIO and publishing the independent QVIO outputs), but would not be able to provide further guidance on actual fusing the outputs of the two instances of QVIO. Please note that we have not done this recently, so there could be some complications that i am not aware of. So there is some risk involved here.
- We are currently working on dual-camera VIO solution (using open vins), but it is not officially released yet. I will check the timeline for that..
Alex
-
@Alex-Kushleyev Thanks Alex. I anticipate challenges in the fusion part for sure.
Regarding two camera based VIO using open vins, Any initial results you can share like it's performance as compared to QVIO?Also this will be part of future SDK releases and I hope we would be having full control over the codebase as a user? Timelines would be highly appreciated. -
@Aaky , I just checked the status of open vins and we are currently testing it on several drone configurations (using Starling drone) using single and dual cameras. However, the main issue is that documentation for using openvins on voxl2 in not ready, so we cannot really support it until the project is formally released. I don't have the ETA for the release - if I find out something, I will definitely let you know.
Alex
-
@Alex-Kushleyev Thanks for providing the status Alex. Can I get a prerelease version of open vins? Even that might be fruitful. I can figure out the bringup and executing it on VOXL2.
-
@Aaky , I know this would be an exciting feature to get your hands on, but unfortunately we cannot release it until it is ready with documentation. We need to get it to a point that there is enough documentation so that it can be set up and tested without additional support, otherwise the support process will become very inefficient.
Our goal is to make the open vins application available with the next major SDK release, but I do not have the ETA yet (we are at the SDK 1.3). So it would be SDK 1.4 or whatever the next one will be called..
Alex
-
@Alex-Kushleyev No problem Alex. Let me know once it's released. Thank you!