Fact Checking my Understanding of the Extrinsic Configuration
-
Hello,
Setup: VOXL 2, SDK 1.4.1, Open-VINs Dual-ARR0144 Cameras
One of my setups will require the VOXL 2 to be mounted upside down, with the MIPI connectors for the cameras facing upwards. I've been preparing the extrinsic file for the setup and wanted confirmation on my understanding of how to properly configure the upside down configuration.
My understanding is such: You must configure PX4 to know the flight controller is upside down, simple enough, with this done, I then go into the extrinsic file and begin setting the rotation and positions offsets.
My question is this: For the "parent": "body",
"child": "imu1",do you set a rotational value of 180 degrees to let the system know the imu is upside down? Then, do you set the camera rotational values to also include the rotation in the board.
For example, the default extrinsic for a forward facing camera and right side up VOXL 2 are:
If the VOXL 2 were upside down, would the extrinsics be:
-
@Jeremy-Frederick , just to clarify, there are two IMUs on VOXL2 - one is connected to the DSP, used by PX4 and the second IMU is connected to the Application Processor (CPU). The data flow from the two IMUs normally does not cross, that is to say, that PX4 IMU is used by PX4 and the "apps proc" IMU is used for things like VIO, etc.
PX4 does not use the
extrinsics.conf
file, since PX4 has its own configuration for IMU rotation, I believe you know that.Now, regarding the second IMU, if you flip your voxl2 board upside down (lets say roll it 180 degrees around its X axis), then your body->imu0 transform should reflect that [180, 0, 0].
When you specify the transforms between the imu and the cameras, you need to do that in the IMU frame. Perhaps that would be easier if you flip the drone upside down so that the voxl2 is oriented in its nominal orientation and then compute the transforms to the cameras by following the rotations described in our docs (see link below)
The Transform Tree typically looks like this;
Body | IMU | ------------------ | | | cam0 cam1 cam2
If your tracking camera is initially in forward facing orientation while voxl2 is "right side up", then you have the [0, 90, 90] rotation wrt the imu. if you roll the voxl2 board 180 degrees around its x axis (x axis still points forward), but the camera remains in the same orientation, then your imu->camera transform becomes [180, 90, 90]. You could verify this by using the red-green-blue marker helper tool as shown in the video
One way to understand this a bit better, and to do a sanity check.. When rotations are composed, they can be multiplied together as Rotation Matrices. When you compute the rotation between the drone's "body" to the tracking camera, this will look like
body_R_cam = body_R_imu * imu_R_cam
. which would be in the default case equal to something like thisbody_R_cam = [Ry(90) * Rz(90)]
becausebody_R_imu
is Identity Matrix (no rotation).For the use case where VOXL2 is rotated upside down around the x axis, both the
body_R_imu
andimu_R_cam
transforms have changed but they changed in the same (opposite) way and it just so happens that 180 degree rotation is the same as -180 rotation. So now we havebody_R_cam = [Rx(180)] * [Rx(-180) * Ry(90) * Rz(90)]
where the firstRx(180)
is thebody_R_imu
andRx(-180)
is the part of the newimu_R_cam
transform. As a result the 180 and -180 degree rotations cancel out and you end up with the originalbody_R_cam
because you have not actually rotated the camera w.r.t to body.. Long explanation but worth the sanity checking.Also more details here : https://docs.modalai.com/configure-extrinsics/
Now, regarding Open Vins actually using the upside down IMU to body rotation correctly, I will need to double check with my colleages. @Cliff-Wong , can you confirm?