Skip to content

T265 External Calibration

Zhiang Chen edited this page Feb 19, 2021 · 18 revisions

The external calibration for T265 is to find the estimates of the transform between the T265 and Pixhawk FCU.

Estimating Projection Matrix

Before introducing how to estimate the transform between camera and UAV. Let us first define some math notations for conciseness.

p_cc: camera position in camera coordinates.
T_rc: transform from camera coordinates to UAV coordinates. (UAV coordinates align with FUC coordinates)
p_rc: camera position in UAV coordinates.
p_rr: UAV position in UAV coordinates.

We can get the camera pose in UAV coordinates,

Eq.(1): p_rc = T_rc*p_cc.

T_rc = [R|t] is the transform to be estimated. If we have a bunch of {p_cc(t)} and {p_rc(t)}, T_rc can be estimated in the above linear system. However, {p_rc(t)} is unknown.

1. Estimating Rotation Matrix

To solve the problem, we assume the UAV only translates during calibration. Since the camera is fixed on UAV, so the relative position between p_rr and p_rc is a constant (when no rotation movement is involved).

Eq.(2): p_rc - p_rr = p_const
or
Eq.(3): Delta(p_rc) = Delta(p_rr)

We substitute Eq(3) to Eq(1)

Eq.(4): Delta(p_rr) = Delta(p_rc) = T_rc*Delta(p_cc).

From a bunch of Delta(p_rr) and Delta(p_cc) pairs, which can be separately sampled from FCU odometry and VIO odometry, we can only estimate the rotation matrix in T_rc using the above linear system, which is shown as follows.
img

Note only rotation matrix can be estimated if only translational motion is involved. If there is only translation motion, the translational transform cannot be estimated. The reason will be explained in the following.

2. Estimating Translation Matrix

Let us denote P_rr as UAV pose in UAV coordinates, P_rr = [R_rr | p_rr]. Then the UAV rotation transform is T_R = [R_rr | 0] in SE(3). When the rotation is considered, Eq.(2) can be extended to any type of movement,

Eq.(5): inv(P_rr) * p_rc - inv(P_rr) * p_rr = p_const
Note p_const is the position of the camera w.r.t the UAV, thus Eq.(5) can also be written in the transformation form,

Eq.(6): inv(P_rr) * p_rc = [0 | t] * inv(P_rr) * p_rr
Substitute Eq.(6) to Eq.(1),

p_rc = P_rr * [0 | t] * inv(P_rr) * p_rr = [R | t] p_cc

img

From the above system, we know there is no solution when there is no rotation in P_rr (rotation matrix is an identity matrix). In practice, if the rotation matrix in P_rr is close to an identity matrix due to noises, we will discard the data.

3. RANSAC

In practice, there may be large noises in p_rr(t) and p_cc(t) due to the inaccurate estimation of FCU and VIO. RANSAC is used to reduce the effect of outliers and find estimates of the rotation matrix and the translation matrix. We denote the projection matrix as P, which is composed of the estimated rotation matrix and translation matrix. Then we need to find the closest transform T_rc (in SE(3)) to the projection matrix P, which is introduced in the next section.

To estimate the rotation matrix and the translation matrix, the minimal sample numbers depend on the number of unknown parameters in the above linear systems. However, since we are using RANSAC, many more sample points will be collected and used.

Estimating Transform from Projection

There are several ways to estimate a transform (in SE(3)) from a projection matrix.

1. Rotation matrix (SO(3)) SVD

This method is straightforward. [U, S, V] = SVD(P), where P is sub-rotation-matrix in the projection matrix. Then we get rotation matrix R=UV, and the translation vector is directly using [t_1, t_2, t_3] from the projection matrix. By using this method, we need to confirm that det(R)=1 instead of det(R)=-1.

2. Nearest orthogonal matrix

http://people.csail.mit.edu/bkph/articles/Nearest_Orthonormal_Matrix.pdf
This method also needs to check the sign of det(R).

3. Frobenius-norm sense with SVD

img
More details can be found in Pose from known 3D points by Trym Vegard Haavardsholm (Appendix).
Note: there is an error in the slide. From SVD, U should be 3x3, and S should be 3x2. Thus [r1, r2] = U S' V, where S' should be a 3x2 identify matrix. The sign of c_3 is determined by det(R)=1. Because we only estimate rotation matrix and then use it to find translation vector, we don't need find the scalar lambda to scale t.

4. Iterative rotation matrix and translation

We use method 1 or method 2 to find the rotation matrix first, and then fix the rotation matrix and run RANSAC to estimate the translation vector. For the next iteration, we fix the new translation vector, and run RANSAC and do the same process to estimate the rotation matrix.

Sampling Strategy

We introduce one sampling strategy to provide more accurate and less noisy data of {Delta(p_rr)_i} and {Delta(p_cc)_i}. Since VIO has a good pose estimation locally, Delta(p_cc) should be locally accurate and reliable. Thus, we sample position points from their odometry and then obtain {Delta(p_rr)_i} and {Delta(p_cc)_i} from the differences of successive points. The position points are sampled by a random Euclidean distance threshold, d~ U(0, m), where m is the largest distance threshold between two sampling points. A position point will be added to the sample data once its distance to the previous sampling point is greater than a threshold.

Simulation

Given T_rc and p_cc, how can we get p_rr? This is useful to simulate other odometry when only one is available. p_rr = T_rc * p_cc * T_cr
camera uav transform

Appendix

PX4 and Realsense T265 coordinate systems

PX4 coordinate system
ENU(X East, Y North, and Z Up) has been used here.

Realsense T265 coordinate system
realsense t265 external calibration

Pose from known 3D points by Trym Vegard Haavardsholm