LiDAR_cal

This section consists of two main parts: instructions for running the calibration, and the theoretical background behind the algorithm. For further details, refer to the following paper.

How to Run

0. Calibration Requirements

A single calibration board with circular patterns is required, and it must satisfy the following two conditions:

  1. Punched-through Circles
    The circles must be physically punched through so that the LiDAR can capture a point cloud with visible circular holes.
    To enhance contrast in RGB images, place a black plate behind the board so the holes appear dark in the image.

  2. Minimum Circle Spacing
    The distance between adjacent circle centers is recommended to be greater than three times the circle radius to ensure robust detection and minimize clustering ambiguity.

1. Data Collection

Prepare multiple pairs of:

  • RGB images: .png format
  • LiDAR point clouds: .pcd format (required)

Capture the board from a variety of positions and angles to maximize calibration robustness.
DiscoCal is designed to work reliably even when the board appears at arbitrary poses.

2. Modify the config file

  • lidar.yaml : This is a config template of LiDAR-camera calibration. You can adjust some parameters for more reliable data processing.

3. Run

Algorithm

The overall process consists of four main stages:

1. Board Detection

To localize the calibration board in the LiDAR point cloud, we apply a combination of segmentation and geometric constraints:

  • plane segmentation: A RANSAC-based plane fitting is applied to extract the dominant planar surface. Among the candidate planes, we select the most planar and appropriately sized one using PCA analysis and some proper filterings.
  • perspective projection: The points on the detected board are perspectively projected onto the fitted plane. Perspective projection can remove range error, thus accurate circular patterns can be obtained.

2. Circle Detection

We introduce a robust filtering-based detection approach that does not rely on the ring channel, enabling wider LiDAR compatibility:

  • Centroid Distance Ratio(CDR): Given a point p and its neighbors p_i, the Centroid Distance Ratio is defined as:

  • Directional Variance: Using the same notation, the directional variance of normalized neighbor vectors is computed as:

Points that satisfy these conditions are extracted as boundary points.

3. Center Detection

After boundary extraction and projection, circle centers are estimated through:

  • Clustering: Boundary points are spatially grouped into clusters, each representing one circle.
  • Circle Fitting: For each cluster, the ransacCircleFit() function selects the best inlier set, which is then refined using a least-squares 2D fitting (circleFitConstrained2D()). The center of each circle is computed by solving the following linear system:

4. 6-DoF Estimation

Using the sorted 3D circle centers and the known board layout, we estimate the 6-DoF pose of the calibration board with respect to the LiDAR frame as follows:

  • 2D rigid transformation (rotation + translation) is optimized to align the detected circle centers with the board layout in 2D space.

  • This alignment is formulated as a least-squares problem and solved using Ceres Solver to obtain the optimal rotation and translation.

  • The resulting 2D transform is lifted to 3D using the known board plane, and combined with the plane-to-LiDAR transform to compute the final 6-DoF pose of the board in the LiDAR frame.