open3d.t.pipelines.registration.correspondences_from_features#
- open3d.t.pipelines.registration.correspondences_from_features(source_features: open3d.core.Tensor, target_features: open3d.core.Tensor, mutual_filter: bool = False, mutual_consistency_ratio: float = 0.10000000149011612) open3d.core.Tensor #
Function to query nearest neighbors of source_features in target_features.
- Parameters:
source_features (open3d.core.Tensor) – The source features in shape (N, dim).
target_features (open3d.core.Tensor) – The target features in shape (M, dim).
mutual_filter (bool, optional) – Filter correspondences and return the collection of (i, j) s.t. source_features[i] and target_features[j] are mutually the nearest neighbor. Default is False.
mutual_consistency_ratio (float, optional) – Threshold to decide whether the number of filtered correspondences is sufficient. Only used when mutual_filter is enabled. Default is 0.1.
- Returns:
Tensor with shape (K,2) of source_indices and target_indices with K as the number of correspondences.
Example
This example shows how to compute the features and correspondences for two point clouds:
import open3d as o3d # read and downsample point clouds paths = o3d.data.DemoICPPointClouds().paths voxel_size = 0.01 pcd1 = o3d.t.io.read_point_cloud(paths[0]).voxel_down_sample(voxel_size) pcd2 = o3d.t.io.read_point_cloud(paths[1]).voxel_down_sample(voxel_size) # compute FPFH features pcd1_fpfh = o3d.t.pipelines.registration.compute_fpfh_feature(pcd1, radius=5*voxel_size) pcd2_fpfh = o3d.t.pipelines.registration.compute_fpfh_feature(pcd2, radius=5*voxel_size) # compute correspondences matches = o3d.t.pipelines.registration.correspondences_from_features(pcd1_fpfh, pcd2_fpfh, mutual_filter=True) # visualize correspondences matches = matches[::500] pcd2.translate([0,2,0]) # translate pcd2 for the visualization lines = o3d.t.geometry.LineSet() lines.point.positions = o3d.core.Tensor.zeros((matches.num_elements(), 3)) lines.point.positions[0::2] = pcd1.point.positions[matches[:,0]] lines.point.positions[1::2] = pcd2.point.positions[matches[:,1]] lines.line.indices = o3d.core.Tensor.arange(matches.num_elements()).reshape((-1,2)) o3d.visualization.draw([pcd1, pcd2, lines])