Difference between revisions of "PCL/OpenNI tutorial 3: Cloud processing (advanced)"

From robotica.unileon.es
Jump to: navigation, search
m
m
Line 88: Line 88:
 
* '''Input''': Source cloud, Target cloud
 
* '''Input''': Source cloud, Target cloud
 
* '''Output''': Registered source cloud, Transformation matrix
 
* '''Output''': Registered source cloud, Transformation matrix
* '''Tutorial''': [http://pointclouds.org/documentation/tutorials/iterative_closest_point.php#iterative-closest-point How to use iterative closest point], [http://pointclouds.org/documentation/tutorials/pairwise_incremental_registration.php#pairwise-incremental-registration How to incrementally register pairs of clouds]
+
* '''Tutorial''': [http://pointclouds.org/documentation/tutorials/iterative_closest_point.php#iterative-closest-point How to use iterative closest point], [http://pointclouds.org/documentation/tutorials/pairwise_incremental_registration.php#pairwise-incremental-registration How to incrementally register pairs of clouds]. [http://pointclouds.org/documentation/tutorials/normal_distributions_transform.php#normal-distributions-transform How to use Normal Distributions Transform]
 
* '''API''': [http://docs.pointclouds.org/trunk/a00565.html pcl::IterativeClosestPoint], [http://docs.pointclouds.org/trunk/a00741.html pcl::NormalDistributionsTransform], [http://docs.pointclouds.org/trunk/a00567.html pcl::IterativeClosestPointWithNormals], [http://docs.pointclouds.org/trunk/a00566.html pcl::IterativeClosestPointNonLinear]
 
* '''API''': [http://docs.pointclouds.org/trunk/a00565.html pcl::IterativeClosestPoint], [http://docs.pointclouds.org/trunk/a00741.html pcl::NormalDistributionsTransform], [http://docs.pointclouds.org/trunk/a00567.html pcl::IterativeClosestPointWithNormals], [http://docs.pointclouds.org/trunk/a00566.html pcl::IterativeClosestPointNonLinear]
 
* [http://robotica.unileon.es/~victorm/PCL_registration.tar.gz Download]
 
* [http://robotica.unileon.es/~victorm/PCL_registration.tar.gz Download]

Revision as of 12:22, 12 February 2014

Go to root: PhD-3D-Object-Tracking




Most of the techniques seen in the previous tutorial focused on preprocessing, that is, performing certain operations on the cloud to get it ready for further analysis or work. Downsampling, removing outliers, smoothing the surfaces, estimating the normals...

This new tutorial will teach you many interesting things that can be done with point clouds after the preparation step, such as registration, segmentation and model matching.

Registration

Registration is the technique of aligning two point clouds, like pieces of a puzzle. To be precise, the algorithm finds a set of correspondences between them, which would mean that there is an area of the scene that has been captured in both clouds. A linear transformation is then computed, which outputs a matrix that contains a rotation and a translation. These are the operations that you would apply to one of the clouds so it would get in place with respect to the other, with the intersecting areas overlapping.

The best results are achieved with clouds that are very similar, so you should try to minimize the transformations between them. Meaning, do not run like crazy with a Kinect in your hands, grabbing frames, and expect PCL to match the clouds. It is better to move the sensor gingerly and at steady intervals. If you use a robotic arm or a rotating tabletop with precise angles, even better.

Registration is a very useful technique because it lets you retrieve full, complete and continous models of a scene or object. It is also expensive for the CPU. Optimizations have been developed that allow to do cloud matching in real time with the GPU, like KinFu does.

Flowchart diagram showing an iteration of the registration algorithm (image from http://pointclouds.org/).

The previous diagram illustrates how the procedure is done. There are mainly two different methods to choose from:

  • ICP registration: ICP stands for Iterative Closest Point. It is an algorithm that will find the best transformation that minimizes the distance from the source point cloud to the target one. The problem is that it will do it by associating every point of the source cloud to its "twin" in the target cloud in a linear way, so it can be considered a brute force method. It the clouds are too big, the algorithm will take its time to finish, so try downsampling them first.
  • Feature-based registration: the algorithm finds a set of keypoints in each cloud, computes a descriptor for each (we mentioned what descriptors were in the previous tutorial; they are like individual signatures of points), and then performs a search to see if the clouds have keypoints in common. If true, a transformation is computed. For accurate results, enough correspondences must be found. This method is (or should be) faster than the first, because matching is only done for the keypoints, not the whole cloud.

The second method is basically identical to the 3D object recognition problem, so we will not look into it here. Features, descriptors and recognition will be dealt with in the next tutorials.


Iterative Closest Point (ICP)

PCL offers several implementations of the ICP algorithm. We will see the general, unoptimized one, but you may want to try one of the optimized variants listed below in the API section. The code for aligning (registering) two point clouds is the following:

#include <pcl/io/pcd_io.h>
#include <pcl/registration/icp.h>

#include <iostream>

int
main(int argc, char** argv)
{
	// Objects for storing the point clouds.
	pcl::PointCloud<pcl::PointXYZ>::Ptr sourceCloud(new pcl::PointCloud<pcl::PointXYZ>);
	pcl::PointCloud<pcl::PointXYZ>::Ptr targetCloud(new pcl::PointCloud<pcl::PointXYZ>);
	pcl::PointCloud<pcl::PointXYZ>::Ptr finalCloud(new pcl::PointCloud<pcl::PointXYZ>);

	// Read two PCD files from disk.
	if (pcl::io::loadPCDFile<pcl::PointXYZ>(argv[1], *sourceCloud) != 0)
	{
		return -1;
	}
	if (pcl::io::loadPCDFile<pcl::PointXYZ>(argv[2], *targetCloud) != 0)
	{
		return -1;
	}

	// ICP object.
	pcl::IterativeClosestPoint<pcl::PointXYZ, pcl::PointXYZ> registration;
	registration.setInputSource(sourceCloud);
	registration.setInputTarget(targetCloud);
	
	registration.align(*finalCloud);
	if (registration.hasConverged())
	{
		std::cout << "ICP converged." << std::endl
		          << "The score is " << registration.getFitnessScore() << std::endl;
		std::cout << "Transformation matrix:" << std::endl;
		std::cout << registration.getFinalTransformation() << std::endl;
	}
	else std::cout << "ICP did not converge." << std::endl;

	return 0;
}

The cloud saved in "finalCloud" is the same as "sourceCloud" (it has the same points), but the transformation found by ICP has been applied, so when visualized next to "targetCloud" they should match properly.




Go to root: PhD-3D-Object-Tracking

Links to articles:

PCL/OpenNI tutorial 0: The very basics

PCL/OpenNI tutorial 1: Installing and testing

PCL/OpenNI tutorial 2: Cloud processing (basic)

PCL/OpenNI tutorial 3: Cloud processing (advanced)

PCL/OpenNI troubleshooting