TY - JOUR
T1 - Error propagation and uncertainty analysis between 3D laser scanner and camera
AU - García-Moreno, Angel Ivá
AU - Hernandez-García, Denis Eduardo
AU - Gonzalez-Barbosa, José Joel
AU - Ramírez-Pedraza, Alfonso
AU - Hurtado-Ramos, Juan B.
AU - Ornelas-Rodriguez, Francisco Javier
N1 - Funding Information:
The authors wish to acknowledge the financial support for this work to the Consejo Nacional de Ciencia y Tecnología (CONACYT) through projects SEP-2005-O1-51004/25293 and to the Instituto Politécnico Nacional through project SIP-20130165. This work was carried out at the Centro de Investigación en Ciencia Aplicada y Tecnología Avanzada (CICATA), Querétaro, México. The authors acknowledge the government agency CONACYT for the financial support through the scholarship holder 237751.
PY - 2014/6
Y1 - 2014/6
N2 - In this work we present an in-situ method to compute the calibration of two sensors, a LIDAR (Light Detection and Ranging) and a spherical camera. Both sensors are used in urban environment reconstruction tasks. In this scenario the speed at which the various sensors acquire and merge the information is very important; however reconstruction accuracy, which depends on sensors calibration, is also of high relevance. Here, a new calibration pattern, visible to both sensors is proposed. By this means, the correspondence between each laser point and its position in the camera image is obtained so that the texture and color of each LIDAR point can be known. Experimental results for the calibration and uncertainty analysis are presented for data collected by the platform integrated with a LIDAR and a spherical camera.
AB - In this work we present an in-situ method to compute the calibration of two sensors, a LIDAR (Light Detection and Ranging) and a spherical camera. Both sensors are used in urban environment reconstruction tasks. In this scenario the speed at which the various sensors acquire and merge the information is very important; however reconstruction accuracy, which depends on sensors calibration, is also of high relevance. Here, a new calibration pattern, visible to both sensors is proposed. By this means, the correspondence between each laser point and its position in the camera image is obtained so that the texture and color of each LIDAR point can be known. Experimental results for the calibration and uncertainty analysis are presented for data collected by the platform integrated with a LIDAR and a spherical camera.
KW - Camera calibration
KW - Effect of noise
KW - LIDAR calibration
KW - Sensors calibration
KW - Uncertainty analysis
UR - http://www.scopus.com/inward/record.url?scp=84899546462&partnerID=8YFLogxK
U2 - 10.1016/j.robot.2014.02.004
DO - 10.1016/j.robot.2014.02.004
M3 - Artículo
SN - 0921-8890
VL - 62
SP - 782
EP - 793
JO - Robotics and Autonomous Systems
JF - Robotics and Autonomous Systems
IS - 6
ER -