TY - GEN
T1 - LIDAR and panoramic camera extrinsic calibration approach using a pattern plane
AU - García-Moreno, Angel Iván
AU - Gonzalez-Barbosa, José Joel
AU - Ornelas-Rodriguez, Francisco Javier
AU - Hurtado-Ramos, Juan B.
AU - Primo-Fuentes, Marco Neri
PY - 2013
Y1 - 2013
N2 - Mobile platforms typically combine several data acquisition systems such as lasers, cameras and inertial systems. However the geometrical combination of the different sensors requires their calibration, at least, through the definition of the extrinsic parameters, i.e., the transformation matrices that register all sensors in the same coordinate system. Our system generate an accurate association between platform sensors and the estimated parameters including rotation, translation, focal length, world and sensors reference frame. The extrinsic camera parameters are computed by Zhang's method using a pattern composed of white rhombus and rhombus holes, and the LIDAR with the results of previous work. Points acquired by the LIDAR are projected into images acquired by the Ladybug cameras. A new calibration pattern, visible to both sensors is used. Correspondence is obtained between each laser point and its position in the image, the texture and color of each point of LIDAR can be know.
AB - Mobile platforms typically combine several data acquisition systems such as lasers, cameras and inertial systems. However the geometrical combination of the different sensors requires their calibration, at least, through the definition of the extrinsic parameters, i.e., the transformation matrices that register all sensors in the same coordinate system. Our system generate an accurate association between platform sensors and the estimated parameters including rotation, translation, focal length, world and sensors reference frame. The extrinsic camera parameters are computed by Zhang's method using a pattern composed of white rhombus and rhombus holes, and the LIDAR with the results of previous work. Points acquired by the LIDAR are projected into images acquired by the Ladybug cameras. A new calibration pattern, visible to both sensors is used. Correspondence is obtained between each laser point and its position in the image, the texture and color of each point of LIDAR can be know.
KW - LIDAR
KW - extrinsic calibration
KW - panoramic camera
KW - sensor calibration
UR - http://www.scopus.com/inward/record.url?scp=84888259153&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-38989-4_11
DO - 10.1007/978-3-642-38989-4_11
M3 - Contribución a la conferencia
SN - 9783642389887
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 104
EP - 113
BT - Pattern Recognition - 5th Mexican Conference, MCPR 2013, Proceedings
T2 - 5th Mexican Conference on Pattern Recognition, MCPR 2013
Y2 - 26 June 2013 through 29 June 2013
ER -