TY - GEN
T1 - Mobile remote sensing platform
T2 - 2014 11th International Joint Conference on Computer Science and Software Engineering: "Human Factors in Computer Science and Software Engineering" - e-Science and High Performance Computing: eHPC, JCSSE 2014
AU - Garcia-Moreno, Angel Ivan
AU - Gonzalez-Barbosa, Jose Joel
AU - Hurtado-Ramos, Juan B.
AU - Ornelas-Rodriguez, Francisco Javier
PY - 2014
Y1 - 2014
N2 - This paper presents a method to estimate the uncertainty in the calibration of two sensors, a laser rangefinder and a multicamera-system. Both sensors were used in urban environment reconstruction tasks. A new calibration pattern, visible to both sensors is proposed. By this means, correspondence between each laser point and its position in the camera image is obtained so that texture and color of each LIDAR point can be known. Besides this allows systematic errors in individual sensors to be minimized, and when multiple sensors are used, it minimizes the systematic contradiction between them to enable reliable multisensor data fusion. A practical methodology is presented to predict the uncertainty analisys inside the calibration process between both sensors. Statistics of the behavior of the camera calibration parameters and the relationship between the camera, LIDAR and the noisy of both sensors were calculated using simulations, an analitical analysis and experiments. Results for the calibration and uncertainty analysis are presented for data collected by the platform integrated with a LIDAR and the panoramic camera.
AB - This paper presents a method to estimate the uncertainty in the calibration of two sensors, a laser rangefinder and a multicamera-system. Both sensors were used in urban environment reconstruction tasks. A new calibration pattern, visible to both sensors is proposed. By this means, correspondence between each laser point and its position in the camera image is obtained so that texture and color of each LIDAR point can be known. Besides this allows systematic errors in individual sensors to be minimized, and when multiple sensors are used, it minimizes the systematic contradiction between them to enable reliable multisensor data fusion. A practical methodology is presented to predict the uncertainty analisys inside the calibration process between both sensors. Statistics of the behavior of the camera calibration parameters and the relationship between the camera, LIDAR and the noisy of both sensors were calculated using simulations, an analitical analysis and experiments. Results for the calibration and uncertainty analysis are presented for data collected by the platform integrated with a LIDAR and the panoramic camera.
UR - http://www.scopus.com/inward/record.url?scp=84904569360&partnerID=8YFLogxK
U2 - 10.1109/JCSSE.2014.6841843
DO - 10.1109/JCSSE.2014.6841843
M3 - Contribución a la conferencia
AN - SCOPUS:84904569360
SN - 9781479958221
T3 - 2014 11th Int. Joint Conf. on Computer Science and Software Engineering: "Human Factors in Computer Science and Software Engineering" - e-Science and High Performance Computing: eHPC, JCSSE 2014
SP - 64
EP - 69
BT - 2014 11th Int. Joint Conference on Computer Science and Software Engineering
PB - IEEE Computer Society
Y2 - 14 May 2014 through 16 May 2014
ER -