In this paper, we compute the precision with which 3D points, camera orientation, position and calibration are estimated for a laser rangefinder and a multicamera-system. Both sensors are used to digitize urban environments. Errors in the localization of image features introduce errors in the reconstruction. Some algorithms are numerically unstable, intrinsically, or in conjunction to particular setups of points and/or of cameras. A practical methodology is presented to predict the error propagation inside the calibration process between both sensors. Performance charts of the error propagation in the intrinsic camera parameters and the relationship between the laser and the noisy of both sensors were calculated using simulations, an analitical analysis. Results for the calibration, error projection between camera-laser and uncertainty analysis are presented for data collected by the mobile terrestrial platform.