View/state planning for three-dimensional object reconstruction under uncertainty

J. Irving Vasquez-Gomez, L. Enrique Sucar, Rafael Murrieta-Cid

Research output: Contribution to journalArticle

12 Citations (Scopus)

Abstract

© 2015, Springer Science+Business Media New York. We propose a holistic approach for three-dimensional (3D) object reconstruction with a mobile manipulator robot with an eye-in-hand sensor; considering the plan to reach the desired view/state, and the uncertainty in both observations and controls. This is one of the first methods that determines the next best view/state in the state space, following a methodology in which a set of candidate views/states is directly generated in the state space, and later only a subset of these views is kept by filtering the original set. It also determines the controls that yield a collision free trajectory to reach a state using rapidly-exploring random trees. To decrease the processing time we propose an efficient evaluation strategy based on filters, and a 3D visibility calculation with hierarchical ray tracing. The next best view/state is selected based on the expected utility, generating samples in the control space based on an error distribution according to the dynamics of the robot. This makes the method robust to positioning error, significantly reducing the collision rate and increasing the coverage, as shown in the experiments. Several experiments in simulation and with a real mobile manipulator robot with 8 degrees of freedom show that the proposed method provides an effective and fast method for a mobile manipulator to build 3D models of unknown objects. To our knowledge, this is one of the first works that demonstrates the reconstruction of complex objects with a real mobile manipulator considering uncertainty in the controls.
Original languageAmerican English
Pages (from-to)89-109
Number of pages78
JournalAutonomous Robots
DOIs
StatePublished - 1 Jan 2017
Externally publishedYes

Fingerprint

Manipulators
Planning
Robots
Ray tracing
End effectors
Visibility
Experiments
Trajectories
Uncertainty
Sensors
Processing
Industry

Cite this

@article{9304a147bd9142f694238c1a93afa37b,
title = "View/state planning for three-dimensional object reconstruction under uncertainty",
abstract = "{\circledC} 2015, Springer Science+Business Media New York. We propose a holistic approach for three-dimensional (3D) object reconstruction with a mobile manipulator robot with an eye-in-hand sensor; considering the plan to reach the desired view/state, and the uncertainty in both observations and controls. This is one of the first methods that determines the next best view/state in the state space, following a methodology in which a set of candidate views/states is directly generated in the state space, and later only a subset of these views is kept by filtering the original set. It also determines the controls that yield a collision free trajectory to reach a state using rapidly-exploring random trees. To decrease the processing time we propose an efficient evaluation strategy based on filters, and a 3D visibility calculation with hierarchical ray tracing. The next best view/state is selected based on the expected utility, generating samples in the control space based on an error distribution according to the dynamics of the robot. This makes the method robust to positioning error, significantly reducing the collision rate and increasing the coverage, as shown in the experiments. Several experiments in simulation and with a real mobile manipulator robot with 8 degrees of freedom show that the proposed method provides an effective and fast method for a mobile manipulator to build 3D models of unknown objects. To our knowledge, this is one of the first works that demonstrates the reconstruction of complex objects with a real mobile manipulator considering uncertainty in the controls.",
author = "Vasquez-Gomez, {J. Irving} and Sucar, {L. Enrique} and Rafael Murrieta-Cid",
year = "2017",
month = "1",
day = "1",
doi = "10.1007/s10514-015-9531-3",
language = "American English",
pages = "89--109",
journal = "Autonomous Robots",
issn = "0929-5593",
publisher = "Springer Netherlands",

}

View/state planning for three-dimensional object reconstruction under uncertainty. / Vasquez-Gomez, J. Irving; Sucar, L. Enrique; Murrieta-Cid, Rafael.

In: Autonomous Robots, 01.01.2017, p. 89-109.

Research output: Contribution to journalArticle

TY - JOUR

T1 - View/state planning for three-dimensional object reconstruction under uncertainty

AU - Vasquez-Gomez, J. Irving

AU - Sucar, L. Enrique

AU - Murrieta-Cid, Rafael

PY - 2017/1/1

Y1 - 2017/1/1

N2 - © 2015, Springer Science+Business Media New York. We propose a holistic approach for three-dimensional (3D) object reconstruction with a mobile manipulator robot with an eye-in-hand sensor; considering the plan to reach the desired view/state, and the uncertainty in both observations and controls. This is one of the first methods that determines the next best view/state in the state space, following a methodology in which a set of candidate views/states is directly generated in the state space, and later only a subset of these views is kept by filtering the original set. It also determines the controls that yield a collision free trajectory to reach a state using rapidly-exploring random trees. To decrease the processing time we propose an efficient evaluation strategy based on filters, and a 3D visibility calculation with hierarchical ray tracing. The next best view/state is selected based on the expected utility, generating samples in the control space based on an error distribution according to the dynamics of the robot. This makes the method robust to positioning error, significantly reducing the collision rate and increasing the coverage, as shown in the experiments. Several experiments in simulation and with a real mobile manipulator robot with 8 degrees of freedom show that the proposed method provides an effective and fast method for a mobile manipulator to build 3D models of unknown objects. To our knowledge, this is one of the first works that demonstrates the reconstruction of complex objects with a real mobile manipulator considering uncertainty in the controls.

AB - © 2015, Springer Science+Business Media New York. We propose a holistic approach for three-dimensional (3D) object reconstruction with a mobile manipulator robot with an eye-in-hand sensor; considering the plan to reach the desired view/state, and the uncertainty in both observations and controls. This is one of the first methods that determines the next best view/state in the state space, following a methodology in which a set of candidate views/states is directly generated in the state space, and later only a subset of these views is kept by filtering the original set. It also determines the controls that yield a collision free trajectory to reach a state using rapidly-exploring random trees. To decrease the processing time we propose an efficient evaluation strategy based on filters, and a 3D visibility calculation with hierarchical ray tracing. The next best view/state is selected based on the expected utility, generating samples in the control space based on an error distribution according to the dynamics of the robot. This makes the method robust to positioning error, significantly reducing the collision rate and increasing the coverage, as shown in the experiments. Several experiments in simulation and with a real mobile manipulator robot with 8 degrees of freedom show that the proposed method provides an effective and fast method for a mobile manipulator to build 3D models of unknown objects. To our knowledge, this is one of the first works that demonstrates the reconstruction of complex objects with a real mobile manipulator considering uncertainty in the controls.

UR - https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=84951781373&origin=inward

UR - https://www.scopus.com/inward/citedby.uri?partnerID=HzOxMe3b&scp=84951781373&origin=inward

U2 - 10.1007/s10514-015-9531-3

DO - 10.1007/s10514-015-9531-3

M3 - Article

SP - 89

EP - 109

JO - Autonomous Robots

JF - Autonomous Robots

SN - 0929-5593

ER -