Space dimension perception from the multimodal sensorimotor flow of a naive robotic agent

Alban Laflaquière, Sylvain Argentieri, Bruno Gas, Eduardo Castillo-Castenada

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

9 Scopus citations

Abstract

Perception and action are fundamental tasks for autonomous robots. Traditionnally, they rely on theoretical models built by the system's designer. But, is a naive agent able to learn by itself the structure of its interaction with the environment without any a priori information ? This knowledge should be extracted through the analysis of the only information it has access to: its high-dimensional sensorimotor flow. Recent works, based on the sensorimotor contingencies theory, allow a simulated agent to extract the geometrical space dimensionality without any model of itself nor of the environment. In this paper, these results are validated using a more sophisticated auditive modality. The question of multimodality fusion is then addressed by fitting up the agent with vision. Finally, preliminary experimental results on a real robotic platform are presented.

Original languageEnglish
Title of host publicationIEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings
Pages1520-1525
Number of pages6
DOIs
StatePublished - 2010
Event23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Taipei, Taiwan, Province of China
Duration: 18 Oct 201022 Oct 2010

Publication series

NameIEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings

Conference

Conference23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010
Country/TerritoryTaiwan, Province of China
CityTaipei
Period18/10/1022/10/10

Fingerprint

Dive into the research topics of 'Space dimension perception from the multimodal sensorimotor flow of a naive robotic agent'. Together they form a unique fingerprint.

Cite this