A sonar array and a camera are the sensors of a robot that learns to move inside a building. Learning is based on the association of ultrasonic and visual images with motor commands. The two sensors extract different information from the environment. Sonars produce a short-range depth image on the sonar array plane, vision gives a wide-range flat image of the space. In this work, a memory-based probabilistic framework is developed that integrates the information gathered by the sensors

Memory-based sensor integration

Giordani, Dimitri;
1994-01-01

Abstract

A sonar array and a camera are the sensors of a robot that learns to move inside a building. Learning is based on the association of ultrasonic and visual images with motor commands. The two sensors extract different information from the environment. Sonars produce a short-range depth image on the sonar array plane, vision gives a wide-range flat image of the space. In this work, a memory-based probabilistic framework is developed that integrates the information gathered by the sensors
1994
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11582/1001
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact