EDEN / Rover Navigation / Environment Perception
 [ Related Publications ]

Entering the Perception / Decision / Action loop

Perception is the way through which robots gather knowledge about their environment. But before structuring this knowledge into models or to exploit it, some basics perception functionalities are required. These pages gather some of our work on such functionalities. They are related to vision, as it is the main sensor we currently use within the Eden project:
  • Stereo-vision describes how a 3D points image is produced on the basis of a pair of calibrated images.
  • Image segmentation describes how color and texture attributes can be exploited to segment the perceived scenes into regions.
  • Panoramic vision panoramic image acquisitions (work that exploit such images to localize the rover is presented here).
  • Interest points matching is an important basic algorithm, whose results can be exploited the principle of panoramic image acquisitions (work that exploit such images to localize the rover is presented here).
  • Some visual features tracking algorithms are described here
[Betge-Brezetz 1995]

Related Publications

[Betge-Brezetz 1995]  [related pages] [BibTeX]  [top]

S. Betge-Brezetz, R. Chatila and M.Devy. Object-Based Modelling and Localization in Natural Environments. In International Conference on Robotics and Automation, pages 2920-2927. Nagoya (Japan), 1995.



  General Information
 About the EDEN Project
 Photo Gallery
 Videos
 People
 Publications
 Related Projects
 Job opportunities
 Intranet

  Robots
 Adam
 Lama
 The Blimp
 Dala

  Rovers Navigation
 Motion control
>Environment Perception
*  Stereo-vision
   Terrain classification
*  Image segmentation
*  Panoramic vision
*  Interest points matching
*  Pixel tracking
 Environment Modeling
 Localization
*  Panoramic images
 Motion Generation
 Integration

  Autonomous Blimps
 Control
 Environment Perception

  Multi-Robot Cooperation
 Multi UAV coordination