Self-localization and Map building
|Title||Self-localization and Map building|
|Publication Type||Book Chapter|
|Year of Publication||2018|
|Editor||Goswami, A, Vadakkepat, P|
|Book Title||Humanoid Robots: A Reference|
|Chapter||Self-localization and Map building|
In order for humanoid robots to evolve autonomously in a complex environment, they have to perceive it, build an appropriate representation, localize in it and decide which motion to realize. The relationship between the environment and the robot is rather complex as some parts are obstacles to avoid, other possible support for locomotion, or objects to manipulate. The affordance with the objects and the environment may result in quite complex motions ranging from bi-manual manipulation to whole-body motion generation.In this chapter, we will introduce tools to realize vision-based humanoid navigation.The general structure of such a system is depicted in Fig.1. It classically represents the perception-action loop where, based on the sensor signals, a number of informations are extracted. The informations are used to localize the robot and build a representation of the environment.
This process is the subject of the second paragraph. Finally a motion is planned and sent to the robot control system. The third paragraph describes several approaches to implement visual navigation in the context of humanoid robotics.