Ground and Aerial Mutual Localization using Anonymous Relative-Bearing Measurements

TitleGround and Aerial Mutual Localization using Anonymous Relative-Bearing Measurements
Publication TypeJournal Article
Year of Publication2016
AuthorsStegagno, P, Cognetti, M, Oriolo, G, Bülthoff, HH, Franchi, A
JournalIEEE Transaction on Robotics
Volume32
Issue5
Pagination1133-1151
Date Published09/2017
Abstract

We present a decentralized algorithm for estimating mutual poses (i.e., relative positions and orientations) in a group of mobile robots. The algorithm uses only anonymous relative-bearing measurements obtainable, e.g., using onboard monocular cameras, and onboard motion measurements, such as inertial ones (acceleration and angular velocity).
Onboard relative-bearing sensors supply anonymous measurements, i.e., they provide the directions along which other robots are located but each direction is not associated to any robot (identities are unknown).
The issue of anonymity is often overlooked in theory but represents a real problem in practice, especially when employing onboard vision.
The solution is first presented for ground robots, in SE(2), and then for aerial robots, in SE(3), in order to emphasize the difference between the two cases.
The proposed method is based on a two-step approach, the first uses instantaneous geometrical arguments on the anonymous measurements in order to retrieve the most likely unscaled relative configurations together with the identities, the second uses a numeric Bayesian filtering in order to take advantage of the motion model over time and to retrieve the scale.
The proposed method exhibits robustness w.r.t. false positives and negatives of the robot detector.
An extensive experimental validation of the algorithm is performed using Khepera III ground mobile robots and quadrotor aerial robots.

Citation Key2016h-SteCogOriBueFra
AttachmentSize
PDF icon preprint-pdf4.74 MB
File video10.02 MB

Taxonomy upgrade extras: