We present a distributed model for a team of autonomous aerial robots to collaboratively track a target without external control. The model uses distributed consensus to coordinate actions and to maintain formation via geometric constraints. Each robot uses its ego-centric view of a target and the relative distance from its two closest neighbors to infer its steering commands. To account for noisy and missing target detections, the robots exchange their estimated target position and formation configuration through shared PID-controlled steering responses. We show that the proposed model enables the team to maintain the view of a maneuvering target with varying acceleration under noisy detections and failures up to situations when all robots but one lose the target from their field of view.
A distributed vision-based consensus model for aerial-robotic teams
Fabio Poiesi
;
2018-01-01
Abstract
We present a distributed model for a team of autonomous aerial robots to collaboratively track a target without external control. The model uses distributed consensus to coordinate actions and to maintain formation via geometric constraints. Each robot uses its ego-centric view of a target and the relative distance from its two closest neighbors to infer its steering commands. To account for noisy and missing target detections, the robots exchange their estimated target position and formation configuration through shared PID-controlled steering responses. We show that the proposed model enables the team to maintain the view of a maneuvering target with varying acceleration under noisy detections and failures up to situations when all robots but one lose the target from their field of view.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.