Reinforcement learning-based aggregation for robot swarms
Aggregation, the gathering of individuals into a single group as observed in animals such as birds, bees, and amoeba, is known to provide protection against predators or resistance to adverse environmental conditions for the whole. Cue-based aggregation, where environmental cues determine the locati...
Saved in:
Published in | Adaptive behavior Vol. 32; no. 3; pp. 265 - 281 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
London, England
SAGE Publications
01.06.2024
|
Subjects | |
Online Access | Get full text |
ISSN | 1059-7123 1741-2633 1741-2633 |
DOI | 10.1177/10597123231202593 |
Cover
Summary: | Aggregation, the gathering of individuals into a single group as observed in animals such as birds, bees, and amoeba, is known to provide protection against predators or resistance to adverse environmental conditions for the whole. Cue-based aggregation, where environmental cues determine the location of aggregation, is known to be challenging when the swarm density is low. Here, we propose a novel aggregation method applicable to real robots in low-density swarms. Previously, Landmark-Based Aggregation (LBA) method had used odometric dead-reckoning coupled with visual landmarks and yielded better aggregation in low-density swarms. However, the method’s performance was affected adversely by odometry drift, jeopardizing its application in real-world scenarios. In this article, a novel Reinforcement Learning-based Aggregation method, RLA, is proposed to increase aggregation robustness, thus making aggregation possible for real robots in low-density swarm settings. Systematic experiments conducted in a kinematic-based simulator and on real robots have shown that the RLA method yielded larger aggregates, is more robust to odometry noise than the LBA method, and adapts better to environmental changes while not being sensitive to parameter tuning, making it better deployable under real-world conditions. |
---|---|
ISSN: | 1059-7123 1741-2633 1741-2633 |
DOI: | 10.1177/10597123231202593 |