Sun X, Hu C, Liu T, Yue S, Peng J, Fu Q. Translating Virtual Prey-Predator Interaction to Real-World Robotic Environments: Enabling Multimodal Sensing and Evolutionary Dynamics.
Biomimetics (Basel) 2023;
8:580. [PMID:
38132519 PMCID:
PMC10742093 DOI:
10.3390/biomimetics8080580]
[Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2023] [Revised: 10/18/2023] [Accepted: 11/29/2023] [Indexed: 12/23/2023] Open
Abstract
Prey-predator interactions play a pivotal role in elucidating the evolution and adaptation of various organism's traits. Numerous approaches have been employed to study the dynamics of prey-predator interaction systems, with agent-based methodologies gaining popularity. However, existing agent-based models are limited in their ability to handle multi-modal interactions, which are believed to be crucial for understanding living organisms. Conversely, prevailing prey-predator integration studies often rely on mathematical models and computer simulations, neglecting real-world constraints and noise. These elusive attributes, challenging to model, can lead to emergent behaviors and embodied intelligence. To bridge these gaps, our study designs and implements a prey-predator interaction scenario that incorporates visual and olfactory sensory cues not only in computer simulations but also in a real multi-robot system. Observed emergent spatial-temporal dynamics demonstrate successful transitioning of investigating prey-predator interactions from virtual simulations to the tangible world. It highlights the potential of multi-robotics approaches for studying prey-predator interactions and lays the groundwork for future investigations involving multi-modal sensory processing while considering real-world constraints.
Collapse