1
|
Chen Q, Guo W, Fang Y, Tong Y, Lu T, Jin X, Deng Z. A Bio-Inspired Model for Bee Simulations. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2025; 31:2073-2085. [PMID: 38502620 DOI: 10.1109/tvcg.2024.3379080] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/21/2024]
Abstract
As eusocial creatures, bees display unique macro collective behavior and local body dynamics that hold potential applications in various fields, such as computer animation, robotics, and social behavior. Unlike birds and fish, bees fly in a low-aligned zigzag pattern. Additionally, bees rely on visual signals for foraging and predator avoidance, exhibiting distinctive local body oscillations, such as body lifting, thrusting, and swaying. These inherent features pose significant challenges to realistic bee simulations in practical animation applications. In this article, we present a bio-inspired model for bee simulations capable of replicating both macro collective behavior and local body dynamics of bees. Our approach utilizes a visually-driven system to simulate a bee's local body dynamics, incorporating obstacle perception and body rolling control for effective collision avoidance. Moreover, we develop an oscillation rule that captures the dynamics of the bee's local bodies, drawing on insights from biological research. Our model extends beyond simulating individual bees' dynamics; it can also represent bee swarms by integrating a fluid-based field with the bees' innate noise and zigzag motions. To fine-tune our model, we utilize pre-collected honeybee flight data. Through extensive simulations and comparative experiments, we demonstrate that our model can efficiently generate realistic low-aligned and inherently noisy bee swarms.
Collapse
|
2
|
Jansen AJ, Fajen BR. Prospective control of steering through multiple waypoints. J Vis 2024; 24:1. [PMID: 39087937 PMCID: PMC11305437 DOI: 10.1167/jov.24.8.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2024] [Accepted: 07/02/2024] [Indexed: 08/02/2024] Open
Abstract
Some locomotor tasks involve steering at high speeds through multiple waypoints within cluttered environments. Although in principle actors could treat each individual waypoint in isolation, skillful performance would seem to require them to adapt their trajectory to the most immediate waypoint in anticipation of subsequent waypoints. To date, there have been few studies of such behavior, and the evidence that does exist is inconclusive about whether steering is affected by multiple future waypoints. The present study was designed to address the need for a clearer understanding of how humans adapt their steering movements in anticipation of future goals. Subjects performed a simulated drone flying task in a forest-like virtual environment that was presented on a monitor while their eye movements were tracked. They were instructed to steer through a series of gates while the distance at which gates first became visible (i.e., lookahead distance) was manipulated between trials. When gates became visible at least 1-1/2 segments in advance, subjects successfully flew through a high percentage of gates, rarely collided with obstacles, and maintained a consistent speed. They also approached the most immediate gate in a way that depended on the angular position of the subsequent gate. However, when the lookahead distance was less than 1-1/2 segments, subjects followed longer paths and flew at slower, more variable speeds. The findings demonstrate that the control of steering through multiple waypoints does indeed depend on information from beyond the most immediate waypoint. Discussion focuses on the possible control strategies for steering through multiple waypoints.
Collapse
Affiliation(s)
- A J Jansen
- Cognitive Science Department, Rensselaer Polytechnic Institute, Troy, NY, USA
| | - Brett R Fajen
- Cognitive Science Department, Rensselaer Polytechnic Institute, Troy, NY, USA
| |
Collapse
|
3
|
Xiao Y, Lei X, Zheng Z, Xiang Y, Liu YY, Peng X. Perception of motion salience shapes the emergence of collective motions. Nat Commun 2024; 15:4779. [PMID: 38839782 PMCID: PMC11153630 DOI: 10.1038/s41467-024-49151-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2023] [Accepted: 05/24/2024] [Indexed: 06/07/2024] Open
Abstract
Despite the profound implications of self-organization in animal groups for collective behaviors, understanding the fundamental principles and applying them to swarm robotics remains incomplete. Here we propose a heuristic measure of perception of motion salience (MS) to quantify relative motion changes of neighbors from first-person view. Leveraging three large bird-flocking datasets, we explore how this perception of MS relates to the structure of leader-follower (LF) relations, and further perform an individual-level correlation analysis between past perception of MS and future change rate of velocity consensus. We observe prevalence of the positive correlations in real flocks, which demonstrates that individuals will accelerate the convergence of velocity with neighbors who have higher MS. This empirical finding motivates us to introduce the concept of adaptive MS-based (AMS) interaction in swarm model. Finally, we implement AMS in a swarm of ~102 miniature robots. Swarm experiments show the significant advantage of AMS in enhancing self-organization of the swarm for smooth evacuations from confined environments.
Collapse
Affiliation(s)
- Yandong Xiao
- College of System Engineering, National University of Defense Technology, Changsha, Hunan, China.
| | - Xiaokang Lei
- College of Information and Control Engineering, Xi'an University of Architecture and Technology, Xi'an, Shaanxi, China
| | - Zhicheng Zheng
- School of Marine Science and Technology, Northwestern Polytechnical University, Xi'an, Shaanxi, China
| | - Yalun Xiang
- School of Marine Science and Technology, Northwestern Polytechnical University, Xi'an, Shaanxi, China
| | - Yang-Yu Liu
- Channing Division of Network Medicine, Department of Medicine, Brigham and Women's Hospital and Harvard Medical School, Boston, MA, USA
- Center for Artificial Intelligence and Modeling, The Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign, Champaign, IL, USA
| | - Xingguang Peng
- School of Marine Science and Technology, Northwestern Polytechnical University, Xi'an, Shaanxi, China.
| |
Collapse
|
4
|
Kittelmann M, McGregor AP. Looking across the gap: Understanding the evolution of eyes and vision among insects. Bioessays 2024; 46:e2300240. [PMID: 38593308 DOI: 10.1002/bies.202300240] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2023] [Revised: 02/29/2024] [Accepted: 03/05/2024] [Indexed: 04/11/2024]
Abstract
The compound eyes of insects exhibit stunning variation in size, structure, and function, which has allowed these animals to use their vision to adapt to a huge range of different environments and lifestyles, and evolve complex behaviors. Much of our knowledge of eye development has been learned from Drosophila, while visual adaptations and behaviors are often more striking and better understood from studies of other insects. However, recent studies in Drosophila and other insects, including bees, beetles, and butterflies, have begun to address this gap by revealing the genetic and developmental bases of differences in eye morphology and key new aspects of compound eye structure and function. Furthermore, technical advances have facilitated the generation of high-resolution connectomic data from different insect species that enhances our understanding of visual information processing, and the impact of changes in these processes on the evolution of vision and behavior. Here, we review these recent breakthroughs and propose that future integrated research from the development to function of visual systems within and among insect species represents a great opportunity to understand the remarkable diversification of insect eyes and vision.
Collapse
Affiliation(s)
- Maike Kittelmann
- Department of Biological and Medical Sciences, Oxford Brookes University, Oxford, UK
| | | |
Collapse
|
5
|
Singh S, Garratt M, Srinivasan M, Ravi S. Analysis of collision avoidance in honeybee flight. J R Soc Interface 2024; 21:20230601. [PMID: 38531412 PMCID: PMC10973882 DOI: 10.1098/rsif.2023.0601] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2023] [Accepted: 03/01/2024] [Indexed: 03/28/2024] Open
Abstract
Insects are excellent at flying in dense vegetation and navigating through other complex spatial environments. This study investigates the strategies used by honeybees (Apis mellifera) to avoid collisions with an obstacle encountered frontally during flight. Bees were trained to fly through a tunnel that contained a solitary vertically oriented cylindrical obstacle placed along the midline. Flight trajectories of bees were recorded for six conditions in which the diameter of the obstructing cylinder was systematically varied from 25 mm to 160 mm. Analysis of salient events during the bees' flight, such as the deceleration before the obstacle, and the initiation of the deviation in flight path to avoid collisions, revealed a strategy for obstacle avoidance that is based on the relative retinal expansion velocity generated by the obstacle when the bee is on a collision course. We find that a quantitative model, featuring a controller that extracts specific visual cues from the frontal visual field, provides an accurate characterization of the geometry and the dynamics of the manoeuvres adopted by honeybees to avoid collisions. This study paves the way for the design of unmanned aerial systems, by identifying the visual cues that are used by honeybees for performing robust obstacle avoidance flight.
Collapse
Affiliation(s)
- Shreyansh Singh
- School of Engineering and Technology, University of New South Wales, Canberra, Australia
| | - Matthew Garratt
- School of Engineering and Technology, University of New South Wales, Canberra, Australia
| | - Mandyam Srinivasan
- Queensland Brain Institute, University of Queensland, Brisbane, Australia
| | - Sridhar Ravi
- School of Engineering and Technology, University of New South Wales, Canberra, Australia
| |
Collapse
|
6
|
Schoepe T, Janotte E, Milde MB, Bertrand OJN, Egelhaaf M, Chicca E. Finding the gap: neuromorphic motion-vision in dense environments. Nat Commun 2024; 15:817. [PMID: 38280859 PMCID: PMC10821932 DOI: 10.1038/s41467-024-45063-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Accepted: 01/15/2024] [Indexed: 01/29/2024] Open
Abstract
Animals have evolved mechanisms to travel safely and efficiently within different habitats. On a journey in dense terrains animals avoid collisions and cross narrow passages while controlling an overall course. Multiple hypotheses target how animals solve challenges faced during such travel. Here we show that a single mechanism enables safe and efficient travel. We developed a robot inspired by insects. It has remarkable capabilities to travel in dense terrain, avoiding collisions, crossing gaps and selecting safe passages. These capabilities are accomplished by a neuromorphic network steering the robot toward regions of low apparent motion. Our system leverages knowledge about vision processing and obstacle avoidance in insects. Our results demonstrate how insects might safely travel through diverse habitats. We anticipate our system to be a working hypothesis to study insects' travels in dense terrains. Furthermore, it illustrates that we can design novel hardware systems by understanding the underlying mechanisms driving behaviour.
Collapse
Affiliation(s)
- Thorben Schoepe
- Peter Grünberg Institut 15, Forschungszentrum Jülich, Aachen, Germany.
- Faculty of Technology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany.
- Bio-Inspired Circuits and Systems (BICS) Lab. Zernike Institute for Advanced Materials (Zernike Inst Adv Mat), University of Groningen, Groningen, Netherlands.
- CogniGron (Groningen Cognitive Systems and Materials Center), University of Groningen, Groningen, Netherlands.
| | - Ella Janotte
- Event Driven Perception for Robotics, Italian Institute of Technology, iCub facility, Genoa, Italy
| | - Moritz B Milde
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Penrith, Australia
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Bielefeld University, Bielefeld, Germany
| | - Elisabetta Chicca
- Faculty of Technology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany
- Bio-Inspired Circuits and Systems (BICS) Lab. Zernike Institute for Advanced Materials (Zernike Inst Adv Mat), University of Groningen, Groningen, Netherlands
- CogniGron (Groningen Cognitive Systems and Materials Center), University of Groningen, Groningen, Netherlands
| |
Collapse
|
7
|
Glass JR, Burnett NP, Combes SA, Weisman E, Helbling A, Harrison JF. Flying, nectar-loaded honey bees conserve water and improve heat tolerance by reducing wingbeat frequency and metabolic heat production. Proc Natl Acad Sci U S A 2024; 121:e2311025121. [PMID: 38227669 PMCID: PMC10823226 DOI: 10.1073/pnas.2311025121] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2023] [Accepted: 11/27/2023] [Indexed: 01/18/2024] Open
Abstract
Heat waves are becoming increasingly common due to climate change, making it crucial to identify and understand the capacities for insect pollinators, such as honey bees, to avoid overheating. We examined the effects of hot, dry air temperatures on the physiological and behavioral mechanisms that honey bees use to fly when carrying nectar loads, to assess how foraging is limited by overheating or desiccation. We found that flight muscle temperatures increased linearly with load mass at air temperatures of 20 or 30 °C, but, remarkably, there was no change with increasing nectar loads at an air temperature of 40 °C. Flying, nectar-loaded bees were able to avoid overheating at 40 °C by reducing their flight metabolic rates and increasing evaporative cooling. At high body temperatures, bees apparently increase flight efficiency by lowering their wingbeat frequency and increasing stroke amplitude to compensate, reducing the need for evaporative cooling. However, even with reductions in metabolic heat production, desiccation likely limits foraging at temperatures well below bees' critical thermal maxima in hot, dry conditions.
Collapse
Affiliation(s)
- Jordan R. Glass
- School of Life Sciences, Arizona State University, Tempe, AZ85281
- Department of Zoology and Physiology, University of Wyoming, Laramie, WY82071
| | - Nicholas P. Burnett
- Department of Neurobiology, Physiology and Behavior, University of California, Davis, CA95616
| | - Stacey A. Combes
- Department of Neurobiology, Physiology and Behavior, University of California, Davis, CA95616
| | - Ethan Weisman
- School of Life Sciences, Arizona State University, Tempe, AZ85281
| | - Alina Helbling
- School of Life Sciences, Arizona State University, Tempe, AZ85281
| | - Jon F. Harrison
- School of Life Sciences, Arizona State University, Tempe, AZ85281
| |
Collapse
|
8
|
Stiemer LN, Thoma A, Braun C. MBT3D: Deep learning based multi-object tracker for bumblebee 3D flight path estimation. PLoS One 2023; 18:e0291415. [PMID: 37738269 PMCID: PMC10516433 DOI: 10.1371/journal.pone.0291415] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2023] [Accepted: 08/29/2023] [Indexed: 09/24/2023] Open
Abstract
This work presents the Multi-Bees-Tracker (MBT3D) algorithm, a Python framework implementing a deep association tracker for Tracking-By-Detection, to address the challenging task of tracking flight paths of bumblebees in a social group. While tracking algorithms for bumblebees exist, they often come with intensive restrictions, such as the need for sufficient lighting, high contrast between the animal and background, absence of occlusion, significant user input, etc. Tracking flight paths of bumblebees in a social group is challenging. They suddenly adjust movements and change their appearance during different wing beat states while exhibiting significant similarities in their individual appearance. The MBT3D tracker, developed in this research, is an adaptation of an existing ant tracking algorithm for bumblebee tracking. It incorporates an offline trained appearance descriptor along with a Kalman Filter for appearance and motion matching. Different detector architectures for upstream detections (You Only Look Once (YOLOv5), Faster Region Proposal Convolutional Neural Network (Faster R-CNN), and RetinaNet) are investigated in a comparative study to optimize performance. The detection models were trained on a dataset containing 11359 labeled bumblebee images. YOLOv5 reaches an Average Precision of AP = 53, 8%, Faster R-CNN achieves AP = 45, 3% and RetinaNet AP = 38, 4% on the bumblebee validation dataset, which consists of 1323 labeled bumblebee images. The tracker's appearance model is trained on 144 samples. The tracker (with Faster R-CNN detections) reaches a Multiple Object Tracking Accuracy MOTA = 93, 5% and a Multiple Object Tracking Precision MOTP = 75, 6% on a validation dataset containing 2000 images, competing with state-of-the-art computer vision methods. The framework allows reliable tracking of different bumblebees in the same video stream with rarely occurring identity switches (IDS). MBT3D has much lower IDS than other commonly used algorithms, with one of the lowest false positive rates, competing with state-of-the-art animal tracking algorithms. The developed framework reconstructs the 3-dimensional (3D) flight paths of the bumblebees by triangulation. It also handles and compares two alternative stereo camera pairs if desired.
Collapse
Affiliation(s)
- Luc Nicolas Stiemer
- Department of Aerospace Engineering, FH Aachen, Aachen, North Rhine-Westphalia, Germany
| | - Andreas Thoma
- Department of Aerospace Engineering, FH Aachen, Aachen, North Rhine-Westphalia, Germany
- Department of Aerospace Engineering, RMIT University, Melbourne, Victoria, Australia
| | - Carsten Braun
- Department of Aerospace Engineering, FH Aachen, Aachen, North Rhine-Westphalia, Germany
| |
Collapse
|
9
|
Yadipour M, Billah MA, Faruque IA. Optic flow enrichment via Drosophila head and retina motions to support inflight position regulation. J Theor Biol 2023; 562:111416. [PMID: 36681182 DOI: 10.1016/j.jtbi.2023.111416] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2021] [Revised: 12/13/2022] [Accepted: 01/11/2023] [Indexed: 01/20/2023]
Abstract
Developing a functional description of the neural control circuits and visual feedback paths underlying insect flight behaviors is an active research area. Feedback controllers incorporating engineering models of the insect visual system outputs have described some flight behaviors, yet they do not explain how insects are able to stabilize their body position relative to nearby targets such as neighbors or forage sources, especially in challenging environments in which optic flow is poor. The insect experimental community is simultaneously recording a growing library of in-flight head and eye motions that may be linked to increased perception. This study develops a quantitative model of the optic flow experienced by a flying insect or robot during head yawing rotations (distinct from lateral peering motions in previous work) with a single other target in view. This study then applies a model of insect visuomotor feedback to show via analysis and simulation of five species that these head motions sufficiently enrich the optic flow and that the output feedback can provide relative position regulation relative to the single target (asymptotic stability). In the simplifying case of pure rotation relative to the body, theoretical analysis provides a stronger stability guarantee. The results are shown to be robust to anatomical neck angle limits and body vibrations, persist with more detailed Drosophila lateral-directional flight dynamics simulations, and generalize to recent retinal motion studies. Together, these results suggest that the optic flow enrichment provided by head or pseudopupil rotation could be used in an insect's neural processing circuit to enable position regulation.
Collapse
Affiliation(s)
- Mehdi Yadipour
- School of Mechanical and Aerospace Engineering, Oklahoma State University, Stillwater, OK, 74078, USA.
| | - Md Arif Billah
- School of Mechanical and Aerospace Engineering, Oklahoma State University, Stillwater, OK, 74078, USA.
| | - Imraan A Faruque
- School of Mechanical and Aerospace Engineering, Oklahoma State University, Stillwater, OK, 74078, USA.
| |
Collapse
|
10
|
Through Hawks’ Eyes: Synthetically Reconstructing the Visual Field of a Bird in Flight. Int J Comput Vis 2023; 131:1497-1531. [PMID: 37089199 PMCID: PMC10110700 DOI: 10.1007/s11263-022-01733-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Accepted: 12/05/2022] [Indexed: 03/06/2023]
Abstract
AbstractBirds of prey rely on vision to execute flight manoeuvres that are key to their survival, such as intercepting fast-moving targets or navigating through clutter. A better understanding of the role played by vision during these manoeuvres is not only relevant within the field of animal behaviour, but could also have applications for autonomous drones. In this paper, we present a novel method that uses computer vision tools to analyse the role of active vision in bird flight, and demonstrate its use to answer behavioural questions. Combining motion capture data from Harris’ hawks with a hybrid 3D model of the environment, we render RGB images, semantic maps, depth information and optic flow outputs that characterise the visual experience of the bird in flight. In contrast with previous approaches, our method allows us to consider different camera models and alternative gaze strategies for the purposes of hypothesis testing, allows us to consider visual input over the complete visual field of the bird, and is not limited by the technical specifications and performance of a head-mounted camera light enough to attach to a bird’s head in flight. We present pilot data from three sample flights: a pursuit flight, in which a hawk intercepts a moving target, and two obstacle avoidance flights. With this approach, we provide a reproducible method that facilitates the collection of large volumes of data across many individuals, opening up new avenues for data-driven models of animal behaviour.
Collapse
|
11
|
Egelhaaf M. Optic flow based spatial vision in insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-022-01610-w. [PMID: 36609568 DOI: 10.1007/s00359-022-01610-w] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 12/06/2022] [Accepted: 12/24/2022] [Indexed: 01/09/2023]
Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal's speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| |
Collapse
|