1
|
Egelhaaf M, Lindemann JP. Path integration and optic flow in flying insects: a review of current evidence. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2025; 211:375-401. [PMID: 40053081 DOI: 10.1007/s00359-025-01734-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2024] [Revised: 02/03/2025] [Accepted: 02/05/2025] [Indexed: 05/16/2025]
Abstract
Path integration is a key navigation mechanism used by many animals, involving the integration of direction and distance of path segments to form a goal vector that allows an animal to return directly to its starting point. While well established for animals walking on solid ground, evidence for path integration in animals moving without ground contact, such as flying insects, is less clear. The review focuses on flying Hymenoptera, particularly bees, which are extensively studied. Although bees can use flight distance and direction information, evidence for genuine path integration is limited. Accurately assessing distance travelled is a major challenge for flying animals, because it relies on optic flow-the movement of visual patterns across the eye caused by locomotion. Optic flow depends on both the animal's speed and the spatial layout of the environment, making it ambiguous for precise distance measurement. While path integration is crucial for animals like desert ants navigating sparse environments with few navigational cues, we argue that flying Hymenopterans in visually complex environments, rich in objects and textures, rely on additional navigational cues rather than precise path integration. As they become more familiar with an environment, they may iteratively refine unreliable distance estimates derived from optic flow. By combining this refined information with directional cues, they could determine a goal vector and improve their ability to navigate efficiently between key locations. In the case of honeybees, this ability also enables them to communicate these refined goal vectors to other bees through the waggle dance.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology, Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| | - Jens P Lindemann
- Neurobiology, Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany
| |
Collapse
|
2
|
Kagioulis E, Knight J, Graham P, Nowotny T, Philippides A. Adaptive Route Memory Sequences for Insect-Inspired Visual Route Navigation. Biomimetics (Basel) 2024; 9:731. [PMID: 39727735 DOI: 10.3390/biomimetics9120731] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2024] [Revised: 11/04/2024] [Accepted: 11/09/2024] [Indexed: 12/28/2024] Open
Abstract
Visual navigation is a key capability for robots and animals. Inspired by the navigational prowess of social insects, a family of insect-inspired route navigation algorithms-familiarity-based algorithms-have been developed that use stored panoramic images collected during a training route to subsequently derive directional information during route recapitulation. However, unlike the ants that inspire them, these algorithms ignore the sequence in which the training images are acquired so that all temporal information/correlation is lost. In this paper, the benefits of incorporating sequence information in familiarity-based algorithms are tested. To do this, instead of comparing a test view to all the training route images, a window of memories is used to restrict the number of comparisons that need to be made. As ants are able to visually navigate when odometric information is removed, the window position is updated via visual matching information only and not odometry. The performance of an algorithm without sequence information is compared to the performance of window methods with different fixed lengths as well as a method that adapts the window size dynamically. All algorithms were benchmarked on a simulation of an environment used for ant navigation experiments and showed that sequence information can boost performance and reduce computation. A detailed analysis of successes and failures highlights the interaction between the length of the route memory sequence and environment type and shows the benefits of an adaptive method.
Collapse
Affiliation(s)
- Efstathios Kagioulis
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton BN1 9QJ, UK
| | - James Knight
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton BN1 9QJ, UK
| | - Paul Graham
- Sussex Neuroscience, School of Life Sciences, University of Sussex, Brighton BN1 9QG, UK
| | - Thomas Nowotny
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton BN1 9QJ, UK
| | - Andrew Philippides
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton BN1 9QJ, UK
| |
Collapse
|
3
|
Moura PA, Cardoso MZ, Montgomery SH. Heliconius butterflies use wide-field landscape features, but not individual local landmarks, during spatial learning. ROYAL SOCIETY OPEN SCIENCE 2024; 11:241097. [PMID: 39507999 PMCID: PMC11539145 DOI: 10.1098/rsos.241097] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/29/2024] [Revised: 09/08/2024] [Accepted: 10/02/2024] [Indexed: 11/08/2024]
Abstract
Spatial learning is vital in foraging ecology. Many hymenopteran insects are adept spatial foragers that rely on visual cues contained within broader wide-field scenes for central place foraging from a central nest. By contrast, for butterflies, which lack central nest sites, visual cue use during spatial foraging is less understood. Heliconius butterflies, however, exhibit stable nocturnal roosts, strong site fidelity and a sophisticated capacity for spatial navigation. This study furthers our understanding of Heliconius spatial learning by testing whether H. erato can associate a spatially informative visual cue with artificial feeders. We explored the relative importance of a visual local landmark compared with broader, wide-field visual cues, through experiments with (i) a fixed rewarded feeder with a local landmark; (ii) a mobile rewarded feeder with the landmark as the sole reliable cue; (iii) the same setup while blocking visual access to external landscape features. Our data suggest that Heliconius butterflies learn static feeder locations without relying on a local individual landmark. Instead, we suggest they integrate broader landscape and celestial cues. This suggests that Heliconius butterflies and central place foraging hymenopterans likely share similar visual navigation strategies, using wide-field, low-resolution views rather than focusing on specific individual landmarks.
Collapse
Affiliation(s)
- P. A. Moura
- Departamento de Ecologia, Universidade Federal do Rio Grande do Norte, Natal, Brazil
| | - M. Z. Cardoso
- Departamento de Ecologia, Universidade Federal do Rio Grande do Norte, Natal, Brazil
- Departamento de Ecologia, Instituto de Biologia, Universidade Federal do Rio de Janeiro, Rio de Janeiro, Brazil
| | - S. H. Montgomery
- School of Biological Sciences, University of Bristol, Bristol, UK
| |
Collapse
|
4
|
Lochner S, Honerkamp D, Valada A, Straw AD. Reinforcement learning as a robotics-inspired framework for insect navigation: from spatial representations to neural implementation. Front Comput Neurosci 2024; 18:1460006. [PMID: 39314666 PMCID: PMC11416953 DOI: 10.3389/fncom.2024.1460006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2024] [Accepted: 08/20/2024] [Indexed: 09/25/2024] Open
Abstract
Bees are among the master navigators of the insect world. Despite impressive advances in robot navigation research, the performance of these insects is still unrivaled by any artificial system in terms of training efficiency and generalization capabilities, particularly considering the limited computational capacity. On the other hand, computational principles underlying these extraordinary feats are still only partially understood. The theoretical framework of reinforcement learning (RL) provides an ideal focal point to bring the two fields together for mutual benefit. In particular, we analyze and compare representations of space in robot and insect navigation models through the lens of RL, as the efficiency of insect navigation is likely rooted in an efficient and robust internal representation, linking retinotopic (egocentric) visual input with the geometry of the environment. While RL has long been at the core of robot navigation research, current computational theories of insect navigation are not commonly formulated within this framework, but largely as an associative learning process implemented in the insect brain, especially in the mushroom body (MB). Here we propose specific hypothetical components of the MB circuit that would enable the implementation of a certain class of relatively simple RL algorithms, capable of integrating distinct components of a navigation task, reminiscent of hierarchical RL models used in robot navigation. We discuss how current models of insect and robot navigation are exploring representations beyond classical, complete map-like representations, with spatial information being embedded in the respective latent representations to varying degrees.
Collapse
Affiliation(s)
- Stephan Lochner
- Institute of Biology I, University of Freiburg, Freiburg, Germany
| | - Daniel Honerkamp
- Department of Computer Science, University of Freiburg, Freiburg, Germany
| | - Abhinav Valada
- Department of Computer Science, University of Freiburg, Freiburg, Germany
| | - Andrew D. Straw
- Institute of Biology I, University of Freiburg, Freiburg, Germany
- Bernstein Center Freiburg, University of Freiburg, Freiburg, Germany
| |
Collapse
|
5
|
Patel RN, Roberts NS, Kempenaers J, Zadel A, Heinze S. Parallel vector memories in the brain of a bee as foundation for flexible navigation. Proc Natl Acad Sci U S A 2024; 121:e2402509121. [PMID: 39008670 PMCID: PMC11287249 DOI: 10.1073/pnas.2402509121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2024] [Accepted: 06/03/2024] [Indexed: 07/17/2024] Open
Abstract
Insects rely on path integration (vector-based navigation) and landmark guidance to perform sophisticated navigational feats, rivaling those seen in mammals. Bees in particular exhibit complex navigation behaviors including creating optimal routes and novel shortcuts between locations, an ability historically indicative of the presence of a cognitive map. A mammalian cognitive map has been widely accepted. However, in insects, the existence of a centralized cognitive map is highly contentious. Using a controlled laboratory assay that condenses foraging behaviors to short distances in walking bumblebees, we reveal that vectors learned during path integration can be transferred to long-term memory, that multiple such vectors can be stored in parallel, and that these vectors can be recalled at a familiar location and used for homeward navigation. These findings demonstrate that bees meet the two fundamental requirements of a vector-based analog of a decentralized cognitive map: Home vectors need to be stored in long-term memory and need to be recalled from remembered locations. Thus, our data demonstrate that bees possess the foundational elements for a vector-based map. By utilizing this relatively simple strategy for spatial organization, insects may achieve high-level navigation behaviors seen in vertebrates with the limited number of neurons in their brains, circumventing the computational requirements associated with the cognitive maps of mammals.
Collapse
Affiliation(s)
- Rickesh N. Patel
- Lund Vision Group, Department of Biology, Lund University, Lund22362, Sweden
| | - Natalie S. Roberts
- Lund Vision Group, Department of Biology, Lund University, Lund22362, Sweden
| | - Julian Kempenaers
- Lund Vision Group, Department of Biology, Lund University, Lund22362, Sweden
| | - Ana Zadel
- Lund Vision Group, Department of Biology, Lund University, Lund22362, Sweden
| | - Stanley Heinze
- Lund Vision Group, Department of Biology, Lund University, Lund22362, Sweden
- Nano Lund, Centre for Nanoscience, Lund University, Lund22362, Sweden
| |
Collapse
|
6
|
van Dijk T, De Wagter C, de Croon GCHE. Visual route following for tiny autonomous robots. Sci Robot 2024; 9:eadk0310. [PMID: 39018372 DOI: 10.1126/scirobotics.adk0310] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2023] [Accepted: 06/14/2024] [Indexed: 07/19/2024]
Abstract
Navigation is an essential capability for autonomous robots. In particular, visual navigation has been a major research topic in robotics because cameras are lightweight, power-efficient sensors that provide rich information on the environment. However, the main challenge of visual navigation is that it requires substantial computational power and memory for visual processing and storage of the results. As of yet, this has precluded its use on small, extremely resource-constrained robots such as lightweight drones. Inspired by the parsimony of natural intelligence, we propose an insect-inspired approach toward visual navigation that is specifically aimed at extremely resource-restricted robots. It is a route-following approach in which a robot's outbound trajectory is stored as a collection of highly compressed panoramic images together with their spatial relationships as measured with odometry. During the inbound journey, the robot uses a combination of odometry and visual homing to return to the stored locations, with visual homing preventing the buildup of odometric drift. A main advancement of the proposed strategy is that the number of stored compressed images is minimized by spacing them apart as far as the accuracy of odometry allows. To demonstrate the suitability for small systems, we implemented the strategy on a tiny 56-gram drone. The drone could successfully follow routes up to 100 meters with a trajectory representation that consumed less than 20 bytes per meter. The presented method forms a substantial step toward the autonomous visual navigation of tiny robots, facilitating their more widespread application.
Collapse
Affiliation(s)
- Tom van Dijk
- Control and Operations Department, Faculty of Aerospace Engineering, Delft University of Technology, Delft, Netherlands
| | - Christophe De Wagter
- Control and Operations Department, Faculty of Aerospace Engineering, Delft University of Technology, Delft, Netherlands
| | - Guido C H E de Croon
- Control and Operations Department, Faculty of Aerospace Engineering, Delft University of Technology, Delft, Netherlands
| |
Collapse
|
7
|
Beetz MJ. A perspective on neuroethology: what the past teaches us about the future of neuroethology. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2024; 210:325-346. [PMID: 38411712 PMCID: PMC10995053 DOI: 10.1007/s00359-024-01695-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2023] [Revised: 02/12/2024] [Accepted: 02/13/2024] [Indexed: 02/28/2024]
Abstract
For 100 years, the Journal of Comparative Physiology-A has significantly supported research in the field of neuroethology. The celebration of the journal's centennial is a great time point to appreciate the recent progress in neuroethology and to discuss possible avenues of the field. Animal behavior is the main source of inspiration for neuroethologists. This is illustrated by the huge diversity of investigated behaviors and species. To explain behavior at a mechanistic level, neuroethologists combine neuroscientific approaches with sophisticated behavioral analysis. The rapid technological progress in neuroscience makes neuroethology a highly dynamic and exciting field of research. To summarize the recent scientific progress in neuroethology, I went through all abstracts of the last six International Congresses for Neuroethology (ICNs 2010-2022) and categorized them based on the sensory modalities, experimental model species, and research topics. This highlights the diversity of neuroethology and gives us a perspective on the field's scientific future. At the end, I highlight three research topics that may, among others, influence the future of neuroethology. I hope that sharing my roots may inspire other scientists to follow neuroethological approaches.
Collapse
Affiliation(s)
- M Jerome Beetz
- Zoology II, Biocenter, University of Würzburg, 97074, Würzburg, Germany.
| |
Collapse
|
8
|
Freas CA, Spetch ML. Directed retreat and navigational mechanisms in trail following Formica obscuripes. Learn Behav 2024; 52:114-131. [PMID: 37752304 PMCID: PMC10923983 DOI: 10.3758/s13420-023-00604-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/12/2023] [Indexed: 09/28/2023]
Abstract
Ant species exhibit behavioural commonalities when solving navigational challenges for successful orientation and to reach goal locations. These behaviours rely on a shared toolbox of navigational strategies that guide individuals under an array of motivational contexts. The mechanisms that support these behaviours, however, are tuned to each species' habitat and ecology with some exhibiting unique navigational behaviours. This leads to clear differences in how ant navigators rely on this shared toolbox to reach goals. Species with hybrid foraging structures, which navigate partially upon a pheromone-marked column, express distinct differences in their toolbox, compared to solitary foragers. Here, we explore the navigational abilities of the Western Thatching ant (Formica obscuripes), a hybrid foraging species whose navigational mechanisms have not been studied. We characterise their reliance on both the visual panorama and a path integrator for orientation, with the pheromone's presence acting as a non-directional reassurance cue, promoting continued orientation based on other strategies. This species also displays backtracking behaviour, which occurs with a combination of unfamiliar terrestrial cues and the absence of the pheromone, thus operating based upon a combination of the individual mechanisms observed in solitarily and socially foraging species. We also characterise a new form of goalless orientation in these ants, an initial retreating behaviour that is modulated by the forager's path integration system. The behaviour directs disturbed inbound foragers back along their outbound path for a short distance before recovering and reorienting back to the nest.
Collapse
Affiliation(s)
- Cody A Freas
- Department of Psychology, University of Alberta, Edmonton, Alberta, Canada.
- School of Natural Sciences, Macquarie University, Sydney, NSW, 2113, Australia.
| | - Marcia L Spetch
- Department of Psychology, University of Alberta, Edmonton, Alberta, Canada
| |
Collapse
|
9
|
Freas CA, Spetch ML. Route retracing: way pointing and multiple vector memories in trail-following ants. J Exp Biol 2024; 227:jeb246695. [PMID: 38126715 PMCID: PMC10906666 DOI: 10.1242/jeb.246695] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2023] [Accepted: 12/13/2023] [Indexed: 12/23/2023]
Abstract
Maintaining positional estimates of goal locations is a fundamental task for navigating animals. Diverse animal groups, including both vertebrates and invertebrates, can accomplish this through path integration. During path integration, navigators integrate movement changes, tracking both distance and direction, to generate a spatial estimate of their start location, or global vector, allowing efficient direct return travel without retracing the outbound route. In ants, path integration is accomplished through the coupling of pedometer and celestial compass estimates. Within path integration, it has been theorized navigators may use multiple vector memories for way pointing. However, in many instances, these navigators may instead be homing via view alignment. Here, we present evidence that trail-following ants can attend to segments of their global vector to retrace their non-straight pheromone trails, without the confound of familiar views. Veromessor pergandei foragers navigate to directionally distinct intermediate sites via path integration by orienting along separate legs of their inbound route at unfamiliar locations, indicating these changes are not triggered by familiar external cues, but by vector state. These findings contrast with path integration as a singular memory estimate in ants and underscore the system's ability to way point to intermediate goals along the inbound route via multiple vector memories, akin to trapline foraging in bees visiting multiple flower patches. We discuss how reliance on non-straight pheromone-marked trails may support attending to separate vectors to remain on the pheromone rather than attempting straight-line shortcuts back to the nest.
Collapse
Affiliation(s)
- Cody A. Freas
- Department of Psychology, University of Alberta, Edmonton, AB, Canada, T6G 2E9
- School of Natural Sciences, Macquarie University, Sydney, NSW 2109, Australia
| | - Marcia L. Spetch
- Department of Psychology, University of Alberta, Edmonton, AB, Canada, T6G 2E9
| |
Collapse
|
10
|
Zhu L, Mangan M, Webb B. Neuromorphic sequence learning with an event camera on routes through vegetation. Sci Robot 2023; 8:eadg3679. [PMID: 37756384 DOI: 10.1126/scirobotics.adg3679] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2022] [Accepted: 08/29/2023] [Indexed: 09/29/2023]
Abstract
For many robotics applications, it is desirable to have relatively low-power and efficient onboard solutions. We took inspiration from insects, such as ants, that are capable of learning and following routes in complex natural environments using relatively constrained sensory and neural systems. Such capabilities are particularly relevant to applications such as agricultural robotics, where visual navigation through dense vegetation remains a challenging task. In this scenario, a route is likely to have high self-similarity and be subject to changing lighting conditions and motion over uneven terrain, and the effects of wind on leaves increase the variability of the input. We used a bioinspired event camera on a terrestrial robot to collect visual sequences along routes in natural outdoor environments and applied a neural algorithm for spatiotemporal memory that is closely based on a known neural circuit in the insect brain. We show that this method is plausible to support route recognition for visual navigation and more robust than SeqSLAM when evaluated on repeated runs on the same route or routes with small lateral offsets. By encoding memory in a spiking neural network running on a neuromorphic computer, our model can evaluate visual familiarity in real time from event camera footage.
Collapse
Affiliation(s)
- Le Zhu
- School of Informatics, University of Edinburgh, EH8 9AB Edinburgh, UK
| | - Michael Mangan
- Sheffield Robotics, Department of Computer Science, University of Sheffield, S1 4DP Sheffield, UK
| | - Barbara Webb
- School of Informatics, University of Edinburgh, EH8 9AB Edinburgh, UK
| |
Collapse
|
11
|
Shigaki S, Ando N, Sakurai T, Kurabayashi D. Analysis of Odor-Tracking Performance of Silk Moth Using a Sensory-Motor Intervention System. Integr Comp Biol 2023; 63:343-355. [PMID: 37280186 DOI: 10.1093/icb/icad055] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2023] [Revised: 05/24/2023] [Accepted: 05/24/2023] [Indexed: 06/08/2023] Open
Abstract
Animals can adaptively behave in different environmental conditions by converting environmental information obtained from their sensory organs into actions. This sensory-motor integration enables the accomplishment of various tasks and is essential for animal survival. This sensory-motor integration also plays an important role in localization to females, relying on sex pheromones floating in space. In this study, we focused on the localization behavior of the adult male silk moth, Bombyx mori. We investigated sensory-motor integration against time delay using odor plume tracking performance as an index when we set a certain time delay for the sensory and motor responses. Given that it is difficult to directly intervene in the sensory and motor functions of the silk moth, we constructed an intervention system based on a mobile behavior measurement system controlled by them. Using this intervention system, not only can timing the detection of the odor in the environment and timing the presentation of the odor to the silk moth be manipulated, but timing the reflection of the movement of the silk moth can also be manipulated. We analyzed the extent to which the localization strategy of the silk moth could tolerate sensory delays by setting a delay to the odor presentation. We also evaluated behavioral compensation by odor sensory feedback by setting a delay to the motor. The results of the localization experiment have shown that the localization success rate did not decrease when there was a motor delay. However, when there was a sensory delay, the success rate decreased depending on the time delay. Analysis of the change in behavior after detection of the odor stimulus has shown that the movement was more linear when we set a motor delay. However, the movement was accompanied by a large rotational movement when there was a delay in the sensory input. This result has suggested that behavior is compensated for the delay in motor function by feedback control of odor sensation, but not when accompanied by sensory delay. To compensate for this, the silk moth may acquire appropriate information from the environment by making large body movements.
Collapse
Affiliation(s)
- Shunsuke Shigaki
- Principles of Informatics Research Division, National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda 101-8430, Tokyo, Japan
| | - Noriyasu Ando
- Department of Systems Life Engineering, Maebashi Institute of Technology, 460-1 Kamisadori-cho, Maebashi 371-0816, Gunma, Japan
| | - Takeshi Sakurai
- Department of Agricultural Innovation for Sustainable Society, Tokyo University of Agriculture, 1737 Funako, Atsugi 243-0034, Kanagawa, Japan
| | - Daisuke Kurabayashi
- Department of Systems and Control Engineering, Tokyo Institute of Technology, 2-12-1 Ookayama, Meguro, 152-8552, Tokyo, Japan
| |
Collapse
|
12
|
Argote K, Albert CH, Geslin B, Biryol C, Santonja M. Effects of litter quality on foraging behaviour and demographic parameters in Folsomia candida (Collembola). Ecol Evol 2023; 13:e10420. [PMID: 37600492 PMCID: PMC10439338 DOI: 10.1002/ece3.10420] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2023] [Revised: 07/13/2023] [Accepted: 07/21/2023] [Indexed: 08/22/2023] Open
Abstract
Litter quality has long been associated with demographic parameters of Collembola populations. However, little is known about the capacity of Collembola to perceive and seek better litter quality. To address this gap, three complementary laboratory experiments were carried out with the Collembola Folsomia candida. First, populations were fed on three different types of leaf litters (Quercus pubescens, Acer opalus and Prunus avium) and a control (agar-agar-brewer's yeast mixture) for 6 weeks to assess their impacts on demography (reproduction rate and population size). Second, the body length of individuals differentially fed with the same four types of resources was measured to assess a functional trait that can potentially affect movement parameters such as prospected area or foraging speed. Third, F. candida single individuals were exposed to the same litter quality gradient and placed at an increasing distance from the litter (from 1 to 5 cm). For 10 min, their foraging behaviour was recorded which included prospected area, foraging speed, perception distance and success in reaching the litter (foraging success). As expected, low-quality litter (i.e. Q. pubescens) contributed to low population growth compared to the control treatment and the high-quality litters (P. avium and A. opalus). In the third experiment, the probability of finding the resource was negatively correlated to the distance, but was unrelated to the litter quality and the Collembola body length. When resource was perceived, F. candida was able to switch from non-directional to directional movements, with a large variability in the perception distance from a few millimetres to several centimetres. Taken together, our results indicate that litter quality plays a relevant role in Collembola demographic parameters once the population settles on litter patch, but not on foraging behaviour to select high-quality resources.
Collapse
Affiliation(s)
- Karolina Argote
- Aix Marseille Université, CNRS, Université Avignon, IRD, IMBEMarseilleFrance
| | - Cecile H. Albert
- Aix Marseille Université, CNRS, Université Avignon, IRD, IMBEMarseilleFrance
| | - Benoît Geslin
- Aix Marseille Université, CNRS, Université Avignon, IRD, IMBEMarseilleFrance
| | - Charlotte Biryol
- Aix Marseille Université, CNRS, Université Avignon, IRD, IMBEMarseilleFrance
| | - Mathieu Santonja
- Aix Marseille Université, CNRS, Université Avignon, IRD, IMBEMarseilleFrance
| |
Collapse
|
13
|
Gilad T, Bahar O, Hasan M, Bar A, Subach A, Scharf I. The combined role of visual and olfactory cues in foraging by Cataglyphis ants in laboratory mazes. Curr Zool 2023; 69:401-408. [PMID: 37614920 PMCID: PMC10443614 DOI: 10.1093/cz/zoac058] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2022] [Accepted: 07/28/2022] [Indexed: 08/25/2023] Open
Abstract
Foragers use several senses to locate food, and many animals rely on vision and smell. It is beneficial not to rely on a single sense, which might fail under certain conditions. We examined the contribution of vision and smell to foraging and maze exploration under laboratory conditions using Cataglyphis desert ants as a model. Foraging intensity, measured as the number of workers entering the maze and arriving at the target as well as target arrival time, were greater when food, blue light, or both were offered or presented in contrast to a control. Workers trained to forage for a combined food and light cue elevated their foraging intensity with experience. However, foraging intensity was not higher when using both cues simultaneously than in either one of the two alone. Following training, we split between the two cues and moved either the food or the blue light to the opposite maze corner. This manipulation impaired foraging success by either leading to fewer workers arriving at the target cell (when the light stayed and the food was moved) or to more workers arriving at the opposite target cell, empty of food (when the food stayed and the light was moved). This result indicates that ant workers use both senses when foraging for food and readily associate light with food.
Collapse
Affiliation(s)
- Tomer Gilad
- School of Zoology, George S Wise Faculty of Life Sciences, Tel Aviv University, 69978 Tel Aviv, Israel
| | - Ori Bahar
- School of Zoology, George S Wise Faculty of Life Sciences, Tel Aviv University, 69978 Tel Aviv, Israel
| | - Malak Hasan
- School of Zoology, George S Wise Faculty of Life Sciences, Tel Aviv University, 69978 Tel Aviv, Israel
| | - Adi Bar
- School of Zoology, George S Wise Faculty of Life Sciences, Tel Aviv University, 69978 Tel Aviv, Israel
| | - Aziz Subach
- School of Zoology, George S Wise Faculty of Life Sciences, Tel Aviv University, 69978 Tel Aviv, Israel
| | - Inon Scharf
- School of Zoology, George S Wise Faculty of Life Sciences, Tel Aviv University, 69978 Tel Aviv, Israel
| |
Collapse
|
14
|
Menzel R. Navigation and dance communication in honeybees: a cognitive perspective. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023; 209:515-527. [PMID: 36799987 PMCID: PMC10354182 DOI: 10.1007/s00359-023-01619-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2022] [Revised: 01/28/2023] [Accepted: 02/06/2023] [Indexed: 02/18/2023]
Abstract
Flying insects like the honeybee experience the world as a metric layout embedded in a compass, the time-compensated sun compass. The focus of the review lies on the properties of the landscape memory as accessible by data from radar tracking and analyses of waggle dance following. The memory formed during exploration and foraging is thought to be composed of multiple elements, the aerial pictures that associate the multitude of sensory inputs with compass directions. Arguments are presented that support retrieval and use of landscape memory not only during navigation but also during waggle dance communication. I argue that bees expect landscape features that they have learned and that are retrieved during dance communication. An intuitive model of the bee's navigation memory is presented that assumes the picture memories form a network of geographically defined locations, nodes. The intrinsic components of the nodes, particularly their generalization process leads to binding structures, the edges. In my view, the cognitive faculties of landscape memory uncovered by these experiments are best captured by the term cognitive map.
Collapse
Affiliation(s)
- Randolf Menzel
- Fachbereich Biologie, Chemie, Pharmazie, Institut Für Biologie, Freie Universität Berlin, Königin Luisestr. 1-3, 14195, Berlin, Germany.
| |
Collapse
|
15
|
Bertrand OJN, Sonntag A. The potential underlying mechanisms during learning flights. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01637-7. [PMID: 37204434 DOI: 10.1007/s00359-023-01637-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Revised: 05/02/2023] [Accepted: 05/04/2023] [Indexed: 05/20/2023]
Abstract
Hymenopterans, such as bees and wasps, have long fascinated researchers with their sinuous movements at novel locations. These movements, such as loops, arcs, or zigzags, serve to help insects learn their surroundings at important locations. They also allow the insects to explore and orient themselves in their environment. After they gained experience with their environment, the insects fly along optimized paths guided by several guidance strategies, such as path integration, local homing, and route-following, forming a navigational toolkit. Whereas the experienced insects combine these strategies efficiently, the naive insects need to learn about their surroundings and tune the navigational toolkit. We will see that the structure of the movements performed during the learning flights leverages the robustness of certain strategies within a given scale to tune other strategies which are more efficient at a larger scale. Thus, an insect can explore its environment incrementally without risking not finding back essential locations.
Collapse
Affiliation(s)
- Olivier J N Bertrand
- Neurobiology, Bielefeld University, Universitätstr. 25, 33615, Bielefeld, NRW, Germany.
| | - Annkathrin Sonntag
- Neurobiology, Bielefeld University, Universitätstr. 25, 33615, Bielefeld, NRW, Germany
| |
Collapse
|
16
|
Egelhaaf M. Optic flow based spatial vision in insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-022-01610-w. [PMID: 36609568 DOI: 10.1007/s00359-022-01610-w] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Revised: 12/06/2022] [Accepted: 12/24/2022] [Indexed: 01/09/2023]
Abstract
The optic flow, i.e., the displacement of retinal images of objects in the environment induced by self-motion, is an important source of spatial information, especially for fast-flying insects. Spatial information over a wide range of distances, from the animal's immediate surroundings over several hundred metres to kilometres, is necessary for mediating behaviours, such as landing manoeuvres, collision avoidance in spatially complex environments, learning environmental object constellations and path integration in spatial navigation. To facilitate the processing of spatial information, the complexity of the optic flow is often reduced by active vision strategies. These result in translations and rotations being largely separated by a saccadic flight and gaze mode. Only the translational components of the optic flow contain spatial information. In the first step of optic flow processing, an array of local motion detectors provides a retinotopic spatial proximity map of the environment. This local motion information is then processed in parallel neural pathways in a task-specific manner and used to control the different components of spatial behaviour. A particular challenge here is that the distance information extracted from the optic flow does not represent the distances unambiguously, but these are scaled by the animal's speed of locomotion. Possible ways of coping with this ambiguity are discussed.
Collapse
Affiliation(s)
- Martin Egelhaaf
- Neurobiology and Center for Cognitive Interaction Technology (CITEC), Bielefeld University, Universitätsstraße 25, 33615, Bielefeld, Germany.
| |
Collapse
|
17
|
Martin-Ordas G. Frames of reference in small-scale spatial tasks in wild bumblebees. Sci Rep 2022; 12:21683. [PMID: 36522430 PMCID: PMC9755249 DOI: 10.1038/s41598-022-26282-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2022] [Accepted: 12/13/2022] [Indexed: 12/23/2022] Open
Abstract
Spatial cognitive abilities are fundamental to foraging animal species. In particular, being able to encode the location of an object in relation to another object (i.e., spatial relationships) is critical for successful foraging. Whether egocentric (i.e., viewer-dependent) or allocentric (i.e., dependent on external environment or cues) representations underlie these behaviours is still a highly debated question in vertebrates and invertebrates. Previous research shows that bees encode spatial information largely using egocentric information. However, no research has investigated this question in the context of relational similarity. To test this, a spatial matching task previously used with humans and great apes was adapted for use with wild-caught bumblebees. In a series of experiments, bees first experienced a rewarded object and then had to spontaneously (Experiment 1) find or learn (Experiments 2 and 3) to find a second one, based on the location of first one. The results showed that bumblebees predominantly exhibited an allocentric strategy in the three experiments. These findings suggest that egocentric representations alone might not be evolutionary ancestral and clearly indicate similarities between vertebrates and invertebrates when encoding spatial information.
Collapse
Affiliation(s)
- Gema Martin-Ordas
- grid.10863.3c0000 0001 2164 6351Department of Psychology, University of Oviedo, Oviedo, Spain ,grid.11918.300000 0001 2248 4331Division of Psychology, University of Stirling, Stirling, UK
| |
Collapse
|
18
|
Baran B, Krzyżowski M, Rádai Z, Francikowski J, Hohol M. Geometry-based navigation in the dark: layout symmetry facilitates spatial learning in the house cricket, Acheta domesticus, in the absence of visual cues. Anim Cogn 2022; 26:755-770. [PMID: 36369419 PMCID: PMC10066172 DOI: 10.1007/s10071-022-01712-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Revised: 10/25/2022] [Accepted: 10/29/2022] [Indexed: 11/13/2022]
Abstract
AbstractThe capacity to navigate by layout geometry has been widely recognized as a robust strategy of place-finding. It has been reported in various species, although most studies were performed with vision-based paradigms. In the presented study, we aimed to investigate layout symmetry-based navigation in the house cricket, Acheta domesticus, in the absence of visual cues. For this purpose, we used a non-visual paradigm modeled on the Tennessee Williams setup. We ensured that the visual cues were indeed inaccessible to insects. In the main experiment, we tested whether crickets are capable of learning to localize the centrally positioned, inconspicuous cool spot in heated arenas of various shapes (i.e., circular, square, triangular, and asymmetric quadrilateral). We found that the symmetry of the arena significantly facilitates crickets’ learning to find the cool spot, indicated by the increased time spent on the cool spot and the decreased latency in locating it in subsequent trials. To investigate mechanisms utilized by crickets, we analyzed their approach paths to the spot. We found that crickets used both heuristic and directed strategies of approaching the target, with the dominance of a semi-directed strategy (i.e., a thigmotactic phase preceding direct navigation to the target). We propose that the poor performance of crickets in the asymmetrical quadrilateral arena may be explained by the difficulty of encoding its layout with cues from a single modality.
Collapse
|
19
|
Flores-Valle A, Seelig JD. A place learning assay for tethered walking Drosophila. J Neurosci Methods 2022; 378:109657. [PMID: 35760146 DOI: 10.1016/j.jneumeth.2022.109657] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2021] [Revised: 06/16/2022] [Accepted: 06/20/2022] [Indexed: 11/25/2022]
Abstract
BACKGROUND Drosophila shows a range of visually guided memory and learning behaviors, including place learning. Investigating the dynamics of neural circuits underlying such behaviors requires learning assays in tethered animals, compatible with in vivo imaging experiments. NEW METHOD Here, we introduce an assay for place learning for tethered walking flies. A cylindrical arena is rotated and translated in real time around the fly in concert with the rotational and translational walking activity measured with an air supported ball, resulting in a mechanical virtual reality (VR). RESULTS Navigation together with heat-based operant conditioning allows flies to learn the location of a cool spot with respect to a visual landmark. Flies optimize the time and distance required to find the cool spot over a similar number of trials as observed in assays with freely moving flies. Additionally, a fraction of flies remembers the location of the cool spot also after the conditioning heat is removed. COMPARISON WITH EXISTING METHODS Learning tasks have been implemented in tethered flying as well as walking flies. Mechanically translating and rotating an arena in concert with the fly's walking activity enables navigation in a three dimensional environment. CONCLUSION In the developed mechanical VR flies can learn to remember the location of a cool place within an otherwise hot environment with respect to a visual landmark. Implementing place learning in a tethered walking configuration is a precondition for investigating the underlying circuit dynamics using functional imaging.
Collapse
Affiliation(s)
- Andres Flores-Valle
- Max Planck Institute for Neurobiology of Behavior - caesar (MPINB), Bonn, Germany; International Max Planck Research School for Brain and Behavior, Bonn, Germany
| | - Johannes D Seelig
- Max Planck Institute for Neurobiology of Behavior - caesar (MPINB), Bonn, Germany.
| |
Collapse
|
20
|
Abstract
Autonomous robots are expected to perform a wide range of sophisticated tasks in complex, unknown environments. However, available onboard computing capabilities and algorithms represent a considerable obstacle to reaching higher levels of autonomy, especially as robots get smaller and the end of Moore's law approaches. Here, we argue that inspiration from insect intelligence is a promising alternative to classic methods in robotics for the artificial intelligence (AI) needed for the autonomy of small, mobile robots. The advantage of insect intelligence stems from its resource efficiency (or parsimony) especially in terms of power and mass. First, we discuss the main aspects of insect intelligence underlying this parsimony: embodiment, sensory-motor coordination, and swarming. Then, we take stock of where insect-inspired AI stands as an alternative to other approaches to important robotic tasks such as navigation and identify open challenges on the road to its more widespread adoption. Last, we reflect on the types of processors that are suitable for implementing insect-inspired AI, from more traditional ones such as microcontrollers and field-programmable gate arrays to unconventional neuromorphic processors. We argue that even for neuromorphic processors, one should not simply apply existing AI algorithms but exploit insights from natural insect intelligence to get maximally efficient AI for robot autonomy.
Collapse
Affiliation(s)
- G C H E de Croon
- Micro Air Vehicle Laboratory, Faculty of Aerospace Engineering, TU Delft, Delft, Netherlands
| | - J J G Dupeyroux
- Micro Air Vehicle Laboratory, Faculty of Aerospace Engineering, TU Delft, Delft, Netherlands
| | - S B Fuller
- Autonomous Insect Robotics Laboratory, Department of Mechanical Engineering and Paul G. Allen School of Computer Science, University of Washington, Seattle, WA, USA
| | - J A R Marshall
- Opteran Technologies, Sheffield, UK
- Complex Systems Modeling Group, Department of Computer Science, University of Sheffield, Sheffield, UK
| |
Collapse
|
21
|
Moreyra S, Lozada M. Spatial configuration learning in
Vespula germanica
forager wasps. Ethology 2022. [DOI: 10.1111/eth.13312] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Affiliation(s)
- Sabrina Moreyra
- Laboratorio Ecotono, Instituto de Investigaciones en Biodiversidad y Medio Ambiente (INIBIOMA), CONICET Universidad Nacional del Comahue (CRUB) Bariloche Argentina
| | - Mariana Lozada
- Laboratorio Ecotono, Instituto de Investigaciones en Biodiversidad y Medio Ambiente (INIBIOMA), CONICET Universidad Nacional del Comahue (CRUB) Bariloche Argentina
| |
Collapse
|
22
|
Gonzales D, Hempel de Ibarra N, Anderson K. Remote Sensing of Floral Resources for Pollinators – New Horizons From Satellites to Drones. Front Ecol Evol 2022. [DOI: 10.3389/fevo.2022.869751] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Insect pollinators are affected by the spatio-temporal distribution of floral resources, which are dynamic across time and space, and also influenced heavily by anthropogenic activities. There is a need for spatial data describing the time-varying spatial distribution of flowers, which can be used within behavioral and ecological studies. However, this information is challenging to obtain. Traditional field techniques for mapping flowers are often laborious and limited to relatively small areas, making it difficult to assess how floral resources are perceived by pollinators to guide their behaviors. Conversely, remote sensing of plant traits is a relatively mature technique now, and such technologies have delivered valuable data for identifying and measuring non-floral dynamics in plant systems, particularly leaves, stems and woody biomass in a wide range of ecosystems from local to global scales. However, monitoring the spatial and temporal dynamics of plant floral resources has been notably scarce in remote sensing studies. Recently, lightweight drone technology has been adopted by the ecological community, offering a capability for flexible deployment in the field, and delivery of centimetric resolution data, providing a clear opportunity for capturing fine-grained information on floral resources at key times of the flowering season. In this review, we answer three key questions of relevance to pollination science – can remote sensing deliver information on (a) how isolated are floral resources? (b) What resources are available within a flower patch? And (c) how do floral patches change over time? We explain how such information has potential to deepen ecological understanding of the distribution of floral resources that feed pollinators and the parameters that determine their navigational and foraging choices based on the sensory information they extract at different spatial scales. We provide examples of how such data can be used to generate new insights into pollinator behaviors in distinct landscape types and their resilience to environmental change.
Collapse
|
23
|
Islam M, Deeti S, Murray T, Cheng K. What view information is most important in the homeward navigation of an Australian bull ant, Myrmecia midas? J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2022; 208:545-559. [PMID: 36048246 PMCID: PMC9734209 DOI: 10.1007/s00359-022-01565-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2021] [Revised: 08/15/2022] [Accepted: 08/17/2022] [Indexed: 12/14/2022]
Abstract
Many insects orient by comparing current panoramic views of their environment to memorised views. We tested the navigational abilities of night-active Myrmecia midas foragers while we blocked segments of their visual panorama. Foragers failed to orient homewards when the front view, lower elevations, entire terrestrial surround, or the full panorama was blocked. Initial scanning increased whenever the visual panorama was blocked but scanning only increased along the rest of the route when the front, back, higher, or lower elevations were blocked. Ants meandered more when the front, the back, or the higher elevations were obscured. When everything except the canopy was blocked, the ants were quick and direct, but moved in random directions, as if to escape. We conclude that a clear front view, or a clear lower panorama is necessary for initial homeward headings. Furthermore, the canopy is neither necessary nor sufficient for homeward initial heading, and the back and upper segments of views, while not necessary, do make finding home easier. Discrepancies between image analysis and ant behaviour when the upper and lower views were blocked suggests that ants are selective in what portions of the scene they attend to or learn.
Collapse
Affiliation(s)
- Muzahid Islam
- grid.1004.50000 0001 2158 5405School of Natural Sciences, Macquarie University, Sydney, NSW 2109 Australia
| | - Sudhakar Deeti
- grid.1004.50000 0001 2158 5405School of Natural Sciences, Macquarie University, Sydney, NSW 2109 Australia
| | - Trevor Murray
- grid.1004.50000 0001 2158 5405School of Natural Sciences, Macquarie University, Sydney, NSW 2109 Australia
| | - Ken Cheng
- grid.1004.50000 0001 2158 5405School of Natural Sciences, Macquarie University, Sydney, NSW 2109 Australia
| |
Collapse
|
24
|
Goulard R, Buehlmann C, Niven JE, Graham P, Webb B. A unified mechanism for innate and learned visual landmark guidance in the insect central complex. PLoS Comput Biol 2021; 17:e1009383. [PMID: 34555013 PMCID: PMC8491911 DOI: 10.1371/journal.pcbi.1009383] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2021] [Revised: 10/05/2021] [Accepted: 08/26/2021] [Indexed: 11/24/2022] Open
Abstract
Insects can navigate efficiently in both novel and familiar environments, and this requires flexiblity in how they are guided by sensory cues. A prominent landmark, for example, can elicit strong innate behaviours (attraction or menotaxis) but can also be used, after learning, as a specific directional cue as part of a navigation memory. However, the mechanisms that allow both pathways to co-exist, interact or override each other are largely unknown. Here we propose a model for the behavioural integration of innate and learned guidance based on the neuroanatomy of the central complex (CX), adapted to control landmark guided behaviours. We consider a reward signal provided either by an innate attraction to landmarks or a long-term visual memory in the mushroom bodies (MB) that modulates the formation of a local vector memory in the CX. Using an operant strategy for a simulated agent exploring a simple world containing a single visual cue, we show how the generated short-term memory can support both innate and learned steering behaviour. In addition, we show how this architecture is consistent with the observed effects of unilateral MB lesions in ants that cause a reversion to innate behaviour. We suggest the formation of a directional memory in the CX can be interpreted as transforming rewarding (positive or negative) sensory signals into a mapping of the environment that describes the geometrical attractiveness (or repulsion). We discuss how this scheme might represent an ideal way to combine multisensory information gathered during the exploration of an environment and support optimal cue integration.
Collapse
Affiliation(s)
- Roman Goulard
- Institute for Perception, Action, and Behaviour, School of Informatics, University of Edinburgh, Edinburgh, Scotland, United Kingdom
| | - Cornelia Buehlmann
- School of Life Sciences, University of Sussex, John Maynard Smith Building, Falmer, Brighton, United Kingdom
| | - Jeremy E. Niven
- School of Life Sciences, University of Sussex, John Maynard Smith Building, Falmer, Brighton, United Kingdom
| | - Paul Graham
- School of Life Sciences, University of Sussex, John Maynard Smith Building, Falmer, Brighton, United Kingdom
| | - Barbara Webb
- Institute for Perception, Action, and Behaviour, School of Informatics, University of Edinburgh, Edinburgh, Scotland, United Kingdom
| |
Collapse
|
25
|
Stankiewicz J, Webb B. Looking down: a model for visual route following in flying insects. BIOINSPIRATION & BIOMIMETICS 2021; 16:055007. [PMID: 34243169 DOI: 10.1088/1748-3190/ac1307] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Accepted: 07/09/2021] [Indexed: 06/13/2023]
Abstract
Insect visual navigation is often assumed to depend on panoramic views of the horizon, and how these change as the animal moves. However, it is known that honey bees can visually navigate in flat, open meadows where visual information at the horizon is minimal, or would remain relatively constant across a wide range of positions. In this paper we hypothesise that these animals can navigate using view memories of the ground. We find that in natural scenes, low resolution views from an aerial perspective of ostensibly self-similar terrain (e.g. within a field of grass) provide surprisingly robust descriptors of precise spatial locations. We propose a new visual route following approach that makes use of transverse oscillations to centre a flight path along a sequence of learned views of the ground. We deploy this model on an autonomous quadcopter and demonstrate that it provides robust performance in the real world on journeys of up to 30 m. The success of our method is contingent on a robust view matching process which can evaluate the familiarity of a view with a degree of translational invariance. We show that a previously developed wavelet based bandpass orientated filter approach fits these requirements well, exhibiting double the catchment area of standard approaches. Using a realistic simulation package, we evaluate the robustness of our approach to variations in heading direction and aircraft height between inbound and outbound journeys. We also demonstrate that our approach can operate using a vision system with a biologically relevant visual acuity and viewing direction.
Collapse
Affiliation(s)
- J Stankiewicz
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| | - B Webb
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| |
Collapse
|
26
|
Paffhausen BH, Petrasch J, Wild B, Meurers T, Schülke T, Polster J, Fuchs I, Drexler H, Kuriatnyk O, Menzel R, Landgraf T. A Flying Platform to Investigate Neuronal Correlates of Navigation in the Honey Bee ( Apis mellifera). Front Behav Neurosci 2021; 15:690571. [PMID: 34354573 PMCID: PMC8329708 DOI: 10.3389/fnbeh.2021.690571] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2021] [Accepted: 06/24/2021] [Indexed: 11/13/2022] Open
Abstract
Navigating animals combine multiple perceptual faculties, learn during exploration, retrieve multi-facetted memory contents, and exhibit goal-directedness as an expression of their current needs and motivations. Navigation in insects has been linked to a variety of underlying strategies such as path integration, view familiarity, visual beaconing, and goal-directed orientation with respect to previously learned ground structures. Most works, however, study navigation either from a field perspective, analyzing purely behavioral observations, or combine computational models with neurophysiological evidence obtained from lab experiments. The honey bee (Apis mellifera) has long been a popular model in the search for neural correlates of complex behaviors and exhibits extraordinary navigational capabilities. However, the neural basis for bee navigation has not yet been explored under natural conditions. Here, we propose a novel methodology to record from the brain of a copter-mounted honey bee. This way, the animal experiences natural multimodal sensory inputs in a natural environment that is familiar to her. We have developed a miniaturized electrophysiology recording system which is able to record spikes in the presence of time-varying electric noise from the copter's motors and rotors, and devised an experimental procedure to record from mushroom body extrinsic neurons (MBENs). We analyze the resulting electrophysiological data combined with a reconstruction of the animal's visual perception and find that the neural activity of MBENs is linked to sharp turns, possibly related to the relative motion of visual features. This method is a significant technological step toward recording brain activity of navigating honey bees under natural conditions. By providing all system specifications in an online repository, we hope to close a methodological gap and stimulate further research informing future computational models of insect navigation.
Collapse
Affiliation(s)
- Benjamin H Paffhausen
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Julian Petrasch
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Benjamin Wild
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Thierry Meurers
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Tobias Schülke
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Johannes Polster
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| | - Inga Fuchs
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Helmut Drexler
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Oleksandra Kuriatnyk
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Randolf Menzel
- Department of Biology, Chemistry and Pharmacy, Institute of Neurobiology, Free University of Berlin, Berlin, Germany
| | - Tim Landgraf
- Dahlem Center for Machine Learning and Robotics, Department of Mathematics and Computer Science, Institute of Computer Science, Free University of Berlin, Berlin, Germany
| |
Collapse
|
27
|
Islam M, Deeti S, Kamhi JF, Cheng K. Minding the gap: learning and visual scanning behaviour in nocturnal bull ants. J Exp Biol 2021; 224:270965. [PMID: 34142708 PMCID: PMC8325935 DOI: 10.1242/jeb.242245] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2021] [Accepted: 06/14/2021] [Indexed: 01/17/2023]
Abstract
Insects possess small brains but exhibit sophisticated behaviour, specifically their ability to learn to navigate within complex environments. To understand how they learn to navigate in a cluttered environment, we focused on learning and visual scanning behaviour in the Australian nocturnal bull ant, Myrmecia midas, which are exceptional visual navigators. We tested how individual ants learn to detour via a gap and how they cope with substantial spatial changes over trips. Homing M. midas ants encountered a barrier on their foraging route and had to find a 50 cm gap between symmetrical large black screens, at 1 m distance towards the nest direction from the centre of the releasing platform in both familiar (on-route) and semi-familiar (off-route) environments. Foragers were tested for up to 3 learning trips with the changed conditions in both environments. The results showed that on the familiar route, individual foragers learned the gap quickly compared with when they were tested in the semi-familiar environment. When the route was less familiar, and the panorama was changed, foragers were less successful at finding the gap and performed more scans on their way home. Scene familiarity thus played a significant role in visual scanning behaviour. In both on-route and off-route environments, panoramic changes significantly affected learning, initial orientation and scanning behaviour. Nevertheless, over a few trips, success at gap finding increased, visual scans were reduced, the paths became straighter, and individuals took less time to reach the goal. Summary: Investigation of how nocturnal bull ants learn to move around obstacles in familiar and semi-familiar environments reveals that scene familiarity plays a significant role in navigation.
Collapse
Affiliation(s)
- Muzahid Islam
- Department of Biological Sciences, Macquarie University, Sydney, NSW 2109, Australia
| | - Sudhakar Deeti
- Department of Biological Sciences, Macquarie University, Sydney, NSW 2109, Australia
| | - J Frances Kamhi
- Department of Biological Sciences, Macquarie University, Sydney, NSW 2109, Australia.,Neuroscience Department, Oberlin College, Oberlin, OH 44074, USA
| | - Ken Cheng
- Department of Biological Sciences, Macquarie University, Sydney, NSW 2109, Australia
| |
Collapse
|
28
|
Parlevliet PP, Kanaev A, Hung CP, Schweiger A, Gregory FD, Benosman R, de Croon GCHE, Gutfreund Y, Lo CC, Moss CF. Autonomous Flying With Neuromorphic Sensing. Front Neurosci 2021; 15:672161. [PMID: 34054420 PMCID: PMC8160287 DOI: 10.3389/fnins.2021.672161] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2021] [Accepted: 04/07/2021] [Indexed: 11/17/2022] Open
Abstract
Autonomous flight for large aircraft appears to be within our reach. However, launching autonomous systems for everyday missions still requires an immense interdisciplinary research effort supported by pointed policies and funding. We believe that concerted endeavors in the fields of neuroscience, mathematics, sensor physics, robotics, and computer science are needed to address remaining crucial scientific challenges. In this paper, we argue for a bio-inspired approach to solve autonomous flying challenges, outline the frontier of sensing, data processing, and flight control within a neuromorphic paradigm, and chart directions of research needed to achieve operational capabilities comparable to those we observe in nature. One central problem of neuromorphic computing is learning. In biological systems, learning is achieved by adaptive and relativistic information acquisition characterized by near-continuous information retrieval with variable rates and sparsity. This results in both energy and computational resource savings being an inspiration for autonomous systems. We consider pertinent features of insect, bat and bird flight behavior as examples to address various vital aspects of autonomous flight. Insects exhibit sophisticated flight dynamics with comparatively reduced complexity of the brain. They represent excellent objects for the study of navigation and flight control. Bats and birds enable more complex models of attention and point to the importance of active sensing for conducting more complex missions. The implementation of neuromorphic paradigms for autonomous flight will require fundamental changes in both traditional hardware and software. We provide recommendations for sensor hardware and processing algorithm development to enable energy efficient and computationally effective flight control.
Collapse
Affiliation(s)
| | - Andrey Kanaev
- U.S. Office of Naval Research Global, London, United Kingdom
| | - Chou P. Hung
- United States Army Research Laboratory, Aberdeen Proving Ground, Maryland, MD, United States
| | | | - Frederick D. Gregory
- U.S. Army Research Laboratory, London, United Kingdom
- Department of Bioengineering, Imperial College London, London, United Kingdom
| | - Ryad Benosman
- Institut de la Vision, INSERM UMRI S 968, Paris, France
- Biomedical Science Tower, University of Pittsburgh, Pittsburgh, PA, United States
- Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, United States
| | - Guido C. H. E. de Croon
- Micro Air Vehicle Laboratory, Department of Control and Operations, Faculty of Aerospace Engineering, Delft University of Technology, Delft, Netherlands
| | - Yoram Gutfreund
- The Neuroethological lab, Department of Neurobiology, The Rappaport Institute for Biomedical Research, Technion – Israel Institute of Technology, Haifa, Israel
| | - Chung-Chuan Lo
- Brain Research Center/Institute of Systems Neuroscience, National Tsing Hua University, Hsinchu, Taiwan
| | - Cynthia F. Moss
- Laboratory of Comparative Neural Systems and Behavior, Department of Psychological and Brain Sciences, Neuroscience and Mechanical Engineering, Johns Hopkins University, Baltimore, MD, United States
| |
Collapse
|
29
|
Grob R, el Jundi B, Fleischmann PN. Towards a common terminology for arthropod spatial orientation. ETHOL ECOL EVOL 2021. [DOI: 10.1080/03949370.2021.1905075] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Affiliation(s)
- Robin Grob
- Behavioral Physiology and Sociobiology (Zoology II), Biocenter, University of Würzburg, Würzburg 97074, Germany
| | - Basil el Jundi
- Behavioral Physiology and Sociobiology (Zoology II), Biocenter, University of Würzburg, Würzburg 97074, Germany
| | - Pauline N. Fleischmann
- Behavioral Physiology and Sociobiology (Zoology II), Biocenter, University of Würzburg, Würzburg 97074, Germany
| |
Collapse
|
30
|
Sergi CM, Antonopoulos T, Rodríguez RL. Black widow spiders use path integration on their webs. Behav Ecol Sociobiol 2021. [DOI: 10.1007/s00265-021-03009-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
31
|
Doussot C, Bertrand OJN, Egelhaaf M. The Critical Role of Head Movements for Spatial Representation During Bumblebees Learning Flight. Front Behav Neurosci 2021; 14:606590. [PMID: 33542681 PMCID: PMC7852487 DOI: 10.3389/fnbeh.2020.606590] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Accepted: 11/23/2020] [Indexed: 11/20/2022] Open
Abstract
Bumblebees perform complex flight maneuvers around the barely visible entrance of their nest upon their first departures. During these flights bees learn visual information about the surroundings, possibly including its spatial layout. They rely on this information to return home. Depth information can be derived from the apparent motion of the scenery on the bees' retina. This motion is shaped by the animal's flight and orientation: Bees employ a saccadic flight and gaze strategy, where rapid turns of the head (saccades) alternate with flight segments of apparently constant gaze direction (intersaccades). When during intersaccades the gaze direction is kept relatively constant, the apparent motion contains information about the distance of the animal to environmental objects, and thus, in an egocentric reference frame. Alternatively, when the gaze direction rotates around a fixed point in space, the animal perceives the depth structure relative to this pivot point, i.e., in an allocentric reference frame. If the pivot point is at the nest-hole, the information is nest-centric. Here, we investigate in which reference frames bumblebees perceive depth information during their learning flights. By precisely tracking the head orientation, we found that half of the time, the head appears to pivot actively. However, only few of the corresponding pivot points are close to the nest entrance. Our results indicate that bumblebees perceive visual information in several reference frames when they learn about the surroundings of a behaviorally relevant location.
Collapse
Affiliation(s)
- Charlotte Doussot
- Department of Neurobiology, University of Bielefeld, Bielefeld, Germany
| | | | | |
Collapse
|
32
|
Kaushik PK, Olsson SB. Using virtual worlds to understand insect navigation for bio-inspired systems. CURRENT OPINION IN INSECT SCIENCE 2020; 42:97-104. [PMID: 33010476 DOI: 10.1016/j.cois.2020.09.010] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2020] [Revised: 09/18/2020] [Accepted: 09/22/2020] [Indexed: 06/11/2023]
Abstract
Insects perform a wide array of intricate behaviors over large spatial and temporal scales in complex natural environments. A mechanistic understanding of insect cognition has direct implications on how brains integrate multimodal information and can inspire bio-based solutions for autonomous robots. Virtual Reality (VR) offers an opportunity assess insect neuroethology while presenting complex, yet controlled, stimuli. Here, we discuss the use of insects as inspiration for artificial systems, recent advances in different VR technologies, current knowledge gaps, and the potential for application of insect VR research to bio-inspired robots. Finally, we advocate the need to diversify our model organisms, behavioral paradigms, and embrace the complexity of the natural world. This will help us to uncover the proximate and ultimate basis of brain and behavior and extract general principles for common challenging problems.
Collapse
Affiliation(s)
- Pavan Kumar Kaushik
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, GKVK Campus, Bellary Road, Bengaluru, 560064, India.
| | - Shannon B Olsson
- National Centre for Biological Sciences, Tata Institute of Fundamental Research, GKVK Campus, Bellary Road, Bengaluru, 560064, India.
| |
Collapse
|
33
|
Pfeffer S, Wolf H. Arthropod spatial cognition. Anim Cogn 2020; 23:1041-1049. [PMID: 33170438 PMCID: PMC7700064 DOI: 10.1007/s10071-020-01446-4] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2020] [Revised: 10/23/2020] [Accepted: 10/28/2020] [Indexed: 12/14/2022]
Abstract
The feats of arthropods, and of the well-studied insects and crustaceans in particular, have fascinated scientists and laymen alike for centuries. Arthropods show a diverse repertoire of cognitive feats, of often unexpected sophistication. Despite their smaller brains and resulting lower neuronal capacity, the cognitive abilities of arthropods are comparable to, or may even exceed, those of vertebrates, depending on the species compared. Miniature brains often provide parsimonious but smart solutions for complex behaviours or ecologically relevant problems. This makes arthropods inspiring subjects for basic research, bionics, and robotics. Investigations of arthropod spatial cognition have originally concentrated on the honeybee, an animal domesticated for several thousand years. Bees are easy to keep and handle, making this species amenable to experimental study. However, there are an estimated 5–10 million arthropod species worldwide, with a broad diversity of lifestyles, ecology, and cognitive abilities. This high diversity provides ample opportunity for comparative analyses. Comparative study, rather than focusing on single model species, is well suited to scrutinise the link between ecological niche, lifestyle, and cognitive competence. It also allows the discovery of general concepts that are transferable between distantly related groups of organisms. With species diversity and a comparative approach in mind, this special issue compiles four review articles and ten original research reports from a spectrum of arthropod species. These contributions range from the well-studied hymenopterans, and ants in particular, to chelicerates and crustaceans. They thus present a broad spectrum of glimpses into current research on arthropod spatial cognition, and together they cogently emphasise the merits of research into arthropod cognitive achievements.
Collapse
Affiliation(s)
- Sarah Pfeffer
- Institute of Neurobiology, Ulm University, Albert-Einstein-Allee 11, 89081, Ulm, Germany.
| | - Harald Wolf
- Institute of Neurobiology, Ulm University, Albert-Einstein-Allee 11, 89081, Ulm, Germany
| |
Collapse
|
34
|
Doussot C, Bertrand OJN, Egelhaaf M. Visually guided homing of bumblebees in ambiguous situations: A behavioural and modelling study. PLoS Comput Biol 2020; 16:e1008272. [PMID: 33048938 PMCID: PMC7553325 DOI: 10.1371/journal.pcbi.1008272] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2019] [Accepted: 08/19/2020] [Indexed: 11/19/2022] Open
Abstract
Returning home is a crucial task accomplished daily by many animals, including humans. Because of their tiny brains, insects, like bees or ants, are good study models for efficient navigation strategies. Bees and ants are known to rely mainly on learned visual information about the nest surroundings to pinpoint their barely visible nest-entrance. During the return, when the actual sight of the insect matches the learned information, the insect is easily guided home. Occasionally, modifications to the visual environment may take place while the insect is on a foraging trip. Here, we addressed the ecologically relevant question of how bumblebees’ homing is affected by such a situation. In an artificial setting, we habituated bees to be guided to their nest by two constellations of visual cues. After habituation, these cues were displaced during foraging trips into a conflict situation. We recorded bumblebees’ return flights in such circumstances and investigated where they search for their nest entrance following the degree of displacement between the two visually relevant cues. Bumblebees mostly searched at the fictive nest location as indicated by either cue constellation, but never at a compromise location between them. We compared these experimental results to the predictions of different types of homing models. We found that models guiding an agent by a single holistic view of the nest surroundings could not account for the bumblebees’ search behaviour in cue-conflict situations. Instead, homing models relying on multiple views were sufficient. We could further show that homing models required fewer views and got more robust to height changes if optic flow-based spatial information was encoded and learned, rather than just brightness information. Returning home sounds trivial, but to a concealed underground location like a burrow, is less easy. For the buff-tailed bumblebees, this task is a routine. After collecting pollen in gardens or flowered meadows, bees must return to their underground nest to feed the queen’s larvae. The nest entrance is almost invisible for a returning bee; therefore, it guides its flight by information about the surrounding visual environment. Since the seminal work of Timbergern, many experiments have focused on how visual information is guiding foraging insects back home. In these experiments, returning foragers were confronted with a coherent displacement of the entire nest surroundings, hence, leading the bees to a unique new location. But in nature, the objects constituting the visual environment maybe unorderly displaced, as some are differently inclined to the action of different factors, e.g. wind. In our study, we moved objects in a tricky way to create two fictitious nest entrances. The bees searched at the fictitious nest entrances, but never in-between. The distance between the fictitious nests affected the bees’ search. Finally, we could predict the search location by using bio-inspired homing models potentially interesting for implementing in autonomous robots.
Collapse
Affiliation(s)
- Charlotte Doussot
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
- * E-mail:
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
| |
Collapse
|
35
|
Sun X, Yue S, Mangan M. A decentralised neural model explaining optimal integration of navigational strategies in insects. eLife 2020; 9:e54026. [PMID: 32589143 PMCID: PMC7365663 DOI: 10.7554/elife.54026] [Citation(s) in RCA: 44] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2019] [Accepted: 06/26/2020] [Indexed: 12/12/2022] Open
Abstract
Insect navigation arises from the coordinated action of concurrent guidance systems but the neural mechanisms through which each functions, and are then coordinated, remains unknown. We propose that insects require distinct strategies to retrace familiar routes (route-following) and directly return from novel to familiar terrain (homing) using different aspects of frequency encoded views that are processed in different neural pathways. We also demonstrate how the Central Complex and Mushroom Bodies regions of the insect brain may work in tandem to coordinate the directional output of different guidance cues through a contextually switched ring-attractor inspired by neural recordings. The resultant unified model of insect navigation reproduces behavioural data from a series of cue conflict experiments in realistic animal environments and offers testable hypotheses of where and how insects process visual cues, utilise the different information that they provide and coordinate their outputs to achieve the adaptive behaviours observed in the wild.
Collapse
Affiliation(s)
- Xuelong Sun
- Computational Intelligence Lab & L-CAS, School of Computer Science, University of LincolnLincolnUnited Kingdom
| | - Shigang Yue
- Computational Intelligence Lab & L-CAS, School of Computer Science, University of LincolnLincolnUnited Kingdom
- Machine Life and Intelligence Research Centre, Guangzhou UniversityGuangzhouChina
| | - Michael Mangan
- Sheffield Robotics, Department of Computer Science, University of SheffieldSheffieldUnited Kingdom
| |
Collapse
|
36
|
Islam M, Freas CA, Cheng K. Effect of large visual changes on the navigation of the nocturnal bull ant, Myrmecia midas. Anim Cogn 2020; 23:1071-1080. [PMID: 32270349 DOI: 10.1007/s10071-020-01377-0] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2019] [Accepted: 03/30/2020] [Indexed: 11/25/2022]
Abstract
Nocturnal insects have remarkable visual capacities in dim light. They can navigate using both the surrounding panorama and celestial cues. Individual foraging ants are efficient navigators, able to accurately reach a variety of goal locations. During navigation, foragers compare the current panoramic view to previously learnt views. In this natural experiment, we observed the effects of large panorama changes, the addition of a fence and the removal of several trees near the nest site, on the navigation of the nocturnal bull ant Myrmecia midas. We examined how the ants' navigational efficiency and behaviour changed in response to changes in ~ 30% of the surrounding skyline, following them over multiple nights. Foragers were displaced locally off-route where we collected initial orientations and homing paths both before and after large panorama changes. We found that immediately after these changes, foragers were unable to initially orient correctly to the nest direction and foragers' return paths were less straight, suggesting increased navigational uncertainty. Continued testing showed rapid recovery in both initial orientation and path straightness.
Collapse
Affiliation(s)
- Muzahid Islam
- Department of Biological Sciences, Macquarie University, Sydney, NSW, 2109, Australia.
| | - Cody A Freas
- Department of Psychology, University of Alberta, Edmonton, AB, Canada
| | - Ken Cheng
- Department of Biological Sciences, Macquarie University, Sydney, NSW, 2109, Australia
| |
Collapse
|
37
|
Freas CA, Congdon JV, Plowes NJR, Spetch ML. Pheromone cue triggers switch between vectors in the desert harvest ant, Veromessor pergandei. Anim Cogn 2020; 23:1087-1105. [PMID: 32078060 DOI: 10.1007/s10071-020-01354-7] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2019] [Revised: 01/07/2020] [Accepted: 01/25/2020] [Indexed: 11/27/2022]
Abstract
The desert harvester ant (Veromessor pergandei) employs a mixture of social and individual navigational strategies at separate stages of their foraging trip. Individuals leave the nest along a pheromone-based column, travelling 3-40 m before spreading out to forage individually in a fan. Foragers use path integration while in this fan, accumulating a direction and distance estimate (vector) to return to the end of the column (column head), yet foragers' potential use of path integration in the pheromone-based column is less understood. Here we show foragers rely on path integration both in the foraging fan and while in the column to return to the nest, using separate vectors depending on their current foraging stage in the fan or column. Returning foragers displaced while in the fan oriented and travelled to the column head location while those displaced after reaching the column travel in the nest direction, signifying the maintenance of a two-vector system with separate fan and column vectors directing a forager to two separate spatial locations. Interestingly, the trail pheromone and not the surrounding terrestrial cues mediate use of these distinct vectors, as fan foragers briefly exposed to the pheromone cues of the column in isolation altered their paths to a combination of the fan and column vectors. The pheromone acts as a contextual cue triggering both the retrieval of the column-vector memory and its integration with the forager's current fan-vector.
Collapse
Affiliation(s)
- Cody A Freas
- Department of Psychology, University of Alberta, Edmonton, AB, T6G 2R3, Canada.
| | - Jenna V Congdon
- Department of Psychology, University of Alberta, Edmonton, AB, T6G 2R3, Canada
| | | | - Marcia L Spetch
- Department of Psychology, University of Alberta, Edmonton, AB, T6G 2R3, Canada
| |
Collapse
|
38
|
Abstract
Many animals use an internal sense of direction to guide their movements through the world. Neurons selective to head direction are thought to support this directional sense and have been found in a diverse range of species, from insects to primates, highlighting their evolutionary importance. Across species, most head-direction networks share four key properties: a unique representation of direction at all times, persistent activity in the absence of movement, integration of angular velocity to update the representation, and the use of directional cues to correct drift. The dynamics of theorized network structures called ring attractors elegantly account for these properties, but their relationship to brain circuits is unclear. Here, we review experiments in rodents and flies that offer insights into potential neural implementations of ring attractor networks. We suggest that a theory-guided search across model systems for biological mechanisms that enable such dynamics would uncover general principles underlying head-direction circuit function.
Collapse
Affiliation(s)
- Brad K Hulse
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, Virginia 20147, USA; ,
| | - Vivek Jayaraman
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, Virginia 20147, USA; ,
| |
Collapse
|
39
|
Collett TS. Path integration: how details of the honeybee waggle dance and the foraging strategies of desert ants might help in understanding its mechanisms. ACTA ACUST UNITED AC 2019; 222:222/11/jeb205187. [PMID: 31152122 DOI: 10.1242/jeb.205187] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Path integration is a navigational strategy that gives an animal an estimate of its position relative to some starting point. For many decades, ingenious and probing behavioural experiments have been the only window onto the operation of path integration in arthropods. New methods have now made it possible to visualise the activity of neural circuits in Drosophila while they fly or walk in virtual reality. Studies of this kind, as well as electrophysiological recordings from single neurons in the brains of other insects, are revealing details of the neural mechanisms that control an insect's direction of travel and other aspects of path integration. The aim here is first to review the major features of path integration in foraging desert ants and honeybees, the current champion path integrators of the insect world, and second consider how the elaborate behaviour of these insects might be accommodated within the framework of the newly understood neural circuits. The discussion focuses particularly on the ability of ants and honeybees to use a celestial compass to give direction in Earth-based coordinates, and of honeybees to use a landscape panorama to provide directional guidance for path integration. The possibility is raised that well-ordered behaviour might in some cases substitute for complex circuitry.
Collapse
Affiliation(s)
- Thomas S Collett
- School of Life Sciences, University of Sussex, Brighton BN1 9QG, UK
| |
Collapse
|