1
|
El Boukhrissi A, Taheri A, Bennas N, Belkhiri A, El Ajjouri B, Reyes-López JL. Foraging trail traffic rules: a new study method of trajectories of the harvester ants. INSECT SCIENCE 2025; 32:687-700. [PMID: 38961518 DOI: 10.1111/1744-7917.13411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/27/2024] [Revised: 05/19/2024] [Accepted: 05/28/2024] [Indexed: 07/05/2024]
Abstract
Harvester ants are one of the most extensively studied groups of ants, especially the group foraging ants, Messor barbarus (Linnaeus, 1767), which construct long-lasting trunk trails. Limited laboratory investigations have delved into head-on encounters along foraging trails involving workers moving in opposing directions, with fewer corresponding studies conducted in the natural environment. To address this gap, we devised an in-field experimental design to induce lane segregation on the foraging trunk trail of M. barbarus. Using an image-based tracking method, we analyzed the foraging behavior of this species to assess the costs associated with head-on encounters and to figure out the natural coexistence of outgoing and returning workers on a bidirectional route. Our results consistently reveal heightened straightness and speed in unidirectional test lanes, accompanied by an elevated foraging rate compared to bidirectional lanes. This suggests a potential impact of head-on collisions on foraging behavior, especially on foraging efficiency. Additionally, Kinematic analysis revealed distinct movement patterns between outbound and inbound flows, particularly low speed and sinuous trajectories of inbounding unladen workers. The study of encounter rates in two traffic systems hints at the plausible utilization of individual memory by workers within trails, underscoring the pivotal role of encounters in information exchange and load transfer.
Collapse
Affiliation(s)
| | - Ahmed Taheri
- Faculty of Sciences, Chouaïb Doukkali University, El Jadida, Morocco
| | - Nard Bennas
- LESCB URL-CNRST N° 18, FS, Abdelmalek Essaadi University, Tetouan, Morocco
| | - Abdelkhalek Belkhiri
- Natural Resources Management and Development Team, Environment and Health Laboratory, Department of Biology, Faculty of Sciences, Moulay Ismaïl University, Meknes, Morocco
| | - Bilal El Ajjouri
- Faculty of Sciences, Chouaïb Doukkali University, El Jadida, Morocco
| | - Joaquín L Reyes-López
- Joaquín L. Reyes-López, Área de Ecología, Facultad de Ciencias, Campus de Rabanales, Universidad de Córdoba, Córdoba, España
| |
Collapse
|
2
|
Kagioulis E, Knight J, Graham P, Nowotny T, Philippides A. Adaptive Route Memory Sequences for Insect-Inspired Visual Route Navigation. Biomimetics (Basel) 2024; 9:731. [PMID: 39727735 DOI: 10.3390/biomimetics9120731] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2024] [Revised: 11/04/2024] [Accepted: 11/09/2024] [Indexed: 12/28/2024] Open
Abstract
Visual navigation is a key capability for robots and animals. Inspired by the navigational prowess of social insects, a family of insect-inspired route navigation algorithms-familiarity-based algorithms-have been developed that use stored panoramic images collected during a training route to subsequently derive directional information during route recapitulation. However, unlike the ants that inspire them, these algorithms ignore the sequence in which the training images are acquired so that all temporal information/correlation is lost. In this paper, the benefits of incorporating sequence information in familiarity-based algorithms are tested. To do this, instead of comparing a test view to all the training route images, a window of memories is used to restrict the number of comparisons that need to be made. As ants are able to visually navigate when odometric information is removed, the window position is updated via visual matching information only and not odometry. The performance of an algorithm without sequence information is compared to the performance of window methods with different fixed lengths as well as a method that adapts the window size dynamically. All algorithms were benchmarked on a simulation of an environment used for ant navigation experiments and showed that sequence information can boost performance and reduce computation. A detailed analysis of successes and failures highlights the interaction between the length of the route memory sequence and environment type and shows the benefits of an adaptive method.
Collapse
Affiliation(s)
- Efstathios Kagioulis
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton BN1 9QJ, UK
| | - James Knight
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton BN1 9QJ, UK
| | - Paul Graham
- Sussex Neuroscience, School of Life Sciences, University of Sussex, Brighton BN1 9QG, UK
| | - Thomas Nowotny
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton BN1 9QJ, UK
| | - Andrew Philippides
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton BN1 9QJ, UK
| |
Collapse
|
3
|
Dauzere-Peres O, Wystrach A. Ants integrate proprioception as well as visual context and efference copies to make robust predictions. Nat Commun 2024; 15:10205. [PMID: 39617774 PMCID: PMC11609268 DOI: 10.1038/s41467-024-53856-4] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2023] [Accepted: 10/23/2024] [Indexed: 05/17/2025] Open
Abstract
Forward models are mechanisms enabling an agent to predict the sensory outcomes of its actions. They can be implemented through efference copies: copies of motor signals inhibiting the expected sensory stimulation, literally canceling the perceptual outcome of the predicted action. In insects, efference copies are known to modulate optic flow detection for flight control in flies. Here we investigate whether forward models account for the detection of optic flow in walking ants, and how the latter is integrated for locomotion control. We mounted Cataglyphis velox ants in a virtual reality setup and manipulated the relationship between the ants' movements and the optic flow perceived. Our results show that ants compute predictions of the optic flow expected according to their own movements. However, the prediction is not solely based on efference copies, but involves proprioceptive feedbacks and is fine-tuned by the panorama's visual structure. Mismatches between prediction and perception are computed for each eye, and error signals are integrated to adjust locomotion through the modulation of internal oscillators. Our work reveals that insects' forward models are non-trivial and compute predictions based on multimodal information.
Collapse
Affiliation(s)
- Océane Dauzere-Peres
- Centre de Recherches sur la Cognition Animale, CBI,CNRS, Université Paul Sabatier, Toulouse, France.
| | - Antoine Wystrach
- Centre de Recherches sur la Cognition Animale, CBI,CNRS, Université Paul Sabatier, Toulouse, France
| |
Collapse
|
4
|
Jesusanmi OO, Amin AA, Domcsek N, Knight JC, Philippides A, Nowotny T, Graham P. Investigating visual navigation using spiking neural network models of the insect mushroom bodies. Front Physiol 2024; 15:1379977. [PMID: 38841209 PMCID: PMC11151298 DOI: 10.3389/fphys.2024.1379977] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2024] [Accepted: 04/29/2024] [Indexed: 06/07/2024] Open
Abstract
Ants are capable of learning long visually guided foraging routes with limited neural resources. The visual scene memory needed for this behaviour is mediated by the mushroom bodies; an insect brain region important for learning and memory. In a visual navigation context, the mushroom bodies are theorised to act as familiarity detectors, guiding ants to views that are similar to those previously learned when first travelling along a foraging route. Evidence from behavioural experiments, computational studies and brain lesions all support this idea. Here we further investigate the role of mushroom bodies in visual navigation with a spiking neural network model learning complex natural scenes. By implementing these networks in GeNN-a library for building GPU accelerated spiking neural networks-we were able to test these models offline on an image database representing navigation through a complex outdoor natural environment, and also online embodied on a robot. The mushroom body model successfully learnt a large series of visual scenes (400 scenes corresponding to a 27 m route) and used these memories to choose accurate heading directions during route recapitulation in both complex environments. Through analysing our model's Kenyon cell (KC) activity, we were able to demonstrate that KC activity is directly related to the respective novelty of input images. Through conducting a parameter search we found that there is a non-linear dependence between optimal KC to visual projection neuron (VPN) connection sparsity and the length of time the model is presented with an image stimulus. The parameter search also showed training the model on lower proportions of a route generally produced better accuracy when testing on the entire route. We embodied the mushroom body model and comparator visual navigation algorithms on a Quanser Q-car robot with all processing running on an Nvidia Jetson TX2. On a 6.5 m route, the mushroom body model had a mean distance to training route (error) of 0.144 ± 0.088 m over 5 trials, which was performance comparable to standard visual-only navigation algorithms. Thus, we have demonstrated that a biologically plausible model of the ant mushroom body can navigate complex environments both in simulation and the real world. Understanding the neural basis of this behaviour will provide insight into how neural circuits are tuned to rapidly learn behaviourally relevant information from complex environments and provide inspiration for creating bio-mimetic computer/robotic systems that can learn rapidly with low energy requirements.
Collapse
Affiliation(s)
| | - Amany Azevedo Amin
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | - Norbert Domcsek
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | - James C. Knight
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | - Andrew Philippides
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | - Thomas Nowotny
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | - Paul Graham
- Sussex Neuroscience, School of Life Sciences, University of Sussex, Brighton, United Kingdom
| |
Collapse
|
5
|
Freas CA, Spetch ML. Directed retreat and navigational mechanisms in trail following Formica obscuripes. Learn Behav 2024; 52:114-131. [PMID: 37752304 PMCID: PMC10923983 DOI: 10.3758/s13420-023-00604-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/12/2023] [Indexed: 09/28/2023]
Abstract
Ant species exhibit behavioural commonalities when solving navigational challenges for successful orientation and to reach goal locations. These behaviours rely on a shared toolbox of navigational strategies that guide individuals under an array of motivational contexts. The mechanisms that support these behaviours, however, are tuned to each species' habitat and ecology with some exhibiting unique navigational behaviours. This leads to clear differences in how ant navigators rely on this shared toolbox to reach goals. Species with hybrid foraging structures, which navigate partially upon a pheromone-marked column, express distinct differences in their toolbox, compared to solitary foragers. Here, we explore the navigational abilities of the Western Thatching ant (Formica obscuripes), a hybrid foraging species whose navigational mechanisms have not been studied. We characterise their reliance on both the visual panorama and a path integrator for orientation, with the pheromone's presence acting as a non-directional reassurance cue, promoting continued orientation based on other strategies. This species also displays backtracking behaviour, which occurs with a combination of unfamiliar terrestrial cues and the absence of the pheromone, thus operating based upon a combination of the individual mechanisms observed in solitarily and socially foraging species. We also characterise a new form of goalless orientation in these ants, an initial retreating behaviour that is modulated by the forager's path integration system. The behaviour directs disturbed inbound foragers back along their outbound path for a short distance before recovering and reorienting back to the nest.
Collapse
Affiliation(s)
- Cody A Freas
- Department of Psychology, University of Alberta, Edmonton, Alberta, Canada.
- School of Natural Sciences, Macquarie University, Sydney, NSW, 2113, Australia.
| | - Marcia L Spetch
- Department of Psychology, University of Alberta, Edmonton, Alberta, Canada
| |
Collapse
|
6
|
Zhu L, Mangan M, Webb B. Neuromorphic sequence learning with an event camera on routes through vegetation. Sci Robot 2023; 8:eadg3679. [PMID: 37756384 DOI: 10.1126/scirobotics.adg3679] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2022] [Accepted: 08/29/2023] [Indexed: 09/29/2023]
Abstract
For many robotics applications, it is desirable to have relatively low-power and efficient onboard solutions. We took inspiration from insects, such as ants, that are capable of learning and following routes in complex natural environments using relatively constrained sensory and neural systems. Such capabilities are particularly relevant to applications such as agricultural robotics, where visual navigation through dense vegetation remains a challenging task. In this scenario, a route is likely to have high self-similarity and be subject to changing lighting conditions and motion over uneven terrain, and the effects of wind on leaves increase the variability of the input. We used a bioinspired event camera on a terrestrial robot to collect visual sequences along routes in natural outdoor environments and applied a neural algorithm for spatiotemporal memory that is closely based on a known neural circuit in the insect brain. We show that this method is plausible to support route recognition for visual navigation and more robust than SeqSLAM when evaluated on repeated runs on the same route or routes with small lateral offsets. By encoding memory in a spiking neural network running on a neuromorphic computer, our model can evaluate visual familiarity in real time from event camera footage.
Collapse
Affiliation(s)
- Le Zhu
- School of Informatics, University of Edinburgh, EH8 9AB Edinburgh, UK
| | - Michael Mangan
- Sheffield Robotics, Department of Computer Science, University of Sheffield, S1 4DP Sheffield, UK
| | - Barbara Webb
- School of Informatics, University of Edinburgh, EH8 9AB Edinburgh, UK
| |
Collapse
|
7
|
Freas CA, Spetch ML. Varieties of visual navigation in insects. Anim Cogn 2023; 26:319-342. [PMID: 36441435 PMCID: PMC9877076 DOI: 10.1007/s10071-022-01720-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2022] [Revised: 11/10/2022] [Accepted: 11/15/2022] [Indexed: 11/29/2022]
Abstract
The behaviours and cognitive mechanisms animals use to orient, navigate, and remember spatial locations exemplify how cognitive abilities have evolved to suit a number of different mobile lifestyles and habitats. While spatial cognition observed in vertebrates has been well characterised in recent decades, of no less interest are the great strides that have also been made in characterizing and understanding the behavioural and cognitive basis of orientation and navigation in invertebrate models and in particular insects. Insects are known to exhibit remarkable spatial cognitive abilities and are able to successfully migrate over long distances or pinpoint known locations relying on multiple navigational strategies similar to those found in vertebrate models-all while operating under the constraint of relatively limited neural architectures. Insect orientation and navigation systems are often tailored to each species' ecology, yet common mechanistic principles can be observed repeatedly. Of these, reliance on visual cues is observed across a wide number of insect groups. In this review, we characterise some of the behavioural strategies used by insects to solve navigational problems, including orientation over short-distances, migratory heading maintenance over long distances, and homing behaviours to known locations. We describe behavioural research using examples from a few well-studied insect species to illustrate how visual cues are used in navigation and how they interact with non-visual cues and strategies.
Collapse
Affiliation(s)
- Cody A. Freas
- Department of Psychology, University of Alberta, Edmonton, AB Canada ,School of Natural Sciences, Macquarie University, Sydney, NSW Australia
| | - Marcia L. Spetch
- Department of Psychology, University of Alberta, Edmonton, AB Canada
| |
Collapse
|
8
|
Freas CA, Wystrach A, Schwarz S, Spetch ML. Aversive view memories and risk perception in navigating ants. Sci Rep 2022; 12:2899. [PMID: 35190612 PMCID: PMC8861035 DOI: 10.1038/s41598-022-06859-4] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2021] [Accepted: 02/01/2022] [Indexed: 11/22/2022] Open
Abstract
Many ants establish foraging routes through learning views of the visual panorama. Route models have focused primarily on attractive view use, which experienced foragers orient towards to return to known sites. However, aversive views have recently been uncovered as a key component of route learning. Here, Cataglyphis velox rapidly learned aversive views, when associated with a negative outcome, a period of captivity in vegetation, triggering increases in hesitation behavior. These memories were based on the accumulation of experiences over multiple trips with each new experience regulating forager hesitancy. Foragers were also sensitive to captivity time differences, suggesting they possess some mechanism to quantify duration. Finally, we analyzed foragers' perception of risky (i.e. variable) versus stable aversive outcomes by associating two sites along the route with distinct captivity schedules, a fixed or variable duration, with the same mean across training. Foragers exhibited fewer hesitations in response to risky outcomes compared to fixed ones, indicating they perceived risky outcomes as less severe. Results align with a logarithmic relationship between captivity duration and hesitations, suggesting that aversive stimulus perception is a logarithm of its actual value. We discuss how aversive view learning could be executed within the mushroom bodies circuitry following a prediction error rule.
Collapse
|
9
|
Woodgate JL, Perl C, Collett TS. The routes of one-eyed ants suggest a revised model of normal route following. J Exp Biol 2021; 224:271814. [PMID: 34382659 DOI: 10.1242/jeb.242167] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2020] [Accepted: 07/12/2021] [Indexed: 11/20/2022]
Abstract
The prevailing account of visually controlled routes is that an ant learns views as it follows a route, while guided by other path-setting mechanisms. Once a set of route views is memorised, the insect follows the route by turning and moving forwards when the view on the retina matches a stored view. We engineered a situation in which this account cannot suffice in order to discover whether there may be additional components to the performance of routes. One-eyed wood ants were trained to navigate a short route in the laboratory, guided by a single black, vertical bar placed in the blinded visual field. Ants thus had to turn away from the route to see the bar. They often turned to look at or beyond the bar and then turned to face in the direction of the goal. Tests in which the bar was shifted to be more peripheral or more frontal than in training produced a corresponding directional change in the ants' paths, demonstrating that they were guided by the bar. Examination of the endpoints of turns towards and away from the bar indicate that ants use the bar for guidance by learning how large a turn-back is needed to face the goal. We suggest that the ants' zigzag paths are, in part, controlled by turns of a learnt amplitude and that these turns are an integral component of visually guided route following.
Collapse
Affiliation(s)
- Joseph L Woodgate
- School of Biological and Chemical Sciences, Queen Mary University of London, London E1 4NS, UK
| | - Craig Perl
- School of Life Sciences, University of Sussex, Brighton BN1 9QG, UK
| | - Thomas S Collett
- School of Life Sciences, University of Sussex, Brighton BN1 9QG, UK
| |
Collapse
|
10
|
Stankiewicz J, Webb B. Looking down: a model for visual route following in flying insects. BIOINSPIRATION & BIOMIMETICS 2021; 16:055007. [PMID: 34243169 DOI: 10.1088/1748-3190/ac1307] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Accepted: 07/09/2021] [Indexed: 06/13/2023]
Abstract
Insect visual navigation is often assumed to depend on panoramic views of the horizon, and how these change as the animal moves. However, it is known that honey bees can visually navigate in flat, open meadows where visual information at the horizon is minimal, or would remain relatively constant across a wide range of positions. In this paper we hypothesise that these animals can navigate using view memories of the ground. We find that in natural scenes, low resolution views from an aerial perspective of ostensibly self-similar terrain (e.g. within a field of grass) provide surprisingly robust descriptors of precise spatial locations. We propose a new visual route following approach that makes use of transverse oscillations to centre a flight path along a sequence of learned views of the ground. We deploy this model on an autonomous quadcopter and demonstrate that it provides robust performance in the real world on journeys of up to 30 m. The success of our method is contingent on a robust view matching process which can evaluate the familiarity of a view with a degree of translational invariance. We show that a previously developed wavelet based bandpass orientated filter approach fits these requirements well, exhibiting double the catchment area of standard approaches. Using a realistic simulation package, we evaluate the robustness of our approach to variations in heading direction and aircraft height between inbound and outbound journeys. We also demonstrate that our approach can operate using a vision system with a biologically relevant visual acuity and viewing direction.
Collapse
Affiliation(s)
- J Stankiewicz
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| | - B Webb
- School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9AB, United Kingdom
| |
Collapse
|
11
|
Grob R, el Jundi B, Fleischmann PN. Towards a common terminology for arthropod spatial orientation. ETHOL ECOL EVOL 2021. [DOI: 10.1080/03949370.2021.1905075] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Affiliation(s)
- Robin Grob
- Behavioral Physiology and Sociobiology (Zoology II), Biocenter, University of Würzburg, Würzburg 97074, Germany
| | - Basil el Jundi
- Behavioral Physiology and Sociobiology (Zoology II), Biocenter, University of Würzburg, Würzburg 97074, Germany
| | - Pauline N. Fleischmann
- Behavioral Physiology and Sociobiology (Zoology II), Biocenter, University of Würzburg, Würzburg 97074, Germany
| |
Collapse
|