1
|
Johnson E, Campos-Cerqueira M, Jumail A, Yusni ASA, Salgado-Lynn M, Fornace K. Applications and advances in acoustic monitoring for infectious disease epidemiology. Trends Parasitol 2023; 39:386-399. [PMID: 36842917 DOI: 10.1016/j.pt.2023.01.008] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Revised: 01/19/2023] [Accepted: 01/19/2023] [Indexed: 02/28/2023]
Abstract
Emerging infectious diseases continue to pose a significant burden on global public health, and there is a critical need to better understand transmission dynamics arising at the interface of human activity and wildlife habitats. Passive acoustic monitoring (PAM), more typically applied to questions of biodiversity and conservation, provides an opportunity to collect and analyse audio data in relative real time and at low cost. Acoustic methods are increasingly accessible, with the expansion of cloud-based computing, low-cost hardware, and machine learning approaches. Paired with purposeful experimental design, acoustic data can complement existing surveillance methods and provide a novel toolkit to investigate the key biological parameters and ecological interactions that underpin infectious disease epidemiology.
Collapse
Affiliation(s)
- Emilia Johnson
- School of Biodiversity, One Health & Veterinary Medicine, University of Glasgow, Glasgow G12 8QQ, UK.
| | | | - Amaziasizamoria Jumail
- Danau Girang Field Centre c/o Sabah Wildlife Department, Wisma Muis, Block B, 5th Floor, 88100 Kota Kinabalu, Sabah, Malaysia; Organisms and Environment Division, Cardiff School of Biosciences, Cardiff University, Sir Martin Evans Building, Museum Avenue, Cardiff CF10 3AX, UK
| | - Ashraft Syazwan Ahmady Yusni
- Danau Girang Field Centre c/o Sabah Wildlife Department, Wisma Muis, Block B, 5th Floor, 88100 Kota Kinabalu, Sabah, Malaysia; Institute for Tropical Biology and Conservation, Universiti Malaysia Sabah, Jalan UMS, 88400 Kota Kinabalu, Sabah, Malaysia
| | - Milena Salgado-Lynn
- Danau Girang Field Centre c/o Sabah Wildlife Department, Wisma Muis, Block B, 5th Floor, 88100 Kota Kinabalu, Sabah, Malaysia; Organisms and Environment Division, Cardiff School of Biosciences, Cardiff University, Sir Martin Evans Building, Museum Avenue, Cardiff CF10 3AX, UK; Wildlife Health, Genetic and Forensic Laboratory, c/o Sabah Wildlife Department, Wisma Muis, Block B, 5th Floor, 88100 Kota Kinabalu, Sabah
| | - Kimberly Fornace
- School of Biodiversity, One Health & Veterinary Medicine, University of Glasgow, Glasgow G12 8QQ, UK; Centre for Climate Change and Planetary Health and Faculty of Infectious and Tropical Diseases, London School of Hygiene and Tropical Medicine, London WC1E 7HT, UK; Saw Swee Hock School of Public Health, National University of Singapore, Singapore; National University Health System, Singapore 117549, Singapore
| |
Collapse
|
2
|
Bergler C, Smeele SQ, Tyndel SA, Barnhill A, Ortiz ST, Kalan AK, Cheng RX, Brinkløv S, Osiecka AN, Tougaard J, Jakobsen F, Wahlberg M, Nöth E, Maier A, Klump BC. ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning. Sci Rep 2022; 12:21966. [PMID: 36535999 PMCID: PMC9763499 DOI: 10.1038/s41598-022-26429-y] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2022] [Accepted: 12/14/2022] [Indexed: 12/23/2022] Open
Abstract
Bioacoustic research spans a wide range of biological questions and applications, relying on identification of target species or smaller acoustic units, such as distinct call types. However, manually identifying the signal of interest is time-intensive, error-prone, and becomes unfeasible with large data volumes. Therefore, machine-driven algorithms are increasingly applied to various bioacoustic signal identification challenges. Nevertheless, biologists still have major difficulties trying to transfer existing animal- and/or scenario-related machine learning approaches to their specific animal datasets and scientific questions. This study presents an animal-independent, open-source deep learning framework, along with a detailed user guide. Three signal identification tasks, commonly encountered in bioacoustics research, were investigated: (1) target signal vs. background noise detection, (2) species classification, and (3) call type categorization. ANIMAL-SPOT successfully segmented human-annotated target signals in data volumes representing 10 distinct animal species and 1 additional genus, resulting in a mean test accuracy of 97.9%, together with an average area under the ROC curve (AUC) of 95.9%, when predicting on unseen recordings. Moreover, an average segmentation accuracy and F1-score of 95.4% was achieved on the publicly available BirdVox-Full-Night data corpus. In addition, multi-class species and call type classification resulted in 96.6% and 92.7% accuracy on unseen test data, as well as 95.2% and 88.4% regarding previous animal-specific machine-based detection excerpts. Furthermore, an Unweighted Average Recall (UAR) of 89.3% outperformed the multi-species classification baseline system of the ComParE 2021 Primate Sub-Challenge. Besides animal independence, ANIMAL-SPOT does not rely on expert knowledge or special computing resources, thereby making deep-learning-based bioacoustic signal identification accessible to a broad audience.
Collapse
Affiliation(s)
- Christian Bergler
- grid.5330.50000 0001 2107 3311Pattern Recognition Lab, Department of Computer Science, Friedrich-Alexander-Universität Erlangen-Nürnberg, 91058 Erlangen, Germany
| | - Simeon Q. Smeele
- grid.507516.00000 0004 7661 536XCognitive and Cultural Ecology Lab, Max Planck Institute of Animal Behavior, 78315 Radolfzell, Germany ,grid.419518.00000 0001 2159 1813Department of Human Behavior, Ecology and Culture, Max Planck Institute for Evolutionary Anthropology, 04103 Leipzig, Germany ,grid.9811.10000 0001 0658 7699Biology Department, University of Konstanz, 78464 Constance, Germany
| | - Stephen A. Tyndel
- grid.507516.00000 0004 7661 536XCognitive and Cultural Ecology Lab, Max Planck Institute of Animal Behavior, 78315 Radolfzell, Germany ,grid.35403.310000 0004 1936 9991Department of Natural Resources and Environmental Sciences, University of Illinois Urbana-Champaign, Champaign, IL United States
| | - Alexander Barnhill
- grid.5330.50000 0001 2107 3311Pattern Recognition Lab, Department of Computer Science, Friedrich-Alexander-Universität Erlangen-Nürnberg, 91058 Erlangen, Germany
| | - Sara T. Ortiz
- grid.4372.20000 0001 2105 1091Max Planck Institute for Biological Intelligence, in Foundation, Seewiesen Eberhard-Gwinner-Strasse, 82319 Starnberg, Germany
| | - Ammie K. Kalan
- grid.143640.40000 0004 1936 9465Department of Anthropology, University of Victoria, Victoria, BC V8P 5C2 Canada
| | - Rachael Xi Cheng
- grid.418779.40000 0001 0708 0355Leibniz Institute for Zoo and Wildlife Research, Alfred-Kowalke-Straße 17, 10315 Berlin, Germany
| | - Signe Brinkløv
- grid.7048.b0000 0001 1956 2722Department of Bioscience, Wildlife Ecology, Aarhus University, 8410 Rønde, Denmark
| | - Anna N. Osiecka
- grid.8585.00000 0001 2370 4076Department of Vertebrate Ecology and Zoology, Faculty of Biology, University of Gdańsk, 80-308 Gdańsk, Poland
| | - Jakob Tougaard
- grid.7048.b0000 0001 1956 2722Department of Bioscience, Marine Mammal Research, Aarhus University, 4000 Roskilde, Denmark
| | - Freja Jakobsen
- grid.10825.3e0000 0001 0728 0170Department of Biology, University of Southern Denmark, 5230 Odense, Denmark
| | - Magnus Wahlberg
- grid.10825.3e0000 0001 0728 0170Department of Biology, University of Southern Denmark, 5230 Odense, Denmark
| | - Elmar Nöth
- grid.5330.50000 0001 2107 3311Pattern Recognition Lab, Department of Computer Science, Friedrich-Alexander-Universität Erlangen-Nürnberg, 91058 Erlangen, Germany
| | - Andreas Maier
- grid.5330.50000 0001 2107 3311Pattern Recognition Lab, Department of Computer Science, Friedrich-Alexander-Universität Erlangen-Nürnberg, 91058 Erlangen, Germany
| | - Barbara C. Klump
- grid.507516.00000 0004 7661 536XCognitive and Cultural Ecology Lab, Max Planck Institute of Animal Behavior, 78315 Radolfzell, Germany
| |
Collapse
|
3
|
Cannet A, Simon-Chane C, Akhoundi M, Histace A, Romain O, Souchaud M, Jacob P, Delaunay P, Sereno D, Bousses P, Grebaut P, Geiger A, de Beer C, Kaba D, Sereno D. Wing Interferential Patterns (WIPs) and machine learning, a step toward automatized tsetse (Glossina spp.) identification. Sci Rep 2022; 12:20086. [PMID: 36418429 PMCID: PMC9684539 DOI: 10.1038/s41598-022-24522-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Accepted: 11/16/2022] [Indexed: 11/24/2022] Open
Abstract
A simple method for accurately identifying Glossina spp in the field is a challenge to sustain the future elimination of Human African Trypanosomiasis (HAT) as a public health scourge, as well as for the sustainable management of African Animal Trypanosomiasis (AAT). Current methods for Glossina species identification heavily rely on a few well-trained experts. Methodologies that rely on molecular methodologies like DNA barcoding or mass spectrometry protein profiling (MALDI TOFF) haven't been thoroughly investigated for Glossina sp. Nevertheless, because they are destructive, costly, time-consuming, and expensive in infrastructure and materials, they might not be well adapted for the survey of arthropod vectors involved in the transmission of pathogens responsible for Neglected Tropical Diseases, like HAT. This study demonstrates a new type of methodology to classify Glossina species. In conjunction with a deep learning architecture, a database of Wing Interference Patterns (WIPs) representative of the Glossina species involved in the transmission of HAT and AAT was used. This database has 1766 pictures representing 23 Glossina species. This cost-effective methodology, which requires mounting wings on slides and using a commercially available microscope, demonstrates that WIPs are an excellent medium to automatically recognize Glossina species with very high accuracy.
Collapse
Affiliation(s)
- Arnaud Cannet
- Direction des affaires sanitaires et sociales de la Nouvelle-Calédonie, Nouméa, New Caledonia France
| | - Camille Simon-Chane
- grid.424458.b0000 0001 2287 8330ETIS UMR 8051, Cergy Paris University, ENSEA, CNRS, 95000 Cergy, France
| | - Mohammad Akhoundi
- grid.413780.90000 0000 8715 2621Parasitology-Mycology, Hôpital Avicenne, AP-HP, Bobigny, France
| | - Aymeric Histace
- grid.424458.b0000 0001 2287 8330ETIS UMR 8051, Cergy Paris University, ENSEA, CNRS, 95000 Cergy, France
| | - Olivier Romain
- grid.424458.b0000 0001 2287 8330ETIS UMR 8051, Cergy Paris University, ENSEA, CNRS, 95000 Cergy, France
| | - Marc Souchaud
- grid.424458.b0000 0001 2287 8330ETIS UMR 8051, Cergy Paris University, ENSEA, CNRS, 95000 Cergy, France
| | - Pierre Jacob
- grid.424458.b0000 0001 2287 8330ETIS UMR 8051, Cergy Paris University, ENSEA, CNRS, 95000 Cergy, France
| | - Pascal Delaunay
- grid.462370.40000 0004 0620 5402Inserm U1065, Centre Méditerranéen de Médecine Moléculaire (C3M), Université de Nice-Sophia Antipolis, Nice, France ,grid.413770.6Parasitologie-Mycologie, Hôpital de L’Archet, Centre Hospitalier Universitaire de Nice, (CHU), Nice, France ,grid.462603.50000 0004 0382 3424MIVEGEC, Univ Montpellier, CNRS, IRD, Montpellier, France
| | - Darian Sereno
- grid.121334.60000 0001 2097 0141InterTryp, Univ Montpellier, IRD-CIRAD, Parasitology Infectiology and Public Health Research Group, Montpellier, France
| | - Philippe Bousses
- grid.462603.50000 0004 0382 3424MIVEGEC, Univ Montpellier, CNRS, IRD, Montpellier, France
| | - Pascal Grebaut
- grid.121334.60000 0001 2097 0141InterTryp, Univ Montpellier, IRD-CIRAD, Parasitology Infectiology and Public Health Research Group, Montpellier, France
| | - Anne Geiger
- grid.121334.60000 0001 2097 0141InterTryp, Univ Montpellier, IRD-CIRAD, Parasitology Infectiology and Public Health Research Group, Montpellier, France
| | - Chantel de Beer
- grid.420221.70000 0004 0403 8399Insect Pest Control Laboratory, Joint FAO/IAEA Center of Nuclear Techniques in Food and Agriculture, Vienna, Austria ,grid.428711.90000 0001 2173 1003Epidemiology, Parasites & Vectors, Agricultural Research Council - Onderstepoort Veterinary Research (ARC-OVR), Onderstepoort, South Africa
| | - Dramane Kaba
- grid.452477.7Institut Pierre Richet, Institut National de Santé Publique, Abidjian, Côte d’Ivoire
| | - Denis Sereno
- grid.121334.60000 0001 2097 0141InterTryp, Univ Montpellier, IRD-CIRAD, Parasitology Infectiology and Public Health Research Group, Montpellier, France ,grid.462603.50000 0004 0382 3424MIVEGEC, Univ Montpellier, CNRS, IRD, Montpellier, France
| |
Collapse
|
4
|
|
5
|
Stowell D. Computational bioacoustics with deep learning: a review and roadmap. PeerJ 2022; 10:e13152. [PMID: 35341043 PMCID: PMC8944344 DOI: 10.7717/peerj.13152] [Citation(s) in RCA: 38] [Impact Index Per Article: 19.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2021] [Accepted: 03/01/2022] [Indexed: 01/20/2023] Open
Abstract
Animal vocalisations and natural soundscapes are fascinating objects of study, and contain valuable evidence about animal behaviours, populations and ecosystems. They are studied in bioacoustics and ecoacoustics, with signal processing and analysis an important component. Computational bioacoustics has accelerated in recent decades due to the growth of affordable digital sound recording devices, and to huge progress in informatics such as big data, signal processing and machine learning. Methods are inherited from the wider field of deep learning, including speech and image processing. However, the tasks, demands and data characteristics are often different from those addressed in speech or music analysis. There remain unsolved problems, and tasks for which evidence is surely present in many acoustic signals, but not yet realised. In this paper I perform a review of the state of the art in deep learning for computational bioacoustics, aiming to clarify key concepts and identify and analyse knowledge gaps. Based on this, I offer a subjective but principled roadmap for computational bioacoustics with deep learning: topics that the community should aim to address, in order to make the most of future developments in AI and informatics, and to use audio data in answering zoological and ecological questions.
Collapse
Affiliation(s)
- Dan Stowell
- Department of Cognitive Science and Artificial Intelligence, Tilburg University, Tilburg, The Netherlands,Naturalis Biodiversity Center, Leiden, The Netherlands
| |
Collapse
|