1
|
A Review of Unmanned System Technologies with Its Application to Aquaculture Farm Monitoring and Management. DRONES 2022. [DOI: 10.3390/drones6010012] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/18/2023]
Abstract
This paper aims to provide an overview of the capabilities of unmanned systems to monitor and manage aquaculture farms that support precision aquaculture using the Internet of Things. The locations of aquaculture farms are diverse, which is a big challenge on accessibility. For offshore fish cages, there is a difficulty and risk in the continuous monitoring considering the presence of waves, water currents, and other underwater environmental factors. Aquaculture farm management and surveillance operations require collecting data on water quality, water pollutants, water temperature, fish behavior, and current/wave velocity, which requires tremendous labor cost, and effort. Unmanned vehicle technologies provide greater efficiency and accuracy to execute these functions. They are even capable of cage detection and illegal fishing surveillance when equipped with sensors and other technologies. Additionally, to provide a more large-scale scope, this document explores the capacity of unmanned vehicles as a communication gateway to facilitate offshore cages equipped with robust, low-cost sensors capable of underwater and in-air wireless connectivity. The capabilities of existing commercial systems, the Internet of Things, and artificial intelligence combined with drones are also presented to provide a precise aquaculture framework.
Collapse
|
2
|
Chavez AG, Mueller CA, Doernbach T, Birk A. Underwater navigation using visual markers in the context of intervention missions. INT J ADV ROBOT SYST 2019. [DOI: 10.1177/1729881419838967] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Intervention missions, that is, underwater manipulation tasks, for example, in the context of oil-&-gas production, require a high amount of precise, robust navigation. In this article, we describe the use of an advanced vision system suited for deep-sea operations, which in combination with artificial markers on target structures like oil-&-gas production-Christmas-trees significantly boosts navigation performance. The system is validated in two intensive field tests running off the shore of Marseille, France. In the experiments, a commercial remotely operated vehicle equipped with the system and a mock-up structure with an oil-&-gas production panel is used to evaluate the navigation performance.
Collapse
Affiliation(s)
- Arturo Gomez Chavez
- Department of Computer Science and Electrical Engineering, Jacobs University Bremen, Bremen, Germany
| | - Christian A Mueller
- Department of Computer Science and Electrical Engineering, Jacobs University Bremen, Bremen, Germany
| | - Tobias Doernbach
- Department of Computer Science and Electrical Engineering, Jacobs University Bremen, Bremen, Germany
| | - Andreas Birk
- Department of Computer Science and Electrical Engineering, Jacobs University Bremen, Bremen, Germany
| |
Collapse
|
3
|
Multi-data sensor fusion framework to detect transparent object for the efficient mobile robot mapping. INTERNATIONAL JOURNAL OF INTELLIGENT UNMANNED SYSTEMS 2019. [DOI: 10.1108/ijius-05-2018-0013] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Purpose
An efficient perception of the complex environment is the foremost requirement in mobile robotics. At present, the utilization of glass as a glass wall and automated transparent door in the modern building has become a highlight feature for interior decoration, which has resulted in the wrong perception of the environment by various range sensors. The perception generated by multi-data sensor fusion (MDSF) of sonar and laser is fairly consistent to detect glass but is still affected by the issues such as sensor inaccuracies, sensor reliability, scan mismatching due to glass, sensor model, probabilistic approaches for sensor fusion, sensor registration, etc. The paper aims to discuss these issues.
Design/methodology/approach
This paper presents a modified framework – Advanced Laser and Sonar Framework (ALSF) – to fuse the sensory information of a laser scanner and sonar to reduce the uncertainty caused by glass in an environment by selecting the optimal range information corresponding to a selected threshold value. In the proposed approach, the conventional sonar sensor model is also modified to reduce the wrong perception in sonar as an outcome of the diverse range measurement. The laser scan matching algorithm is also modified by taking out the small cluster of laser point (w.r.t. range information) to get efficient perception.
Findings
The probability of the occupied cells w.r.t. the modified sonar sensor model becomes consistent corresponding to diverse sonar range measurement. The scan matching technique is also modified to reduce the uncertainty caused by glass and high computational load for the efficient and fast pose estimation of the laser sensor/mobile robot to generate robust mapping. These stated modifications are linked with the proposed ALSF technique to reduce the uncertainty caused by glass, inconsistent probabilities and high load computation during the generation of occupancy grid mapping with MDSF. Various real-world experiments are performed with the implementation of the proposed approach on a mobile robot fitted with laser and sonar, and the obtained results are qualitatively and quantitatively compared with conventional approaches.
Originality/value
The proposed ASIF approach generates efficient perception of the complex environment contains glass and can be implemented for various robotics applications.
Collapse
|
4
|
Map-based localization and loop-closure detection from a moving underwater platform using flow features. Auton Robots 2018. [DOI: 10.1007/s10514-018-9797-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
|
5
|
Muhammad N, Toming G, Tuhtan JA, Musall M, Kruusmaa M. Underwater map-based localization using flow features. Auton Robots 2016. [DOI: 10.1007/s10514-016-9558-0] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
|
6
|
Affiliation(s)
- Brian Claus
- Department of Ocean and Naval Architecture Engineering; Memorial University, St. John's; Newfoundland Canada
| | - Ralf Bachmayer
- Department of Ocean and Naval Architecture Engineering; Memorial University, St. John's; Newfoundland Canada
| |
Collapse
|