1
|
Jing X, Li S, Zhu R, Ning X, Lin J. Miniature bioinspired artificial compound eyes: microfabrication technologies, photodetection and applications. Front Bioeng Biotechnol 2024; 12:1342120. [PMID: 38433824 PMCID: PMC10905626 DOI: 10.3389/fbioe.2024.1342120] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2023] [Accepted: 01/11/2024] [Indexed: 03/05/2024] Open
Abstract
As an outstanding visual system for insects and crustaceans to cope with the challenges of survival, compound eye has many unique advantages, such as wide field of view, rapid response, infinite depth of field, low aberration and fast motion capture. However, the complex composition of their optical systems also presents significant challenges for manufacturing. With the continuous development of advanced materials, complex 3D manufacturing technologies and flexible electronic detectors, various ingenious and sophisticated compound eye imaging systems have been developed. This paper provides a comprehensive review on the microfabrication technologies, photoelectric detection and functional applications of miniature artificial compound eyes. Firstly, a brief introduction to the types and structural composition of compound eyes in the natural world is provided. Secondly, the 3D forming manufacturing techniques for miniature compound eyes are discussed. Subsequently, some photodetection technologies for miniature curved compound eye imaging are introduced. Lastly, with reference to the existing prototypes of functional applications for miniature compound eyes, the future development of compound eyes is prospected.
Collapse
Affiliation(s)
- Xian Jing
- College of Electronic Science and Engineering, Jilin University, Changchun, China
- Jilin Provincial Key Laboratory of Micro/Nano and Ultra-precision Manufacturing, School of Mechatronic Engineering, Changchun University of Technology, Changchun, China
| | - Shitao Li
- Jilin Provincial Key Laboratory of Micro/Nano and Ultra-precision Manufacturing, School of Mechatronic Engineering, Changchun University of Technology, Changchun, China
| | - Rongxin Zhu
- Jilin Provincial Key Laboratory of Micro/Nano and Ultra-precision Manufacturing, School of Mechatronic Engineering, Changchun University of Technology, Changchun, China
| | - Xiaochen Ning
- Jilin Provincial Key Laboratory of Micro/Nano and Ultra-precision Manufacturing, School of Mechatronic Engineering, Changchun University of Technology, Changchun, China
| | - Jieqiong Lin
- Jilin Provincial Key Laboratory of Micro/Nano and Ultra-precision Manufacturing, School of Mechatronic Engineering, Changchun University of Technology, Changchun, China
| |
Collapse
|
2
|
Abstract
Autonomous robots are expected to perform a wide range of sophisticated tasks in complex, unknown environments. However, available onboard computing capabilities and algorithms represent a considerable obstacle to reaching higher levels of autonomy, especially as robots get smaller and the end of Moore's law approaches. Here, we argue that inspiration from insect intelligence is a promising alternative to classic methods in robotics for the artificial intelligence (AI) needed for the autonomy of small, mobile robots. The advantage of insect intelligence stems from its resource efficiency (or parsimony) especially in terms of power and mass. First, we discuss the main aspects of insect intelligence underlying this parsimony: embodiment, sensory-motor coordination, and swarming. Then, we take stock of where insect-inspired AI stands as an alternative to other approaches to important robotic tasks such as navigation and identify open challenges on the road to its more widespread adoption. Last, we reflect on the types of processors that are suitable for implementing insect-inspired AI, from more traditional ones such as microcontrollers and field-programmable gate arrays to unconventional neuromorphic processors. We argue that even for neuromorphic processors, one should not simply apply existing AI algorithms but exploit insights from natural insect intelligence to get maximally efficient AI for robot autonomy.
Collapse
Affiliation(s)
- G C H E de Croon
- Micro Air Vehicle Laboratory, Faculty of Aerospace Engineering, TU Delft, Delft, Netherlands
| | - J J G Dupeyroux
- Micro Air Vehicle Laboratory, Faculty of Aerospace Engineering, TU Delft, Delft, Netherlands
| | - S B Fuller
- Autonomous Insect Robotics Laboratory, Department of Mechanical Engineering and Paul G. Allen School of Computer Science, University of Washington, Seattle, WA, USA
| | - J A R Marshall
- Opteran Technologies, Sheffield, UK
- Complex Systems Modeling Group, Department of Computer Science, University of Sheffield, Sheffield, UK
| |
Collapse
|
3
|
Cheng Y, Cao J, Zhang Y, Hao Q. Review of state-of-the-art artificial compound eye imaging systems. BIOINSPIRATION & BIOMIMETICS 2019; 14:031002. [PMID: 30654337 DOI: 10.1088/1748-3190/aaffb5] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The natural compound eye has received much attention in recent years due to its remarkable properties, such as its large field of view (FOV), compact structure, and high sensitivity to moving objects. Many studies have been devoted to mimicking the imaging system of the natural compound eye. The paper gives a review of state-of-the-art artificial compound eye imaging systems. Firstly, we introduce the imaging principle of three types of natural compound eye. Then, we divide current artificial compound eye imaging systems into four categories according to the difference of structural composition. Readers can easily grasp methods to build an artificial compound eye imaging system from the perspective of structural composition. Moreover, we compare the imaging performance of state-of-the-art artificial compound eye imaging systems, which provides a reference for readers to design system parameters of an artificial compound eye imaging system. Next, we present the applications of the artificial compound eye imaging system including imaging with a large FOV, imaging with high resolution, object distance detection, medical imaging, egomotion estimation, and navigation. Finally, an outlook of the artificial compound eye imaging system is highlighted.
Collapse
Affiliation(s)
- Yang Cheng
- Key Laboratory of Biomimetic Robots and Systems, Ministry of Education, Beijing Institute of Technology, Beijing, People's Republic of China
| | | | | | | |
Collapse
|
4
|
Colonnier F, Ramirez-Martinez S, Viollet S, Ruffier F. A bio-inspired sighted robot chases like a hoverfly. BIOINSPIRATION & BIOMIMETICS 2019; 14:036002. [PMID: 30654332 DOI: 10.1088/1748-3190/aaffa4] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Here we present a novel bio-inspired visual processing system, which enables a robot to locate and follow a target, using an artificial compound eye called CurvACE. This visual sensor actively scanned the environment at an imposed frequency (50 Hz) with an angular scanning amplitude of [Formula: see text] and succeeded in locating a textured cylindrical target with hyperacuity, i.e. much finer resolution than the coarse inter-receptor angle of the compound eye. Equipped with this small, lightweight visual scanning sensor, a Mecanum-wheeled mobile robot named ACEbot was able to follow a target at a constant distance by localizing the right and left edges of the target. The localization of the target's contrasted edges is based on a bio-inspired summation of Gaussian receptive fields in the visual system. By means of its auto-adaptive pixels, ACEbot consistently achieved similar pursuit performances under various lighting conditions with a high level of repeatability. The robotic pursuit pattern mimics finely the behavior of the male fly Syritta Pipens L. while pursuing the female. The high similarity in the trajectories as well as the biomimicry of the visual system provides strong support for the hypothesis that flies do maintain center the target and constant its subtended angle during smooth pursuit. Moreover, we discuss the fact that such simple strategy can also provide a trajectory compatible with motion camouflage.
Collapse
Affiliation(s)
- Fabien Colonnier
- Aix Marseille Univ, CNRS, ISM, Marseille, France. Temasek Labs, National University of Singapore, Singapore, Singapore
| | | | | | | |
Collapse
|
5
|
A Hybrid Bionic Image Sensor Achieving FOV Extension and Foveated Imaging. SENSORS 2018; 18:s18041042. [PMID: 29601531 PMCID: PMC5948721 DOI: 10.3390/s18041042] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/25/2018] [Revised: 03/27/2018] [Accepted: 03/28/2018] [Indexed: 11/16/2022]
Abstract
Based on bionic compound eye and human foveated imaging mechanisms, a hybrid bionic image sensor (HBIS) is proposed in this paper to extend the field of view (FOV) with high resolution. First, the hybrid bionic imaging model was developed and the structure parameters of the HBIS were deduced. Second, the properties of the HBIS were simulated, including FOV extension, super-resolution imaging, foveal ratio and so on. Third, a prototype of the HBIS was developed to validate the theory. Imaging experiments were carried out, and the results are in accordance with the simulations, proving the potential of the HBIS for large FOV and high-resolution imaging with low cost.
Collapse
|
6
|
Zhang K, Jung YH, Mikael S, Seo JH, Kim M, Mi H, Zhou H, Xia Z, Zhou W, Gong S, Ma Z. Origami silicon optoelectronics for hemispherical electronic eye systems. Nat Commun 2017; 8:1782. [PMID: 29176549 PMCID: PMC5701179 DOI: 10.1038/s41467-017-01926-1] [Citation(s) in RCA: 74] [Impact Index Per Article: 10.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2017] [Accepted: 10/25/2017] [Indexed: 12/13/2022] Open
Abstract
Digital image sensors in hemispherical geometries offer unique imaging advantages over their planar counterparts, such as wide field of view and low aberrations. Deforming miniature semiconductor-based sensors with high-spatial resolution into such format is challenging. Here we report a simple origami approach for fabricating single-crystalline silicon-based focal plane arrays and artificial compound eyes that have hemisphere-like structures. Convex isogonal polyhedral concepts allow certain combinations of polygons to fold into spherical formats. Using each polygon block as a sensor pixel, the silicon-based devices are shaped into maps of truncated icosahedron and fabricated on flexible sheets and further folded either into a concave or convex hemisphere. These two electronic eye prototypes represent simple and low-cost methods as well as flexible optimization parameters in terms of pixel density and design. Results demonstrated in this work combined with miniature size and simplicity of the design establish practical technology for integration with conventional electronic devices.
Collapse
Affiliation(s)
- Kan Zhang
- Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI, 53706, USA
| | - Yei Hwan Jung
- Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI, 53706, USA
| | - Solomon Mikael
- Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI, 53706, USA
| | - Jung-Hun Seo
- Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI, 53706, USA
| | - Munho Kim
- Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI, 53706, USA
| | - Hongyi Mi
- Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI, 53706, USA
| | - Han Zhou
- Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI, 53706, USA
| | - Zhenyang Xia
- Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI, 53706, USA
| | - Weidong Zhou
- Department of Electrical Engineering, University of Texas at Arlington, Arlington, TX, 79019, USA
| | - Shaoqin Gong
- Department of Biomedical Engineering and Wisconsin Institutes for Discovery, University of Wisconsin-Madison, Madison, WI, 53706, USA
| | - Zhenqiang Ma
- Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI, 53706, USA.
| |
Collapse
|
7
|
Vanhoutte E, Mafrica S, Ruffier F, Bootsma RJ, Serres J. Time-of-Travel Methods for Measuring Optical Flow on Board a Micro Flying Robot. SENSORS 2017; 17:s17030571. [PMID: 28287484 PMCID: PMC5375857 DOI: 10.3390/s17030571] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/30/2016] [Revised: 03/08/2017] [Accepted: 03/09/2017] [Indexed: 11/22/2022]
Abstract
For use in autonomous micro air vehicles, visual sensors must not only be small, lightweight and insensitive to light variations; on-board autopilots also require fast and accurate optical flow measurements over a wide range of speeds. Using an auto-adaptive bio-inspired Michaelis–Menten Auto-adaptive Pixel (M2APix) analog silicon retina, in this article, we present comparative tests of two optical flow calculation algorithms operating under lighting conditions from 6×10−7 to 1.6×10−2 W·cm−2 (i.e., from 0.2 to 12,000 lux for human vision). Contrast “time of travel” between two adjacent light-sensitive pixels was determined by thresholding and by cross-correlating the two pixels’ signals, with measurement frequency up to 5 kHz for the 10 local motion sensors of the M2APix sensor. While both algorithms adequately measured optical flow between 25 ∘/s and 1000 ∘/s, thresholding gave rise to a lower precision, especially due to a larger number of outliers at higher speeds. Compared to thresholding, cross-correlation also allowed for a higher rate of optical flow output (99 Hz and 1195 Hz, respectively) but required substantially more computational resources.
Collapse
Affiliation(s)
- Erik Vanhoutte
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288 Marseille Cedex 09, France.
| | - Stefano Mafrica
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288 Marseille Cedex 09, France.
| | - Franck Ruffier
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288 Marseille Cedex 09, France.
| | - Reinoud J Bootsma
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288 Marseille Cedex 09, France.
| | - Julien Serres
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288 Marseille Cedex 09, France.
| |
Collapse
|
8
|
Wu S, Jiang T, Zhang G, Schoenemann B, Neri F, Zhu M, Bu C, Han J, Kuhnert KD. Artificial compound eye: a survey of the state-of-the-art. Artif Intell Rev 2016. [DOI: 10.1007/s10462-016-9513-7] [Citation(s) in RCA: 33] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
9
|
Pericet-Camara R, Dobrzynski MK, Juston R, Viollet S, Leitel R, Mallot HA, Floreano D. An artificial elementary eye with optic flow detection and compositional properties. J R Soc Interface 2016. [PMID: 26202684 DOI: 10.1098/rsif.2015.0414] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
We describe a 2 mg artificial elementary eye whose structure and functionality is inspired by compound eye ommatidia. Its optical sensitivity and electronic architecture are sufficient to generate the required signals for the measurement of local optic flow vectors in multiple directions. Multiple elementary eyes can be assembled to create a compound vision system of desired shape and curvature spanning large fields of view. The system configurability is validated with the fabrication of a flexible linear array of artificial elementary eyes capable of extracting optic flow over multiple visual directions.
Collapse
Affiliation(s)
- Ramon Pericet-Camara
- Laboratory of Intelligent Systems, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Michal K Dobrzynski
- Laboratory of Intelligent Systems, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Raphaël Juston
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288, Marseille CEDEX 09, France
| | - Stéphane Viollet
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288, Marseille CEDEX 09, France
| | - Robert Leitel
- Fraunhofer Institute for Applied Optics and Precision Engineering, Jena, Germany
| | - Hanspeter A Mallot
- Laboratory of Cognitive Systems, Department of Biology, University of Tübingen, Tübingen, Germany
| | - Dario Floreano
- Laboratory of Intelligent Systems, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
10
|
Mafrica S, Godiot S, Menouni M, Boyron M, Expert F, Juston R, Marchand N, Ruffier F, Viollet S. A bio-inspired analog silicon retina with Michaelis-Menten auto-adaptive pixels sensitive to small and large changes in light. OPTICS EXPRESS 2015; 23:5614-5635. [PMID: 25836794 DOI: 10.1364/oe.23.005614] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
In this paper, we present: (i) a novel analog silicon retina featuring auto-adaptive pixels that obey the Michaelis-Menten law, i.e. V=V(m) I(n)/I(n)+σ(n); (ii) a method of characterizing silicon retinas, which makes it possible to accurately assess the pixels' response to transient luminous changes in a ±3-decade range, as well as changes in the initial steady-state intensity in a 7-decade range. The novel pixel, called M(2)APix, which stands for Michaelis-Menten Auto-Adaptive Pixel, can auto-adapt in a 7-decade range and responds appropriately to step changes up to ±3 decades in size without causing any saturation of the Very Large Scale Integration (VLSI) transistors. Thanks to the intrinsic properties of the Michaelis-Menten equation, the pixel output always remains within a constant limited voltage range. The range of the Analog to Digital Converter (ADC) was therefore adjusted so as to obtain a Least Significant Bit (LSB) voltage of 2.35mV and an effective resolution of about 9 bits. The results presented here show that the M(2)APix produced a quasi-linear contrast response once it had adapted to the average luminosity. Differently to what occurs in its biological counterparts, neither the sensitivity to changes in light nor the contrast response of the M(2)APix depend on the mean luminosity (i.e. the ambient lighting conditions). Lastly, a full comparison between the M(2)APix and the Delbrück auto-adaptive pixel is provided.
Collapse
|
11
|
Colonnier F, Manecy A, Juston R, Mallot H, Leitel R, Floreano D, Viollet S. A small-scale hyperacute compound eye featuring active eye tremor: application to visual stabilization, target tracking, and short-range odometry. BIOINSPIRATION & BIOMIMETICS 2015; 10:026002. [PMID: 25712307 DOI: 10.1088/1748-3190/10/2/026002] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
In this study, a miniature artificial compound eye (15 mm in diameter) called the curved artificial compound eye (CurvACE) was endowed for the first time with hyperacuity, using similar micro-movements to those occurring in the fly's compound eye. A periodic micro-scanning movement of only a few degrees enables the vibrating compound eye to locate contrasting objects with a 40-fold greater resolution than that imposed by the interommatidial angle. In this study, we developed a new algorithm merging the output of 35 local processing units consisting of adjacent pairs of artificial ommatidia. The local measurements performed by each pair are processed in parallel with very few computational resources, which makes it possible to reach a high refresh rate of 500 Hz. An aerial robotic platform with two degrees of freedom equipped with the active CurvACE placed over naturally textured panels was able to assess its linear position accurately with respect to the environment thanks to its efficient gaze stabilization system. The algorithm was found to perform robustly at different light conditions as well as distance variations relative to the ground and featured small closed-loop positioning errors of the robot in the range of 45 mm. In addition, three tasks of interest were performed without having to change the algorithm: short-range odometry, visual stabilization, and tracking contrasting objects (hands) moving over a textured background.
Collapse
Affiliation(s)
- Fabien Colonnier
- Aix-Marseille Université, CNRS, ISM UMR 7287, 13288, Marseille cedex 09, France
| | | | | | | | | | | | | |
Collapse
|