1
|
Sukhavasi SB, Sukhavasi SB, Elleithy K, Abuzneid S, Elleithy A. CMOS Image Sensors in Surveillance System Applications. SENSORS 2021; 21:s21020488. [PMID: 33445557 PMCID: PMC7827463 DOI: 10.3390/s21020488] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/21/2020] [Revised: 01/08/2021] [Accepted: 01/08/2021] [Indexed: 12/19/2022]
Abstract
Recent technology advances in CMOS image sensors (CIS) enable their utilization in the most demanding of surveillance fields, especially visual surveillance and intrusion detection in intelligent surveillance systems, aerial surveillance in war zones, Earth environmental surveillance by satellites in space monitoring, agricultural monitoring using wireless sensor networks and internet of things and driver assistance in automotive fields. This paper presents an overview of CMOS image sensor-based surveillance applications over the last decade by tabulating the design characteristics related to image quality such as resolution, frame rate, dynamic range, signal-to-noise ratio, and also processing technology. Different models of CMOS image sensors used in all applications have been surveyed and tabulated for every year and application.
Collapse
Affiliation(s)
- Susrutha Babu Sukhavasi
- Department of Computer Science and Engineering, University of Bridgeport, Bridgeport, CT 06604, USA; (S.B.S.); (S.B.S.); (S.A.)
| | - Suparshya Babu Sukhavasi
- Department of Computer Science and Engineering, University of Bridgeport, Bridgeport, CT 06604, USA; (S.B.S.); (S.B.S.); (S.A.)
| | - Khaled Elleithy
- Department of Computer Science and Engineering, University of Bridgeport, Bridgeport, CT 06604, USA; (S.B.S.); (S.B.S.); (S.A.)
- Correspondence: ; Tel.: +1-203-576-4703
| | - Shakour Abuzneid
- Department of Computer Science and Engineering, University of Bridgeport, Bridgeport, CT 06604, USA; (S.B.S.); (S.B.S.); (S.A.)
| | - Abdelrahman Elleithy
- Department of Computer Science, William Paterson University, Wayne, NJ 07470, USA;
| |
Collapse
|
2
|
Steland A. Shrinkage for covariance estimation: asymptotics, confidence intervals, bounds and applications in sensor monitoring and finance. Stat Pap (Berl) 2018. [DOI: 10.1007/s00362-018-1040-y] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
|
3
|
Chen YL, Chiang HH, Chiang CY, Liu CM, Yuan SM, Wang JH. A vision-based driver nighttime assistance and surveillance system based on intelligent image sensing techniques and a heterogamous dual-core embedded system architecture. SENSORS 2012; 12:2373-99. [PMID: 22736956 PMCID: PMC3376597 DOI: 10.3390/s120302373] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/09/2012] [Revised: 02/12/2012] [Accepted: 02/21/2012] [Indexed: 11/16/2022]
Abstract
This study proposes a vision-based intelligent nighttime driver assistance and surveillance system (VIDASS system) implemented by a set of embedded software components and modules, and integrates these modules to accomplish a component-based system framework on an embedded heterogamous dual-core platform. Therefore, this study develops and implements computer vision and sensing techniques of nighttime vehicle detection, collision warning determination, and traffic event recording. The proposed system processes the road-scene frames in front of the host car captured from CCD sensors mounted on the host vehicle. These vision-based sensing and processing technologies are integrated and implemented on an ARM-DSP heterogamous dual-core embedded platform. Peripheral devices, including image grabbing devices, communication modules, and other in-vehicle control devices, are also integrated to form an in-vehicle-embedded vision-based nighttime driver assistance and surveillance system.
Collapse
Affiliation(s)
- Yen-Lin Chen
- Department of Computer Science and Information Engineering, National Taipei University of Technology, 1, Sec. 3, Chung-hsiao E. Rd., Taipei 10608, Taiwan; E-Mails: (Y.-L.C.); (C.-M.L.)
| | - Hsin-Han Chiang
- Department of Electrical Engineering, Fu Jen Catholic University, New Taipei City 24205, Taiwan; E-Mail:
| | - Chuan-Yen Chiang
- Department of Computer Science, National Chiao Tung University, 1001 University Road, Hsinchu 30050, Taiwan; E-Mails: (C.-Y.C.); (S.-M.Y.)
| | - Chuan-Ming Liu
- Department of Computer Science and Information Engineering, National Taipei University of Technology, 1, Sec. 3, Chung-hsiao E. Rd., Taipei 10608, Taiwan; E-Mails: (Y.-L.C.); (C.-M.L.)
| | - Shyan-Ming Yuan
- Department of Computer Science, National Chiao Tung University, 1001 University Road, Hsinchu 30050, Taiwan; E-Mails: (C.-Y.C.); (S.-M.Y.)
| | - Jenq-Haur Wang
- Department of Computer Science and Information Engineering, National Taipei University of Technology, 1, Sec. 3, Chung-hsiao E. Rd., Taipei 10608, Taiwan; E-Mails: (Y.-L.C.); (C.-M.L.)
- Author to whom correspondence should be addressed; E-Mail: ; Tel.: +886-2-2771-2171 ext. 4238
| |
Collapse
|
4
|
Chen YL, Liang WY, Chiang CY, Hsieh TJ, Lee DC, Yuan SM, Chang YL. Vision-based finger detection, tracking, and event identification techniques for multi-touch sensing and display systems. SENSORS 2011; 11:6868-92. [PMID: 22163990 PMCID: PMC3231698 DOI: 10.3390/s110706868] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/12/2011] [Revised: 06/20/2011] [Accepted: 06/20/2011] [Indexed: 11/18/2022]
Abstract
This study presents efficient vision-based finger detection, tracking, and event identification techniques and a low-cost hardware framework for multi-touch sensing and display applications. The proposed approach uses a fast bright-blob segmentation process based on automatic multilevel histogram thresholding to extract the pixels of touch blobs obtained from scattered infrared lights captured by a video camera. The advantage of this automatic multilevel thresholding approach is its robustness and adaptability when dealing with various ambient lighting conditions and spurious infrared noises. To extract the connected components of these touch blobs, a connected-component analysis procedure is applied to the bright pixels acquired by the previous stage. After extracting the touch blobs from each of the captured image frames, a blob tracking and event recognition process analyzes the spatial and temporal information of these touch blobs from consecutive frames to determine the possible touch events and actions performed by users. This process also refines the detection results and corrects for errors and occlusions caused by noise and errors during the blob extraction process. The proposed blob tracking and touch event recognition process includes two phases. First, the phase of blob tracking associates the motion correspondence of blobs in succeeding frames by analyzing their spatial and temporal features. The touch event recognition process can identify meaningful touch events based on the motion information of touch blobs, such as finger moving, rotating, pressing, hovering, and clicking actions. Experimental results demonstrate that the proposed vision-based finger detection, tracking, and event identification system is feasible and effective for multi-touch sensing applications in various operational environments and conditions.
Collapse
Affiliation(s)
- Yen-Lin Chen
- Department of Computer Science and Information Engineering, National Taipei University of Technology, 1, Sec. 3, Chung-hsiao E. Rd., Taipei 10608, Taiwan; E-Mails: (Y.-L.C.); (W.-Y.L.); (T.-J.H.); (D.-C.L.)
| | - Wen-Yew Liang
- Department of Computer Science and Information Engineering, National Taipei University of Technology, 1, Sec. 3, Chung-hsiao E. Rd., Taipei 10608, Taiwan; E-Mails: (Y.-L.C.); (W.-Y.L.); (T.-J.H.); (D.-C.L.)
| | - Chuan-Yen Chiang
- Department of Computer Science, National Chiao Tung University, 1001 University Road, Hsinchu 30050, Taiwan; E-Mails: (C.-Y.C.); (S.-M.Y.)
| | - Tung-Ju Hsieh
- Department of Computer Science and Information Engineering, National Taipei University of Technology, 1, Sec. 3, Chung-hsiao E. Rd., Taipei 10608, Taiwan; E-Mails: (Y.-L.C.); (W.-Y.L.); (T.-J.H.); (D.-C.L.)
| | - Da-Cheng Lee
- Department of Computer Science and Information Engineering, National Taipei University of Technology, 1, Sec. 3, Chung-hsiao E. Rd., Taipei 10608, Taiwan; E-Mails: (Y.-L.C.); (W.-Y.L.); (T.-J.H.); (D.-C.L.)
| | - Shyan-Ming Yuan
- Department of Computer Science, National Chiao Tung University, 1001 University Road, Hsinchu 30050, Taiwan; E-Mails: (C.-Y.C.); (S.-M.Y.)
| | - Yang-Lang Chang
- Department of Electrical Engineering, National Taipei University of Technology, 1, Sec. 3, Chung-hsiao E. Rd., Taipei 10608, Taiwan
- Author to whom correspondence should be addressed; E-Mail: ; Tel.: +886-2-2771-2171 ext. 2156
| |
Collapse
|