1
|
Malligere Shivanna V, Guo JI. Object Detection, Recognition, and Tracking Algorithms for ADASs-A Study on Recent Trends. SENSORS (BASEL, SWITZERLAND) 2023; 24:249. [PMID: 38203111 PMCID: PMC10781282 DOI: 10.3390/s24010249] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/28/2023] [Revised: 12/13/2023] [Accepted: 12/20/2023] [Indexed: 01/12/2024]
Abstract
Advanced driver assistance systems (ADASs) are becoming increasingly common in modern-day vehicles, as they not only improve safety and reduce accidents but also aid in smoother and easier driving. ADASs rely on a variety of sensors such as cameras, radars, lidars, and a combination of sensors, to perceive their surroundings and identify and track objects on the road. The key components of ADASs are object detection, recognition, and tracking algorithms that allow vehicles to identify and track other objects on the road, such as other vehicles, pedestrians, cyclists, obstacles, traffic signs, traffic lights, etc. This information is then used to warn the driver of potential hazards or used by the ADAS itself to take corrective actions to avoid an accident. This paper provides a review of prominent state-of-the-art object detection, recognition, and tracking algorithms used in different functionalities of ADASs. The paper begins by introducing the history and fundamentals of ADASs followed by reviewing recent trends in various ADAS algorithms and their functionalities, along with the datasets employed. The paper concludes by discussing the future of object detection, recognition, and tracking algorithms for ADASs. The paper also discusses the need for more research on object detection, recognition, and tracking in challenging environments, such as those with low visibility or high traffic density.
Collapse
Grants
- 112-2218-E-A49-027- National Science and Technology Council (NSTC), Taiwan, R.O.C.
- 112-2218-E-002 -042 - National Science and Technology Council (NSTC), Taiwan, R.O.C.
- 111-2622-8-A49-023- National Science and Technology Council (NSTC), Taiwan, R.O.C.
- 111-2221-E-A49-126-MY3 National Science and Technology Council (NSTC), Taiwan, R.O.C.
- 111-2634-F-A49-013- National Science and Technology Council (NSTC), Taiwan, R.O.C.
- 110-2221-E-A49-145-MY3 National Science and Technology Council (NSTC), Taiwan, R.O.C.
Collapse
Affiliation(s)
- Vinay Malligere Shivanna
- Department of Electrical Engineering, Institute of Electronics, National Yang-Ming Chiao Tung University, Hsinchu City 30010, Taiwan;
| | - Jiun-In Guo
- Department of Electrical Engineering, Institute of Electronics, National Yang-Ming Chiao Tung University, Hsinchu City 30010, Taiwan;
- Pervasive Artificial Intelligence Research (PAIR) Labs, National Yang Ming Chiao Tung University, Hsinchu City 30010, Taiwan
- eNeural Technologies Inc., Hsinchu City 30010, Taiwan
| |
Collapse
|
2
|
Haider A, Pigniczki M, Koyama S, Köhler MH, Haas L, Fink M, Schardt M, Nagase K, Zeh T, Eryildirim A, Poguntke T, Inoue H, Jakobi M, Koch AW. A Methodology to Model the Rain and Fog Effect on the Performance of Automotive LiDAR Sensors. SENSORS (BASEL, SWITZERLAND) 2023; 23:6891. [PMID: 37571674 PMCID: PMC10422612 DOI: 10.3390/s23156891] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/20/2023] [Revised: 07/24/2023] [Accepted: 07/28/2023] [Indexed: 08/13/2023]
Abstract
In this work, we introduce a novel approach to model the rain and fog effect on the light detection and ranging (LiDAR) sensor performance for the simulation-based testing of LiDAR systems. The proposed methodology allows for the simulation of the rain and fog effect using the rigorous applications of the Mie scattering theory on the time domain for transient and point cloud levels for spatial analyses. The time domain analysis permits us to benchmark the virtual LiDAR signal attenuation and signal-to-noise ratio (SNR) caused by rain and fog droplets. In addition, the detection rate (DR), false detection rate (FDR), and distance error derror of the virtual LiDAR sensor due to rain and fog droplets are evaluated on the point cloud level. The mean absolute percentage error (MAPE) is used to quantify the simulation and real measurement results on the time domain and point cloud levels for the rain and fog droplets. The results of the simulation and real measurements match well on the time domain and point cloud levels if the simulated and real rain distributions are the same. The real and virtual LiDAR sensor performance degrades more under the influence of fog droplets than in rain.
Collapse
Affiliation(s)
- Arsalan Haider
- Institute for Driver Assistance Systems and Connected Mobility (IFM), Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
- Institute for Measurement Systems and Sensor Technology, Technical University of Munich, Theresienstrasse 90, 80333 Munich, Germany
| | - Marcell Pigniczki
- Institute for Driver Assistance Systems and Connected Mobility (IFM), Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
| | - Shotaro Koyama
- Advanced Vehicle Research Institute, Kanagawa Institute of Technology, Shimoogino 1030, Atsugi 243-0292, Japan
| | | | - Lukas Haas
- Institute for Driver Assistance Systems and Connected Mobility (IFM), Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
| | - Maximilian Fink
- Institute for Measurement Systems and Sensor Technology, Technical University of Munich, Theresienstrasse 90, 80333 Munich, Germany
| | | | - Koji Nagase
- Advanced Vehicle Research Institute, Kanagawa Institute of Technology, Shimoogino 1030, Atsugi 243-0292, Japan
| | - Thomas Zeh
- Institute for Driver Assistance Systems and Connected Mobility (IFM), Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
| | | | - Tim Poguntke
- Institute for Driver Assistance Systems and Connected Mobility (IFM), Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
| | - Hideo Inoue
- Advanced Vehicle Research Institute, Kanagawa Institute of Technology, Shimoogino 1030, Atsugi 243-0292, Japan
| | - Martin Jakobi
- Institute for Measurement Systems and Sensor Technology, Technical University of Munich, Theresienstrasse 90, 80333 Munich, Germany
| | - Alexander W. Koch
- Institute for Measurement Systems and Sensor Technology, Technical University of Munich, Theresienstrasse 90, 80333 Munich, Germany
| |
Collapse
|
3
|
Haider A, Cho Y, Pigniczki M, Köhler MH, Haas L, Kastner L, Fink M, Schardt M, Cichy Y, Koyama S, Zeh T, Poguntke T, Inoue H, Jakobi M, Koch AW. Performance Evaluation of MEMS-Based Automotive LiDAR Sensor and Its Simulation Model as per ASTM E3125-17 Standard. SENSORS (BASEL, SWITZERLAND) 2023; 23:3113. [PMID: 36991824 PMCID: PMC10056070 DOI: 10.3390/s23063113] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/09/2023] [Revised: 03/07/2023] [Accepted: 03/08/2023] [Indexed: 06/19/2023]
Abstract
Measurement performance evaluation of real and virtual automotive light detection and ranging (LiDAR) sensors is an active area of research. However, no commonly accepted automotive standards, metrics, or criteria exist to evaluate their measurement performance. ASTM International released the ASTM E3125-17 standard for the operational performance evaluation of 3D imaging systems commonly referred to as terrestrial laser scanners (TLS). This standard defines the specifications and static test procedures to evaluate the 3D imaging and point-to-point distance measurement performance of TLS. In this work, we have assessed the 3D imaging and point-to-point distance estimation performance of a commercial micro-electro-mechanical system (MEMS)-based automotive LiDAR sensor and its simulation model according to the test procedures defined in this standard. The static tests were performed in a laboratory environment. In addition, a subset of static tests was also performed at the proving ground in natural environmental conditions to determine the 3D imaging and point-to-point distance measurement performance of the real LiDAR sensor. In addition, real scenarios and environmental conditions were replicated in the virtual environment of a commercial software to verify the LiDAR model's working performance. The evaluation results show that the LiDAR sensor and its simulation model under analysis pass all the tests specified in the ASTM E3125-17 standard. This standard helps to understand whether sensor measurement errors are due to internal or external influences. We have also shown that the 3D imaging and point-to-point distance estimation performance of LiDAR sensors significantly impacts the working performance of the object recognition algorithm. That is why this standard can be beneficial in validating automotive real and virtual LiDAR sensors, at least in the early stage of development. Furthermore, the simulation and real measurements show good agreement on the point cloud and object recognition levels.
Collapse
Affiliation(s)
- Arsalan Haider
- IFM—Institute for Advanced Driver Assistance Systems and Connected Mobility, Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
- Institute for Measurement Systems and Sensor Technology, Technical University of Munich, Theresienstr. 90, 80333 Munich, Germany
| | - Yongjae Cho
- IFM—Institute for Advanced Driver Assistance Systems and Connected Mobility, Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
| | - Marcell Pigniczki
- IFM—Institute for Advanced Driver Assistance Systems and Connected Mobility, Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
| | | | - Lukas Haas
- IFM—Institute for Advanced Driver Assistance Systems and Connected Mobility, Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
| | - Ludwig Kastner
- IFM—Institute for Advanced Driver Assistance Systems and Connected Mobility, Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
| | - Maximilian Fink
- Institute for Measurement Systems and Sensor Technology, Technical University of Munich, Theresienstr. 90, 80333 Munich, Germany
| | | | - Yannik Cichy
- IPG Automotive GmbH, Bannwaldallee 60, 76185 Karlsruhe, Germany
| | - Shotaro Koyama
- Department of Vehicle System Engineering, Kanagawa Institute of Technology, Shimoogino 1030, Atsugi 243-0292, Kanagawa, Japan
| | - Thomas Zeh
- IFM—Institute for Advanced Driver Assistance Systems and Connected Mobility, Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
| | - Tim Poguntke
- IFM—Institute for Advanced Driver Assistance Systems and Connected Mobility, Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany
| | - Hideo Inoue
- Department of Vehicle System Engineering, Kanagawa Institute of Technology, Shimoogino 1030, Atsugi 243-0292, Kanagawa, Japan
| | - Martin Jakobi
- Institute for Measurement Systems and Sensor Technology, Technical University of Munich, Theresienstr. 90, 80333 Munich, Germany
| | - Alexander W. Koch
- Institute for Measurement Systems and Sensor Technology, Technical University of Munich, Theresienstr. 90, 80333 Munich, Germany
| |
Collapse
|