Sensors in ADAS
Sensors in ADAS
3. Sensors in ADAS
2
I. Introduction
Advanced Driver Assistance Systems (ADAS) have transformed the automotive
industry by integrating a diverse array of sensors to enhance vehicle safety, precision,
and functionality. Among these sensors, radar, cameras, ultrasonic sensors, LIDAR,
GNSS, GPS, and IMU play pivotal roles, enabling vehicles to detect, analyze, and
respond to their surroundings. This comprehensive training delves into the intricate
workings of these sensors, their specific applications in ADAS, and their collective
contribution to the evolution of increasingly autonomous driving.
II. Automotive Radar
1. What is Radar?
RADAR (RAdio Detection And Ranging) uses radio waves to measure the distance,
angle, and velocity of objects. It operates on the principle of electromagnetic radiation
to detect and track objects around the vehicle.
2. Basic Working of Radar.
Radar functions by emitting radio waves and subsequently detecting their reflections.
The process involves; Transmission, Reflection, Reception, Analysis.
3. Radar Field of View (FoV):
A crucial aspect in radar systems is the Field of View (FoV), determining the area
within which a sensor can effectively detect objects. If an object falls completely
outside the FoV, the sensor will be unable to detect it. Radars are categorized based
on their specific characteristics and applications.
These include Short Range Radar (SRR), which operates within a maximum range of
approximately 50 meters and features a wide opening angle. Medium Range Radar
(MRR) covers a maximum range of approximately 80-100 meters with a wide to
medium opening angle. Long Range Radar (LRR) extends its reach to a maximum
range of approximately 250-300 meters but has a narrower opening angle.
Each category of radar is designed to suit different operational requirements, with
varying detection ranges and angles optimized for specific use cases. Understanding
the Field of View is essential for deploying radar systems effectively in diverse
scenarios.
4. Why Radar?
Radar technology serves as a pivotal choice for a multitude of reasons, encompassing
both functional and physical attributes. Functionally, radar systems excel in measuring
the relative speed, distance, and angle of a target, while also capable of determining
the Radar Cross Section (RCS), a characteristic related to the type of object. Notably,
radar exhibits robust performance in adverse weather conditions and varying light
situations, making it reliable in challenging environments.
Moreover, radars offer versatility in their specifications, providing options with different
ranges, angles, and speed coverage to suit diverse application needs. On the physical
3
front, these sensors boast a compact size, rendering them easily mountable within
vehicles. With a 360° Field of View (FoV), radar sensors ensure comprehensive
coverage, contributing to enhanced situational awareness. Furthermore, radar
systems are characterized by their robust construction, ensuring durability and
reliability in diverse operational conditions. Overall, radar emerges as a valuable
technology due to its functional capabilities and physical attributes, making it a
preferred choice in a range of applications.
5. Radar Digital Signal Processing
Radar Digital Signal Processing (DSP) plays a pivotal role in target analysis and
detection. The process involves a sequential application of key algorithms: Range
Fast Fourier Transform (FFT) for determining target distance, Doppler FFT for
calculating relative speed, Angle Estimation through angle FFT to ascertain target
angle, and finally, Detection Generation for precise identification. This streamlined
approach enables radar systems to efficiently process incoming signals, providing
accurate and timely information essential for effective target detection and tracking.
III. Camera (Vision System)
1. Camera:
A camera operates on the principle of passive light sensors to produce a digital image
of a covered region. Its capabilities extend to identifying road signs, traffic lights, and
lane markings, contributing significantly to Advanced Driver Assistance Systems
(ADAS).
2. Types of Cameras for ADAS:
4
4. Open Source Computer Vision Library (OpenCV) – for Algorithm
Development:
The Open Source Computer Vision Library (OpenCV) serves as a cross-platform
library for algorithm development. Its versatility makes it a valuable tool for creating
and implementing algorithms in computer vision applications for ADAS, contributing
to the advancement of vision-based technologies in the automotive industry.
By integrating cameras into ADAS, vehicles gain the ability to interpret and respond
to visual cues, enhancing overall safety and contributing to the realization of
increasingly autonomous driving capabilities.
IV. Ultrasonic sensor:
1. Ultrasonic Sensor:
The ultrasonic sensor utilizes sound waves to measure distances, exhibiting robust
performance even in adverse weather conditions. Typically employed for short-range
applications, such as automated systems, it plays a crucial role in enhancing safety
and precision.
2. Waves Used in Ultrasonic Sensors:
Ultrasonic waves are acoustic waves with frequencies exceeding 20kHz. Their
characteristics, including inaudibility to humans, high directivity, compressional
vibration of matter, and a propagation speed greater than that of light or radio waves
(approximately 340 m/s), make them highly suitable for diverse applications.
3. Non-Automotive Applications:
Ultrasonic sensors find application in various non-automotive settings, including
checking diameter, double sheet control, foil tear monitoring, height and width
measurements, level control, quality control, robotic sensing, and wire break
detection.
4. Main Usage in Parking Assist:
Primarily employed in parking assist systems, ultrasonic sensors were initially used
for obstacle detection during parking maneuvers. However, advancements have led
to the integration of ultrasonic sensors into automatic parking systems. These systems
take control of steering, acceleration, and braking based on the parking zone, utilizing
location information gathered from ultrasonic sensors.
In the context of "rear sonar," 2 to 4 sensors on the rear bumper detect obstacles up
to 2 to 2.5 meters away, providing a reliable and efficient solution for obstacle
detection during parking maneuvers.
The versatility of ultrasonic sensors extends beyond the automotive sector,
demonstrating their value in a wide range of applications, from industrial quality control
to robotic sensing. As technology continues to evolve, ultrasonic sensors remain at
the forefront of innovations aimed at enhancing safety and automation in various
domains.
5
V. LIDAR
1. Automotive LIDAR:
LIDAR, an acronym for Light Detection And Ranging, is a remote sensing technology
crucial for measuring distances. It utilizes active sensors that emit their own energy,
primarily operating on the principle of time of flight (TOF). By emitting laser beams,
LIDAR systems can generate high-resolution, densely spaced networks of elevation
points, forming point clouds that facilitate the detection and identification of objects.
2. LIDAR - Explanation from Velodyne:
LIDAR, unlike cameras, excels in low-light conditions, addressing challenges faced
by cameras in darkness. High-quality LIDAR systems boast between 8 to 128 laser
beams, enabling high-resolution, real-time environmental measurements, resulting in
a billion data points. Unlike 2D vision from cameras, LIDAR captures a full 3D image
of the environment, overcoming issues related to optical illusions, measurement
limitations, and blind spots.
3. Automotive LIDAR Examples:
Notable examples of automotive LIDAR systems include the HDL-64 E (High
Definition Real-Time 3D LIDAR) and the VLP-16 Puck (Real-Time 3D LIDAR sensor),
showcasing advancements in real-time, high-definition environmental perception.
4. LIDAR Types Based on Field of View (FoV):
LIDAR systems are categorized based on their Field of View (FoV), with some offering
a 360° FoV, such as Velodyne, while others, like Blickfeld, provide a specific angle
FoV.
5. LIDAR in ADAS:
LIDAR's integration into Advanced Driver Assistance Systems (ADAS) enhances the
overall capabilities of the system by providing accurate and detailed three-dimensional
environmental information. This contributes significantly to the enhancement of safety
and the development of increasingly autonomous driving technologies.
VI. GNSS, GPS, IMU:
6
information using atomic clocks, allowing receivers to trilaterate their position based
on signals from at least four GNSS satellites.
3. GPS:
The Global Positioning System (GPS) provides users with Positioning, Navigation,
and Timing (PNT) services. Operating through triangulation or trilateration, GPS
involves receiving signals from multiple satellites to determine the GPS receiver's
position on the ground accurately.
4. IMU:
Inertial Measurement Unit (IMU) devices consist of accelerometers, gyroscopes, and
magnetometers for the orthogonal X, Y, and Z axes. They are crucial for control and
guidance in autonomous vehicles and are integrated into Inertial Navigation Systems
(INS). In ADAS, IMUs contribute to increasing the accuracy of GNSS/GPS systems
for vehicle position, measuring vehicle motions like velocity, position, and
acceleration, and facilitating on-the-run sensor calibration.
5. GNSS+IMU Unit in ADAS:
ADMA (Automotive Dynamic Motion Analyze) represents a precise IMU connected by
Differential GNSS (DGNSS), employed for dynamic testing in the automotive sector.
This unit ensures accurate measurement of vehicle dynamics, contributing to the
continuous improvement of ADAS technologies.
VII. Conclusion
In conclusion, the integration of various sensors in ADAS is fundamental for the
advancement of automotive technology. From radar and cameras to ultrasonic
sensors, LIDAR, GNSS, GPS, and IMU, each sensor type contributes unique
advantages that collectively enhance vehicle safety and pave the way for autonomous
driving. The synergy of these sensors creates a robust perception system, enabling
vehicles to navigate complex environments with precision and reliability.
Moreover, the fusion of Radar, Camera, and Lidar technologies further amplifies the
perceptual capabilities of vehicles. This sensor fusion is pivotal for both ADAS and
fully Autonomous Driving (AD), providing a comprehensive understanding of the
vehicle's surroundings. The combination of radar's long-range detection, camera's
detailed vision, and LIDAR's 3D environmental mapping forms a powerful alliance for
enhanced perception, ensuring a safer and more efficient driving experience.
This training not only equips individuals with a profound understanding of sensor
technologies but also underscores their collective impact on reshaping the future of
automotive transportation.