ZF is an innovative autonomous car sensors developer, giving an insight into the coordination needed to make them work together. Advanced driver assistance systems must also be capable of navigating through congested areas. The information from the cameras and sensors must be intelligently coupled for the vehicle to perform successfully under all lighting and weather situations.

A camera with a blank gap on each picture most likely has a design flaw. When you compare the human eye to a camera, you’ll see that when you look, the same thing happens: There are no receptors where the optic nerve leaves the retina to record the light stimulus from which a picture is formed in the brain.

Autonomous car sensors and the extraordinary symphony that make self-driving cars work

The blind spot refers to this area of the eye. However, it has no effect because the surrounding retinal receptors and, in particular, the visual perceptions from the other eye, compensate for the missing picture points. Two eyes that are close to each other guarantee that humans can sense spatial depth, which is essential for measuring distances.

In terms of technology, the data from two sensors merges or fuses to create a more comprehensive picture with more information.

Sensors and cameras need to work in concert

Developers of autonomous driving features also use this idea. Different sensors are required so that a self-driving car can understand every traffic scenario without ambiguity, especially in poor illumination and weather conditions. Each sensor, such as cameras, radar, and lidar, has its own benefits. They may be cleverly packaged to provide a broad and comprehensive 360-degree vision.

“As system architects of autonomous driving, we have developed a sensor set that equips vehicles with all necessary senses in order to be able to perceive their environment digitally,” explains Torsten Gollewski, head of ZF Advanced Engineering and managing director of Zukunft Ventures GmbH.

Autonomous car sensors include cameras

For object detection, cameras are required. They provide the essential information to the car by employing artificial intelligence to recognize items along the side of the road, such as people or trash cans. Furthermore, the camera’s greatest strength is its ability to properly measure angles.

Autonomous car sensors and the extraordinary symphony that make self-driving cars work

This enables the car to anticipate whether or not an oncoming vehicle will turn. On highways, a long range of up to 300 meters and a limited angle of vision are required to record pedestrians and traffic, but city transportation needs a broad angle of vision to record people and vehicles.

The wide range of camera systems offered by ZF is critical for adaptive cruise control, automatic emergency braking, and the Lane Keeping Assist feature.

Interior cameras are important for passenger safety

Cameras, on the other hand, not only monitor the vehicle’s outer surroundings but also keep a watch on the driver and passengers inside. They can, for example, detect not just if the driver is preoccupied or sleepy, but also which seats the passengers prefer. This information is a significant safety benefit since, in the event of an accident, the seat belt and airbag functions adjust appropriately.

Low visibility is countered with radar

Unlike cameras that passively record image information, radar systems are an active technology. These sensors emit electromagnetic waves and receive the “echo” that is reflected back from the surrounding objects. Radar sensors can therefore determine, especially, the distance and the relative speed of these objects with a high level of accuracy.

That makes them ideal for maintaining distances, issuing collision warnings, or for emergency brake assist systems. Another decisive benefit of radar sensors, compared to optical systems, is that they function regardless of weather, light, or visibility conditions because they use radio waves. That makes them an important component in the sensor set.

Autonomous car sensors and the extraordinary symphony that make self-driving cars work

Like its camera systems, ZF also offers a broad assortment of sensors with different ranges and opening angles (beam width) The imaging Gen21 Full Range Radar, for example, is a good option for highly automated and autonomous driving due to its high resolution.

Lidar is the high-tech radar

Lidar sensors also apply the echo principle, however, they use laser pulses instead of radio waves. That’s why they record distances and relative speeds equally as well as radar, but recognize objects and angles with a much higher level of accuracy. This is also a reason why they oversee complex traffic situations in the dark very well.

Unlike cameras and radar sensors, the angle of view is not critical because lidar sensors record the 360-degree environment of the vehicle. The high-resolution 3D solid state lidar sensors from ZF can also display pedestrians and smaller objects three dimensionally. This is very important for level 4 automation.

Autonomous car sensors and the extraordinary symphony that make self-driving cars work

The solid-state technology is considerably more robust than previous solutions due to the lack of moving components.

“It is good to see that solid-state lidar is hitting the road together with our partner Ibeo. And we have shown here on the Consumer Electronics Show the new full-range radar. That is a high-resolution radar technology that overcomes limitations of previous generations”, says Martin Randler, Director Sensor Technologies and Perception System.

Autonomous cars will “hear” as well

ZF’s technology solutions enable cars to perceive, as its advertising tagline “see. think. act.” indicates. In addition, the business installs the “Sound.AI” program in automobiles so that they can hear as well. The technology identifies incoming emergency vehicles, such as police cars, ambulances, and fire engines, based on their auditory signals, among other things.

When the car is equipped with Sound.AI, it will also pull over to the side of the road.

AI will link everything together

Autonomous car sensors and the extraordinary symphony that make self-driving cars work

The above-mentioned technical solutions, when combined into a single sensor set, also prevent blind spots from forming while detecting the vehicle’s surroundings, even in complicated conditions. A “brain” is also required to combine sensor data from the lidar, radar, and camera systems into a single full image. The “ProAI RoboThink” computer from ZF is the company’s answer to this problem.

In the automobile sector, it is now the world’s most powerful mainframe computer. Drivers will soon be able to shut their eyes or do anything else while their driverless cars drive themselves after they are outfitted with this artificial brain.

Source: ZF.




Tags: , , , , , ,

Related Article