Yes, we’re talking about radar, the same technology that was early 20’s for sure.
Today, radar is no longer just intended for aircraft and military installations. A number of new companies on both the hardware and software side are making radar an integral part of safety systems for detecting cyclists and pedestrians.
The need for such technology is urgent: the number of pedestrian deaths in the United States has increased in recent years despite Americans traveling fewer kilometers. And the increasing autonomy in new vehicles that enable features like collision warning, automatic braking and blind spot detection – not to mention the driverless cars of the future – depends entirely on advanced sensor systems. Such systems are also essential for automakers to deliver on their promise to incorporate automatic braking systems into all vehicles by 2022.
To reduce and prevent the carnage on our streets, companies like Mobileye, an Intel subsidiary, are working on chips bristling with tiny radar antennas. General Motors recently invested in Oculii, a pure math and software startup that uses machine learning to shape the kind of signals that automotive radar systems use. The software company MathWorks develops algorithms with which car manufacturers can integrate data from radar and other sensors into a trustworthy picture of the vehicle world.
Now is a time of rapid change for engineers working on vehicle sensors, said Erez Dagan, executive vice president of products and strategy at Mobileye. Cameras used in cars are becoming ever higher resolution and can perceive a larger area of natural light than they used to be. Lidar, which bounces lasers off surrounding objects to “see” the world in 3D, is becoming cheaper than it used to be. (Lidar is common in robot taxi prototypes such as Cruise and Amazon’s Zoox.)
Radar, which bounces radio waves off objects – the term was born as an acronym for “Radio Detection and Ranging” – has been used in some first-generation vehicle security systems since the 1990s. Automotive radar systems have a number of advantages. They are robust enough to withstand years of crowds and temperature fluctuations when mounted on cars. They’re much, much cheaper than lidar, can measure the speed of objects instantly, and can peer through harsh weather like fog and rain, which can thwart both cameras and lidar systems, but until recently they had one major drawback: they have only a fraction of the resolution of these other systems, meaning the images they produce are much more blurry.
Oculii’s technology works by changing the shape – also known as a waveform – of the radar signal sent by radar onto cars. The physics are complicated, but by changing the nature of the radar signal based on the type of objects it bounces off, it can resolve objects whose shape would otherwise be impossible to “see”. The result, says Chief Executive Steven Hong, is that existing $ 50 automotive radar sensors can produce three-dimensional images of a car’s surroundings at a much higher resolution. To appear in 2023.
Mobileye leverages the chip manufacturing capabilities of its parent company Intel and works on individual microchips covered with nearly 100 tiny antennas. By using artificial intelligence software to process the noisy signals received, Mobileye systems can identify pedestrians, for example, at least in the laboratory. Until now, this was only possible with cameras and lidar.
There is no consensus among automotive technologists about which configuration of cameras, lidar and radar will become standard for different security systems or autonomous driving, but almost all agree that the best solution will be a combination of these.
The resolution that even the best automotive radar can achieve is only as good as the worst lidar systems available, says Matthew Weed, engineer and senior director of product management at Luminar, which makes lidar systems for automobiles. However, Luminar’s system, which Mr. Weed said is superior to radar for most uses, costs $ 1,000.
Mr. Weed says Luminar’s lidar-based systems could justify their costs by being so good that they could cut driver insurance costs by preventing accidents and pedestrian deaths. Even with such a system in a car, radar would be a good aid if it fails or cannot withstand bad weather, he adds.
Mobileye uses lidar, cameras and radar in its most advanced systems. CEO Amnon Shashuahas said that while the prices of lidar systems have fallen, they are still ten times the cost of radar and are likely to remain so for the foreseeable future due to the complexity of the hardware involved.
Elon Musk’s Tesla has accepted its bet that the company can only achieve true autonomous driving in its vehicles with cameras.
Cameras have the advantage of extremely high resolution and, thanks to years of advances in smartphone cameras, are affordable and compact. But for a system that can achieve the highest security standards and ultimately even full autonomy, cameras need backup sensors that fail under different conditions than they do, adds Dagan.
Take fog, which is an obstacle to both camera-based and lidar-based systems and potentially causes vehicles to stop when they shouldn’t. In a research published in 2020, radar-based automotive sensors had no problem penetrating fog and correctly identifying stopped vehicles hidden in it, says Dinesh Bharadia, assistant professor of engineering at the University of California San Diego who helped on the work.
Dr. Bharadia says his team found that a key was using multiple radars placed at least 1.5 meters apart in a vehicle. The same principle works with the ever-increasing number of cameras on the back of our smartphones, he adds. It is possible to use multiple inexpensive radar sensors to create an “image” of the area around a car, just as our phone can use multiple small and inexpensive cameras, and then combine the images they collect into a much sharper image.
To bring all of a car’s sensors together into a single, coherent view of reality outside of a vehicle, all of this data needs to be brought together, says Rick Gentile, an engineer who previously worked on radar systems for defense applications and is now a product manager at MathWorks, a software company that Tools developed for processing data. For example, while the radar can see that there is a sign in front of you, it cannot see its color, which is essential to quickly determining what type of sign it is.
With so-called robot taxis, one can fill in the gaps in the capabilities of each type of sensor by using them all. The goal is “full redundancy”, says Dagan, so that the others can correctly perceive the world even if one sensor fails. This is the quickest way to convey at least as good a sense to vehicles (whether these vehicles have sufficient judgment to actually move safely is another matter).
By the time we get real autonomous vehicles – something that could be years if not decades – automakers will have to choose between radar, lidar, and cameras, or a combination of the three, to automatically develop security systems that can keep their promises drive to make braking systems standard by 2022 and to further improve these systems. All three types of sensors keep getting better, but the price difference between them has led automakers to favor one technology or another based on how well they think they can make up for their shortcomings with software and AI.
This has resulted in healthy competition between manufacturers of security systems, sensors, and supporting software – whoever you speak to, they claim their systems are the best.
With all of these companies scrambling for a place in your car, the goal of all these technologists is to benefit from the fact that the number of road fatalities of all kinds is greatly reduced when someone is behind the wheel. A goal that everyone agrees is much closer than fully autonomous vehicles.
Subscribe to something Mint newsletter
* Please enter a valid email
* Thank you for subscribing to our newsletter.
Never miss a story again! Stay connected and informed with Mint. Download our app now !!