From Audi to Volvo, most self-driving cars use the same hardware
by Jonathan M. Gitlin | ars technica
On Sunday, my colleague Lee Hutchinson regaled you all with a tale of his semi-autonomous driving adventure in one of Tesla’s high-speed electric chariots. But that’s not the only semi-autonomous (Level 2 self-driving, according to the National Highway Traffic Safety Administration) road trip we’ve conducted here at Ars. You can read all about how we got on with Volvo’s latest and greatest XC90 SUVs in a week or so. Plus, there’s the new Audi A4, which in Dynamic mode really puts the mantra of “trust the machine” to the test as it late-brakes for exits at up to 0.5G. And finally, I was also fortunate enough to have put many miles on an Audi A7 TDI, driving from DC to Columbus, Ohio, and back when I went to visit the Venturi Buckeye Bullet 3.
Much of the technology that underpins these systems is shared among the industry. A handful of companies like Bosch, Delphi, and Mobileye provide sensors, control units, and even algorithms to car makers, who then integrate and refine those systems.
Depending on the make of car, these advanced driver assistant systems—ADAS in industry speak—might be called Traffic-Aware Cruise Control and Autosteer (Tesla), IntelliSafe Assist and Pilot Assist (Volvo), Distronic Plus with Steering Assist (Mercedes-Benz), Adaptive Cruise Control with Lane Assist and Traffic Jam Assist (Audi), and so on. But all of them work on the same basic principles. A fusion of sensors identify the lane markings on the road, the cars around you, and now even road signs like speed limits or school zones, and use this information to maintain your speed and a safe distance to those other cars.
Mobileye is one of the most important suppliers of ADAS technology. The Israeli company brought out its first EyeQ system-on-chip back in 2004, and on May 17 it just announced details of the fifth generation, which should start showing up in late 2020. EyeQ5 will allow a higher level of autonomous driving (NHTSA Level 3) and, therefore, has been designed to cope with high-bandwidth sensor fusion of up to 40Gbps, fed from a combination of optical cameras, radar, and lidar sensors.
At this year’s CES, Mobileye’s CTO Amnon Shashua explained that the company has been using deep learning for the past few years to refine its systems, which are capable of 3D vehicle detection. This is important because unlike road furniture, the cars on our roads aren’t static and will present different angles to the sensor at different times. Being able to do this well is what enables systems like Audi’s PreSense City or Volvo’s City Safety assists to detect cars that aren’t directly ahead of the vehicle but which could cross its path. It’s not only about identifying other cars on the road. EyeQ also benefited from deep learning to find free space on the road ahead, as well as what Shashua called holistic path planning to predict where the lanes on a road ought to be in cases where the lines are unpainted or obscured. (For a more in-depth read on Shashua’s presentation, we recommend this piece by WCCFTech.)
Mobileye’s founder and CEO, Ziv Aviram, told Ars that the company “works with most of the world’s leading car manufacturers to supply advanced driver assistance systems.”
“For the near future, we have agreements with several leading automakers on systems that will allow drivers to fully disengage during highway driving beginning in 2018,” he continued. “Additionally, we are working with GM, Nissan, and Volkswagen to incorporate our recently announced high-definition mapping technology called Road Experience Management into their vehicles, a significant step on the road to fully autonomous driving everywhere. The backbone of Mobileye’s software is the EyeQ system-on-chip family, jointly developed with STMicroelectronics. The new EyeQ5, expected to launch in 2020, will be the flagship system that turns the concept of fully autonomous cars we envision into a reality.”
While the OEMs are happy to discuss how their systems work with us, for example—they do appear to be slightly more reticent when it comes to their suppliers. We reached out to a few of them in advance of this piece to see if we could find out more specifics about the technology that underpins their various implementations of ADAS, with varying success.
Audi’s Justin Goduto told Ars that, “Like many other OEMs, we work with suppliers for many of the sensors used in our Advanced Driver Assistance Systems (ADAS), such as the MobileEye camera system used for features like PreSense City and Audi Traffic Jam Assist. However, we develop and design the implementation of these technologies in-house. A great example of the in-house innovation happening in the ADAS and Autonomous space is the development of the Audi zFAS controller. The design for the zFAS occurred here at Audi, and we work with NVIDIA as a key partner for the manufacturing process.”
Volvo was reticent to go on record about its suppliers. Tesla wouldn’t go into specifics about which suppliers it sources ADAS components from, but it’s well known that the company uses Mobileye EyeQ in the Model S and Model X, which use it for detecting pedestrians and traffic signs. Tesla also told Ars that both models are equipped with a long-distance all-weather radar, a cluster of 12 ultrasonic sensors that detect anything within 16 feet (4.8m) around the vehicle, and a high-precision GPS module. And the fleet of customer cars is helping develop future autonomous Teslas. “These mutually reinforcing systems offer realtime data feedback from the Tesla fleet, ensuring that the system is continually learning and improving upon itself,” a Tesla spokesperson told us.
As that last quote suggests, as good as some ADAS implementations are, they remain works in progress. For example, we’ve been able to detect a significant improvement in Audi’s Lane Assist between older models like the A7 or A8 compared to the new A4. In the newer car, it keeps you well-centered in your lane; older models (and Volvo’s XC90) aren’t quite as steady and will constantly correct your course from one side of the lane to the other. Speaking to Ars, Mary Gustanski, Delphi’s VP of engineering and product management, likened the evolution of these systems to a learner driver—like Tesla (and presumably the other OEMs), it’s constantly refining and iterating algorithms to provide the cars with more natural behavior.
Lastly, even if OEMs source components from the same suppliers, the end results can feel quite different in action. Each vehicle has its own flavor of semi-autonomous cruising—you needn’t be clairvoyant to guess that semi-autonomous cruising is a more relaxed experience in the Volvo than the Audi, for example.