For the past decade, the easiest way to spot a self-driving car was to look for the distinctive spinning bucket mounted to its roof. The classic lidar design pioneered by Velodyne spins 64 lasers over 360 degrees, producing a three-dimensional view of the car’s surroundings from the reflected laser beams.
That complicated and bulky set-up has traditionally further been expensive. Velodyne’s US $75,000 lidar famously cost several times the sticker price of the Toyota Priuses that formed the nucleus of Google’s original self-driving car fleet.
Those days are long gone. Velodyne now sells its cheapest 16-laser lidar for $4,000, and a host of startups are nipping at its heels with solid-state lidars that can soon be as cheap as $100 each. Many of the new lidars work in fundamentally dissimilar ways to Velodyne’s bucket—a shift that brings new capabilities and new challenges.
With the notable exception of Tesla, almost every company working on self-driving vehicles uses three core sensor technologies: cameras to identify road users, signs and traffic lights; radars for accurate ranging out to long distances; and lidars for high-resolution 3D imaging. (Tesla insists that only cameras, radars, and ultrasonic sensors are necessary, although multiple in the industry think the company will integrate lidars in future vehicles).
“Cameras are in a true fantastic place and will continue to get better, automotive radar has been well established… but lidar is an area that has been underexplored,” stated Chris Urmson at CES. Urmson was long the head of Google’s self-driving car program and is now CEO of Aurora Innovation, an autonomous technology startup that recently declared partnerships with Volkswagen and Hyundai. “The best exciting thing about lidar is that suddenly, there are a whole lot of people working on it,” he said.
The primary motorist for carmakers is to move away from sensitive electro-mechanical systems to something more durable, stated Jim Zizelman, vice president of engineering and program management for Aptiv, a global automotive parts company: “Think about a mechanical spinning system going over potholes and over temperature extremes. A solid-state system should give orders of magnitude more reliability.”
CES this year saw multiple lidar companies demonstrating solid-state lidars. LeddarTech was showing a lidar system-on-a-chip that it says will cost less than $100 at production volumes, although rival Innoviz unveiled a high-resolution lidar based on MEMS—tiny microelectromechanical mirrors that steer a single fixed laser.
Innoviz boasted that its unit has a field of view greater than 70 degrees. While that is ahead than a few other solid-state lidars, it’s clearly less impressive than Velodyne’s 360-degree bucket. Solid-state lidars, then, generally have to build up a panoramic view by fusing data from multiple units mounted on the front, sides, and rear of a car. That requires additional computation but further gives car designers a little more freedom.
Legendary car designer Henrik Fisker was at CES talking about his latest vehicle, the Fisker EMotion, a luxury all-electric sedan that uses solid-state lidar from Quanergy. “For 100 years, the front completion of cars has been designed around the radiator,” he signified to IEEE Spectrum. “Now we don’t require radiators. I thought, why don’t I design the confront around the lidar? It was honestly liberating, to design around a brand new technology.”
A Quanergy lidar takes pride of place on the nose of the EMotion, with four other units around the vehicle. Aptiv’s test BMW has no fewer than nine lidars built into it. Toyota’s latest self-driving research vehicle, codenamed Platform 3.0, has a complete of eight lidars. Four of these are MEMS-based solid-state lidars from Luminar, a startup run by 22-year long-established entrepreneur Austin Russell.
Another approach to steering lasers in solid-state lidar is to use a phased array. Strobe, a start-up acquired by GM in October, most likely uses one, along with an innovative frequency-modulated technology that lets the lidar measure the velocity as well as scope of other road users.
“Phased array steering for lidar is interesting,” stated Urmson. “But the challenge with that approach is the defeat you get over the system. Can you honestly compress the sidelobes enough so that you can trust the measurements that come out? There’s a lot of interesting innovation in lidar and most likely 20 to 50 startups trying dissimilar flavours.”
Some self-driving companies are already looking beyond lidar to other novel sensors. Mike Fleming was on one of the three triumphing teams of the DARPA Urban Challenge in 2007, and has been working from there on autonomous technology. He is now CEO of Torc, a company that has built self-driving Humvees for the U.S. military and is now turning its attention to passenger cars.
“There’s so much competition within the lidar space, we’re seeing more capabilities and the cost coming down,” he stated concurrently a test ride in Torc’s self-driving Lexus at CES. “We’ve used infrared cameras and lidars on dissimilar wavelengths with other projects, and believe that self-driving car companies will have to design their architecture to backing sensors that we haven’t even thought of yet.”