Tesla vs. Waymo Approach: Cameras Only/First vs. LiDAR

Starsky Robotics, a startup developing autonomous trucks, had some difficulties raising money, as co-founder and CTO Kartik Tiwari reported. Although self-driving technology is a hot topic for venture capitalists and startups can get money easily, Starsky struggled.

Reason: the promise that autonomous vehicles can be developed without the use of LiDARs. The majority opinion today in the self-driving car scene is that LiDARs are a must, beside having cameras, radar and ultrasound sensors.. LiDAR-technology is considered one of the key technologies, and billions are poured into the development. The technology has some advantages that other sensors cannot deliver. Most importantly it’s the capability of creating a 3D image of the environment.

LiDAR First/Only

LiDARs send out light pulses and detect the reflected light. The delay and weakened reflected light is then used to calculate the distance to an object, which allows to create a 3D map. LiDARs are also not much sensitive to sunlight or darkness, which makes them a reliable sensor under changing light conditions.

But they do have some disadvantages. One is the price. Up to 70,000 dollars cost highly performing LiDARs, and multiple of them are used on one car. Waymo for instance uses six of them on each of their currently 600 Pacifica minivans. Also the range of LiDARs is a challenge. Although the best performing LiDARs from Velodyne and Waymo can see up to 300 meters, most of the available LiDARs are in the range of 100 meters. That’s not enough for driving on highways under higher speeds. Also the high performing LiDARs are relying on a mechanic principle, with the lasers rotating. Those types of LiDARs last not more than one year, too short for the average lifecycle of 10 to 15 years that is expected for automotive grade hardware. Solid state LiDARs would be the solution, but they are under development and not ready for prime time.

But with Starsky Robotics there are with AutoX, Aurora, MobilEye and especially prominently Tesla, a whole range of companies – although the minority – who believe in that they can forgo LiDARs, or that they can use LiDARs as a backup only, but mostly go with cameras first. New algorithms currently under development shall allow to create 3D maps, similar to the one from LiDARs, but only from those sensors alone and without the costs of LiDARs.

Behavioral Cloning

That’s why Tesla cars produced and delivered since October 2016 have cameras, radar, and ultrasound sensors, but not LiDARs. On the one hand algorithms and Tesla’s neural network shall help to generate that 3D-information, and on the other hand something called Behavioral Cloning. Tens of thousands of Tesla owners already deliver today gigabytes on driving data on a monthly base to Tesla, including millions of video data and the driving style of owners. This data allows to calculate an average for a good human driver, which can then be added to Tesla’s Autopilot.

Instead of driving with a few hundred cars with high effort in a geofenced area – like Waymo is doing – and create from machine learning a safe driver, Tesla is using crowd intelligence and clones the driving behavior of good human drivers.

The Race

Who’s win that race is not yet clear. It’s a race between the price drops in LiDAR technology and the development of LiDAR-less algorithms. But also the use cases for autonomous vehicles.

Starsky Robotics reasons that their non-use of LiDARs on their trucks is based on the area of use of trucks. Long haul trucks often drive short distances only on urban roads, before they enter long stretches of highways.  And highways don’t have traffic in opposite direction, pedestrians or traffic signals. Vehicles on highways have to stay in the lanes and keep distance to other cars, a relatively simple task.

Starsky’s trucks for instance drive more than 99 percent of their distances on highways. From the storage in Hayward to the highway it’s just 5 kilometers, then 4,200 kilometers drive on the highway across states to Georgia, and then another 2 kilometers to the destination. For the highway cameras, radar and ultrasound sensors are sufficient, the few kilometers on urban roads a remote vehicle operator takes control of the truck.

It’ll be interesting to see who will win the race. But I guess, both sides will be proven right, depending on the use case.

This article was also published in German.