Environment

Elon Musk is lying about Tesla’s self-driving and I have the DMs to prove it

Published

on

Over the last few days, Elon Musk has been making several statements claiming that autonomous driving systems that use lidar and radar sensors are more dangerous than Tesla’s camera-only computer vision approach because the system gets confused when interpreting data from different sensors.

It’s not only false, Musk told me directly that he agreed that radar and vision could be safer than just vision, right after he had Tesla remove the radars from its vehicles.

Tesla has taken a controversial approach, using only cameras as sensors for driving inputs in its self-driving technology. In contrast, most other companies use cameras in conjunction with radar and lidar sensors.

When Tesla first announced that all its cars produced onward have the hardware capable of “full self-driving” up to level 5 autonomous capacity in 2016, it included a front-facing radar in its self-driving hardware suite.

Advertisement – scroll for more content

However, in 2021, after not having achieved anything more than a level 2 driver assist (ADAS) system with its self-driving effort, Elon Musk announced a move that he called “Tesla Vision”, which consists of moving Tesla’s self-driving effort only to use inputs from cameras.

Here’s what I wrote in 2021 about Musk sharing his plan for Tesla to only use cameras and neural nets:

CEO Elon Musk has been hyping the vision-only update as “mind-blowing.” He insists that it will lead to a true level 5 autonomous driving system by the end of the year, but he has gotten that timeline wrong before.

By May 2021, Tesla had begun removing the radar sensor from its lineup, starting with the Model 3 and Model Y, and later the Model S and Model X in 2022.

Tesla engineers reportedly attempted to convince Musk to retain the use of radar, but the CEO overruled them.

We are now in 2025, and unlike what Musk claimed, Tesla has yet to deliver on its self-driving promises, but the CEO is doubling down on his vision-only approach.

The controversial billionaire is making headlines this week for a series of new statements attacking Tesla’s self-driving rivals and their use of radar and lidar sensors.

Earlier this week, Musk took a jab at Waymo and claimed that “lidar and radar reduce safety”:

Lidar and radar reduce safety due to sensor contention. If lidars/radars disagree with cameras, which one wins? This sensor ambiguity causes increased, not decreased, risk. That’s why Waymos can’t drive on highways.We turned off radars in Teslas to increase safety. Cameras ftw.

The assertion that “Waymos can’t drive on highways” is simply false. Waymo has been conducting fully driverless employee testing on freeways in Phoenix, San Francisco, and Los Angeles for years, and it is expected to make this technology available to rider-only rides soon.

Tesla is in a similar situation with its Robotaxi: they don’t drive on freeways without an employee supervisor.

Musk later added:

LiDAR also does not work well in snow, rain or dust due to reflection scatter. That’s why Waymos stop working in any heavy precipitation. As I have said many times, there is a role for LiDAR in some circumstances and I personally oversaw the development of LiDAR for the SpaceX Dragon docking with Space Station. I am well aware of its strengths and weaknesses.

It’s not true that Waymos can’t work in “any heavy precipitation.”

Here’s a video of a Waymo vehicle driving by itself in heavy rain:

In comparison, Tesla’s own Robotaxi terms of service mention that it “may be limited or unavailable in inclement weather.”

Last month, Tesla Robotaxi riders had their rides cut short, and they were told it was due to the rain.

There’s plenty of evidence that Musk is wrong and misleading with these statements, but furthermore, he himself admitted that radar sensors can make Tesla’s vision system safer.

‘Vision with high-res radar would be better than pure vision’

In May 2021, as Tesla began removing radar sensors from its vehicle lineup and transitioning to a vision-only approach, I was direct messaging (DMing) Musk to learn more about the surprising move.

In the conversation, he was already making the claim that sensor contention is lowering safety as he did this week in new comments attacking Waymo.

He wrote at the time:

The probability of safety will be higher with pure vision than vision+radar, not lower. Vision has become so good that radar actually reduces signal/noise.

However, what was more interesting is what he said shortly after claiming that:

Musk admitted that “vision with high-resolution radar would be better than pure vision”. However, he claimed that such a radar didn’t exist.

In the same conversation, I pointed Musk to existing high-definition millimeter wave radars, but he didn’t respond.

It was still early for that technology in 2021, but high-definition millimeter wave radars are now commonly used by companies developing autonomous driving technologies, including Waymo.

Waymo uses six high-definition radars in its system:

In short, Musk was already concerned about sensor contention in 2021, but he admitted that the problem would be worth solving with higher-definition radars, which already existed then and are becoming more common now.

Yet, he criticizes companies using radar and lidar, which work similarly to high-resolution radars but on different wavelengths, for even attempting sensor fusion.

It’s not impossible because Tesla can’t do it

Part of the problem here appears to be that Musk thinks something doesn’t work because Tesla can’t make it work, and he doesn’t want to admit that others are solving the sensor fusion problem.

Tesla simply couldn’t solve sensor fusion, so it focused on achieving autonomy solely through camera vision. However, those who continued to work on the issue have made significant progress and are now reaping the rewards.

Waymo and Baidu, both of which have level 4 autonomous driving systems currently commercially operating without supervision, unlike Tesla, have heavily invested in sensor fusion.

Amir Husain, an AI entrepreneur who sits on the Boards of Advisors for IBM Watson and the Department of Computer Science at UT Austin, points to advancements in the use of Kalman filters and Bayesian techniques to solve sensor noise covariance.

He commented on Musk’s statement regarding the use of radar and lidar sensors:

The issue isn’t a binary disagreement between two sensors. It generates a better estimate than any individual sensor can produce on its own. They all have a margin of error. Fusion helps reduce this.

If Musk’s argument held, why would the human brain use eyes, ears, and touch to estimate object location? Why would aircraft combine radar, IRST, and other passive sensors to estimate object location? This is a fundamental misunderstanding of information theory. Every channel has noise. But redundancy reduces uncertainty.

Musk’s main argument to focus on cameras and neural nets has been that the roads are designed for humans to drive and humans drive using their eyes and brain, which are the hardware and software equivalent of cameras (eyes) and neural nets (brain).

Now, most other companies developing autonomous driving technologies are also focusing on this, but to surpass humans and achieve greater levels of safety through precision and redundancy, they are also adding radar and lidar sensors to their systems.

Electrek’s Take

Musk painted Tesla into a corner with its vision-only approach, and now he is trying to mislead people into thinking that it is the only one that can work, when there’s no substantial evidence to support this claim.

Now, let me be clear, Musk is partly correct. When poorly fused, multi-sensor data introduces noise, making it more challenging to operate an autonomous driving system.

However, who said that this is an unsolvable problem? Others appear to be solving it, and we are seeing the results in Waymo’s and Baidu’s commercially available rider-only taxi services.

If you can take advantage of radar’s ability to detect distance and speed as well as work through rain, fog, dust, and snow, why wouldn’t you use it?

As he admitted in the DMs with me in 2021, Musk is aware of this – hence why he acknowledged that high-resolution radar combined with vision would be safer than vision alone.

The problem is that Tesla hasn’t focused on improving sensor fusion and radar integration in the last 4 years because it has been all-in on vision.

Now, Tesla could potentially still solve self-driving with its vision system, but there’s no evidence that it is close to happening or any safer than other systems, such as Waymo’s, which use radar and lidar sensors.

In fact, Tesla is still only operating an autonomous driving system under the supervision of in-car employees with a few dozen cars, while Waymo has been doing rider-only rides for years and operates over 1,500 autonomous vehicles in the US.

Just like with his “Robotaxi” with supervisors, Musk is trying to create the illusion that Tesla is not only leading in autonomy, but it is the only one that can solve it.

FTC: We use income earning auto affiliate links. More.

Trending

Exit mobile version