Get all your news in one place.
100’s of premium titles.
One app.
Start reading
InsideEVs
InsideEVs
Technology
Steven Loveday

Tesla Is Now Removing Ultrasonic Sensors In Move To Vision Only

Tesla made a bold move by removing the radar in its cars in favor of its new camera-based vision-only Tesla Vision approach. People said it was a bad idea and couldn't be done, but it didn't take long before safety organizations tested the vision-only features and approved them. Now, Tesla has announced that it will move forward even further with the vision-only setup, by eliminating ultrasonic sensors from its cars.

Much like the situation with the removal of radar, Tesla will certainly face some pushback here. Its cars' safety features will also need to be retested and approved all over again since the systems will be working differently. This means its cars will likely lose various awards and/or recommendations until after they're proven to work as intended without the ultrasonic sensors.

Tesla's original Autopilot suite featured eight cameras, forward-facing radar, and a host of ultrasonic sensors. The company insists that if a car is going to drive like a human, it needs to "see" the world around it and then react accordingly. Tesla has also noted that the various technologies can contradict one another and that it has learned that when paired with AI and a neural network (essentially the brain), camera-based vision appears to have the most success, and it can see so much more than any human.

Tesla just announced this week:

"Today, we are taking the next step in Tesla Vision by removing ultrasonic sensors (USS) from Model 3 and Model Y. We will continue this rollout with Model 3 and Model Y, globally, over the next few months, followed by Model S and Model X in 2023."

Much like the removal of radar, Tesla is starting with the Model 3 and Model Y. It will begin removing the USS from its flagship Model S and Model X vehicles next year.

According to Electrek, the USS were especially helpful for shorter-range object detections. For example, they helped the car with Autopark and collision warnings. However, if the car can "see" the world around it, render it, and analyze distances, Tesla could argue that the sensors are creating unnecessary redundancy.

Tesla went on to explain that it's using software and its neural network to replace the data provided by the USS with that of data collected via its vision-based system. The company shared that "this approach gives Autopilot high-definition spatial positioning, longer range visibility and ability to identify and differentiate between objects. As with many Tesla features, our occupancy network will continue to improve rapidly over time."

During the transition period, Tesla notes that certain features, including Park Assist, Autopark, Summon, and Smart Summon, will either be limited or inactive. Once the company can prove performance parity without the USS, it will restore the features.

What say you? Do you agree with and support Tesla's transition to advanced driver-assist systems powered solely by vision? Let us know your thoughts in the comment section below.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.