search slide
search slide
pages bottom

Tesla responds to Autopilot issues with major new radar-reliant update

Tesla owners have come to expect frequent, and sometimes substantial, over-the-air (OTA) updates to their cars — especially for systems like the Autopilot software that is technically a beta. But there is little doubt that the extent and timing of its latest version 8 of Autopilot was driven in part by the widely covered fatal crash in Florida, where a Model S failed to detect a white truck crossing its path against a bright sky. For those tracking the details, the issue wasn’t really related to autonomous features of the car, but to the AEB (Automatic Emergency Braking System). Either way, the blame was placed on a shortcoming in Mobileye’s camera-based object detection system — leading to a parting of ways between the two companies.

Unlike typical cameras, radar is unaffected by lighting, and much less sensitive to atmospheric conditions like fog. Tesla models compatible with Autopilot (those built since October 2014) include a front-facing radar, but until now the system relied primarily on camera input. With Version 8.0, that Tesla CEO Elon Musk said was an effort to fit into the memory capacity of some models, the radar will be pulsed up to 10 times per second and used to create a 3D image of what is in front of the car.

Overhead signs present a challenge for autopilot systemsWhile radar is much better than a camera in poor conditions, and is great at detecting metal, it has a harder time seeing people, wood, and plastic. That’s meant that Tesla has had to put a lot of work into the signal processing software that works on the raw radar data — both to not over-react to small pieces of metal like a soda can, and to correctly detect non-metallic objects. Part of the update is increasing the density of the raw point cloud from the radar by a factor of six.

One interesting aspect of Tesla’s system is that if one car detects an object (for example an overhead sign) that turns out not to be an issue, the data is geotagged and shared. If several cars report the same experience, then the entire fleet can be taught to ignore it. Personally, while this type of learning works great for marking speed traps and red light cameras, it is a little spooky to think about an emergency system ignoring an input because of cloud-sourced data. Nvidia’s Jen-Hsun Huang demonstrated a different version of this technique at GTC, where vehicles could share the image data that could be post-processed and used to provide an entire fleet of cars with improved “ground truth” maps.

Tesla is clearly both concerned and frustrated by drivers that abuse the Autopilot system by repeatedly taking their hands off the wheel for long periods. So Version 8.0 also includes a feature that will disable the system after three incidents in an hour. The driver will need to pull over and put the car in park to re-enable it. Musk is upfront about his enthusiasm for Tesla’s Autopilot offering, as it is part of what he says makes “the Model S and X by far the safest cars on the road.” Making better use of its radar should help make them even safer. There are also a variety of other small upgrades and bug fixes in the release that are detailed by Tesla.

Leave a Reply

Captcha image