Tesla to take back some Autopilot controls

Tesla to take back some Autopilot controls

In 15 to 20 years, CEO Elon Musk believes owning a car that can't drive itself will be like using a horse for travel today


Just a couple of weeks after releasing new self-driving features for its Model S vehicles through a software upgrade, Tesla will place new constraints on when Autopilot can be used.

Comments about the Autopilot issue came from Tesla CEO Elon Musk during a question-and-answer session after the company reported its third quarter financial results yesterday.

Tesla Model S autosteer Tesla

The Model S now has automatic steering, lane keeping and automatic lane change.

Asked whether he'd seen videos of drivers taking their hands off the steering wheels of Model S sedans and allowing the Autopilot feature to take control, Musk said he thinks changes must be made for safety.

"There's been some fairly crazy videos on YouTube.... This is not good. And we will be putting some additional constraints on when Autopilot can be activated to minimize the possibility of people doing crazy things with it," he said.

According to Musk, nearly one million cars have already installed the over-the-air software upgrade to the Model S Autopilot feature, which includes a beta version of Autosteer and Auto Lane Change.

While Autosteer is more akin to an enhanced adaptive cruise control and lane keeping assist than a fully self-driving vehicle, it does automatically maintain distances from other cars around the Model S.

Autosteer uses a variety of features including steering angle, steering rate and speed to determine the appropriate maneuvering operation.

Another feature, Auto Lane Change, is engaged when the turn signal is used. The Model S will move itself to the adjacent lane when it's safe.

tesla model s digital panels Creative Commons Lic.

The Tesla Model S instrument panel and tablet-like infotainment center.

Model S drivers have taken to posting videos to YouTube showing how their vehicle can drive hands free, a practice Tesla discourages.

Tesla has been tracking data related to accidents involving Model S P85D vehicles using Autopilot, and while it's still early, Musk said "that it's very positive."

Musk, however, admitted that reports of Autopilot errors should come as no surprise as it's a beta program that will need to "learn over time."

"And the system is getting better with each passing week. I think it will start to feel quite refined within a few months," he said.

Far from being dangerous, Musk said Tesla has received no reports of Autopilot causing accidents and has, in fact, prevented accidents.

"This is still early, but it's a good indication. So it appears to be quite beneficial from a safety standpoint, and I believe some of our customers have posted videos to this fact," Musk said.

Unlike traditional car manufacturers, which tend to bundle upgrades for a model year, Tesla is constantly engineering upgrades to the existing fleet and vehicles being manufactured.

According to Musk, the company's philosophy is to continuously make improvements, "so every week there are approximately 20 engineering changes made to the car. So model year doesn't mean as much. There are cases where that step change may be a little higher than normal as, for example, with having the Autopilot camera, radar, and ultrasonics. But we try to actually keep those step changes as small as possible."

While Autopilot's features allow for driver assist functions today, Musk said he expects Tesla to produce a fully autonomous vehicle within about three years.

In 15 to 20 years, it will be "quite unusual" to even see a car rolling off an assembly line that's not fully autonomous, he said.

"And for Tesla, it will be a lot sooner than that," Musk said, adding that cars without full autonomy will be seen as having a negative value. "It will be like owning a horse or not - you're really owning it for sentimental reasons."

Google, which has been building and testing its own self-driving vehicle, released a report this week indicating it would be safer to ensure drivers cannot take control of a self-driving car.

In its report, Google stated that it has spent time considering features that would allow their autonomous car to "hand off" control to the driver. But, Google referred to it as a real problem because driver reaction time to a problem was shown to be markedly slow.

Google referenced a study by the Virginia Tech Transportation Institute that found drivers required somewhere between five and eight seconds to safely regain control of a semi-autonomous system.

In August, the National Highway Traffic Safety Administration published a study  that revealed some participants took up to 17 seconds to respond to autonomous vehicle alerts and retake control of the vehicle.

"There's also the challenge of context -- once you take back control, do you have enough understanding of what's going on around the vehicle to make the right decision?" Google stated. "In the end, our tests led us to our decision to develop vehicles that could drive themselves from point A to B, with no human intervention."

"Everyone thinks getting a car to drive itself is hard. It is. But we suspect it's probably just as hard to get people to pay attention when they're bored or tired and the technology is saying, 'Don't worry, I've got this..., for now.' "

Follow Us

Join the ARN newsletter!

Error: Please check your email address. is a channel management ecosystem that automates all major aspects of the entire sales, marketing and service process, including data tracking, integrated learning, knowledge management and product lifecycle management.

Show Comments