Consumer Reports has publicly called on Tesla to disable the automatic steering portion of Autopilot in the wake of the fatal accident that took the like of Joshua Brown. Tesla’s Autopilot allows the vehicle to automatically steer, accelerate and brake when navigating highways with lane markings. It should be deactivated “until it can be reprogrammed to require drivers to keep their hands on the steering wheel,” says the consumer watchdog organization.
The editors of Consumer Reports say the name Autopilot is “misleading and potentially dangerous.” They want Tesla to block its automatic steering technology, overhaul it, and rename it. Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports, said in a statement that self-driving systems “could make our roads safer” eventually, “but today, we’re deeply concerned that consumers are being sold a pile of promises about unproven technology.”
That’s quite a reversal for an organization that tested a Tesla with Autopilot last October and reported that is “worked quite well,” given its limitations.
Tesla and Elon Musk are sticking to their guns. “Tesla is constantly introducing enhancements proven over millions of miles of internal testing to ensure that drivers supported by Autopilot remain safer than those operating without assistance,” Tesla said in a statement on July 14. “We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well meaning advice from any individual or group, we make our decisions on the basis of real world data, not speculation by media.”
At issue are the length of time the car will continue to drive in semi-autonomous mode even when the system detects no hand on the wheel and how the system alerts drivers that it is time for them to resume direct control of the car. In a recent crash involving a Model X driving on a twisty road in Montana, the company says there was no hand on the wheel for more than 2 minutes. The car was traveling at 60 miles an hour, which means it went more than 2 miles with no human input. The driver says he was unaware the car was directing him to take control because his native language is Mandarin, not English.
Also, some drivers report they were unaware the system had handed back control to them, leaving them responsible for driving the car. Ambiguity is not in anyone’s best interests when it comes to driving a motor vehicle.
“Tesla Autopilot functions like the systems that airplane pilots use when conditions are clear,” Tesla said. “The driver is still responsible for, and ultimately in control of, the car. This is enforced with onboard monitoring and alerts. To further ensure drivers remain aware of what the car does and does not see, Tesla Autopilot also provides intuitive access to the information the car is using to inform its actions.” Some drivers feel that “intuitive access” is less successful that it could be. That’s an area that Tesla could address fairly easily by making warnings clearer and less ambiguous.
Consumer Reports’ suggestion seems more than a little over the top. Still, Tesla has to tread carefully here. Rumor and innuendo can have a strongly negative effect on consumer opinions. Some people may remember the maelstrom surrounding the Audi 5000 sudden unintended acceleration situation that happened some time ago. 60 Minutes got involved and people started calling it a “death car.” Audi sales plummeted and it almost went out of business.
There are hundreds of thousands of motor vehicle accidents every year on America’s roads. Few ever garner any media attention. Why is this one crash causing such a commotion? “If it bleeds, it leads,” is a popular expression it the news business and the media have been quick to make a cause célèbre out of Brown’s death.
Elon is not easily dissuaded from his chosen course. But there is ample evidence to suggest that human drivers are not as alert and tech savvy as perhaps the company assumes they are. The trick is to satisfy any safety concerns without stripping the Autopilot system of its life saving features. Ultimately, the question comes down to whether the death of one driver should be an excuse for failing to protect hundreds if not thousands of other drivers from injury or death.