Friday, July 15, 2016

TESLA KAMIKAZE PILOT

You have to hand it to Tesla. They have no problem admitting that they use their customers as lab rats in testing their technology.



In this instance the test subject lost their life in a beta test using what they call "autopilot" which Tesla then cautions drivers to stay alert and keep their hands on the wheel which completely contradicts the definition of "auto" pilot. This raises a very serious question. Should Tesla be allowed to work out the flaws in their technology using their customers as subjects?


Following a series of crashes, one of which was fatal, Tesla Motors, the automaker known for its high-performance electric vehicles and envelope-pushing technology, is now under intense scrutiny for the way it deployed and marketed its Autopilot driving-assist system.

The company’s aggressive roll-out of self-driving technology—in what it calls a “beta-test”—is forcing safety agencies and automakers to reassess the basic relationship between human drivers and their increasingly sophisticated cars. Last week, the National Highway Traffic Safety Administration (NHTSA) sent a letter to Tesla requesting detailed information about Autopilot, including any design changes and updates to the system, as well as detailed logs of when the system has prompted drivers to take over steering.

The most serious of the Autopilot crashes happened in Florida on May 7. According to the accident report, 40-year-old Ohio resident Joshua Brown died in a collision near Williston, Fla., with a tractor trailer that was making a left turn in front of his Model S. Tesla later acknowledged that the car was in Autopilot mode at the time. On June 30, Tesla published a blog post about the accident, stating “neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”

Autopilot comprises multiple systems (including Autosteer and Auto Lane Change) that use cameras, radar, ultrasonic sensors and data to, in Tesla’s words “automatically steer down the highway, change lanes, and adjust speed in response to traffic.” The company also claims the features “help the car avoid hazards and reduce the driver’s workload.”

The Florida crash has prompted investigations by NHTSA and the National Transportation Safety Board (NTSB). Meanwhile, The Wall Street Journal reported the Securities and Exchange Commission is investigating whether Tesla failed to tell investors about the crash in a timely fashion.

While the exact cause of the fatal accident is not yet known, the incident has caused safety advocates, including Consumer Reports, to question whether the name Autopilot, as well as the marketing hype of its roll-out, promoted a dangerously premature assumption that the Model S was capable of truly driving on its own. Tesla’s own press release for the system announced “Your Autopilot has arrived” and promised to relieve drivers “of the most tedious and potentially dangerous aspects of road travel.” But the release also states that the driver “is still responsible for, and ultimately in control of, the car.”

Consumer Reports experts believe that these two messages—your vehicle can drive itself, but you may need to take over the controls at a moment’s notice—create potential for driver confusion. It also increases the possibility that drivers using Autopilot may not be engaged enough to to react quickly to emergency situations. Many automakers are introducing this type of semi-autonomous technology into their vehicles at a rapid pace, but Tesla has been uniquely aggressive in its deployment. It is the only manufacturer that allows drivers to take their hands off the wheel for significant periods of time, and the fatal crash has brought the potential risks into sharp relief.

"By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security," says Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports. "In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we're deeply concerned that consumers are being sold a pile of promises about unproven technology. 'Autopilot' can't actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time. Tesla should disable automatic steering in its cars until it updates the program to verify that the driver's hands are on the wheel."

Companies must commit immediately to name automated features with descriptive—not exaggerated—titles, MacCleery adds, noting that automakers should roll out new features only when they're certain they are safe.

“Consumers should never be guinea pigs for vehicle safety 'beta' programs,” she says. “At the same time, regulators urgently need to step up their oversight of cars with these active safety features. NHTSA should insist on expert, independent third-party testing and certification for these features, and issue mandatory safety standards to ensure that they operate safely."




READ MORE; Consumer Reports urges Tesla to disable autopilot after driver’s death | 

No comments:

Post a Comment