Google has declared war on the independent media and has begun blocking emails from NaturalNews from getting to our readers. We recommend GoodGopher.com as a free, uncensored email receiving service, or ProtonMail.com as a free, encrypted email send and receive service.
04/04/2018 / By Lance D Johnson
Tesla’s autopilot feature isn’t fail proof and drivers must constantly be ready to take control of the wheel. In one man’s test video, the autopilot feature became confused with the lines on the road and nearly sent the driver head first into a concrete divider. The man was trying to demonstrate what could have gone wrong in a fatal accident involving a Tesla Model X Autopilot in Mountain View just a month ago.
An investigation by the U.S. National Transportation Safety Board found that the Tesla Model X in question was indeed on autopilot when it crashed. The adaptive cruise control feature was set to follow at the minimum distance before the fatal accident occurred. The data logs confirmed that the driver did not have his hands on the wheel six seconds before the crash took place. Earlier in the drive, the driver had been given an audible warning that his hands were not on the wheel. According to the logs, the driver had five seconds and 150 meters of unobstructed view before the crash happened, but “no action was taken.”
This brings up the real problem with self-driving vehicles. Drivers, trusting in the autopilot feature, will naturally pay less attention to the road and their surroundings. If a fault arises in the autopilot technology, the driver could be caught unaware, not ready to take control of the vehicle. No action was taken in this crash, likely, because the driver wasn’t paying attention and didn’t expect he would need to steer clear of a concrete divider. In the real world, distracted drivers won’t be able to predict the problems with self-driving cars. They won’t see these new, unexpected problems coming. Drivers who enjoy autopilot won’t be fully aware of their surroundings because they trust the technology to take care of them. At the last second, they won’t be able to make important judgment calls. (Related: Safety not a major concern as House gives self-driving cars the fast lane.)
In the video re-enactment of the crash, the driver never grabs the steering wheel in time, ignoring the alert that was sent just a few seconds before the concrete barrier is facing him, head on. Apparently, the Tesla’s Auto steer feature became confused, locking in on the far left line, which was actually the right line of the exit ramp. This line was clearly marked and detected unlike the actual left line of the lane the car was supposed to be guided by. The confused Auto steer feature directed the car directly into the barrier, giving the driver just a few seconds to react. However, many drivers could be distracted, texting or spacing out in a circumstance like this. By the time they figure out where the auto steer went wrong, it could be too late to react.
Unknown flaws in auto pilot technology + blind trust in technology + distracted drivers + overmedicated, ailing cognition will inevitably equal more accidents and fatalities on the road. To keep up-to-date with autonomous vehicle concerns, visit Glitch.News.
Sources include:
Tagged Under: auto pilot failures, autopilot, blind trust, car accidents, distracted drivers, Elon Musk, Glitch, reflex time, self-driving cars, tesla
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.