Drivers said they are willing to take on the risk even if they have to intervene — believing they are on a world-changing mission
Smith is now part of a group of at least 12,000 beta testers for Tesla’s polarizing “Full Self-Driving” software, which can attempt many everyday driving tasks, albeit sometimes unpredictably. Despite its flaws, Smith believes it’s safer. He is willing to take on the task even if he knows he might have to intervene when software makes mistakes: running a red light, driving onto light-rail tracks or nearly striking a person in a crosswalk, all scenarios that beta testers interviewed by The Washington Post have encountered on the road.
“It de-stresses me,” he said in an interview. “I observe more. I’m more aware of everything around. I feel safer with it on.”
At the heart of Tesla’s strategy is a bold bet that the thousands of chosen test drivers, many of whom passed a safety screening that monitored their driving for a week or more, will scoop up enough real-world data to rapidly improve the software on the fly. In navigating public roads with unproven software, Tesla’s Full Self-Driving beta testers have not just volunteered to help, but have taken on the liability for any mistakes the software might make.
© Copyright LaPresse