The automaker may have undermined safety in designing its Autopilot driver-assistance system to fit its chief executive’s vision, former employees say.

Elon Musk built his electric car company, Tesla, around the promise that it represented the future of driving — a phrase emblazoned on the automaker’s website, The New York Times reports.

Much of that promise was centered on Autopilot, a system of features that could steer, brake and accelerate the company’s sleek electric vehicles on highways. Over and over, Mr. Musk declared that truly autonomous driving was nearly at hand — the day when a Tesla could drive itself — and that the capability would be whisked to drivers over the air in software updates.

Unlike technologists at almost every other company working on self-driving vehicles, Mr. Musk insisted that autonomy could be achieved solely with cameras tracking their surroundings. But many Tesla engineers questioned whether it was safe enough to rely on cameras without the benefit of other sensing devices — and whether Mr. Musk was promising drivers too much about Autopilot’s capabilities.

Now those questions are at the heart of an investigation by the National Highway Traffic Safety Administration after at least 12 accidents in which Teslas using Autopilot drove into parked fire trucks, police cars and other emergency vehicles, killing one person and injuring 17 others.

Families are suing Tesla over fatal crashes, and Tesla customers are suing the company for misrepresenting Autopilot and a set of sister services called Full Self Driving, or F.S.D.

Continue reading

© Copyright LaPresse