Tesla Inc. filed a recall covering more than 2 million vehicles after the top US auto-safety regulator determined its driver-assistance system Autopilot doesn’t do enough to prevent misuse.
Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies autonomous vehicle safety, called the software update a compromise that doesn’t address a lack of night vision cameras to watch drivers’ eyes, as well as Teslas failing to spot and stop for obstacles.
“The compromise is disappointing because it does not fix the problem that the older cars do not have adequate hardware for driver monitoring,” Koopman said.
Koopman and Michael Brooks, executive director of the nonprofit Center for Auto Safety, contend that crashing into emergency vehicles is a safety defect that isn’t addressed. “It’s not digging at the root of what the investigation is looking at,” Brooks said. “It’s not answering the question of why are Teslas on Autopilot not detecting and responding to emergency activity?”
Koopman said NHTSA apparently decided that the software change was the most it could get from the company, “and the benefits of doing this now outweigh the costs of spending another year wrangling with Tesla.”
Disappointing article. The Associated Press’s is better: