The family of a North Carolina man who died after driving his car off a collapsed bridge while following Google Maps directions is suing the technology giant for negligence
When you sit in the car, it really doesn’t though. When you enable it, it clearly warns you about the dangers involved and to always pay attention. The radar versions even warned you about specific situations it would fail in and potentially cause a fatal accident. All cars that rely on radar have that issue and warn their users.
Anyone who’s used it knows it clearly has problems, and honestly, it can be a little nerve racking getting used to using at first as well, because it does have problems and you need to learn them. My partner doesn’t like using it because of those problems.
The majority of people causing accidents on it have simply grown accustomed to it. They know when it will usually fail, and then make poor choices and end up in a rare circumstance. People are just people and make all sorts of bad choices. Some people follow the GPS off a bridge or into a lake.
That’s not to say I don’t think there’s room for Tesla to improve on this, like using the in cabin camera to further help detect if someone is paying attention or not, but ultimately it falls on the driver to pay attention.
If you happen to be given a Tesla with AP already enabled on your profile, and you’ve only gone off what you heard in the media then sure maybe, but those aren’t the people causing problems. And really, if you rent a Tesla, I really do hope it’s all disabled by default so you have to turn it on and go through the setup. That would be a legit problem in my mind otherwise.
My exact first thought. And why not a billion BIG Red SIGNS saying shit like: “Collapsed Bridge ahead”, “Warning: immediate death ahead”, “What the fuck are doing?! Turn around”, etc.
User error caused this man to go over a cliff. User error does not excuse tesla accidents when the user is supposed to be hands off. One was an accident caused by a person, the other was an accident caused by a machine attempting to make human decisions. There’s a huge difference.
I wasn’t talking about FSD, I was talking about AP.
Although if you use FSD, to sign up, you need to acknowledge this (among other things)
“Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent,”
If it leaves Beta in V12, and that warning is gone, there will be problems probably =( It’s not ready to lose such an extreme warning. And it legit shouldn’t leave beta until they take on liability and it’s legit FSD.
I’m going to get downvoted to hell here, but if you defend google here, you should be defending Tesla when someone severely misuses auto-pilot.
Play games on AP and don’t pay attention causing crash, not Tesla’s fault. Drive off a bridge cause the GPS tells you to, not Google’s fault.
You’re responsible for driving your car at all times.
At least AP presents itself (somewhat) as an autonomous system though… even the best GPS obviously still requires you to look at the road.
When you sit in the car, it really doesn’t though. When you enable it, it clearly warns you about the dangers involved and to always pay attention. The radar versions even warned you about specific situations it would fail in and potentially cause a fatal accident. All cars that rely on radar have that issue and warn their users.
Anyone who’s used it knows it clearly has problems, and honestly, it can be a little nerve racking getting used to using at first as well, because it does have problems and you need to learn them. My partner doesn’t like using it because of those problems.
The majority of people causing accidents on it have simply grown accustomed to it. They know when it will usually fail, and then make poor choices and end up in a rare circumstance. People are just people and make all sorts of bad choices. Some people follow the GPS off a bridge or into a lake.
That’s not to say I don’t think there’s room for Tesla to improve on this, like using the in cabin camera to further help detect if someone is paying attention or not, but ultimately it falls on the driver to pay attention.
If you happen to be given a Tesla with AP already enabled on your profile, and you’ve only gone off what you heard in the media then sure maybe, but those aren’t the people causing problems. And really, if you rent a Tesla, I really do hope it’s all disabled by default so you have to turn it on and go through the setup. That would be a legit problem in my mind otherwise.
The biggest fault here would be whoever was in charge of that bridge. If it collapsed 9 years ago why was it not blocked off?
My exact first thought. And why not a billion BIG Red SIGNS saying shit like: “Collapsed Bridge ahead”, “Warning: immediate death ahead”, “What the fuck are doing?! Turn around”, etc.
That’s the biggest question for me.
A GPS is a tool that aids a person
Tesla FSD is marketed as Self Driving
User error caused this man to go over a cliff. User error does not excuse tesla accidents when the user is supposed to be hands off. One was an accident caused by a person, the other was an accident caused by a machine attempting to make human decisions. There’s a huge difference.
I wasn’t talking about FSD, I was talking about AP.
Although if you use FSD, to sign up, you need to acknowledge this (among other things)
“Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent,”
If it leaves Beta in V12, and that warning is gone, there will be problems probably =( It’s not ready to lose such an extreme warning. And it legit shouldn’t leave beta until they take on liability and it’s legit FSD.