begin with a free consultation (949) 870-3800
begin with a free consultation
begin with a free consultation (949) 870-3800
begin with a free consultation
begin with a free consultation Start Here
start a free consultation here
Every story is unique, start telling yours here
  • This field is for validation purposes and should be left unchanged.
  • This field is for validation purposes and should be left unchanged.

All Fields Required

Tesla’s Self-Driving Car Involved in First Fatality

Tesla’s Self-Driving Car Involved in First Fatality

The first fatality in a self-driving car has been recorded.

The victim, Joshua Brown, was driving Tesla’s Model S, which was equipped with a Beta Autopilot System. In this particular accident, John was utilizing the feature when an 18-wheel tractor drove across the highway and over his Model S, impacting both the roof and windshield. The driver eventually struck a utility pole 100 feet away.

As Tesla stated in their blog, “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”

Tesla also went on to say that if the model crashed into the trailer’s front of rear end, the safety features would have been instigated and prevented injury. Unfortunately, in this case, the features weren’t instigated.

So who’s at fault?

Prior to the fatality, Joshua Brown was a huge advocate of the car. He appeared on YouTube praising the vehicle, nicknaming it “Tessy” and boasting how it had already saved him from an accident. Tesla has also made it clear that these cars are still in Beta Testing, and while they have logged 130 million miles without a fatality, drivers still need to keep their hands on the wheel at all times and prepare to take over if necessary. In fact, the car is supposed to gradually slow down when the hands are not activated.

As a Tesla owner myself, I am a huge fan of the brand. However, self-driving cars create a diverse set of problems to the consumer and the possible victims of an accident. At the end of the day, computers are prone to the same mechanical failures as any technological device. With that being said, should cars in Beta be allowed on the road? And most importantly, who is responsible in the case of a self-driving car crashing into another vehicle? Does liability fall on the driver or the brand that created the autonomous vehicle? Should blame be shared if the self-driving features don’t do exactly as they are marketed to do? In this particular case, was the driver’s hand on the wheel and if not, did the safety features warn him of his improper use of the autopilot feature.

Another question is morality. Should self-driving cars be programmed to solely protect their owners, or do they need to factor in a wide range of circumstances, such as kids running across the street, in which drivers will often swerve the vehicle to avoid the collision. These are questions we need to consider as the legal debate of self-driving accidents increase over the next decade.

As far as blame of the company, Tesla had this to say in a statement last Friday:  “Autopilot is by far the most advanced driver assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility.”

It is clear the stance that Tesla is making. But what if you’re a victim of a self-driving car? The answer might not be so clear.