Tag Archives: autonomous

Joshua Brown, 40, Tesla Model S Driver, Killed in Williston, FL

2015-models-pdWho:

Joshua Brown, 40, from Canton, Ohio, died on May 7th, 2016 at 3:40 PM in Williston, Florida, due to a collision between his 2015 Tesla Model S operating in “Autopilot” mode and a semi-trailer driven by Frank Baressi, 62. Brown died at the scene, while Baressi was uninjured.

How:

Per Florida Highway Patrol reports, the crash occurred on US 27A, a divided highway. The semi was north-east-bound while the Model S was south-west-bound and traveling with “Autopilot”, a semi-autonomous driving mode, activated. The semi turned left in front of the car, and the car continued to travel at an estimated speed of 65 mph into and underneath the trailer at a broadside angle (i.e., a side underride). Tesla’s electronic records indicate the brakes were not applied by either the vehicle or the driver, and that none of the features of the car detected the semi in front of the vehicle.

The windshield was penetrated and the roof was torn off at impact, and the driver was presumably killed at this time. The car continued beneath the trailer, staying east-bound on the roadway, until eventually leaving it on the right side (south shoulder) and striking multiple fences and a power pole, before ending 100 feet away from the highway.

The driver of the semi stated that Brown was watching a film (Harry Potter) at the time of the crash. He later clarified that he had not seen the film, but had heard it. The police later discovered a portable DVD in the vehicle, but did not note whether it had been playing at the time of the collision.

Tesla blogged about the crash in late June, shortly before the NHTSA publicized its intent to launch a preliminary evaluation of the Autopilot system. They stated that Autopilot camera didn’t notice the trailer due to the color of the trailer (white) and the brightness of the day, leading to the vehicle not applying the brakes. Elon Musk, the CEO of Tesla, added that the vehicle’s radar did not see the trailer because it was designed to ignore objects that resembled overhead road signs (i.e., the height of the trailer fooled the system into ignoring it). They ultimately placed responsibility on the driver, noting that the system was a semi-autonomous one and was not designed to replace actual driving behaviors.

Why:
A side underride guard (standard in the EU) could have significantly mitigated the severity of this collision.
A side underride guard (standard in the EU) could have significantly mitigated the severity of this collision.

This is an unfortunate case that has sent ripples through the Tesla and autonomous driving communities, and for good reason. I’ve written about underride collisions before, and my reaction to this crash is no different in this respect: regardless of whether responsibility should be handed to the driver of the Tesla (who was inattentive enough to have not made any attempt at braking the vehicle or steering it out of the path of the semi trailer), the driver of the semi (who is being judged at fault most strongly by the Tesla community under the tenet in US traffic law that a vehicle that does not yield to a vehicle with the right of way is generally automatically responsible in a collision), or the Tesla and its designers (due to the inability of the vehicle to detect a massive and massively dangerous roadway obstacle), I firmly believe the ultimate responsibility rests with our government’s refusal to mandate side underride guards and stronger rear underride guards, as is the case in the European Union.

Trucks in the EU are required to have rear and side underride guards, dramatically reducing underride injury and fatality rates compared to in the US.
Trucks in the EU are required to have rear and side underride guards, dramatically reducing underride injury and fatality rates compared to in the US.

In the US, they are known as RUPS (Rear Underrun Protection System), FUPS (Front Underrun Protection System), and SUPS (Side Underrun Protection System). In this case, the SUPS was the relevant guard missing, which was unfortunately to be expected, since, unlike RUPS, they aren’t required in the United States.

While such guards may not have prevented the collision (Brown was apparently unaware that he was being driven into a semi-trailer, and the Autopilot may have been unable to “see” a white underride guard, given its blindness to a white trailer), it would almost certainly have reduced the impact of the collision, and perhaps have done so to the point where the underride might have been avoided or mitigated to survivable levels.

The vehicle and drivers were (also) both at fault

Beyond the need for comprehensive side and rear underride guards, I completely agree that Brown (and Baressi) should have been paying greater attention while driving. An outspoken proponent of the Model S and Tesla company, Brown made a number of YouTube videos within which he discussed his appreciation of the Model S’ Autopilot feature, and even credited it with saving him from a crash shortly before his death.

Unfortunately, the system he trusted with his life was eventually partly responsible for claiming it. This was a complete failure of the Autopilot system; any system incapable of seeing an effectively stopped, massive, life-threatening road obstacle is not a system that should not be made massively available to the driving population. While the Autopilot documentation repeatedly states that the driver must remain ready to take control, the truth of human nature is that people have, do, and will continue to try to get away with as much as they see safely possible. Tesla dared its drivers to take this risk while hiding behind legalese about the need to avoid this risk. This is unacceptable.

Tesla additionally intimated that despite the failure of Autopilot here, Teslas equipped with activated Autopilot were still safer than any other group of driver-driven vehicles:

This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.

Unfortunately, this claim rings hollow, even if technically accurate, due to the extremely controlled conditions under which Autopilot is activated by Tesla drivers (by a self-selected community of generally educated, wealthy, and older individuals and almost exclusively on divided highways, which have the lowest fatality rates per mile of any large driving environment in the US), compared to the general battlefield of drivers, vehicles, and roadways throughout the country (and world). It isn’t a fair comparison.

Additionally, due to the vanishingly small sample size used, it’s also at risk of being debunked and openly ridiculed at any moment. If only 1 Autopilot fatality has occurred in 130 million miles of activation, a second fatality tomorrow would drop the ratio to 1 per 65 million miles, a statistic that would make the Autopilot 31% more dangerous than not buying or driving a Tesla, but driving whatever you wanted and on whatever road you wanted, on average. It would make an Autopiloted Tesla virtually identical in risk to that of the average driver throughout the globe. A third AP fatality would drop the ratio to 1 in 43 million, making an AP Tesla more than twice as dangerous as the average driver / vehicle combination in the US and 40% more dangerous than the average global driver. This isn’t the kind of math Tesla wants to get itself into.

I have no doubt that autonomous driving, in general, will be far, far safer than human-led driving once the technology is mature and widely available. However, what Tesla is pushing isn’t autonomous driving. To be fair, Tesla doesn’t claim that it is. But until they have it available, half measures can be far more dangerous than the real thing or nothing. A little knowledge can get you into a lot of trouble. You can’t jump a cliff halfway. This is one of those areas where an “all or nothing” approach is much safer than a system that lulls people into a false sense of security.

We still need underride guards

As I’ve written before, and will continue to write in every underride crash until the US brings its underride guard regulations into the 21st century, it is essential to repeat that this crash was preventable, but that virtually no passenger vehicle exists that would have protected him at the speeds at which he likely crashed, as the significant marker of trailer underride crashes is that the part that results in death–the trailer itself–is above the crash-absorbing structures of virtually every vehicle on the road. This is proven true yet again, regardless of whatever marketing Tesla will likely continue to push about the S (and X, and 3) being the safest vehicles in their class. It’s not about the vehicles; it’s about the guards.

If you find the information on car safety, recommended car seats, and car seat reviews on this car seat blog helpful, you can shop through this Amazon link for any purchases, car seat-related or not. Canadians can shop through this link for Canadian purchases.