Projects -> Tesla


Autopilot Click-Bait Drama

2016-07-24 22:52:23 - wk057

Plenty of people have written about Tesla’s Autopilot over the last month due to the publicity around a fatal accident where the feature was in use.  I’ve got a few minutes, so I guess it’s time I jump on the bandwagon… hopefully with a reasonable viewpoint that’s not really represented in the general media aside from a few Tesla enthusiast sites.


First a little background. I personally own a 2014 Tesla Model S P85D that has the autopilot features.  I also have recently retrofitted that same hardware onto my wife’s 2014 Tesla Model S P85, which predates the release of the hardware by about 11 months.  Overall, since the feature was finally made available on October 15, 2015, about a year after buying my P85D, I’ve used the autopilot package’s autosteer feature personally for nearly 15,000 miles of driving.  It's probably also worth noting that I've been quite critical of Tesla in the past on some topics and I don't think anyone would consider me as biased towards Tesla's side of things.


I've not stopped using autopilot since day one.  After just a short time using autopilot its shortcomings were pretty clear.  It certainly is not perfect for hands-off use in all conditions, and Tesla specifically displays a message to keep your hands on the wheel at all times every time it’s activated.  I won’t lie, I almost never do.  I’m pretty sure most people who use autopilot often also do not keep their hands on the wheel at all times unless the system nags them to do so, which it does periodically if it doesn’t detect any torque on the wheel.  I keep my hands close and my eyes on the road, however.


I’m pretty comfortable with the system and I'm familiar with its behavior at this point.  It performs better than I had originally expected given the limited sensor inputs available to it.  I can pretty easily determine how well the system will behave on different stretches of road and I can handle things accordingly.  For example, there is a spot near my home on a 2-lane undivided road where the system works pretty much perfectly every time.  However, there is one spot where it freaks out consistently when rounding a moderate hill in the road.  Over 90% of the time it will try to steer off to the right for some unknown reason, and I have to take over.  It’s odd because the car is well aware of the lane markings and even shows on the instrument cluster that the car is exiting its lane.  I have no real explanation for why it does this.  For contrast, there’s a long stretch of a local divided highway where the system does so well that if I were to doze off for a moment I’d probably be perfectly fine. I’m definitely not suggesting I would ever do so, but just using the hypothetical for emphasis.  For more confessions, I admittedly do check some things on my phone occasionally on this stretch, but only after I've looked ahead and made the determination that all will be well for the next few seconds, and even then not ever fully taking my attention away from the road.


Which prompts some reiterating:  Autopilot is not perfect.  More importantly, it is NOT fully autonomous driving.  When autopilot is engaged in my car, who is driving the car?  I’ll give you a hint: he's flesh and blood with a valid driver’s license.  The car is not driving itself.  It is helping me, the actual driver, to drive with less fatigue.  You know, the whole driver assistance section of the car's settings where autosteer had to be activated?  And you know what?  It does impeccably well with its assistance and successfully reduces fatigue.  I’ve driven long stretches by myself for many years now.  Sometimes 500-800 miles at a time with infrequent stops.  At the end of the day I would be pretty drained, my arms and legs would be stiff and sore, and I would just overall feel like I needed a day to recover.  Doing the same trips with autopilot helping out simply is a night and day difference.  Even though I’m still driving the car and still paying attention to the road like I should be I don’t end up feeling the need to just sleep for a day.


I could detail the dozens of times autopilot has probably saved my ass from various incidents over the last 9 months, but that’s not what people seem to want to hear lately.  Those stories rarely make the news and are rarely even reported at all to anyone.  The media is making autopilot out like it’s the most dangerous thing to ever hit the road when all the facts point to the contrary.  Why?  Simple.  Because if you throw “Tesla” and “accident” or “fatal” into the same headline it draws attention.  Most people are completely unaware of how autopilot actually works.  So when the misinformed advertising-revenue-starved media gets the chance to capitalize on some sensational click-bait headlines they tend to do so.  In this case, like so many others these days, they simply don’t know what they’re talking about.  I doubt most of the people writing and supporting these overly dramatic views of the recent incident have ever even been in an autopilot enabled Tesla, let alone driven and used one for hundreds or thousands of miles.  Therefore they seem to fill in the gaps in the information with made up nonsense.  It’s pretty sad.


So, let’s talk about the accident everyone’s in a tizzy over.  I’m sure you all know the details by now.  If not, just Google “fatal Tesla autopilot” and I’m sure you’ll find 100+ stories about it.


Before I go on, my condolences to the family and friends of Joshua Brown.  I had the pleasure of exchanging a few emails with him earlier in the year relating to one of my videos, and he definitely seemed like a man I should have gotten to know.  That said, I apologize in advance for having to write what I have to write below.


I won’t claim to be an expert on this sort of thing, but this accident seems like it should have been easily avoided.  I looked at the scene of the incident via Google Earth and street view.  The incident took place shortly after the Model S rounded a hill in the highway, which would have obstructed the driver’s view of the intersection where the truck crossed until it was rounding the hill. However, looking at the distances involved, even in the worst case scenario with cruise control engaged at the maximum speed (90 MPH) the driver would have had something like a full five seconds to react to the situation from the time the car rounded the hill until the situation would have been unavoidable.  Might not sound like much, but five seconds is quite a long time in a situation like this.  I’ve been involved in situations in vehicles where split second reactions were needed to avoid disaster.  Five seconds is an eternity to react to such danger.  Why didn’t the driver simply notice the danger and react?


You may have picked up on the fact that in the previous paragraph I’ve not even mentioned autopilot.  The fact that autopilot was in use at the time of this accident is entirely irrelevant in my opinion.  Everyone seems so quick to just blame everything on Tesla and the autopilot feature.  But really, and I’m sorry to say, this accident is just nothing special.  In the time it’s taken you to read this post several people have died in the United States alone in some form of fatal car accident.  And it’s pretty likely that one or more of those deaths will have involved a driver who simply was not paying attention when they should have been.  The accident involving Mr. Brown is simply no different.  But you won’t even hear about any of the other nearly 100 fatal accidents that happened in the USA that day just because they didn’t involve the words “Tesla” and “autopilot.”


“But wk, the car should have prevented the accident!”  No it shouldn’t have.  If it did, that would be great and we likely wouldn’t even know about it.  But no, the car has no obligation whatsoever to prevent accidents.  It’s pretty advanced.  Likely the most advanced system of its kind.  But it simply doesn’t replace a driver yet, nor is it intended to do so.  It can help the driver in some cases, yes, but not even a fully alert human driver can prevent every accident.  For whatever reason people think the Model S can do impossible things, but it simply isn't supposed to.


“But it was a huge truck!  It should have been easy to prevent.”  Well, for this I’m going to have to go into some theory on how the sensors on the car actually work in situations like this.  Looking forward we have a camera, a radar, and a few ultrasonic sensors.  The ultrasonic sensors are pretty useless in this situation (~16 ft max range), so they’re out.  The camera was apparently suffering from a contrast issue, so it wasn’t able to do object recognition.  And then the radar, which probably was getting pings off of the truck the entire time.


“Wait, you just said the radar could probably see the truck!”  Sure, but the radar has no way of knowing on its own what it was seeing.  Just that there is an object ahead that we’re closing on at some speed.  A truck perpendicular to the car has no velocity from the view of the radar, so is seen as something stationary.


“So you’re saying the car’s radar probably saw a stationary object that it was closing on rapidly, and it didn’t stop?  Blame Tesla! Blame Autopilot!”  Here’s the issue with looking at this scenario like that.  The radar alone simply can not be relied on to make decisions about the vehicle’s behavior.  Let me give you an example as to why.




(Google Street View photo from local highway: https://goo.gl/maps/mgFgEFSn1zG2 )


Now let’s pretend you’re the forward facing radar in the Model S.  What’s directly in front of the vehicle?  A big stationary sign.  Now, obviously you’re not going to hit this sign because we’re going to take the curve in the road and the sign is never really even in the actual path of the vehicle.  But at this moment the radar is getting pings off of this sign as a stationary object directly in front of the vehicle that we’re approaching at full speed.  But the car doesn’t slam on the brakes when it detects this.  Why should it?  The camera sees that this is just a sign, the road curves, and all is well in the world.


You know what else the radar picks up?  Changes in the elevation of the road itself.  For example, when you’re coming down a hill and the road levels out, the radar is picking up the road ahead as rapidly closing on a stationary object directly in the path of the vehicle.  Same with overpasses, vehicles on the shoulder, cars in other lanes while rounding curves, etc etc etc.  The radar alone simply can’t sanely make a determination as to what to do with the car with its data alone.  It needs cooperation of the camera to make a determination.


So, personally, I wouldn't have expected autopilot to react any differently than it has.  The camera is not the best at quickly distinguishing stationary objects, but this is generally fine for its intended usage.  Not too many stationary objects on interstate highways where the system shines and where its designed to be used.


I suppose in a lot of cases it’s inevitable that people will continue to try and blame Tesla’s autopilot for their own failures as a driver, and likely get a lot of attention in the process.  But I guess it’s just normal for such people to try and assign blame vs. taking responsibility.  Doesn't make their claims any more valid.


The best and most humorous example is the driver who keeps trying to publicly blame Tesla and autopilot for his Model X running off the road into twelve wooden posts… TWELVE POSTS.  I mean really, even if you weren’t paying attention and somehow managed to hit one or two, why would you let the car continue to plow through them instead of getting back in your lane?  Just seems like stupidity to me with the driver looking to get out of taking responsibility.


In any case, I really hope the drama around this dies down soon.  Probably not, since drama sells.  In the meantime we should just blame everything autopilot! :D


Me personally, I’d gladly sign a waiver releasing Tesla of any responsibility or liability for any accident that happens to occur while my own cars have autopilot engaged, and maybe this should become part of the purchase agreement or something similar in the future.  It’s a great system that already has and is going to save many more lives.  So, Tesla, I'm very glad you're not giving in to the media.  Stick to your guns, keep calling out these drivers that try to claim nonsense, and keep on rolling.


Sorry for the rather lengthy post.  Looks like I went over quite a bit!


-wk


P.S. - You should probably just buy a Tesla if you don't already have one.  If you do, please use my referral link in the site footer to save a few bucks.  I don't really put advertisements or anything like that on my sites and videos, but I'll make an exception for the Tesla referral program since it actually has a benefit for the person using it unlike 99.9% of crappy ads you'll find all over the internet.