- Tesla Model 3 inside an overturned truck. The car was on Autopilot, at speed and spotted the truck too late to avoid a collision. The driver saw the truck in time but didn't react, putting his faith in the Autopilot.

Why Tesla's Autopilot should NOT be called an autopilot

Not a rant, but a broader look at the bigger picture in hand

Baby steps towards autonomy

Let's start with the basics. The grandfather of autopilot is cruise control and the thought behind it was to maintain a constant speed without the need of an input from the driver. The idea was born in the early 1900s from the English company Wilson-Pilcher and refined by the American company Peerless, who's patent included sustained speed on uphills and downhills. Modern cruise control was invented in 1948 by the blind inventor and mechanical engineer Ralph Teetor. His idea was ostensibly borne out of the frustration of riding in a car driven by his lawyer, who kept speeding up and slowing down as he talked. A mechanism controlled by the driver provided resistance to further pressure on the accelerator pedal when the vehicle reached the desired speed. Teetor's idea of a dashboard speed selector with a mechanism connected to the driveshaft and a device able to push against the gas pedal was patented in 1950. He added a speed lock capability that would maintain the car's speed until the drivers tapped the brake pedal or turned off the system.

Tesla Model S on the Autopilot collided with a fire truck who was traveling towards an accident scene.

Tesla Model S on the Autopilot collided with a fire truck who was traveling towards an accident scene.

How did we get here

With major technological advances made in the last two decades, cruise control started to evolve rapidly. With added cameras, sensors and radars (LiDAR in particular), the level of autonomy rose to unprecedented levels. First came the adaptive version that automatically adjusts the vehicle speed to maintain a safe distance from vehicles ahead. Then lane-keeping assist improved the picture even more, by allowing more degree of autonomy by keeping the car in-lane by applying a degree of steering input to keep you between the road lines. From there on, things started to get beyond the reasons of driving. Automatic lane changing by the press of the indicator, automatic steering in corners, following both the road lines and the navigation data. Sounds great, but where's the driver in this picture?

The camper van before the accident

The camper van before the accident

Back seat driver

We've all heard the story about a woman, who's camper van veered off the road with 40 mph and crashed into a tree near Orford (in Suffolk, UK - not the one in Canada), while she was preparing a tea in the back. The former librarian claimed she was confused by the word "Autopilot" on the stalk of her second-hand vehicle, she'd put her life-savings on.

β€œI thought it was like an autopilot that you get on airplanes.”

β€œI turned it on at what I thought was a sensible 40 mph, then stepped away from the driver seat into the back of the motorhome to make a cup of tea."

And this is where the issues lies in. The word "autopilot" is wrongly used in cars these days. The very definition of an autopilot excludes driver's input. And none other than Tesla is more avid to promote autonomous vehicle capabilities. But how autonomous are the current batch of cars, making those claims?

What's left of the Tesla after the accident in Spring, Texas. Credit: Matt Dougherty/KHOU 11

What's left of the Tesla after the accident in Spring, Texas. Credit: Matt Dougherty/KHOU 11

Too much faith, too little sense

In recent years internet users have shared a massive amount of content, promoting Tesla's Autopilot functions, including some seriously reckless and deeply disturbing images. Moving out of the driver seat while on the highway, sleeping in the back seat on a motorway, having an intercourse in the passenger seat on the autobahn, just to mention some. But this is just a negligibly small part of the craziness that's happening on a daily basis. People are getting hurt and killed! A recent incident in Spring, Texas (near Houston) caught my attention. A Tesla with no one in the driver’s seat crashed into a tree and burst into flames. Two people were killed and the car was extinguished after four hours and some 30 000 gallons of water.

β€œInvestigators are 100-percent certain that no one was in the driver seat driving that vehicle at the time of impact”

Preliminary reports suggest the car was traveling at a high rate of speed and failed to make a turn, then drove off the road into a tree. One of the men killed was in the front passenger seat of the car, the other was in the back seat, according to KHOU 11. And this accident is not a one-off. There have been at least 23 Autopilot related crashes under investigation by the National Highway Traffic Safety Administration at this very moment. And those are just the ones that the authorities know of. So what's the deal with the Autopilot here? Looking cool while crashing and potentially killing yourself, your passengers or even some innocent bystanders?

Tesla Model 3 on the Autopilot struck a police cruiser on a highway while trying to overtake autonomously. The driver was using only the indicator...

Tesla Model 3 on the Autopilot struck a police cruiser on a highway while trying to overtake autonomously. The driver was using only the indicator...

Who's fault is it then

To answer that question, we should really start looking at the bigger picture, because pointing fingers solves nothing! In recent times Tesla has been under fire exactly for its Autopilot and the driver's behaviour that the tech is promoting. So much so, that Elon Musk finally managed to budge a bit and Tesla cautioned its customers that Autopilot is not an autonomous driving system and still requires constant attention to the road while in use. But that's still too little, too late. The company’s cars only check that attention with a sensor that measures torque in the steering wheel, though, leaving room for misuse β€” something the National Transportation Safety Board admonished Tesla for last year. In the past, Tesla's CEO has rejected calls from his own engineers to add better safety monitoring when a vehicle is in Autopilot, such as eye-tracking cameras or additional sensors on the steering wheel, saying the current tech is ineffective.

Maybe we should take a look at ourselves then. More specifically, the trust we put in a technology that's just not there yet. Knowingly putting our lives at risk for an Instagram photo or a TikTok/Youtube video is irresponsible and dangerous to say the least. As you can see above, it can be deadly! And we're not putting only our lives on the line here, but also our friends and family as passengers, and even people on the sidewalks, minding their own business. This is no Domino's delivery robot, trundling along at 3 mph, carrying a pizza to your home. We're talking about a two-tone metal box with combustable batteries, capable of traveling at speeds with which it could penetrate into a house! We bear the responsibility for our own actions and we suffer the consequences! It's our choice to make, not the car. The word "autopilot" may be misleading, but it's our job as drivers to get informed how to use it properly. And any misuse of it means we've made the wrong choice.

So please, realise today your personal responsibility and don't put lives at risk for the expense of social network likes.

Do as I say, not as I do

I'm also guilty at misusing this tech. Not in a Tesla, but in Mercedes-Benz GLE, about a year or so ago. To be clear, Mercedes does not claim complete autonomous driving and does not use the word "autopilot", but rather presents their interpretation as a safety feature. And it does feature an eye-tracking camera, but it's only used to measure driver's fatigue. With all adaptive driving aids on, I was driving my friend's car, after we were out and about. She had couple of pints with the agreement that I would not drink in order to drive her home in her own car and get a taxi home afterwards. It was getting dark, we were on our way, when a heated discussion started. Trusting the German tech, I took my hands off the steering wheel and my eyes off the road, and started passionately, using hand gestures to explain my point in the argument. She warned me not to do that but I completely ignored her. Then the GLE started beeping at me, the screen started flashing rapidly, but I've ignored that as well, knowing that we're on a straight road with a very gentle kink ahead and the systems should be able to deal with that. Our speed was around 80 mph at the time. As she was starting to freak out a little, because the kink was already visible up ahead and fast approaching, the car just took off all throttle, no matter that my foot was still on the pedal, the hazard lights went on, we've started braking without me pressing the brakes and the car pulled to the side, then went into a complete stop. I realised then, that I've made a mistake, because we've stopped in the last meters where there were markings on the side of the road. Just a brief look ahead and the markings were missing completely, the barriers were also torn and there was a river on the other side! Someone had jumped in that river relatively recently and the barriers weren't repaired. Couple more meters and the car would've put us both in the river, not knowing where the side of the road is. So do as I say and not as I do! Get a car, drive it, don't leave your life in the hands of computers. They are there to help you out in a difficult situation, not to chauffeur you around safely. They just can't do that! The tech is not there yet and there is no reason to put your complete faith in it. Where faith begins, reason ends! Your safety is your responsibility!

Please, be responsible and drive safely! Emphasis on DRIVE!

Join In

Comments (3)

  • Autocrasher.

      23 days ago
  • Um β€œI thought it was like an autopilot that you get on airplanes.” Well, it is. Pilots don’t go in the back to make a cup of tea. The issue here is in folks not caring to understand terms or read up on limitations.

    For those of us who have flown...you know...airplanes it (autopilot) is an apt name.

    In both cases it’s a Pilot/driver assistance feature to automate some portion of routine tasks and can be misused. In both cases familiarize on what it does well and what it doesn’t. The pilot has to monitor an autopilot (in a plane) and the driver has to monitor autopilot in a Tesla.

    FWIW Pilot and Tesla owner here.

      23 days ago
    • I know, there's a need of a two pilots in the cockpit - one flying and one monitoring. But this 62 years old lady had a different understanding of the term. I agree with you, is not as simple as the lady put it in front of the court, while pleading...

      Read more
        23 days ago
3