Become a Member: Get Ad-Free Access to All Our Content

Tesla Recalls 363,000 Vehicles As Self-Driving Issues Persist

and the state-of-play for autonomous cars

Both the concept and application of self-driving technologies are taboo enough to begin with. Things take on yet another tune when the topic becomes hotly debated amongst driving enthusiasts, like yourself—and on a website called supercars.net, like this one.

Well, perhaps “debate” is the wrong word, as the idea of a car doing all the driving for you goes against the basic tenets of why we love cars. There is little purpose in having any back-and-forth with such heretical and sacrilege ideas.

But for the general commuter—as in, the vast majority of people on the road—the automation of self-owned transportation has been met with interest, if not a little intrigue, and can potentially transform the broader global economic landscape as we know it today.

Living With Auto-Automation

This means that self-driving automobiles are something that purists and traditionalists will need to be ready to understand, manage, and cope with—to at least a moderate degree and at a point in the not-too-distant future.

Or maybe not.

If the recent slew of events is anything to go by, proliferation is a long way off, especially if we’re talking about the widespread use of fully autonomous vehicles. Tesla has just recalled nearly 363,000 of its production cars which are equipped with what the company has dubbed, “Full Self-Driving Capability”.

This is amid plenty of fresh controversy as well, with a Tesla—whilst in self-driving mode—crashing head-on into a parked firetruck in Walnut Creek, California just last week. The driver was pronounced dead at the scene, with the passenger in critical condition.

Credit: The Associated Press

 

At the worst of it, the current sophistication levels of autonomous driving software have cost people their lives. Sure it’s rare, but not to the point that the average person can say that it’s unheard of —and that in itself, is inherently alarming.

Then there’s of course, the well-documented “sleeping drivers” whom have gone on to caricaturize the current disposition of self-driving technology in a way that is equally as hilarious as it is frightening.

But What Does Self-Driving Really Mean in Today’s Context?

Call it a false advertising, or what have you, but there is still no such thing as a 100% self-driving vehicle that’s legally operating on public roads today. In real world application, drivers (yes, drivers) are still essential in the safe operation of the vehicle even when the self-driving feature is in use.

So in a conventional sense, today’s autonomous car tech really isn’t all that different from the function of auto-pilot software on a commercial airplane, where an actual pilot (and First Officer) are still fundamentally responsible and intricately involved with getting everyone to their destination in one piece.

Suffice to say, it’s not so much that fully self-driving technology sucks per se. It’s actually just a matter of it not being “fully” at all. This is partial autonomy, at best. The terminology being used by Tesla is misleading.

This folly (partial pun intended) has lead prospective buyers—and sadly many current owners—to be under the impression that their Teslas are also full-time personal chauffeurs.

Rather than revising the wording or nomenclature of their products and technologies to provide clarity, Tesla chalks the current state of affairs down to a lack of the end-user’s understanding of their car’s “Full Self-Driving Capability” and what its limits entail. Basically they’re saying that “you should’ve read the fine print”. 

As long as self-driving features come with such a bold “use at your own risk” message from the very entities that have created them, should they not be kept off public roads and in places where there can be loss-of-life scenarios? Surely, there are better ways to test the software.

Could Things Look Different in the Long-Term?

It wouldn’t be naïve to expect the technology to continue improving, though I think this would more broadly be a matter of artificial intelligence, than self-driving software in isolation.

Afterall, current autonomous driving capabilities are limited by its coding, so when the software encounters something “off-script”, tragic events like the one mentioned earlier are more likely to happen.

For all of this to work as intended, it looks like we’re requiring something that can at the very least, “self-learn”, to a level that many would categorize as “sentient”. The best way to fill the role of a regular human driver is to bring in a practically human one instead, right? Now, that’s a discussion to be had over a beer, or seven.

Oh, and of course someone is already on the books to make this a reality. 

*shudders*