We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›
We live in an age of technological progress the likes of which no other period of human invention has ever seen; we have reusable orbital rockets; medical breakthroughs in surgical neurology; and deep-scan imaging uncovering lost civilizations. But the promise of self-driving cars has yet to be realized, though that’s not for the lack of promise or trying.
Self-driving, also known as autonomous, cars have been in the zeitgeist since the late 1950s. Automakers saw the space age’s marketing and technological advancements capture the world’s imagination, and started hyping “Cars that Drive Themselves!” Unfortunately, no one, and we mean no one, has been able to crack the code, but they’re getting closer.
In recent years, newly miniaturized sensors like radar, sonar, and Lidar, along with the advancement of computer technology, and artificial intelligence, self-driving cars have reentered the public’s consciousness. Add fully electric vehicles with systems capable of computing all that data and you can see why everyone’s hyped.
Nearly every major vehicle manufacturer has poured in money, retooled manufacturing facilities, devote entire engineering departments, and purchased self-driving startups in attempts to be the first to offer a functional self-driving car. Tesla even started selling a “Full Self-Driving” package in 2017 for its lineup, though the company hasn’t shown a functional prototype as of this writing.
Due to the glut of promises, media hype, and Elon Musk’s Twitter, The Drive’s crack info team thought it necessary to build a comprehensive guide to the future’s most-hyped tech. We determine whether or not you own a self-driving car, explain the levels of autonomy, and answer all of your questions about systems such as ADAS. So let’s dive in!
Can I Buy a Self-Driving Car?
To borrow a phrase from Lana Kane from FX’s most-excellent Archer, “Nooooooooooooope.”
But Really, Can I Buy a Self-Driving Car?
If that was unclear, no, you absolutely cannot purchase a self-driving/autonomous vehicle right now. They don’t exist, despite the marketing, billion-dollar valuations, and thousands of articles detailing our autonomous future being “just around the corner,” a phrase that pops up all too frequently in tech and transportation coverage.
Nor will you be able to purchase a self-driving car in the foreseeable future, as no company, not even Tesla, has been able to bring a self-driving/autonomous car to market. Even the company’s recent Full-Self Driving Beta release states in its terms and conditions, “Full self-driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent.” Emphasis ours.
We’re sorry to pop your Jetsons bubble.
What Is a Self-Driving Car?
A car that requires no human input in driving other than destination selection.
What Are Advanced Driver Assistance Systems (ADAS)?
Advanced Driver Assistance Systems, commonly known as ADAS, are the safety, accident mitigation (automatic emergency braking, lane-departure assist), and adaptive cruise and steering systems found on most new cars. These systems assist the driver in its regular operation. They do not drive the car for you. Again, they do not drive the car for you, they merely reduce the fatigue some feel when behind the wheel.
Tesla’s Autopilot, Mercedes-Benz’s Pro Pilot Assist, and GM’s Super Cruise use ADAS to function and are consequently not autonomous, as the driver still needs to maintain vigilance and be ready to take over operating a car when the systems can no longer operate.
What Is the Purpose of a Self-Driving Car?
Marketing says to make for a more peaceful ride to work. Reality says it’s likely great for companies and companies alone. They’ll be able to eke out more work from you for the same money because you can start work even earlier! Don’t you want that? Don’t you!
There’s also a significant portion of the population, media, and those companies developing the technology that tout an autonomous car will be safer than human-controlled vehicles because they’re less likely to get into accidents. That, however, simply cannot be proven due to a complete lack of data from real-world situations with large numbers of autonomous vehicles on the road.
What Makes a Self-Driving Car Work?
Well, nothing at the moment because there are no self-driving cars available. There are prototypes from Uber, Waymo, Tesla, and others that have been running partially without human intervention, though there’s still a human involved who can kill the car if necessary.
Enabling those prototypes to operate, as well as all the ADAS found on almost every new car, are two primary technologies; hardware and software. Without either, these semi-autonomous and autonomous prototypes would not be able to operate. Here’s how each breaks down.
Software
The software involved in automating a vehicle’s functions connects a suite of sensors that tell the car’s brain to brake, accelerate, and in some applications, steer the car. The design, coding, and functionality of this software vary from manufacturer to manufacturer. This isn’t like picking a Microsoft or Apple operating system.
Most engineers, as well as companies, also assume that a fully autonomous vehicle will require some basic form of artificial intelligence (AI) for it to function and safely get you from Point A to Point B. It won’t be Skynet (let’s hope), but it will have to be advanced enough to navigate complex terrains, weather, and traffic patterns; mitigate the anarchy caused by human drivers; and safely get you to your destination without the exterior of your vehicle resembling a bumper car.
Hardware
The hardware involved in automated and autonomous driving is the physical sensor array described above, which can feature radar, sonar, or lidar, as well as the car’s physical onboard computers. Each company uses different hardware, with Tesla famously coming out against the use of Lidar, though most use a combination of the sensors above.
Those computers collect the data relayed and make sense of it all in order for the car to function without human intervention.
Terms You Should Know Related to Self-Driving Cars
In the age of Twitter, Facebook, and the internet giving anyone and everyone a platform, a little knowledge (or lack thereof) can be used to proclaim expertise even where there is none.
To better insulate yourself against those reckless few who spout nonsense, here are the terms you should know and understand whenever you’re pulled into a conversation about self-driving cars. (This doesn’t make you an expert, just more informed—Ed.)
Autonomy
Autonomy is a closed-loop system that doesn’t require interference or input by an outside system, i.e. an autonomous car doesn’t need input by a human driver.
Automation
Automation is when a single system can, with a selected prompt or scheduled suggestion, become a closed-loop system, i.e. selecting radar-guided cruise control.
Partial Automation
A semi-closed-loop system where prompts are needed to execute an initial action and may require additional prompts, though not necessarily, i.e. lane-keep assist.
Artificial Intelligence (AI)
A likewise hyped technology that will either save humanity or doom it for all eternity. AI also features heavily in autonomous vehicle literature, as an autonomous vehicle’s computer will need to compute millions of variables from a plethora of sensors and interpret and judge what’s the best course of action. In short, the Terminator. Don’t worry, the Matrix isn’t real…yet.
Car-to-Car/V2V Communication
Car-to-car (C2C) or vehicle-to-vehicle (V2V) communication is when one car can communicate with another car. Many proponents and engineers working in the autonomous vehicle industry believe that car-to-car communication will be necessary to make autonomy function as predicted, with each car in front talking to the car behind, telling it of oncoming road hazards, weather, or accidents.
Driver Monitor Systems
Driver monitoring systems (DMS) have been around for a while now. Mercedes-Benz had one of the first systems that monitored whether or not the driver was tired and alerted them that they likely needed to take a break, stretch, or pull over and rest.
The influx of ADAS and automated systems have made DMS absolutely necessary, as people have shown wanton disregard for reality and repeatedly abused these systems to sleep, eat, text, watch movies, and even have carnal relations with another consenting adult. Every car fitted with these ADAS should have some form of DMS.
Radar-Guided Cruise Control
A recent advancement in car technology, radar-guided cruise control is one of life’s newest little pleasures. Using a sensor suite located either in the car’s bumper or in a housing near the rearview mirror at the top of the windshield, it detects cars ahead and can control the distance between objects ahead and you, braking and accelerating when necessary.
Lidar
At its core, lidar is a way for scientists to use laser light to measure distances between objects, with the returning data then used to 3D-map the scan area—think of it as a rapid topographic map-generating tool. Archaeologists have used lidar to scan a rainforest’s canopy and sand-strewn plateaus to detect ancient civilizations with some seriously impressive accuracy and detailed scans.
Because of its accuracy and full-spectrum sweep of a surrounding area that’s 3D-mapped, autonomous vehicle researchers have adopted and miniaturized the tech in order to provide better forward, side, and rearward scans for autonomous car prototypes to operate in 3D space.
Geo-Fencing
Geo-fencing is when an autonomous car can only operate autonomously in prescribed areas set by the manufacturer or certain roads. It would not be able to function elsewhere.
Levels of Autonomy
To properly explain the levels of autonomy, we need more room, so scroll down and dig in.
Levels of Autonomy for Self-Driving Cars
To better understand where we’re currently at in terms of automation, it’s best to learn the Levels of Autonomy. Although many trust and use the Society of Automotive Engineers’ (SAE) defined levels of autonomy, there are some holes in its definitions.
The SAE-defined Levels also often allow marketing speak to seep where it shouldn’t be, i.e. the definitions that the public/media use to illustrate autonomy. There should be a separation of church and state.
To address those issues, we’ve adopted the SAE’s framework of Level 0 through 5, but replaced the definitions with our own to better define and explain to the average reader. Let’s do this!
Level 0
To borrow from Alex Roy, who is highly educated on autonomy and supremely funny, Level 0 autonomy is a horse. While this level is recognized by the SAE, the group defines it as the driver “must constantly steer, brake, accelerate, and steer.” Our definition is easier: you’re the driver, you’re driving, nothing is there to help you drive so drive the dang car.
TL:DR: The car provides no interference.
Level 1
Similar to Level 0, Level 1 is where the driver is still in control, but the car features advanced driver assistance systems such as adaptive cruise control or lane-centering and brake and acceleration support such as Automatic Emergency Braking.
TL:DR: The car provides some interference.
Level 2
Level 2 is essentially Level 1, where the driver is still in control, but unlike Level 1, Level 2 features both adaptive cruise control and lane-centering AND can be used at the same time.
TL:DR: The car provides some interference. Tesla’s Autopilot, Mercedes-Benz’s Drive Pilot, Audi’s Adaptive Cruise with Traffic Jam Assist, and GM’s Super Cruise are examples of Level 2.
Level 3
So here’s where the Levels get tricky and where The Drive and SAE diverge. SAE says Level 3 Autonomy is “You are not driving when these automated features are engaged—even if you are seated in the driver’s seat.” To most people, that vague definition means the car will drive itself, but the driver can be requested to operate the car if something calamitous arises. In reality, it means that the system likely can’t cope with inclement weather, poor infrastructure, or conditions that make it unsafe for the system to function.
TL:DR: That’s not driving itself, that’s still assisting the driver and it should not purport to be self-driving in any regard. This is especially egregious given the abuse Level 2 systems have enjoyed due to erroneous proclamations about the capabilities of systems such as Autopilot. SAE needs to redefine it and clear up its nomenclature..
Level 4
Levels 4 and 5 are fabled things, indeed. According to SAE, Level 4 Autonomy-capable cars may or may not have “pedals/steering wheel.” These cars do not require a human to intervene at all. The engineers use the example of “local driverless taxis,” which DO NOT EXIST, no matter what Elon says.
TL:DR: Level 4 is hybrid autonomy as it features fully autonomous capabilities but still has the functionality for you to drive the car. We could even see a scenario where Level 4 and Level 5 are combined and segmented into two groups, those with controls and those without.
Level 5
Once again borrowing from the indomitable Alex Roy, “BS, at least for the foreseeable future.”
In a more structured breakdown, Level 5 autonomy means that the car drives, brakes, and steers itself. There’s no steering wheel, no brake or accelerator, no controls at all. According to the SAE, Level 5 refers to a car that can drive itself everywhere in all conditions. As such, the car is fully autonomous. You just plug in your destination and it whisks you away. THIS DOES NOT EXIST.
TL:DR: An Uber without the chatty driver and a long-shot in becoming reality before 2077.
Why Is Driver Monitoring Necessary?
Because too many people think their cars are autonomous when they’re not and can’t be trusted to sit behind the wheel without doing something stupid.
Why Is It Irresponsible To Allow Autonomous Prototypes On The Road?
Let me tell you a little story. Nearly two years ago, as the author was driving in Los Angeles with his wife, infant daughter, and brother were in the car, a crimson Tesla Model 3 began to merge onto the highway. To mitigate a potential crash, the author accelerated. The Model 3 did too and nearly side-swiped the author’s car, which could’ve injured, or worse, everyone inside. Just before the two cars almost came together, the author saw the Tesla driver’s hand jump up from his lap and onto the steering wheel. The car was on Autopilot.
Borrowing from the resulting column, “My daughter, your daughter, your son, your wife, your husband, your brother and sister, your father and mother, every single person who shares the road with an Autopilot-equipped car are in essence Tesla’s lab rats. What’s a few deaths when you’re advancing technological progress? Tesla covers its ass by giving those “futurists” willing to use Autopilot—again, not a fully autonomous vehicle—a Terms and Conditions prompt before drivers are able to engage the system. The dialogue-box informs drivers that they need to agree to “Keep your hands on the steering wheel at all times and to always maintain control and responsibility for your vehicle.” Yet, unlike the Terms and Conditions we accept on a regular basis—those that almost no one ever reads—its effects can reach beyond the user. There are in fact other people on the road who haven’t given their tacit agreement to be beta testers, like my daughter. No amount of Tesla legalese can refute that.”
Essentially, public roads aren’t a vacuum. Others exist and beta-testing technology on them is inherently irresponsible.
FAQs About Self-Driving Cars
You’ve got questions, The Drive has answers!
Q. So Mr. Smarty Pants, in What Year Will Cars Drive Themselves?
A. Honestly, we don’t know. Companies have made real progress in automating a number of systems but every single “autonomous” experiment still requires a human in the driver seat monitoring everything. And if they aren’t, people can be hurt. We’re in an age where companies can promise a lot and deliver a little, but because of the information age, we’re hyper-aware of those launches and think everything is “just around the corner.” Right now, it isn’t.
Q. Then What Is Musk Talking About With Full Self-Driving?
A. To be frank, it’s marketing spin. Full Self-Driving technically “debuted” in 2016 and has been promised to be “just around the corner” ever since, with Musk even saying it’d be available to “certain customers” in 2018 and 2019. As of this writing, it’s only available as a Beta test, as mentioned above, to very few users.
It is, however, a helluva cash flow stream for Tesla, a company that has charged between $1,000-$7,000 since it debuted FSD on all new Teslas in 2019. We’re not saying that people are paying a lot of money for something that doesn’t exist, but people are paying a lot of money for something that doesn’t exist.
Q. Ok, But Why Does Autopilot Work in Airplanes and Not Cars?
A. Because planes are easy, relatively speaking. Planes, unlike cars, have the freedom to move in 3-dimensional space and are capable of moving on the X, Y, and Z axes in much less crowded spaces. That freedom allows for the, again, relatively small number of planes to easily pilot themselves to their destination.
Cars, however, don’t operate in such environments. They have to contend with millions of other cars, pedestrians, obstacles impeding the road, poor infrastructure, the chaotic impulses and actions of drivers, weather, along with a host of variables that are extremely hard to code around. It’s also the impetus around the famous “Trolley Problem.”
Q. The “Trolley Problem?”
A. Oh, baby, strap yourself in, we’re going for a ride. The trolley problem is a thought experiment that the people who are tasked with bringing about our autonomous future cling to. The problem boils down to this: You’re the conductor of a trolley, and there’s a fork in the tracks ahead. Tied to one track is a single person. On the other, five people. You, the conductor, have to make the decision to take one track or the other. The conceit is that you’d make the decision to do the least amount of harm possible and, using pure logic, choose the track with the single individual. This is the same logic coders are attempting to give the AI computers that regulate self-driving cars.
Q. That Sounds Awful, But I Guess the Logic Is Sound?
A. Yeah, it doesn’t work. As a thought problem, totally sound. In reality, as soon as you put a name to someone, the entire problem shreds itself.
Imagine the problem again, but this time that one person is your spouse, mother, father, brother, daughter, son, or grandparent and the other people are random strangers. Or an even split between the two tracks. The AI program driving the car won’t care, but you know who will? That person’s family. This leaves the company behind the software open to litigation regarding how the AI decided who lives and who dies. People aren’t computers and won’t see a computer deciding who lives and who dies as “logical.”
Q. If That’s The Case, Are Any Companies Even Close To Providing Self-Driving Cars?
A. Some say they’re getting close, but every single company building supposedly self-driving/autonomous cars still have a squishy meat-bag in the driver seat in case some programming error occurs and they’re needed to halt the car’s momentum. This was supposed to be the case with Uber’s prototype when it killed a pedestrian. And it’s most definitely the case with every single Tesla on the road, no matter whether you purchase Full Self-Driving or not.
Q. Wait, Has Tesla’s Autopilot Killed Anyone?
A. Because of Tesla’s poor marketing and Elon’s public statements to just about every outlet on the capabilities of Autopilot, many people abuse the system on the regular. Those are the videos you’ve seen with people sleeping, eating, watching movies, and even having sex while Autopilot is engaged. As a result, a handful of deaths have occurred.
The National Highway Traffic and Safety Administration (NHTSA) even found that Autopilot was likely at fault in a fatal accident in 2018. Further, people continue to abuse Tesla’s Autopilot system and use driver-monitoring defeat devices to thwart Tesla’s driver-monitoring steering wheel torque sensor to allow for prolonged Autopilot driving without driver assistance or monitoring. These are affectionately known as “Autopilot Buddies.”
Q. What Is an Autopilot Buddy?
A. An Autopilot Buddy is a weight that you clip onto your steering wheel to defeat Tesla’s driver monitoring steering wheel torque sensor. You know, the system that keeps you alert and attentive rather than filming a video for Pornhub.
Q. You’ve Been a Great Help, How Can I Ever Repay You?
A. Fact-check your friends whenever they decide to say their Tesla “drives itself!”
Let’s Talk, Comment Below To Talk With The Drive’s Editors!
We’re here to be expert guides in everything How-To related. Use us, compliment us, yell at us. Comment below and let’s talk! You can also shout at us on Twitter or Instagram, here are our profiles.
Jonathon Klein: Twitter (@jonathon.klein), Instagram (@jonathon_klein)
Tony Markovich: Twitter (@T_Marko), Instagram (@t_marko)
Chris Teague: Twitter (@TeagueDrives), Instagram (@TeagueDrives)
Featured Products
Zencar Level 1 with NEMA 5-15 Plug
Juicebox Pro40 Smart EV Charger
Got a question? Got a pro tip? Send us a note: guidesandgear@thedrive.com