Self-driving cars have been talked about for as long as flying ones, but where the latter really does, for now at least, seem to be relegated to science fiction, the autonomous vehicle is fast becoming a reality. However, this does not come without some serious philosophical problems, both ethical and linguistic. For instance, how exactly are we defining ‘autonomous’? Do we mean a car that you can literally get into, plug in a destination, and sit back with a glass of champagne without ever even learning how to drive? Or do we simply mean the sum of a collection of automatic features, such as an automatic gearbox, handbrake, or cruise control? If an accident occurs on auto-drive, who is liable? The driver? The manufacturer? It is a minefield, but this article will explore some of these difficult philosophical questions.
Strictly speaking, ‘auto’ as a prefix derives from the Greek ‘autos’, meaning ‘by oneself’. This does not determine the extent of automation ascribed to a thing, only that it is a characteristic of it. An automatic gearbox, for example, does not mean that there are no gears, only that the driver does not need to change them manually via a clutch, as the gearbox uses a planetary gearset and a torque converter to naturally shift up or down depending on the speed of the vehicle. An automatic handbrake, similarly, is triggered electronically when the vehicle stops. The basic idea, therefore, is that there is a spectrum of automation available, whereby more and more of what constitutes ‘driving’ is managed by a machine, freeing the driver to manage everything else.
So, how close are we to ‘total’ automation? Well, the Society of Automotive Engineers (SAE) defines five levels of automation, with the top level being described as ‘full’ in the sense that, in principle, a driver ought not to be required at all, the vehicle’s navigation and hazard perception being taken care of via sensors on the vehicle’s exterior and a system of sophisticated computer algorithms. Fully autonomous vehicles also use cloud computing to act upon things like traffic data, weather conditions, and city maps, which could be useful if you want to get to the beach before everyone else while you’re just visiting Chula Vista, for example, and you don’t have an internet provider, although you can learn more here about that.
Ethics & Liability
However, it is not the case that you can get in a self-driving car and take charge of it without a license, for the same reason that a Boeing 747 being, in principle, capable of take-off and landing via autopilot still does not mean the plane does not need a pilot. This is because in driving, just like in flying, there are invariably awkward situations that arise in which there simply isn’t an objective solution, i.e., that are subject to an opinion, and these are the ones that require an experienced, sentient mind to resolve. Legally, therefore, no matter the level of automation, it is still the driver who is ultimately responsible for what happens, not the car.