The Stack Archive

Should self-driving cars have to pass a driving test?

Thu 22 Oct 2015

The University of Michigan Transportation Research Institute has released a white paper calling for investigation into whether autonomous vehicles should have to pass a standardised driving test in the same way that people do in the United States, and it raises many interesting questions about the prospects of a release-to-market for self-drive technology – as well as implications for the people who will occupy the vehicles.

Should We Require Licensing Tests And Graduated Licensing For Self-Driving Vehicles? (PDF abstract) by Dr. Michael Sivak and Brandon Schoettle, Project Manager in UMTRI’s Human Factors Group, suggests that even though applying the United States’ graduated driver licensing system (GDLS) to autonomous vehicles directly is not appropriate, the disparity in standards and application of common criteria among self-driving vehicle (SDV) manufacturers makes some common establishment of standards a possible priority in the near future.

One of several areas that the paper addresses is the self-confessed potential limitations of visual recognition systems in SDVs, a factor which, for humans, is entirely handled within the GDLS by an eye test and an optional requirement for glasses on the driving licence:

‘Visual performance of self-driving vehicles in inclement weather is currently a problem. For example, a Google spokesperson was quoted recently as saying that Google “doesn’t intend to offer a self-driving car to areas where it snows in the near term” (Trudell, 2015). Furthermore, even rain can be a problem for some current prototype systems (Sutherland, 2015). ‘

In January Chris Urmson, director of Google’s self-driving car project, told reporters in Detroit that tests taking place on Google’s home turf in Mountain View are in effect first-phase trials in ideal conditions, observing, on the subject of SDVs negotiating fog and rain: “There are a lot of places where we can get an initial deployment, understand the tests, see how people use it and then push the technological boundaries into these more challenging situations.”

‘Fair-weather’ self-driving

The urgency to commercialise SDVs seems to indicate that they may enter the market initially with many caveats; certainly that early conditions for autonomous driving will prescribe those same conditions in which the vehicles have proven themselves. There are clearly not going to be many early launches in February in Alaska.

This would seem to suggest that any standardised SDV testing procedures will have to be graduated or qualified, as the paper suggests – or else that the need for self-driving cars to prove themselves under far more challenging driving conditions could cause great delay to any eventual roll-out. If governments prove as skittish about SDVs as they are currently proving about drones, it’s even possible that we are living in the SDV equivalent of the first ‘false flush’ of consumer enthusiasm that surrounded VR in the 1990s, and that self-drive will be something for your kids, rather than you.

Alternatively we have the possibility of ‘fair-weather’ SDVs going to market in the ‘early years’ of self-driving; vehicles licensed either for certain regions or for certain conditions, or both. That brings up the issue of what happens when an occupant attempts to ride an SDV out of its comfort zone – into pile-driving rain, for instance, or into fogged-out Detroit, or even into a GPS zone noted for a high incidence of traffic accidents. Would the vehicle report itself to authorities, quietly park itself and make the occupant aware of nearby motels with vacancies, or simply stop and tell the occupant to take over the driving duties?

In the latter case, what are your options if you don’t actually have a driving licence? In The Pathway to Driverless Cars [PDF] the British Government envisions a future where self-driving cars do not require that the occupants be able to drive, though it adds “Emergent properties of the way automated systems interact…may potentially [require] changes to driver training, testing, and licensing.”

Self-driving vehicles – too good to be safe?

A crucial issue to the prospect of SDV testing is whether a self-driving car would pass or fail the test by breaking traffic laws. The answer isn’t as obvious as it seems; though the report admits that SDV guidance and driving systems could easily incorporate all the traffic laws of 50 states and apply them based on GPS, there is no implicit instruction in these laws about how to break them in order to avoid injuring or even killing people:

‘Self-driving vehicles may follow the letter of the law too strictly, compared to what people typically do (Richtel and Dougherty, 2015). Let us consider two examples. The first example comes from Visnic (2015). He points out that “merging at the speed limit onto a highway of cars zipping past at well over the speed limit is just plain dangerous.” The second example is from Knight (2015): “Another clip…showed the first time one of Google’s cars encountered a traffic roundabout, when it decided the safest thing to do was to keep going around,” presumably because of the aggressive behavior of other traffic participants. Other examples of traffic-law violations by human drivers include going over the speed limit by a few miles per hour, performing rolling stops at stop signs, and leaving less-than-recommended spacing between vehicles.

‘Consequently, the newest versions of self-driving software attempt to make vehicles “drive more like humans” (Barr and Ramsey, 2015). This is achieved, for example, “by cutting corners, edging into intersections and crossing double-yellow lines” (Barr and Ramsey, 2015). ‘

This problem would mean that traffic violations would need not only to be instituted in software but also in legally binding tests that constitute any proposed evaluation of self-driving vehicles. One possible solution would be the enforcement of ‘zero tolerance’ for this kind of laissez faire driving style at a systems level, pushing the ‘better behaviour’ of SDVs forcibly into the general driving population, which could be presumed by that time to be entirely controllable – and repressible.

The no-win scenario

At a regulatory and software level, self-driving cars will eventually have to answer some questions which are very tough, and about which even human driving pupils are given no instruction, such as which of two bad choices to make in extreme driving circumstances:

“On rare occasions, self-driving vehicles will face ethical dilemmas of having to choose the lesser of two evils (e.g., Newcomb, 2015). (An example is being forced to decide between two inevitable crashes that involve different participants.) It would be desirable if the resolutions of such ethical dilemmas were consistent with societal norms, as is hopefully the case with human drivers.”

It’s in considering such cases that the huge gulf between human vehicle training and automotive software testing is most apparent, along with how interpretive and subjective human driving tests are. Humans are not required to prove themselves in all the possible conditions that they’re likely to encounter in a lifetime of driving, very few of which can be arranged on the day of a driving test: night-time conditions, snow and other ‘poor-visibility’ weather states, highly variable traffic conditions and ethical pre-crash considerations. A truly comprehensive human driving test in the United States would likely cost $100,000 and need to take place in a number of states at different times in the year. And to factor in reaction-time and reaction-choices to anomalous situations would require VR-based tests where the cost of establishing baseline performance alone would make the test improbable.

Standard driving tests evaluate base competencies but are in effect professional estimations of ‘common sense’ – presumptions about how likely the pupil is to be able to cope with and adapt to unforeseen and untested conditions. In many ways the process tests the learning ability of the pupil more than it tests what the pupil has learned to date. That’s a very abstract methodology to apply to analyses of self-driving vehicles.


feature research self-driving cars US
Send us a correction about this article Send us a news tip