News programs worldwide expressed concern regarding the burgeoning driverless car industry on Friday, November 10, when the first self-driving taxi making its way through a half-mile loop in Las Vegas was involved in a crash with a human driven truck. No injuries occurred, but the incident reveals that autonomous driving is still very much in the experimental stage. According to John Moreno, spokesman of AAA (the American Automobile Association), “”This is exactly the kind of real-world scenario that this pilot is attempting to learn from. This is one of the most advanced pieces of technology on the planet, and it’s just now learning how to interact with humans and human driving.” In this case, the ‘autonomous’ taxi was driving behind a truck which stopped and reversed in order to turn into an alley – the truck driver was in the wrong, of course, and was given a ticket, but the outcome is still considered a glitch, standing in the way of the lofty safety aims of autonomous car manufacturers.

Human vs Tech Expectations on the Road

The taxi was part of the AAA’s intention to raise awareness of the benefits of autonomous or driverless cars. The operators explained that the taxi failed to move, because there was another vehicle behind it. Still, there was enough space to avoid an accident if the vehicle had reacted quickly enough. The accident also bring up a specific problem: the different expectations of autonomous driving system and human drivers. In this case, the truck driver continued because he assumed the taxi would back up in order to give him space. The National Transportation Safety Board (NTSB) immediately sent staff to investigate what had occurred. This follows a report issued after an accident in which a Tesla Model S, which was steering itself, crashed into a truck in 2016 in Florida, killing the driver.

The Future is Still Bright for Driverless Cars

Although the NTSB has been critical of specific autopilot designs, it is very much encouraging the car industry to install features such as auto-braking and lane departure systems in all vehicles. It only makes sense, since autonomous systems are purposely being designed to avoid accidents and reduce road deaths. Remember the Bill Busbice produced film Left Behind (starring Nicolas Cage), in which millions of people suddenly disappear from the face of the earth? Remember the number of car crashes that suddenly occurred as drivers vanished from their vehicles?

Imagine a world where in such a situation, a car would automatically brake before crashing into other vehicles. Companies such as FIAT, BMW, Intel, and TESLA are well on their way towards achieving this car with a full state of autonomy. Safety features which will start appearing between 2020 and 2025 include autonomous long-range driving at motorway speeds, automatic emergency calls in the case of accidents, and warning systems alerting drivers to dangers, particularly the dreaded blind spot.

Autonomous driving has many aims – above all, promoting safety in both ideal and risky driving conditions. Thus far, there have been a couple of glitches, yet negative outcomes themselves are ways to identify improvements that need to be made in autonomous-human driver interaction.