By: Jack Barkenbus, Vanderbilt University

Autonomous driverless carEvery day about 100 people die in car crashes on U.S. roads. That death toll is a major reason why both Congress and the Trump administration are backing automotive efforts to develop and deploy self-driving cars as quickly as possible.

However, officials’ eagerness far exceeds the degree to which the public views this as a serious concern, and overestimates the public’s willingness to see its driving patterns radically altered. As those of us involved in studies of technology and society have come to understand, foisting a technical fix on a skeptical public can lead to a backlash that sets back the cause indefinitely. The backlash over nuclear power and genetically modified organisms are exemplary of the problems that arise from rushing technology in the face of public fears. Public safety on the roads is too important to chance consumer backlash.

I recommend industry, government and consumers take a more measured and incremental approach to full autonomy. Initially emphasizing technologies that can assist human drivers – rather than the abilities of cars to drive themselves – will somewhat delay the day all those lives are saved on U.S. roads. But it will start saving some lives right away, and is more likely to avoid mass rejection of the new technology.

(more…)

By: Srikanth Saripalli, Texas A&M University

Autonomous driverless carIn early November, a self-driving shuttle and a delivery truck collided in Las Vegas. The event, in which no one was injured and no property was seriously damaged, attracted media and public attention in part because one of the vehicles was driving itself – and because that shuttle had been operating for only less than an hour before the crash.

It’s not the first collision involving a self-driving vehicle. Other crashes have involved Ubers in Arizona, a Tesla in “autopilot” mode in Florida and several others in California. But in nearly every case, it was human error, not the self-driving car, that caused the problem.

In Las Vegas, the self-driving shuttle noticed a truck up ahead was backing up, and stopped and waited for it to get out of the shuttle’s way. But the human truck driver didn’t see the shuttle, and kept backing up. As the truck got closer, the shuttle didn’t move – forward or back – so the truck grazed the shuttle’s front bumper.

As a researcher working on autonomous systems for the past decade, I find that this event raises a number of questions: Why didn’t the shuttle honk, or back up to avoid the approaching truck? Was stopping and not moving the safest procedure? If self-driving cars are to make the roads safer, the bigger question is: What should these vehicles do to reduce mishaps? In my lab, we are developing self-driving cars and shuttles. We’d like to solve the underlying safety challenge: Even when autonomous vehicles are doing everything they’re supposed to, the drivers of nearby cars and trucks are still flawed, error-prone humans.

(more…)