The self-driving, “autonomous” car seems like something out of the Jetsons. Or Blade Runner. The future that used to be only in the movies is now coming. Companies like Google, Apple, GM and Honda – those with a finger on the pulse of the situation, predict these self-driving autos will be available to everyday consumers by 2020. Two decades from now, 1 in 10 vehicles on the road could be a self-driving one.
But there are significant roadblocks at the present time that may keep this revolutionary ideas from being full embraced by the people who matter the most – the consumers who might buy them.
How much trust do you have?
First and foremost is a lack of trust. A self-driving car is such a new phenomenon, it could take a decade or more for consumers to become comfortable enough with the idea that they’d be willing to put down money to buy one. Riding in an autonomous vehicle involves giving up control to a machine that you have to trust to make the right decisions to keep you out of an accident. Machines can process information much faster than humans can. Computer and GPS location technologies have advanced far beyond expectations in a really short period of time. But do you trust a machine, even with access to all that information and the ability to process it at lightning speeds, to read situations and make a nuanced decision about the best course of action? It’s one thing to say that and another thing to put that belief into action by spending money and trusting your safety to it.
Right now, the autonomous cars that consumers think of most often are the Google cars that drive around logging pictures for Google Maps. And they’re awesome, no doubt about it. They're the reason we have Street View, where you can access a 360 degree view of almost any paved street in the country. But some of the Google cars have been involved in accidents. Most, if not all, of those accidents were caused by other drivers. But people don’t think about that – they just hear that a driverless car got in an accident.
What if someone else took control?
Not to mention the concern that computer systems in these vehicles could be vulnerable to outside influences like hackers. Earlier this year, it was reported that hackers were able to remotely take over the computer system of a Tesla Model S, shutting down the engine. Tesla doesn’t make autonomous vehicles but it did reinforce concerns over how to prevent the same kind of thing from happening in one.
As of right now, the National Highway Traffic Safety Administration has not given its seal of approval for driverless vehicles to be on public roads in any fulltime capacity. But the test models out there currently are provided valuable feedback and advancing the technology ever closer to the day when people don’t think twice about riding in a car they don’t have to drive themselves.
You may be interesed in these other posts:
- One step closer to autonomous driving
- Ready for a game changer? How about "Google in your Car"?
- Drivers loosen the purse strings - car loans top $1 trillion
This post was published on January 7, 2016 and was updated on April 15, 2021.