Self-Driving Cars and Trust

I covered autonomous driving briefly in my CES trends post on Monday, but I wanted to return to the topic for a somewhat deeper dive on a specific issue: trust. The context here is Google’s insistence on moving straight to entirely autonomous vehicles (ones without steering wheels) and its disparaging comments about more incremental approaches.

Users don’t trust machines

The starting point here has to be the fact users don’t generally trust machines, at least at first. Most users take quite some time to warm up to machines and computers. Their experience teaches them machines often don’t work the way you expect them to and often malfunction. In the context of a machine that’s supposed to drive you places at dangerous speeds, that’s problematic. When the worst outcome we’re used to with most computers is figurative crashes, literal ones are pretty intimidating.

Incrementalism is the key

In this context, incrementalism is not only an inferior approach but is likely the key to gaining user trust. Can you imagine walking into a car dealership tomorrow and buying a self-driving car without ever having driven one? Would you even feel comfortable test-driving one without a steering wheel with no previous experience of such a thing? Such a prospect would be utterly intimidating. However, if you were to make small, incremental steps in this direction first, over time you’d probably be quite prepared to make a smaller leap to a self-driving car. The US Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) has defined five levels of automated driving as follows:

  • No-Automation (Level 0)
  • Function-specific Automation (Level 1)
  • Combined Function Automation (Level 2)
  • Limited Self-Driving Automation (Level 3)
  • Full Self-Driving Automation (Level 4)

Most drivers, accustomed to vehicles operating at Level 0 or Level 1, will not have developed the level of trust necessary to climb into a car operating at Level 4. But if they move through the stages in between, they may gain that trust, assuming they don’t have a poor experience. That likely means starting with cruise control and moving on to brake assistance, electronic stability control, smart cruise control, etc., and going on from there. Google’s biggest challenge will be it appears to be working only on the ultimate goal of Level 4 automation, without the ability to take users through the intermediate steps.

Good early experiences key

Those early experiences, though, are critical in developing user trust – incrementalism alone won’t cut it. If the car performs poorly at those lower-level automation tasks, users will never trust them to do more. Tesla’s recent move to Level 2 automation with its Autopilot feature was exciting for drivers, some of whom posted videos to YouTube showing the technology in action. But search for “tesla autopilot fail” on YouTube and you’ll get quite a few results too, demonstrating the technology isn’t quite ready in some cases. These types of failures, if they become frequent or tragic enough, will start to erode trust even in Level 2 automation, which will make it much harder to help users to make the move towards Levels 3 and 4. Ina Fried of Re/code recently posted a video of an experience of a self-driving car in which the demo driver frequently had to intervene — another example of a poor experience. Conversely, I had a demo in a simulator while at CES last week in which the “car” allowed me to relinquish control only under certain circumstances where it would do a better job. When I did so, it did a good job every time, and this kind of consistent good experience is critical to gaining user trust, too.

Mimicking (some aspects of) human drivers

Interestingly, another important aspect of developing driver (or passenger) trust is mimicking some of the characteristics of human drivers. Obviously, that doesn’t include the more dangerous aspects of human driving, but it does mean machines can’t simply take what appears to be the logical approach and can’t always drive at the maximum safe speed. Some of the people I’ve talked to who work in this field tell me human drivers tend to go slower than automated cars would go when exiting a freeway, for example. The car may be traveling perfectly safely but, if it doesn’t feel that way to the driver/passenger based on her personal experience, then it doesn’t matter. The bar for developing trust is not merely driving safely, but in such a way the occupants of the vehicle feel safe.

Since not everyone drives the same way, over time this mimicking of human drivers needs to move beyond average driver behavior and towards multiple different driver profiles, some driving faster and some slower, some prioritizing earliest arrival time and others maximum fuel efficiency, for example.

A long game

We already know the technological side of developing truly autonomous vehicles is a long game. Though the tech is moving quickly, mapping technology, LiDAR, regulations and so many other aspects of this field have a long way to go. But gaining user trust is also going to be a long game and one companies can’t short circuit.

Thankfully, companies have plenty of time to train users to trust these computers in a way they haven’t been able to trust other computers in the past.

Published by

Jan Dawson

Jan Dawson is Founder and Chief Analyst at Jackdaw Research, a technology research and consulting firm focused on consumer technology. During his sixteen years as a technology analyst, Jan has covered everything from DSL to LTE, and from policy and regulation to smartphones and tablets. As such, he brings a unique perspective to the consumer technology space, pulling together insights on communications and content services, device hardware and software, and online services to provide big-picture market analysis and strategic advice to his clients. Jan has worked with many of the world’s largest operators, device and infrastructure vendors, online service providers and others to shape their strategies and help them understand the market. Prior to founding Jackdaw, Jan worked at Ovum for a number of years, most recently as Chief Telecoms Analyst, responsible for Ovum’s telecoms research agenda globally.

856 thoughts on “Self-Driving Cars and Trust”

  1. “users don’t generally trust machines”

    Considering the failure rate of both cars and computers, I don’t see why this is so difficult to understand.

    That said, I can only hope it will clarify for everyone how to behave at a four-way stop. The concept seems to escape so many.

    Joe

    1. Also, buyers don’t trust companies, car makers in particular. In recent memory
      – GM sold millions of cars with a known defective ignition switch, then weaseled its way out of compensating most of the tens of people it predictably killed
      – Volkswagen cheated on pollution scores for hundred of thousands of cars
      – Takata sold millions of known defective airbags

      These are not bottom of the barrel price-oriented fly-by night companies, but what I’m sure some here would call “premium” suppliers.

      The trust issue is maybe with tech, certainly with carmakers, possibly with regulators (has anyone gone to jail about any of those ? How come they weren’t caught early ?).

      Tech is just a tool; and can be a very dangerous tool. Companies are not even held accountable for willful harm, so there’s little hope about -thankfully more widespread- accidental malfunction. Tech is already the biggest cause of car repairs.

      1. And then there is Toyota that continues to blame floor mats and sticky pedals on unintended acceleration. Another near fatality last week.

    2. hoverboards catching fire won’t stop people from buying hoverboards. and people trust elevators. why? everybody says that you can trust them. the problem won’t be the technology but the uninformed people who will say it’s not save even though the tech is ready (sometime in the future ofc).

      1. It certainly has stopped some people and retailers from buying and selling them.

        All I said is the lack of trust is understandable. Not sure why you need to conjecture any meaning beyond that.

        Joe

  2. As someone who has lived in and driven in extremely nasty winter weather, I can say with absolute certainty and no fear of contradiction that level four autonomous cars are a complete fantasy. Period.

    George Jetson was a cartoon, folks – not a roadmap.

    Hidden black ice, sensors covered in ice from freezing rain and sideways snow, roadways with no discernible lane markings. No machine intelligence will function better than a human in these conditions. No radar can discover that snowplow-induced pothole, coming into view from under the 18-wheeler, that is about to mangle your car’s alignment.

    Then there is the simple fact that these machines would require constant connectivity to networks to function properly (road maps, speed limit data, construction project data, software updates, etc.). Connectivity is the ultimate deal-breaker. Repeat after me: Data security is a myth. EVERY networked system can and will be broken. That pricey autonomous car will be the stupidest–and riskiest– thing you’ve ever bought.

    Trust me on that. Or not. It’s your life–go ahead and bet against the inevitable. We are DEVO.

      1. The TV crime show practically writes itself. In 70’s crime dramas, if you wanted to kill someone, you cut their brake lines. Looks like an accident.

        With Level 4 Googlemobiles, just hack the car, program it to think the exit is 20 feet further away than it really is, and watch the car go off the bridge. Try to solve that, Barnaby Jones…

    1. now that’s what makes you an absolute expert on self-driving cars… driving in extremely nasty weather. congratulations.

          1. Oh. I should have guessed by the scholarly approach you employed as you deftly addressed every issue raised. I mean, wow, man! You really put me in my place!

            How foolish I look next to a guy like you who employs scientific methodology. Boy Howdy! You are a shining example of rigorous, thorough scientific methodology if there ever was one.

            Hat’s off to you, bud. Can I like you on Facebook? Please?!?I mean, I want to be able to tell my friends that I know a guy who is an expert in scientific methodology! They will all be so impressed! This is like a dream come true to me!

      1. Yup. I’ve been saying this for a while. I have no problem with the concept of a fully-autonomous car but it won’t be possible to have it work in every driving situation for many many years.

        If you are driving on California roads with no severe weather, sure it might work to have a car with no manual controls but try it in Boston. Roads there are narrow, pothole filled, cow-paths that have poorly parked cars on both sides of nearly every street. Now add in something like the snow storms we received last year and your poor fully-autonomous car will just have a nervous breakdown. There were many roads in Boston last winter where only a single car in one direction could navigate at a time because the amount of snow piled on the sides of the road.

        So, give me a car in a couple of years that maintains my lane on a clear highway and I will believe it. Try and sell me on a car that has no manual controls or that can be summoned without a driver to my location and I’m betting that won’t work in many common situations.

    2. I’ve been saying this for a while. I have no problem with the concept of a fully-autonomous car but it won’t be possible to have it work in every driving situation for many many years.

      If you are driving on California roads with no severe weather, sure it might work to have a car with no manual controls but try it in Boston. Roads there are narrow, pothole filled, cow-paths that have poorly parked cars on both sides of nearly every street. Now add in something like the snow storms we received last year and your poor fully-autonomous car will just have a nervous breakdown. There were many roads in Boston last winter where only a single car in one direction could navigate at a time because the amount of snow piled on the sides of the road.

      So, give me a car in a couple of years that maintains my lane on a clear highway and I will believe it. Try and sell me on a car that has no manual controls or that can be summoned without a driver to my location and I’m betting that won’t work in many common situations.

      1. “… I’m betting that won’t work in many common situations.”

        Jamesbailey, my friend, you and I worry too much.

        I have it on good authority (from an expert in scientific methodology, no less) that soon people native to older cities in the northern climes will be able to sit back, smoke weed, and text their friends while doing crossword puzzles during rush hour winter weather events.

        Manual controls. Atrophied driving skills. Pshaw!

  3. I don’t think Level 3 autonomous cars (cars that can drive themselves nearly all the time, but may need occasional human input) are more than a few years away. Google and others are hot on the heels of the traditional car manufacturers, and they are feeling a lot of pressure. None of them wants to be seen as behind the times with technology.

    As far as the trust issue goes, maybe the cars will sell in small volumes at first. We have to make a distinction between availability of self-driving cars, and purchase of them.

    1. Engineers have already pointed out the severe, possibly fatal flaw in Level 3. That hand over when the computer surrenders control over the car to the driver: Usually happens in a hazardous situation that is unfolding too fast for the computer to process. But by then, the driver, who has to make rapid fire decisions, may have already been lulled to inattention and mental lethargy by the boredom of sitting in the driver’s seat, doing nothing. So in an emergency, the computer who can’t think fast enough will be turning the controls over to a driver who can’t think, period.

          1. ” i’m glad there are less people in fatal car accidents because humans are more reliable.”

            You make such a strong argument. You’re really smart.

          2. Yeah, and we all know a gun in the house raises the risk of you or a family member killing themselves far more than it lowers the risk of getting injured by an intruder. And yet people want their guns.

            Bottomline, human nature is what autonomous driving is up against.

  4. Manufacturers of autonomous cars will have to explicitly declare what their decision model is for accident avoidance or mitigation. Is it to save lives overall, or to save the passenger’s lives above all?

    If they say it’s the former, how will you feel about that as an owner of a self-driving car? Or worse as the owner of your wife’s/daughter’s/son’s self-driving car? Do you really want to put them in a vehicle that will/might decide to sacrifice your kin to spare a vanful of carousing, inattentive drunks?

    If they say it’s the latter, is it good public policy to let car makers program self-driving cars to think “screw you all, I’m saving my (passenger’s) ass”?

    The factors that need to be taken into account in any accident avoidance decision model involves ethical choices and dilemmas that no computer can ever evaluate.

    Level 4 autonomous cars will never happen. I keep saying it, how can we dream of level 4 in cars when we can’t even have it in planes, trains and ships where the technological obstacles are way, way less? Some even say there are no technological obstacles anymore?

      1. i’m sure everybody rather risk their own lives spontaneously than the lives of others 😀 that’s how the phenotype survives.

      1. The ass is a lovely creature, don’t be so harsh. Just as four wheeled non-autonomous vehicles are more trustworthy than the two wheeled kind, the four legged ass is far more dependable than the two legged version of the species. 😉

  5. “Can you imagine walking into a car dealership tomorrow and buying a self-driving car without ever having driven one?”

    i’m 33, never had a driver’s license, never drove a car. and yes, i can imagine using a self-driving car. make it cool, make the taxis cheap(er) and young people will use self-driving cars.

    1. Having never driven a car in your life, I can see why you would have total faith on a computer doing the driving for you. Or in computers doing the driving for the other cars on the road with you.