Afraid Of Self-Driving Car Testing On Public Roads? Compared To What?


A front-page article in last Sunday’s Washington Post beat an old drum: “Some Silicon Valley Residents Anxious Over Self-Driving Cars.” The usual concerns were expressed regarding how these vehicles can’t be trusted to be safe, plus objections to corporations like Lyft, Waymo, and Cruise developing robo-taxis which could put drivers out of work.  And, one cannot deny that a fatality occurred in Uber’s case, when a safety driver wasn’t doing their job. However, the new wrinkle in this article was a focus on Silicon Valley residents with some understanding of computer science who nevertheless don’t feel safe around Self-Driving Cars (SDC’s).

The content in these types of articles are usually based on a skewed perspective. People who are happy with life where they live typically do not go to Town Halls and City Council meetings. The ones who do show up have a concern they want to express, which has happened frequently in the Valley regarding SDC’s. Good for them, but this results in more quote-able content from the unhappy rather than the happy. I’ll give credit to the Post writer for noting that “Some residents are proponents – or at least indifferent – to the autonomous cars on their streets.”

In this article, longtime robocar specialist Brad Templeton bolstered the supportive view of current SDC testing, proffering the excellent point that society accepts risks on the road now with teen-age drivers, who need time to become better drivers. Do we expect teen drivers to zip around for months on a test track? No, the needed learning can only happen with on-road driving. The teenage driver learning curve only benefits one driver, whereas the learning of a few hundred SDCs under development can be transferred to millions of SDC’s for long term safety benefit. Well done, Brad.Today In: Business

To illustrate my point, envision a series of cars driving down a residential street in Mountain View during the busy morning commute/school hours. One has a driver who is distracted by work issues and urgently trying to read a text that just came in. One is exhausted and just barely “with it.” Another is looking at their left side mirror to safely merge into a left-turn lane, not viewing the road ahead for a couple of seconds.  Another is a teenage driver, paying close attention given that they only started driving last week. And then there’s the SDC, seeing 360 degrees with full attention on relevant objects in every direction. For each car, imagine a kid on a skateboard zipping out into the vehicle’s path from between parked cars on the right. Which will respond best? Perceiving and acting on this event by the current SDC’s could be flawed but all the human processes in this example have “flaws” too. The difference: the SDC is getting better at its job every day until the point at which it will far surpass the skill of the human driver.

But I left out the most dangerous player: a Tesla driving that same road, occupied by an alert and rested driver who chooses to dwell on his or her personal screen, never looking up. This is sheer recklessness, endangering both themselves and everyone on the road, because these vehicle systems aren’t designed to handle all obstacles and events. Tesla is very clear about this in customer communications. What about when any of us take our eyes off the road for a few seconds to, for instance, select new music? In this case, a large slice of our brain capacity is still into driving, and our peripheral vision supports driving as well. However, a Tesla driver looking at down at their screen for an extended period is almost completely out of the driving game. We have yet to see harm come to someone who is the victim of a negligent Tesla driver; so far they’ve managed to kill only themselves.

Forbes

Leave a Comment


Your email address will not be published.