1. 程式人生 > >Autonomous Cars: Taking a “Yellow Light” Approach

Autonomous Cars: Taking a “Yellow Light” Approach

In China, a Tesla Model S in “autopilot” mode crashed into a semi truck, killing the Tesla driver, in 2016.

Another fatal Model S crash occurred while in autopilot in Florida a few months later.

In early March, 2018, a self-driving Uber just hit and injured a woman in Scottsdale, AZ. She died later at a hospital.

On March 23, 2018, a Tesla set on Autopilot mode crashed into a highway barrier, killing the 38-year-old driver.

Those are the fatalities so far. There have been dozens of non-fatal accidents and other incidents.

Regarding the Tesla crash, NTSB (National Transportation Safety Board) spokesperson Chris O’Neil said, “ “The NTSB is looking into all aspects of this crash including the driver’s previous concerns about the autopilot. We will work to determine the probable cause of the crash and our next update of information about our investigation will likely be when we publish a preliminary report, which generally occurs within a few weeks of completion of field work.”

Though an acquaintance of the victim said that the victim had complained the car would sometimes veer toward the barrier, Tesla representatives say there were no logged complaints regarding Autopilot in their system.

Tesla’s announcement included this:

There was a concern raised once about navigation not working correctly, but Autopilot’s performance is unrelated to navigation.

However, according to Tesla’s own website, when their vehicle is set in Autopilot mode:

Once on the freeway, your Tesla will determine which lane you need to be in and when. In addition to ensuring you reach your intended exit, Autopilot will watch for opportunities to move to a faster lane when you’re caught behind slower traffic. When you reach your exit, your Tesla will depart the freeway, slow down and transition control back to you.

And…

All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed.

The word “navigate” is used 4 times on Tesla’s Autopilot page alone.

How, dear Tesla representative, is Autopilot NOT related to navigation?

What other navigation issues have been reported that would, according to you, have no bearing on Autopilot’s function?

Let us count the ways:

And this does not address the question of why Tesla’s camera and radar did not detect the barrier in the first place.

All of the positive momentum that has been building up will come crashing to a halt (pun intended).

Tesla, and other autonomous vehicle makers, need to be honest and open about the issues and the solutions, otherwise they risk public and political ire. Why would Tesla try and say Autopilot performance is unrelated to navigation when it clearly relies on the proper functioning of the nav system to do its job? It is such an obvious attempt at misdirection (pun intended) that we have to question how Tesla’s PR department let it slide.

I must come clean. I love the goals and ambition behind Elon Musk’s 21st-century companies. I am a big fan of SpaceX, Solar City and Tesla. But I need to know —

Where is the truth, Mr. Musk & co.? Tesla Motors?

We knew this was going to happen. It was inevitable.

The rationale behind self-driving cars has always been sound: People tend to react either too slowly or act illogically on the road. They can be reckless and they don’t follow all of the rules. Get people out of the equation in driving, let really smart computers control your vehicles, and everything should improve: less crashes, less traffic congestion, less fuel consumption.

In May 2017, a Morgan Stanley analyst team predicted that if Alphabet Inc.’s self-driving car venture Waymo has an 84,000-strong fleet by 2022, and driven almost 4 billion miles, that could mean nearly 50 deaths based on the current rate of auto fatalities (1 death for every 80 million miles).

But the systems being used by the self-driving experts at Tesla, Ford, Waymo and Uber are being built with the express purpose of being safer.

Ford is working with a video and radar-based system for pedestrian detection that is based on 500K miles of driving data gathered from a dozen cars buzzing around in various conditions on three continents. It will be self-contained and unable to download software updates like Tesla’s autopilot system allows.

This stuff works. It has been tested, re-tested, and refined thousands of times.

People Problems

The trouble in the case of pedestrian accidents will, most likely, come from the pedestrians themselves, or other “human” drivers sharing the road with the autonomous vehicles.

Why? Because our self-driving cars will, barring severe failures in hardware or software bugs not covered by back-ups, follow the rules they are programmed to follow. And they will do so without being affected by emotion (read: road rage), or fatigue, or distraction. Or medical conditions, or random last-second decisions, or curiosity.

And people are affected by those things all the time. 90% of the factors that cause traffic accidents now would be almost entirely eliminated if self-driving cars ruled the roads.

One of the biggest problems, then, is that they won’t be the only kind of cars out there for a long time. For a long time, there will be some mixture of significant portions of autonomous vehicles and human-controlled vehicles. In essence, the highways and avenues of the world will be shared by huge numbers of 100% logical multi-ton machines and huge numbers of human-fallible multi-ton machines.

There will be accidents. A lot of them. Perhaps less than now, but they will happen.

The makers of self-driving tech are working hard right now to help their cars better learn and understand the myriad forms that weird, random, unpredictable human behavior can take. What they’re made will continue to improve, and become safer. But it’s not really the autonomous vehicles we need to worry about. Just as it always has been, it’s the human factor.

Man vs. Machine

There will always be the matter of a natural human tendency toward mistrust of new technology.

Even the (comparatively) simple technology of many electrical systems in modern cars has resulted in a veritable pile-up of accidents and recalls.

Toyota’s “sticky pedals”incidents, that caused unintended acceleration, ended up killing almost 90 people. It was traced to a confusing muddle of what programmers call “spaghetti code” in the car’s software, in which so many people have written so much code over such a long period of time, without proper documentation, that it became almost impossible to find the problem.

There is a lesson in that.

The cars didn’t screw up. The people who made the cars screwed up.

We can’t let the few accidents that do occur, and that will occur in the future, to scare us away from advancing this technology. When it is fully adopted, people in general WILL be safer. You will hear of less friends and relatives dying or being badly injured in wrecks.

But we need time and support to get there. Once we iterate enough and improve on the tech to best the human problem with self-driving cars, one of the first words everyone thinks when they hear “car” won’t be “accident”.

Many engineers and VCs want to speed through R&D as quickly as possible and get a product to market, but when it comes to autonomous cars and AI projects in general, it is definitely safer to proceed with caution. Rather than see the road ahead as endless intersections of “green lights”, we should treat every stretch of road as if it culminates in a yellow light. When it comes to human lives, that’s the least we can do.

Thank you for reading and sharing!

This story is published in The Startup, Medium’s largest entrepreneurship publication followed by + 378,529 people.