Lessons Learned from the Fatal Uber AV Accident

Lessons Learned from the Fatal Uber AV Accident

Michael E. McGrath

Author: Autonomous Vehicles: Opportunities, Strategies, and Disruptions

The tragic death of a pedestrian hit by an Uber autonomous vehicles (AV) on March 18th provides some important lessons. 

Elaine Herzberg, 49, was walking her bicycle far from the crosswalk on a four-lane road in the Phoenix suburb of Tempe about 10 PM on Sunday March 18, 2018 when she was struck by an Uber autonomous vehicle traveling at about 38 miles per hour, police said. The Uber Volvo XC90 SUV test vehicle was in autonomous mode with an operator behind the wheel. 

 “The pedestrian was outside of the crosswalk. As soon as she walked into the lane of traffic she was struck,” Tempe Police Sergeant Ronald Elcock told reporters at a news conference. The video footage shows the Uber vehicle was traveling in the rightmost of two lanes. Herzberg was moving from left to right, crossing from the center median to the curb on the far right. 

There are many lessons to be learned from this tragic accident, some obvious and some more subtle. 

1. A human driver would not have prevented the accident.

This was not an example demonstrating that an autonomous vehicle (AV) is less safe than a human driver.  "It's very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway," stated the local police chief. Based on the AVs onboard video recorder, the pedestrian appeared suddenly from the dark and was visible for only 1 second before impact, and she didn’t appear to notice the car even though the headlights were shining directly on her. It takes a human 2 seconds to react instantly to a situation like this. 

So, the first lesson is that this isn’t a case where an AV failed to perform as well and as safely as a human driver. The first level of standard is that AV must perform as well as human drivers, but we expect more from AVs. 

2. We expect more from AVs than human drivers.

This is potentially a case where an AV should have outperformed a human drive. There was no other traffic around, and nothing to obstruct the other sensors in addition to the video from viewing Herzberg. By the time she came into view of the camera, she had crossed the left lane and was halfway across the right lane, having moved at least 20 feet from the median. She was also walking her bicycle across the road, not dashing out in front of traffic. That means she was out in clear space on the road for at least a couple of seconds. The Volvo XC90s that Uber is using for development are equipped with cameras, radar and lidar sensors. Radar and lidar are capable of “seeing” their surroundings in complete darkness and should have detected Herzberg’s presence in the roadway. The Volvo was traveling at 38 mph, a speed from which it could stop in no more than 60-70 feet, or steer around Herzberg, to the left, without hitting her. 

The reason why the Uber AV failed to detect her in time is still undetermined. It could have been a blind spot in its lidar sensors or a failure to characterize the data properly, or possible something else. Eventually the data logs will most likely provide an answer. In any case, it will be used to prevent any similar accidents in the future.

3. Unlike human drivers, AVs will learn from this accident and avoid similar ones in the future.

While a human driver involved in this type of accident may learn from it personally, the other hundreds of millions of drivers won’t. This is a big difference between human drivers and AVs. Every AV will learn from this experience and be more cautious in the same conditions. This is because the software, which is the brains of the AV, will put this case in its memory to be called on in any similar circumstances. The sensors on AVs will be validated to detect and properly characterize similar objects.

This ability to distribute learning will make AVs increasingly safer over time. Within a few days after the accident, I expect that all companies developing AVs have verified this case situation and made sure that they would have detected the pedestrian and taken corrective action. It is unlikely that a similar accident will occur with any AV in the future.

4. Humans are unpredictable and that behavior needs to be included in an AV’s software.

The victim in this case, appeared to walk directly in front of the vehicle without noticing it. She wasn’t in a crosswalk, and she directly put herself in jeopardy.  6,000 pedestrians were killed in the United States in 2016, and most these were not using crosswalks. AVs need to anticipate unpredictable human behavior and build random behavior into their artificial intelligence.

5. Video sensors alone may not be able to get the results we expect from AVs.

The video from the accident showed that the vehicle did not have time to react once the video camera saw the woman. Humans rely primarily on what they see (video), but AVs have other sensors. While AVs will be better in reacting to what they see with video than humans, especially when the human driver is distracted or impaired, this case shows the importance of AVs being able to detect more than what they see. 

There is some disagreement on the ability of different types of video to identify the pedestrian earlier, but it certainly raises the issue that video alone may not be sufficient. Most AVs use multiple sensors and fuse together what they all detect, but some, especially Tesla, rely primarily on video sensing – seeing what the human eye sees.

6. AV developers may want to consider adding thermal sensors to the AV sensor package to distinguish humans and animals from other similar objects.

An AV cannot react to all objects in motion. Slamming on the breaks when a bag is blown across the road can cause an unnecessary accident. But if the object is a human or an animal, breaking should be initiated. The difference is the thermal heat signature of objects. Humans and animals on or near the road create far more heat than their surroundings.

Thermal sensors for autonomous vehicles can provide additional data could be used to identify an object as human or animal.

Thermal imaging is particularly important when used as night vision. A thermal imaging sensor can see further ahead and provide more time to react than video. It can distinguish between a moving object on the side of the road. In this case, a thermal imaging camera or sensor would have identified the woman as a human moving across the road in the dark. 

Currently most AVs don’t have thermal sensing cameras, but perhaps it’s time to add these to the sensor package. This will require substantial reprogramming of the sensor fusion software, but learning for this experience may justify it.

7. The failures of AVs will be publicized, but the successes won’t be noticed.

A headline that you won’t see is: “Autonomous vehicle avoids hitting pedestrian that would have been killed by a human driver”. Successes aren’t anything more than entries into a database for the artificial intelligence guiding AVs. They don’t make news stories. Unfortunately, this tragic accident could have been simply an addition to the database as “detected and avoided hitting pedestrian crossing road with bike.”

So, we need to expect that all the incident reports in the press will be overwhelming negative, since only the negative cases are newsworthy. Skewing news like this may cause people to be more cautious and may incite regulation of AVs, but in the end, it won’t stop the advances because the unknown successes will surely outweigh the failures or limitations. 

8. There may be a difference in the accident avoidance among different AV developers.

This may be a harsh reality: some AVs may be better and safer than others because of the way they are designed and developed. Company A’s AVs may be safer than Company B’s. We already saw a glimpse of this when Waymo claimed that its AVs would have spotted and try to avoid the pedestrian in the Uber accident. That may or may not be the case, but there will be differences among AVs by developer. 

There are some reports that Uber reduced the number of sensors on its AVs when it went from earlier Ford Fusion versions with seven lidars, seven radars, and 20 cameras to the newest Volvo test vehicle with one lidar, 10 radars and seven cameras. The reduction in sensors may have been done because Uber thought it had overdone the number of sensors originally. It’s possible that the reduction in sensors may have caused blind spots in the sensor coverage. Waymo AVs currently use six lidar sensors and GM’s AVs use five. 

How will people react to this? Will some sort of safety record, shape the way people buy AVs or select an autonomous ride service for a trip? Will Uber be forced to withdraw or significantly delay its entry into autonomous ride services? Maybe something like verified senor coverage may be appropriate for government regulation? 

9. AV accidents, unlike human accidents, will be factually documented.

The facts of the Uber accident illustrated a marked difference in the way accident facts are reported. With the onboard computer recording all the sensor and control readings, that facts are known exactly. The vehicle was traveling 38 MPH in a 35 MPH zone. There was a complete video recording of the pedestrian entering the road as soon as she was visible. The time from being visible to the time of impact was 1 second. With AVs, there won’t be any discrepancies from drivers and witnesses. The exact facts will be known. 

10. A backup human driver can’t react quickly enough in these situations. 

The backup driver in this case appeared to be distracted, but nevertheless, it takes 2 seconds for a human to react to that type of situation, and there was only 1 second between the victim becoming visible and impact. SAE Level 3, Conditional Automation, sets the expectation that a driver should be able to respond appropriately to a request to intervene. This example illustrates that a request to intervene is not practical in situations like this where an instant response is required. It is only practical for situations such as encountering an unusual traffic situation or road blockage. This is the reason why some AV developers are skipping SAE Level 3. 

11. t may be time to increase the focus on pedestrians using crosswalks.

6,000 pedestrians were killed in the United States in 2016, an increase of 27% from 2007. Elaine Herzberg was struck and killed by Uber’s autonomous vehicle when she attempted to walk her bike across an eight-lane street, crossing about 100 yards from the crosswalk. Herzberg walked her bike from the center median across two vehicular lanes when she is struck by the vehicle. The large median at the site of the crash has signs warning people not to cross mid-block and to use the crosswalk to the north at Curry Road instead. But the median also has a brick pathway cutting through the desert landscaping that accommodates people who do cross at that site.

Arizona has the highest rate of pedestrian deaths in the nation. Ten pedestrians were killed in the state in the previous week. Experts have long attributed the state’s high rate of pedestrian deaths to exceptionally wide streets that are engineered to move cars fast and do not provide adequate safety infrastructure for people who are on foot or bike.

This accident focuses attention not just on AVs but also on the general safety of pedestrians. In the future, AVs may avoid hitting pedestrians more than vehicles controlled by humans, but it is still a better safety measure to discourage pedestrians from crossing roads outside of crosswalks.

Learning from experiences, especially tragic ones like this, is critical to the advancement of new technology. Autonomous vehicles will profoundly change transportation as we currently know it, and in the process, significantly improve lifestyles and create major new industries. However, it’s important to be implemented carefully.