Ford Inside News banner

Uber autonomous vehicle kill a pedestrian

8K views 32 replies 11 participants last post by  2b2 
#1 ·
This was going to happen, sooner or later. An autonomous vehicle (a Volvo) accidentaly kill a bicyclist early in Tempe, Arizona.

Uber suspends the autonoous vehicle test.

IMO, autonomous vehicles are not compatible with pedestrian or vehicles conduced by humans. Is necessary to create circuits or areas exclusively for autonomous vehicles.

Here the link: https://www.autoblog.com/2018/03/19/uber-autonomous-volvo-fatality/
 
#3 ·
The Uber car had a human "driver" and he was unable to react in time. The woman was not in a crosswalk and it was late at night so not only dark, but there is a chance she had been drinking. The fact that the pedestrian always has the right of way does not mean they don't have a responsibility to follow the law (cross in a crosswalk which will likely be better lit) and ensure the road is clear before entering the street.

People are stupid. Cars programmed by people will be relatively stupid too.
 
#8 ·
The Uber car had a human "driver" and he was unable to react in time. The woman was not in a crosswalk and it was late at night so not only dark, but there is a chance she had been drinking. The fact that the pedestrian always has the right of way does not mean they don't have a responsibility to follow the law (cross in a crosswalk which will likely be better lit) and ensure the road is clear before entering the street.

People are stupid. Cars programmed by people will be relatively stupid too.
At least three times a week I have deer dart out in front of me. They never use the crosswalks (though I doubt they have been drinking). The thing is, I expect them and are ready to avoid them, which I have to do at 40mph. If a car is coming in the opposite direction, I slow down and pay even more attention, knowing I have less options if one does dart out. A human driver in an automated car is likely to get falsely lulled into a sense of security, and not pay attention.
An alert driver would may have had a better chance at avoiding her. I hear she may have been pushing a bicycle. if so, the reflectors on the bike would have been an easy flag for a human driver.
 
#10 ·
(a) I have some difficulty with the proposition that a woman walking a bicycle (reportedly from a median) "darted" into traffic.

(b) one report indicated 40mph in a 35, no indication that brakes were never applied. If that's the final assessment, we are looking at both a self-driving software/hardware failure, and an inattentive human behind the wheel. I'll be willing to admit that if I had to sit behind the wheel of a self-driving vehicle for hours on end "just in case" I'd likely wind up less attentive than if I was driving.
 
#11 ·
... here in Madrid, even the fully automatic subway trains, in a closed circuit and without pedestrian or bicyclist, have a driver. Why? to give confidence to the traveling public of the subway, and for those cases of train's failure...

The autonomous car is too far in the future.
 
#13 ·
The next 'game' for city kids will be seeing how close you can let an autonomous car get before jumping out in front and causing it to panic break...and then squeegee the windows for 'tips'.
 
  • Like
Reactions: 2b2
#14 ·
Wow, after watching the video, there is no way anyone could say she 'darted' out into traffic. She was slowly pushing the bike. She should have been easily picked up by the radar. I believe what happened is that the autonomous car thought it was a small vehicle in the next lane, and did not even 'think' of another scenario.
 
  • Like
Reactions: biker16
#15 ·
#16 ·
Fails all around. Unfortunately, many will blame the ride-along safety driver based on their history. Yet, the very nature of the job is to induce incredible boredom.

The real concern is that even when the victim and bicycle were directly ahead, the car didn't brake. Software is supposed to have a faster reaction time than people. One article I read suggested that perhaps Uber had disabled the Volvo's City Safety. I'm sure Volvo really wants to know if that was the case.
 
#17 ·
This is a pretty clear cut scenario where the vehicle's technology should have seen the obstacle in the dark. The 'driver' should have been more attentive and wasn't doing their job correctly, but this is also why the technology is there in the first place and it's hard to know how quickly they could have responded once they saw the obstacle. It's hard to react quickly if you're not always in control of the vehicle, the reaction time is longer and REAL car companies already know this.
 
#18 ·
swiping:
Volvo Cars CEO Håkan Samuelsson made two promises for its autonomous vehicle program in October 2015. They were in the context of federal guidelines for autonomous driving in the U.S.
1.) Volvo will accept full liability whenever one if its cars is in autonomous mode.
2.) Volvo regards the hacking of a car as a criminal offense.
U.S urged to establish nationwide federal guidelines
for autonomous driving -
Media.Volvocars.com


These guidelines don't apply to Uber's XC90 involved in the Tempe, Arizona incident. The autonomous vehicle technology for that particular car wasn't developed by Volvo Cars.​
 
#20 ·
Uber disabled self-driving Volvo's safety system before U.S. fatality, supplier says
Europe Automotive News
- Gabrielle Coppola, Ian King, Bloomberg - March 27, 2018


NEW YORK/SAN FRANCISCO -- Uber disabled the standard collision avoidance technology in the Volvo XC90 SUV that struck and killed a woman in the U.S. last week, according to Aptiv, the partsmaker that supplied the vehicle's radar and camera.

"We don't want people to be confused or think it was a failure of the technology that we supply for Volvo, because that's not the case," Zach Peterson, a spokesman for Aptiv, said. The Volvo XC90's standard advanced driver-assistance system "has nothing to do" with the Uber test vehicle's autonomous driving system, he said.

Aptiv is speaking up for its technology to avoid being tainted by the fatality involving Uber, which may have been following standard practice by disabling other tech as it develops and tests its own autonomous driving system. The supplier was spun off from Delphi last year as the parent company renamed its automated driving business Aptiv.

Uber's system failed to slow the vehicle as 49-year-old victim Elaine Herzberg crossed the street pushing a bicycle. Police in Tempe, Arizona, and the National Transportation Safety Board are investigating the incident.

Uber declined to comment. A Volvo spokesman said the company can't speculate on the cause of the incident and is awaiting a full investigation report.

Intel's Mobileye, which makes chips and sensors used in collision-avoidance systems and is a supplier to Aptiv, said Monday that it tested its own software after the crash by playing a video of the Uber incident on a television monitor.

Mobileye said it was able to detect Herzberg one second before impact in its internal tests, despite the poor second-hand quality of the video relative to a direct connection to cameras equipped to the car.

'Challenging task'
"The video released by the police seems to demonstrate that even the most basic building block of an autonomous vehicle system, the ability to detect and classify objects, is a challenging task," Mobileye CEO Amnon Shashua wrote on Intel's website. "It is this same technology that is required, before tackling even tougher challenges, as a foundational element of fully autonomous vehicles of the future."

Aptiv's radar and camera system using Mobileye chips and sensors helps power the Volvo XC90's driver-assistance system, which provides collision avoidance, lane-keeping aid and other safety features.

In November, Uber agreed to buy 24,000 Volvo SUVs onto which it planned to install its own sensors and software to permit pilot-less driving.

- - - - - - -

Arizona governor suspends Uber's ability to test self-driving cars
businessinsider-Reuters
- David Schwartz


TEMPE, Ariz. (Reuters) - The governor of Arizona on Monday suspended Uber's ability to test self-driving cars on public roads in the state following a fatal crash last week that killed a 49-year-old woman pedestrian.

In a letter sent to Uber Chief Executive Dara Khosrowshahi and shared with Reuters, Governor Doug Ducey said he found a video released by police of the crash "disturbing and alarming, and it raises many questions about the ability of Uber to continue testing in Arizona."

With its wide and open roads, good weather and a light regulatory touch, Arizona has been prime testing ground for Uber Technologies Inc and other autonomous vehicle developers. Ducey in 2016 welcomed Uber to his state with celebration, saying at the time "We want you to know Arizona does want Uber."

.
 
#22 ·
Uber reaches settlement with family of autonomous vehicle victim
Reuters
, Bernie Woodall, MARCH 28, 2018 / UPDATED


TEMPE, Ariz. - The family of a woman killed by an Uber Technologies Inc [UBER.UL] self-driving vehicle in Arizona has reached a settlement with the ride services company, ending a potential legal battle over the first fatality caused by an autonomous vehicle.

Cristina Perez Hesano, attorney with the firm of Bellah Perez in Glendale, Arizona, said “the matter has been resolved” between Uber and the daughter and husband of Elaine Herzberg, 49, who died after being hit by an Uber self-driving SUV in the Phoenix suburb of Tempe earlier this month.

The terms of the settlement were not given. The law firm representing Herzberg’s daughter and husband, whose names were not disclosed, said they would have no further comment on the matter as they considered it resolved.

An Uber spokeswoman declined to comment.

The fallout from the accident could stall the development and testing of self-driving vehicles, designed to eventually perform far better than human drivers and to sharply reduce the number of motor vehicle fatalities that occur each year.

Uber has suspended its testing in the wake of the incident. Toyota Motor Corp (7203.T) and chipmaker Nvidia Corp (NVDA.O) have also suspended self-driving car testing on public roads, as they and other companies await the results of an investigation into the Tempe incident, believed to be the first death of a pedestrian struck by a self-driving vehicle.

Uber does not use the self-driving platform architecture of Nvidia, the chipmaker’s Chief Executive Jensen Huang said on Wednesday.

The March 18 fatality near downtown Tempe also presents an unprecedented liability challenge because self-driving vehicles, which are still in the development stage, involve a complex system of hardware and software often made by outside suppliers.

Herzberg was walking her bicycle outside the crosswalk on a four-lane road when she was struck. Video footage from a dash-mounted camera inside the vehicle, released by Tempe police, showed the SUV traveling along a dark street when the headlights suddenly illuminated Herzberg in front of the SUV.

Other footage showed that in the seconds before the accident, the human driver who was behind the wheel was mostly looking down and not at the road.
.
 
#23 ·
It looks like the family took the payday and that was their best option. Because the women was crossing a 4 lane roadway illegally, in the middle of the night, with poor lighting, wearing dark clothing, without looking to see if traffic was coming. There was less than 1.5 seconds of response time from visual perception and impact, and there was no time for a human to acknowledge the situation, let alone hit the brakes, then the brake system response before deceleration to begin. It also looks like the Uber driver looked up before the women was visible, but there was no time to react. I think Uber thought instead of spending the dollars defending their legal position and the negative impact on the family proving the woman's fault, the family quickly settled.

Asside from that, the question is why didn't the radar or lidar see the women through the darkness to offer better vision than a human?
 
#27 ·
take-away point: passengers get the TICKET!

Motorcycle cop tickets a self-driving car in San Francisco
Yahoo/Autoblog
- Sven Gustafson - March 30, 2018
WITH VIDEO


Now here's a genuine novelty: In San Francisco, a motorcycle cop pulled over an autonomous vehicle and issued it a ticket. The future has arrived.

But the reason — police said it failed to yield to a pedestrian at a crosswalk — probably shouldn't be taken lightly, coming a day after a self-driving car operated by Uber Technologies Inc. struck and killed a woman walking her bicycle across the road March 18 in Tempe, Ariz. Cruise Automation, the operator of the ticketed self-driving car, says the vehicle did nothing wrong. The story was first reported by CBS affiliate KPIX-TV.

Cruise tells the station that its onboard data shows the pedestrian was 10.8 feet away from the car when it began driving in autonomous mode down Harrison Street at 14th Street. The officer pulled the car over shortly after it began accelerating and ticketed the human test driver.

"Safety is our priority in testing our self-driving vehicles," Cruise said in a statement. "California law requires the vehicle to yield the right of way to pedestrians, allowing them to proceed undisturbed and unhurried without fear of interference of their safe passage through an intersection. Our data indicates that's what happened here." It tells the station the human test driver did everything right but is responsible for the citation.

General Motors purchased San Francisco-based Cruise in 2016 to boost its efforts to develop self-driving vehicles. GM is seeking federal approval for a fully autonomous car that lacks a steering wheel, brake pedal or accelerator pedal to join its first commercial ride-sharing fleet in 2019, and it recently announced plans to build the car, which is based on a Chevrolet Bolt electric vehicle, at a plant near Detroit.

Meanwhile, investigators and autonomous-vehicle equipment suppliers are still trying to figure out what went wrong in the self-driving Uber fatal crash. Uber and the family of Elaine Herzberg, the woman killed in the accident, have reached a settlement in the case.

The latest incident won't help convince a wary public about the safety of our increasingly inevitable self-driving future. It also makes us wonder: What happens when an officer tries to pull over a completely self-driving car that doesn't have a passenger inside of it? On that front, time will certainly tell.
.
 
#28 ·
"onboard data shows the pedestrian was 10.8 feet away from the car when it began driving"

This sounds like a judgement call the officer made to offer the citation, while the vehicle had actual data to back up it's right of way. Similar to the Uber accident where the women was illegally j-walking at night wearing dark clothing, and never looked before crossing the road. Then the Tesla accident using AutoPilot, where the car gave numerous visual and auditory warnings, but the driver was apparently asleep at the wheel, along with a highway safety barrier being missing/damaged from a previous accident at the same spot.

In each of these incidents, it was human error, but the focus will be on autonomous cars, since that will attract more interest from the public and more revenue for the media/news outlets. While at the same time dramatically exaggerating the incident rate of autonomous vehicles.
 
#29 ·
Similar to the Uber accident where the women was illegally j-walking at night wearing dark clothing, and never looked before crossing the road. Then the Tesla accident using AutoPilot, where the car gave numerous visual and auditory warnings, but the driver was apparently asleep at the wheel, along with a highway safety barrier being missing/damaged from a previous accident at the same spot.

I thought the whole idea of the braking sensors, was to brake.


Why are you constantly defending and trivializing events like this?
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top