ADVERTISEMENT

Uber goes Death Race 3000

seminole97

Veteran Seminole Insider
Jun 14, 2005
6,554
2,596
853
“Tempe, Arizona police report that a self-driving Uber vehicle was in autonomous mode when it was involved in a deadly crash overnight.

As ABC15 reports, the crash occurred near Mill Avenue and Curry Road early Monday morning.

The Uber vehicle was reportedly headed northbound when a woman walking outside of the crosswalk was struck.

The woman was taken to the hospital where she died from her injuries.”

Our robot overlords have deemed jaywalking a capital offense...
 
  • Like
Reactions: NDallasRuss
Jaywalking becomes a capital offense for a lot of people, whether the vehicle is driven by man or not. A lot of people like to play frogger and lose.
 
That theory that some people have about pedestrians always having the right of way, and that drivers will see them and stop, keeps getting proven inaccurate, and now we have cars that have no one driving?!? Do you think this will cause people to be more careful? Use the sidewalk and crosswalk?
 
That theory that some people have about pedestrians always having the right of way, and that drivers will see them and stop, keeps getting proven inaccurate, and now we have cars that have no one driving?!? Do you think this will cause people to be more careful? Use the sidewalk and crosswalk?

If I understand the theory, in time, only the careful will remain...

This story is really just interesting too me because we had about 6000 pedestrians waxed last year, so I find the response (FBI guy from Diehard: "Shut it down!") the newsworthy thing.
The longer we delay the rollout of autonomous cars the more deaths we'll have on the roads.
 
Makes you wonder what the autonomous brain thinks is obstructing the road.
 
If I understand the theory, in time, only the careful will remain...

This story is really just interesting too me because we had about 6000 pedestrians waxed last year, so I find the response (FBI guy from Diehard: "Shut it down!") the newsworthy thing.
The longer we delay the rollout of autonomous cars the more deaths we'll have on the roads.
Yeah, the big question is would have this likely happened with a human driver behind the wheel? My guess is yes. Any time there is a human element involved (in this case, the pedestrians) something can go wrong.

In most states, pedestrians in the crosswalk do have the right of way. That said, white striped lines never stopped a car, people should always be weary and use good judgment, especially when we design neighborhood streets for high speed travel.
 
In Florida, there is actually a set of Statutes about being a pedestrian. Chapter 316, Section 130. Main gist: If there is a sidewalk, you must use it. Cross only at cross walk when allowed. Don't walk in the road.
 
I read an interesting question about autodriving cars. What do you set the parameter, where the car will wreck itself and risk it's passengers lives, to avoid hitting/killing a pedestrian? If there are multiple passengers at a higher speed?
 
I read an interesting question about autodriving cars. What do you set the parameter, where the car will wreck itself and risk it's passengers lives, to avoid hitting/killing a pedestrian? If there are multiple passengers at a higher speed?

11% is more than enough. Save the girl. A human being would have known that.
 
  • Like
Reactions: KitingHigh
It's finally happening!

MaximumOverdrive.png
 
  • Like
Reactions: cpanole
11% is more than enough. Save the girl. A human being would have known that.
That's the rub. Would you want the car you are riding in to wreck, and potentially cause you harm, to save the life of someone who didn't take responsibility for their own welfare? How about if your child was in the car with you? Would you risk the life of your child to save another?
 
.......
In most states, pedestrians in the crosswalk do have the right of way. That said, white striped lines never stopped a car, people should always be weary and use good judgment, especially when we design neighborhood streets for high speed travel.


Thats called being dead right.

The lack of awareness by most pedestrians who should be motivated to be safe is the biggest issue, I've seem people pushing strollers across highways; common sense ain't so common. And many drivers are equally distracted. Driving should be easy, but it really is pretty hard the way we do it.
 
That's the rub. Would you want the car you are riding in to wreck, and potentially cause you harm, to save the life of someone who didn't take responsibility for their own welfare? How about if your child was in the car with you? Would you risk the life of your child to save another?

Its a quote from I-Robot, they discussed this very topic. I am not fond of the idea of automated cars anyways so Im the wrong person to ask. :) I'd rather make my emotional decisions based on self interest. Me and mine above all else.
 
Its a quote from I-Robot, they discussed this very topic. I am not fond of the idea of automated cars anyways so Im the wrong person to ask. :) I'd rather make my emotional decisions based on self interest. Me and mine above all else.
It's funny that the quote involved 11%. That's about exactly how many automobile accidents that occur that are not attributable to human error (10-13%).
 
That's the rub. Would you want the car you are riding in to wreck, and potentially cause you harm, to save the life of someone who didn't take responsibility for their own welfare? How about if your child was in the car with you? Would you risk the life of your child to save another?
It's the trolley problem - the reason why a lot of people will have a problem switching to autonomous cars. The car has to be programmed by someone to make choices. Do you want your car intentionally crashing - endangering you - in order to not hit someone else?

"There is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person tied up on the side track. You have two options:
  1. Do nothing, and the trolley kills the five people on the main track.
  2. Pull the lever, diverting the trolley onto the side track where it will kill one person."
Which is the most ethical choice?

For your judging pleasure, determine what the autonomous car should do:

http://moralmachine.mit.edu/
 
  • Like
Reactions: Semiologist
People don't act recklessly around railroad tracks because the know the train won't stop for them. The same will be true for humans in the future. It will become known that cars are not programmed to accomodate for the mistakes of pedestrians (or other drivers) when it is facted with the decision of whether to wreck itself and/or break traffic laws that cause additional injury vs. avoiding harm to someone who's breaking the rules.

The expectation now is if I jaywalk, the driver will slow down. Maybe they'll honk or flip me off, but I most likely won't die. That gives us a cavalier attitude towards cars. Once folks understand the rules, they'll likely act within them ... or get run over texting someone.
 
It's the trolley problem - the reason why a lot of people will have a problem switching to autonomous cars. The car has to be programmed by someone to make choices. Do you want your car intentionally crashing - endangering you - in order to not hit someone else?

"There is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person tied up on the side track. You have two options:
  1. Do nothing, and the trolley kills the five people on the main track.
  2. Pull the lever, diverting the trolley onto the side track where it will kill one person."
Which is the most ethical choice?

For your judging pleasure, determine what the autonomous car should do:

http://moralmachine.mit.edu/

I got to #5 before they had the option of 5 fit people exercising vs 5 overweight people.... Made me thing about the other thread.

People don't act recklessly around railroad tracks because the know the train won't stop for them. The same will be true for humans in the future. It will become known that cars are not programmed to accomodate for the mistakes of pedestrians (or other drivers) when it is facted with the decision of whether to wreck itself and/or break traffic laws that cause additional injury vs. avoiding harm to someone who's breaking the rules..........

Ah, I don't think that is true at all.
https://www.defensivedriving.com/safe-driver-resources/how-to-avoid-being-hit-by-a-train/
 
That theory that some people have about pedestrians always having the right of way, and that drivers will see them and stop, keeps getting proven inaccurate, and now we have cars that have no one driving?!? Do you think this will cause people to be more careful? Use the sidewalk and crosswalk?

Watching the students walk around campus crossing streets without even looking both ways with their heads down reading their cellphones is interesting.
 
Without knowing the details of this crash, it's hard to even guess how it happened. How far ahead was the pedestrian? Did they literally step right out in front of the car so that no one would have had a chance to stop or did they step out, being at fault, but in such a manner that a human being could have rightfully noticed the issue and stopped.

That's the question. Did the autonomous vehicle fail, if so, why. Or was the human literally stepping right in front of the car so that no one could have avoided it.
 
People don't act recklessly around railroad tracks because the know the train won't stop for them.

It’s a lot better than it used to be, but still 200-300 killed a year at intersections by trains.
What I’ve read most of the difference has come from getting barriers at more crossings. So it’s been the application of more tech that’s really driven the improvement.

Did they literally step right out in front of the car so that no one would have had a chance to stop or did they step out, being at fault, but in such a manner that a human being could have rightfully noticed the issue and stopped.

That's the question. Did the autonomous vehicle fail, if so, why. Or was the human literally stepping right in front of the car so that no one could have avoided it.

Yep, that’s what happened. Bag lady stepped off median into path of car at night.
Splat.
 
It's the trolley problem - the reason why a lot of people will have a problem switching to autonomous cars. The car has to be programmed by someone to make choices. Do you want your car intentionally crashing - endangering you - in order to not hit someone else?

"There is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person tied up on the side track. You have two options:
  1. Do nothing, and the trolley kills the five people on the main track.
  2. Pull the lever, diverting the trolley onto the side track where it will kill one person."
Which is the most ethical choice?

For your judging pleasure, determine what the autonomous car should do:

http://moralmachine.mit.edu/
That's a damn smart car in those scenarios. It always knows the occupation of possible crash victims.
 
  • Like
Reactions: NDallasRuss
It’s a lot better than it used to be, but still 200-300 killed a year at intersections by trains.
What I’ve read most of the difference has come from getting barriers at more crossings. So it’s been the application of more tech that’s really driven the improvement.



Yep, that’s what happened. Bag lady stepped off median into path of car at night.
Splat.
That's something a good human driver can account for better than current technology.

This exact scenario is a common occurance in St Pete South of downtown near a Salvation Army shelter. If your not looking at the homeless people on the sides of the road you may unexpectedly find one under your car. Also there tends to be a lot of dark skinned folk, dressed darkly at night crossing the street at random locations. I'd be very hesitant to allow an automated car to drive 4th/6th street in St Pete.
 
I read an interesting question about autodriving cars. What do you set the parameter, where the car will wreck itself and risk it's passengers lives, to avoid hitting/killing a pedestrian? If there are multiple passengers at a higher speed?
This was solved in I, Robot.
 
It's the trolley problem - the reason why a lot of people will have a problem switching to autonomous cars. The car has to be programmed by someone to make choices. Do you want your car intentionally crashing - endangering you - in order to not hit someone else?

"There is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person tied up on the side track. You have two options:
  1. Do nothing, and the trolley kills the five people on the main track.
  2. Pull the lever, diverting the trolley onto the side track where it will kill one person."
Which is the most ethical choice?

For your judging pleasure, determine what the autonomous car should do:

http://moralmachine.mit.edu/
Found the dog and cat in car (dog driving obviously) deciding to kill a dog and puppy funny. Surely the car with the cat has to die.
 
  • Like
Reactions: funksouljon
Found the dog and cat in car (dog driving obviously) deciding to kill a dog and puppy funny. Surely the car with the cat has to die.

I notice you didn't use a question mark for your last sentence. I concur.

Back in HS, our physics teacher used to use killing cats in most of his problems. "If I drop a 2 lb cat off a 50ft cliff, what will be the cat's terminal velocity?".

note - there was no drop in feline murders in TLH after he retired in the late 90s.
 
  • Like
Reactions: ericram
I notice you didn't use a question mark for your last sentence. I concur.

Back in HS, our physics teacher used to use killing cats in most of his problems. "If I drop a 2 lb cat off a 50ft cliff, what will be the cat's terminal velocity?".

note - there was no drop in feline murders in TLH after he retired in the late 90s.
Funny AND nice catch on the non use of the question mark.
 
I wouldn’t have seen them, based on that video
Seriously? You wouldn't see a person walking a bicycle across a street? Sorry but that would have been really easy to avoid for someone paying attention.

The video makes it look like the person wasn't visible until the last second, but that simply doesn't mesh with reality. It wasn't a corner and she didn't step out from behind anything. The pedestrian would have been visible to a human long in advance.

Not sure how the system missed it, but that was gross error, not an unavoidable accident.
 
Curious how do these self-driving cars know how fast to go? Many times on the highway the posted speed limits are too slow and in residential or city-areas the limits are too high, factoring in crowds, conditions, etc.


Also, the driver looks like Chris Farley in drag.
 
I wouldn’t have seen them, based on that video

I agree, based on that video, I would not have seen them either. I am sure reality is skewed and if I were behind the wheel I would have better light and would have seen the person a little sooner. However, even if I did see them a little sooner, I am not 100% positive I would have been able to stop or avoid them.
 
It doesn't look like they will fault Uber. She wasn't walking in the crosswalk. She crossed several lanes on a very busy street at night. The Uber was going a bit fast but not by much. The human driver did not look up until just before.
I was under the impression that Lidar was being used, but not sure why it didn't pick up the jaywalker sooner.
 
Yeah the human (interior video) didn’t seem like they were real focused, which doesn’t shock me
 
Seriously? You wouldn't see a person walking a bicycle across a street? Sorry but that would have been really easy to avoid for someone paying attention.

The video makes it look like the person wasn't visible until the last second, but that simply doesn't mesh with reality. It wasn't a corner and she didn't step out from behind anything. The pedestrian would have been visible to a human long in advance.

Not sure how the system missed it, but that was gross error, not an unavoidable accident.


Video, esp at night, really skews perception. Objects appear faster when close up, but slower when far away. Night vision is always reduced on video. She was also crossing just beyond the street light so eyes struggle with that lighting variation as well in real life, much less on video. Also wearing blue jeans and a black jacket. Being a 4 lane with separate median road, it likely had a speed limit 55ish. Vehicle likely wasn't speeding at all. I am also a bit surprised there is no FLIR sensors on the cars. Seems that would be significantly better for an auto car at night time than just headlights.

I don't blame the driver, they are there to monitor the car as well as the road, so definitely juggling multiple things. It would cost money, but they are likely better to have two people, 1 for the road exclusively and 1 to monitor the machines. Would a fully attentive person have seen the pedestrian? Possibly, but the pedestrian did almost everything she could to get hit. Hell, she could have crossed the road 10 feet to the right and been under a street light! It does pose the question for me, if you are "monitoring" a self driving car, are they still liable? You aren't driving.... If yes, then they should be focused on the road only, just like a full time driver. Cause Farley definitely was looking down for extended periods. The entire time it should have seen the pedestrian, its eyes were down.
 
I think the fact that he's not watching the road shows that the person being in the car doesn't really help too much in case of a problem. I do think there could be some idea of negligence to charge uber with in that regard.

With that said, I have no idea if I would have seen the woman in normal situations. I am curious what the car is using to detect obstructions? If it's just that camera then I'm not surprised it didn't detect it, but if so, the camera only isn't sufficient and it likely needs some other detection equipment aside from just cameras for night driving.
 
ADVERTISEMENT
ADVERTISEMENT