Self Driving Uber Car Kills Pedestrian
Goto page Previous  1, 2, 3
 
Post new topic    LakersGround.net Forum Index -> Off Topic Reply to topic
View previous topic :: View next topic  
Author Message
splashmtn
Star Player
Star Player


Joined: 30 Aug 2016
Posts: 3961

PostPosted: Tue Mar 27, 2018 12:45 pm    Post subject:

ringfinger wrote:
splashmtn wrote:
ringfinger wrote:
Chronicle wrote:
ringfinger wrote:
Btw, here is the interesting (and scary) thing about the future of autonomous driving: Decision Making.

If a car can compute, instantly and say via AI, that a fatal accident will happen, should it prioritize the life of its occupants or prioritize the fewest deaths? If you’re going to hit a school bus in a way that will kill all the kids onboard, but your car can conduct a maneuver that will save all the kids at your expense, it might end up doing just that.

In short, the car will decide who dies in a scenario where someone is going to die.

https://www.usatoday.com/story/money/cars/2017/11/23/self-driving-cars-programmed-decide-who-dies-crash/891493001/


How is this different from sitting in a taxi where some dude decides who dies


Well, there's a couple differences.

1) The taxi driver isn't going to be able to make the some predictive analysis because he/she won't have the same information available to an autonomous vehicle. For instance, he won't know the exact rate of speed and directionality of the other vehicle(s) whereas an autonomous vehicle, in theory, could. They also wouldn't be able to calculate an infinite number of scenarios like if I slam in to this car at this rate of speed then this will happen, etc. A human driver also wouldn't necessarily know for certain the number of occupants in each vehicle.

2) A human driver is going to make decisions based on self-preservation. Whereas, an autonomous vehicle could be programmed to make decisions based on the greater good.

So this could mean that it could decide, in a given scenario, that there are only two outcomes. Slam into a car of 2 passengers, killing them, or swerve off a cliff, killing itself (and you) but saving 2 people instead of 1.

Or we can take it a step further. Let's say the autonomous vehicle knows that the person in the other car has a significant net worth and donates thousands of dollars to charity each year. But you, you're just an everyman who does not have the means to do the same thing. It may decide, if someone has to perish, that the other person is better for the greater good and send you off the cliff instead.

So your example, they are the same in the sense that there is some other entity making the decision on your behalf, but the conclusions they draw, and how they arrive there, are markedly different.


To the bolded. you dont know if thats the case or not. If someone has the time to react, they might just choose others over themselves. You never know whats on some other person's mind.


You're right. You don't know. We're hypothesizing. And that's sort of the point. With AI, the "best" case scenario will always be followed whereas, with a human being you cannot, as you have astutely pointed out, be guaranteed will be the case.

I think we can at least agree that in the human driver scenario, at least some of the time, they will prioritize self-preservation over the greater good.
it depends.

Its a natural reaction when drivnig to first veer one direction or the other out of the way of whatever is in front of you. not necessarily for self preservation but purely on fear of "oh no. i should move." maybe it is self pres going on even when in reality that tactic doesnt actually help you preserve your life. For example. I've been in the car a many of time with people who will completely veer off to the right when a Possum darts out in the road. not even a house cat. a possum.. think about that for a moment. Now you let it be me. I dont want to do it but I"m running mr Opossum over like he never existed. VOOOM. sorry buddy, stay out the road. But a lot of people will get into worse accidents trying not to hit the little ugly animal. now if its a dog. I'm going to slow down if i can or try to veer if doable without me creating an accident for myself and others. but most people just go OH NO and veer.
Back to top
View user's profile Send private message Reply with quote
splashmtn
Star Player
Star Player


Joined: 30 Aug 2016
Posts: 3961

PostPosted: Tue Mar 27, 2018 12:50 pm    Post subject:

governator wrote:
https://www.nytimes.com/2018/02/10/technology/his-2020-campaign-message-the-robots-are-coming.html

To fend off the coming robots, Mr. Yang is pushing what he calls a “Freedom Dividend,” a monthly check for $1,000 that would be sent to every American from age 18 to 64, regardless of income or employment status. These payments, he says, would bring everyone in America up to approximately the poverty line, even if they were directly hit by automation. Medicare and Medicaid would be unaffected under Mr. Yang’s plan, but people receiving government benefits such as the Supplemental Nutrition Assistance Program could choose to continue receiving those benefits, or take the $1,000 monthly payments instead.
good idea but not nearly enough. people at the poverty line right now have one leg on the street and one leg in a home some where. one more incident and they're out. one more raising of that rent, and they're out on their rears.

Something would have to drastically happen with the housing situation country wide. but with that being said. if they did something with housing and made it a state wide thing or city thing or county controlled thing. You would not be able to move from one state to the other or one city /county to another. you would have to stay right where you are because the other areas could not afford to have extras taking up spots. Now, there could be a website, where if you could find a family/person(S) that wanted to switch with you. then that would be your way in. Some people rushing to leave the big city going back to the midwest. the midwest young guy or girl wanting to come out to cali. Or they could just have a long list. if people die and or go elsewhere a spot opens up in that area. would make a for a great movie if you did it right. just pay me on the back end
Back to top
View user's profile Send private message Reply with quote
ExPatLkrFan
Star Player
Star Player


Joined: 29 Jul 2004
Posts: 3982
Location: Mukdahan, Thailand

PostPosted: Tue Mar 27, 2018 6:01 pm    Post subject:

splashmtn wrote:
ringfinger wrote:
splashmtn wrote:
ringfinger wrote:
Chronicle wrote:
ringfinger wrote:
Btw, here is the interesting (and scary) thing about the future of autonomous driving: Decision Making.

If a car can compute, instantly and say via AI, that a fatal accident will happen, should it prioritize the life of its occupants or prioritize the fewest deaths? If you’re going to hit a school bus in a way that will kill all the kids onboard, but your car can conduct a maneuver that will save all the kids at your expense, it might end up doing just that.

In short, the car will decide who dies in a scenario where someone is going to die.

https://www.usatoday.com/story/money/cars/2017/11/23/self-driving-cars-programmed-decide-who-dies-crash/891493001/


How is this different from sitting in a taxi where some dude decides who dies


Well, there's a couple differences.

1) The taxi driver isn't going to be able to make the some predictive analysis because he/she won't have the same information available to an autonomous vehicle. For instance, he won't know the exact rate of speed and directionality of the other vehicle(s) whereas an autonomous vehicle, in theory, could. They also wouldn't be able to calculate an infinite number of scenarios like if I slam in to this car at this rate of speed then this will happen, etc. A human driver also wouldn't necessarily know for certain the number of occupants in each vehicle.

2) A human driver is going to make decisions based on self-preservation. Whereas, an autonomous vehicle could be programmed to make decisions based on the greater good.

So this could mean that it could decide, in a given scenario, that there are only two outcomes. Slam into a car of 2 passengers, killing them, or swerve off a cliff, killing itself (and you) but saving 2 people instead of 1.

Or we can take it a step further. Let's say the autonomous vehicle knows that the person in the other car has a significant net worth and donates thousands of dollars to charity each year. But you, you're just an everyman who does not have the means to do the same thing. It may decide, if someone has to perish, that the other person is better for the greater good and send you off the cliff instead.

So your example, they are the same in the sense that there is some other entity making the decision on your behalf, but the conclusions they draw, and how they arrive there, are markedly different.


To the bolded. you dont know if thats the case or not. If someone has the time to react, they might just choose others over themselves. You never know whats on some other person's mind.


You're right. You don't know. We're hypothesizing. And that's sort of the point. With AI, the "best" case scenario will always be followed whereas, with a human being you cannot, as you have astutely pointed out, be guaranteed will be the case.

I think we can at least agree that in the human driver scenario, at least some of the time, they will prioritize self-preservation over the greater good.
it depends.

Its a natural reaction when drivnig to first veer one direction or the other out of the way of whatever is in front of you. not necessarily for self preservation but purely on fear of "oh no. i should move." maybe it is self pres going on even when in reality that tactic doesnt actually help you preserve your life. For example. I've been in the car a many of time with people who will completely veer off to the right when a Possum darts out in the road. not even a house cat. a possum.. think about that for a moment. Now you let it be me. I dont want to do it but I"m running mr Opossum over like he never existed. VOOOM. sorry buddy, stay out the road. But a lot of people will get into worse accidents trying not to hit the little ugly animal. now if its a dog. I'm going to slow down if i can or try to veer if doable without me creating an accident for myself and others. but most people just go OH NO and veer.



I agree there will be programming for the "greater good" however that greater good will be for the greater good of whoever owns the vehicle. It would mitigate damages to the car owner. Anything less would be a failure of the fiduciary responsibility to the shareholders of the company. It's a lot cheaper to pay off one law suit rather than dozens, even though a rider in a hired car should expect by contract to have his rights looked after first. But I doubt it will happen that way.
Back to top
View user's profile Send private message Yahoo Messenger Reply with quote
ringfinger
Retired Number
Retired Number


Joined: 08 Oct 2013
Posts: 29418

PostPosted: Tue Mar 27, 2018 7:54 pm    Post subject:

^ There’s a lot of debate on whether it would be programmed for the greater good of society or the driver. I don’t think it’s decided yet.

The other question is, when there is a crash by an autonomous vehicle, who is to blame if the human isn’t the “driver”?
Back to top
View user's profile Send private message Reply with quote
splashmtn
Star Player
Star Player


Joined: 30 Aug 2016
Posts: 3961

PostPosted: Thu Mar 29, 2018 1:10 pm    Post subject:

More info coming in

https://www.msn.com/en-us/autos/car-tech/uber-reportedly-disabled-key-safety-feature-prior-to-deadly-crash/ar-BBKOpaR?ocid=spartandhp

read the entire article.
Back to top
View user's profile Send private message Reply with quote
Display posts from previous:   
Post new topic    LakersGround.net Forum Index -> Off Topic All times are GMT - 8 Hours
Goto page Previous  1, 2, 3
Page 3 of 3
Jump to:  

 
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum






Graphics by uberzev
© 1995-2018 LakersGround.net. All Rights Reserved. Privacy Policy. Terms of Use.
LakersGround is an unofficial news source serving the fan community since 1995.
We are in no way associated with the Los Angeles Lakers or the National Basketball Association.


Powered by phpBB