#2 2016-10-13 11:32:59
They're not wrong. Try to think of a situation involving pedestrians or kids on bikes that would create a life threatening situation for the occupant of the vehicle.
This type of decision making is aimed at highway situations, where speed exponentially increases the odds of occupant mortality.
Offline
#3 2016-10-13 12:12:02
OBD-1 and prior thank you just the same. My car has no good reason to reason for me or to report back to Corporate on my actions.
Offline
#4 2016-10-13 12:13:25
XregnaR wrote:
They're not wrong. Try to think of a situation involving pedestrians or kids on bikes that would create a life threatening situation for the occupant of the vehicle.
This type of decision making is aimed at highway situations, where speed exponentially increases the odds of occupant mortality.
The morality of this starts to break down the moment the car decides to drive up onto the sidewalk and hit a bunch of pedestrians to avoid a head-on collision. I knew the Three Laws of Robotics would come into play sometime in my lifetime. It seems Mercedes has added a zeroth Law.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
0. A robot must preserve the life of its human beings over the lives of other human beings.
Offline
#5 2016-10-13 13:56:42
White people in a Mercedes are more valuable than BLM assholes blocking traffic on the freeway. I'm glad the car would run over the assholes !
Offline
#6 2016-10-13 14:05:44
fnord wrote:
White people in a Mercedes are more valuable than BLM assholes blocking traffic on the freeway. I'm glad the car would run over the assholes !
Hey! It's our butt Pirate!!!
Damn sure have missed you man, let me buy you a beer. (or a nice Chardonnay if your feeling Twinky).
Luv
Em.
Offline
#7 2016-10-14 21:35:36
Start Judging...
http://moralmachine.mit.edu/
Last edited by Chuck Schick (2016-10-14 21:43:25)
Offline
#8 2016-10-15 02:05:31
False dichotomy.
Offline
#9 2016-10-19 17:18:31
Why do we complain when a machine would make the same moral decision that a human would make? Given the choice I'm going to choose to kill a pedestrian over running head on into a truck too.
Offline
#10 2016-10-19 18:07:25
GooberMcNutly wrote:
Why do we complain when a machine would make the same moral decision that a human would make? Given the choice I'm going to choose to kill a pedestrian over running head on into a truck too.
Because you, honestly, don't know what you'd do until the time came, despite what you say.
With the Car it's young family with two kids die, rich fuck lives. Every time.
Offline
#11 2016-10-20 14:57:07
Baywolfe wrote:
GooberMcNutly wrote:
Why do we complain when a machine would make the same moral decision that a human would make? Given the choice I'm going to choose to kill a pedestrian over running head on into a truck too.
Because you, honestly, don't know what you'd do until the time came, despite what you say.
With the Car it's young family with two kids die, rich fuck lives. Every time.
If it's a family of wetbacks running across the freeway to escape the INS agents at the border, they deserve to die!
Offline
#12 2016-10-20 16:26:00
fnord wrote:
Baywolfe wrote:
GooberMcNutly wrote:
Why do we complain when a machine would make the same moral decision that a human would make? Given the choice I'm going to choose to kill a pedestrian over running head on into a truck too.
Because you, honestly, don't know what you'd do until the time came, despite what you say.
With the Car it's young family with two kids die, rich fuck lives. Every time.If it's a family of wetbacks running across the freeway to escape the INS agents at the border, they deserve to die!
You don't strike me as a border town resident.
Offline
#13 2016-10-20 16:28:55
That's what all the Gated Community people think, paranoia is their mantra.
Offline
#14 2016-10-20 21:37:37
GooberMcNutly wrote:
Why do we complain when a machine would make the same moral decision that a human would make? Given the choice I'm going to choose to kill a pedestrian over running head on into a truck too.
This brings up a key point. Why should I not be able to set the moral relativity of my machines? Why does the choice they make have to be fixed? Can they not reflect my own morals or lack thereof? Can my car not reflect just how much of an asshole I feel today. And if I am a particular assholeliness, I should have a "I just don't care about you" setting should I not?
Last edited by Johnny_Rotten (2016-10-20 21:38:38)
Offline
#15 2016-10-21 00:28:24
Johnny_Rotten wrote:
This brings up a key point. Why should I not be able to set the moral relativity of my machines? Why does the choice they make have to be fixed? Can they not reflect my own morals or lack thereof? Can my car not reflect just how much of an asshole I feel today. And if I am a particular assholeliness, I should have a "I just don't care about you" setting should I not?
Can I set it to run into the car next to me, the one turning left from the right lane? You know the one driven by some asshole who wanted to get ahead of ten cars at the stop light. That's the kind of setting I want to see in the menu. A run into the BMW button.
Offline
#16 2016-10-21 12:21:17
Johnny_Rotten wrote:
GooberMcNutly wrote:
Why do we complain when a machine would make the same moral decision that a human would make? Given the choice I'm going to choose to kill a pedestrian over running head on into a truck too.
This brings up a key point. Why should I not be able to set the moral relativity of my machines? Why does the choice they make have to be fixed? Can they not reflect my own morals or lack thereof? Can my car not reflect just how much of an asshole I feel today. And if I am a particular assholeliness, I should have a "I just don't care about you" setting should I not?
That's what I'm saying. Every day when you get into the driver's seat you make the moral decision to either run over pedestrians or not.
Machines won't made be any more altruistic than their designers.
The limiting problem (for both meat and metal) is that neither of us is intelligent enough to pick the altruistic option.
Offline