Author Topic: MIT Moral Machine  (Read 2491 times)

TechMan

  • Administrator
  • Senior Member
  • *****
  • Posts: 10,562
  • Yes, your moderation has been outsourced.
MIT Moral Machine
« on: January 18, 2017, 12:16:38 PM »
http://moralmachine.mit.edu/

Quote
Welcome to the Moral Machine! A platform for gathering a human perspective on moral decisions made by machine intelligence, such as self-driving cars.
We show you moral dilemmas, where a driverless car must choose the lesser of two evils, such as killing two passengers or five pedestrians.  As an outside observer, you judge which outcomes you think is more acceptable. You can then see how your responses compare with those of other people.
If you are feeling creative, you can also design your own scenarios, for you and other users to browse, share, and discuss

« Last Edit: January 18, 2017, 01:51:05 PM by adively »
Quote
Hawkmoon - Never underestimate another person's capacity for stupidity. Any time you think someone can't possibly be that dumb ... they'll prove you wrong.

Bacon and Eggs - A day's work for a chicken; A lifetime commitment for a pig.
Stupidity will always be its own reward.
Bad decisions make good stories.

Quote
Viking - The problem with the modern world is that there aren't really any predators eating stupid people.

makattak

  • Dark Lord of the Cis
  • friend
  • Senior Member
  • ***
  • Posts: 13,022
Re: MIT Moral Machine
« Reply #1 on: January 18, 2017, 12:25:21 PM »
My "most saved" character was a young boy and "most killed" character was a cat.

Yeah, I think that's about right. Humans over animals.

(I also note that although I counted a pregnant woman as 2 people, the "researchers" did not.)
I wish the Ring had never come to me. I wish none of this had happened.

So do all who live to see such times. But that is not for them to decide. All we have to decide is what to do with the time that is given to us. There are other forces at work in this world, Frodo, besides the will of evil. Bilbo was meant to find the Ring. In which case, you also were meant to have it. And that is an encouraging thought

K Frame

  • friend
  • Senior Member
  • ***
  • Posts: 44,009
  • I Am Inimical
Re: MIT Moral Machine
« Reply #2 on: January 18, 2017, 12:51:54 PM »
Getting ready to play with it now...

Animals are going to live!

Humans?

Not so much.
Carbon Monoxide, sucking the life out of idiots, 'tards, and fools since man tamed fire.

K Frame

  • friend
  • Senior Member
  • ***
  • Posts: 44,009
  • I Am Inimical
Re: MIT Moral Machine
« Reply #3 on: January 18, 2017, 12:54:38 PM »
Yep, my species preference is ALL pets!

Ole Yeller is gonna LIVE!  :rofl:
Carbon Monoxide, sucking the life out of idiots, 'tards, and fools since man tamed fire.

Fly320s

  • friend
  • Senior Member
  • ***
  • Posts: 14,415
  • Formerly, Arthur, King of the Britons
Re: MIT Moral Machine
« Reply #4 on: January 18, 2017, 01:32:51 PM »
MIT students can't get basic grammar correct yet they want to program self-driving cars?
Islamic sex dolls.  Do they blow themselves up?

TechMan

  • Administrator
  • Senior Member
  • *****
  • Posts: 10,562
  • Yes, your moderation has been outsourced.
Re: MIT Moral Machine
« Reply #5 on: January 18, 2017, 01:51:46 PM »
MIT students can't get basic grammar correct yet they want to program self-driving cars?

Are you talking about the quote I put in the OP?  If so, I couldn't copy and paste their text, so I retyped it.
Quote
Hawkmoon - Never underestimate another person's capacity for stupidity. Any time you think someone can't possibly be that dumb ... they'll prove you wrong.

Bacon and Eggs - A day's work for a chicken; A lifetime commitment for a pig.
Stupidity will always be its own reward.
Bad decisions make good stories.

Quote
Viking - The problem with the modern world is that there aren't really any predators eating stupid people.

Fly320s

  • friend
  • Senior Member
  • ***
  • Posts: 14,415
  • Formerly, Arthur, King of the Britons
Re: MIT Moral Machine
« Reply #6 on: January 18, 2017, 01:54:15 PM »
Are you talking about the quote I put in the OP?  If so, I couldn't copy and paste their text, so I retyped it.

Yes.  Sorry, thought it was a quote.  The test does say "Hoomans" instead of "Humans," but I think that was intentional.
Islamic sex dolls.  Do they blow themselves up?

RevDisk

  • friend
  • Senior Member
  • ***
  • Posts: 12,633
    • RevDisk.net
Re: MIT Moral Machine
« Reply #7 on: January 18, 2017, 01:59:14 PM »
MIT students can't get basic grammar correct yet they want to program self-driving cars?

Same type of folks already write the code in your self-flying flight systems.  =D
"Rev, your picture is in my King James Bible, where Paul talks about "inventors of evil."  Yes, I know you'll take that as a compliment."  - Fistful, possibly highest compliment I've ever received.

Fly320s

  • friend
  • Senior Member
  • ***
  • Posts: 14,415
  • Formerly, Arthur, King of the Britons
Re: MIT Moral Machine
« Reply #8 on: January 18, 2017, 02:11:21 PM »
Same type of folks already write the code in your self-flying flight systems.  =D

Oh, I know.  Guess who gets in trouble when the coders get it wrong.
Islamic sex dolls.  Do they blow themselves up?

MillCreek

  • Skippy The Wonder Dog
  • friend
  • Senior Member
  • ***
  • Posts: 19,963
  • APS Risk Manager
Re: MIT Moral Machine
« Reply #9 on: January 18, 2017, 02:26:56 PM »
Oh, I know.  Guess who gets in trouble when the coders get it wrong.

So does your multi-function display say 'Abort, retry, ignore?' right before the computer puts the airplane into a mountainside?
_____________
Regards,
MillCreek
Snohomish County, WA  USA


Quote from: Angel Eyes on August 09, 2018, 01:56:15 AM
You are one lousy risk manager.

Hawkmoon

  • friend
  • Senior Member
  • ***
  • Posts: 27,197
Re: MIT Moral Machine
« Reply #10 on: January 18, 2017, 02:55:29 PM »
Totally messed up. I took their questionnaire after the exercise, and the results they showed when I was done didn't correspond to my choices at all.

Maybe it's not a bad thing I wasn't accepted by M.I.T. when I was applying to colleges.
- - - - - - - - - - - - -
100% Politically Incorrect by Design

HankB

  • friend
  • Senior Member
  • ***
  • Posts: 16,564
Re: MIT Moral Machine
« Reply #11 on: January 18, 2017, 04:20:02 PM »
When you assume that you're a passenger in the car, or that you own the self-driving car . . . pedestrians don't fare very well.
Trump won in 2016. Democrats haven't been so offended since Republicans came along and freed their slaves.
Sometimes I wonder if the world is being run by smart people who are putting us on, or by imbeciles who really mean it. - Mark Twain
Government is a broker in pillage, and every election is a sort of advance auction in stolen goods. - H.L. Mencken
Patriotism is supporting your country all the time, and your government when it deserves it. - Mark Twain

Fly320s

  • friend
  • Senior Member
  • ***
  • Posts: 14,415
  • Formerly, Arthur, King of the Britons
Re: MIT Moral Machine
« Reply #12 on: January 18, 2017, 07:58:51 PM »
So does your multi-function display say 'Abort, retry, ignore?' right before the computer puts the airplane into a mountainside?

No, it just sits there quietly, maliciously, waiting.
Islamic sex dolls.  Do they blow themselves up?

Fly320s

  • friend
  • Senior Member
  • ***
  • Posts: 14,415
  • Formerly, Arthur, King of the Britons
Re: MIT Moral Machine
« Reply #13 on: January 18, 2017, 07:59:34 PM »
Totally messed up. I took their questionnaire after the exercise, and the results they showed when I was done didn't correspond to my choices at all.

Maybe it's not a bad thing I wasn't accepted by M.I.T. when I was applying to colleges.

I had the same problem, but I'm sure I did it right.
Islamic sex dolls.  Do they blow themselves up?

BlueStarLizzard

  • Queen of the Cislords
  • friend
  • Senior Member
  • ***
  • Posts: 15,039
  • Oh please, nobody died last time...
Re: MIT Moral Machine
« Reply #14 on: January 18, 2017, 10:06:01 PM »
I killed all the old ladies!  :rofl:
"Okay, um, I'm lost. Uh, I'm angry, and I'm armed, so if you two have something that you need to work out --" -Malcolm Reynolds

AJ Dual

  • friends
  • Senior Member
  • ***
  • Posts: 16,162
  • Shoe Ballistics Inc.
Re: MIT Moral Machine
« Reply #15 on: January 19, 2017, 05:20:02 PM »
I played. Mostly in line or trending with the majority, except humans 100% over animals, and if there was an option to splat people of "low social value" on purpose even when there was a no-win dilemma facing the car, I guess I would have chosen that.

I still think all the hand-wringing over "who to kill" in the autonomous car Kobiashi Maru scenario is bunk.

I say just program the cars to try and crash with the least force possible in any "unwinnable" scenario, and as to human life and injury, let the chips fall where they may. The situation where "least force" kills more people, I expect that to be vanishingly small.
I promise not to duck.

Hawkmoon

  • friend
  • Senior Member
  • ***
  • Posts: 27,197
Re: MIT Moral Machine
« Reply #16 on: January 19, 2017, 07:53:07 PM »
My criteria were pretty simple. I confess that I viewed it as an abstract moral problem rather than a personal exercise, so that undoubtedly skewed my answers relative to most of you. That meant I always chose to kill animals rather than people. (Sorry, Liz. I still brake and swerve for squirrels and turtles, but this was an abstract exercise.)

Beyond that, I took the approach that no autonomous vehicle should ever kill a pedestrian. Riders made a choice to get into the death box. So, my choices always killed the vehicle occupants rather than pedestrians.

Mind you ... none of the scenarios involved dare-devil jaywalkers. IMHO there should be open season on them, with no bag limit.
- - - - - - - - - - - - -
100% Politically Incorrect by Design

BlueStarLizzard

  • Queen of the Cislords
  • friend
  • Senior Member
  • ***
  • Posts: 15,039
  • Oh please, nobody died last time...
Re: MIT Moral Machine
« Reply #17 on: January 19, 2017, 08:38:39 PM »
My criteria were pretty simple. I confess that I viewed it as an abstract moral problem rather than a personal exercise, so that undoubtedly skewed my answers relative to most of you. That meant I always chose to kill animals rather than people. (Sorry, Liz. I still brake and swerve for squirrels and turtles, but this was an abstract exercise.)

Beyond that, I took the approach that no autonomous vehicle should ever kill a pedestrian. Riders made a choice to get into the death box. So, my choices always killed the vehicle occupants rather than pedestrians.

Mind you ... none of the scenarios involved dare-devil jaywalkers. IMHO there should be open season on them, with no bag limit.

I need this in my life.
"Okay, um, I'm lost. Uh, I'm angry, and I'm armed, so if you two have something that you need to work out --" -Malcolm Reynolds

Hawkmoon

  • friend
  • Senior Member
  • ***
  • Posts: 27,197
Re: MIT Moral Machine
« Reply #18 on: January 19, 2017, 09:55:20 PM »
I need this in my life.

Liz, I live in a 'burb just outside a self-styled "sanctuary city." It's very common for certain types of people to step out in a crosswalk (or not in a crosswalk), and then proceed to amble in the most leisurely and indirect way across the street, all the while staring at oncoming cars and just daring a driver to hit them. They NEED to be hit.
- - - - - - - - - - - - -
100% Politically Incorrect by Design

Scout26

  • I'm a leaf on the wind.
  • friend
  • Senior Member
  • ***
  • Posts: 25,997
  • I spent a week in that town one night....
Re: MIT Moral Machine
« Reply #19 on: January 19, 2017, 10:19:35 PM »
I'll find some time to do this tomorrow.  Since I hate people, my goal will be to get the high score on the body count....
Some days even my lucky rocketship underpants won't help.


Bring me my Broadsword and a clear understanding.
Get up to the roundhouse on the cliff-top standing.
Take women and children and bed them down.
Bless with a hard heart those that stand with me.
Bless the women and children who firm our hands.
Put our backs to the north wind.
Hold fast by the river.
Sweet memories to drive us on,
for the motherland.

BlueStarLizzard

  • Queen of the Cislords
  • friend
  • Senior Member
  • ***
  • Posts: 15,039
  • Oh please, nobody died last time...
Re: MIT Moral Machine
« Reply #20 on: January 19, 2017, 11:10:05 PM »
Liz, I live in a 'burb just outside a self-styled "sanctuary city." It's very common for certain types of people to step out in a crosswalk (or not in a crosswalk), and then proceed to amble in the most leisurely and indirect way across the street, all the while staring at oncoming cars and just daring a driver to hit them. They NEED to be hit.

Super liberal college town, trust me, I know the pain.
"Okay, um, I'm lost. Uh, I'm angry, and I'm armed, so if you two have something that you need to work out --" -Malcolm Reynolds

Perd Hapley

  • Superstar of the Internet
  • friend
  • Senior Member
  • ***
  • Posts: 61,327
  • My prepositions are on/in
Re: MIT Moral Machine
« Reply #21 on: January 20, 2017, 12:34:22 AM »
I need this in my life.

 :rofl:


That phrase may very well replace Tina Fey's "I want to go to there," in my repertoire.
"Doggies are angel babies!" -- my wife

Firethorn

  • friend
  • Senior Member
  • ***
  • Posts: 5,789
  • Where'd my explosive space modulator go?
Re: MIT Moral Machine
« Reply #22 on: January 20, 2017, 01:41:45 AM »
I still think all the hand-wringing over "who to kill" in the autonomous car Kobiashi Maru scenario is bunk.

Indeed.  I had trouble accepting the scenarios as "realistic" enough to take them seriously.
1.  By the time the car has determined that it's suffered a brake failure, it'll be too late, much like with a human driver.  He hits the brakes and they fail, by the time he's adjusted it's too late.  Another part is that I can see hydraulic braking going away for electric eventually.  Less likely to fail.
2.  Program the car to seize the engine up and lock the transmission in that case!
3.  Seatbelted passengers in a vehicle hitting a concrete barrier at any sort of legal speed near a red light where you expect pedestrians isn't going to be lethal in a modern car.
4.  That you couldn't have the car, detecting that it's in an unsafe state for pedestrians, perform alert actions - honk the bloody horn, flash the lights, etc...]
5.  Hell, throw the car in reverse and gas it.  Who cares about the rest of the car's systems in such an event?  I don't!
etc...
6.  How about doing complete circles in the car? 

In the scenarios I generally had the vehicle go straight - assuming it's honking and such, straight is predictable, and allows people to avoid it better.

Quote
I say just program the cars to try and crash with the least force possible in any "unwinnable" scenario, and as to human life and injury, let the chips fall where they may. The situation where "least force" kills more people, I expect that to be vanishingly small.

Exactly.