The trolley problem

Guest
The problem of selecting between 5 lives or 1

https://www.technologyreview.com/2018/10/24/139313/a-global-ethics-study-aims-to-help-ai-solve-the-self-driving-trolley-problem/

Plus other problems

Why would one select 5 to die instead of 1?

Is it because you then decide with then guilty conscience?

Regards

Klaus
 
On Wednesday, April 8, 2020 at 8:13:24 PM UTC-4, John S wrote:
On 4/8/2020 9:48 AM, klaus.kragelund@gmail.com wrote:
The problem of selecting between 5 lives or 1

https://www.technologyreview.com/2018/10/24/139313/a-global-ethics-study-aims-to-help-ai-solve-the-self-driving-trolley-problem/

Plus other problems

Why would one select 5 to die instead of 1?

Is it because you then decide with then guilty conscience?

Regards

Klaus


This is stupid.

"The Moral Machine took that idea to test nine different comparisons
shown to polarize people: should a self-driving car prioritize humans
over pets, passengers over pedestrians, more lives over fewer, women
over men, young over old, fit over sickly, higher social status over
lower, law-abiders over law-benders?"

How does the car know women vs men, young vs old, fit vs sickly, higher
social status over lower, law-abiders over law-benders?

Any self driven car would be able to avoid hitting the many/few, young/old, etc. by stopping.

It's a hypothetical.

--

Rick C.

- Get 1,000 miles of free Supercharging
- Tesla referral code - https://ts.la/richard11209
 
On 4/8/2020 9:48 AM, klaus.kragelund@gmail.com wrote:
The problem of selecting between 5 lives or 1

https://www.technologyreview.com/2018/10/24/139313/a-global-ethics-study-aims-to-help-ai-solve-the-self-driving-trolley-problem/

Plus other problems

Why would one select 5 to die instead of 1?

Is it because you then decide with then guilty conscience?

Regards

Klaus

This is stupid.

"The Moral Machine took that idea to test nine different comparisons
shown to polarize people: should a self-driving car prioritize humans
over pets, passengers over pedestrians, more lives over fewer, women
over men, young over old, fit over sickly, higher social status over
lower, law-abiders over law-benders?"

How does the car know women vs men, young vs old, fit vs sickly, higher
social status over lower, law-abiders over law-benders?
 
On 09/04/20 01:14, John S wrote:
On 4/8/2020 9:48 AM, klaus.kragelund@gmail.com wrote:
The problem of selecting between 5 lives or 1

https://www.technologyreview.com/2018/10/24/139313/a-global-ethics-study-aims-to-help-ai-solve-the-self-driving-trolley-problem/


Plus other problems

Why would one select 5 to die instead of 1?

Is it because you then decide with then guilty conscience?

Regards

Klaus


This is stupid.

"The Moral Machine took that idea to test nine different comparisons shown to
polarize people: should a self-driving car prioritize humans over pets,
passengers over pedestrians, more lives over fewer, women over men, young over
old, fit over sickly, higher social status over lower, law-abiders over
law-benders?"

How does the car know women vs men, young vs old, fit vs sickly, higher social
status over lower, law-abiders over law-benders?

Do you also think Schroedinger's Cat is stupid?

How would a human driver know all that?

If you suddenly found yourself in that position, what would you do?
 
On Thursday, April 9, 2020 at 4:26:25 AM UTC-4, Tom Gardner wrote:
On 09/04/20 01:14, John S wrote:
On 4/8/2020 9:48 AM, klaus.kragelund@gmail.com wrote:
The problem of selecting between 5 lives or 1

https://www.technologyreview.com/2018/10/24/139313/a-global-ethics-study-aims-to-help-ai-solve-the-self-driving-trolley-problem/


Plus other problems

Why would one select 5 to die instead of 1?

Is it because you then decide with then guilty conscience?

Regards

Klaus


This is stupid.

"The Moral Machine took that idea to test nine different comparisons shown to
polarize people: should a self-driving car prioritize humans over pets,
passengers over pedestrians, more lives over fewer, women over men, young over
old, fit over sickly, higher social status over lower, law-abiders over
law-benders?"

How does the car know women vs men, young vs old, fit vs sickly, higher social
status over lower, law-abiders over law-benders?

Do you also think Schroedinger's Cat is stupid?

How would a human driver know all that?

If you suddenly found yourself in that position, what would you do?

I'd step on the brake until I came to a full stop.

Let's face it, SchrĂśdinger's cat is pretty stupid to be put into a box with radioactive stuff and not scratch the crap out of the person holding you who would probably be Niels Bohr. No, wait, more likely J. Robert Oppenheimer. He never liked cats much.

--

Rick C.

+ Get 1,000 miles of free Supercharging
+ Tesla referral code - https://ts.la/richard11209
 

Welcome to EDABoard.com

Sponsor

Back
Top