Uncommon Descent Serving The Intelligent Design Community

A note on unthinkable calculations

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

File:A small cup of coffee.JPG Barry Arrington published a piece here on unthinkable calculations, on the slaughter of tens of millions of Chinese people, apparently accepted in a “nuancey” way by Henry Kissinger.

One possibly useful thought: Train-tracks switch dilemmas (should one throw the switch and doom a fat man or else two other people, etc.?), such as raised in Barry’s reflection, are almost always false.

Indeed, they should be a good reason to drop “ethics” courses, unless one needs the courses to graduate, and then the question should be what is my degree really worth?

Yes, the train tracks dilemma has really happened. One Belgian workman (I cannot just now find the reference) threw himself onto the tracks to hold two junction lines together and prevent near-certain fatalities.

But that guy didn’t know it was going to happen. He was not sitting in an ethics class at the time. He responded out of an instinct to prevent disaster.

A clever ethics prof was probably schmoozing on the cocktail circuit somewhere while that guy’s buddies were cleaning the tracks and breaking the news to the widow and orphans.

That said, the main reason the train tracks dilemma is almost always false is that most of us are already embedded in the situations in which we make ethical decisions. We have already made a number of earlier decisions that help provide a context for the one we will make now.

It’s not a tic tac toe game.

If a man was attracted to the exercise of power over millions (Kissinger?), he might feel more nuanced about those millions’ deaths than say, an American doctor like Bob Pierce, World Vision founder, who discovered that it would cost only about $5 a month to save the life of a Korean child. And then spent the rest of his life raising money from people who might spend that much on one restaurant meal.

So one person is in situation A and the other is in situation B. And that is not an accident.

Its origins are mysterious, perhaps, but not an accident.

Comments
Fair enough, Gordon Davisson at 1, but people who confront these issues are not typically building self-driving cars. They don't even wake up in the morning expecting to confront any moral issues that are new to them. The traffic cop knows he is supposed to spot possible drunk drivers (instead of lounging in a donut shop somewhere) and that he is not to take bribes. But no one considers those maxims to be ethical dilemmas, in themselves. He doesn't expect to be asked who should live or die in an artificial "fat man" dilemma. Of course, people engaged in a highly specialized activity might have to consider dilemmas that would not confront the rest of us. But it is a form of corruption if ethics courses spend a lot of time on exotic dilemmas instead of practical ones. Like, why are the police habitually corrupt in country A but much less so in country B? Why are women much more likely to be murdered by their families or their husband's families in country C than country D? A family doctor told me recently about how women in a certain culture are hounded to get an abortion if the child is a girl. But in other cultures, it is merely a choice of nursery decor. Why? These are serious, everyday dilemmas, not artificial ones. Ethics students are underserved if the choice is to discuss artificial dilemmas instead.News
February 6, 2015
February
02
Feb
6
06
2015
03:51 AM
3
03
51
AM
PST
It's not even always a decision made on the spot -- the programmers working on self-driving cars have to decide in advance how the car should respond in various trolley-dilemma-like situations. From the c|net article "Self-driving car advocates tangle with messy morality":
Sure, dealing with lane changes, firetrucks and construction projects is difficult for engineers building self-driving cars. But what about deciding which people to kill when an accident is unavoidable? That's the kind of the thorny problem that's a real issue for the auto industry as it moves to vehicles that steer, brake and accelerate for themselves. Perhaps because computer-driven cars are so closely compared to human-driven cars, though, people have begun wrestling with those moral issues.
They don't have the luxury of dismissing these calculations as unthinkable -- indeed, I'd consider it immoral for them not to think about the issues.Gordon Davisson
February 5, 2015
February
02
Feb
5
05
2015
03:55 PM
3
03
55
PM
PST

Leave a Reply