2016-05-24

Finding the Cure - Thought Experiment - Part 3 of 3

Part 3

The two situations, as described in Parts 1 & 2, are identical.

Were your choices consistent? If you chose A in part 1 and A' in part 2, then your choices were consistent. The Part 1 decision and the Part 2 decision are identical, just worded differently.


Most of the human race would choose inconsistently. Don't be surprised if you are among them.

The expected number of survivors in A and A', B and B' is also the same: 200, and the expected number of deaths is 400 in each case.

A and A' have one possible outcome: 200 live and 400 die. B and B' present two probabilistic outcomes: there is some chance that all of them will live, and a larger chance that all of them will die.

If you chose A and A', then, in effect, you chose for 200 people to live and 400 to die, with certainty.

According to Kahneman "Thinking Fast and Slow" , this is an example of the effect of "framing". Presented with Part 1, which presents in the frame of surviving (surviving is "good"), most people will choose A. The same people, then presented with Part 2, which presents in the frame of dying (dying is "bad"), will choose B' and take the long odds of saving everybody. This tendency holds true for public-health officers, MBA students, cab drivers, everybody. (The response also holds true when separate samples of respondents are asked to decide on Parts 1 and 2.)

Kahneman's explanation is that our minds resolve decisions "between gambles and sure things differently, depending on whether the outcomes are good or bad. Decision makers tend to prefer the sure thing over the gamble (they are risk averse) when the outcomes are good. They tend to reject the sure thing and accept the gamble (they are risk seeking) when both outcomes are negative.

This dismays us. How could people who make even the most important decisions be swayed by such a superficial manipulation? But this is how human minds work. 


Our best strategy: remain aware of and accept and cope with this human inconsistency.

-------------

Looking for Part 2? Click Here.
Looking for Part 1? Click Here.

References:
Daniel Kahneman presented this thought experiment in his book "Thinking, Fast and Slow" http://smile.amazon.com/dp/0374533555
Photo: San Francisco Department of Public Health http://www.sfcdcp.org

1 comment:

  1. The assumption in all of these ethical scenarios, like the ones about pulling a switch to re-route a train about to kill 10 to another route that would only kill your best friend, etc., is that lives are ultimately and equally valuable on an individual basis. However, if you assume that human life in general is ultimately valuable and accept the results from science that populations (forests, wildlife, etc.) need to be thinned in order for the species to survive, then it is easy but uncomfortable to conclude that letting 200 people die is much more preferable to trying to save 600 people. This strays from the other assumption in this case – “you are a surgeon” – which immediately throws you into the first assumption, i.e. saving individuals. But philosophers, as I am, and policy-makers in the future are going to have to start thinking about the survival of the human species on this planet instead of saving every human life. Overpopulation is the root cause of most of our problems on the planet – global warming, water shortages, famines, traffic, urban sprawl, coyotes in your backyard, etc. International agencies which tout and defend “reproductive rights” as an absolute human right are part of the problem.

    ReplyDelete