Sunday, 13 May 2012

Possibility Effect and Understanding of Numbers

Few months back, there was a sudden upraise of protests regarding the atomic power project in Kudankulam, India. In the little reading I did on these protests, one of the primary concern was about the safety of the nuclear power plant and the possible direct and indirect casualties this could trigger. In addition, the effects of Bhopal disaster, probably the worst industrial accident in the history of the country and the way these victims were handled  had created an animosity towards projects like these.

In cases like Kudankulam the decision to oppose the project is usually based on the casualties caused by similar projects in the past. For example, if we consider the Union Carbide disaster, according to this CNN article, the death toll was about 33,000. This Wikipedia entry quotes that, the leak had caused about 558,125 injuries. Similarly this entry from Wikipedia indicates that the approximate number of deaths alone in nuclear accidents around the world is about 5000 and then there are other prolonged side effects. These numbers seems quite high. But do the protesters actually take account of these numbers?

Now to a different event. This article in 'The Lancet' indicates that about 7.6 million children below the age of five had died in 2010 of which about 64% are attributed to infectious causes including pneumonia, diarrhea which are preventable.This is the estimate around the world. Out of these 7.6 million, 1.682 million were from India alone which includes about 0.397 million from pneumonia and 0.212 million from diarrhea which are infectious but again almost preventable.

Assuming that the above numbers regarding the deaths from nuclear reactors and infant mortality are true (at least proportionally), it is very clear that the infectious diseases have a high death rate (among the children below the age of 5), when compared to nuclear accidents. The outcome of injury related to nuclear reactors still reduces if we include the frequency of these accidents as well. Unless we start working towards better healthcare, the death rates among these children are going to be more or less similar every year unlike an accident in a reactor which might not necessarily happen every year.

So ideally our priority to save human lives should have more focus on events that have an higher death rate . But we seldom see protests against these events like infant mortality which has an higher outcome. Even if such protests happen, it doesn't get the required attention always. Rather we tend to focus our time and energy towards relatively less probable events. This inconsistency, I think is an example of the Possibility Effect:


"The decision weights that people assign to outcomes are not identical to the probabilities of these outcomes, contrary to the expectation principle. Improbable outcomes are over weighted - this is the possibility effect. Outcomes that are almost certain are under weighted relative to actual certainty."
-Daniel Kahneman, Thinking Fast and Slow, p.312


The Possibility Effect might look less harmful, especially when it is associated with events like gambles and lotteries. But when it is applied to our attitude towards some serious life threatening events, we can observe how it affects our decision making.

In addition, the lack of understanding of numbers related to these events we deal with, affects our decisions. If these protesters were aware of the actual numbers, will it have the same intensity or will they still work towards the same cause ?

For example, When a question 'Which of these events should be prevented first ? Event A - Kills 1 lakh people, probably occurring once in few years. Event B - Kills 10 lakhs people almost every year' is posed to the protesters, a reasonable person would want to prevent event B and work towards it. But if we just pose a question without numbers, 'which one you would want to prevent? Deaths related to infectious disease or Nuclear Disaster?' do we still get the same answer?

The possibility of a nuclear accident even though its outcome is less comparatively, looms large and attracts more attention and support to eliminate it. This when compared to a larger and sure outcome event like infant mortality is more or less ignored*. So ideally we should try working towards making our decisions based on the probability of the event, its outcome and the actual numbers. Not just based on an ideology or the emotional statements of the policy makers and the protesters. This could have a better value for people's money and time and might have a better impact on society itself.



* This representation is only based on the numbers available in the internet. If the numbers turn out to be incorrect or change eventually and reverses the outcome, then so should be our decisions towards it.

4 comments:

Deepan said...

interesting view da.. Even I too would choose option B if you put that question to B. I didn't go through the plethora of details about Koodankulam protests. But from what I have read, most of the common protesters are misguided in various ways. They obviously won't look into a broad picture of other issues taking more lives than the one they are fighting for, unless it happens in their vicinity. The same people fighting against Koodandkulam will fight against disease prevention if 1 million out of that 7.6 million happened in their district/zone alone. Its just a normal human character that he/she won't care about a issue unless it bites his/her own ass.

Deepan said...

correction.. "Even I too would choose option B if you put that question to **me** "

Naanthaanga said...

True da. We don't often see as much as we are supposed to see. Kahneman says, this is because one system of our brain makes decision mostly based on what it is under its vicinity. It often cannot see the whole picture.

The system of the brain that has the ability to go back and see the big picture and do complex calculations is generally lazy. So it often lets the first system decide and hence we get by our own brain. "Memory can be treacherous" is not just a statement.

He calls this as "WYSIATI" - What You See Is All There Is. This is one of the primary reasons for our bias while making decisions.

Try reading Thinking, Fast and Slow. It will give a completely different perspective about how we think and why we think like the way we think.

Anonymous said...

just read the reviews of that book on Amazon. sounds interesting! Added it to my reading queue.. hope I will pop it out soon :)

-Deepan