See also Social Proof (ch. 4); Social Loafing (ch. 33); In-Group Out-Group Bias (ch. 79);
Planning Fallacy (ch. 91)
26
WHY YOU’LL SOON BE PLAYING MEGATRILLIONS
Neglect of Probability
Two games of chance: in the first, you can win $10 million, and in the second,
$10,000. Which do you play? If you win the first game, it changes your life
completely: you can quit your job, tell your boss where to go and live off the
winnings. If you hit the jackpot in the second game, you can take a nice vacation
in the Caribbean, but you’ll be back at your desk quick enough to see your
postcard arrive. The probability of winning is one in 100 million in the first game,
and one in 10,000 in the second game. So which do you choose?
Our emotions draw us to the first game, even though the second is ten times
better, objectively considered (expected win times probability). Therefore, the
trend is towards ever-larger jackpots – Mega Millions, Mega Billions, Mega
Trillions – no matter how small the odds are.
In a classic experiment from 1972, participants were divided into two groups.
The members of the first group were told that they would receive a small electric
shock. In the second group, subjects were told that the risk of this happening was
only 50%. The researchers measured physical anxiety (heart rate, nervousness,
sweating, etc.) shortly before commencing. The result were, well, shocking: there
was absolutely no difference. Participants in both groups were equally stressed.
Next, the researchers announced a series of reductions in the probability of a
shock for the second group: from 50% to 20%, then 10%, then 5%. The result: still
no difference! However, when they declared they would increase the
strength
of
the expected current, both groups’ anxiety levels rose – again, by the same
degree. This illustrates that we respond to the expected
magnitude
of an event
(the size of the jackpot or the amount of electricity), but not to its
likelihood
. In
other words: we lack an intuitive grasp of probability.
The proper term for this is
neglect of probability,
and it leads to errors in
decision-making. We invest in start-ups because the potential profit makes dollar
signs flash before our eyes, but we forget (or are too lazy) to investigate the slim
chances of new businesses actually achieving such growth. Similarly, following
extensive media coverage of a plane crash, we cancel flights without really
considering the minuscule probability of crashing (which, of course, remains the
same before and after such a disaster). Many amateur investors compare their
investments solely on the basis of yield. For them, Google shares with a return of
20% must be twice as good as property that returns 10%. That’s wrong. It would
be a lot smarter to also consider both investments’ risks. But then again, we have
no natural feel for this so we often turn a blind eye to it.
Back to the experiment with the electric shocks: in group B, the probability of
getting a jolt was further reduced: from 5% to 4% to 3%. Only when the probability
reached zero did group B respond differently to group A. To us, 0% risk seems
infinitely better than a (highly improbable) 1% risk.
To test this, let’s examine two methods of treating drinking water. Suppose a
river has two equally large tributaries. One is treated using method A, which
reduces the risk of dying from contaminated water from 5% to 2%. The other is
treated using method B, which reduces the risk from 1% to 0%, i.e. the threat is
completely eliminated. So, method A or B? If you think like most people, you will
opt for method B – which is silly because with measure A, 3% fewer people die,
and with B, just 1% fewer. Method A is three times as good! This fallacy is called
the
zero-risk bias
.
A classic example of this is the U.S. Food Act of 1958, which prohibits food that
contains cancer-causing substances. Instituted to achieve zero risk of cancer, this
ban sounds good at first, but it ended up leading to the use of more dangerous
(but non-carcinogenic) food additives. It is also absurd: as Paracelsus illustrated
in the sixteenth century, poisoning is always a question of dosage. Furthermore,
this law can never be enforced properly since it is impossible to remove the last
‘banned’ molecule from food. Each farm would have to function like a hyper-
sterile computer-chip factory, and the cost of food would increase a hundredfold.
Economically, zero risk rarely makes sense. One exception is when the
consequences are colossal, such as a deadly, highly contagious virus escaping
from a biotech laboratory.
We have no intuitive grasp of risk and thus distinguish poorly between different
threats. The more serious the threat and the more emotional the topic (such as
radioactivity), the less reassuring a reduction in risk seems to us. Two
researchers at the University of Chicago have shown that people are equally
afraid of a 99% chance as they are of a 1% chance of contamination by toxic
chemicals. An irrational response, but a common one.
Do'stlaringiz bilan baham: |