went too far.
A slave is never supposed to question his master, but most of us
can think of times when we questioned and revised our rst
intuitive judgment. The rider-and-elephant metaphor works well
here. The rider evolved to serve the elephant, but it’s a digni ed
partnership, more like a lawyer serving a client than a slave serving
a master. Good lawyers do what they can to help their clients, but
they sometimes refuse to go along with requests. Perhaps the
request is impossible (such as nding a reason to condemn Dan, the
student council president—at least for most of the people in my
hypnosis experiment). Perhaps the request is self-destructive (as
when the elephant wants a third piece of cake, and the rider refuses
to go along and nd an excuse). The elephant is far more powerful
than the rider, but it is not an absolute dictator.
When does the elephant listen to reason? The main way that we
change our minds on moral issues is by interacting with other
people. We are terrible at seeking evidence that challenges our own
beliefs, but other people do us this favor, just as we are quite good
at nding errors in other people’s beliefs. When discussions are
hostile, the odds of change are slight. The elephant leans away from
the opponent, and the rider works frantically to rebut the
opponent’s charges.
But if there is a ection, admiration, or a desire to please the other
person, then the elephant leans toward that person and the rider
tries to nd the truth in the other person’s arguments. The elephant
may not often change its direction in response to objections from its
own rider, but it is easily steered by the mere presence of friendly
elephants (that’s the social persuasion link in the social intuitionist
model) or by good arguments given to it by the riders of those
friendly elephants (that’s the reasoned persuasion link).
There are even times when we change our minds on our own,
with no help from other people. Sometimes we have con icting
intuitions about something, as many people do about abortion and
other controversial issues. Depending on which victim, which
argument, or which friend you are thinking about at a given
moment, your judgment may ip back and forth as if you were
looking at a Necker cube (
gure 3.1
).
FIGURE
3.1. A Necker cube, which your visual system can read in two
con icting ways, although not at the same time.
Similarly, some moral
dilemmas can be read by your righteous mind in two con icting
ways, but it’s hard to feel both intuitions at the same time.
And nally, it is possible for people simply to reason their way to
a moral conclusion that contradicts their initial intuitive judgment,
although I believe this process is rare. I know of only one study that
has demonstrated this overruling experimentally, and its ndings
are revealing.
Joe Paxton and Josh Greene asked Harvard students to judge the
story about Julie and Mark that I told you in
chapter 2
.
47
They
supplied half of the subjects with a really bad argument to justify
consensual incest (“If Julie and Mark make love, then there is more
love in the world”). They gave the other half a stronger supporting
argument (about how the aversion to incest is really caused by an
ancient evolutionary adaptation for avoiding birth defects in a world
without contraception, but because Julie and Mark use
contraception, that concern is not relevant). You’d think that
Harvard students would be more persuaded by a good reason than a
bad reason, but it made no di erence. The elephant leaned as soon
as subjects heard the story. The rider then found a way to rebut the
argument (good or bad), and subjects condemned the story equally
in both cases.
But Paxton and Greene added a twist to the experiment: some
subjects were not allowed to respond right away. The computer
forced them to wait for two minutes before they could declare their
judgment about Julie and Mark. For these subjects the elephant
leaned, but quick a ective ashes don’t last for two minutes. While
the subject was sitting there staring at the screen, the lean
diminished and the rider had the time and freedom to think about
the supporting argument. People who were forced to re ect on the
weak argument still ended up condemning Julie and Mark—slightly
more than people who got to answer immediately. But people who
were forced to re ect on the good argument for two minutes
actually did become substantially more tolerant toward Julie and
Mark’s decision to have sex. The delay allowed the rider to think for
himself and to decide upon a judgment that for many subjects was
contrary to the elephant’s initial inclination.
In other words, under normal circumstances the rider takes its cue
from the elephant, just as a lawyer takes instructions from a client.
But if you force the two to sit around and chat for a few minutes,
the elephant actually opens up to advice from the rider and
arguments from outside sources. Intuitions come rst, and under
normal circumstances they cause us to engage in socially strategic
reasoning, but there are ways to make the relationship more of a
two-way street.
IN SUM
The rst principle of moral psychology is Intuitions come rst,
Do'stlaringiz bilan baham: