The Rational
Persuader Machine would never influence the behaviour or
intentions directly but create circumstances under which a
person would attend to previously ignored considerations,
acquire a wider outlook, learn something about the
consequences of their actions or attend to reasons they
already had but which they had not given sufficient weight
to.
It would never influence other people directly, but, through
a network of hired staff and changes to inanimate objects, it
would create the circumstances in which a person would
change her mind. Sometimes it would be a closely falling brick
that would make them reflect on how short life is. Sometimes it
would be an anonymous email with just the right content, or a
hologram of a street preacher saying just the right words.
The
Machine is never a sophist; a constraint put on its action
was that it is supposed to increase the person’s ability (or
provide the relevant information) for making an all-things-
considered judgement.
Since the Rational Persuader Machine
was monitoring the thoughts, beliefs and desires of everyone
and had run simulations, every intervention would be effective.
As before, it would only intervene once the threshold of harm
to others’ interest was reached. Murder of innocent people no
longer occurred and everyone was given the best chance to
(inevitably) change their minds.
It appears to me that interference from the Rational Persuader Machine, although
still somewhat troubling (and shortly we shall explore some of the possible reasons
why), would be significantly less problematic than that of the God Machine. The
main difference is that there is no
direct
change of mental phenomena leading to the
change of the second order volition and that, consequently, the scenario allows
agents to engage with the input in a ‘right way’ , i.e., the agent develops a will on
her own on the basis of reasons.
153
One could object by pointing out that the Rational Persuader Machine is indeed
troubling because it is an instance of manipulation; although valid reasons or
nudges to consider reasons are given, they might be interpreted as tendentious and
instrumental in achieving a certain behavioural result. Such a scenario would be
similar to propaganda, or to living in a society with tendentious media reporting. I
admit that it is a valid consideration, yet my basic point stands: the difference
between the Rational Persuader and the God Machine signals the importance of the
agent’s engagement with reasons.
Moreover, this scenario could be conceived of as a high-tech version of a well-
known story. In Dickens’
A Christmas Carol
, Ebenezer Scrooge is visited by four
ghosts: Marley, and the Ghosts of Christmas Past, Present and Yet to Come. Shown
images of the past, present and future and warned by the ghosts, Scrooge ‘changes
his mind’ and re-shapes his life. Although the ghosts’ intervention could be
considered as leading to a desired outcome, while at the same time the images
brought to Scrooge’s attention seen as tendentious, ultimately Scrooge seems to
take into account more of the morally relevant considerations than previously. By
this reading, his ability to make an all-things-considered judgement has increased,
even if he remains an imperfect moral agent.
26
The potentially relevant difference between Scrooge and the Rational Persuader
Machine scenario is that in the latter case the agent does not know that another will
is involved in shaping the information they can access – that there is a purposeful
activity shaping their environment. Since the intervention is done in order to
achieve a certain outcome, the whole exercise may be seen as manipulation.
Moreover, since the person is unaware of the intervention they cannot take that
factor into account. I think that this intuition arises from the fact that in our every-
day context we have good reasons to distrust information from people who have a
vested interest in us doing a certain thing – often, the information may be
26
For the purpose of this argument it is sufficient to contend that Scrooge is somewhat
better
at making an all-considered judgement from a moral perspective, despite the
questions about the reasons for him adopting the moral perspective and how good of a
moral agent he is after the ghost’s appearance.
154
misleading or unreliable. Yet, in our rather removed- from-the-real-world scenario,
the constraint on the Machine’s interference is that agent’s ability to make an all-
things-considered judgement increases. Thus, this last objection turns out to be
weak.
As a result, I think that it is unimportant that the Rational Persuader Machine
provides necessarily compelling reasons, as long as the agent is able to engage with
them in a right way. Moreover, although for both the machines the outcome is
known from the start, the fact that the Rational Persuader Machine is less
problematic than the God Machine signals that freedom of action is not the only,
and perhaps not even the main, consideration. It appears that it is not the ability to
do otherwise – in the sense of having possible futures in which one commits a gross
immorality – that is of utmost importance. Rather, it is the appropriate engagement
of the agent with reasons that is important. This is not a novel point (e.g. DeGrazia,
2014; Harris, 2014a),
and put this way it may seem uncontroversial. Compatibilist
accounts of free will often include emphasis on the appropriate engagement with
moral reasons: what matters is that there is an appropriate causal connection
between agent’s actions and preferences. A reason-responsive mechanism guiding
action seems to be an important component of that account (for more details see:
section 7.7.1). Consequently, if there are necessarily compelling reasons given, it
does not necessarily mean that the freedom of will is diminished, even if we could
not have done otherwise based on the reasons given.
Here we can note a certain asymmetry in interpretation of the God Machine and
Rational Persuader Machine. Savulescu and Persson interpret the meaning of the
God Machine’s intervention as follows:
‘We have argued that there might be interventions, such as the
God Machine, that do indeed produce more moral behaviour that
do control the moral agent, subjugating that person to the will of
another and removing the freedom to act immorally. Such
interventions and such control are not plausibly moral
enhancements of that person – they rather undermine autonomy
by substituting moral for immoral intentions.’ (2012a, p. 417)
155
While in the God Machine scenario, the changes seem to undermine free will and
free action (as interpreted by various commentators (e.g. Harris, 2014a; Savulescu
and Persson, 2012a), this does not seem to apply to the Rational Persuader scenario,
despite the fact that the agent cannot do otherwise. For the purpose of the outcome
of action, the agent could just as well be locked in Locke’s room.
27
This does not,
however, seem to be as problematic in the Rational Persuader Machine scenario. As
a result, it seems that the crucial factor in the assessment of the God Machine is not
the undermining of the freedom of will or free action – the main problem is the
inability to form a will of
Do'stlaringiz bilan baham: |