157
with a doctor.
30
Where previously Hall would go into rage, he
now does not.
In triggering situations, he can feel increasing
anger but at some point the device kicks in and the anger
diminishes. Often but not always he notices when the device
kicks in. The device also has a
display that allows him to
monitor its activity. However, Hall is free to employ other
emotion modulation mechanisms and is learning to regulate his
emotions and actions earlier on: by calming himself down,
reappraising when
he notices his anger rising, noticing triggers
and exiting the triggering situation, etc. Sometimes he is
successful in regulating his emotions
and averting behavioural
impacts, sometimes the device kicks in. He can practice self-
control with the assurance that he will not harm anyone in the
process, as there is a ‘safety net’ if his self-control fails.
This scenario appears to me to be significantly less problematic than the God
Machine scenario. Overdetermination undermining a person’s
ability to do
otherwise seems not to be the main concern here – the difference between those
cases does not lie in the ability to do otherwise in any particular situation.
Moreover, the action of the device is direct in the sense of acting directly on the
brain.
The first difference lies in the fact that the influence of the brain implant is not
beyond moral review in the sense of
offline reflection and control, although it is
beyond the agent’s control at the moment of acting. The second difference is that it
aids the agent in making the connection between her all-things-considered
judgement and actions, although this happens in a somewhat more roundabout way
than if Hall exercised self-control. The third difference (which I would like to
highlight here) is that the agent’s will is not heteronomous to the same extent as in
30
This condition found its way to the example partly to accommodate a concern raised by
Harris in the context of the God Machine related to ‘agreeing
to enslave oneself,’ i.e.,
agreeing to the diminishment of freedom that cannot be revoked, and partly with an eye on
the current practice of DBS in medical contexts, in which the device can be switched off at
any time by the user. The criminal justice-related and impulsivity related use would likely
offer the ability to switch the device off but not at any point.
158
the God Machine scenario. Thus, it is not overdetermination (and thus the inability
to do otherwise) that is a main issue here.
Do'stlaringiz bilan baham: