Chapter 1. The general remarks. An error as intellectual catastrophe.
The basic part of methodology of the analysis detection and elimination of possible errors in reasonings on global risks or the factors conducting to incorrect estimations and then to incorrect decisions is represented. Then a number of reasons of recommendatory character how it is necessary to carry out the analysis of global risks follows.
Our reasonings on global risks are subject to those or other regular errors and distortions which influence final conclusions of these reasonings, and, hence, and on our safety. « Errors »not quite exact word - in English it is called ‘ cognitive biases’, that it is possible to translate as« biases "or" deviations during reasonings », or, if to use the exact psychological term,« когнитивные distortions ». Когнитивные distortions are natural property of human mind, and in it there is no shade of"fault"which from a school bench is connected with our representation about"errors". However it is important to know, that as когнитивные distortions is a natural property of human, arisen evolutionary by everyone is subject to them and can find them in the reasonings. The basic way of correction когнитивных distortions - experiment - cannot help us in case of global risks. Therefore we should approach much more responsibly to a problem of faultless thinking about them. It is thus important to remember, that any lists когнитивных distortions are intended for search of errors in the thoughts, instead of for a victory in disputes with other people as in the second case it will result only in accumulation of errors in the system and closeness to the new information.
Even if the contribution of each of several tens possible errors is small, together they can reject a likelihood estimation of this or that scenario in times and lead to the wrong appendix of means of maintenance of safety. It is not difficult to be convinced of force of these errors - to interrogate the several people knowing the same facts about human history enough, and to ask to give them the confident forecast for the XXI century - and you will see, final conclusions will differ how much: one will be assured of inevitability of exhaustion of oil, others to trust in a celebration ветроэнергетики, the third to expect a world pandemic; one I will consider probability of application of the nuclear weapon huge, others to believe, that it is extremely improbable. The researcher of global risks should know about these reefs. In this section attempt to make the list of such errors is undertaken. Works of foreign and Russian researchers, and also author's operating time are used. The base text on a problem is article Елиезера Judkovsky «Regular errors in the reasonings, potentially influencing an estimation of global risks» in the collection mentioned already «Risks of global catastrophe». The given list does not replace this article in which the mathematical and psychological analysis of some resulted here когнитивных distortions is resulted. However many descriptions of factors of errors are taken from other literature or are found out by the author. The analysis of possible errors in reasonings on global risks is step on a way to creation of methodology of work with global risks, so, and to their prevention. The aspiration of the different groups investigating alternative scenarios of the future is interesting, to make the the list of intellectual traps. For example, recently there was article about ‘ cognitive biases’, influencing a theory estimation of "peak Хуберта», that is exhaustion of stocks of oil.
The work purpose - to reduce possible когнитивные distortions in the convenient and structured list. Thus the maximum attention is given completeness of the list, instead of the proof of each separate point.
The given list does not apply neither for completeness, nor on accuracy of classification, and its some points can appear are identical to another, but told differently. The detailed explanation of each separate possible error in a risk estimation would occupy all volume of article. (Natural catastrophes and Antropic principle »where one of the possible reasons of errors resulted in this article understands on more than 20 printing pages see for example, my article«.)
At the same time it is important to remember, that to errors in reasonings in the same measure peculiar pathological self-organising, as well as to errors and chains of events which lead to real catastrophes. It means, what even the small errors leading to a small deviation of estimations, tend to be hooked one for another, взаимоусиливаясь, especially at occurrence of a positive feedback with them.
The fallacy is an intellectual catastrophe. It is easy to track on an example of real failures as erroneous reasonings of pilots of planes led to catastrophes, and even to designate, in reasonings they have made which errors. It is possible to tell, that almost any catastrophe occurs because of human errors. These errors are chronologically built so: after errors in reasonings on possibilities there are errors in designing, in "preflight" preparation, in piloting, in management of a critical situation, in elimination of consequences of failure and in the analysis of its reasons. Our reasonings on global risks basically concern the first stage, to reasonings on possibility and a tentative estimation of those probabilities or other risks. There is no sense to build opposition strategy to global risks before priorities were defined. Accordingly, the errors resulted in given article also concern, first of all, the earliest phase of counteraction to global risks. However they can prove and later, at a stage of designing of mechanisms acceptances of concrete decisions are sewn up also. Nevertheless, in this text the problem about the analysis of errors at later stages of protection against global catastrophe though a number of the reasons of erroneous actions of "operators" is mentioned is not put.
Separate question is when such errors can happen. One of these errors occur in the course of discussions in "peace time" when the society solves, it should prepare for which risks. Others are to the full shown in emergencies when people are compelled to estimate quickly their danger and to make of the decision. Roughly speaking, it is accepted to divide all errors into errors of "designer" and "pilot". Errors of "designer" are made by the big groups of people for many years whereas errors of the pilot are made by one or small group of people within seconds or minutes. It can be incorrect, generally speaking, concerning global catastrophes in case the situation will start to develop so quickly, that designing and management will actually develop in one rate.
There is also a probability, that some descriptions of errors which I here result, can appear objects of my incorrect understanding - that is too to be erroneous. Also is not present the slightest doubts, that this list is not full. Therefore the given list should be used more likely as a launching pad for the critical analysis of any reasonings on global risks, but not as the tool for statement of the definitive diagnosis.
Dangerous illusion consists that errors in reasonings on global risks or are insignificant, or it is easy обнаружимы and устранимы. Roots of this illusion in a following reasoning: « Time planes fly, despite all possible errors, and in general a life on the Earth proceeds, value of these errors is insignificant ». It is analogy it is incorrect. Planes fly because during their evolution, designing and tests thousand cars have broken. And behind each this failure there were someone's errors which each time were considered and as a whole did not repeat. We do not have thousand planets, which we can break to understand how us correctly to address with an explosive combination био, нано, nuclear and AI of technologies. We cannot use and that fact, that the Earth is still whole for any conclusions about the future (Natural catastrophes and Antropic principle» see my article «) because cannot be done statistical conclusions in one case. And, of course, especially because the future technologies will essentially change a life on the Earth. So, we are deprived a habitual way of elimination of errors - checks. And, nevertheless, right now it is the most important to us not to be mistaken in mankind history.
Probably, that there is a number когнитивных distortions and logic paradoxes which are shown only in reasonings on global risks and which are not found out by us yet, but completely change all all course of reasonings. Precisely also I do not wish to tell, that all researchers commit all errors listed here. On the contrary, the majority of these errors, possibly, are axiomatic to the majority of researchers - or at all do not seem errors. However there is a chance, that any errors are passed.
Under the term «когнитивные» I mean distortions here not only logic infringements, but also any intellectual designs which can influence final conclusions and increase risk of global catastrophe. Some resulted errors can not lead in current circumstances to any consequences, nevertheless, it is useful to mean them.
Possible kinds of errors and когнитивных distortions are divided into following groups:
1. Errors, possible only concerning global risks owing to their specificity.
2. Errors, possible concerning an estimation of any risks, with reference to global risks.
3. The factors influencing acceptance of incorrect decisions, мгогущие to be shown in situations of global risk.
4. The Obshchelogichesky errors, able to be shown in reasonings on global risks.
5. The specific errors arising in discussions about danger of uncontrollable development of an artificial intellect (and also specific errors in reasonings about nano - bio-and others прорывных and dangerous technologies - including in nuclear technologies and astronomies.)
Chapter 2. Errors, possible only concerning threats to mankind existence
1. Mess concerning global catastrophes and it is simple very big catastrophes
There is a tendency to confuse the global catastrophes conducting to extinction of mankind (designated in the English-speaking literature the term «existential risks») and any other enormous catastrophes which can bring a huge damage rejects a civilisation far back and to exterminate a considerable part of mankind. Criterion of global catastrophes is irreversibility. In Russian while there is no settled short term for the catastrophes conducting to extinction of mankind. (Moiseyev named them цивилизационными risks or catastrophes.) I name their global catastrophes. There is still a term-tracing-paper - existential risks. (Is more detailed about definition of global catastrophes and their specificity Threats to existence see article Ника Bostromа «. The analysis of scenarios of human extinction and similar dangers».) Difference between these two kinds of catastrophes - not in number of the lost people and the sufferings tested by them, and the future of a planet after them. If will escape though one tribe in 100 humans through some thousand years on the earth again there will be states, cities and planes, and the lost civilisation in any sense will be revived under ancient texts. (From ancient Greeks remained, by some estimations, only 1 Гб information, and their influence on culture has appeared huge.)
As example of such difference catastrophe in which all mankind dies out, and catastrophe in which it dies out everything, except several humans who then recreate human population as antiquated Ache can serve. From the point of view of the separate human there is no visible difference between two these catastrophes. Both in that and in other case it will be lost almost for certain, and everything, that to it is valuable, also will be lost. However for mankind as whole it is a difference it is equivalent to a difference between death and very heavy illness. And this difference also consists that illness can be long and painful, and then will end with recover, and the death can be easy, instant, but necessarily irreversible.
2. Underestimation of unevident risks
Global risks share on obvious and unevident. Unevident risks it is somewhat much more dangerous, because their volume is unknown, their probability, and in connection with them is undertaken nothing. Some unevident risks are known only to a narrow circle of experts which express opposite opinions in an estimation of their reality and probability. These opinions can look for the detached onlooker in an equal measure proved, that forces it to choose between opinions of experts, or proceeding from humanal preferences, or «throwing a coin». However unevident risks bear quite real threat and before the scientific community will definitively be defined with their parametres. It forces to pay attention to those fields of knowledge in which relation there are more many questions.
In process of growth of our knowledge of the nature and power of technics, the number of the reasons of possible human extinction known to us constantly grew. Moreover, this growth is accelerated. Therefore it is quite intelligent to expect, that there are the major risks about which we know nothing. And those risks about which we physically cannot learn anything while they will not happen are worse from them.
Besides, obvious risks are much more convenient for analyzing. There is a huge volume of data on a demography, military potential and stocks of raw materials which can be analyzed in details and in detail. The volume of this analysis can cover that fact that there are other risks about which we very little know and which do not suit the analysis in the numerical form but which too are mortally dangerous (for example, problems with incorrectly programmed AI).
It is easy to notice, that at the moment of emergency development, for example, in aircraft, misunderstanding by pilots of that occurs (especially errors in an estimation of height and degree of danger of process) has the most terrible consequences. On the contrary, when such understanding is available, the plane manages to be rescued often in absolutely improbable conditions. And though a posteriori causes of catastrophe are obvious to us, for pilots they were during that moment is made are unevident.
3. Global risks нетождественны national safety
Each country spends for national safety of more money, than for the global. However global risks represent the big threat for each country, than national - is simple because if all world, also the country together with it is lost. Thus often those actions which increase safety of the given country at a current stage, reduce general safety. For example, safety of a certain country increases, - anyway, according to its management - when it accumulates stocks of the nuclear and bacteriological weapon, but safety of all world as a result of race of arms falls. Or, for example, a problem of the Russian Federation is depopulation, and for all world - an overpopulation (no less than for Moscow). Still an example: one American fund realises the project on prevention of global risks and terrorist attacks to America. But for us it is clear, that the first and the second - not one and too.
4. The error connected with психологизацией of a problem
Long since there is a stereotype of the supporter of "doomsday", the interpreter of an apocalypse, - as outcast a society of the individual, trying the ridiculous statements to raise the social importance and to compensate, thus, the failures in the finance and private life. Without dependence from the validity or ложности such interpretation of psychological motives of people, it does not influence degree of risks. Only exact calculations can define real weight of risk. Psychological researches have shown, that people in a depression condition give more exact predictions for the future events, than usual people if it does not concern their own life.
5. An identification of global catastrophe with death of all people and on the contrary
Extinction of mankind does not mean destruction of all people, and on the contrary. It is easily possible to imagine scenarios when the mankind most part perishes from certain epidemic, but one island will escape and for 200 years will restore human population. However if all people are ill with a virus translating a brain in a condition of continuous contemplate pleasure it will mean the civilisation end though the huge majority of people some time will be still live. Or if - in a certain fantastic scenario - aliens win the Earth and will sell people on space zoos. Moreover, all people living at the moment if radical means of prolongation of a life is not invented, will die out to the beginning of XXII century, is equal as the people living in XIX century have now died out. But we do not consider it as global catastrophe because the mankind continuity remains. The present global catastrophe will deprive of us the future.
6. A stereotype of perception of catastrophes which has developed as a result of work of mass-media
Mass-media create a false image of global catastrophe that subconscious impact on estimations can make. Experience смотрения television reportings on catastrophes has developed a stereotype, that a doomsday to us will show on CNN. However global catastrophe will mention everyone, and to look reportings there will be nobody. No less than to show.
In mass-media disproportionate illumination of risks regularly takes place. For example, it is interesting to compare scales of a possible damage from global warming and a bird flu mutation in a dangerous virus. Not pressing in discussions, I will tell, that often it appears, that the real damage is disproportionate to its information illumination. As human is inclined to unconscious training, and in general quantity of statements which can be apprehended critically, ограниченно, these ideas create a certain information background for any reasonings on global risks (on a level with cinema and science fiction).
7. The possible errors connected by that global catastrophe still never with us occurred
Отвержение a certain scenario as fantastic - but also global catastrophe cannot be something other, than "fantastic" event.
The Error, able to arise in connection with neorealization of that fact, that any events it is impossible to identify as global catastrophes in advance, in process - but only a posteriori. Probably, nobody learns, that it actually was global catastrophe. Catastrophe becomes global only after death of last human. (However in scenarios of slow extinction people can realise it - or to be mistaken in this respect. The possible example of the similar scenario is described in novel Нейджела Шюта "Ashore" where people slowly die out from consequences of radioactive pollution.)
Inapplicability of logic operation of "induction" for reasonings on global catastrophes. The induction as a logic method consists in that, the assumption, that if a certain statement is true during the moments 1,2, 3 … N it is true and at N+1 (or at all N). It does not possess absolute logic reliability, but gives good results at very big N and smooth conditions. For example, all physical laws are based on final quantity of experiments, that is have resulted from an induction.
The induction as a logic method has applicability borders. It is inapplicable in situations when the future is not similar to the past. In other words, we cannot, on the basis of that something was in the past always, to tell, that so it will be and in the future. Induction application in type reasonings is especially dangerous: time of this catastrophe was not in the past it never will be in the future. (However an induction as logic procedure is applicable in safety issues: from a point of maintenance of safety thrice periodic repetition of dangerous event - is very significant, whereas from the point of view of the proof of the validity of a certain law - is not present.)
8. Когнитивное the distortion, consisting that reflexions about global risks automatically include a certain archetype «the rescuer of the world»
Danger of a competition between the different groups of people protecting different models of rescue of the world is thus underestimated. Eventually, each world religion is engaged in rescue of all mankind, and the others to it only stir. So struggle of saviours of the world among themselves can threaten a life on the earth. It is possible to recollect a joke of Soviet times: «War will not be, but there will be such struggle for the world, that from the stone world on a stone does not remain».
9. Underestimation of global risks because of psychological mechanisms of ignoring of thoughts on own death
People are not excited with global risks because they and so have got used to inevitability of humanal death the next decades and have developed steady psychological mechanisms of protection against these thoughts. The greatest term of real planning (instead of speculative imaginations) can be defined under long-term real investments of people. Typical reflexion of such planning is house purchase in the mortgage, pension accumulation and education of children - a deadline of these projects - 30 years, with rare exception, and usually it is less 20. However not the fact, that such planning actually is effective; and people in the majority know, that the life is much more unpredictable. In any case, each human has a certain horizon of events, and the event outside of this horizon represents for it purely speculative interest, - and after all the majority of people considers, that global risks will defend from us for many decades.
10. The errors connected by that the one who investigates global catastrophes as a whole, is compelled to rely on opinions of experts in different areas of knowledge
It is often found out, that there is a set of opinions on any problem which look in an equal measure is given reason. A.P.Chehov wrote: «If from illness many means are offered, mean, it is incurable». Owing to it the researcher of global risks should be the expert on correct selection and comparison of expert opinions. As it is not always possible, always there is a probability of a wrong choice of a pool of experts and wrong understanding of their results.
11. The error connected by that as whole give to global risks of less attention, than to risks of catastrophe of separate objects
Global risks should be estimated on the same scale, as risks of all other objects making a civilisation. For example, there is no sense to pawn in the plane probability of failure one on one million if all civilisation with set of planes has smaller reliability.
12. The error connected by that the risk comprehensible to one human or the project, extends on all mankind
Ideas such: «the Mankind should risk on the one 100-th percent for the sake of this new extraordinary result» are vicious because so can argue simultaneously many researchers and designers, each of which thus overestimates safety of the project, that in the sum can give very high risk.
13. Absence of that clear understanding to whom instructions on global risks are turned
Whether they are turned to citizens who all the same cannot make anything, to a civil liability of the scientists which existence else it is necessary to prove, to the governments of large world powers or the United Nations which were engaged in by the affairs, or to the commissions and the fund specially aimed at prevention of global risks - whose ability to influence a situation is unknown. Depresses also absence of the regular file on all risks - with which all would agree.
14. Feature of communication theoretical and practical concerning global risks
The question on global risks is theoretical as such event did not occur to time yet. And we do not wish to check up any possibility experimentally. Moreover, we also are not able to do it, because we, researchers, we will not go through global catastrophe. However we should take practical measures that it does not happen. Thus we can observe positive result: namely, that a certain risk has not happened, but it is difficult to establish the reasons why it has not occurred. It is impossible to tell, why there was no thermonuclear war - because it was impossible, or because to us has improbably carried, or it is result of struggle for the world.
Do'stlaringiz bilan baham: |