Preface
Existential risk – One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.
N. Bostrom. «Existential Risks:
Analyzing Human Extinction Scenarios and Related Hazards».
If in the XX century possibility of extinction of mankind was connected first of all with threat of global nuclear war, now, in the beginning XXI century we can easily name more than ten various sources of possible irreversible global catastrophe, which are basically new technologies, and the number such sources of risk constantly grows. Research of the given question oddly lags behind many other things and smaller questions that is visible at least by quantity of scientific works on this theme. (Probably it is a feature of human nature: in due time D. Carnegie complained, that in library it is a lot of books about worms, but there are no books about worry – which is much more important theme.) Problems of exhaustion of oil, the future of the Chinese economy or outer space exploration involve much more attention, than irreversible global catastrophes, and researches in which different kinds of global catastrophes are compared with each other, are less often, than subject discussions on separate risks. However it seems senseless to discuss the future of a human civilisation before will be received the intelligent estimation of its chances of survival. Even if as a result of such research we learn that the risk is negligibly small, in any case it is important to study this question. But, unfortunately, I immediately should tell, that we will not receive such encouraging result. Sir Martin Rees estimates chances of survival of mankind in the XXI century as 50 to 50 per cent, and I think that is quite proved estimation.
The book offered to the reader - «The Structure of the global catastrophe» - is devoted to the theme little shined in the Russian literature: to the consistent review of the "threats to existence”, that is to risks of irreversible destruction of all human civilisation and extinction of the mankind. The purpose of this book is to give wide and as much as possible plausible review of the theme. Thus, however, the book has debatable character. It urged not to give definitive answers, but to push thoughts of the reader and to create soil for the further discussions. Many stated here hypotheses can seem unduly radical. However, speaking about them, I was guided by “a precaution principle” which recommends to consider worst of realistic scenarios when it is a question of safety maintenance. Criterion of realness of scenarios for me is that they can arise at preservation of present rate of development of technologies during the XXI century and do not break known laws of physics.
Researches of character of global threats and estimations of their likelihood are necessary in order to define how much is the risk and what measures are necessary to accept to mitigate it. And though in this book possible preventive measures are discussed, there is no universal recipe of disposal of global risks in this volume. However it would not be desirable to inspire the reader’s sensation of inevitability of destruction. I believe that despite difficulties and risks which mankind will face in the XXI century, people have chance to survive and, moreover, to construct more perfect world. However preservation of mankind is a necessary condition for any perfect world. Besides, in this book we do not discuss a question on what could be the perfect world in which new technologies are usedfor the blessing, instead of destruction.
In this volume you will find my monography «The Structure of the global catastrophe», and also three articles of the other authors in the appendix, which themes are necessary for clearer understanding. The monography consists of two big parts - methodology of the analysis and actually research of risks. Analysis of concrete threats in the first part consists of their as much as possible detailed list with references to sources and the critical analysis. Then system effects of interaction of different risks are investigated, and then discussed ways of a probability estimation of global risks and other questions connected with it. The methodology offered in the second part, consists basically of the critical analysis of ability of human thinking to the prediction and estimation of global risks. It can be useful, with little changes, and in any other futurological researches. In the same section is given a number of recommendations, about how it is necessary to carry out the analysis of risks.
From Appendix materials, it is necessary to note, first of all, an innovative article of American scientist E. Yudkowsky «An Artificial Intellect As The Positive And Negative Factor Of Global Risk», for the first time translated by me into Russian. E. Yudkowsky - leading scientific employee Singularity Institute in California which is engaged in system engineering of a universal artificial intellect and in analizing problems of its safety ("friendliness").He is the author of several works on problems of creation of systems of AI and maintenance of their "friendliness", he has entered the concept of Seed AI, wrote about problems of futurology and possible Technological Singularity - sharp acceleration of development of technologies in the near future. Its institute has developed «SIAI recommendations about of friendly AI», which pretend to be the standard in safety of AI.
Nick Bostrom is Swedish scientist and the philosopher heading institute Future of Humanity («the mankind Future») in Oxford, the author of researches on ethics, probability theory, futurology and philosophies. The part of his works on probability theory is devoted little-known in Russia to the logic paradox named Doomsday argument. There are many different opinions on its validity, falsity and applicability borders, however it seems to us important to acquaint the reader with this direction of modern thought. Therefore in the Appendix the reader can become acquainted with Bostrom’s article «Doomsday Argument for dummies». Scientific community with care considers this problem and articles about it are published in American magazine Nature in hypothesis section that speaks about certain level of a recognition.
The scientific employee of Institute of the system analysis of the Russian Academy of Sciences, the candidate of sciences A.A. Kononov in the article «Ideological beginnings of the general theory of unexterminability of the mankind» approaches to a problem of global risks from the point of view of strategy, namely, necessities of realisation of a problem « unexterminability of the mankind». It seems to us important to publish here this article as it shows increasing interest among the Russian scientists to the problems of global threats and draws prospects of the decision of this problem.
What was the reason to write in Russian about global catastrophes? I think, there were following reasons:
1. This theme is not covered in the Russian literature and assigned to sectarians of different sort and preachers of the apocalypse. The basic researches are not translated (behind an exception approximately 10 articles on a problem, translated by me in 2006-2008). Open discussion of these questions can be interesting not only to experts, but also general public.
2. Technological backlog of Russia is not so great to serve as a guarantee of that dangerous technologies cannot be developed here and dangerous products cannot be created on their basis. On the contrary, the Russian Federation possesses technical potential for development of many kinds of dangerous technologies, first of all, in the field of the nuclear and biological weapon. Also in our country there are groups working in the field of AI. Also some we do some high-energy physical experiments.
3. Russia repeatedly in history advanced the world in the important technological workings out (e.g. first man in space), or lead up them to the maximim scale limit ("Tsar-bomb"). Besides, in territory of the former USSR there were some largest in history catastrophes (Chernobyl).
4. Irresponsibility and the corruption influencing the organisation of manufacture ("perhaps"-style thinking, orientation to short-term benefit), have led to that not enough attention is given to safety issues. G.G. Malinetsky in his books and reports draws a disastrous picture in the field of prevention of technogenic catastrophes in Russia. Global catastrophes involve even less attention.
5. The information on the risks connected with new technologies created in the West, gets into mass consciousness more slowly, than mere technologies, and biggest part of works on a theme of global risks is not translated till now into Russian.
6. Absence of the rigid control allows to exist the big number of illegal developers of computer programs («Russian hackers»), and can be extremely dangerous, if the same occurs in the field of biotechnologies.
7. Russia has inherited powerful geopolitical contradictions and "inferiority complex" as a result of disintegration of the USSR («a postimperial syndrome»), that can promote realisation of dangerous projects.
8. Publications in Russian can make positive impact on a foreign science and public opinion, increasing «a saturation of environment» with the information on global risks. Unique results of the Russian researchers can bring the contribution to the common cause of rescue of the world and a civilisation. The Russian-speaking literature will be accessible also in the CIS countries. Many Russian students in the future will study or work in foreign establishments, transferring the knowledge saved up in our country. There is considerable enough group of the foreign researchers reading in Russian or of Russian origin.
9. Russia can appear in circumstances when its existence as parts of the big world will appear depending on external circumstances, and there is necessary a fast acceptance of adequate decisions in the conditions of a sharp lack of the information. In this case there will be need for the information and people. The clear understanding of the governments of the different countries of the nature of global risks is necessary.
10. The width and outlook freedom as I hope, peculiar to thinkers in Russia, can give a new sight at universal problems, open new vulnerability and specify new ways of prevention of global risks.
11. If our country positions itself as great power, develops nanotechnology, is going to participate in the flight project to Mars etc., it should play a responsible role in maintenance of safety of all mankind.
More, we speak here about Russia, but we have concerns on other countries, for example, India and China where technologies quickly develop, and the culture of prevention of risks also is low.
I also see deep sense that my book leaves under the aegis of Institute of Africa of the Russian Academy of Sciences. The intillegent life on the Earth has arisen on the African continent, and in the same place for the first time it has appeared on brink of extinction when consequences of eruption of supervolcano Toba (in territory of modern Indonesia) have led 74 000 years ago to long cooling of a climate, and have put Homo sapiens on an extinction side. In today's Africa many events which can serve as local models of global risks, first of all - total epidemic of AIDS and a drugsteady tuberculosis are developed. In Uganda in 1999 appeard the dangerous fungoid disease UG99 which is affecting wheat, and its distribution in Africa and behind its limits threatens with world hunger. The uranium used in a bomb, dumped to Hiroshima, has been extracted in Congo. In territory of Gabon the unique natural uranium reactor in which chain reaction was supported in natural deposits of uranium has been found out. And till now on these mines there is an extraction though it is not known to whom this uranium is delivered. The relation of the man to the higher primacies of Africa - a chimpanzee and to the gorillas, put, especially the last, on an extinction side, can be caution for all those who believes, that superhuman artificial intellect, will be necessarily kind to the man.
I wish to express gratitude to people who have supported me during a writing of this book. First of all I wish to thank E Yudkowsky and N. Bostrom which have inspired me on research of a theme of global risks by the clear and shrill articles, and also have kindly allowed to publish transfers of these articles in Russian. Certainly, this book could not arise without that influence which was rendered on me by E.D. Plavinskaja. I am grateful A. V. Sledzevsky who has helped to give the finished form to my narration and has put weight of efforts to its editing. My gratitude to Coordination council of the Russian Transhumanistic Movement - Valeria Prajd and Daniela Andreevich Medvedevs, whous rendered material and moral help in the edition of this book is especially great. I should express gratitude to the first teacher M.M. Allenov who has given to me the sample of clearness and an insight of thought. I am grateful to my mother Xenia Bogemsky, to my father Valery Turchin, son Stanislav and his mother Anna Soboleva, and also my godmother Natalia Segal. I am grateful to all to those whom I can not name humanally, including readers of my blog (http://turchin.livejournal.com/profile) who have helped me the uncountable comments.
-
V. Turchin
Do'stlaringiz bilan baham: |