The structure of the global catastrophe


Chapter 7. Conclusions from the analysis когнитивных distortions in an estimation of global risks



Download 1,95 Mb.
bet40/41
Sana27.06.2017
Hajmi1,95 Mb.
#16820
1   ...   33   34   35   36   37   38   39   40   41

Chapter 7. Conclusions from the analysis когнитивных distortions in an estimation of global risks

The scale can be estimated influence of errors on reasonings on global risks, having compared opinions of different experts, scientists and politicians of possibility of definitive global catastrophe and its possible reasons. It is easy to be convinced, that the disorder of opinions is huge. One consider total risk insignificant, others are confident inevitability of human extinction. As the possible reasons the set of different technologies and scenarios is called, and different experts offer the sets of possible scenarios and sets of impossible scenarios.

It is obvious, that roots of such disorder of opinions - in a variety of movements of thought which, in absence of any visible reference point, appears it is subject to various biases and когнитивным to distortions. As we cannot find a reference point concerning global risks in experiment, it is represented desirable that open discussion about methodology of research of global risks on which basis the uniform and conventional picture of global risks could be generated became such reference point.

Chapter 8. Possible rules for rather effective estimation of global risks

1. A precaution principle

It means preparation for the worst realistic scenario in all situations of uncertainty. Realistic it is necessary to consider any scenario which does not contradict known laws of physics and has precisely измеримую probability above there is nobody threshold level. It corresponds to a principle of a conservative engineering estimation. However precaution should not have irrational character, that is should not exaggerate a situation. One of formulations of a principle of precaution sounds so: «the precaution Principle is a moral and political principle which asserts, that if a certain action or the politician can cause a severe or irreversible damage to a society, that, in absence of the scientific consent that harm will not be, weight of the proof lays on those who offers the given actions».



2. A doubt principle

The principle of doubt demands to suppose possibility of an inaccuracy of any idea. However the doubt should not lead to instability of a course of thought, blind trust to authorities, absence of the opinion and uncertainty in it if it is proved enough.



3. Open discussion

Important maintenance of open discussion by all kinds of risks. It means consideration of any objection as true sufficient time, that it to estimate before deciding it to reject. Not to reject any objections to a descent and to support presence of opponents.



4. Introspection

The continuous analysis of own conclusions about possible errors from all list.



5. Independent repeated calculations

Here independent calculation by different people, and also comparison of direct and indirect estimations enters.



6. An indirect estimation of degree of an error

We can estimate degree of underestimation of global catastrophe, studying that, how much people underestimate similar risks - that is risks of unique catastrophes. For example, spaceships «the Space the Shuttle» have been calculated on one failure more than on 1000 flights, but the first failure has occurred on 25th flight. That is the initial estimation 1 to 25 would be more exact. Nuclear stations were under construction counting upon one failure in one million years, but Chernobyl failure has occurred approximately after 10 000 stations-years of operation (this number turns out from multiplication of number of stations by that moment for average term of their operation, and demands specification). So, in the first case real stability has appeared in 40 times worse, than the design estimation, and in the second - in 100 times is worse. From here we can draw a conclusion, that in case of unique difficult objects people underestimate their risks in tens times.

The conclusion. Prospects of prevention of global catastrophes

The mankind is not doomed not so to extinction. And even if our chances are insignificant, infinitely big future costs that for it to struggle. Definitely positive fact is that ice has got under way - in 2000th years the number of publications on a problematics of global catastrophes of the general character has sharply increased and the uniform understanding of a problem has started to develop. There is a hope, that in the nearest decades the problem of global risks becomes conventional, and the people who have absorbed understanding of importance of these problems, will appear in the power. Possibly, it will occur not smoothly, and after painful shocks, like September, 11th, each of which will raise readership of the literature on global risks and will urge forward discussion. Besides, it is possible to hope, that efforts of separate people and groups of concerned citizens will promote realisation of such perspective strategy, as differential development of technologies. Namely, development of Friendly AI will occur advancing rates, than, for example, consciousness loading in the computer which as a result will find huge forces, but will be uncontrollable. Also it is important, that powerful AI has arisen earlier, than will appear strong nanotechnologyи - besides can supervise them.

Probably, we should reconcile the period superfluous and even the totalitarian control over human activity during this period when the risk will be maximum, and the understanding of concrete threats - is minimum. During this period it will be not clear, which knowledge is really knowledge of mass defeat, and what - a harmless toy.

Probably, that to us will simply carry also any risk it is not materialised. On the other hand, probably, that to us will carry less, and the train of large catastrophes will reject a civilisation in the development far back, however human will remain and will find wiser approach to realisation of technological achievements. Probably, that on this way it is necessary to us a difficult choice: to remain for ever at medieval level, having refused computers and flights to stars or to risk and try to become something big. Despite all risk, this second scenario looks for me more attractive as the mankind closed on the Earth is doomed sooner or later to extinction for the natural reasons.

Growth of efforts on creation of refuges of a different sort is observed also: in Norway the storehouse for seeds on a case of global catastrophe is constructed. Though such storehouse will not rescue people, the fact of intention is praised to put up money and real resources in projects, return from which is possible only in centuries. The project of creation of a similar refuge on the Moon which even name «a spare disk for a civilisation» is actively discussed. In this refuge it is supposed to keep not only all knowledge of people, but also the frozen human embryos, in hope what somebody (aliens?) will restore then on them people.

At the same time, in this book I tried to show, that unreasoned actions on prevention of catastrophes can be not less dangerous, than catastrophes. Hence, at the moment the basic efforts should be concentrated not to concrete projects, and at all on propagation of a "green" way of life, and on growth of understanding of the nature of possible risks, on formation of a scientific consensus that actually is dangerous also what risk levels are comprehensible. Thus such discussion cannot be infinitely long as in certain more abstract areas as then we risk to "oversleep" really approaching catastrophe. It means, that we are limited in time.



The literature:


  1. Blair Bruce G. The Logic of Catastropheal Nuclear War. Brookings Institution Press, 1993.

  2. Bostrom N. and Tegmark M. How Unlikely is a Doomsday Catastrophe? // Nature, Vol. 438, No. 7069, C. 754, 2005. (Русский перевод: Макс Тегмарк и Ник Bostrom. Насколько невероятна катастрофа судного дня? http://www.proza.ru/texts/2007/04/11-348.html )

  3. Bostrom N. Antropic principle in science and philosophy. L., 2003.

  4. Bostrom N. Are You Living In a Computer Simulation?. // Philosophical Quarterly, 2003, Vol. 53, No. 211, pp. 243-255., http://www.simulation-argument.com/, (Русский сокращённый перевод: http://alt-future.narod.ru/Future/bostrom3.htm )

  5. Bostrom, N. and M. Cirkovic eds. Global Catastrophic Risks. Oxford University Press. 2008.

  6. Bostrom, N. Existential Risks: Analyzing Human Extinction Scenarios. // Journal of Evolution and Technology, 9. 2001. (Русский перевод: Ник Bostrom. Угрозы существованию. Анализ сценариев человеческого вымирания и связанных опасностей. Пер. с англ.: http://www.proza.ru/texts/2007/04/04-210.html)

  7. Bostrom, N. How Long Before Superintelligence? // International Journal of Futures Studies, 2. 1998. URL: http://www.nickbostrom.com/superintelligence.html.

  8. Bostrom, N. Observer-relative chances in anthropic reasoning? // Erkenntnis, 52, 93-108. 2000. URL: http://www.anthropic-principle.com/preprints.html.

  9. Bostrom, N. The Doomsday Argument is Alive and Kicking. // Mind, 108 (431), 539-550. 1999. URL: http://www.anthropic-principle.com/preprints/ali/alive.html.

  10. Bostrom, N. The Doomsday argument, Adam & Eve, UN++, and Quantum Joe. // Synthese, 127(3), 359-387. 2001. URL: http://www.anthropic-principle.com.

  11. Cirkovic Milan M., Richard Cathcart. Geo-engineering Gone Awry: A New Partial Solution of Fermi's Paradox. // Journal of the British Interplanetary Society, vol. 57, pp. 209-215, 2004.

  12. Cirkoviс Milan M. The Anthropic Principle And The Duration Of The Cosmological Past. // Astronomical and Astrophysical Transactions, Vol. 23, No. 6, pp. 567–597, 2004.

  13. Collar J.I. Biological Effects of Stellar Collapse Neutrinos. // Phys.Rev.Lett. 76, 1996, 999-1002 URL:http://arxiv.org/abs/astro-ph/9505028

  14. Dar, A. et al. Will relativistic heavy-ion colliders destroy our planet? // Physics Letters, B 470, 142-148. 1999.

  15. Dawes, R.M. Rational Choice in an Uncertain World. San Diego, CA: Harcourt, Brace, Jovanovich, 1988.

  16. Diamond Jared. Collapse: How Societies Choose to Fail or Succeed. Viking Adult, 2004.

  17. Drexler, K.E. Dialog on Dangers. Foresight Background 2, Rev. 1. 1988. URL: http://www.foresight.org/Updates/Background3.html.

  18. Drexler, K.E. Engines of Creation: The Coming Era of Nanotechnology. London: Forth Estate. 1985. URL: http://www.foresight.org/EOC/index.html.

  19. Fetherstonhaugh, D., Slovic, P., Johnson, S. and Friedrich, J. Insensitivity to the value of human life: A study of psychophysical numbing. // Journal of Risk and Uncertainty, 14: 238-300. 1997.

  20. Foresight Institute. Foresight Guidelines on Molecular Nanotechnology, Version 3.7. 2000. URL: http://www.foresight.org/guidelines/current.html.

  21. Forrest, D. Regulating Nanotechnology Development. 1989. URL: http://www.foresight.org/NanoRev/Forrest1989.html.

  22. Freitas (Jr.), R.A. A Self-Reproducing Interstellar Probe. // J. Brit. Interplanet. Soc., 33, 251-264. 1980.

  23. Freitas (Jr.), R.A. Some Limits to Global Ecophagy by Biovorous Nanoreplicators, with Public Policy Recommendations. Zyvex preprint, April 2000. URL: http://www.foresight.org/NanoRev/Ecophagy.html. (Русский перевод: Р.Фрейтас. Проблема серой слизи. http://www.proza.ru/texts/2007/11/07/59.html)

  24. Gehrels Neil, Claude M. Laird, Charles H. Jackman, John K. Cannizzo, Barbara J. Mattson, Wan Chen. Ozone Depletion from Nearby Supernovae. // The Astrophysical Journal, March 10, vol. 585. 2003.

  25. Gold, R.E. SHIELD: A Comprehensive Earth Protection System. A Phase I Report on the NASA Institute for Advanced Concepts, May 28, 1999.

  26. Gott J. R. III. Implications of the Copernican principle for our future prospects. // Nature, 363, 315–319, 1993.

  27. Gubrud, M. Nanotechnology and International Security, Fifth Foresight Conference on Molecular Nanotechnology. 2000. URL: http://www.foresight.org/Conferences/MNT05/Papers/Gubrud/index.html.

  28. Hanson R. Catastrophe, Social Collapse, and Human Extinction // Global Catastrophic Risks, ed. Nick Bostrom. 2008. http://hanson.gmu.edu/collapse.pdf

  29. Hanson, R. Burning the Cosmic Commons: Evolutionary Strategies for Interstellar Colonization. Working paper. 1998. URL: http://hanson.gmu.edu/workingpapers.html.

  30. Hanson, R. et al. A Critical Discussion of Vinge's Singularity Concept. // Extropy Online. 1998. URL: http://www.extropy.org/eo/articles/vi.html.

  31. Hanson, R. What If Uploads Come First: The crack of a future dawn. // Extropy, 6(2). 1994. URL: http://hanson.gmu.edu/uploads.html.

  32. http://www.acceleratingfuture.com/michael/blog/?p=539

  33. http://www.proza.ru/texts/2007/05/14-31.html

  34. Jackson, R.J. et al. Expression of Mouse Interleukin-4 by a Recombinant Ectromelia Virus Suppresses Cytolytic Lymphocyte Responses and Overcomes Genetic Resistance to Mousepox. 2001. Journal of Virology, 73, 1479-1491.

  35. Joy, B. Why the future doesn't need us. // Wired, 8.04. 2000. URL: http://www.wired.com/wired/archive/8.04/joy_pr.html.

  36. Kahneman, D. and Tversky, A. eds. Choices, Values, and Frames. Cambridge, U.K.: Cambridge University Press, 2000.

  37. Kahneman, D., Slovic, P., and Tversky, A., eds. Judgment under uncertainty: Heuristics and biases. NY, Cambridge University Press, 1982.

  38. Knight, L.U. The Voluntary Human Extinction Movement. 2001. URL: http://www.vhemt.org/.

  39. Knobe Joshua, Ken D. Olum and Alexander Vilenkin. Philosophical Implications of Inflationary Cosmology. // British Journal for the Philosophy of Science Volume 57, Number 1, March 2006, pp. 47-67(21) http://philsci-archive.pitt.edu/archive/00001149/00/cosmology.pdf

  40. Kruglanski A. W. Lay Epistemics and Human Knowledge: Cognitive and Motivational Bases. 1989.

  41. Kurzweil, R. The Age of Spiritual Machines: When computers exceed human intelligence. NY, Viking. 1999.

  42. Leslie J. The End of the World: The Science and Ethics of Human Extinction. 1996.

  43. Leslie, J. Risking the World's End. Bulletin of the Canadian Nuclear Society, May, 10-15. 1989.

  44. Mason C. The 2030 Spike: Countdown to Global Catastrophe. 2003.

  45. Melott, B. Lieberman, C. Laird, L. Martin, M. Medvedev, B. Thomas. Did a gamma-ray burst initiate the late Ordovician mass extinction? // arxiv.org/abs/astro-ph/0309415, (Русский перевод: Гамма-лучи могли вызвать на Земле ледниковый период. http://www.membrana.ru/articles/global/2003/09/25/200500.html

  46. Merkle, R.. The Molecular Repair of the Brain. Cryonics, 15 (1 and 2). 1994.

  47. Michael Foote, James S. Crampton, Alan G. Beu, Bruce A. Marshall, Roger A. Cooper, Phillip A. Maxwell, Iain Matcham. Rise and Fall of Species Occupancy in Cenozoic Fossil Mollusks // Science. V. 318. P. 1131–1134. 2007.

  48. Milgrom Paul, Nancy Stokey. Information, trade and common knowledge. // Journal of Economic Theory, Volume 26:1, pp. 17-27. 1982.

  49. Moravec, H. Mind Children: The Future of Robot and Human Intelligence, 1988.

  50. Moravec, H. Robot: Mere Machine to Transcendent Mind. New York: Oxford University Press. 1999.

  51. Moravec, H. When will computer hardware match the human brain? // Journal of Transhumanism, 1. 1998. URL: http://www.transhumanist.com/volume1/moravec.htm.

  52. Morgan, M.G. Categorizing Risks for Risk Ranking. // Risk Analysis, 20(1), 49-58. 2000.

  53. Neil Gehrels, Claude M. Laird, Charles H. Jackman, John K. Cannizzo, Barbara J. Mattson, Wan Chen. Ozone Depletion from Nearby Supernovae. // Astrophysical Journal 585: 1169–1176. Retrieved on 2007-02-01. http://xxx.lanl.gov/abs/astro-ph/0211361

  54. Nowak, R. Disaster in the making. // New Scientist, 13 January 2001. 2001. URL: http://www.newscientist.com/nsplus/insight/bioterrorism/disasterin.html.

  55. Perrow, Charles. Normal Catastrophes: Living with High-Risk Technologies. Princeton University Press. 1999.

  56. Posner Richard A. Catastrophe: Risk and Response. Oxford University Press, 2004.

  57. Powell, C. 20 Ways the World Could End. Discover, 21(10). 2000. URL: http://www.discover.com/oct_00/featworld.html.

  58. Raffensberger C, Tickner J (eds.) Protecting Public Health and the Environment: Implementing the Precautionary Principle. Island Press, Washington, DC, 1999.

  59. Robock, Alan, Luke Oman, и Georgiy L. Stenchikov: Nuclear winter revisited with a modern climate model and current nuclear arsenals: Still catastrophic consequences. // J. Geophys. Res., 112, D13107, doi:2006JD008235. 2007. (русский перевод: http://climate.envsci.rutgers.edu/pdf/RobockNW2006JD008235Russian.pdf )

  60. Roland Jon. Nuclear Winter and Other Scenarios, 1984. рукопись. http://www.pynthan.com/vri/nwaos.htm

  61. Ross, M. & Sicoly, F. Egocentric biases in availability and attribution. // Journal of Humanality and Social Psychology 37, 322-336. 1979.

  62. Shute, N. On the Beach. Ballentine Books. 1989.

  63. Simmons Matthew R. Twilight in the Desert: The Coming Saudi Oil Shock and the World Economy. NY, 2005.

  64. Sir Martin Rees. Our final hour. NY, 2003.

  65. Stevenson David. A Modest Proposal: Mission to Earth’s Core. // Nature 423, 239-240 2003.

  66. Svenson, O. Are we less risky and more skillful that our fellow drivers? Acta Psychologica, 47, 143-148. 1981.

  67. Taleb, N. The Black Swan: Why Don't We Learn that We Don't Learn? NY, Random House, 2005.

  68. Tegmark M. The interpretation of quantum mechanics: many worlds or many words? // Fortschr. Phys. 46, 855-862. 1998 http://arxiv.org/pdf/quant-ph/9709032

  69. Tickner, J. et al. The Precautionary Principle. 2000. URL: http://www.biotech-info.net/handbook.pdf.

  70. Turner, M.S., & Wilczek, F. Is our vacuum metastable? Nature, August 12, 633-634. 1982.

  71. Vinge, V. The Coming Technological Singularity. Whole Earth Review, Winter issue. 1993.

  72. Ward, P. D., Brownlee, D. Rare Earth: Why Complex Life Is Uncommon in the Universe. NY, 2000.

  73. Warwick, K. March of the Machines. London: Century. 1997.

  74. Whitby, B. et al. How to Avoid a Robot Takeover: Political and Ethical Choices in the Design and Introduction of Intelligent Artifacts. Presented at AISB-00 Symposium on Artificial Intelligence, Ethics an (Quasi-) Human Rights. 2000. http://www.informatics.sussex.ac.uk/users/blayw/BlayAISB00.html

  75. Yudkowsky E. Artificial Intelligence as a Positive and Negative Factor in Global Risk. Forthcoming in Global Catastrophic Risks, eds. Nick Bostrom and Milan Cirkovic, - UK, Oxford University Press, to appear 2008. (Русский перевод: Е.Yudkowsky. Искусственный интеллект как позитивный и негативный фактор глобального риска. http://www.proza.ru/texts/2007/03/22-285.html)

  76. Yudkowsky E. Cognitive biases potentially affecting judgment of global risks. Forthcoming in Global Catastrophic Risks, eds. Nick Bostrom and Milan Cirkovic, - UK, Oxford University Press, to appear 2008 (Русский перевод: Е.Yudkowsky. Систематические ошибки в рассуждениях, потенциально влияющие на оценку глобальных рисков. http://www.proza.ru/texts/2007/03/08-62.html )

  77. Yudkowsky, E. Creating Friendly AI 1.0. 2001. URL: http://www.singinst.org/upload/CFAI.html.

  78. Абрамян Е.А. Долго ли осталось нам жить? Судьба цивилизацAI: Анализ обстановки в мире и перспектив будущего. М., Терика, 2006.

  79. Азимов А. Выбор катастроф. М., Амфора, 2002.

  80. Александровский г. бегство от умирающего солнца. // наука и жизнь, №08, 2001. http://nauka.relis.ru/05/0108/05108042.htm

  81. Александровский Ю.А. и др. ПсихогенAI в экстремальных условиях. Москва, Медицина, 1991.

  82. Анисимов А. Развитие стратегических сил Китая и проблема адекватности ситуацAI внешней политики США // Россия XX. № 4. № 5. 2002. http://ecc.ru/XXI/RUS_21/ARXIV/2002/anisimov_2002_4.htm

  83. Анисичкин В. О взрывах планет. //. Труды V Забабахинских чтений, Снежинск. 1998.

  84. Архипов А.В. Археологический аспект исследований Луны // Астрономический вестник. 1994. Т.28. N4-5.- С.211-214. см. http://www.arracis.com.ua/moon/m312.html

  85. Бестужев-Лада И. Человечество обречено на вымирание, Московский комсомолец, май, 2006.

  86. Биндеман И. тайная жизнь супервулканов. // В мире науки. N 10. 200

  87. Бобылов Ю. Генетическая бомба. Тайные сценарAI биотерроризма. Белые Альвы, 2006.

  88. Брин Д. Singularityи кошмары. Nanotechnology Perceptions: A Review of Ultraprecision Engineering and Nanotechnology, Volume 2, No. 1, March 27 2006.

  89. Будыко М.М., Ранов А.Б., Яншин В. История атмосферы. Л., 1985

  90. Воробьёв, Ю.Л, Малинецкий Г.Г., Махутов H.A. Управление риском и устойчивое развитие. Человеческое измерение. // Общественные Науки и Современность, 2000, № 6.

  91. Владимиров В.А., Малинецкий Г.Г., Потапов А.Б. и др. Управление риском. Риск, устойчивое развитие, синергетика. М., Наука, 2000

  92. Геворкян С.Г., Геворкян И.С. Преднамеренные антропогенные экологические катастрофы. // Эволюция, , №3. 2006.

  93. Данилов-Данильян В.И., Лосев К.С., Рейф И.Е. Перед главным вызовом цивилизацAI. Взгляд из РоссAI.

  94. Докинз Р. Эгоистичный ген. М., Мир, 1993.

  95. Дробышевский Э.М. Опасность взрыва Каллисто и приоритетность космических миссий // Журнал технической физики, том 69, вып. 9. 1999.

  96. Дядин Ю. А., Гущин А. Л. Газовые гидраты и климат земли. // Соросовский Образовательный Журнал, N3, 1998.

  97. Ергин Дэниел. Добыча. Всемирная история борьбы за нефть, деньги и власть. М. 2003.

  98. Еськов К.Ю. История Земли и жизни на ней. М., НЦ ЭНАС, 2004.

  99. Израэль Ю.А. Экологические последствия возможной ядерной войны. // Метеорология и гидрология, 1983, №10.

  100. Капица С., Курдюмов С., Малинецкий Г., Синергетика и прогнозы будущего. М., 2001.

  101. Карнаухов А.В. К вопросу об устойчивости химического баланса атмосферы и теплового баланса земли. // Биофизика, том 39, вып. 1. 1994.

  102. Карпан Н.. Атомной энергетике не отмыться от Чернобыля. // Зеркало недели. №13 (592) 8-14 апреля 2006 http://pripyat.com/ru/publications/2006/04/08/750.html


  103. Download 1,95 Mb.

    Do'stlaringiz bilan baham:
1   ...   33   34   35   36   37   38   39   40   41




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish