Microsoft Word Kurzweil, Ray The Singularity Is Near doc



Download 13,84 Mb.
Pdf ko'rish
bet185/303
Sana15.04.2022
Hajmi13,84 Mb.
#554549
1   ...   181   182   183   184   185   186   187   188   ...   303
Bog'liq
Kurzweil, Ray - Singularity Is Near, The (hardback ed) [v1.3]

A Panoply of Existential Risks 
If a little knowledge is dangerous, where is a person who has so much as to be out of danger? 
—T
HOMAS 
H
ENRY
I discuss below (see the section "A Program for GNR Defense," p. 422) steps we can take to address these grave risks, 
but we cannot have complete assurance in any strategy that we devise today. These risks are what Nick Bostrom calls 
"existential risks," which he defines as the dangers in the upper-right quadrant of the following table:
18 
Biological life on Earth encountered a human-made existential risk for the first time in the middle of the twentieth 
century with the advent of the hydrogen bomb and the subsequent cold-war buildup of thermonuclear forces. President 
Kennedy reportedly estimated that the likelihood of an all-out nuclear war during the Cuban missile crisis was between 
33 and 50 percent.
19
The legendary information theorist John von Neumann, who became the chairman of the Air 
Force Strategic Missiles Evaluation Committee and a government adviser on nuclear strategies, estimated the 
likelihood of nuclear Armageddon (prior to the Cuban missile crisis) at close to 100 percent.
20
Given the perspective of 
the 1960s what informed observer of those times would have predicted that the world would have gone through the 
next forty years without another nontest nuclear explosion? 
Despite the apparent chaos of international affairs we can be grateful for the successful avoidance thus far of the 
employment of nuclear weapons in war. But we clearly cannot rest easily, since enough hydrogen bombs still exist to 


destroy all human life many times over.
21
Although attracting relatively little public discussion, the massive opposing 
ICBM arsenals of the United States and Russia remain in place, despite the apparent thawing of relations. 
Nuclear proliferation and the widespread availability of nuclear materials and know-how is another grave concern, 
although not an existential one for our civilization. (That is, only an all-out thermonuclear war involving the ICBM 
arsenals poses a risk to survival of all humans.) Nuclear proliferation and nuclear terrorism belong to the "profound-
local" category of risk, along with genocide. However, the concern is certainly severe because the logic of mutual 
assured destruction does not work in the context of suicide terrorists. 
Debatably we've now added another existential risk, which is the possibility of a bioengineered virus that spreads 
easily, has a long incubation period, and delivers an ultimately deadly payload. Some viruses are easily communicable, 
such as the flu and common cold. Others are deadly, such as HIV. It is rare for a virus to combine both attributes. 
Humans living today are descendants of those who developed natural immunities to most of the highly communicable 
viruses. The ability of the species to survive viral outbreaks is one advantage of sexual reproduction, which tends to 
ensure genetic diversity in the population, so that the response to specific viral agents is highly variable. Although 
catastrophic, bubonic plague did not kill everyone in Europe. Other viruses, such as smallpox, have both negative 
characteristics—they are easily contagious and deadly—but have been around long enough that there has been time for 
society to create a technological protection in the form of a vaccine. Gene engineering, however, has the potential to 
bypass these evolutionary protections by suddenly introducing new pathogens for which we have no protection, 
natural or technological. 
The prospect of adding genes for deadly toxins to easily transmitted, common viruses such as the common cold 
and flu introduced another possible existential-risk scenario. It was this prospect that led to the Asilomar conference to 
consider how to deal with such a threat and the subsequent drafting of a set of safety and ethics guidelines. Although 
these guidelines have worked thus far, the underlying technologies for genetic manipulation are growing rapidly in 
sophistication. 
In 2003 the world struggled, successfully, with the SARS virus. The emergence of SARS resulted from a 
combination of an ancient practice (the virus is suspected of having jumped from exotic animals, possibly civet cats, to 
humans living in close proximity) and a modern practice (the infection spread rapidly across the world by air travel). 
SARS provided us with a dry run of a virus new to human civilization that combined easy transmission, the ability to 
survive for extended periods of time outside the human body, and a high degree of mortality, with death rates 
estimated at 14 to 20 percent. Again, the response combined ancient and modern techniques. 
Our experience with SARS shows that most viruses, even if relatively easily transmitted and reasonably deadly, 
represent grave but not necessarily existential risks. SARS, however, does not appear to have been engineered. SARS 
spreads easily through externally transmitted bodily fluids but is not easily spread through airborne particles. Its 
incubation period is estimated to range from one day to two weeks, whereas a longer incubation period would allow a 
THE virus to spread through several exponentially growing generations before carriers are identified.
22
SARS is deadly, but the majority of its victims do survive. It continues to be feasible for a virus to be 
malevolently engineered so it spreads more easily than SARS, has an extended incubation period, and is deadly to 
essentially all victims. Smallpox is close to having these characteristics. Although we have a vaccine (albeit a crude 
one), the vaccine would not be effective against genetically modified versions of the virus. 
As I describe below, the window of malicious opportunity for bioengineered viruses, existential or otherwise, will 
close in the 2020s when we have fully effective antiviral technologies based on nanobots.
23
However, because 
nanotechnology will be thousands of times stronger, faster, and more intelligent than biological entities, self-
replicating nanobots will present a greater risk and yet another existential risk. The window for malevolent nanobots 
will ultimately be closed by strong artificial intelligence, but, not surprisingly, "unfriendly" AI will itself present an 
even more compelling existential risk, which I discuss below (see p. 420). 

Download 13,84 Mb.

Do'stlaringiz bilan baham:
1   ...   181   182   183   184   185   186   187   188   ...   303




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish