-
@ Laan Tungir
2025-01-25 20:15:01
### The probability of AGI ending us is negative.
A common tactic among doomers is to state something along the lines of the following:
"If we create AGI, there is some non-zero chance every year, that the AGI will end humanity. Whatever that chance is compounds every year, and is thus unacceptable."
The question I would ask the doomer is **"How do you know that the chance of AGI ending humanity isn't negative?"**
What does it mean for AGI to have a negative chance of ending humanity?
Setting aside AGI for a moment, what is the chance every year that humanity ends from other events, such as nuclear war, solar flares, super-volcanoes, asteroid hits, societal collapse, etc?
How do you even answer this question in a reasonable manner? Well lets look at past data.
Of all species that have existed 99% have gone extinct.
Of our closest 7 relatives from the genus Homo, 100% have gone extinct.
On a higher level looking at civilizational collapse. According to Wikipedia, [virtually all](https://en.wikipedia.org/wiki/Societal_collapse) historical societies have collapsed.
Looking out into the universe, we see no signs of life elsewhere, so if there was life in the past we can assume that they have gone extinct.
It would therefore be reasonable to say that the chance of our extinction is very high even without AGI, well above 90%. Maybe 99%.
You could argue the numbers but at least with these numbers we are using some past data to extrapolate the future, where the doomers are using no data whatsoever to support their numbers.
We can also state, that of all the species and societies that have collapsed in the past, they have done so because they did not yet have the knowledge on how to survive. They didn't know how to create antibiotics, or generate enough energy, or defend themselves from the environment.
What is AGI? AGI is a way to create knowledge. Most all societies in the past have collapsed because they didn't have the knowledge on how to survive, and the doomer argument is that we should slow down our ability to create knowledge, when we know that this has been precisely the problem in the past.
It is a certainty we will all die without more knowledge then we have now. The future death of our sun ensures that.
We have to create the knowledge on how to survive, and that is what AGI does - it creates knowledge.
AGI most certainly has a negative probability of ending humanity.