FANDOM


Singularity
The technological singularity – also, simply, the singularity – is a hypothetical future point in time when technological growth becomes uncontrollable and irreversible, resulting in unfathomable changes to human civilization.

According to the most popular version of the singularity hypothesis, called intelligence explosion, an upgradable intelligent agent (such as a computer running software-based artificial general intelligence) will eventually enter a "runaway reaction" of self-improvement cycles, with each new and more intelligent generation appearing more and more rapidly, causing an "explosion" in intelligence and resulting in a powerful superintelligence that qualitatively far surpasses all human intelligence.

It's another von Neumann idea

The first use of the concept of a "singularity" in the technological context was John von Neumann. Stanislaw Ulam reports a discussion with von Neumann "centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue". Subsequent authors have echoed this viewpoint.

Developments

I. J. Good's "intelligence explosion" model predicts that a future superintelligence will trigger a singularity.

The concept and the term "singularity" were popularized by Vernor Vinge in his 1993 essay The Coming Technological Singularity, in which he wrote that it would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate. He wrote that he would be surprised if it occurred before 2005 or after 2030.

Four polls, conducted in 2012 and 2013, suggested that the median estimate was a 50% chance that artificial general intelligence (AGI) would be developed by 2040–2050. Polls about this subject are utterly meaningless of course.

Public figures such as Stephen Hawking and Elon Musk have expressed concern that full artificial intelligence could result in human extinction. The consequences of the singularity and its potential benefit or harm to the human race have been intensely debated.

Community content is available under CC-BY-SA unless otherwise noted.