Technological singularity

related topics
{theory, work, human}
{film, series, show}
{company, market, business}
{system, computer, user}
{math, energy, light}
{rate, high, increase}
{specie, animal, plant}
{work, book, publish}
{ship, engine, design}
{land, century, early}
{math, number, function}

A technological singularity is a hypothetical event occurring when technological progress becomes so rapid that it makes the future after the singularity qualitatively different and harder to predict. Many of the most recognized writers on the singularity, such as Vernor Vinge and Ray Kurzweil, define the concept in terms of the technological creation of superintelligence, and allege that a post-singularity world would be unpredictable to humans due to an inability of human beings to imagine the intentions or capabilities of superintelligent entities.[1][2][3] Some writers use "the singularity" in a broader way to refer to any radical changes in our society brought about by new technologies such as molecular nanotechnology,[4][5][6] although Vinge and other prominent writers specifically state that without superintelligence, such changes would not qualify as a true singularity.[1] Many writers also tie the singularity to observations of exponential growth in various technologies (with Moore's Law being the most prominent example), using such observations as a basis for predicting that the singularity is likely to happen sometime within the 21st century.[5][7]

Vernor Vinge proposed that the creation of superhuman intelligence would represent a breakdown in the ability of humans to model the future thereafter. He was the first to use the term "singularity" for this notion, in a 1983 article, and a later 1993 article entitled "The Coming Technological Singularity: How to Survive in the Post-Human Era" was widely disseminated on the World Wide Web and helped to popularize the idea.[8] Vinge also compared the event of a technological singularity to the breakdown of the predictive ability of physics at the space-time singularity beyond the event horizon of a black hole.[3]

A technological singularity includes the concept of an intelligence explosion, a term coined in 1965 by I. J. Good.[9] Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not, according to Paul R. Ehrlich, changed significantly for millennia.[10] However with the increasing power of computers and other technologies, it might eventually be possible to build a machine that is more intelligent than humanity.[11] If superhuman intelligences were invented, either through the amplification of human intelligence or artificial intelligence, it would bring to bear greater problem-solving and inventive skills than humans, then it could design a yet more capable machine, or re-write its source code to become more intelligent. This more capable machine then could design a machine of even greater capability. These iterations could accelerate, leading to recursive self improvement, potentially allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation set in.[12][13][14]

Full article ▸

related documents
Cultural studies
Social Darwinism
Logical positivism
Wikipedia:Neutral point of view/Examples Debate
Jürgen Habermas
Imre Lakatos
The nature of God in Western theology
Noble Eightfold Path
Friedrich Hayek
Chinese philosophy
Where Mathematics Comes From
Objectivity (philosophy)
Faith and rationality
Postmodern philosophy
Northrop Frye
Falun Gong
Philosophical Investigations
Homo economicus
Thomas Malthus
Socratic method