Singularity n.

usually with the: the point at which technological innovation radically transforms society, esp. the point at which artificial general intelligence outpaces human intelligence; the transition to posthumanity

SF Encyclopedia

Wikipedia


  • 1983 V. Vinge First Word in Omni Jan. 10/2 page image Vernor Vinge

    We will soon create intelligences greater than our own. When this happens, human history will have reached a kind of singularity, an intellectual transition as impenetrable as the knotted space-time at the center of a black hole, and the world will pass far beyond our understanding. This singularity, I believe, already haunts a number of science-fiction writers. It makes realistic extrapolation to an interstellar future impossible. To write a story set more than a century hence, one needs a nuclear war in between—and progress enough so that the world remains intelligible.

  • 1989 M. Stiegler in Analog Science Fiction/Science Fact Apr. 12/1

    Have you ever heard of Singularity?… Singularity is a time in the future as envisioned by Vernor and Vinge [sic] . It'll occur when the rate of change of technology is very great—so great that the effort to keep up with the change will overwhelm us.

  • 1994 Science Fiction Eye Spring 36/1

    You’ve all seen the effects. Those affected by this illness are often found talking excitedly about the ‘Singularity’ and how pointless it is to work in ‘doomed’ industries like microelectronics (‘B-o-o-o-ring!’) and genetic engineering (‘Obsolete’).

  • 1996 K. MacLeod Stone Canal (1997) 300 Ken MacLeod

    We must work towards being able to control, or at least contain, their development. The same goes for any form of artificial intelligence capable of improving itself. We will do it. The day will come when we control the Singularity, as we've learned to control the flame on the hearth, the lightning of the sky, and the nuclear fire of the stars!

  • 1997 D. Broderick Spike 2

    Around 2050, or maybe even 2030, is when an upheaval unprecedented in human history—a technological singularity, as it’s been termed—is expected to erupt.

  • 2001 V. Vinge in Locus Jan. 69/1 Vernor Vinge

    In 1982, on a scientific panel, it occurred to me that with all these ideas I had been talking about and others had been talking about earlier, basically if we ever did get a human-equivalent intelligence, very soon things would be very different. Since the creative impetus would not be from us, it would be in some sense unknowable… This panel was the first time I used the term ‘technological singularity’. I did a 900-word piece about that in ’83 in Omni, and almost all my science fiction has been dealing with it. In ’92 or ’93, NASA asked me to come and give a talk on it, and I did an essay where I also tried to look at precursors. In John von Neumann’s obituary written by Stan Ulam, he relates a conversation he had with von Neumann in which he even uses the term ‘essential singularity’. To me, von Neumann’s notion was not quite the same thing. He was saying that technological progress would become so advanced that the situation would be unknowable—that much is like what I was saying. But to me, the fundamental reason for the technological singularity is, technology creates something that is smarter than human.

  • 2011 New Yorker 4 Apr. 70/2

    Tic-tac-toe fell, and then checkers, and chess went down when Deep Blue defeated Garry Kasparov, and now the TV trivia quiz show ‘Jeopardy!’ has fallen to a computer system named Watson. On the surface, at least, what Raymond Kurzweil, in accents both ominous and worshipful, calls ‘the Singularity’—the ‘Matrix’ moment when artificial intelligence becomes as strong as, if not stronger than, human kind—gets closer all the time.


Research requirements

antedating 1983

Earliest cite

in an Omni article by Vernor Vinge

Research History
Treesong submitted a 2001 cite from a Locus interview with Vernor Vinge.
Treesong submitted a 1983 cite from an Omni article also by Vinge.
Mikael Johansson submitted a 1997 cite from Damien Broderick's "The Spike".
Malcolm Farmer submitted a cite from an anonymous author in the Spring 1994 SF Eye.
Malcolm Farmer suggested and Mike Christie located a cite from a 1997 reprint of Ken MacLeod's 1996 "The Stone Canal".
Malcolm Farmer submitted a 1989 cite from Mark Stiegler's "The Gentle Seduction".
Malcolm Farmer submitted a cite from a 2000 reprint of Stuart Brand's 1999 "The Clock of the Long Now".

Last modified 2023-11-18 03:26:58
In the compilation of some entries, HDSF has drawn extensively on corresponding entries in OED.