Full record for Singularity n.

Definition the transition to posthumanity
OED requirements antedating 1983
Earliest cite in an Omni article by Vernor Vinge
Comment Treesong submitted a 2001 cite from a Locus interview with Vernor Vinge. Treesong submitted a 1983 cite from an Omni article also by Vinge. Mikael Johansson submitted a 1997 cite from Damien Broderick's "The Spike". Malcolm Farmer submitted a cite from an anonymous author in the Spring 1994 SF Eye. Malcolm Farmer suggested and Mike Christie located a cite from a 1997 reprint of Ken MacLeod's 1996 "The Stone Canal". Malcolm Farmer submitted a 1989 cite from Mark Stiegler's "The Gentle Seduction". Malcolm Farmer submitted a cite from a 2000 reprint of Stuart Brand's 1999 "The Clock of the Long Now".
Last modified 6 July, 2008

Citations for Singularity n.

click here for more information about the citation list

1970 L. Niven Ringworld 106 There wasn't even a theoretical basis for faster-than-light travel. We never did invent hyperdrive, if you'll recall. We'd never have discovered it by accident, either, because we'd never have thought to do our experiments out beyond the singularity.
1983 V. Vinge in Omni Jan. 10/2 We will soon create intelligences greater than our own. When this happens, human history will have reached a kind of singularity, an intellectual transition as impenetrable as the knotted space-time at the center of a black hole, and the world will pass far beyond our understanding. This singularity, I believe, already haunts a number of science-fiction writers. It makes realistic extrapolation to an interstellar future impossible. To write a story set more than a century hence, one needs a nuclear war in between—and progress enough so that the world remains intelligible.
1989 M. Stiegler in Analog Sci. Fiction/Sci. Fact Apr. 12/1 Have you ever heard of Singularity?‥. Singularity is a time in the future as envisioned by Vernor and Vinge [sic] . It'll occur when the rate of change of technology is very great—so great that the effort to keep up with the change will overwhelm us.
1990 P. Anderson Inconstant Star in L. Niven et al. Man-Kzin Wars III (1992) x. 241 Burnt out, a giant star collapses into a form so dense, infinite dense at the core singularity, that light itself can no longer escape its grip.
1991 D. Brin What continues‥& What Fails‥ in D. Brin Otherness (1994) 331 Not even the singularity was pure enough to typify true blackness.
1994 Sci. Fiction Eye Spring 36/1 You've all seen the effects. Those affected by this illness are often found talking excitedly about the ‘Singularity’ and how pointless it is to work in ‘doomed’ industries like microelectronics (‘B-o-o-o-ring!’) and genetic engineering (‘Obsolete’).
1994 B. Hambly Crossroad x.142 We'll make a run for the singularity shear and hope their phaser targeting is thrown out of calibration.
1996 K. MacLeod Stone Canal (1997) 300 We must work towards being able to control, or at least contain, their development. The same goes for any form of artificial intelligence capable of improving itself. We will do it. The day will come when we control the Singularity, as we've learned to control the flame on the hearth, the lightning of the sky, and the nuclear fire of the stars!
1997 D. Broderick Spike 2 Around 2050, or maybe even 2030, is when an upheaval unprecedented in human history—a technological singularity, as it's been termed—is expected to erupt.
1998 L. A. Graf & M. J. Friedman War Dragons ii. 30 You find out it was just a subspace echo that got bounced through a singularity from one side of the quadrant to the other.
1998 L. A. Graf & M. J. Friedman War Dragons xiii. 182, I was thinking a nearby singularity, or perhaps the fading wavefront from a long-ago supernova.
2001 V. Vinge in Locus Jan. 69/1 In 1982, on a scientific panel, it occurred to me that with all these ideas I had been talking about and others had been talking about earlier, basically if we ever did get a human-equivalent intelligence, very soon things would be very different. Since the creative impetus would not be from us, it would be in some sense unknowable‥. This panel was the first time I used the term ‘technological singularity’. I did a 900-word piece about that in '83 in Omni , and almost all my science fiction has been dealing with it. In '92 or '93, NASA asked me to come and give a talk on it, and I did an essay where I also tried to look at precursors. In John von Neumann's obituary written by Stan Ulam, he relates a conversation he had with von Neumann in which he even uses the term ‘essential singularity’. To me, von Neumann's notion was not quite the same thing. He was saying that technological progress would become so advanced that the situation would be unknowable—that much is like what I was saying. But to me, the fundamental reason for the technological singularity is, technology creates something that is smarter than human.