Someone asked if Vinge himself was becoming unconvinced of the nearness of the next Singularity. Here is my reply (for a full explanation of the Singularity see):
In brief, from the about paper:
What Is The Singularity?
“The acceleration of technological progress has been the central feature of this century. We are on the edge of change comparable to the rise of human life on Earth. The precise cause of this change is the imminent creation by technology of entities with greater-than-human intelligence. Science may achieve this breakthrough by several means (and this is another reason for having confidence that the event will occur):
1. Computers that are “awake” and superhumanly intelligent may be developed. (To date, there has been much controversy as to whether we can create human equivalence in a machine. But if the answer is “yes,” then there is little doubt that more intelligent beings can be constructed shortly thereafter.)
2. Large computer networks (and their associated users) may “wake up” as superhumanly intelligent entities.
3. Computer/human interfaces may become so intimate that users may reasonably be considered superhumanly intelligent.
4. Biological science may provide means to improve natural human intellect.
He goes on to state: “What are the consequences of this event? When greater-than-human intelligence drives progress, that progress will be much more rapid. In fact, there seems no reason why progress itself would not involve the creation of still more intelligent entities — on a still-shorter time scale. The best analogy I see is to the evolutionary past: Animals can adapt to problems and make inventions, but often no faster than natural selection can do its work — the world acts as its own simulator in the case of natural selection. We humans have the ability to internalize the world and conduct what-if’s in our heads; we can solve many problems thousands of times faster than natural selection could. Now, by creating the means to execute those simulations at much higher speeds, we are entering a regime as radically different from our human past as we humans are from the lower animals.
This change will be a throwing-away of all the human rules, perhaps in the blink of an eye — an exponential runaway beyond any hope of control. Developments that were thought might only happen in “a million years” (if ever) will likely happen in the next century.”
In the 1950s very few saw it: Stan Ulam1paraphrased John von Neumann as saying:
One conversation centered on the ever-accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.
Von Neumann even uses the term singularity, though it appears he is thinking of normal progress, not the creation of superhuman intellect. (For me, the superhumanity is the essence of the Singularity. Without that we would get a glut of technical riches, never properly absorbed.) (see my earlier post on Von Neumann and his importance).
It’s fair to call this event a singularity (“the Singularity” for the purposes of this piece). It is a point where our old models must be discarded and a new reality rules, a point that will loom vaster and vaster over human affairs until the notion becomes a commonplace.’
In several interviews in the past couple of years he’s been asked about whether the Singularity is still relevant, given the changes in the Web and computers a number of times. He does seem to skirt around the issue, but seems to think it’s still “plausible” (note the use of that rather than “probable” and he is a man of carefully chosen words).
In an undated (to me – although I know a smattering for French left from 8 years in middle and high
school) article from the French ActuSf site, he states:
“ActuSF : You’ve never been so close, in any of your previous novels, to the origin of the Singularity. Was there any urgent need of explanation ?
Vernor Vinge : No. Rainbows End looks at one plausible scenario, but there are others.”
In a 2007 interview with Computer World (Australia) he says:
” I think it’s the most likely non-catastrophic outcome of the next
There is an NPR audio interview with Vinge available at:
Reasononline’s interview with Vinge in 2007 asks him about it:
“Reason: In your speech you foresaw efforts to build ubiquitous monitoring or government controls into our information technology. What’s more, you suggested that this wasn’t deliberate—that the trend
is happening regardless of, or in spite of, the conscious choices we’re making about our information technology.
Vernor Vinge: I see an implacable government interest here, and also the convergence of diverse nongovernmental interests—writers unions, Hollywood, “temperance” organizations of all flavors, all with their own stake in exploiting technology to make people “do the right thing.”
Reason: Do you believe this pervasive monitoring and/or control might stall the Singularity?
Vinge: I think that if the Singularity can happen, it will. There are lots of very bad things that could happen in this century. The Technological Singularity may be the most likely of the noncatastrophes.”
“Reason: It’s now more than 20 years after you first started writing about the Singularity and more than a dozen since you presented your ideas in a paper about it. Are we still on track?
Vinge: I think so. In 1993 I said I’d be surprised if the Technological Singularity happened before 2005—I’ll stand by that!—or after 2030. It’s also possible the Singularity won’t happen at all.”
He then goes on the say that the most likely thing to stall it will be either a disaster, such as MAD, or 2nd, that we will never learn to harness the hardware, and 3rd, least plausible, that the human mind may be the key in terms of neural computational competence.
In Shaun Farrell’s interview (courtesy of Mysterious Galaxy bookstore) in 2006, Vinge says:
“SF: I read your essay, The Coming Technological Singularity, and in it you suggest that if we know the Singularity is coming, we have the freedom to establish initial conditions, but we lack the foreknowledge to know which actions could precipitate the Singularity actually occurring. That’s obviously my paraphrasing there. You wrote that back in 1993, so are the choices any clearer now, 13 years later?
VV: Actually, I think there are certain paths toward the Singularity that seem more likely now. And as we go forward from year to year there will be certain aspects that seem to be proceeding more realistically toward the Singularity. In the essay I think I listed four or five. I made them quite distinct, although they’ll probably
intertwine as we actually proceed. Of those 4 or 5 I think all of them are still plausible. But in the last five or six years, and also in the near future, the stuff about the internet and ubiquitous computing and the towers of large numbers of people working this thing together, those seem to be very attractive in a practical sense
as things that are ongoing, and it’s pretty obvious they could be exploited to a much greater degree than we’ve already exploited them. That’s one aspect of the difference in time (from 1993 to 2006). It’s made us more confident that certain approaches are going to be plausible. I personally think the other items I had in my 1993 essay are still plausible and it’s not entirely clear to me which would happen first.”
So, where do we end up? That the singularity is still “plausible”, but he won’t say “probable.” But then, how would you deny it when it’s your raison d’etre to many people – so much has been written about him and the big “S”, and others such as Stross, Bear, Egan, Sterling, and my beloved Schroeder have used it, and he likes and admires their work. But he does talk about AI:
“ActuSF : And talking about emerging systems, do you think AI could arise from the internet ?
Vernor Vinge : Yes. I see people+computers+networks as one of several possible paths to the Singularity … At the present time, this path appears to be proceeding more successfully than the other possibilities.”
There is a good site dedicated to the Singularity: http://community.livejournal.com/singularity_now/profile and some links from it:
Artificial Intelligence Newsfrom KurzweilAI.net
The Singularity Institute – Non-Profit organization researching AI
Singularity links page
Singularity Watch – Interpreting a world of accelerating change.
Yahoo! Groups – Singularity
Singularity Articlesby Eliezner Yudkowsky of the Foresight Institute – “excellent”
Law of Accelerating Returnsby Ray Kurzweil – a must read for any Singularity junkie
Surviving the Singularity– interview with five transhumanists about the Singularity
Have fun! The Singularity can be a wild ride….