2005/07/06

The Technological Singularity

Ah, the Technological Singularity. There's so much to be said and read. I've only dipped my toes into the pool, nay, the ocean (you've gotta pour on the cheese sometimes). And I only recently noticed the discussion link on WIKIPEDIA. On the Singularity alone, the discussion page is nearly as long as the definition page. The first paragraph of the WIKIPEDIA describes the Singularity as, "a predicted time at which technological progress accelerates beyond the ability of present-day humans to fully comprehend or predict."

I used to understand it as being the point when technology has advanced so much the distinction between biological and mechanical life could no longer be made.

Verner Vinge is the guy who most spread the popularity of the idea in the 80s. You can read his 1993 paper on the subject here. The futurist Raymond Kurzweil and author of Age of Spiritual Machines (one of my favorite books) has also done a lot to spread the idea with his predictions of 'greater than human intelligence.' That's the most accepted, and most probable, cause for the Singularity. Either artificially intelligent machines, augmented humans, or the minds of humans downloaded into machines which operate at speeds our biological neurons aren't capable of. Once a greater than human intelligence is created, it would istelf be able to create something more intelligent than itself and so on. Intelligence would increase at an exponential rate until purely biological intelligence becomes obsolete.

This is where the danger lies in the Singularity: will our successors want to keep us around? To try to make it so, the Singularity Institute for Artificial Intelligence has been, founded to study and facilitate a broader discussion and understanding of moral artificial intelligence, and in particular, AI based on the principles of Friendliness theory." The idea of friendly artificial intelligence doesn't use the word 'friendly' the way we normally think about it. They're not trying to make robots that save kittens from trees and give all they have to the poor. The idea, basically, is to make sure the machines have similar goals to our own. Here are a few things to read about Seed AI and Friendly AI.

There's actually a philosophy called Singularitarianism based on the belief that the technological Singularity is possible. People with this philosophy would act in ways they believe would contribute to the likelihood of the Singularity happening, and happening safely. I'll have to read some more before I start considering myself a Singularitarianist. The SL4 wiki and mailing list looks like a good place to start. There's also a page on Singularitarian Principles.



Wow, I seem to have dropped the ball on that one. I start such a long, involved post, then spend an entire day away from the computer. Oh well, it was worth it. I tried to cover a lot of topics and plan on going into more detail in later posts.

0 Comments:

Post a Comment

<< Home