The Singularity is the idea that we are on the verge of technological advances so amazing that we cannot comprehend what is on the other side. In a nutshell, technology will allow humans to be so smart that we will be able to improve our intelligence exponentially. The linked talk by Vernor Vinge quotes I.J. Good:

"Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the _last_ invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control. ... It is more probable than not that, within the twentieth century, an ultraintelligent machine will be built and that it will be the last invention that man need make."

The Singularity is exciting to think about, but I am unconvinced that it will actually happen. We know that Moore's Law has been true, more or less, for the past thirty years, but we don't know that that will continue into the future, and we have good reason to believe that it won't, at least without breakthroughs that open up new computing technologies. In other words, we don't have any guarantees, or even direct evidence, that it is possible for computing speed to exceed certain thresholds.

We also are not significantly closer to understanding the nature of consciousness, intelligence or sentience. We know little about how a brain functions. We do not know how to manipulate existing memories. Strictly speaking, none of those are required for the Singularity to occur, but they would help.

Don't get me wrong: the Singularity is possible. But likely? I don't think we have enough evidence to know, one way or the other. Inevitable? No.


  1. The singularity seems to have a lot in common with perpetual motion machines, and such devices are prohibited by the second law of thermodynamics. I wonder if the singularity might also be ruled out by laws of physics and mathematics. For instance, there are mathematical bounds (Cramer-Rao) that limit the efficiency of statistical estimation.

    There also seems to be an assumption that unlimited information will lead to unlimited creativity, and that is not necessarily so.

  2. Ha. At the risk of post necromancy, I posit the following....

    Designing a machine to emulate human reason by copying as closely as possible the effects of that reason on the brain and the results gained by the brain's actions - is like attempting to build a cold virus by recreating in exacting detail all the symptoms of a cold.

    It's reverse engineering. But it won't work. You might get a machine someday which can emulate human reason. But I doubt it.