The Singularity is the idea that we are on the verge of technological advances so amazing that we cannot comprehend what is on the other side. In a nutshell, technology will allow humans to be so smart that we will be able to improve our intelligence exponentially. The linked talk by Vernor Vinge quotes I.J. Good:
"Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the _last_ invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control. ... It is more probable than not that, within the twentieth century, an ultraintelligent machine will be built and that it will be the last invention that man need make."
The Singularity is exciting to think about, but I am unconvinced that it will actually happen. We know that Moore's Law has been true, more or less, for the past thirty years, but we don't know that that will continue into the future, and we have good reason to believe that it won't, at least without breakthroughs that open up new computing technologies. In other words, we don't have any guarantees, or even direct evidence, that it is possible for computing speed to exceed certain thresholds.
We also are not significantly closer to understanding the nature of consciousness, intelligence or sentience. We know little about how a brain functions. We do not know how to manipulate existing memories. Strictly speaking, none of those are required for the Singularity to occur, but they would help.
Don't get me wrong: the Singularity is possible. But likely? I don't think we have enough evidence to know, one way or the other. Inevitable? No.