I cannot appreciate the implications. The reason why is that "smarter" is not clear. Does this mean better at doing fast mulitplication? Machines are already smarter. Holding memory? Computing hashes? Already smarter.
And what is this so-called progress? Desertification? Toxification? Squandering of resources? Or just the spread of mental illness?
This is not to agree or disagree with either side in this argument, but I believe that the "smarter" part is important because of this reasoning (and "smarter" is used to mean "more capable of designing artificial intelligences):
If humans of intelligence level X, manage to design an AI of intelligence level "X+1", then the machine of intelligence level "X+1" would have been able to design itself (since it is smarter than the humans that did design it). By implication then, it would be able to design an intelligence of level "X+2", etc, etc.
Personally I find the logic of a technological singularity lacking; I don't really see that the implication is true: who says that "X" intelligence is always capable of designing "X+1" intelligence AIs? Perhaps it gets exponentially harder to make smarter and smarter designs, in which case there would be no rush to "infinite" intelligence, but instead a rush to "maximum intelligence". What's more, what if humanity is already "maximum intelligence"? The assumption that we can design an intelligence cleverer than us is just that: an assumption.
You're only thinking in one dimension. For example, we could be mass producing cheap human equivalent AIs, which communicate with each other through networks. They could, in turn, produce who-knows-what.
If there is a maximum intelligence, then the singularity is impossible. This is a tautology.
@mucus
"Technological Progress Grows Exponentially and Reaches Infinity in Finite Time". This is a contradiction. The exponential curve reaches y=infinity @ x=infinity. So the referenced page fails to define exponential growth.
You can't have "infinite progress in finite time". I agree that this is impossible.
Lets define the singularity as "technology growth reaches the rate of x2 per day.". Do you think that this is impossible?