@dinofelis, due to my ongoing medication for TB, I don't have the cognitive energy to (assimilate all the information I need to) finish our discussion/debate right now. Maybe soon...
And I apologize that I don't explain the part that wasn't clear, but I'd rather not encourage the discussion to continue until I am back up to full brain power.
In the meantime, some tidbits:
The human brain consists of about one billion neurons. Each neuron forms about 1,000 connections to other neurons, amounting to more than a trillion connections. If each neuron could only help store a single memory, running out of space would be a problem. You might have only a few gigabytes of storage space, similar to the space in an iPod or a USB flash drive. Yet neurons combine so that each one helps with many memories at a time, exponentially increasing the brain’s memory storage capacity to something closer to around 2.5 petabytes (or a million gigabytes). For comparison, if your brain worked like a digital video recorder in a television, 2.5 petabytes would be enough to hold three million hours of TV shows.
Kurzweil's Singularity (nonsense!) also has an energy efficiency deficiency compared to humans:
The efficiency of the two systems depends on what SNR (signal to noise) ratio you need to maintain within the system.
One of the other differences between existing supercomputers and the brain is that neurons aren’t all the same size and they don’t all perform the same function. If you’ve done high school biology you may remember that neurons are broadly classified as either motor neurons, sensory neurons, and interneurons. This type of grouping ignores the subtle differences between the various structures — the actual number of different types of neurons in the brain is estimated between several hundred and perhaps as many as 10,000 — depending on how you classify them.
Compare that to a modern supercomputer that uses two or three (at the very most) CPU architectures to perform calculations and you’ll start to see the difference between our own efforts to reach exascale-level computing and simulate the brain, and the actual biological structure.
...
All three charts are interesting, but it’s the chart on the far right that intrigues me most. Relative efficiency is graphed along the vertical axis while the horizontal axis has bits-per-second. Looking at it, you’ll notice that the most efficient neurons in terms of bits transferred per ATP molecule (ATP is a biological unit of energy equivalent to bits-per-watt in computing) is also one of the slowest in terms of bits per second. The neurons that can transfer the most data in terms of bits-per-second are also the least efficient.
So a typical adult human brain runs on around 12 watts—a fifth of the power required by a standard 60 watt lightbulb. Compared with most other organs, the brain is greedy; pitted against man-made electronics, it is astoundingly efficient. IBM's Watson, the supercomputer that defeated Jeopardy! champions, depends on ninety IBM Power 750 servers, each of which requires around one thousand watts.
I've been trying to make the point to you that raw processing speed isn't a sufficient condition to be indicative of superiority. Such a conclusion is very simpleton.