Underclocking alone indeed does "diddly squat" to power efficiency. The key is lowering the core voltage. If core voltage can not be raised/lowered to achieve higher speed/better efficiency, that generally hints at a design flaw. I have no idea whats going on with bitmine specifically. Got a link showing that power effiency doesnt not increase quadratic with voltage?
Here is the specs https://bitcointalksearch.org/topic/bitmine-coincraft-series-users-thread-troubleshooting-efficiency-oc-495357. Again you need evidence that lowering voltage to increase efficiency is possible and because it works with cpu/gpus doesn't necessarily mean it has to work with bitcoin asics. I have yet to see bitmain/hashfast claiming anything below 0.6w/gh which they would happily do if it were possible. I assume they have already tested the chips to find out the maximum efficiency so they can advertise such. Why would they not?
I dont, because I never made that claim.
Then why did you bring it up an irrelevant claim? Neither I nor anyone else is interested in <0.4w/gh chips that are not cost effective.
If you dont understand the difference between NRE and per GH production cost, there is not much I can do. Not that $10M sounds realistic to me, for a chip as simple as a bitcoin miner. Its not going to have 15 metal layers like a highend CPU or GPU. Id be surprised if it has more than 3, maybe 4.
I understand very clearly the difference. What I don't understand is why KNC would RUSH to the newest node size (spending more than necessary simply to be first) when they could simply lower voltage and save millions? Wouldn't it make sense to wait until 20nm is cheaper since production cost is nowhere near a limiting factor as of now? Only reason I can think of for doing this would be that they are limited to 0.6w/gh (at cost effective $/gh).