..... Also despite what galaxyASIC said seems that I have mastered to design $10 million-worth logic cell library (he claimed that such research costs in range of $10 million USD). If that job is really that complex and chip works, then this technology costs more than $10 million ALONE. But - need to wait of course to check error rates. Without real hashing and reading error rates it can be overestimated and real hash rate can be twice less (clock). Well - and also of course there may be minor bug that ruins whole thing, because development time was very short - that are current questions, not power consumption.
I don't remember claiming that it's cost $10M.
But I did calculation over the weekend and result is that it's completely worthless endeavor to run chip in 0.5-0.6 volts vs running it at standard 1.0-1.1 volts for bitcoin projects to the consumer. So, low voltage chip is just a marketing BS. Running chips at low voltage makes sense to someone who gets them at their cost of manufacturing (~$1-$2), in which case using more chips maybe less expensive in supporting components than running them at maximum with more expensive components to handle extra power.
Savings in cost of power vs getting less Bitcoins at a ratio of something as high as 6/2 will yield ~48.76% less bitcoins if run from day one on low power and if switched at most optimal time from full power to low power only yields less than 0.32%, less than third of one percent.
(6 times less power at 2 times less performance)
You will just end up using more chips and more electronic components and PCB to get to the same performance and since power use by any ASICs is already very low and cost per kW is not $4 but only $0.10-$0.40, there is no point in ruining chips at low power to the consumer.
For low voltage chip make any financial sense, power needs to cost over $4/kW
But if he can achieve it, then it will be just an ego stroke.
Lesson? Do cost/benefit analysis before you spend a lot of money.