Fact's-
Only devices using 16nm technologies are cell phones, using TINY amounts of voltage and amps, thus, low heat.
Intel's 14nm processor is suffering from massive overheating issues, they are now triyng to work in 3 dimensional space, as the 14nm / 16nm resistors generate so much heat that it melts the wafers.
No 16nm chip produces has had the type of heat running through it like these ASIC's will produce (unlike 20nm, use in processors, and GPU's)
Speculation-
ITS VERY BAD, that no processors, or GPU chips (high heat) have been made with this technology. This chip will fail considering intel, with 2 years, and 50million dollars invested haven't been able to produce an efficient 14nm processor chip, I doubt these under qualified engineers have a chance in hell at doing anything that will work.
Do NOT PRE-ORDER, it may take several months to a year before 16nm chips are ready for ASIC production.
Sources - ACTUAL ENGINEER, consulting with engineer friends who work on the 14nm intel chip production plant in Chandler, AZ....
clueless newbie - saying that you spoke to an actual engineer isn't quoting a source. to do that you need to say exactly who you spoke to.
anyhow... thats ridiculous to say 16nm isn't ideal for bitcoin chips and only useful for mobile chips. what a ridiculous thing to say. the reality is that anything thats good for mobile chips, which are low voltage and low power, is exceptionally good for bitcoin mining chips as well since they too need to be low voltage and low power. in fact, i think you'll find that bitcoin chips run at even lower voltages, on the whole, than mobile phone chips do..!
one thing I'm pretty sure of, is that in 2015, a large amount of bitcoin mining will be done on 16nm chips, and by 2016, probably most of it will be 16nm. its just so much better than 28nm (and yes, even 20nm) that there's no reason not to use it for everything.
btw, nvidia has already announced they're using 16nm for their next graphics chips.
cointerra may have the first 16nm to be ready for tapeout, but i guarantee within the space of 6 months, a lot more 16nm are coming. its the obvious next step for the majority of the asic companies that want the lowest possible power and smallest possible silicon area (= more dies per wafer). sure, some smart companies can eek out a bit more juice out of 28nm and survive for one more generation in 28nm, but ultimately in 2015 they will all be using 16nm,.. cointerra's just the first, thats all. they might be a quarter or two ahead of the pack, but the whole pack is going there!
(btw, 16nm tsmc and 14nm Samsung/global foundries are pretty much similar in specs, so for our purposes are interchangeable)
Lol you obviously don't understand hardware at all do you? My point was, only mobile processors / applications are using 16nm technology currently. Also, you're argument for Nvidia going to 16nm is flawed at best. First off, they are over a year out, second, they recently switched to Maxwell technology, that allows them to use about 50% less power than older models.
My point is, NO ONE has produced an efficient 14nm or 16nm process (like CPU / GPU processor yet 5volts to 9volt power). The reason is, the transistors are so close together on the wafer, it causes extensive heat, they literally melt the silicon between the transistors. This is why Intel's 14nm tech has been in development for so long. They are trying now to solve it by printing in 3 dimensions (more space between transistors for cooling) buts its going to be another 6 months to a year, and Intel isn't going to share there process for 2 more years.
Obviously you have no understanding or clue what is involved. You say the ASICS can benefit from low voltage, while that is true, they can't achieve the same type of processing potential running on 2volts the same way they can on 5volts. Its simply physics and I'm sorry if I'm talking above your head. Am I saying its impossible, absolutely not, but they technology they are working on is cutting edge. If you think that butterfly labs had issues with 20nm process, just wait for CoinTerra trying to do this on the 16nm level.
Come revisit this threat in March of next year and will see how spot on the hardware issues are as they miss deadline after deadline.