Author

Topic: New 14nm miners??????? (Read 5991 times)

DrG
legendary
Activity: 2086
Merit: 1035
September 03, 2014, 03:28:19 PM
#69
If TSMC is booked up well then 28nm it is.
The hash rate is rising so fast that by the time 14nm ASIC fab capability becomes available, fabbing the parts will be uneconomic. We're headed for 1 exahash around the beginning of 2015, about 4x the current hash rate.

By Q1 2015, all miners will be losing money. Then what?


Difficulty will go to the moon!

Well at some point somebody will shut off their miners.  First to go will be most European and Australian miners since they have high energy rates.  If that doesn't bring the difficulty down then the people who thought they had cheap power with $0.08/KWH will get kicked out.

The same thing happened with GPUs - that's when FPGA started becoming popular.  ASICs killed FPGA development.

Sooner or later we'll reach equilibrium.  At that point people with "free" or insanely cheap power can mine against large companies.
legendary
Activity: 1204
Merit: 1002
September 03, 2014, 02:53:17 AM
#68
If TSMC is booked up well then 28nm it is.
The hash rate is rising so fast that by the time 14nm ASIC fab capability becomes available, fabbing the parts will be uneconomic. We're headed for 1 exahash around the beginning of 2015, about 4x the current hash rate.

By Q1 2015, all miners will be losing money. Then what?

Difficulty will go to the moon!
hero member
Activity: 854
Merit: 510
September 03, 2014, 02:05:14 AM
#67
...  Moore's Law does appear to be reaching an end. ....

Heard the same thing in late 1980's.   It is more likely that there will be a shift to other approaches, like stacking many layers or something else.   Still that that doesn't mean it will be a few guys in a bar figuring out how to do it.

I was on my 3rd PC in the late 80s and I never heard such a thing.  I did hear that they would have problems with the clocks because of possible interference with other electronics - turned out that didn't happen  Cheesy

The basic transistor gate design has to change (ie 3D) but even that only has so much it can go.  The concept of a switch or gate has to change.  Graphene or something like that which we haven't used yet.
3rd PC by the late 80's, you started a later them me.   I was on my third computer before 1980.   

The talk about Moore's law in the late 80's was mostly around hard disk density and capacity at the time.   I was working as a Hardware/Software engineer designing sub-systems used in oscilloscopes.  You may have been too young to follow the trade journals at that time.   There wasn't many stories in the popular press as computers weren't mainstream.    Anyway at the time a typical hard disk was measured in MB.   A few GB drive was a monster and very expensive and was physically very large.     

Currently there is a great deal of research on how to improve ICs and we are long ways from the end of improvements, however that might not be true for reducing sizes of transistors.  Also much of the new stuff will take years to work its way out to smaller companies unless they are buying general CPU's etc.   I know for example that stacking layers of transistors has been in the works for at least 5 years, probably more, but so far I haven't actually seen it.   
 

Yeah I'm a physician so all my computer knowledge is hobbywork, but my dad was an EE so my science project in 1st grade was parallel vs series lighting while other kids were doing things with plants and volcanos  Cheesy

And yes I remember MB HDs.  After my IIc I bought a 386 SX16 (turbo 16MHz) from Gateway and my dad payed an extra $500 for the 40MB WD HD.  It was an IDE drive, not the RLL/MFM/ESDI ones.  Yeah he still has punchcards in the garage along with his oscilloscope.  I do indeed remember the talk about the hard drive limits at that time. I learned to program in Fortran, Basic, C, and Cobalt but that's is all erased from my mind and filled with useless medical knowledge.


I still read through the IEEE magazines form time to time and I have the 2011 issue talking about digital currency and Bitcoin.

Anyways the circuit will be revolutionized - just need to have a Eureka moment.  It's not my field of study so I get my information from stuff like this and other forum members:
https://www.youtube.com/watch?v=gjx5y9lrtwU

I once considered medical school, even read up on the MCAT before deciding it wasn't for me.   I respect how much a physician has to go through to get to practice.   It seems medical fields are really starting to explode with innovation!   
DrG
legendary
Activity: 2086
Merit: 1035
September 03, 2014, 01:54:24 AM
#66
...  Moore's Law does appear to be reaching an end. ....

Heard the same thing in late 1980's.   It is more likely that there will be a shift to other approaches, like stacking many layers or something else.   Still that that doesn't mean it will be a few guys in a bar figuring out how to do it.

I was on my 3rd PC in the late 80s and I never heard such a thing.  I did hear that they would have problems with the clocks because of possible interference with other electronics - turned out that didn't happen  Cheesy

The basic transistor gate design has to change (ie 3D) but even that only has so much it can go.  The concept of a switch or gate has to change.  Graphene or something like that which we haven't used yet.
3rd PC by the late 80's, you started a later them me.   I was on my third computer before 1980.   

The talk about Moore's law in the late 80's was mostly around hard disk density and capacity at the time.   I was working as a Hardware/Software engineer designing sub-systems used in oscilloscopes.  You may have been too young to follow the trade journals at that time.   There wasn't many stories in the popular press as computers weren't mainstream.    Anyway at the time a typical hard disk was measured in MB.   A few GB drive was a monster and very expensive and was physically very large.     

Currently there is a great deal of research on how to improve ICs and we are long ways from the end of improvements, however that might not be true for reducing sizes of transistors.  Also much of the new stuff will take years to work its way out to smaller companies unless they are buying general CPU's etc.   I know for example that stacking layers of transistors has been in the works for at least 5 years, probably more, but so far I haven't actually seen it.   
 

Yeah I'm a physician so all my computer knowledge is hobbywork, but my dad was an EE so my science project in 1st grade was parallel vs series lighting while other kids were doing things with plants and volcanos  Cheesy

And yes I remember MB HDs.  After my IIc I bought a 386 SX16 (turbo 16MHz) from Gateway and my dad payed an extra $500 for the 40MB WD HD.  It was an IDE drive, not the RLL/MFM/ESDI ones.  Yeah he still has punchcards in the garage along with his oscilloscope.  I do indeed remember the talk about the hard drive limits at that time. I learned to program in Fortran, Basic, C, and Cobalt but that's is all erased from my mind and filled with useless medical knowledge.


I still read through the IEEE magazines form time to time and I have the 2011 issue talking about digital currency and Bitcoin.

Anyways the circuit will be revolutionized - just need to have a Eureka moment.  It's not my field of study so I get my information from stuff like this and other forum members:
https://www.youtube.com/watch?v=gjx5y9lrtwU
hero member
Activity: 854
Merit: 510
September 03, 2014, 01:39:29 AM
#65
...  Moore's Law does appear to be reaching an end. ....

Heard the same thing in late 1980's.   It is more likely that there will be a shift to other approaches, like stacking many layers or something else.   Still that that doesn't mean it will be a few guys in a bar figuring out how to do it.

I was on my 3rd PC in the late 80s and I never heard such a thing.  I did hear that they would have problems with the clocks because of possible interference with other electronics - turned out that didn't happen  Cheesy

The basic transistor gate design has to change (ie 3D) but even that only has so much it can go.  The concept of a switch or gate has to change.  Graphene or something like that which we haven't used yet.
3rd PC by the late 80's, you started a later them me.   I was on my third computer before 1980.  

The talk about Moore's law in the late 80's was mostly around hard disk density and capacity at the time.   I was working as a Hardware/Software engineer designing sub-systems used in oscilloscopes.  You may have been too young to follow the trade journals at that time.   There wasn't many stories in the popular press as computers weren't mainstream.    Anyway at the time a typical hard disk was measured in MB.   A few GB drive was a monster and very expensive and was physically very large.      

Currently there is a great deal of research on how to improve ICs and we are long ways from the end of improvements, however that might not be true for reducing sizes of transistors.  Also much of the new stuff will take years to work its way out to smaller companies unless they are buying general CPU's etc.   I know for example that stacking layers of transistors has been in the works for at least 5 years, probably more, but so far I haven't actually seen it.  
 
EDIT: In the late 80's there was a general feeling that solid state storage (i.e. flash) would easily replace hard disks within 10 years.   Still hasn't happen, although we are getting closer to the cross over.   In 1995 I wrote a driver for storing information on flash, then the 28F010's Intel, 1 Mb each, wow that was really hard and there was like a 90 second cycle where if you lost power you lost your flash contents.   Things have come a long ways since then.   I know a ton about pre-2000 technology but after about 2000 I've mostly done pure software.    Grin   My electrical skills are getting rusty and I don't read the journals much now.  Kids take up too much time.   
DrG
legendary
Activity: 2086
Merit: 1035
September 03, 2014, 12:38:51 AM
#64
...  Moore's Law does appear to be reaching an end. ....

Heard the same thing in late 1980's.   It is more likely that there will be a shift to other approaches, like stacking many layers or something else.   Still that that doesn't mean it will be a few guys in a bar figuring out how to do it.

I was on my 3rd PC in the late 80s and I never heard such a thing.  I did hear that they would have problems with the clocks because of possible interference with other electronics - turned out that didn't happen  Cheesy

The basic transistor gate design has to change (ie 3D) but even that only has so much it can go.  The concept of a switch or gate has to change.  Graphene or something like that which we haven't used yet.
hero member
Activity: 854
Merit: 510
September 02, 2014, 05:51:24 PM
#63
...  Moore's Law does appear to be reaching an end. ....

Heard the same thing in late 1980's.   It is more likely that there will be a shift to other approaches, like stacking many layers or something else.   Still that that doesn't mean it will be a few guys in a bar figuring out how to do it.
DrG
legendary
Activity: 2086
Merit: 1035
September 02, 2014, 11:48:57 AM
#62
There's no point to going towards 14nm when 20nm hasn't even been optimized.  28nm offerings are still sometimes more efficient than their 20nm brethren.  Spending money on improving the current design before stepping down to an expensive process should seem to be the better route.

What 20nm brethren? There's only KnC's half-arsed rushed attempt so far. Also, making ASICs doesn't work the way you are describing it. You don't just take a highly optimised 28nm design and run it through the 20nm fab and end up with a highly optimised 20nm ASIC. The design usually has to be reworked because the actual fab process has changed and once working, it can then be optimised for that specific process.



Well I meant what you said - may not have come out right.  basically they're trying to take a leap too far ahead if they rush to 14nm.  Since it would be prohibitively expensive to do so they should just work on optimizing the 20 & 28nm stuff that they have right now.

TSMC 20nm is fully booked for at least the rest of the year and probably a good portion of next year too. You've basically got Nvidia, AMD and ARM manufacturers fighting for capacity for their next gen products. The bitcoin ASIC manufacturers would be far better off focusing on 28nm

Meh I've been waiting on those new GPUs for ages now.  The 7970s I have from 2012 are still the same as the current crop of GPUs from the end of 2014 more or less.  Moore's Law does appear to be reaching an end.

If TSMC is booked up well then 28nm it is.
legendary
Activity: 826
Merit: 1004
September 02, 2014, 06:34:13 AM
#61
There's no point to going towards 14nm when 20nm hasn't even been optimized.  28nm offerings are still sometimes more efficient than their 20nm brethren.  Spending money on improving the current design before stepping down to an expensive process should seem to be the better route.

What 20nm brethren? There's only KnC's half-arsed rushed attempt so far. Also, making ASICs doesn't work the way you are describing it. You don't just take a highly optimised 28nm design and run it through the 20nm fab and end up with a highly optimised 20nm ASIC. The design usually has to be reworked because the actual fab process has changed and once working, it can then be optimised for that specific process.



Well I meant what you said - may not have come out right.  basically they're trying to take a leap too far ahead if they rush to 14nm.  Since it would be prohibitively expensive to do so they should just work on optimizing the 20 & 28nm stuff that they have right now.

TSMC 20nm is fully booked for at least the rest of the year and probably a good portion of next year too. You've basically got Nvidia, AMD and ARM manufacturers fighting for capacity for their next gen products. The bitcoin ASIC manufacturers would be far better off focusing on 28nm
DrG
legendary
Activity: 2086
Merit: 1035
September 02, 2014, 03:28:54 AM
#60
There's no point to going towards 14nm when 20nm hasn't even been optimized.  28nm offerings are still sometimes more efficient than their 20nm brethren.  Spending money on improving the current design before stepping down to an expensive process should seem to be the better route.

What 20nm brethren? There's only KnC's half-arsed rushed attempt so far. Also, making ASICs doesn't work the way you are describing it. You don't just take a highly optimised 28nm design and run it through the 20nm fab and end up with a highly optimised 20nm ASIC. The design usually has to be reworked because the actual fab process has changed and once working, it can then be optimised for that specific process.



Well I meant what you said - may not have come out right.  basically they're trying to take a leap too far ahead if they rush to 14nm.  Since it would be prohibitively expensive to do so they should just work on optimizing the 20 & 28nm stuff that they have right now.
legendary
Activity: 826
Merit: 1004
September 01, 2014, 06:24:26 PM
#59
There's no point to going towards 14nm when 20nm hasn't even been optimized.  28nm offerings are still sometimes more efficient than their 20nm brethren.  Spending money on improving the current design before stepping down to an expensive process should seem to be the better route.

What 20nm brethren? There's only KnC's half-arsed rushed attempt so far. Also, making ASICs doesn't work the way you are describing it. You don't just take a highly optimised 28nm design and run it through the 20nm fab and end up with a highly optimised 20nm ASIC. The design usually has to be reworked because the actual fab process has changed and once working, it can then be optimised for that specific process.

hero member
Activity: 854
Merit: 510
September 01, 2014, 05:28:39 PM
#58
There's no point to going towards 14nm when 20nm hasn't even been optimized.  28nm offerings are still sometimes more efficient than their 20nm brethren.  Spending money on improving the current design before stepping down to an expensive process should seem to be the better route.
Good point and besides everything I've read indicates that 14nm requires redesign to get any real advantages.   So companies should be focusing on optimizing first. 
DrG
legendary
Activity: 2086
Merit: 1035
September 01, 2014, 05:06:57 PM
#57
There's no point to going towards 14nm when 20nm hasn't even been optimized.  28nm offerings are still sometimes more efficient than their 20nm brethren.  Spending money on improving the current design before stepping down to an expensive process should seem to be the better route.
legendary
Activity: 826
Merit: 1004
September 01, 2014, 01:55:30 PM
#56
The only way anyone except Intel is making 14nm bitcoin ASICs is if they pay a shit load of money to Intel.

or to someone....14nm is a pretty significant scaling hurdle as we are starting to build chips with components that are only a few atoms across

http://www.extremetech.com/computing/97469-is-14nm-the-end-of-the-road-for-silicon-lithography

Very interesting article.    However, it may be possible for companies to get Intel to make the chips.   I expect that would be very costly though.    Probably currently way out to the reach of smaller companies.   

It is possible for external companies to use Intel's fabs. They're making the Stratix10 14nm FPGAs for Altera and 14nm SoCs for Panasonic to name 2 customers.
sr. member
Activity: 420
Merit: 250
September 01, 2014, 02:18:54 AM
#55
Well either the power becomes a major factor or if the growth is slow then the 2016 halving will stop it.

Yup, once the halving happens it will be a whole new game.   

What will halving do to miners?  Please elaborate
Block awards are half, so income per block is half.

Why would they do that though?
Dude!  You better go learn some basics.  There isn't any wanting involved, it is just part of how bitcoin works.   Bitcoin mining was 50 BTC per block, now it is 25 BTC and in about 2 years it will be 12.5 BTC.   It will keep halving every 3 to 4 years until the award is just dust.   

Any run, don't walk to a bitcoin wiki and read up on some of the basics.   

Sorry I misunderstood you.  I didn't know you were referring to the block award.  Yes I am familiar with that.  Thanks for the post.
hero member
Activity: 854
Merit: 510
September 01, 2014, 02:16:11 AM
#54
Well either the power becomes a major factor or if the growth is slow then the 2016 halving will stop it.

Yup, once the halving happens it will be a whole new game.  

What will halving do to miners?  Please elaborate
Block awards are half, so income per block is half.

Why would they do that though?
Dude!  You better go learn some basics.  There isn't any wanting involved, it is just part of how bitcoin works.   Bitcoin mining was 50 BTC per block, now it is 25 BTC and in about 2 years it will be 12.5 BTC.   It will keep halving every 3 to 4 years until the award is just dust.  

Anyway run, don't walk to a bitcoin wiki and read up on some of the basics.  
sr. member
Activity: 420
Merit: 250
September 01, 2014, 12:52:23 AM
#53
Well either the power becomes a major factor or if the growth is slow then the 2016 halving will stop it.

Yup, once the halving happens it will be a whole new game.   

What will halving do to miners?  Please elaborate
Block awards are half, so income per block is half.

Why would they do that though?
hero member
Activity: 854
Merit: 510
September 01, 2014, 12:19:48 AM
#52
Well either the power becomes a major factor or if the growth is slow then the 2016 halving will stop it.

Yup, once the halving happens it will be a whole new game.   

What will halving do to miners?  Please elaborate
Block awards are half, so income per block is half.
sr. member
Activity: 420
Merit: 250
August 31, 2014, 11:50:55 PM
#51
Well either the power becomes a major factor or if the growth is slow then the 2016 halving will stop it.

Yup, once the halving happens it will be a whole new game.   

What will halving do to miners?  Please elaborate
hero member
Activity: 854
Merit: 510
August 31, 2014, 11:10:18 PM
#50
The only way anyone except Intel is making 14nm bitcoin ASICs is if they pay a shit load of money to Intel.

or to someone....14nm is a pretty significant scaling hurdle as we are starting to build chips with components that are only a few atoms across

http://www.extremetech.com/computing/97469-is-14nm-the-end-of-the-road-for-silicon-lithography

Very interesting article.    However, it may be possible for companies to get Intel to make the chips.   I expect that would be very costly though.    Probably currently way out to the reach of smaller companies.   
full member
Activity: 121
Merit: 100
August 31, 2014, 08:02:27 PM
#49
The only way anyone except Intel is making 14nm bitcoin ASICs is if they pay a shit load of money to Intel.

or to someone....14nm is a pretty significant scaling hurdle as we are starting to build chips with components that are only a few atoms across

http://www.extremetech.com/computing/97469-is-14nm-the-end-of-the-road-for-silicon-lithography
hero member
Activity: 575
Merit: 500
August 31, 2014, 06:49:59 PM
#48
The only way anyone except Intel is making 14nm bitcoin ASICs is if they pay a shit load of money to Intel.

TBH, Intel is barely making stuff on 14nm ;P
legendary
Activity: 826
Merit: 1004
August 31, 2014, 10:15:05 AM
#47
The only way anyone except Intel is making 14nm bitcoin ASICs is if they pay a shit load of money to Intel.
legendary
Activity: 1281
Merit: 1000
☑ ♟ ☐ ♚
August 31, 2014, 01:46:20 AM
#46
This is still way out there in time.   Assuming it all goes well chips are not available until May, 2015.   Systems would be after that.    You have a long time for your happy dance.   Does seem a lot more possible though.

I meant, BlackArrow, or any other bitcoinASIC-company, wont be able to produce 14nm-chips in years. Come on.
hero member
Activity: 854
Merit: 510
August 30, 2014, 11:40:46 PM
#45
The only way BA is making 14nm ASICs is if they make 28nm stuff and cut it in 1/2 with a Ginsu knife.

Their track record speaks volumes.  Anybody jumping on that is just funding their shadiness.
Strangely I'm not surprised to hear that. 
DrG
legendary
Activity: 2086
Merit: 1035
August 30, 2014, 10:20:19 PM
#44
The only way BA is making 14nm ASICs is if they make 28nm stuff and cut it in 1/2 with a Ginsu knife.

Their track record speaks volumes.  Anybody jumping on that is just funding their shadiness.
sr. member
Activity: 285
Merit: 250
August 30, 2014, 10:13:33 PM
#43
Black arrow have bad records right? Not worth to risk.
hero member
Activity: 854
Merit: 510
August 30, 2014, 08:44:17 PM
#42
This is still way out there in time.   Assuming it all goes well chips are not available until May, 2015.   Systems would be after that.    You have a long time for your happy dance.   Does seem a lot more possible though.
legendary
Activity: 1281
Merit: 1000
☑ ♟ ☐ ♚
DrG
legendary
Activity: 2086
Merit: 1035
July 18, 2014, 11:45:10 PM
#40
Moore's law is approaching it's end.  We won't keep seeing leaps in transistor reduction.  Switching to graphene or some other exotic manufacturing might be the future.

But for what seems an eternity in the Bitcoin world, the next 1-2 years will still see more efficient miners released.
Agreed.  Seriously doubt asic miniaturization will slow down within the next 18-24 months, then after that I suspect we will see a slew of optimization techniques that carry this trend on for another 6-12 months easy.  Of course, all of this is largely contingent upon the bitcoin price continuing it's upward trend.

I believe ALToids was saying the opposite of what you're saying.  CPUs have pretty much reached a wall.

ASICs for Bitcoin still have an efficient jump to 20nm, but 14nm will be cost prohibitive because all the foundries will have other orders.  This won't be for another year - which in Bitcoin terms is like 25 years from now.
Actually the jump to 14nm is more that just a cost problem.  A lot of redesign has to be done to gain the full benefits.   Probably much more than a year yet.   

When producing a batch run of 100k chips, the 500K spent on 5 full time engineers is nothing compared to the $20 million needed for the run.  You are right that to optimize 14nm a LOT of work needs to be done, but most ASIC companies aren't really optimizing their current 20/28nm offerings.  That's why Bitfury was able to do so well on the efficiency end even with a higher node.
full member
Activity: 153
Merit: 100
July 18, 2014, 10:29:55 AM
#39
http://igg.me/p/822701/x

What do you guys think?  Legit or scam?

If this is real it could mean that difficulty would go up even more and our old miners would become obselete even faster.

Scam, avoid at all cost
hero member
Activity: 854
Merit: 510
July 18, 2014, 04:33:28 AM
#38
Moore's law is approaching it's end.  We won't keep seeing leaps in transistor reduction.  Switching to graphene or some other exotic manufacturing might be the future.

But for what seems an eternity in the Bitcoin world, the next 1-2 years will still see more efficient miners released.
Agreed.  Seriously doubt asic miniaturization will slow down within the next 18-24 months, then after that I suspect we will see a slew of optimization techniques that carry this trend on for another 6-12 months easy.  Of course, all of this is largely contingent upon the bitcoin price continuing it's upward trend.

I believe ALToids was saying the opposite of what you're saying.  CPUs have pretty much reached a wall.

ASICs for Bitcoin still have an efficient jump to 20nm, but 14nm will be cost prohibitive because all the foundries will have other orders.  This won't be for another year - which in Bitcoin terms is like 25 years from now.
Actually the jump to 14nm is more that just a cost problem.  A lot of redesign has to be done to gain the full benefits.   Probably much more than a year yet.   
DrG
legendary
Activity: 2086
Merit: 1035
July 18, 2014, 04:03:06 AM
#37
Moore's law is approaching it's end.  We won't keep seeing leaps in transistor reduction.  Switching to graphene or some other exotic manufacturing might be the future.

But for what seems an eternity in the Bitcoin world, the next 1-2 years will still see more efficient miners released.
Agreed.  Seriously doubt asic miniaturization will slow down within the next 18-24 months, then after that I suspect we will see a slew of optimization techniques that carry this trend on for another 6-12 months easy.  Of course, all of this is largely contingent upon the bitcoin price continuing it's upward trend.

I believe ALToids was saying the opposite of what you're saying.  CPUs have pretty much reached a wall.

ASICs for Bitcoin still have an efficient jump to 20nm, but 14nm will be cost prohibitive because all the foundries will have other orders.  This won't be for another year - which in Bitcoin terms is like 25 years from now.
hero member
Activity: 871
Merit: 505
Founder of Incakoin
July 18, 2014, 02:51:32 AM
#36
As for the original post, scam.

It's one of those "flexible funding" Indiegogo projects, which means "even if we don't get funded, we keep your money".

bump!
full member
Activity: 210
Merit: 100
★☆★ 777Coin - The Exciting Bitco
July 15, 2014, 08:51:53 PM
#35
Moore's law is approaching it's end.  We won't keep seeing leaps in transistor reduction.  Switching to graphene or some other exotic manufacturing might be the future.

But for what seems an eternity in the Bitcoin world, the next 1-2 years will still see more efficient miners released.
Agreed.  Seriously doubt asic miniaturization will slow down within the next 18-24 months, then after that I suspect we will see a slew of optimization techniques that carry this trend on for another 6-12 months easy.  Of course, all of this is largely contingent upon the bitcoin price continuing it's upward trend.
hero member
Activity: 873
Merit: 1007
July 15, 2014, 04:07:41 AM
#34
Well either the power becomes a major factor or if the growth is slow then the 2016 halving will stop it.

Yup, once the halving happens it will be a whole new game.   

Do we know when exactly when halving will occur at this rate?

The next halving will occur at the 420,000 th block, which will be found at around Aug 2016 according to http://bitcoinclock.com/.

It will stay at August if there's no growth in the network.  If there's a decrease in the difficulty it will get pushed back, if increase it moves up.  It will probably be sometime between May and June 2016.
legendary
Activity: 1204
Merit: 1002
July 15, 2014, 01:09:11 AM
#33
As for the original post, scam.

It's one of those "flexible funding" Indiegogo projects, which means "even if we don't get funded, we keep your money".
hero member
Activity: 662
Merit: 500
July 13, 2014, 04:05:21 AM
#32
Well either the power becomes a major factor or if the growth is slow then the 2016 halving will stop it.

Yup, once the halving happens it will be a whole new game.   

Do we know when exactly when halving will occur at this rate?

The next halving will occur at the 420,000 th block, which will be found at around Aug 2016 according to http://bitcoinclock.com/.
hero member
Activity: 662
Merit: 500
July 13, 2014, 04:03:04 AM
#31
Moore's law is approaching it's end.  We won't keep seeing leaps in transistor reduction.

Exactly. The Quantum tunneling effect will eventually kick in. Smiley

Switching to graphene or some other exotic manufacturing might be the future.

Something like this http://spectrum.ieee.org/semiconductors/devices/the-tunneling-transistor ?
hero member
Activity: 854
Merit: 510
July 13, 2014, 03:55:20 AM
#30
14nm is coming to intel, it might take awhile before 14nm used for ASIC.
Actually it is Intel currently producing 14nm chips.  However, you are correct it may be some time before we see 14nm ASICs.
hero member
Activity: 526
Merit: 500
July 13, 2014, 02:21:15 AM
#29
14nm is coming to intel, it might take awhile before 14nm used for ASIC.
sr. member
Activity: 266
Merit: 250
July 13, 2014, 02:18:25 AM
#28
new generation = higher hash rate = higher price = Wont ROI = Diff. goin crazy

am i supposed to say this 14nm is impossible for now?
i mean, it wont sell like hotcake
hero member
Activity: 854
Merit: 510
July 13, 2014, 01:34:25 AM
#27
Impossible, the smallest nanometer is 20nm, there wouldn't be such a decrease so soon. I would be interested to see his actual working product if he claim its 14nm. I don't think we can hit 14nm within a year

Actually the first 14nm fpga chips started showing up summer of 2013.   Well better than in power consumption and the amount of gates available, they aren't vastly better though.   

http://www.fpgacentral.com/news/2013/altera-14nm-stratix-and-20nm-arria-fpga

However, I haven't heard of any 14nm ASICs yet.

There's no reason to jump the gun to a more difficult and costly architecture when the "older" stuff still hasn't been optimized.  Some of the 20/28nm stuff is not as efficient as it could/should be.  Now if 14nm was cheap that would be another story...
14nm isn't cheap and it requires a lot of redesign to get to the 2x performance level.   You can bet that random guys jawing in a bar are able to do that level of design. 
hero member
Activity: 519
Merit: 500
July 13, 2014, 01:23:16 AM
#26
Impossible, the smallest nanometer is 20nm, there wouldn't be such a decrease so soon. I would be interested to see his actual working product if he claim its 14nm. I don't think we can hit 14nm within a year

Actually the first 14nm fpga chips started showing up summer of 2013.   Well better than in power consumption and the amount of gates available, they aren't vastly better though.   

http://www.fpgacentral.com/news/2013/altera-14nm-stratix-and-20nm-arria-fpga

However, I haven't heard of any 14nm ASICs yet.

There's no reason to jump the gun to a more difficult and costly architecture when the "older" stuff still hasn't been optimized.  Some of the 20/28nm stuff is not as efficient as it could/should be.  Now if 14nm was cheap that would be another story...
hero member
Activity: 854
Merit: 510
July 12, 2014, 11:17:18 PM
#25
Impossible, the smallest nanometer is 20nm, there wouldn't be such a decrease so soon. I would be interested to see his actual working product if he claim its 14nm. I don't think we can hit 14nm within a year

Actually the first 14nm fpga chips started showing up summer of 2013.   Well better than in power consumption and the amount of gates available, they aren't vastly better though.   

http://www.fpgacentral.com/news/2013/altera-14nm-stratix-and-20nm-arria-fpga

However, I haven't heard of any 14nm ASICs yet.
legendary
Activity: 3038
Merit: 4418
Crypto Swap Exchange
July 12, 2014, 08:28:42 PM
#24
Impossible, the smallest nanometer is 20nm, there wouldn't be such a decrease so soon. I would be interested to see his actual working product if he claim its 14nm. I don't think we can hit 14nm within a year
hero member
Activity: 854
Merit: 510
hero member
Activity: 871
Merit: 505
Founder of Incakoin
hero member
Activity: 854
Merit: 510
July 07, 2014, 05:24:15 PM
#21
I don't think we will see 14 nm asics very soon. Becaus rhe smallest process node currently is 20 nm. Only Intel has smaller ones at the moment.

Intel does produce 14 nm chips, but I haven't heard of 14 nm ASICs yet.   There are 14 nm FPGAs, http://www.altera.com/devices/fpga/stratix-fpgas/stratix10/stx10-index.jsp

That still doesn't change my view.  It won't be some guys taking about problems in a bar over nachos that end up delivering 14 nm ASICs.  Also when we do see them, they won't be the huge jump in performance that might be expected.  
sr. member
Activity: 406
Merit: 250
July 07, 2014, 05:08:28 PM
#20
http://igg.me/p/822701/x

What do you guys think?  Legit or scam?

If this is real it could mean that difficulty would go up even more and our old miners would become obselete even faster.

My thoughts:

1) A 14nm ASIC is unlikely to come from a crowd funded source with no ASIC experience.
2) There are many problems with scaling ASICs that have not necessary been solved.   The results may in fact be disappointing, maybe in the range of 1.6% faster and not as much power reduction as expected.
3) 14nm is very new and hasn't had time to mature.   

There will likely be smaller ASICs in coming years, but the huge gains are in.   No it is more of a factor of using more ASICs not vastly faster ASICs.


So I wonder if difficulty will plateau then?


The difficulty rate growth will slow down, but the real question is when.   Personally I think there is still more than a year before we start seeing a real plateau.   At some point the amount of power used for bitcoin mining will become a limit to growth.  

Yeah.  That's what I was thinking.  At some point, the electricity costs will decide the plateau of difficulty.  Interesting opinions!
We still have quite a way to go before the difficulty will get to this point.
legendary
Activity: 4326
Merit: 8950
'The right to privacy matters'
July 07, 2014, 02:13:48 PM
#19
Well either the power becomes a major factor or if the growth is slow then the 2016 halving will stop it.

Yup, once the halving happens it will be a whole new game.   

Do we know when exactly when halving will occur at this rate?

no not exactly but if rate avg stays under 20 % some time in 2016.

most people do no realize  the growth rate has slowed.   it is just a bit under 18% avg since the  dec 21st 2013  30 %

jump.  that is about a 200 day time period.     the avg from june to dec 21 2013 was about 24 %.

 1  simple reason why rate of diff slowed is power savings are slowing.

   so 24% for 200 days then 18% for 200 days next 200 days will be 13-15% avg

and btw no 14nm chip will do 14.4th at under 1000 watts so the AC dude is tripping on acid

at best 14.4th will be 2000 watts with 14nm chips.
full member
Activity: 221
Merit: 100
July 07, 2014, 02:08:25 PM
#18
I don't think we will see 14 nm asics very soon. Becaus rhe smallest process node currently is 20 nm. Only Intel has smaller ones at the moment.
member
Activity: 87
Merit: 10
July 07, 2014, 02:01:08 PM
#17
Well either the power becomes a major factor or if the growth is slow then the 2016 halving will stop it.

Yup, once the halving happens it will be a whole new game.   

Do we know when exactly when halving will occur at this rate?
hero member
Activity: 854
Merit: 510
July 07, 2014, 01:49:44 PM
#16
Well either the power becomes a major factor or if the growth is slow then the 2016 halving will stop it.

Yup, once the halving happens it will be a whole new game.   
member
Activity: 70
Merit: 10
July 07, 2014, 01:40:34 PM
#15
Power is ot neccesary the main factor, there is many who is mining with loss, hoping on value increse. there are others who are not paying for  electricity or paying very small price.
hero member
Activity: 519
Merit: 500
July 07, 2014, 07:40:51 AM
#14
http://igg.me/p/822701/x

What do you guys think?  Legit or scam?

If this is real it could mean that difficulty would go up even more and our old miners would become obselete even faster.

My thoughts:

1) A 14nm ASIC is unlikely to come from a crowd funded source with no ASIC experience.
2) There are many problems with scaling ASICs that have not necessary been solved.   The results may in fact be disappointing, maybe in the range of 1.6% faster and not as much power reduction as expected.
3) 14nm is very new and hasn't had time to mature.   

There will likely be smaller ASICs in coming years, but the huge gains are in.   No it is more of a factor of using more ASICs not vastly faster ASICs.


So I wonder if difficulty will plateau then?


The difficulty rate growth will slow down, but the real question is when.   Personally I think there is still more than a year before we start seeing a real plateau.   At some point the amount of power used for bitcoin mining will become a limit to growth.  

Well either the power becomes a major factor or if the growth is slow then the 2016 halving will stop it.
member
Activity: 87
Merit: 10
July 07, 2014, 07:04:09 AM
#13
http://igg.me/p/822701/x

What do you guys think?  Legit or scam?

If this is real it could mean that difficulty would go up even more and our old miners would become obselete even faster.

My thoughts:

1) A 14nm ASIC is unlikely to come from a crowd funded source with no ASIC experience.
2) There are many problems with scaling ASICs that have not necessary been solved.   The results may in fact be disappointing, maybe in the range of 1.6% faster and not as much power reduction as expected.
3) 14nm is very new and hasn't had time to mature.   

There will likely be smaller ASICs in coming years, but the huge gains are in.   No it is more of a factor of using more ASICs not vastly faster ASICs.


So I wonder if difficulty will plateau then?


The difficulty rate growth will slow down, but the real question is when.   Personally I think there is still more than a year before we start seeing a real plateau.   At some point the amount of power used for bitcoin mining will become a limit to growth.  

Yeah.  That's what I was thinking.  At some point, the electricity costs will decide the plateau of difficulty.  Interesting opinions!
hero member
Activity: 854
Merit: 510
July 06, 2014, 10:35:30 PM
#12
http://igg.me/p/822701/x

What do you guys think?  Legit or scam?

If this is real it could mean that difficulty would go up even more and our old miners would become obselete even faster.

My thoughts:

1) A 14nm ASIC is unlikely to come from a crowd funded source with no ASIC experience.
2) There are many problems with scaling ASICs that have not necessary been solved.   The results may in fact be disappointing, maybe in the range of 1.6% faster and not as much power reduction as expected.
3) 14nm is very new and hasn't had time to mature.   

There will likely be smaller ASICs in coming years, but the huge gains are in.   No it is more of a factor of using more ASICs not vastly faster ASICs.


So I wonder if difficulty will plateau then?


The difficulty rate growth will slow down, but the real question is when.   Personally I think there is still more than a year before we start seeing a real plateau.   At some point the amount of power used for bitcoin mining will become a limit to growth.  
sr. member
Activity: 406
Merit: 250
July 06, 2014, 10:34:39 PM
#11
http://igg.me/p/822701/x

What do you guys think?  Legit or scam?

If this is real it could mean that difficulty would go up even more and our old miners would become obselete even faster.

My thoughts:

1) A 14nm ASIC is unlikely to come from a crowd funded source with no ASIC experience.
2) There are many problems with scaling ASICs that have not necessary been solved.   The results may in fact be disappointing, maybe in the range of 1.6% faster and not as much power reduction as expected.
3) 14nm is very new and hasn't had time to mature.   

There will likely be smaller ASICs in coming years, but the huge gains are in.   No it is more of a factor of using more ASICs not vastly faster ASICs.


So I wonder if difficulty will plateau then?

Difficulty still has some ways to go as, if you exclude the initial cost of buying the miners, mining can be profitable after accounting for electric costs.
hero member
Activity: 519
Merit: 500
July 06, 2014, 09:05:47 PM
#10
Moore's law is approaching it's end.  We won't keep seeing leaps in transistor reduction.  Switching to graphene or some other exotic manufacturing might be the future.

But for what seems an eternity in the Bitcoin world, the next 1-2 years will still see more efficient miners released.
full member
Activity: 196
Merit: 100
July 06, 2014, 08:48:33 PM
#9
"I am a Graduated Science Engineer major with a 3.84GPA from Atlantic City, NJ, USA. "

Scam...a person with the credentials offered would have not made this egregious grammar mistake

Yeah.  That's true.  I'm wondering when the improvment in ASICS will stop.

Uh, never.

Unless bitcoin and all other SHA-256 coins go bust. I'm guessing you don't want that.

But what's going to get better?  Optimization I guess.  Probably not nm.

Okay folks, a little computer history for you.

In 1990 the fastest consumer level computer you could get was a Mac II fx at 40MHz with 4 MB, expandable to 128 MB. That was 24 years ago.

How long have SHA-256 ASICs been around? 2 years?
member
Activity: 87
Merit: 10
July 06, 2014, 08:44:43 PM
#8
http://igg.me/p/822701/x

What do you guys think?  Legit or scam?

If this is real it could mean that difficulty would go up even more and our old miners would become obselete even faster.

My thoughts:

1) A 14nm ASIC is unlikely to come from a crowd funded source with no ASIC experience.
2) There are many problems with scaling ASICs that have not necessary been solved.   The results may in fact be disappointing, maybe in the range of 1.6% faster and not as much power reduction as expected.
3) 14nm is very new and hasn't had time to mature.   

There will likely be smaller ASICs in coming years, but the huge gains are in.   No it is more of a factor of using more ASICs not vastly faster ASICs.


So I wonder if difficulty will plateau then?
hero member
Activity: 854
Merit: 510
July 06, 2014, 08:42:20 PM
#7
http://igg.me/p/822701/x

What do you guys think?  Legit or scam?

If this is real it could mean that difficulty would go up even more and our old miners would become obselete even faster.

My thoughts:

1) A 14nm ASIC is unlikely to come from a crowd funded source with no ASIC experience.
2) There are many problems with scaling ASICs that have not necessary been solved.   The results may in fact be disappointing, maybe in the range of 1.6% faster and not as much power reduction as expected.
3) 14nm is very new and hasn't had time to mature.   

There will likely be smaller ASICs in coming years, but the huge gains are in.   No it is more of a factor of using more ASICs not vastly faster ASICs.
member
Activity: 87
Merit: 10
July 06, 2014, 08:36:40 PM
#6
"I am a Graduated Science Engineer major with a 3.84GPA from Atlantic City, NJ, USA. "

Scam...a person with the credentials offered would have not made this egregious grammar mistake

Yeah.  That's true.  I'm wondering when the improvment in ASICS will stop.

Uh, never.

Unless bitcoin and all other SHA-256 coins go bust. I'm guessing you don't want that.

But what's going to get better?  Optimization I guess.  Probably not nm.
full member
Activity: 196
Merit: 100
July 06, 2014, 08:35:33 PM
#5
"I am a Graduated Science Engineer major with a 3.84GPA from Atlantic City, NJ, USA. "

Scam...a person with the credentials offered would have not made this egregious grammar mistake

Yeah.  That's true.  I'm wondering when the improvment in ASICS will stop.

Uh, never.

Unless bitcoin and all other SHA-256 coins go bust. I'm guessing you don't want that.
full member
Activity: 121
Merit: 100
July 06, 2014, 08:34:12 PM
#4

Yeah.  That's true.  I'm wondering when the improvment in ASICS will stop.

never...technology will always get "better' given enough time and incentive for profit
member
Activity: 87
Merit: 10
July 06, 2014, 08:32:38 PM
#3
"I am a Graduated Science Engineer major with a 3.84GPA from Atlantic City, NJ, USA. "

Scam...a person with the credentials offered would have not made this egregious grammar mistake

Yeah.  That's true.  I'm wondering when the improvment in ASICS will stop.
full member
Activity: 121
Merit: 100
July 06, 2014, 08:31:03 PM
#2
"I am a Graduated Science Engineer major with a 3.84GPA from Atlantic City, NJ, USA. "

Scam...a person with the credentials offered would have not made this egregious grammar mistake
member
Activity: 87
Merit: 10
July 06, 2014, 08:24:13 PM
#1
http://igg.me/p/822701/x

What do you guys think?  Legit or scam?

If this is real it could mean that difficulty would go up even more and our old miners would become obselete even faster.
Jump to: