Pages:
Author

Topic: Are ASICs the last major evolution in mining hardware? - page 3. (Read 6886 times)

full member
Activity: 219
Merit: 100

my 2 bitcents:

in the future who knows..  quantum computing, positronic brains, chystaline entities?



indeed. sufficiently big quantum computer would be able to find a suitable block hash in arbitrarily short time thus ending the bitcoin for good.

of course such quantum computer would be useful for other things as well (can you spell SSH?)...

Smiley
legendary
Activity: 980
Merit: 1040
Nothing major to come besides the obvious efficiency and manufacturing improvements, not before the next few reward splits anyway.

Well, I do expect a price war that will have pretty major consequences. Once all the ASIC vendors sold and delivered their initial runs and difficulty explodes, mining revenue per GH and thus demand will dry up unless they cut prices, and they should have very high per unit margins allowing them to do just that (ASICs cost almost nothing to produce). Causing even higher difficulty, requiring even lower prices for these devices to be marketable. Over and over. In that sense the next year or two will see pretty dramatic changes IMO, but it will be with mostly the same chips for sale now. Just for satoshi's on the bitcoin, so to speak.
hero member
Activity: 560
Merit: 500
Some really good replies here by some really knowledgeable people. It looks like bitcoin has caught up, technology wise, to the rest of the industry. Nothing major to come besides the obvious efficiency and manufacturing improvements, not before the next few reward splits anyway.

legendary
Activity: 980
Merit: 1040
Asics will be in the same boat as CPUs/GPUs, as they are fundamentally using the same technology.  Short of quantum computing, there hasnt been anything even on the horizon that promises a "quantum leap" beyond Moore's law for the past 30 or 40 years now. Keeping up with Moore's law will prove difficult enough by itself. Now anything is possible, but if something new does come along that offers another order of magnitude increase,  it will almost certainly revolutionize a whole lot more than just bitcoin mining.

BTW, I dont expect to see any real improvements over the first generation of ASICs any time soon. In the short/mid term, they wont even keep up with Moore's law, because I believe there will be no financial incentive to invest in more advanced process nodes. We will get a price war first, and once we get to the bottom of that (orders of magnitude cheaper than today), most miners will be so far under water, I doubt there will be a market for slightly more energy efficient devices, especially considering the high NRE it would take to develop 28nm or smaller chips.
member
Activity: 91
Merit: 10
“640K ought to be enough for anybody.” -Bill Gates (1981)

Well you can clearly see how that went.

Which has nothing to do with the topic.

Nobody said faster and more efficient ASICs aren't possible but there is no technology which would allow a 680x increased in efficiency over what the Avalon is capable of.  Going to state of the art, fully custom, optimized 28nm ASIC might allow a 24x increase in performance however after that miners would be limited to Moore's law.  There are no more "shortcuts".

I was referring to that quote because you can never be sure about the future, unless you've been there yourself.
As others have said, quantum computing will most likely be available at some point in the future.
-ck
legendary
Activity: 4088
Merit: 1631
Ruu \o/
More people talking nonsense....

There are still algorithmic improvements, at some point in time further 'shortcuts' will be discovered.
Spotted at least one myself... problem is I'm not a maths guru.. so I cannot figure out a formula, but I can see it happening.


So why didn't these happen on GPU (which are pretty damn easy to reprogram).  GPU algorithm mining efficiency has all but stalled. 18 months ago people were finding 10% here, and 3% there and that all essentially flatlined almost a year ago.

I doubt there is much algorithm efficiency left.  Now it is possible SHA-256 will be partially compromised which allows for the creation of "optimized" hashers which take that cryptographic flaw into account but that is hard to predict when or if it will happen.
D&T is right. I can tell you both Diablo and I, along with numerous others along the way, have spent many hours trying everything to further tweak the algorithms as used by GPUs, and we have been unable to squeeze anything more out of it. I suspect the on-chip algorithm on the ASICs closely resembles these optimised OpenCL kernels as used by GPUs.
donator
Activity: 2058
Merit: 1007
Poor impulse control.

my 2 bitcents:

in the future science fiction who knows..  quantum computing, positronic brains, chystaline entities?



FTFY, and saved me from posting it myself Wink
legendary
Activity: 1876
Merit: 1000

my 2 bitcents:

in the future who knows..  quantum computing, positronic brains, chystaline entities?

donator
Activity: 1218
Merit: 1079
Gerald Davis
“640K ought to be enough for anybody.” -Bill Gates (1981)

Well you can clearly see how that went.

Which has nothing to do with the topic.

An Avalon ASIC is roughly 17x more efficient than the average FPGA (170 MH/J vs 10 MH/J).
An Avalon ASIC is roughly 85x more efficient than the average GPU (170 MH/J vs 2 MH/J).
An Avalon ASIC is roughly 680x more efficient than the average CPU (170 MH/J vs 0.25 MH/J).

Nobody said faster and more efficient ASICs aren't possible but there is no technology which would allow a 680x increased in efficiency over what the Avalon is capable of.  Going to state of the art, fully custom, optimized 28nm ASIC might allow a 24x increase in performance however after that miners would be limited to Moore's law.  There are no more "shortcuts".

When you consider that at one time "good miners" were operating at 0.25 MH/W efficiency and bad guys could "cheat" (using 28nm full custom ASICs) acheive a roughly 16,000x "shortcut" the gap has been significantly closed.

member
Activity: 91
Merit: 10
“640K ought to be enough for anybody.” -Bill Gates (1981)

Well you can clearly see how that went.
legendary
Activity: 952
Merit: 1000
More people talking nonsense....

There are still algorithmic improvements, at some point in time further 'shortcuts' will be discovered.
I reckon the jump is going to be algorithmic. I mean finding mathematical shortcuts to hashing. That's where the action will be.
For some reason, the wording of both your posts sounded really weird to me. Either way, you're posting very vague answers to the OPs question. Yes, there will be improvements, but no, there will be no major technological advancements that will produce the same jump as what we've seen from CPU -> GPU and now from GPU -> ASIC.
donator
Activity: 1218
Merit: 1079
Gerald Davis
More people talking nonsense....

There are still algorithmic improvements, at some point in time further 'shortcuts' will be discovered.
Spotted at least one myself... problem is I'm not a maths guru.. so I cannot figure out a formula, but I can see it happening.


So why didn't these happen on GPU (which are pretty damn easy to reprogram).  GPU algorithm mining efficiency has all but stalled. 18 months ago people were finding 10% here, and 3% there and that all essentially flatlined almost a year ago.

I doubt there is much algorithm efficiency left.  Now it is possible SHA-256 will be partially compromised which allows for the creation of "optimized" hashers which take that cryptographic flaw into account but that is hard to predict when or if it will happen.
hero member
Activity: 784
Merit: 502
Bitcoin was released into the wild in Jan 2009 and was first mined on CPUs, which everyone who learned about Bitcoin already had access to.

Then around Sep 2010 the first GPU miner was released. It picked up fast because GPU's were available off of any computer store shelf, and most PC enthusiasts already had one in their machine.

Sometime around May 2011 the first FPGA was demonstrated. FPGA chips were already available, but required assembly to work as Bitcoin miners.

Now we have ASICs coming onto the market, which are purpose built chips, designed for the sole purpose of mining Bitcoins.

The big question is, where do you go from here? Or is this the last major leap for Bitcoin hardware?

I reckon the jump is going to be algorithmic. I mean finding mathematical shortcuts to hashing. That's where the action will be.
full member
Activity: 196
Merit: 100
More people talking nonsense....

There are still algorithmic improvements, at some point in time further 'shortcuts' will be discovered.
Spotted at least one myself... problem is I'm not a maths guru.. so I cannot figure out a formula, but I can see it happening.
donator
Activity: 1218
Merit: 1079
Gerald Davis
The only major jumps left are catching up to current state of the art technology (28nm process).  After that any future improvements would be limited to Moore's law.  

The Avalon is built using 110nm so there are a jumps in performance using the same chip manufactured on a smaller process (80nm, 65m, 40nm, 28nm).  Each one of these jumps would roughly double the efficiency (MH/J and MH/$).  Also the design of the Avalon is likely not the "end all" in SHA-256 Microprocessor design.  Some additional efficiency can be tweaked out of an improved chip design and improved construction, etc.  How much?  No idea lets guess 50% improvement is possible.

Still even if you put all that together you likely are talking a full custom, perfectly optimized, 28nm miner would be ~(110/28)^2*150% "only" 24x high efficiency as a theoretical upper bound.

legendary
Activity: 952
Merit: 1000
Or is this the last major leap for Bitcoin hardware?
Pretty much. You might see some efficiency improvements as the years go by, but from a performance standpoint, this is pretty much it.
hero member
Activity: 560
Merit: 500
Bitcoin was released into the wild in Jan 2009 and was first mined on CPUs, which everyone who learned about Bitcoin already had access to.

Then around Sep 2010 the first GPU miner was released. It picked up fast because GPU's were available off of any computer store shelf, and most PC enthusiasts already had one in their machine.

Sometime around May 2011 the first FPGA was demonstrated. FPGA chips were already available, but required assembly to work as Bitcoin miners.

Now we have ASICs coming onto the market, which are purpose built chips, designed for the sole purpose of mining Bitcoins.

The big question is, where do you go from here? Or is this the last major leap for Bitcoin hardware?
Pages:
Jump to: