Pages:
Author

Topic: "ASIC- Proof" - page 2. (Read 3244 times)

full member
Activity: 164
Merit: 100
June 26, 2013, 05:46:45 AM
#36
It's not about being "ASIC-proof", more about being "disruption-proof". It's harder to disrupt the current Scrypt mining technology than it is to disrupt SHA-256 mining technology. This sort of protection has value for many people, especially since the Scrypt parameters could be tweaked to change the performance.

Actually, this thread is about being "ASIC-Proof" and/or "ASIC-Resistant".

That's a lovely subject for a different thread, though.

Have you looked up ASIC-Proof and ASIC-Resistant in the dictionary? I have and they're not there, so I guess it's possible there's no canonical definition and we're into semantics now.

I consider two meanings:
1) Can never be mined at all by any custom ASIC hardware
2) It will be unlikely to be worthwhile to develop mass-produced ASICs for mining, which gives significant enough profitability increases that it becomes unprofitable with GPUs - i.e. ASICs are not a real threat in the future to replace GPU mining of the coin.

Isn't 2) more important than 1) ?
hero member
Activity: 798
Merit: 1000
‘Try to be nice’
June 26, 2013, 05:08:30 AM
#35
I kind of think maybe we are all bored and wanted a thread were we could all agree with each other ....


{hey digi, that's right i agree !, also i like your point before about where you said we would be agreeing with each other}

{thanks i agree}

{but wait, what about where you said we were bored? , do you mean that's why we made this thread, if so,  then i agree.}

{yeah that's what i said, so i agree with you too. }




also , what is a BOT?
full member
Activity: 196
Merit: 100
June 25, 2013, 10:58:46 PM
#34
It's not about being "ASIC-proof", more about being "disruption-proof". It's harder to disrupt the current Scrypt mining technology than it is to disrupt SHA-256 mining technology. This sort of protection has value for many people, especially since the Scrypt parameters could be tweaked to change the performance.

Actually, this thread is about being "ASIC-Proof" and/or "ASIC-Resistant".

That's a lovely subject for a different thread, though.
hero member
Activity: 632
Merit: 500
June 25, 2013, 10:50:34 PM
#33
It's not about being "ASIC-proof", more about being "disruption-proof". It's harder to disrupt the current Scrypt mining technology than it is to disrupt SHA-256 mining technology. This sort of protection has value for many people, especially since the Scrypt parameters could be tweaked to change the performance.
full member
Activity: 196
Merit: 100
June 25, 2013, 10:47:58 PM
#32
Based on the rapidly balooning hashrate for litecoin, and the facts that

1. 95% of GPU miners switched to litecoin weeks if not months ago (as it has _LONG_ been more profitable than BTC for GPU miners)
2. Only a complete idiot would continue buying GPUs to mine anything

it's possible that someone has developed an FPGA or ASIC for mining scrypt coins, but is adding hashrate in such a way as to disguise this fact.  It doesn't make $/kWh to buy GPU miners for scrypt coins anymore.

Completely untrue at current profitability rates.  At current rates, mining LTC, a GPU will pay for itself in about 6.5 months.

Since people aren't exactly rushing out to buy GPU's at the moment and ASICS is a remote risk, we can expect the LTC network to stay at about the same hash rate / difficulty for an extended period of time more than sufficient to recoup any initial investment.

And GPU mining profit is WAY above the power cost.  Just run a calculator:

https://give-me-ltc.com/calc

Just absurd to suggest that, at current levels, LTC mining won't pay for the power.  Prove your claim.  And if true, why is 20 gh/s being thrown at LTC and clones ?

As for ASICS, good luck with your BFL pre-order #2,124,123.  Have fun mining with that once it arrives in 2040

Regarding ASICs I agree with you whole heartedly.

As for FPGAs, you'll never convince me that there isn't someone who programs FPGAs for a living or a hobby, already had a few sitting around, and set them up to mine litecoins. I'm sure they're out there.
full member
Activity: 196
Merit: 100
June 25, 2013, 10:45:06 PM
#31
I said this:

Now anybody that wants to argue that there can be an ASIC-Proof algorithm that a general purpose processor such as an i7 can compute just needs to put on an idiot hat and go sit in the corner.  Yes, I know that's rude as hell but, c'mon.

And you said this:

Strongly disagree. It is, at least in principle, possible to make an algorithm that an intel core i7 is the ideal chip for (that is, an intel core i7 IS the asic for this algorithm). You'd have to write the algorithm specifically for that task, but it is possible.

Do you see the disconnect, where you sort of confirm what I just said, but seem to be arguing against it?
hero member
Activity: 1395
Merit: 505
June 25, 2013, 10:40:12 PM
#30
Based on the rapidly balooning hashrate for litecoin, and the facts that

1. 95% of GPU miners switched to litecoin weeks if not months ago (as it has _LONG_ been more profitable than BTC for GPU miners)
2. Only a complete idiot would continue buying GPUs to mine anything

it's possible that someone has developed an FPGA or ASIC for mining scrypt coins, but is adding hashrate in such a way as to disguise this fact.  It doesn't make $/kWh to buy GPU miners for scrypt coins anymore.

Completely untrue at current profitability rates.  At current rates, mining LTC, a GPU will pay for itself in about 6.5 months.

Since people aren't exactly rushing out to buy GPU's at the moment and ASICS is a remote risk, we can expect the LTC network to stay at about the same hash rate / difficulty for an extended period of time more than sufficient to recoup any initial investment.

And GPU mining profit is WAY above the power cost.  Just run a calculator:

https://give-me-ltc.com/calc

Just absurd to suggest that, at current levels, LTC mining won't pay for the power.  Prove your claim.  And if true, why is 20 gh/s being thrown at LTC and clones ?

As for ASICS, good luck with your BFL pre-order #2,124,123.  Have fun mining with that once it arrives in 2040.  And for FPGA BTC mining good luck competing against the ASICS that have shipped and the hordes of them being run by the manufacturers for easy mining profit
full member
Activity: 196
Merit: 100
June 25, 2013, 10:38:06 PM
#29
Based on the rapidly balooning hashrate for litecoin, and the facts that

1. 95% of GPU miners switched to litecoin weeks if not months ago (as it has _LONG_ been more profitable than BTC for GPU miners)
2. Only a complete idiot would continue buying GPUs to mine anything

it's possible that someone has developed an FPGA or ASIC for mining scrypt coins, but is adding hashrate in such a way as to disguise this fact.  It doesn't make $/kWh to buy GPU miners for scrypt coins anymore.

I believe that there are FPGAs mining Litecoin right now, and have been for some time.  It's pretty silly to think there aren't, IMO.

Not ASICs.  There isn't enough money in all of altcoins to cause secret ASIC development - the costs are so extreme that the only hope for viability would be a sales plan.

Remember it isn't the miners that get rich in the gold rush.
full member
Activity: 196
Merit: 100
June 25, 2013, 10:33:23 PM
#28

It appears the OP assumes implementing scrypt in hardware is identical in difficulty and development effort to implementing SHA256 in hardware, and that the benefits in hash rate would be similar to SHA256 on GPU's vs. an ASIC...  

Not at all.  Neither condition is necessary for the point I'm making (well trying to make).  See...

In reality, SHA256 is almost a prime example of an embarrassingly simple problem (Google "embarrassingly simple" if you're not familiar with what that term means in the context of parallel processing), and you can pipeline the whole SHA256 calculation and crank out a hash per clock cycle per core.  It's almost retardedly simple to develop an SHA256 ASIC, there's even multiple Verilog implementations for FPGA's that you can start with to generate the netlist for the SHA256 cores (at which point the development process diverges from that of FPGA's of course).  That's why we've seen several SHA256 ASICs arrive on the market that were designed by novices or people with negligible or no prior VLSI experience.

When it comes to development cost, there's also a massive spread.  You can go and pick yourself up an SHA256 core design, for free, that performs fairly well and is fully pipelined, from multiple sources.  For scrypt, you have to go it alone and develop it from scratch, and you end up with an almost infinitely more complex netlist than an SHA256 core (in fact, an scrypt core will tend to contain two SHA256 cores) that is significantly harder to place and route on the die, and much harder to verify gate-level simulations prior to taping out the masks.  The challenge in making an SHA256 ASIC pretty much amounts to placing and routing a fairly simple netlist against the foundry's provided logic cell library, and then just copy'n'pasting the core all over the available die area.  The challenge with scrypt is monumental in comparison.

So, what are we talking here.  Let's say $100,000 buys me an extra man-year of engineering time. Let's say it takes 2 man-years, and $200,000 extra.

Preposterous, do you disagree?  Nonetheless.

What's that mean to the overall cost of bringing an ASIC miner all the way to retail from scratch? 5% increase? 3% increase? I'm not going to say it's trivial.  It's no show stopper.


Implementing scrypt in hardware is not what I'd call "embarrassingly simple" in comparison.  A Radeon 69xx (or maybe a 79xx depending how you value power efficiency) die is fairly close to being a pretty good hardware implementation for scrypt.  Yeah, you can probably do slightly better in a few areas (but worse in others, particularly if you're stuck interfacing to off-chip GDDR5, as AMD kinda has the edge over anything an amateur-developed ASIC is going to have for a memory controller core(s)).

Oh my god, this is entirely new to me.  And absolutely hilarious.  If I recall correctly LiteCoin was actually developed for "GPU Resistance".  The irony here is "thick as butta".

On topic, you keep talking about amateurs.  That was what happened with the original, simple implementation of a blockchain, because "amateurs" were the only ones who cared and it was, as you state, embarrassingly simple.

I know I posted some of this after you but, with the example of BitCoin right in front of them, if a scrypt coin breaks $xx USD (mystery threshold at this time) and holds it for a few months, or shows signs of climbing, the situation is going to be quite different. It won't be about how many years it takes a man, it will be about how many man-years it takes.


...  Or else you do on-die SRAM and pick a good spot along the more obvious TMTO curve (lookup gap) and live with burning tons of die area on SRAM.  But it's not going to result in something with an epic performance gap compared to GPU's, as happened with SHA256 for BTC.

In my opinion, OP's points would be valid if developing an scrypt ASIC were of equal difficulty and complexity to slapping an array of open-source SHA256 cores through an open-source ASIC router and layout tool and sending off the placed and routed design to the foundry (oversimplification, but not by much).  So I would say yes, the algorithm does actually matter.  You can spin an SHA256 ASIC design for significantly less than $1M if you do the design work yourself.  You can even screw up the design royally several times and re-run new masks through MOSIS (an ASIC prototype aggregation service) multiple times and still be under that amount.

The performance gap doesn't have to be epic, it just has to be significant, and "Burning tons of die area on SRAM" may be exactly what would happen.  I'll get back to this in a second. Also a reference back to my comment about your focus on amateurs.

And to bring it further into the clear, I suspect the majority of the difference in cost between producing a scrypt ASIC Miner and a SHA ASIC Miner is going to be in the supporting miner hardware, which is going to require a lot more activity to occur off-chip than bitcoin does.

I'm going to make a statement here that is absolutely outside of my realm of knowledge. It's actually possible that the design, production and fab of a scrypt asic would end up being cheaper than the same process for a SHA ASIC due to this very fact.

Partially correct on the first point above, and very wrong on the last point.  I'm not sure how you got from "scrypt will require more complicated off-chip support components" to "an scrypt ASIC would end up being cheaper" than an SHA256 ASIC.  The die area needed to implement an scrypt core (that actually performs with any sort of noteworthy hash rate) is massively larger than for a simple pipelined SHA256(SHA256()) core, regardless of whether there is off-die memory.  And interfacing to external high-speed I/O is one of the hardest things you can deal with in an ASIC design, especially if we're talking about interfacing to something like a very wide bank of GDDR5 at anything close to the clock rates that the Radeon GPU's operate at.  It is, perhaps, very foolish to suggest that addressing an extremely difficult external I/O problem will drive down the cost of developing and fabricating an ASIC, compared with a simple SHA256 core that barely needs to talk to anything (and when it does, can do so over even a dirt simple open-collector bus that just communicates a winning nonce when one is found).

Maybe people just aren't understanding how dirt simple a hardware implementation of SHA256 really is..  Not exactly ground-breaking technology that demands a cutting-edge process node here.

Back to burning up that die space. What if (yes I'm speculating again, and asking for your input) some basic research or, more likely early prototyping/simulation, were to show that you could do it better by burning the whole damn thing on a single die. Stick with me here. The cost of the die will increase immensely.  But your ASIC miner could then consist of a power supply, an I/O interface comparable to a SHA256 asic, if not simpler, and a water (more likely oil) cooler.   

An interesting thought. If you care to analyze it I'm interested in what you have to say.  If you eliminate so many other factors in the design/development/manufacturing process, it might be worth the increased cost per die. Pile on reliability, life span in the field, and a few other factors. Would you be willing to dismiss this out of hand?

If so, I'd like to know what your experience and/or credentials are before I decide how seriously to take your dismissal.

It takes well under the $8M figure mentioned by the OP to call up AMD and license the Radeon 6950 or 7950 reference design, and produce boards with multiple GPU's on them.

This might be the show stopper. In light of my original point, I want to mention that just because something "isn't the cheapest way" doesn't make it "not economically viable". And it keeping with what I posted after you - the decision to actually create a Scrypt ASIC may be an entirely emotional one, with factors that were nonexistent "when bitcoin did it".

Back to our imaginary billionaire Smiley  It depends on what kind of guy he is.  If he wakes up and says "I want to pwn crypto" and goes about it as efficiently as possible, he's likely to speak to AMD. If he wakes up and says "I want to be the guy that brought out the scrypt ASIC", all other factors go out the window.

Profit is profit, and if the numbers are big enough, a 1% improvement in over all efficiency (calculated only by coins/joule) may make it "economically viable".


I'm going to make a statement here that is absolutely outside of my realm of knowledge.

There are probably people on this forum who know the real answer to that, I don't.

You're correct on both of these items though.

 Roll Eyes Good to know I was right about something.  You covered some of the same ground in both posts, if you think me shifting quoted blocks to address similar points changed your meaning, let me know.

I'm also glad no one has showed up to tell me about "an ASIC-proof algorithm".
sr. member
Activity: 322
Merit: 250
June 25, 2013, 10:10:47 PM
#27
Based on the rapidly balooning hashrate for litecoin, and the facts that

1. 95% of GPU miners switched to litecoin weeks if not months ago (as it has _LONG_ been more profitable than BTC for GPU miners)
2. Only a complete idiot would continue buying GPUs to mine anything

it's possible that someone has developed an FPGA or ASIC for mining scrypt coins, but is adding hashrate in such a way as to disguise this fact.  It doesn't make $/kWh to buy GPU miners for scrypt coins anymore.
sr. member
Activity: 347
Merit: 250
June 25, 2013, 10:02:29 PM
#26
You've substituted the word "would" where I used the word "could" - in context (and with my admission of ignorance) it's an important change in meaning.  If I hadn't gone ahead and speculated I don't think I would have gotten such a thorough answer.

Mis-quote acknowledged, and I went back and fixed my post above to say "could" instead of "would" as you noted.
full member
Activity: 196
Merit: 100
June 25, 2013, 09:31:11 PM
#25
And to bring it further into the clear, I suspect the majority of the difference in cost between producing a scrypt ASIC Miner and a SHA ASIC Miner is going to be in the supporting miner hardware, which is going to require a lot more activity to occur off-chip than bitcoin does.

I'm going to make a statement here that is absolutely outside of my realm of knowledge. It's actually possible that the design, production and fab of a scrypt asic would end up being cheaper than the same process for a SHA ASIC due to this very fact.

Partially correct on the first point above, and very wrong on the last point.  I'm not sure how you got from "scrypt will require more complicated off-chip support components" to "an scrypt ASIC would end up being cheaper" than an SHA256 ASIC.  The die area needed to implement an scrypt core is massively larger than for a simple pipelined SHA256(SHA256()) core, regardless of whether there is off-die memory.  And interfacing to external high-speed I/O is one of the hardest things you can deal with in an ASIC design, especially if we're talking about interfacing to something like a very wide bank of GDDR5 at anything close to the clock rates that the Radeon GPU's operate at.  It is, perhaps, very foolish to suggest that addressing an extremely difficult external I/O problem will drive down the cost of developing and fabricating an ASIC, compared with a simple SHA256 core that barely needs to talk to anything (and when it does, can do so over even a dirt simple open-collector bus that just communicates a winning nonce when one is found).

When it comes to development cost, there's also a massive spread.  You can go and pick yourself up an SHA256 core design, for free, that performs fairly well and is fully pipelined, from multiple sources.  For scrypt, you have to go it alone and develop it from scratch, and you end up with an almost infinitely more complex netlist than an SHA256 core (in fact, an scrypt core will tend to contain two SHA256 cores) that is significantly harder to place and route on the die, and much harder to verify gate-level simulations prior to taping out the masks.  The challenge in making an SHA256 ASIC pretty much amounts to placing and routing a fairly simple netlist against the foundry's provided logic cell library, and then just copy'n'pasting the core all over the available die area.  The challenge with scrypt is monumental in comparison.

Maybe people just aren't understanding how dirt simple a hardware implementation of SHA256 really is..  Not exactly ground-breaking technology that demands a cutting-edge process node here.


I'm going to make a statement here that is absolutely outside of my realm of knowledge.

There are probably people on this forum who know the real answer to that, I don't.

You're correct on both of these items though.

I've been at dinner and I'm taking in your posts.  Thank you for stopping in, as I do appreciate actually being educated a little.

I want to point this out, while I'm still taking in some of the more technical parts of your posts:


Partially correct on the first point above, and very wrong on the last point.  I'm not sure how you got from "scrypt will require more complicated off-chip support components" to "an scrypt ASIC would end up being cheaper" than an SHA256 ASIC.  

You've substituted the word "would" where I used the word "could" - in context (and with my admission of ignorance) it's an important change in meaning.  If I hadn't gone ahead and speculated I don't think I would have gotten such a thorough answer.

For now I'll take your knowledgability at face value and I'm off to re-read your posts before I say anything else.

I'm going to go read them a couple more times and see if I have anything to say Smiley
full member
Activity: 196
Merit: 100
June 25, 2013, 09:13:21 PM
#24

So your ASIC Resistance notion boils down to "economically unviable", and I agree that is a good notion.

I just don't get your distinction between it not being an algorithmic property yet a context property of the system built on that algorithm when in fact these limitations in the ultimately static context arise out of the algorithmic properties in the first place. Unless of course you can provide numbers.


The algorithmic properties will have a very minor impact on the timing of achieving "economic viability".

It's technically nothing at all to do with "the ability to produce an ASIC" for the POW algorithm.

And to bring it further into the clear, I suspect the majority of the difference in cost between producing a scrypt ASIC Miner and a SHA ASIC Miner is going to be in the supporting miner hardware, which is going to require a lot more activity to occur off-chip than bitcoin does.

I'm going to make a statement here that is absolutely outside of my realm of knowledge. It's actually possible that the design, production and fab of a scrypt asic would end up being cheaper than the same process for a SHA ASIC due to this very fact.  There are probably people on this forum who know the real answer to that, I don't.

Note very carefully the distinction between the ASIC itself, and the ASIC miner.

What sort of numbers do you mean?  We're in practically uncharted waters here (one data point does not make much of a chart).  I don't have any.



That's actually my criticism, you don't seem to have any data. Last time I checked people with sophisticated understanding of ASIC design and sCrypt internals determined it is not economically viable, and likely won't be for decades to come, or to put it this way: it seems the algorithmic properties do have a very major impact on the timing of achieving "economic viability".

All right, stating in all caps that it has ABSOLUTELY NOTHING TO DO with it may be a bit of hyperbole.

However, the folks you're talking about (just assuming that they are as credentialed as you indicate) don't have any data either.  Or at least, they have only the one data point that I have, which is the point at which someone felt that BitCoin ASICs were economically viable.

Bitcoin had nothing to blaze a trail, there was no reason for anyone to buy in or believe it would get to where it is.  I don't know what the market cap was when the various people and teams now producing bitcoin ASIC miners decided to move ahead on their projects. I know that they've only recently begun to ship.

Litecoin (no haterade here sorry just the best example of a scrypt coin atm) has had bitcoin to blaze a trail.  People are sorry that they missed out on the takeoff of BTC are going to be more eager to speculate, and thus more welcoming to the risk/reward ratio.

Put that burning desire together with moore's law and current research into various computing technologies.  I say we have an unpredictable situation here.

A billionaire might wake up tomorrow and say "fuck it let's have a scrypt ASIC". The odds of that having happened when Bitcoin were $2 apiece were pretty small. The odds of it happening to scrypt because someone is sorry they missed out on the "bitcoin rush" are much, much better.

Economic viability is a social factor. Even though the technicalities certainly do have an impact on those decisions. Or to put it another way, "ASIC-resistance" is a property of the coin and it's economy.  While the POW algorithm is a admittedly a factor, it's unlikely that it's going to be the actual deciding factor.

The actual deciding factor is going to be "fear of not being first".

That's all IMO, and so is anyone else's estimate of the situation. No disrespect intended. There are too many emotional factors this time around.







sr. member
Activity: 347
Merit: 250
June 25, 2013, 09:00:22 PM
#23
And to bring it further into the clear, I suspect the majority of the difference in cost between producing a scrypt ASIC Miner and a SHA ASIC Miner is going to be in the supporting miner hardware, which is going to require a lot more activity to occur off-chip than bitcoin does.

I'm going to make a statement here that is absolutely outside of my realm of knowledge. It's actually possible that the design, production and fab of a scrypt asic would end up being cheaper than the same process for a SHA ASIC due to this very fact.

Partially correct on the first point above, and very wrong on the last point.  I'm not sure how you got from "scrypt will require more complicated off-chip support components" to "an scrypt ASIC would could end up being cheaper" than an SHA256 ASIC.  The die area needed to implement an scrypt core (that actually performs with any sort of noteworthy hash rate) is massively larger than for a simple pipelined SHA256(SHA256()) core, regardless of whether there is off-die memory.  And interfacing to external high-speed I/O is one of the hardest things you can deal with in an ASIC design, especially if we're talking about interfacing to something like a very wide bank of GDDR5 at anything close to the clock rates that the Radeon GPU's operate at.  It is, perhaps, very foolish to suggest that addressing an extremely difficult external I/O problem will drive down the cost of developing and fabricating an ASIC, compared with a simple SHA256 core that barely needs to talk to anything (and when it does, can do so over even a dirt simple open-collector bus that just communicates a winning nonce when one is found).

When it comes to development cost, there's also a massive spread.  You can go and pick yourself up an SHA256 core design, for free, that performs fairly well and is fully pipelined, from multiple sources.  For scrypt, you have to go it alone and develop it from scratch, and you end up with an almost infinitely more complex netlist than an SHA256 core (in fact, an scrypt core will tend to contain two SHA256 cores) that is significantly harder to place and route on the die, and much harder to verify gate-level simulations prior to taping out the masks.  The challenge in making an SHA256 ASIC pretty much amounts to placing and routing a fairly simple netlist against the foundry's provided logic cell library, and then just copy'n'pasting the core all over the available die area.  The challenge with scrypt is monumental in comparison.

Maybe people just aren't understanding how dirt simple a hardware implementation of SHA256 really is..  Not exactly ground-breaking technology that demands a cutting-edge process node here.


I'm going to make a statement here that is absolutely outside of my realm of knowledge.

There are probably people on this forum who know the real answer to that, I don't.

You're correct on both of these items though.

EDIT - I misquoted the OP above, so went back and replaced "would" with "could" above.
sr. member
Activity: 347
Merit: 250
June 25, 2013, 08:39:09 PM
#22
So your ASIC Resistance notion boils down to "economically unviable", and I agree that is a good notion. I just don't get your distinction between it not being an algorithmic property yet a context property of the system built on that algorithm when in fact these limitations in the ultimately static context arise out of the algorithmic properties in the first place. Unless of course you can provide numbers backed by data.

+1

It appears the OP assumes implementing scrypt in hardware is identical in difficulty and development effort to implementing SHA256 in hardware, and that the benefits in hash rate would be similar to SHA256 on GPU's vs. an ASIC.  In reality, SHA256 is almost a prime example of an embarrassingly simple problem (Google "embarrassingly simple" if you're not familiar with what that term means in the context of parallel processing), and you can pipeline the whole SHA256 calculation and crank out a hash per clock cycle per core.  It's almost retardedly simple to develop an SHA256 ASIC, there's even multiple Verilog implementations for FPGA's that you can start with to generate the netlist for the SHA256 cores (at which point the development process diverges from that of FPGA's of course).  That's why we've seen several SHA256 ASICs arrive on the market that were designed by novices or people with negligible or no prior VLSI experience.

Implementing scrypt in hardware is not what I'd call "embarrassingly simple" in comparison.  A Radeon 69xx (or maybe a 79xx depending how you value power efficiency) die is fairly close to being a pretty good hardware implementation for scrypt.  Yeah, you can probably do slightly better in a few areas (but worse in others, particularly if you're stuck interfacing to off-chip GDDR5, as AMD kinda has the edge over anything an amateur-developed ASIC is going to have for a memory controller core(s)).  Or else you do on-die SRAM and pick a good spot along the more obvious TMTO curve (lookup gap) and live with burning tons of die area on SRAM.  But it's not going to result in something with an epic performance gap compared to GPU's, as happened with SHA256 for BTC.

In my opinion, OP's points would be valid if developing an scrypt ASIC were of equal difficulty and complexity to slapping an array of open-source SHA256 cores through an open-source ASIC router and layout tool and sending off the placed and routed design to the foundry (oversimplification, but not by much).  So I would say yes, the algorithm does actually matter.  You can spin an SHA256 ASIC design for significantly less than $1M if you do the design work yourself.  You can even screw up the design royally several times and re-run new masks through MOSIS (an ASIC prototype aggregation service) multiple times and still be under that amount.

It takes well under the $8M figure mentioned by the OP to call up AMD and license the Radeon 6950 or 7950 reference design, and produce boards with multiple GPU's on them.
sr. member
Activity: 350
Merit: 250
- "Bitcore (BTX) - Airdrops every Monday"
June 25, 2013, 08:31:49 PM
#21

So your ASIC Resistance notion boils down to "economically unviable", and I agree that is a good notion.

I just don't get your distinction between it not being an algorithmic property yet a context property of the system built on that algorithm when in fact these limitations in the ultimately static context arise out of the algorithmic properties in the first place. Unless of course you can provide numbers.


The algorithmic properties will have a very minor impact on the timing of achieving "economic viability".

It's technically nothing at all to do with "the ability to produce an ASIC" for the POW algorithm.

And to bring it further into the clear, I suspect the majority of the difference in cost between producing a scrypt ASIC Miner and a SHA ASIC Miner is going to be in the supporting miner hardware, which is going to require a lot more activity to occur off-chip than bitcoin does.

I'm going to make a statement here that is absolutely outside of my realm of knowledge. It's actually possible that the design, production and fab of a scrypt asic would end up being cheaper than the same process for a SHA ASIC due to this very fact.  There are probably people on this forum who know the real answer to that, I don't.

Note very carefully the distinction between the ASIC itself, and the ASIC miner.

What sort of numbers do you mean?  We're in practically uncharted waters here (one data point does not make much of a chart).  I don't have any.



That's actually my criticism, you don't seem to have any data. Last time I checked, people with a sophisticated understanding of ASIC design and sCrypt internals determined it is not economically viable, and likely won't be for decades to come, or to put it this way: it turns out the algorithmic properties do have a very major impact on the timing of achieving "economic viability" (and were designed to that end specifically with sCrypt, let's not forget) because the context constraints are ultimately static and therefore I fail to see why you would believe otherwise although there is nothing to back it up, not even a theoretical sketch of a proof of concept design.
full member
Activity: 196
Merit: 100
June 25, 2013, 08:26:58 PM
#20

So your ASIC Resistance notion boils down to "economically unviable", and I agree that is a good notion.

I just don't get your distinction between it not being an algorithmic property yet a context property of the system built on that algorithm when in fact these limitations in the ultimately static context arise out of the algorithmic properties in the first place. Unless of course you can provide numbers.


The algorithmic properties will have a very minor impact on the timing of achieving "economic viability".

It's technically nothing at all to do with "the ability to produce an ASIC" for the POW algorithm.

And to bring it further into the clear, I suspect the majority of the difference in cost between producing a scrypt ASIC Miner and a SHA ASIC Miner is going to be in the supporting miner hardware, which is going to require a lot more activity to occur off-chip than bitcoin does.

I'm going to make a statement here that is absolutely outside of my realm of knowledge. It's actually possible that the design, production and fab of a scrypt asic would end up being cheaper than the same process for a SHA ASIC due to this very fact.  There are probably people on this forum who know the real answer to that, I don't.

Note very carefully the distinction between the ASIC itself, and the ASIC miner.

What sort of numbers do you mean?  We're in practically uncharted waters here (one data point does not make much of a chart).  I don't have any.

full member
Activity: 126
Merit: 100
June 25, 2013, 08:19:00 PM
#19
the hatorade for litecoins is coming.
I guess we can thank gox for pulling litecoin in the spotlight.


newbie
Activity: 56
Merit: 0
June 25, 2013, 08:11:22 PM
#18
Strongly disagree. It is, at least in principle, possible to make an algorithm that an intel core i7 is the ideal chip for (that is, an intel core i7 IS the asic for this algorithm). You'd have to write the algorithm specifically for that task, but it is possible.

Wait, are we in the same thread?

Anyways, no you would have a chip-specific algorithm, this wouldn't magically make the i7 (a general computing device) suddenly become an ASIC.

This would possibly be a way to build social/economic ASIC-resistance into a coin, because it would be illegal to make it, and nearly impossible to compete with Intel on price.

Either way, I'm not exactly sure what you were reading when you wrote that. If you're not a native english speaker I apologize for possibly confusing grammar.

If ever single feature of the i7 were used, every FPU, every ALU, etc are all used in their most efficient way possible for the i7, then the only way you could beat it with a "mining ASIC" would be to just make more i7's. You could make one twice as powerful, but it'd require twice the silicon, twice the cost, twice the power, etc.

General use CPU's are so easy to beat for stuff like bitcoin, because the algorithm isn't optimized at the hardware level for those general CPU chips. If it is, then you can't really do much better.

Such an algorithm would be spectacularly hard to write, but I don't see why the possibility should be discredited.

I apologize if you're not a native english speaker and the technical aspects are hard to understand.
sr. member
Activity: 350
Merit: 250
- "Bitcore (BTX) - Airdrops every Monday"
June 25, 2013, 08:08:57 PM
#17
Pages:
Jump to: