Pages:
Author

Topic: Minimalist Spartan6-LX150 board - page 4. (Read 49998 times)

hero member
Activity: 518
Merit: 500
October 11, 2011, 11:14:42 AM
#75
You are missing my point - I have no idea why the developers have not develop good CUDA code - I only speculated to one of the possible reasons.

I have no reason to believe the cuda miners arent any good. Its the nvidia cards that arent as suited to bitcoin mining as amd cards, due to fundamentally different architectures. Since that makes the cards uncompetitive, it stands to reason few people will invest heavily in cuda apps that can only work on this (for bitcoin) very uncompetitive hardware. No amount of software optimization is going to turn a 140 MH/s nVidia card in to a 400 one. There is probably less than 10% untapped potential.
full member
Activity: 135
Merit: 100
October 11, 2011, 10:58:27 AM
#74
Like a lot of people will say NVIDIA is capable of GPU mining, but there are not that many CUDA code written - because the hardware is not that good, or who know, because the god software developers only could afford to have ATI hardware to focus their development on.

Please tell me you are kidding.
1) You are aware many of the software developers do this full time as their day job.  I am sure someone with $50K to $120K salary can afford an NVidia card.
2) That has absolutely nothing to do with why Nvidia performance is so poor.

You are missing my point - I have no idea why the developers have not develop good CUDA code - I only speculated to one of the possible reasons.

donator
Activity: 1218
Merit: 1079
Gerald Davis
October 11, 2011, 10:40:57 AM
#73
Like a lot of people will say NVIDIA is capable of GPU mining, but there are not that many CUDA code written - because the hardware is not that good, or who know, because the god software developers only could afford to have ATI hardware to focus their development on.

Please tell me you are kidding.
1) You are aware many of the software developers do this full time as their day job.  I am sure someone with $50K to $120K salary can afford an NVidia card.
2) That has absolutely nothing to do with why Nvidia performance is so poor.
full member
Activity: 135
Merit: 100
October 11, 2011, 09:40:25 AM
#72
I need a kit i can plug in and mine. The total package. When you can offer that (with enough hash rate), i'm sure that people will buy.
Indeed.
I am with these guys..  I am no electrical engineer..  but I can plug in a psu Smiley

You have a good point.  But to flip that around, on the GPU side it's taken as a given that you're going to buy your hardware from a company (ATI) that does not provide the mining software (or even admit it knows what bitcoin is).  But I understand that while gamers have seen GPUs before, most bitcoiners are encountering FPGAs for the first time.  They aren't scary; they're just obscenely flexible... "enough rope to hang yourself with."

I could, perhaps, put together a turn-key solution, although it would involve a lot of effort.  My two major concerns are:

1. HDL developer guilt.  It makes me slightly ill to see posts like ngzhang's "hey you lazy-ass HDL developers make your code faster so I can make MOAR PROFITZ!!!".  I'd feel queasy about selling a "solution" that bundled in somebody else's hard work.  I don't know the exact details of the fpgaminer/ztex dispute, but I can certainly empathize with the initial reaction from fpgaminer.  It would make me really happy to be providing low-cost boards to people who are interested in tweaking/tuning/improving the HDL code, but I think I've figured out now that there aren't as many of those people as I'd thought.

2. Support.  I'm happy to help out here in a casual message-board-member way.  But I'm kinda worried about lazy users buying a "turn-key" solution from me and then demanding that I hand-hold them through the whole process of configuring Xilinx's crapware drivers on their Windows host box (I haven't used Windows in almost a decade) under threat of posting negative reviews of my product ("did not work for me").  I definitely can't sell the boards for $250 if I have to budget in my own time spent on extensive tech support work.

Anyways.  Looks like the first run will be small personal-use-only, but there may be another batch of boards in November after I've figured out if it's worth taking this to the next level.

Hi big-chip-small-board,

I have been a lurker here for some time, more importantly a huge fan of your work.

May I offer you my opinion on this tricky matter.

I think that you should explore the advantage of your HDL developer quitting and streamline/optimise your work, in other words re-position it elegantly so it gets better accepted among the broader bitcoin community base.

The problems I see arise from the fact that FPGA developers try to tackle both parts of this project - design the hardware as well as develop the core functionality.

You know that this particular project requires two fundamentally different resources, one that is averagely good in both both areas will not do.

Hence, what about re-looking your marvellous hardware concept, perhaps give it a final touch and ensure that it is 100% compatible with the open source core functionality or any other firmware there is, so the people who buy your hardware can decide themselves what to run on it.

Instead of trying to excel in two areas, you then need to excel in one - design and produce great hardware - and let others develop the software and add value as they can.

This approach is very similar to the time when the GPU mining started - everyone would agree that selection of hardware and software are interlinked, but they got it from different sources.

Like a lot of people will say NVIDIA is capable of GPU mining, but there are not that many CUDA code written - because the hardware is not that good, or who know, because the god software developers only could afford to have ATI hardware to focus their development on.

But the result is clear - at the end of the day, ATI does not write hashing code and the bitcoin script writers do not develop highly-integrated hardware - and irrespective of this division of functionalities, the bitcoin community does not have problem locating the required bits and pieces and building their rigs.

My 2 cents.
donator
Activity: 980
Merit: 1004
felonious vagrancy, personified
September 30, 2011, 06:47:08 PM
#71
"My FPGAs won't lose 30% overnight due to some Goldman Sachs bullshit."

I am tempted to make this my new .signature
hero member
Activity: 592
Merit: 501
We will stand and fight.
September 28, 2011, 09:57:43 AM
#70

In my opinion, none of company will take part in any mining ASICs.

Not specific for bitcoin, but SHA256 has other uses. VIA has CPU's with hardware accelerated encryption functions, and I thought recent (or upcoming?) Intel chips did do. They are no match for GPUs but it shows its already done. Also when I google "SHA256 chip" you find among  others, this:
http://www.s2cinc.com/product/pd.asp?id=278

I have no clue how that performs compared to our GPUs, or even if its usable for bitcoin mining, but I would be surprised if there werent chips out there or coming that could be used for bitcoin, even if they are not designed for bitcoin.

there are some tiny difference between standard SHA256 hashing and bitcoin hashing. So ...
hero member
Activity: 518
Merit: 500
September 28, 2011, 09:33:46 AM
#69

In my opinion, none of company will take part in any mining ASICs.

Not specific for bitcoin, but SHA256 has other uses. VIA has CPU's with hardware accelerated encryption functions, and I thought recent (or upcoming?) Intel chips did do. They are no match for GPUs but it shows its already done. Also when I google "SHA256 chip" you find among  others, this:
http://www.s2cinc.com/product/pd.asp?id=278

I have no clue how that performs compared to our GPUs, or even if its usable for bitcoin mining, but I would be surprised if there werent chips out there or coming that could be used for bitcoin, even if they are not designed for bitcoin.
hero member
Activity: 592
Merit: 501
We will stand and fight.
September 28, 2011, 08:43:47 AM
#68
ngzhang, I am pretty much clueless about hardware, so I am interested in your views.  Don't you think that if a large-scale enterprise were to get into this, they would be more interested in making a custom ASIC than an FPGA?  How substantial do you imagine the power/speed gains could be for an ASIC over a GPU?

In my opinion, none of company will take part in any mining ASICs. If they have enough resource to tape out an ASIC, I'm sure they will design a another project, not for mining. As a single person, we can behaviour by interests, but a real company can't.

And answer your question, 1/10 cost, 10X performance, 1/10 energy consumption, on a single ASIC. AT LEAST.
sr. member
Activity: 462
Merit: 250
September 28, 2011, 08:19:39 AM
#67
ngzhang, I am pretty much clueless about hardware, so I am interested in your views.  Don't you think that if a large-scale enterprise were to get into this, they would be more interested in making a custom ASIC than an FPGA?  How substantial do you imagine the power/speed gains could be for an ASIC over a GPU?
hero member
Activity: 592
Merit: 501
We will stand and fight.
September 27, 2011, 10:46:53 PM
#66
I just find .. claims of the death of GPU mining to be naive & frustrating.

I've invested a lot of time into FPGA mining. Here is my thinking:

If GPUs remain the dominant technology, difficulty will adjust to make them barely profitable in average-electricity-cost areas of the world.
I don't think anyone really disagrees with that, it's an intentional design decision in Bitcoin.

Once that happens: GPUs in high-elec-cost areas (like me) will be unprofitable. FPGAs will be profitable everywhere operationally
[in terms of BTC produced minus electricity/cooling/maintenance costs]. So they will eventually pay for themselves,
unless Bitcoin collapses entirely before then, screwing over all miners. It might take 2 years, but that is still a pretty decent ROI compared
to FDIC savings accounts, the stock market, treasuries, etc.

This is true even if, say, 28nm or 20nm GPUs are driving the difficulty. My 45nm FPGAs will still have better MH per watt, so they will
still be profitable operationally.

If FPGAs become the dominant technology, difficulty will adjust to make them barely profitable in average-power-cost areas of the world.
GPUs will then be wildly unprofitable everywhere, except for people that somehow have free electricity
[which I think is a tiny fraction of the network].

I actually hope that GPUs remain the dominant technology, while I mine on FPGAs, with a nice, high profit margin.

that still only gets FPGAs to ~$2/MH installed.

$1/MH is possible today if you build the boards yourself, or professionally in qty 100+

I suspect most miners see their hardware investment as sunk cost, leaving the electricity bill. FPGA's already have better MH/Watt and I suspect that gap will grow as the software matures.

Exactly. The decision to keep a GPU running, or shut it off, is not based on some breakeven calculation you did when you bought it.
It's based on whether it's making or losing money, today, based on current difficulty + elec/cooling costs.
I stand by my statement that if FPGAs take off, they will certainly put most GPU miners out of business,
and capture a large percentage of the coins left to be mined.

-rph


In a short time, 1.5$/MH will come true by some "low-manufacture-cost-area" of the world. by my troth.
Because the FPGA mining system is really not came to a big business, so I think profession groups still not join this game. In fact, miners are still a very small group. At this time, most of people making hard work on FPGA mining system maybe really for their "love". Their technique and effort surely could make more money in other filed.
rph
full member
Activity: 176
Merit: 100
September 27, 2011, 10:08:36 PM
#65
I just find .. claims of the death of GPU mining to be naive & frustrating.

I've invested a lot of time into FPGA mining. Here is my thinking:

If GPUs remain the dominant technology, difficulty will adjust to make them barely profitable in average-electricity-cost areas.
I don't think anyone really disagrees with that, it's an intentional design decision in Bitcoin.

Once that happens: GPUs in high-elec-cost areas (like me) will be unprofitable. FPGAs will be profitable everywhere operationally
[in terms of BTC produced minus electricity/cooling/maintenance costs]. So they will eventually pay for themselves,
unless Bitcoin collapses entirely first, screwing over all miners. The payoff might take 2 years, but that is still a pretty decent ROI compared
to FDIC savings accounts, the stock market, treasuries, etc. My FPGAs won't lose 30% overnight due to some Goldman Sachs bullshit.

If/when 28nm or 20nm or 16nm GPUs are driving the difficulty, my 45nm FPGAs will still have better MH per watt, so they will
still be profitable operationally. And anyway I will then be adding 28nm or 20nm or 16nm FPGAs.

If FPGAs become the dominant technology, difficulty will adjust to make them barely profitable in average-power-cost areas of the world.
GPUs will then be wildly unprofitable everywhere, except for people that somehow have free electricity
[which I think is a tiny fraction of the network]. Then we'll see $50 5830s on eBay as lots of people rush to the exits.

I actually hope that GPUs remain the dominant technology, while I mine on FPGAs, with a nice, high profit margin.

If a very high-end ASIC becomes the dominant technology then both GPUs + FPGAs will be unprofitable operationally.
I seriously doubt this will happen. The people with the skills and capital to make it happen could make a lot more money
with less risk building something else. [I'm not talking about a Mosis 250nm ASIC; I'm talking 90nm or better]

that still only gets FPGAs to ~$2/MH installed.

$1/MH is possible today if you build the boards yourself, or professionally in qty 100+

I suspect most miners see their hardware investment as sunk cost, leaving the electricity bill. FPGA's already have better MH/Watt and I suspect that gap will grow as the software matures.

Exactly. The decision to keep a GPU running, or shut it off, is not based on some breakeven calculation you did when you bought it.
It's based on whether it's making or losing money, today, based on current difficulty + elec/cooling costs.
I stand by my statement that if FPGAs take off, they will certainly put most GPU miners out of business,
and capture a large percentage of the coins left to be mined.

-rph
newbie
Activity: 29
Merit: 0
September 27, 2011, 12:44:53 PM
#64
Following
donator
Activity: 1218
Merit: 1079
Gerald Davis
September 27, 2011, 10:17:44 AM
#63
Its true its a risky investment, certainly at this point, but once this starts generating sufficient volume I can see prices tumbling. after all, an FPGA is likely cheaper to produce than our highend gaming GPUs.

Most of the cost (60%+) comes from the actual FPGA.  It is unlikely prices will tumble.  FPGA already have economies of scale.  20 million are sold each year.  Another 1K (or even 10K) miners using FPGA isn't going to cause a massive drop in price. Maybe if one of the FPGA developers gets a massive buy order they could cut FPGA & assembly costs by 30%.  Software improvements might squeeze out another 10%-20% out of current gen FPGA but that still only gets us to ~$2/MH installed.

Yes FPGA benefit from Moore's law but so will GPU.  GPU are almost perfectly scalable.  When the process size gets cut in half then you simply double the numbers of shaders, get roughly 2x the performance and the die size (and thus cost & power) remains the same.

I have derailed this thread enough as it is.  To the OP very good work looks promising I just find the false hope of people (not you) pretending away the economic issues of FPGA (and claims of the death of GPU mining) to be naive & frustrating.
hero member
Activity: 518
Merit: 500
September 27, 2011, 09:38:46 AM
#62
Im not an FPGA  expert by any stretch but they definitely follow Moore's law, in fact arguably more easily than CPU's, as they are much simpler and you can just up the # units as your process gets smaller.  Similar to how, because CPU designs hit an IPC  and clock scaling brick wall, most of the extra transistor budget is simply spent on going from single to dual, quad and octal cores.

That said, Im not sure I agree with the above math; the market dynamic of mining leads to difficulty gravitating  towards break even point for the average miner. I suspect most miners see their hardware investment as sunk cost, leaving the electricity bill. FPGA's already have better MH/Watt and I suspect that gap will grow as the software matures. Its true its a risky investment, certainly at this point, but once this starts generating sufficient volume I can see prices tumbling. after all, an FPGA is likely cheaper to produce than our highend gaming GPUs.
full member
Activity: 188
Merit: 100
September 27, 2011, 08:45:56 AM
#61
Once there are enough FPGAs on the network, difficulty will increase and GPUs will become unprofitable or barely
profitable for anyone paying for cooling + electricity [probably most people with more than 4-5 GPUs]. It's a self-fulfilling prophesy.

I often see this quoted but it is nonsense.  Higher difficulty will make FPGA even @ $2 per MH even MORE prohibitively expensive.  Higher difficulty benefits those w/ efficient GPU (like 5970 & 7xxx series) and moderate to low cost electricity the most.

I think you will see a difficulty spike will kill demand for new FPGA not drive it.

Take hypothetical FPGA miner $2 per MH.  150MH = $300 in cost.  Running 24/7/365 @ 15W.
Break even @ current difficulty is 25 months.
Break even @ 30% difficulty increase is 33 months.
Break even @ 50% difficulty increase is 40 months.

Currently today one could buy 5970 for <$500.  Say 3x5970 + powersupply + other components = 2.8GH for $2800.  Running 24/7/365 @ 1000W.

Break even @ current difficulty is 17 months.
Break even @ 30% difficulty increase is 25 months.
Break even @ 50% difficulty increase is 32 months.

Difficulty increases close the gap but $2 per MH is still beat by anyone w/ $0.10 electrical costs (or less).  I am interested in FPGA but these dire predictions of them killing GPU are simply unwarranted unless cost is closer to $1 per MH installed.

Remember GPU performance per watt won't be static.  The 7xxx series looks to almost double performance per watt (cutting electrical cost in half for GPU miners).  A break even of 40+ months is highly dangerous one risks being undercut by the next next gen video cards.  4 years is long enough for 2 product cycles and we will be looking @ 20nm chips (and other doubling of performance per watt).

Very good info. I guess I would ask the FPGA experts: how often do the FPGA chips increase in performance as well? Do they move as fast as GPUs? Do they follow Moore's law, essentially?
donator
Activity: 1218
Merit: 1079
Gerald Davis
September 27, 2011, 08:27:55 AM
#60
Once there are enough FPGAs on the network, difficulty will increase and GPUs will become unprofitable or barely
profitable for anyone paying for cooling + electricity [probably most people with more than 4-5 GPUs]. It's a self-fulfilling prophesy.

I often see this quoted but it is nonsense.  Higher difficulty will make FPGA even @ $2 per MH even MORE prohibitively expensive.  Higher difficulty benefits those w/ efficient GPU (like 5970 & 7xxx series) and moderate to low cost electricity the most.

I think you will see a difficulty spike will kill demand for new FPGA not drive it.

Take hypothetical FPGA miner $2 per MH.  150MH = $300 in cost.  Running 24/7/365 @ 15W.
Break even @ current difficulty is 25 months.
Break even @ 30% difficulty increase is 33 months.
Break even @ 50% difficulty increase is 40 months.

Currently today one could buy 5970 for <$500.  Say 3x5970 + powersupply + other components = 2.8GH for $2800.  Running 24/7/365 @ 1000W.

Break even @ current difficulty is 17 months.
Break even @ 30% difficulty increase is 25 months.
Break even @ 50% difficulty increase is 32 months.

Difficulty increases close the gap but even $2 per MH (an impressive improvement) is still undercut by anyone w/ $0.10 electrical costs (or less). I am interested in FPGA but these dire predictions of them killing GPU are simply unwarranted unless cost is closer to $1 per MH installed.

Remember GPU performance per watt won't be static.  The 7xxx series looks to almost double performance per watt (cutting electrical cost in half for GPU miners).  A break even of 40+ months, is a considerable risk as 4 years is long enough for 2 product cycles in GPU world.  The product after the 7xxx series likely won't improve performance per watt (think a repeat of 5xxx vs 6xxx) but the generation after that (lets call it 9xxx series) will be a move to 20nm and bring all the power reduction and performance boosts that a die shrink does.

4 years is a long time.  My comparison above is based on the 5970s.  Soon FPGA will compete against 7xxx series (nearly double the performance per watt) and within 4 years against the 9xxx series (4x the performance per watt).
full member
Activity: 182
Merit: 100
September 27, 2011, 07:21:20 AM
#59
Interested
sr. member
Activity: 404
Merit: 250
September 27, 2011, 06:26:30 AM
#58
Following.
hero member
Activity: 518
Merit: 500
September 27, 2011, 02:59:10 AM
#57
I'd feel queasy about selling a "solution" that bundled in somebody else's hard work. 

Then make a deal with the someone else's and give them a share of the revenue (possibly in return for them providing support on the software)

Quote
2. Support.  I'm happy to help out here in a casual message-board-member way.  But I'm kinda worried about lazy users buying a "turn-key" solution from me and then demanding that I hand-hold them through the whole process of configuring Xilinx's crapware drivers on their Windows host box

No harm in stating you'd only support Linux. I think most serious miners use linux anyway. At least the market for "linux able" miners is infinitely bigger than the market for people who are familiar with FPGAs.

Just my 2 cents.
hero member
Activity: 592
Merit: 501
We will stand and fight.
September 27, 2011, 02:34:25 AM
#56
I need a kit i can plug in and mine. The total package. When you can offer that (with enough hash rate), i'm sure that people will buy.
Indeed.
I am with these guys..  I am no electrical engineer..  but I can plug in a psu Smiley

You have a good point.  But to flip that around, on the GPU side it's taken as a given that you're going to buy your hardware from a company (ATI) that does not provide the mining software (or even admit it knows what bitcoin is).  But I understand that while gamers have seen GPUs before, most bitcoiners are encountering FPGAs for the first time.  They aren't scary; they're just obscenely flexible... "enough rope to hang yourself with."

I could, perhaps, put together a turn-key solution, although it would involve a lot of effort.  My two major concerns are:

1. HDL developer guilt.  It makes me slightly ill to see posts like ngzhang's "hey you lazy-ass HDL developers make your code faster so I can make MOAR PROFITZ!!!".  I'd feel queasy about selling a "solution" that bundled in somebody else's hard work.  I don't know the exact details of the fpgaminer/ztex dispute, but I can certainly empathize with the initial reaction from fpgaminer.  It would make me really happy to be providing low-cost boards to people who are interested in tweaking/tuning/improving the HDL code, but I think I've figured out now that there aren't as many of those people as I'd thought.

2. Support.  I'm happy to help out here in a casual message-board-member way.  But I'm kinda worried about lazy users buying a "turn-key" solution from me and then demanding that I hand-hold them through the whole process of configuring Xilinx's crapware drivers on their Windows host box (I haven't used Windows in almost a decade) under threat of posting negative reviews of my product ("did not work for me").  I definitely can't sell the boards for $250 if I have to budget in my own time spent on extensive tech support work.

Anyways.  Looks like the first run will be small personal-use-only, but there may be another batch of boards in November after I've figured out if it's worth taking this to the next level.

Here we have a team working hard on the HDLs, but unfortunately, it's really a extreme difficult work.
Pages:
Jump to: