Scalability. My "end game" would be putting 4GH/s in a 4U chassis and dropping it in a datacenter.
Am I crazy? Would it be too much cost/complexity for little gain.
Before you splash that kind of money on expensive and (relatively) power hungry FPGAs, do consider that if bitcoin is here to stay,
sooner or later someone will do a quickpath/harcopy port, or even a full custom asic. If its not BFL, someone else will, and it will make your off the shelve fpga's look almost as silly as someone who bought racks full of xeons for bitcoin mining a year ago.
To get a feel for what full custom asics could achieve, have a look here:
http://rijndael.ece.vt.edu/sha3/chip/sha3-asic-datasheet.pdfIf my math isnt off, that test chip gets either 150 or 300 MH/W on an old 130nm process.
Hardcopy electrical efficiency wouldn't be that high & full ASIC are not happening any time in the near future (if ever). The greater threat comes from non-viability of Bitcoin (which would affect all custom solutions equally) and the relentless drive of Moore's law. Still the comparison to rack full of Xeons isn't realistic. The 7800 series cards will likely have 2x the electrical efficiency of 5000 and 6000 series cards. The day 7800s launch does that suddenly turn all current rigs into obsolete unprofitable junk? Hardly. Sure a rack full of 7870s would be nicer than 5970s but the mining maket isn't that efficient to instantly make older tech obsolete.
Greater electrical efficiency starts becoming a game of diminishing returns.
My rigs get about 2.6MH/W and that is something I am pretty proud of.
Current real FPGA solutions are roughly 22MH/W.
Structured Asic is ~ double efficiency per watt say 40MH/W.
Lets split the difference on that chip and say 230MH/W.
1 Bitcoin electrical cost (above performance per watt & 1.2M difficulty)GPU - $0.88
FPGA - $0.10
SASIC - $0.06
Custom ASIC - $0.01
Sure SASICS and ASICS are more efficient however look at current prices they trade at roughly 3x my electrical cost and almost 25x FPGA costs. Some day when a Bitcoin is priced at 3x SASIC electrical cost an FPGA is still profitable. Eventually the combination of Moore's law and SASIC (or just more efficient FPGA) will drive the revenue below cost of production on a "current gen" FPGA cluster but that day is likely at least a decade away. Remember even when the tech exists it will take time before the market adopts it.
TLDR version:
Scale doesn't negatively affect ROI%.
If a $20K FPGA cluster is going to lose money over its lifetime then ... a $500 FPGA board will also lose money over its lifetime. So if you believe SASICS or Custom ASICS make building a cluster a bad investment why would you want to buy an $500 FPGA board?