I have both cards and icarus. Currently the for me the electricity to generate 1 btc is about 29c for GPU and 1.44c for FPGA and with BTC being around $6 it is not that important to me atm. I think the big advantage is ease of expandability:
Those are all very good points, thank you. The cost savings looks like it's substantial. That's the only real issue for me. I have computers used by my family that I haven't upgraded the GPU on yet so the other factors are not as big a concern for me. Of course, I forgot about heat because it's winter. Thanks for reminding me how miserable last summer was for me.
Keep in mind that Defkin's cost of 29c to generate 1BTC is extremely low; most people cannot achieve anywhere this low a cost.
To generate 1BTC per day at today's difficulty would require 1400Ghps. That's 3 overclocked 5870s running at 200W each, or 600W. Add to that the power for the cpu, motherboard, and power supply efficiency (assume 90%) and that brings you up to 700W. Or 16.8kWh per day. The AVERAGE electricity rate in the US is $0.11/kWh. So 16.8kWh x $0.11/kWh is $1.85. Not $0.29.
So although Defkin can somehow generate 1BTC with his GPUs for $0.29, most people in the States will need $1.85 in electricity to generate that same 1BTC. Run the same numbers for the FPGA and see what you get. Defkin gets 1.44c, but again that is unrealistically low for the average person. Check your electricity bill and see what your actual rate is.