Based on the figures thown about on here, I gather that tweaking the S3 to achieve 500 GH/s increases the overall at-the-wall wattage by 32% with only a 14% increase in hash rate. This 14% hash rate gain by itself also consumes more than twice the energy at 1.83 W/GH/s (2.38x more compared to the stock .77 W/GH/s). Is it worth it? Are we getting too carried away with tweaking and modding? How do these figures affect ROI? I can't help thinking that this is getting to be about interweb bragging rights on whose S3 achieves 0.5 TH/s.
Perhaps there are some hardcore mathematician/statistician among us that could elaborate on this subject.
It's actually ~20% power increase.
Calculating ROI off one unit is just silly.
Look at the difficulty increase from 16.8 to 18.7 in past two weeks. It will continue to get worse. Every week-two weeks your ROI is going to get extended and extended.
If you pay for power or if you don't is the biggest factor in terms of ROI. Either way the small increase of hashrate at the current difficulty should give you multiples more of profit than it is a "power cost"
I'm not sure if you meant hash (rate) power or power consumed but I can't figure out where your 20% comes from. I figure:
500GH/s - 440GH/s = 60GH/s
Therefore: 60GH/s / 440GH/s x 100 = 13.64% (rounded up to 14% increase in hash rate from stock 440GH/s)
450W - 340W = 110W
Therefore: 110W / 340W x 100 = 32.35% (rounded down to 32% increase in power consumption from stock 340W)
I don't think calculating off the most basic unit (a single S3) and extrapolating from it is silly. In fact, it is a very sound method.
I agree that there is an advantage in OC'ing the S3 to a certain extent (depending on kWh rate) as the overall increase in power consumption is not that bad at 500GH/s (from 0.77 W/GH/s to 0.90 W/GH/s).