Author

Topic: Power costs & real efficiency analysis for ATI 5770 (Read 1756 times)

full member
Activity: 406
Merit: 104
I run my 6770 at 960mhz.  it jumped 20 watts from 800 to 960mhz  my hasing went from 185 to 212m/hash sec.

Im running it on 5-6 year old dell.  My kill a watt meter says im pulling on average 240watts (240 x 4 =kwh) hashing away which at 7 cents per KWH cost me roughly (6 x .07 = ) 42 cents a day.  At current rate of my pool I am averaging $1.78 income a day and expense of 42 cents.

You bring an interesting point, there might be a certain clock speed that maximizes profits.  I have not down clocked the memory yet either.  Ironically, this old Dell uses more watts then a 3.6ghz quad core AMD with power savings features toggled off and overclocked 6850 to 875mhz by 15-20 watts.  Looking forward to my new sempron build :-)
member
Activity: 98
Merit: 10
Good to see another NZer mining.
Overclocking does bump the power usage up but under-clocking the memory brings it back down.
Just need to find a good balance.
newbie
Activity: 54
Merit: 0
No. There's good documentation showing the results of overclocking here:
https://en.bitcoin.it/wiki/Mining_hardware_comparison

I think I will investigate underclocking the memory, then overclocking until it reaches a similar running temp, being 74-76 degrees.

After my initial post, I realised I forgot to include the power cost of my ADSL 4-port modem router, which uses about 10 watts (but is used for part of the day for other things too), but it won't affect the economics of mining substantially.
sr. member
Activity: 302
Merit: 250
Have you tried overclocking the card? It would be interesting to know how overclocking affects power usage/# of hashes gained.
newbie
Activity: 54
Merit: 0
Ok, I have to do some posts in the newbies forum to get greater access, so I'll try to share something worthwhile.

I've been mining for about a week now and my head is spinning from all the reading I've done on the topic. A key factor in profitability of mining is obviously cost of the power to run the equipment, especially for an amateur user like me who just wants to get some value out of a graphics card. I have a power meter, so measuring my current wattage is as easy as plugging the box into it.

The hardware in my box is bottom-end gaming and is about 15 months old. I bought what I think is good quality brands but economical hardware and it's proved faultless so far. In particular I made an effort to get an efficient (but value for money) PSU, so unbranded equipment at least will probably use a few % more power than mine.

Hardware:
Corsair 400W PSU
ASUS Radeon (ATI) EAH5770
AMD Phenom II X2 555 3.2Ghz CPU
Seagate 500GB 7200 16MB SATA
(Running Win 7)

Power consumption (excluding monitor):
Idle: 27W
GPU Mining (Phoenix miner, 1 thread, 92% load on GPU, CPU load about 10%): 69.5 Watts, 161 Mhash/s
GPU Mining (Phoenix miner, 2 threads, 96% load on GPU, CPU load about 10%): 70.5 W, 167 Mhash/s (but with substantially more stale blocks)
GPU & CPU Mining (Phoenix miner for GPU - 1 thread, Bitcoin app for CPU, 1 GPU thread, CPU load 100%): 94.5 W, 165 Mhash/s from GPU, 1.9Mhash/s from CPU

Cost of power = NZ$0.21168/kWh

I'm guessing that CPU mining is so inefficient I won't recoup my power costs, so I won't look into that further.

Consumption of GPU mining (if I'm using the computer anyway) = just the additional cost of the GPU load = 69.5-27 = 42.5W
Consumption of GPU mining (if I'd've otherwise shut down the PC) = 69.5W

The PC often is left on during the day for a number of hrs, let's say 10.5hrs average.
Daily cost = 10.5hrs * (42.5W/1000W) + (24hrs - 10.5hrs) * (69.5W/1000W) * $0.21168/kWh = 1.3845kWh * $0.21168kWH = NZ$0.29307096

It's winter in NZ right now, so the waste heat actually saves on space heating too. Our heat pump (fairly old) is probably 2.5:1 efficiency or worse, so the actual net cost is probably more like NZ$0.20/day.

(If I couldn't make good use of the waste heat and the time the PC was on, the cost would be about NZ$0.353)

NZ$1 = approx US$0.80 I think, so net power costs are about US$0.16/day

My earnings over the approx 5 days I've had the GPU mining set up have averaged about 0.15BTC/day through pooled mining, which if valued at US$15 per BTC would earn me US$2.25/day for my 16 cents.

If I'd done some work in the office instead of writing this, I probably could've earned more than a weeks mining, but I guess it's nice to have an interesting hobby...

ps: My GPU output drops by around 10Mhash/s when leaving Firefox open (with about 20-30 tabs open mind you, due to my crazy browsing habits)
Jump to: