Author

Topic: What drives power consumption for mining? (Read 423 times)

member
Activity: 81
Merit: 1002
It was only the wind.
June 11, 2017, 09:11:36 PM
#6
Core volts are what EAT watts.

And voltage scales (in a not exactly linear manner) with core clock.
Power is roughly proportional to voltage squared, and if voltage is held constant, power scales linearly with frequency.


Not exactly. I can drop volts and OC. But yeah, overall, you're correct.
sr. member
Activity: 610
Merit: 265
My rule of thumb is:

Power scales to the square of voltage.
Power scales to the square of clock speed.
Clock speeds scale linearly with voltage.

Mean time to failure gets exponentially worse with increased voltage
Mean time of fan failure gets exponentially worse as fan speed increases

Because my farm is in a remote location, I just run everything at 0.85v, Nvidia and AMD Gpus alike, but I'll push them as hard as possible with clock speeds and dual mining at these voltages.
sr. member
Activity: 588
Merit: 251
Core volts are what EAT watts.

And voltage scales (in a not exactly linear manner) with core clock.
Power is roughly proportional to voltage squared, and if voltage is held constant, power scales linearly with frequency.
hero member
Activity: 1036
Merit: 606
TDP > Core clock > Memory clock in terms of power usage.
sr. member
Activity: 378
Merit: 250
So, random question as I jump back into some coin mining and trying to build the most efficient rigs.  What drives a cards power consumption down the most?  Lowering Power Limit, Core Clock or Memory Clock?  In particular, I've got some 1070's I'm trying to mine ETH with, and while I can get them to 30-31 MH, can't seem to get the card below 110W draw or so.
Jump to: