Pages:
Author

Topic: How to figure out total power draw of a GPU? - page 2. (Read 2656 times)

newbie
Activity: 8
Merit: 0
I do have the kill-a-watt plug but for an in-depth analysis it is not good. I can't go an see day-by-day power draw. Plus there are stories of it burning up when used with high wattage use. For that reason I am afraid of using it with my rig.

So are we saying that the on-board sensors in cards are not accurate?

thanks
jaja
sr. member
Activity: 861
Merit: 281
Hi,

Can someone tell me how to calculate the total power drawn by a GPU?

I'm logging various GPU metrics with HWiNFO64 and would like to know which values constitute the total wattage drawn by a GPU. In the log file I'm seeing the following (these were captured while the card was mining):

GPU Core Voltage (VDDC) [V] 1.194
GPU Core Current [A] 78.313
GPU Core Power [W]   93.486
GPU Chip Power [W]  129.578
GPU VRM Voltage Out (VOUT/VID) [V] 1.193
GPU VRM Voltage In (VIN/+12V) [V] 12.125
GPU VRM Current In (IIN) [A] 6.438
GPU VRM Current Out (IOUT) [A] 56.5
GPU VRM Power Out (POUT) [W] 67.25
GPU VRM Power In (PIN) [W] 78

Is the total the card drawing the Core Power + Chip Power? What about VRM? Is there a way to validate the wattage drawn vs Power In/Current In?

I think understanding these values would be really helpful in deciding how to approach overclocking/underclocking of a specific card (based on historic data) instead of taking a shot in dark, right?

thanks
jaja


You can start by investing in a kill-a-watt first, after that just plug your system through it and check the power consumption in system idle, after that run it in full load and record the power consumption. Now, you can get the approx value for power consumption of one GPU by total power in full load minus total power in idle state dividing whole by the number of GPUs you have.

Secondly, the values that you specify don't have any relation on how well they will be undervolt/overclocked. The parameter that comes in play for deciding this is ASIC quality of your GPU.
legendary
Activity: 2002
Merit: 1051
ICO? Not even once.
Measure at the wall. Software can't possible be accurate.
newbie
Activity: 7
Merit: 0
Software readings aren't accurate unless you have one of the digital power supplies that outputs the readings directly from the psu.  If you don't have one of those, you'll need to use external hardware to measure it.  Something like this is cheap and fairly accurate.  https://www.amazon.com/P3-International-P4460-Electricity-Monitor/dp/B000RGF29Q
newbie
Activity: 8
Merit: 0
Hi,

Can someone tell me how to calculate the total power drawn by a GPU?

I'm logging various GPU metrics with HWiNFO64 and would like to know which values constitute the total wattage drawn by a GPU. In the log file I'm seeing the following (these were captured while the card was mining):

GPU Core Voltage (VDDC) [V] 1.194
GPU Core Current [A] 78.313
GPU Core Power [W]   93.486
GPU Chip Power [W]  129.578
GPU VRM Voltage Out (VOUT/VID) [V] 1.193
GPU VRM Voltage In (VIN/+12V) [V] 12.125
GPU VRM Current In (IIN) [A] 6.438
GPU VRM Current Out (IOUT) [A] 56.5
GPU VRM Power Out (POUT) [W] 67.25
GPU VRM Power In (PIN) [W] 78

Is the total the card drawing the Core Power + Chip Power? What about VRM? Is there a way to validate the wattage drawn vs Power In/Current In?

I think understanding these values would be really helpful in deciding how to approach overclocking/underclocking of a specific card (based on historic data) instead of taking a shot in dark, right?

thanks
jaja
Pages:
Jump to: