Author

Topic: Trying to determine energy cost of gpu per month, is it right? (Read 395 times)

newbie
Activity: 33
Merit: 0
1st reply got it. I know because I was figuring out all the kw to kwh conversion for a while!
full member
Activity: 394
Merit: 101
Just to add tho, depending on the motherboard and CPU you are using, as well as other components like HDD or SSD, the 50-70 overhead go up to 150W. I personally am using a pentium g4560 processor and SSD but still my CPU only consumption without any mining is at 150W.
How are you managing to burn 150W with a system based on g4560? I've got a few rigs with this CPU, all Z270 boards, and my idle power consumption is ~40 W (from the wall). G4560's TDP is only 54W, and it's not an overclockable CPU — so even if you feel like stressing the CPU to the max for some reason you're still not supposed to go over 80-90W. I've got one system with an overclocked i7-6700k (with increased voltage as well) mining cryptonight and even that thing is a tiny bit below 100W under load (not just the CPU, but the whole "mb+cpu+ram+2ssd+1hdd+fans" combo — all drawing ~100W from the wall).

yes i see from intel's main ark site the TDP 54W https://ark.intel.com/products/97143/Intel-Pentium-Processor-G4560-3M-Cache-3_50-GHz.
Perhaps he is mentioning the whole system not just CPU and hopefully accidentlly said CPU only.
legendary
Activity: 1106
Merit: 1014
Just to add tho, depending on the motherboard and CPU you are using, as well as other components like HDD or SSD, the 50-70 overhead go up to 150W. I personally am using a pentium g4560 processor and SSD but still my CPU only consumption without any mining is at 150W.
How are you managing to burn 150W with a system based on g4560? I've got a few rigs with this CPU, all Z270 boards, and my idle power consumption is ~40 W (from the wall). G4560's TDP is only 54W, and it's not an overclockable CPU — so even if you feel like stressing the CPU to the max for some reason you're still not supposed to go over 80-90W. I've got one system with an overclocked i7-6700k (with increased voltage as well) mining cryptonight and even that thing is a tiny bit below 100W under load (not just the CPU, but the whole "mb+cpu+ram+2ssd+1hdd+fans" combo — all drawing ~100W from the wall).
sr. member
Activity: 784
Merit: 282
Ok i bought a gtx1050 monster card on eBay and in ewbf mining software it consistently rates at 150w at average. Winter highest rate for my energy bill last year was .25 $/kwh.
Using that if i run the gpu mining non stop for a month, it comes up to this number:
150w x 1kw / 1000w x 24 h/day x 30 day/month = 108kw/month.

Multiplying by highest rate: 108 kw  x .25 $/ kw =27$ per month. This is the price you pay for non stop mining at 150w with 0.25$/kw rate.

Does it sound about right?

It's basically:
0.15kW * $0.25 * 24h * 30days so yeah, $27 per month.

But the PC itself is using probably around 50-70 watts too.

First response nailed it on the head.

Just to add tho, depending on the motherboard and CPU you are using, as well as other components like HDD or SSD, the 50-70 overhead go up to 150W. I personally am using a pentium g4560 processor and SSD but still my CPU only consumption without any mining is at 150W. So yeah you should factor that into your computations too. Good luck!
full member
Activity: 394
Merit: 101
that is good to check with you guys, I do understand the GPU is not the only power source there are many electrical parts on the PC but once you start mining, it becomes minor and GPU will take most of the power.

Anyways, I only started a day ago, first time ever, I can get somehow profitable, I have some long term plan perhaps investigating the possibility of solar charging available, there are dozens upon googling but those costs thousands of dollars and output is not that enough (probably barely enough for powering CPU/GPU) however, core of the investigation will be around scalable system where you can line up the solar arrays and its batteries to pool its output. So far, I haven't see anything like it yet on the internet, most of the system were targeting home-based with 2-4 batteries at most that will likely to run out of juice quite fast.

full member
Activity: 846
Merit: 115
At this price of electricity is not profitable to mine coins. My advice to you is to find another way of earning coins. Think about why the main power of the miners are concentrated in China. There electricity three times cheaper than you. In addition, we see that the growing complexity of production and after some time you can face the fact that it will not be possible to mine bitcoins using GPU.

I'm mining at 22 cents rate and I can say it's still profitable,  Yes it's a headwind but that's not going to force me to give up my hobby.

Most people based their calculations that the coin price will stay the same for the next 1-3 years. That's never the case. Once you take into account the price appreciating in value your electricity is no longer that big of a deal.


Plus their's ways to offset the electric cost by focusing on efficiency,  Get platinum or titantium power supply's, run 240 volts.  Run video cards at 60 percent TDP and HODL for coin value to appreciate. Win!

sr. member
Activity: 518
Merit: 250
You should put a Watt-Meter between your computer and the electrical socket to determin the whole computers power usage. Sure, the GPU accounts for most of the wattage, but its not 100%.
legendary
Activity: 1106
Merit: 1014
Think about why the main power of the miners are concentrated in China. There electricity three times cheaper than you.
My electricity is 5 times cheaper than his ($0.05), and I'm not even in China. Smiley They definitely pay even less for electricity.
Ok i bought a gtx1050 monster card on eBay and in ewbf mining software it consistently rates at 150w at average.
Single gtx 1050 card shouldn't consume more than 60-70W in mining. You just need to learn how to set the clocks and voltages right. Simplest way would be to just lower powerlimit in MSI AB down to 60-70. Just try different limits and see what's more profitable to you.
hero member
Activity: 1218
Merit: 534
At this price of electricity is not profitable to mine coins. My advice to you is to find another way of earning coins. Think about why the main power of the miners are concentrated in China. There electricity three times cheaper than you. In addition, we see that the growing complexity of production and after some time you can face the fact that it will not be possible to mine bitcoins using GPU.


Why not? If I follow the formula above for my rig I get:


0.6 * €0.2 * 24 * 31 = €89,28


My rig is using less than 600w and my power costs less than 0.2, so these were just max nrs.
My earnings are 250-300 euro per month.. That's still some profit. Not sure why you say it's not profitable...?
sr. member
Activity: 434
Merit: 255
At this price of electricity is not profitable to mine coins. My advice to you is to find another way of earning coins. Think about why the main power of the miners are concentrated in China. There electricity three times cheaper than you. In addition, we see that the growing complexity of production and after some time you can face the fact that it will not be possible to mine bitcoins using GPU.
legendary
Activity: 2002
Merit: 1051
ICO? Not even once.
Ok i bought a gtx1050 monster card on eBay and in ewbf mining software it consistently rates at 150w at average. Winter highest rate for my energy bill last year was .25 $/kwh.
Using that if i run the gpu mining non stop for a month, it comes up to this number:
150w x 1kw / 1000w x 24 h/day x 30 day/month = 108kw/month.

Multiplying by highest rate: 108 kw  x .25 $/ kw =27$ per month. This is the price you pay for non stop mining at 150w with 0.25$/kw rate.

Does it sound about right?

It's basically:
0.15kW * $0.25 * 24h * 30days so yeah, $27 per month.

But the PC itself is using probably around 50-70 watts too.
full member
Activity: 394
Merit: 101
Ok i bought a gtx1050 monster card on eBay and in ewbf mining software it consistently rates at 150w at average. Winter highest rate for my energy bill last year was .25 $/kwh.
Using that if i run the gpu mining non stop for a month, it comes up to this number:
150w x 1kw / 1000w x 24 h/day x 30 day/month = 108kw/month.

Multiplying by highest rate: 108 kw  x .25 $/ kw =27$ per month. This is the price you pay for non stop mining at 150w with 0.25$/kw rate.

Does it sound about right?
Jump to: