Author

Topic: Heating a house with old GPU's, worth it? (Read 2682 times)

sr. member
Activity: 285
Merit: 250
October 30, 2013, 08:42:34 AM
#16
Not worth it, just sell it Smiley
full member
Activity: 231
Merit: 100
October 29, 2013, 10:11:33 PM
#15
To calculate how much heat every 6990 in the setup will produce, you have to consider how much energy the computers containing them will draw from the wall. PSUs aren't 100% efficient, and all the "lost" energy gets converted into heat as well.

Calculating 65 W power consumption outside the video cards, we get a total of 1565 W for a computer holding four 6990s. If the PSU operates at 85% efficiency, the whole setup will draw approximately 1840 W from the wall, giving 460 W per video card.
By DeathAndTaxes view per 4 6990's we'll need 1565W from the wall. Producing 1565W heat output.

No. The 85% efficient PSU draws 1840 W from the wall. 15% get converted directly into heat, the remaining 85% get converted into heat by the rig's remaining components.

If we are on a duty cycle they won't run 24/7 so hopefully they don't get abused to hard..

By far too wasteful. Since the rigs will be mining coins only while heating, you should run them 24/7. An 18000 W rig running 8 hours per day will produce the same heat as a 6000 W rig running continuously, but it will cost thrice as much and produce only one third of the coins.
donator
Activity: 1218
Merit: 1079
Gerald Davis
October 29, 2013, 09:37:29 PM
#14
By DeathAndTaxes view per 4 6990's we'll need 1565W from the wall. Producing 1565W heat output.

60kBTU = 17580W / 1565W = ~11 = 12 of these rigs

A 60K BTU heater doesn't run 24/7.  It doesn't even run 20% of the time.   Buying 12 rigs to run them 20% of the time is a incredibly expensive idea.  The only way to be more wasteful would be to burn money to produce BTUs.

You are still going it backwards.  Find out how much heat you need FIRST (i.e. we need x BTUs per day peak) and then figure out how many rigs it will take to produce that.  Note this is academic because GPU mining is mostly dead. 

Do you CURRENTLY have heat in your house?  If so did you have it all last winter?  If so do you have a copy of your utility bill?  You can estimate the fuel (if natural gas/propane) used and thus the amount of BTUs used and work backwards from that.
full member
Activity: 160
Merit: 100
October 29, 2013, 09:31:22 PM
#13
Let's travel deeper.
1. Figure the actual percentage of wattage released into the air from a GPU.
EDIT: 6990's are rated at 375watts TDP; TDP is defined as the amount of energy the cooling system needs to remove from the IC for proper functioning http://en.wikipedia.org/wiki/Thermal_design_power. So I would have to figure it would dissipate 375watts into the air at 100% PU usage.
2. Possible duty cycle of GPU heating system to keep house warm and not cook humanz alive.
3. Duty cycle will give us the cost of electricity
4. Duty cycle also gives us potential coins generated
5. Subtract electricity cost from coin value, hope its small!
(currently working on finding a good guesstimate of these)

It is far simpler.  Computer components (including GPUs) do no "work" in the physics sense.  So 100% of energy drawn from the wall will be converted to heat.
If you want a 3KW heat just build a rig (or rigs) which have a 3KW load.  No other calculations are required.  It doesn't matter where the heat come from (CPU idling, GPU core, VRMs, power supply ineffciency) it is all heat.
So for 1 it seems that all wattage is released into the air, further more pretty much all wattage consumed by electrical devices is given off as heat. We then need the minimum cost of a computer to run the maximum amount of GPU's at once and its power specs.

To calculate how much heat every 6990 in the setup will produce, you have to consider how much energy the computers containing them will draw from the wall. PSUs aren't 100% efficient, and all the "lost" energy gets converted into heat as well.

Calculating 65 W power consumption outside the video cards, we get a total of 1565 W for a computer holding four 6990s. If the PSU operates at 85% efficiency, the whole setup will draw approximately 1840 W from the wall, giving 460 W per video card.
By DeathAndTaxes view per 4 6990's we'll need 1565W from the wall. Producing 1565W heat output.

60kBTU = 17580W / 1565W = ~11 = 12 of these rigs

Now if this rig can produce out needed 60k btu, we need to figure the duty cycle of a regular heater at 60btu. Then at this duty cycle determine how much power our rigs would draw, and find the cost of it.

Any idea for a house heaters duty cycle?


-- If we are on a duty cycle they won't run 24/7 so hopefully they don't get abused to hard..
Also videocards were not designed to run 24/365
Even at modest settings something is bound to go off, be it a fan, or something worse. If you have no warranty then this is worst idea ever.

legendary
Activity: 1246
Merit: 1002
October 29, 2013, 09:30:15 PM
#12
There are a few calculators to calculate mining profit.  You might want to look at one of them.

It appears that you can buy a 4 module Avalon today.  This will give you about 109 GH/s hashpower.  The cost is about BTC 8, and the return may be about BTC 5, for a net cost of 3 btc ($600) per device.

You need to have about 23 of them to deliver the heat that you desire, or about $13,800 net loss.

Predicting future income is very difficult.  You could make profit, or lose even more money than this calculation.
full member
Activity: 231
Merit: 100
October 29, 2013, 09:22:15 PM
#11
To calculate how much heat every 6990 in the setup will produce, you have to consider how much energy the computers containing them will draw from the wall. PSUs aren't 100% efficient, and all the "lost" energy gets converted into heat as well.

Calculating 65 W power consumption outside the video cards, we get a total of 1565 W for a computer holding four 6990s. If the PSU operates at 85% efficiency, the whole setup will draw approximately 1840 W from the wall, giving 460 W per video card.

It is far simpler.  Computer components (including GPUs) do no "work" in the physics sense.  So 100% of energy drawn from the wall will be converted to heat.

That's what I said (or at least tried to). I was merely pointing out that considering the GPU's power consumption alone isn't enough, since the remaining components require electricity as well and the PSU has to draw more power than it delivers to those components.
hero member
Activity: 980
Merit: 500
FREE $50 BONUS - STAKE - [click signature]
October 29, 2013, 09:15:42 PM
#10
In theory - every W you draw to power something, turns into heat energy, one way or another. Even light from lightbulbs.

I'm heating my apartment with my last gfx card, it pays for itself ++, but with the cost of litecoin going down it will eventually end.
If I had no other use for this card, except for mining - I would probably sell it, because it can devalue fast. But if the card is so old, that you can't sell it - I guess it's ok to use it a crypto-heater.

Also videocards were not designed to run 24/365
Even at modest settings something is bound to go off, be it a fan, or something worse. If you have no warranty then this is worst idea ever.

Finally - electricity is usually the most expensive source of heat.
donator
Activity: 1218
Merit: 1079
Gerald Davis
October 29, 2013, 09:12:32 PM
#9
To calculate how much heat every 6990 in the setup will produce, you have to consider how much energy the computers containing them will draw from the wall. PSUs aren't 100% efficient, and all the "lost" energy gets converted into heat as well.

Calculating 65 W power consumption outside the video cards, we get a total of 1565 W for a computer holding four 6990s. If the PSU operates at 85% efficiency, the whole setup will draw approximately 1840 W from the wall, giving 460 W per video card.

It is far simpler.  Computer components (including GPUs) do no "work" in the physics sense.  So 100% of energy drawn from the wall will be converted to heat.

If you want a 3KW heat just build a rig (or rigs) which have a 3KW load.  No other calculations are required.  It doesn't matter where the heat come from (CPU idling, GPU core, VRMs, power supply ineffciency) it is all heat.

3KW electricity In = 3KW thermal energy ("heat") out.

legendary
Activity: 1246
Merit: 1002
October 29, 2013, 09:08:02 PM
#8
This looks like a terrible idea. Easier to just buy a real heater? Or did I do some maths wrong?


Yes it is definitely easier to buy a real heater. The calculation is also wrong, because GPU produce heat only as a (conceptually unwanted) by-product, while the heater produces heat as its main product. So GPU heat production per Watt is not comparable at all to a heater.

Using mining hardware for heating makes only sense if their mining rate is nearly profitable. So the heat is some "added value" of the mining operation and not vice versa.


ya.ya.yo!

A watt-hour consumed at the wall and converted to heat is the same no matter what device does the conversion.
full member
Activity: 231
Merit: 100
October 29, 2013, 09:07:03 PM
#7
[...] GPU produce heat only as a (conceptually unwanted) by-product, while the heater produces heat as its main product. So GPU heat production per Watt is not comparable at all to a heater.

This is complete nonsense. Whether heat is conceptually wanted or not doesn't matter at all. A simple example is the traditional light bulb, which converts over 95% of the consumed electricity directly into heat. The visible light it produces gets converted into heat as well when it is absorbed by the bulb's surroundings (there are some exceptions), so while a light bulb is only 5% efficient as a light source, it is 100% efficient as a heater.

Energy cannot be created nor destroyed (Law of conservation of energy), so all the electricity your computer consumes has to get saved or converted into some other kind of energy. In the case of a GPU, all energy is converted into heat in the transistors and the electric circuits. Even a video card's fans actually heat up the ...

Lets also say we want to make our heater out of 6990's which have a TDP of 375watts and price of $370.
To get 17850 watts we need ~48 6990's at $370 ea = $17,760

To calculate how much heat every 6990 in the setup will produce, you have to consider how much energy the computers containing them will draw from the wall. PSUs aren't 100% efficient as a power converter; the "lost" energy gets converted into heat as well.

Calculating 65 W power consumption outside the video cards, we get a total of 1565 W for a computer holding four 6990s. If the PSU operates at 85% efficiency, the whole setup will draw approximately 1840 W from the wall, giving 460 W per video card.
full member
Activity: 160
Merit: 100
October 29, 2013, 09:03:39 PM
#6
Yes it is definitely easier to buy a real heater. The calculation is also wrong, because GPU produce heat only as a (conceptually unwanted) by-product, while the heater produces heat as its main product. So GPU heat production per Watt is not comparable at all to a heater.
You might want to revise your numbers a bit.
You won't be running your heater at 100% capacity 24 hours a day.
Let's travel deeper.
1. Figure the actual percentage of wattage released into the air from a GPU.
EDIT: 6990's are rated at 375watts TDP; TDP is defined as the amount of energy the cooling system needs to remove from the IC for proper functioning http://en.wikipedia.org/wiki/Thermal_design_power. So I would have to figure it would dissipate 375watts into the air at 100% PU usage.
2. Possible duty cycle of GPU heating system to keep house warm and not cook humanz alive.
3. Duty cycle will give us the cost of electricity
4. Duty cycle also gives us potential coins generated
5. Subtract electricity cost from coin value, hope its small!
(currently working on finding a good guesstimate of these)


Cant believe you actually thought about it seriously.
however, if you have $17k to spare, please do it regardless if its profitable or not just for the coolness of it.

That's why I ponder on things like this, cool things come from it. Though they never get created. Well the only thing I "created" from an idea like this was putting dry ice directly on my GPU PCB+IC one day, it was sweet. Still have the card and it works Smiley
donator
Activity: 1218
Merit: 1079
Gerald Davis
October 29, 2013, 07:29:11 PM
#5
You might want to revise your numbers a bit.

You won't be running your heater at 100% capacity 24 hours a day.

This.  If you ran a 60K BTU heater 24/7 in a well insulated house, the internal temp of your house would probably get hot enough to literally cook you.

You currently have heat right?  You paid for it thus you can compute the average daily energy use you spend on heating.  That is the amount of energy you would need to replace.
hero member
Activity: 700
Merit: 500
What doesn't kill you only makes you sicker!
October 29, 2013, 07:24:26 PM
#4
You might want to revise your numbers a bit.

You won't be running your heater at 100% capacity 24 hours a day.
newbie
Activity: 27
Merit: 0
October 29, 2013, 07:11:42 PM
#3
Cant believe you actually thought about it seriously.
however, if you have $17k to spare, please do it regardless if its profitable or not just for the coolness of it.
legendary
Activity: 1806
Merit: 1024
October 29, 2013, 05:02:15 PM
#2
This looks like a terrible idea. Easier to just buy a real heater? Or did I do some maths wrong?


Yes it is definitely easier to buy a real heater. The calculation is also wrong, because GPU produce heat only as a (conceptually unwanted) by-product, while the heater produces heat as its main product. So GPU heat production per Watt is not comparable at all to a heater.

Using mining hardware for heating makes only sense if their mining rate is nearly profitable. So the heat is some "added value" of the mining operation and not vice versa.


ya.ya.yo!
full member
Activity: 160
Merit: 100
October 29, 2013, 01:53:16 PM
#1
I wanted to do some maths and get some others input on if this is a feasible idea.

GPU's create a lot of heat, they also can mine coins (btc, ltc, ...). I want to figure out how many would be needed to heat an entire house, and if the coins they produce could offset electrcity costs at all.

So lets first figure out how big of heating unit a house might need, here is a handy calculator for that http://www.alpinehomeair.com/hc/calculator/heating_estimator.cfm

I got quoted for a 60k BTU heater for a 2,000sq.ft. house.
Lets convert 60k BTU to Watts = 17580 watts
Lets also say we want to make our heater out of 6990's which have a TDP of 375watts and price of $370.
To get 17850 watts we need ~48 6990's at $370 ea = $17,760
With 48 6990 we would have (SHA-256) 750Mhash each = 36,000Mhash = 36Ghash
With 48 6990 we would have (scrypt) 850kHash each = 40,800kHash = 40.8MHash

If I were to have one right now
Mining Scrypt I could make ~ 36LTC a day ~ 1064LTC a month ~~~~~~~~ $2000
Mining SHA-256 I could make ~ 0.05BTC a day ~ 1.5BTC a month ~~~~~ $300

The best I could find is at 17850 watts running 24/7 the electricity costs would be about $3000.

In the end spend $1000 a month on heating....

This looks like a terrible idea. Easier to just buy a real heater? Or did I do some maths wrong?
Jump to: