Author

Topic: Heat - Enemy #1, and here's why (Read 2002 times)

newbie
Activity: 38
Merit: 0
September 03, 2014, 07:27:38 PM
#23

The heat signature of a 4U rig is the equivalent of a 42U rack and most data centers are designed for the density spread out over 42U not 4. It's like trying to cool a heat gun in a shoebox. Very tough. The heat problem (at least here in the US) hasn't been much of an issue because it was winter and heat was a desirable benefit along with the bitcoin mined. There are the liquid cooled solutions that are available, and Martin Enclosures just released a new line of racks - http://www.martinenclosures.com/product/bitrack/ that sound like they were developed to help deal with the heat problem.




This totally confuses me, let's take the most dense system on the market, the SP31, you are tell me most 42U racks don't pull more than 6K Watts (say single NEMA L6-30P)? I am not sure what rock you are living under but I have been running and/or hosting gear in data centers since the 90s and it was a rare rack that had such a low power usage..

I can point to three global banks, two media companies, 2 global ad agencies, a computer manufacturer, and an $8B outsourcing company and tell you will the degree of accuracy that only comes from being embedded in their operations groups on projects that the rack loads are 2.4-3.6 today. I wouldn't have believed it.

Contrast that with animation and geospatial (oil & gas) who run 30Kw/cabinet or more for their applications. I have blogged about high density and the HPC markets, and how density will be a bazillion watts a foot, but the reality doesn't support the projections in a broad sense. In pockets, yes. generally, no.


High density applications in *some* racks, not in every single rack from wall to wall. The building simply wasn't designed with that in mind.

The high density environments I am most familiar with are in converted shipping containers. At one animation Studio they run 34Kw/cabinet in every cabinet in specially designed computer rooms. They exist, just not broadly.
legendary
Activity: 1666
Merit: 1185
dogiecoin.com
September 03, 2014, 04:44:04 PM
#22

The heat signature of a 4U rig is the equivalent of a 42U rack and most data centers are designed for the density spread out over 42U not 4. It's like trying to cool a heat gun in a shoebox. Very tough. The heat problem (at least here in the US) hasn't been much of an issue because it was winter and heat was a desirable benefit along with the bitcoin mined. There are the liquid cooled solutions that are available, and Martin Enclosures just released a new line of racks - http://www.martinenclosures.com/product/bitrack/ that sound like they were developed to help deal with the heat problem.




This totally confuses me, let's take the most dense system on the market, the SP31, you are tell me most 42U racks don't pull more than 6K Watts (say single NEMA L6-30P)? I am not sure what rock you are living under but I have been running and/or hosting gear in data centers since the 90s and it was a rare rack that had such a low power usage..

I can point to three global banks, two media companies, 2 global ad agencies, a computer manufacturer, and an $8B outsourcing company and tell you will the degree of accuracy that only comes from being embedded in their operations groups on projects that the rack loads are 2.4-3.6 today. I wouldn't have believed it.

Contrast that with animation and geospatial (oil & gas) who run 30Kw/cabinet or more for their applications. I have blogged about high density and the HPC markets, and how density will be a bazillion watts a foot, but the reality doesn't support the projections in a broad sense. In pockets, yes. generally, no.


High density applications in *some* racks, not in every single rack from wall to wall. The building simply wasn't designed with that in mind.
newbie
Activity: 38
Merit: 0
September 03, 2014, 02:07:40 PM
#21

The heat signature of a 4U rig is the equivalent of a 42U rack and most data centers are designed for the density spread out over 42U not 4. It's like trying to cool a heat gun in a shoebox. Very tough. The heat problem (at least here in the US) hasn't been much of an issue because it was winter and heat was a desirable benefit along with the bitcoin mined. There are the liquid cooled solutions that are available, and Martin Enclosures just released a new line of racks - http://www.martinenclosures.com/product/bitrack/ that sound like they were developed to help deal with the heat problem.




This totally confuses me, let's take the most dense system on the market, the SP31, you are tell me most 42U racks don't pull more than 6K Watts (say single NEMA L6-30P)? I am not sure what rock you are living under but I have been running and/or hosting gear in data centers since the 90s and it was a rare rack that had such a low power usage..

I can point to three global banks, two media companies, 2 global ad agencies, a computer manufacturer, and an $8B outsourcing company and tell you will the degree of accuracy that only comes from being embedded in their operations groups on projects that the rack loads are 2.4-3.6 today. I wouldn't have believed it.

Contrast that with animation and geospatial (oil & gas) who run 30Kw/cabinet or more for their applications. I have blogged about high density and the HPC markets, and how density will be a bazillion watts a foot, but the reality doesn't support the projections in a broad sense. In pockets, yes. generally, no.
legendary
Activity: 1428
Merit: 1000
https://www.bitworks.io
September 03, 2014, 02:00:07 PM
#20

The heat signature of a 4U rig is the equivalent of a 42U rack and most data centers are designed for the density spread out over 42U not 4. It's like trying to cool a heat gun in a shoebox. Very tough. The heat problem (at least here in the US) hasn't been much of an issue because it was winter and heat was a desirable benefit along with the bitcoin mined. There are the liquid cooled solutions that are available, and Martin Enclosures just released a new line of racks - http://www.martinenclosures.com/product/bitrack/ that sound like they were developed to help deal with the heat problem.




This totally confuses me, let's take the most dense system on the market, the SP31, you are tell me most 42U racks don't pull more than 6K Watts (say single NEMA L6-30P)? I am not sure what rock you are living under but I have been running and/or hosting gear in data centers since the 90s and it was a rare rack that had such a low power usage..
legendary
Activity: 1666
Merit: 1185
dogiecoin.com
September 03, 2014, 01:50:29 PM
#19


Anyone have an educated answer to this ? 



Yep. Build your datacenter somewhere really cold.

You bumped a 4 month old thread for that?
newbie
Activity: 38
Merit: 0
September 03, 2014, 01:28:49 PM
#18


I want to see how much additional energy it takes to cool per rig watt.

For example. I wish to keep my rigs in an ambient temperature of 80, when it is 105 outside. It almost seems to me that more energy will be spent pumping the heat out of the room than will be spent on power the rigs.

I don't remember physics that well, but I thought at least an equivalent amount of energy would be spent to remove the heat and even more because of the huge inefficiencies in cooling ?

Anyone good at physics want to answer this ?  Point me to where someone writes it up ?  The link above was more about data centers and not the direct cost related economics of it as they relate to profitability.   (and environmental impact Sad

Anyone have an educated answer to this ? 



This is commonly expressed in PUE - Power Utilization Effectiveness - in the data center world. Maybe not the most accurate yardstick, but it measures the overhead of cooling required to cool the corresponding compute load - 1.5 PUE means that for every watt you spend on IT load, you will spend 50 cents in cooling costs. If you look at a data center contract, there is a 'cooling uplift' charge - this is to cover the PUE and associated electricity overhead to provide cooling.

Immersion 2 cooling is below 1.1 meaning that you spend less than 10% electricity overhead on cooling, and 400% less than a typical water or air cooled deployment.

There are two practical ways to mitigate costs - find the most efficient way to manage the concentration of heat from a rig and find the cheapest electricity you can. If that's the goal. One watt generates 3.41 BTU and it is the heat signature that is the issue and insuring as little power goes to run cooling as possible.

Does that help?
member
Activity: 89
Merit: 10
September 03, 2014, 12:57:40 PM
#17


Anyone have an educated answer to this ? 



Yep. Build your datacenter somewhere really cold.
sr. member
Activity: 405
Merit: 250
May 09, 2014, 12:52:01 PM
#16


I want to see how much additional energy it takes to cool per rig watt.

For example. I wish to keep my rigs in an ambient temperature of 80, when it is 105 outside. It almost seems to me that more energy will be spent pumping the heat out of the room than will be spent on power the rigs.

I don't remember physics that well, but I thought at least an equivalent amount of energy would be spent to remove the heat and even more because of the huge inefficiencies in cooling ?

Anyone good at physics want to answer this ?  Point me to where someone writes it up ?  The link above was more about data centers and not the direct cost related economics of it as they relate to profitability.   (and environmental impact Sad

Anyone have an educated answer to this ? 

hero member
Activity: 770
Merit: 509
May 09, 2014, 12:11:36 AM
#15
2 phase Immersion cooling - the #1 solution to the heat problem.

As of now I estimate immersion cooling is just about as costly as buying an AC. But with specifically designed boards it can be several times cheaper in addition to being significantly more efficient.(less than 1.01 pue vs 1.3+ for AC cooled.)
member
Activity: 92
Merit: 10
May 08, 2014, 11:29:22 PM
#14
This will be a very interesting summer for mining. Modern data centers are NOT the solution. Neither are mobile data centers. Lots of deep pocketed miners will lose their ass trying to fit their operation into the current infrastructure. Luckily, there are some really smart people quietly building solutions from scratch in anticipation of the evolving landscape.
sr. member
Activity: 405
Merit: 250
May 08, 2014, 11:22:01 PM
#13


I want to see how much additional energy it takes to cool per rig watt.

For example. I wish to keep my rigs in an ambient temperature of 80, when it is 105 outside. It almost seems to me that more energy will be spent pumping the heat out of the room than will be spent on power the rigs.

I don't remember physics that well, but I thought at least an equivalent amount of energy would be spent to remove the heat and even more because of the huge inefficiencies in cooling ?

Anyone good at physics want to answer this ?  Point me to where someone writes it up ?  The link above was more about data centers and not the direct cost related economics of it as they relate to profitability.   (and environmental impact Sad
member
Activity: 75
Merit: 10
Vintage4X4
May 08, 2014, 11:07:32 PM
#12
They are only good if you are the operator of the NOC/DC. For space/cage/by the U leases are better off doing it from home with amp power upgrades or non descript low cost office space lease. Bitcoin mining doesn't necessarily need high avail, fail over systems.

At the end of the day, the lowest cost and the most bitcoin in the pocket wins, no fancy stuff.




newbie
Activity: 52
Merit: 0
May 08, 2014, 10:39:05 PM
#11
What you say makes sense, we are doing so, thank you.
hero member
Activity: 490
Merit: 500
May 08, 2014, 04:58:23 PM
#10
im pretty sure there are more efficient way then using air conditioner. Anybody have any idea?

Location: Hong Kong Client: ASICMiner Completion: Oct 2013
Video: https://www.youtube.com/watch?v=oZavKweMrP4
Text: http://spectrum.ieee.org/computing/it/is-there-a-liquid-fix-for-the-clouds-heavy-energy-footprint


I don't know why anyone would want to underclock something. Overclock and run em at maximum throughput so you make more.

Because it runs more efficiently, uses less power, and therefor generates less heat.

Ultimately Bitcoin mining is about power efficiency, you switch off a miner when it no longer covers the power cost with what it mines.

Essentially the Antminer S2 1TH is the same chip as the Antminer S1 180GH just underclocked from 2W/GH to 1W/GH by throwing more chips into it.
sr. member
Activity: 441
Merit: 250
May 08, 2014, 09:06:22 AM
#9
im pretty sure there are more efficient way then using air conditioner. Anybody have any idea?

Despite what many people on these forums believe, or would like to believe, the laws of conservation of energy apply to Bitcoin rigs just like everything else. Is doesn't matter if you use water cooling, forced air or immersion baths, if your kit consumes 10kw you have to get rid of that 10kw to an external sink, be it air or water, somehow. The heat stays, no matter what you do to it unless you use thermoelectric devices, and they're not very efficient.

If you have a flowing river outside, then it can carry away a lot of heat, but the environmental authorities might have something to say about that, can you imagine the public outcry once they realise how many gWh of power are being used for Bitcoin mining? (Currently load is approximately 720Mw - enough for 50,000 homes.)
legendary
Activity: 1834
Merit: 1009
May 08, 2014, 08:35:25 AM
#8
im pretty sure there are more efficient way then using air conditioner. Anybody have any idea?
newbie
Activity: 38
Merit: 0
May 08, 2014, 08:30:44 AM
#7
Biffa,

You make a clear distinction that I overlook constantly. As a data center guy, I look at things as when the data center fails, people die and/or it costs a company $2,000 per second of downtime. Mission critical goggles are always on. Bitcoin is not in that realm to a large degree (I believe it is changing) and is largely comprised of enthusiasts.

It is not the first summer of mining, but the number of miners who view mining as a get rich quick operation has dramatically increased. People with little or no understanding of computer operation are starting to mine too, and don't understand heat, electricity and cooling. What is also starting to happen is that investors are throwing money at this space and they have a very different and high expectation for operation - especially at scale. My realm is MW of electricity and you're correct that no data center can handle the heat footprint which means it is too expensive to make money. The liquid cooling stuff from Allied Control I think has promise.

As for Iceland, the seismic risk and potential volcanic ash risk will keep me away. My preference would be set up in South America or New Zealand and follow the winter, or just find inexpensive pretty reliable power in a secure building with climate control and/or a BitRack with the A/C units and mine. I don't know why anyone would want to underclock something. Overclock and run em at maximum throughput so you make more. That is where this is headed IMHO - scale, maximum throughput, an arms race for the next 5 years that will be limited by heat, or more Bitcoin share captured by those who can run fast and handle heat.
legendary
Activity: 3234
Merit: 1220
May 06, 2014, 10:04:06 AM
#6
It's not just the chips. The boards they sit on, the insulation on the wires, anything plastic in the case, capacitors, and fan blades - those are where the trouble is. That stuff can melt before the data center cooling even enters the equation.

The other thing people overlook is that when you plug in a rig that is cold and fire it up, it goes from room temp to over 100F quickly. Lots of metals and heat sinks, drives, cases, and other components don't do well with a drastic and sudden exponential increase in heat. It cracks things. The A/C inside the facility at 65F if drawn into the case and over the guts of the rigs at 200F can also create a huge temperature differential. That cracks things.

Ideally you look at airflow, you look at SLA's of the facility, and tweak the rigs for the environment you operate in.


Maybe we just need chips that don't mind the heat Wink

This stuff is designed to take it the temperature differential, you make it sound like this is the first summer anyone has had to mine coins in. People have been mining on gpu's running at near 100°C (not F) for years. ASIC's aren't so different. In a home environment you underclock or slow everything down or get hosting or whatever it takes to continue with your hobby Smiley I think the chances of catastrophic hardware failure due to heat are slim to none. Whats more likely is that the systems will just slow down or shutdown to try and keep cool.

In a datacenter environment the biggest issue is power/heat density, you can't really put 35 1.25U Miners in a 47U rack, there just aren't datacenters with that sort of power footprint per rack available. Its like when blades started coming out, most datacenters were not designed for the rack/power/heat density of blades and had to build self contained blade rooms with separate cooling facilities so that the heat from the blade racks didn't overcome the overall cooling on the floor.

I think that its moving to dedicated mining floors, or specially commissioned datacenters in ultra low electricity cost locations, and even then its going to be touch and go from an ROI pov.
hero member
Activity: 490
Merit: 500
May 06, 2014, 09:54:56 AM
#5
The answer is to mine in Iceland, cheap environmental friendly geothermal energy, and it's cold outside. Everyone is colocating there! Wink

The heat problem is related to GH/W so instead of just selling space per U, you need to add per Watt into your price equation.

A 2U 2kWh miner needs to be paying twice and much as a 2U 1kWh miner to encourage underclocking and more efficient equipment replacement.
newbie
Activity: 38
Merit: 0
May 06, 2014, 08:44:45 AM
#4
It's not just the chips. The boards they sit on, the insulation on the wires, anything plastic in the case, capacitors, and fan blades - those are where the trouble is. That stuff can melt before the data center cooling even enters the equation.

The other thing people overlook is that when you plug in a rig that is cold and fire it up, it goes from room temp to over 100F quickly. Lots of metals and heat sinks, drives, cases, and other components don't do well with a drastic and sudden exponential increase in heat. It cracks things. The A/C inside the facility at 65F if drawn into the case and over the guts of the rigs at 200F can also create a huge temperature differential. That cracks things.

Ideally you look at airflow, you look at SLA's of the facility, and tweak the rigs for the environment you operate in.


Maybe we just need chips that don't mind the heat Wink
hero member
Activity: 686
Merit: 500
FUN > ROI
May 06, 2014, 07:58:52 AM
#3
That article (link got eaten) is: BitBeat: For Bitcoin Miners, A Hot Problem This Summer
Quote from: article
So, what might happen this summer? In a telephone interview, Mr. MacAuley said he thinks many rigs will have to go off line. After all, bitcoin’s price in dollar terms is now less than half what it was in December, at the same time that the mining power they’re competing against is now 10 times what is was then. This summer, the profit equation just won’t compute.
[...]
For MacAuley it lies in putting computers in new high-tech liquids that keep the machines cool. He’s talking about a sophisticated new product produced by 3M that has so far struggled to find a market. This summer, Bitcoin could provide it with its moment in the sun.
legendary
Activity: 3234
Merit: 1220
May 06, 2014, 07:55:05 AM
#2
Maybe we just need chips that don't mind the heat Wink
newbie
Activity: 38
Merit: 0
May 06, 2014, 07:51:56 AM
#1
Hi folks,

I am a data center guy, so I look at the rigs from the 'how much heat do they generate' perspective. The short answer is that mining rigs are a MONSTER for a data center that is more than 3 years old. I did some math on bitcoin, and posted it on my blog http://blunthammer.wordpress.com and the data points were picked up by the Wall Street Journal for an article about the heat problem - [Suspicious link removed]j.com/moneybeat/2014/04/29/bitbeat-for-bitcoin-miners-a-hot-problem-this-summer/ which is the #1 problem the rigs have.

The heat signature of a 4U rig is the equivalent of a 42U rack and most data centers are designed for the density spread out over 42U not 4. It's like trying to cool a heat gun in a shoebox. Very tough. The heat problem (at least here in the US) hasn't been much of an issue because it was winter and heat was a desirable benefit along with the bitcoin mined. There are the liquid cooled solutions that are available, and Martin Enclosures just released a new line of racks - http://www.martinenclosures.com/product/bitrack/ that sound like they were developed to help deal with the heat problem.

Bottom line - It is expensive to mine where power is over $0.05 per Kwh. The cost doubles when it is warm outside because you're paying for electricity to run rigs AND to cool them. Heat is not a desired byproduct when it is 90F/30C outside. The data center contracts carry a cooling uplift of 30%-50% so you'll pay for the cooling no matter what, so the best defense is to go to a facility where power is inexpensive, or go big with the mining operation (400 + rigs).

Jump to: