Author

Topic: .25 BTC BOUNTY for the best answer (Read 13574 times)

sr. member
Activity: 472
Merit: 250
January 14, 2014, 09:17:57 PM
#68


My understanding is that evaporative cooling can pose a risk to electronics, as the air might become saturated with moisture.  I am in search of some expert advice on this subject.  

1)  Is this a feasible option?
2)  How risky is it?
3)  Can the risks be addressed?
4)  If not, what is a better option?


I work for a company that runs a few different datacenters with cheap power in WA state.  We do actually use evaporative coolers in one, pretty neat system.  So let me do my best here:

1)  Yes evaporative coolers are an option for datacenters but it depends on the type.  The cooler you linked to would be a "Direct evaporative cooling" type which basically is just like a regular swamp cooler (see: http://en.wikipedia.org/w/index.php?title=Evaporative_cooler  then the "Direct evaporative cooling" title) and is not suited for datacenter use as the temperature drop is very low and a large amount of moisture is added to the air.  Eventually the air can become saturated with humidity too high for the electronics to survive and then you risk frying electronics simply due to condensation concerns...

2)  Well for a quick test get a regular swamp cooler in a closet, put a computer near the air output and see how long it lasts...  There is obviously a much more technical answer but even then it just depends on a lot of factors.  Most swamp coolers pull in outside air and rely on air entering/leaving the space to prevent the buildup of moisture.  Remember these "evaporative coolers" feel a lot colder to us humans than it actually drops the temp due to the evaporation aspect.   Again if you were to just recirculate the same air over and over eventually it would just be hot muggy air, and you would get condensation and start frying stuff.  Really evaporative cooling could be described as "add water to the air to make it feel cooler".  Again - this only goes so far and only works so much.

3)  Yes by changing the design to "Indirect evaporative cooling" (refer to the wiki link) where you use a heat exchanger this eliminates a large amount of the humidity.  But even this has its limitations and thus the options used for a datacenter are generally a standard AC/evaporative combination which leads us to:

4)  What datacenters use:  http://www.trane.com/datacenter/pdfs/TR_EvaporativeCooling_web.pdf  (problem:  $$$$$)

Really all the core data is written right on the wiki:

DISADVANTAGES:

  • The air supplied by the evaporative cooler is typically 80–90% relative humidity;
  • High humidity in air accelerates corrosion, particularly in the presence of dust. This can considerably shorten the life of electronic and other equipment.
  • High humidity in air may cause condensation of water. This can be a problem for some situations (e.g., electrical equipment, computers, paper, books, old wood).

So how much humidity do you want?  Not this much:

"a problem in the facility's building-management system led to high temperature and low humidity air from the hot aisles being endlessly recirculated though a water-based evaporative cooling system that sought to cool the air down – which meant that when the air came back into the cold aisle for the servers it was so wet it condensed."

Long story short:

"Some servers broke entirely because they had front-facing power supplies and these shorted out. For a few minutes, Parikh says, you could stand in Facebook's data center and hear the pop and fizzle of Facebook's ultra-lean servers obeying the ultra-uncompromising laws of physics."

http://www.theregister.co.uk/2013/06/08/facebook_cloud_versus_cloud/

The problem is any home-grown evaporative cooler solution is going to have that exact same problem.  Miner heats air > Evap cooler adds water (cools a bit) > miner heats air more > evap cooler adds more water (cools even less) > miner heats air > .... rinse repeat until you have a jungle!  ... then POP/FIZZLE...

Why not just leave a window open?  Wink



full member
Activity: 126
Merit: 100
CAUTION: Angry Man with Attitude.
January 14, 2014, 09:16:54 PM
#67
I highly doubt anyone is going to get the bounty, OP is just stealing your ideas, Unless this comment makes him reply in rage.
You must be new here. Let me check.... yep, less than 2 months. Yoch has been around forever, and is very professional. He's probably one of the more successful miners on this forum.

In the past, he has posted bounties for questions. When they're answered, he pays.
You offended or something?  Angry

Why hasent yoch replied for a while?
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
January 14, 2014, 09:11:27 PM
#66
I highly doubt anyone is going to get the bounty, OP is just stealing your ideas, Unless this comment makes him reply in rage.
You must be new here. Let me check.... yep, less than 2 months. Yoch has been around forever, and is very professional. He's probably one of the more successful miners on this forum.

In the past, he has posted bounties for questions. When they're answered, he pays.

Frankly, I'm more interested in a uniquely interesting success than a 0.25BTC bounty.

As I slowly inch myself closer to the solar powered mining array that heats my pool and jacuzzi and sauna, and dry's my towels, I can finally aspire to that noble profession: towel-boy.
Yoch's efforts can only benefit this result.

When my rig is lit by the power of the sun and nothing is wasted from its output, we'll have something worthy. 
Whatever Yoch's ultimate result will be, it is sure to be interesting.

I have a little math yet to do on the surface area of the pool and daily average unheated temperature to see how many BTUs I can pull out of a server room.  Having a little tropical pool zone would be a nice bonus to a mining operation.  Until then I can savor imagining the extra enjoyment of reclining in a jacuzzi knowing that it is heated by the byproduct of securing bitcoin.
legendary
Activity: 952
Merit: 1000
January 14, 2014, 07:46:35 PM
#65
I highly doubt anyone is going to get the bounty, OP is just stealing your ideas, Unless this comment makes him reply in rage.
You must be new here. Let me check.... yep, less than 2 months. Yoch has been around forever, and is very professional. He's probably one of the more successful miners on this forum.

In the past, he has posted bounties for questions. When they're answered, he pays.
full member
Activity: 126
Merit: 100
CAUTION: Angry Man with Attitude.
January 14, 2014, 06:06:29 PM
#64
I highly doubt anyone is going to get the bounty, OP is just stealing your ideas, Unless this comment makes him reply in rage.
legendary
Activity: 2114
Merit: 1693
C.D.P.E.M
January 14, 2014, 11:10:24 AM
#63
Lets hop back on topic.....

OP, Can you tell us what the square footage of this datacenter is planned to be and also let us know what kind of budget you are looking to work with....

Woodser


1500 SF.......I would like to keep cooling to $5,000 or less.  I am looking for out of the box ideas here.  This will not be your typical data center!

For out of the box + not typical...Sell the heat. Have your cooling system pay you instead of paying for it.

Run a jacuzzi / steam room / sauna business next to your data center
Heat a pool
Run a laundramat
Distillery, brewery, bakery
Lots of uses for extra heat if you can control it.

The cost of moving the air where it is more valuable from where it is not wanted on your chips is paid by the receiver rather than the sender.
This is also a marketing press opportunity.  
Or as I said earlier try geothermal cooling. Pros: runs almost by itself, won't affect electricity bill, causes no moisture problems and doesn't require maintenance. Cons: you need to dig a hole in the ground (bad idea if you're just renting the place).

I am civil engineer and i can garantee you that this is not a viable option !
for your information :  to get some "calorie" from the ground to AC a house (we are not talking about 40 000w but much less) in an area that is around 30° in summer the set up cost around 10 000€ (13k$).
It's simple/quick explaination, but as english isn't my main lenguage it's kind of hard to developpe more.
Geothermical is not possible in THIS case, it might work for some bigger set up where the budget is bigger too.

I don't know if in winter you need to heat your house (what is average temperature at your place ?) but you can use the heat to lower your heating bills.

As for the cooling part in summer, you can  use some heat to warm up your pool or someone else (also in winter by the way) but the only DIY solution i can imagine is : a large amount of new air from the outside (don't forget to put a filter to get rid of all kind of bugs, leaves...) directly on the hot spot and extraction of the air on the roof.

This would be a not so expensive solution and with the heat in winter you can warm houses or swiming pool !! think about it  you can make an extra $ selling it to neighboorhs.

The picture below explain what is a thermical exchange (basic stuff).
Usually you get the heat from the outside to have even more heat on the outside (work up to -15° that is why the snow men). But you can set it opposite, use your hot zone (garage with the rigs) to warm up the outside (or a swiming pool or anything)... it will result of the heat being transfered outside.
 
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
January 13, 2014, 12:32:44 PM
#62

This is also location dependent. Some places it is much warmer in the ground than above it.  
In Malibu, the well water is warmer than the air.  
Where did you come up with that?  In Malibu? surely you jest.  

The earths temp is pretty consistent once you get below a certain level.  I don't think it's that deep either, but I don't remember the exact depth.  It's a constant somewhere around 50 degrees Fahrenheit.   A quick search on "underground temperature constant" will give you plenty of good reliable info on this subject.

Not a jest, I came up with it by measuring... so yes I could search the web... or I could use a thermometer on my well water.
Geothermal heat is highly variably in localized areas, certainly you have heard of hot springs and geysers?
There are some hot springs a few miles away.  In this area of California, the ground moves enough that deep heat does not dissipate evenly as the subterranean rocks turn.  The temperature changes depending on how long I run the water due to the temperature at the different depths.
newbie
Activity: 53
Merit: 0
January 13, 2014, 10:32:03 AM
#61
Here is your answer: Northern Wisconsin  Cheesy
member
Activity: 84
Merit: 10
January 11, 2014, 12:48:11 PM
#60

This is also location dependent. Some places it is much warmer in the ground than above it. 
In Malibu, the well water is warmer than the air. 
Where did you come up with that?  In Malibu? surely you jest. 

The earths temp is pretty consistent once you get below a certain level.  I don't think it's that deep either, but I don't remember the exact depth.  It's a constant somewhere around 50 degrees Fahrenheit.   A quick search on "underground temperature constant" will give you plenty of good reliable info on this subject.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
January 09, 2014, 01:35:08 PM
#59
Lets hop back on topic.....

OP, Can you tell us what the square footage of this datacenter is planned to be and also let us know what kind of budget you are looking to work with....

Woodser


1500 SF.......I would like to keep cooling to $5,000 or less.  I am looking for out of the box ideas here.  This will not be your typical data center!

For out of the box + not typical...Sell the heat. Have your cooling system pay you instead of paying for it.

Run a jacuzzi / steam room / sauna business next to your data center
Heat a pool
Run a laundramat
Distillery, brewery, bakery
Lots of uses for extra heat if you can control it.

The cost of moving the air where it is more valuable from where it is not wanted on your chips is paid by the receiver rather than the sender.
This is also a marketing press opportunity.  
Or as I said earlier try geothermal cooling. Pros: runs almost by itself, won't affect electricity bill, causes no moisture problems and doesn't require maintenance. Cons: you need to dig a hole in the ground (bad idea if you're just renting the place).

This is also location dependent. Some places it is much warmer in the ground than above it. 
In Malibu, the well water is warmer than the air. 
hero member
Activity: 658
Merit: 500
Small Red and Bad
January 09, 2014, 12:27:09 PM
#58
Lets hop back on topic.....

OP, Can you tell us what the square footage of this datacenter is planned to be and also let us know what kind of budget you are looking to work with....

Woodser


1500 SF.......I would like to keep cooling to $5,000 or less.  I am looking for out of the box ideas here.  This will not be your typical data center!

For out of the box + not typical...Sell the heat. Have your cooling system pay you instead of paying for it.

Run a jacuzzi / steam room / sauna business next to your data center
Heat a pool
Run a laundramat
Distillery, brewery, bakery
Lots of uses for extra heat if you can control it.

The cost of moving the air where it is more valuable from where it is not wanted on your chips is paid by the receiver rather than the sender.
This is also a marketing press opportunity.  
Or as I said earlier try geothermal cooling. Pros: runs almost by itself, won't affect electricity bill, causes no moisture problems and doesn't require maintenance. Cons: you need to dig a hole in the ground (bad idea if you're just renting the place).
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
January 09, 2014, 11:09:07 AM
#57
Lets hop back on topic.....

OP, Can you tell us what the square footage of this datacenter is planned to be and also let us know what kind of budget you are looking to work with....

Woodser


1500 SF.......I would like to keep cooling to $5,000 or less.  I am looking for out of the box ideas here.  This will not be your typical data center!

For out of the box + not typical...Sell the heat. Have your cooling system pay you instead of paying for it.

Run a jacuzzi / steam room / sauna business next to your data center
Heat a pool
Run a laundramat
Distillery, brewery, bakery
Lots of uses for extra heat if you can control it.

The cost of moving the air where it is more valuable from where it is not wanted on your chips is paid by the receiver rather than the sender.
This is also a marketing press opportunity.  
full member
Activity: 126
Merit: 100
CAUTION: Angry Man with Attitude.
January 07, 2014, 05:56:04 PM
#56
The summer here are brutally hot....can get up to 110 F.

Arizona? Phoenix maybe? If so, im here too. let me know if you need help or another hand in setting something like this up.. id love to be a part of something like that Smiley... and yes i do have technical experience.

Lol, me too, mesa. Cheesy
full member
Activity: 132
Merit: 100
January 07, 2014, 05:41:28 PM
#55
The summer here are brutally hot....can get up to 110 F.

Arizona? Phoenix maybe? If so, im here too. let me know if you need help or another hand in setting something like this up.. id love to be a part of something like that Smiley... and yes i do have technical experience.
member
Activity: 84
Merit: 10
January 06, 2014, 07:34:43 PM
#54
You got a nearby underground cave Huh  Cheesy

I hadn't seen this post, but it's along the lines of what I'm thinking. 

Your location put's you at a disadvantage right off the bat.  

What's the infrastructure like where you plan on locating this?  Lots of older buildings?, silos?  If it were me and I was trying to be budget minded I would find an existing underground facility or basement.  This will dramatically cut your ambient temps and provide a more stable environment.  Is anything like this available or is your facility already selected?

Warehouse rentals aren't cheap, even spaces, OP will have to add the cost of electricity, equipment for cooling,hardware, and cost of space.

If you ask me, I don't think its worth it,you'll be lending more out of your pocket than making money.
There not cheap where I am, but I'm in the Bay Area.  That doesn't mean that in the middle of Missouri they're not.  Location is everything and finding something that's cool naturally is going to make a BIG difference in cost
full member
Activity: 126
Merit: 100
CAUTION: Angry Man with Attitude.
January 06, 2014, 11:46:31 AM
#53
You got a nearby underground cave Huh  Cheesy

I hadn't seen this post, but it's along the lines of what I'm thinking. 

Your location put's you at a disadvantage right off the bat.  

What's the infrastructure like where you plan on locating this?  Lots of older buildings?, silos?  If it were me and I was trying to be budget minded I would find an existing underground facility or basement.  This will dramatically cut your ambient temps and provide a more stable environment.  Is anything like this available or is your facility already selected?

Warehouse rentals aren't cheap, even spaces, OP will have to add the cost of electricity, equipment for cooling,hardware, and cost of space.

If you ask me, I don't think its worth it,you'll be lending more out of your pocket than making money.
member
Activity: 84
Merit: 10
January 06, 2014, 12:39:01 AM
#52
You got a nearby underground cave Huh  Cheesy

I hadn't seen this post, but it's along the lines of what I'm thinking. 

Your location put's you at a disadvantage right off the bat.  

What's the infrastructure like where you plan on locating this?  Lots of older buildings?, silos?  If it were me and I was trying to be budget minded I would find an existing underground facility or basement.  This will dramatically cut your ambient temps and provide a more stable environment.  Is anything like this available or is your facility already selected?
hero member
Activity: 574
Merit: 500
January 05, 2014, 11:00:35 PM
#51
But OP hasent told us the size of his rig, he doesnt need to spend a alot of money doing this you know?
really?
I am not sure of the BTU rating, but I will need to dissipate upwards of 40,000 watts.

youchdog has been around for a while ...lolz

He is running ~ 120m of scrypt or ~ 200+ GPU's of various descriptions

Real farms are starting to get out of control and in regards to heat ...fucking HEAT !!!!

I run 70m and have 6 x 3000L second fans on both intake and exhaust

AC is toooo expensive ... Just go with lots of air directed onto the GPU's and an fan connected to an extract fan mounted into the rig chasis that then pumpts the air out to a chimmney stack that then exhausts out with a 3000l fan suckikng at the other end

go for a largeish 600-800m outside exhaust fan stack (its like the ones u see on mcdonadls)
http://www.fantech.com.au/FanRange.aspx?MountingID=RVE&RangeID=32

The key being control the air in and then GET RIG of the HEAT !!

I live in australia that is currently 100+ at the moment ... the cards are running stable ~ 80c (its not great but oh well warranty)

Anyway youchdog thanks for added to diff :S

I could go further but this is what u get for free lolz  Cool
legendary
Activity: 2212
Merit: 1001
January 05, 2014, 06:12:43 PM
#50
You got a nearby underground cave Huh  Cheesy
legendary
Activity: 2044
Merit: 1000
January 04, 2014, 12:12:27 PM
#49
Lets hop back on topic.....

OP, Can you tell us what the square footage of this datacenter is planned to be and also let us know what kind of budget you are looking to work with....

Woodser


1500 SF.......I would like to keep cooling to $5,000 or less.  I am looking for out of the box ideas here.  This will not be your typical data center!
full member
Activity: 188
Merit: 100
January 04, 2014, 10:21:38 AM
#48
Lets hop back on topic.....

OP, Can you tell us what the square footage of this datacenter is planned to be and also let us know what kind of budget you are looking to work with....

Woodser
sr. member
Activity: 406
Merit: 250
January 02, 2014, 08:34:46 PM
#47
Also build a datacenter in a cold country like north of Canada, Iceland, Norway...and you won't have to cool it, just use the cold air from around you. Can save a lot, if you have that option.
full member
Activity: 188
Merit: 100
January 02, 2014, 10:34:21 AM
#46
Thats why theres inline duct fans inside the duct.

I guess I missed that during the video....
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
January 02, 2014, 10:11:51 AM
#45
On a related note:
http://www.gfxtechnology.com/
full member
Activity: 126
Merit: 100
CAUTION: Angry Man with Attitude.
January 01, 2014, 09:09:45 PM
#44
We have a lot of the "silliest" answers here, from people who have no datacenter or AC experience. You put up a bounty, you get stupid spammers and beggars.

Before qualifying any answer as silly or stupid, I think we need to know some additional considerations such as the amount of money he is willing to spend.

If we take into account the kind of unit he is considering to buy, that sounds more like a kind of garage project, in the line of DIY and low cost solutions.

In the company where I work, we recently did an investment in a cooling solution. The company spent a bit more than 50,000 euros (about 80,000 USD) in a solution based on in-row equipment from APC, like the one linked below. This was a very small datacenter with a power consumption much lower than 40,000 W

http://www.apc.com/products/family/index.cfm?id=379

So, if someone is willing to spend an amount in the range of 3000~4000 USD, maybe the silly solutions are those intended for real datacenters. May be, if he had a budget in the range of 100,000 USD he would be addressing his doubts to a professional cooling consultant. May be he would be ordering and paying a whole study and a project instead of being offering 0.25 BTC (about 180 USD) looking for ideas into a community of unknown people.

This.

I am looking for low cost, DIY solutions.  This will be built out in space not originally intended for Crypto mining.  I want to keep expenses to a minimum.....no need to spend all the profit on cooling the damn things. 

If you want, I suggest using a dehumidifier to keep out moisture and look at this expert on youtube showing his mining rig shelf and how he sets up cooling. http://youtu.be/G5f_e4P6gMA

My video is a good example though, because the duct is sucking out the hot air as the fans cover the front portion of the shelf .

I watched the video....Is there anything sucking the air into the duct?  To me I only saw the large box fans on the front.  It looks like to me without proper pressurization of that air it would be hard to make it travel through that ducting.

Thats why theres inline duct fans inside the duct.
full member
Activity: 188
Merit: 100
January 01, 2014, 03:52:50 PM
#43
We have a lot of the "silliest" answers here, from people who have no datacenter or AC experience. You put up a bounty, you get stupid spammers and beggars.

Before qualifying any answer as silly or stupid, I think we need to know some additional considerations such as the amount of money he is willing to spend.

If we take into account the kind of unit he is considering to buy, that sounds more like a kind of garage project, in the line of DIY and low cost solutions.

In the company where I work, we recently did an investment in a cooling solution. The company spent a bit more than 50,000 euros (about 80,000 USD) in a solution based on in-row equipment from APC, like the one linked below. This was a very small datacenter with a power consumption much lower than 40,000 W

http://www.apc.com/products/family/index.cfm?id=379

So, if someone is willing to spend an amount in the range of 3000~4000 USD, maybe the silly solutions are those intended for real datacenters. May be, if he had a budget in the range of 100,000 USD he would be addressing his doubts to a professional cooling consultant. May be he would be ordering and paying a whole study and a project instead of being offering 0.25 BTC (about 180 USD) looking for ideas into a community of unknown people.

This.

I am looking for low cost, DIY solutions.  This will be built out in space not originally intended for Crypto mining.  I want to keep expenses to a minimum.....no need to spend all the profit on cooling the damn things. 

If you want, I suggest using a dehumidifier to keep out moisture and look at this expert on youtube showing his mining rig shelf and how he sets up cooling. http://youtu.be/G5f_e4P6gMA

My video is a good example though, because the duct is sucking out the hot air as the fans cover the front portion of the shelf .

I watched the video....Is there anything sucking the air into the duct?  To me I only saw the large box fans on the front.  It looks like to me without proper pressurization of that air it would be hard to make it travel through that ducting.
full member
Activity: 126
Merit: 100
CAUTION: Angry Man with Attitude.
January 01, 2014, 06:51:23 AM
#42
We have a lot of the "silliest" answers here, from people who have no datacenter or AC experience. You put up a bounty, you get stupid spammers and beggars.

Before qualifying any answer as silly or stupid, I think we need to know some additional considerations such as the amount of money he is willing to spend.

If we take into account the kind of unit he is considering to buy, that sounds more like a kind of garage project, in the line of DIY and low cost solutions.

In the company where I work, we recently did an investment in a cooling solution. The company spent a bit more than 50,000 euros (about 80,000 USD) in a solution based on in-row equipment from APC, like the one linked below. This was a very small datacenter with a power consumption much lower than 40,000 W

http://www.apc.com/products/family/index.cfm?id=379

So, if someone is willing to spend an amount in the range of 3000~4000 USD, maybe the silly solutions are those intended for real datacenters. May be, if he had a budget in the range of 100,000 USD he would be addressing his doubts to a professional cooling consultant. May be he would be ordering and paying a whole study and a project instead of being offering 0.25 BTC (about 180 USD) looking for ideas into a community of unknown people.

This.

I am looking for low cost, DIY solutions.  This will be built out in space not originally intended for Crypto mining.  I want to keep expenses to a minimum.....no need to spend all the profit on cooling the damn things. 

If you want, I suggest using a dehumidifier to keep out moisture and look at this expert on youtube showing his mining rig shelf and how he sets up cooling. http://youtu.be/G5f_e4P6gMA

My video is a good example though, because the duct is sucking out the hot air as the fans cover the front portion of the shelf .
full member
Activity: 126
Merit: 100
CAUTION: Angry Man with Attitude.
January 01, 2014, 06:49:15 AM
#41
But OP hasent told us the size of his rig, he doesnt need to spend a alot of money doing this you know?
really?
I am not sure of the BTU rating, but I will need to dissipate upwards of 40,000 watts.

Must of missed that thanks.
legendary
Activity: 1512
Merit: 1036
January 01, 2014, 06:05:35 AM
#40
But OP hasent told us the size of his rig, he doesnt need to spend a alot of money doing this you know?
really?
I am not sure of the BTU rating, but I will need to dissipate upwards of 40,000 watts.
member
Activity: 82
Merit: 10
January 01, 2014, 05:11:52 AM
#39
Freeze lot's of ice while energy costs are low(at night), than melt it with coolant while costs are high (daytime). Might save up to 20% of cooling costs.  Cry
full member
Activity: 126
Merit: 100
CAUTION: Angry Man with Attitude.
January 01, 2014, 04:14:00 AM
#38
We have a lot of the "silliest" answers here, from people who have no datacenter or AC experience. You put up a bounty, you get stupid spammers and beggars.

You basically have two choices:

- Traditional refrigerant air conditioning with a condensing outdoor unit,
- Evaporative "Swamp coolers", where facilities allow large outside berth for building flow-through, and the local weather is favorable.

*snip*


Evaporative cooling is measured a different way, in the temperature drop from intake air temp, with accompanying increased humidity. You can make 100F outside air into 75F inside air. However, you will need to look at the cubic feet per minute ratings of the systems to see what can keep up with your heat load. You may decide that 85F will be the maximum "output" temperature rise after air goes through your racks - for this much cooling, you will be looking at garage-door sized walls of fans from the outside and gallons of water per minute.

However, the evaporative cooling does have the advantage that you are putting in a massive outside air circulation system - the 75% of the day and year when outside air is below 75F, you will need nothing more than to run the fans.

Inside a closed air conditioned building, evaporative cooling may enhance efficiency a bit. AC removes humidity, to the point where the IDUs need to pump water out. You could add some humidity back to pre-cool the hot AC intake air (you can't humidify cold air AC output). The humidity would have to be strictly monitored to not go overboard or add more humidity than the AC can remove.

Whatever system is implemented, you need to direct airflow through your facility and systems, ideally in a typical contained hot/cool-aisle system:
*cut*

You could always try an evaporative system base on old technology, like a windcatcher
http://en.wikipedia.org/wiki/Windcatcher
Just get a mechanical engineer to do the required calculations


But OP hasent told us the size of his rig, he doesnt need to spend a alot of money doing this you know?
newbie
Activity: 40
Merit: 0
January 01, 2014, 03:54:30 AM
#37
We have a lot of the "silliest" answers here, from people who have no datacenter or AC experience. You put up a bounty, you get stupid spammers and beggars.

You basically have two choices:

- Traditional refrigerant air conditioning with a condensing outdoor unit,
- Evaporative "Swamp coolers", where facilities allow large outside berth for building flow-through, and the local weather is favorable.

*snip*


Evaporative cooling is measured a different way, in the temperature drop from intake air temp, with accompanying increased humidity. You can make 100F outside air into 75F inside air. However, you will need to look at the cubic feet per minute ratings of the systems to see what can keep up with your heat load. You may decide that 85F will be the maximum "output" temperature rise after air goes through your racks - for this much cooling, you will be looking at garage-door sized walls of fans from the outside and gallons of water per minute.

However, the evaporative cooling does have the advantage that you are putting in a massive outside air circulation system - the 75% of the day and year when outside air is below 75F, you will need nothing more than to run the fans.

Inside a closed air conditioned building, evaporative cooling may enhance efficiency a bit. AC removes humidity, to the point where the IDUs need to pump water out. You could add some humidity back to pre-cool the hot AC intake air (you can't humidify cold air AC output). The humidity would have to be strictly monitored to not go overboard or add more humidity than the AC can remove.

Whatever system is implemented, you need to direct airflow through your facility and systems, ideally in a typical contained hot/cool-aisle system:
*cut*

You could always try an evaporative system base on old technology, like a windcatcher
http://en.wikipedia.org/wiki/Windcatcher
Just get a mechanical engineer to do the required calculations
full member
Activity: 126
Merit: 100
CAUTION: Angry Man with Attitude.
December 31, 2013, 06:41:47 PM
#36
We have a lot of the "silliest" answers here, from people who have no datacenter or AC experience. You put up a bounty, you get stupid spammers and beggars.

Before qualifying any answer as silly or stupid, I think we need to know some additional considerations such as the amount of money he is willing to spend.

If we take into account the kind of unit he is considering to buy, that sounds more like a kind of garage project, in the line of DIY and low cost solutions.

In the company where I work, we recently did an investment in a cooling solution. The company spent a bit more than 50,000 euros (about 80,000 USD) in a solution based on in-row equipment from APC, like the one linked below. This was a very small datacenter with a power consumption much lower than 40,000 W

http://www.apc.com/products/family/index.cfm?id=379

So, if someone is willing to spend an amount in the range of 3000~4000 USD, maybe the silly solutions are those intended for real datacenters. May be, if he had a budget in the range of 100,000 USD he would be addressing his doubts to a professional cooling consultant. May be he would be ordering and paying a whole study and a project instead of being offering 0.25 BTC (about 180 USD) looking for ideas into a community of unknown people.

This.

I am looking for low cost, DIY solutions.  This will be built out in space not originally intended for Crypto mining.  I want to keep expenses to a minimum.....no need to spend all the profit on cooling the damn things. 

If you want, I suggest using a dehumidifier to keep out moisture and look at this expert on youtube showing his mining rig shelf and how he sets up cooling. http://youtu.be/G5f_e4P6gMA
full member
Activity: 126
Merit: 100
CAUTION: Angry Man with Attitude.
December 31, 2013, 06:20:38 PM
#35
I design and build or rather used to....

I dont want your bounty but will happily offer up my 2 pence worth.

Not a concept i would choose, cooling is more complex than that....

How many servers are we talking? what are the BTU Raiting etc.....

How will air "flow" work around the space, will each server get enough cooled air

Why not free cooling??? just high speed fans in a small space and extraction can be enough on small setups.

Its not as simple as your question makes it

I am not sure of the BTU rating, but I will need to dissipate upwards of 40,000 watts.

Air flow would be a bit tricky.  I was planning on an intake and exhaust fan to get rid of humidity.  

The summer here are brutally hot....can get up to 110 F.  high speed fans don't really cut it in those conditions, at least they haven't the last couple years.  

Are you in AZ or Cali? I believe you can rent cooled spaces.
sr. member
Activity: 406
Merit: 250
December 31, 2013, 06:02:20 PM
#34
Well I have a better option for you. Look at how big companies do it. They try to make it as efficient as possible. Microsoft doesn't even put a roof on their new facilities.
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
December 31, 2013, 09:32:26 AM
#33
I would like to rig miners to my pool's water heater and make use of the heat to heat my pool to tropical temperatures.
This could save .25 BTC a day for each day I want to heat it.
If you have AC or a heat pump, there are already systems that can heat the pool with waste heat (or cool your house with your pool if you look at it the other way):
http://www.hotspotenergy.com/pool-heater/
Thanks for the link.
Yes, this is what I am looking at.
Solar panels provide electricity already for the home/office.
Running the chillers on the miner with the waste heat warming the swimming pool which serves as the evap.
Virtuous cycle mining.
legendary
Activity: 1512
Merit: 1036
December 31, 2013, 12:49:42 AM
#32
I would like to rig miners to my pool's water heater and make use of the heat to heat my pool to tropical temperatures.
This could save .25 BTC a day for each day I want to heat it.
If you have AC or a heat pump, there are already systems that can heat the pool with waste heat (or cool your house with your pool if you look at it the other way):
http://www.hotspotenergy.com/pool-heater/
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
December 31, 2013, 12:19:55 AM
#31
I would like to rig miners to my pool's water heater and make use of the heat to heat my pool to tropical temperatures.
This could save .25 BTC a day for each day I want to heat it.
newbie
Activity: 38
Merit: 0
December 30, 2013, 09:52:13 PM
#30
I'm not in no way, of possession of the credentials to give the OP's the reply he wants, but I've once seen Liquid Nitrogen cooling solution, which was actually a small flat chamber with the liquid trapped inside, which was serving as a "floor" for us to step over. To keep it cool at the lowest cost I believe that a magnetic cooling system was set up (don't clearly remember the occasion).

 It was serving as cooling solution for chemical reactors though, not sure if such technology applies to this case.

A quick search on google points me to this on first link: http://en.wikipedia.org/wiki/Magnetic_refrigeration

Have no idea about the costs Cheesy

Happy new year

Edit: The Lab was located at University of Algarve in Portugal, department of Physics and Chemistry
https://www.ualg.pt/home/en
newbie
Activity: 6
Merit: 250
December 30, 2013, 02:39:43 PM
#30
We have a lot of the "silliest" answers here, from people who have no datacenter or AC experience. You put up a bounty, you get stupid spammers and beggars.

Before qualifying any answer as silly or stupid, I think we need to know some additional considerations such as the amount of money he is willing to spend.

If we take into account the kind of unit he is considering to buy, that sounds more like a kind of garage project, in the line of DIY and low cost solutions.

In the company where I work, we recently did an investment in a cooling solution. The company spent a bit more than 50,000 euros (about 80,000 USD) in a solution based on in-row equipment from APC, like the one linked below. This was a very small datacenter with a power consumption much lower than 40,000 W

http://www.apc.com/products/family/index.cfm?id=379

So, if someone is willing to spend an amount in the range of 3000~4000 USD, maybe the silly solutions are those intended for real datacenters. May be, if he had a budget in the range of 100,000 USD he would be addressing his doubts to a professional cooling consultant. May be he would be ordering and paying a whole study and a project instead of being offering 0.25 BTC (about 180 USD) looking for ideas into a community of unknown people.
legendary
Activity: 2044
Merit: 1000
December 30, 2013, 05:50:32 PM
#29
We have a lot of the "silliest" answers here, from people who have no datacenter or AC experience. You put up a bounty, you get stupid spammers and beggars.

Before qualifying any answer as silly or stupid, I think we need to know some additional considerations such as the amount of money he is willing to spend.

If we take into account the kind of unit he is considering to buy, that sounds more like a kind of garage project, in the line of DIY and low cost solutions.

In the company where I work, we recently did an investment in a cooling solution. The company spent a bit more than 50,000 euros (about 80,000 USD) in a solution based on in-row equipment from APC, like the one linked below. This was a very small datacenter with a power consumption much lower than 40,000 W

http://www.apc.com/products/family/index.cfm?id=379

So, if someone is willing to spend an amount in the range of 3000~4000 USD, maybe the silly solutions are those intended for real datacenters. May be, if he had a budget in the range of 100,000 USD he would be addressing his doubts to a professional cooling consultant. May be he would be ordering and paying a whole study and a project instead of being offering 0.25 BTC (about 180 USD) looking for ideas into a community of unknown people.

This.

I am looking for low cost, DIY solutions.  This will be built out in space not originally intended for Crypto mining.  I want to keep expenses to a minimum.....no need to spend all the profit on cooling the damn things. 
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
December 30, 2013, 06:45:42 AM
#28
We have a lot of the "silliest" answers here, from people who have no datacenter or AC experience. You put up a bounty, you get stupid spammers and beggars.

Inside a closed air conditioned building, evaporative cooling may enhance efficiency a bit. AC removes humidity, to the point where the IDUs need to pump water out. You could add some humidity back to pre-cool the hot AC intake air (you can't humidify cold air AC output). The humidity would have to be strictly monitored to not go overboard or add more humidity than the AC can remove.

+1
You should pass some of the bounty to DC for being so accurate and on point.

If you are doing raised flooring, you will have moisture sensors under the floor (spills, leaks, plumbing, flooding), you may also want air humidity alarming if you are using swamp cooling as part of your mix.

Design for what happens when things go wrong, not just for how you want it to work.
legendary
Activity: 1512
Merit: 1036
December 29, 2013, 05:47:13 PM
#27
We have a lot of the "silliest" answers here, from people who have no datacenter or AC experience. You put up a bounty, you get stupid spammers and beggars.

You basically have two choices:

- Traditional refrigerant air conditioning with a condensing outdoor unit,
- Evaporative "Swamp coolers", where facilities allow large outside berth for building flow-through, and the local weather is favorable.

The amount of air conditioning required is calculable. You have two factors:
1. The amount of air conditioning required to keep an unloaded building at room temperature vs the hottest outside temperatures.
 This is directly related to the building's insulation and R factor. If you have an uninsulated warehouse style steel building, you are going to be using much more AC to keep the building at room temperature than a highly insulated facility,
2. The amount of heat that needs to be removed from equipment heat generation.

I am not sure of the BTU rating, but I will need to dissipate upwards of 40,000 watts.
Unfortunately, #2 will be the major factor in designing an air conditioning system, your equipment-generated heat is much more than the amount needed for building cooling. The cons of air conditioning is that it is a closed system, so even on cool days, you'll be running AC equivalent to 40,000 watts of heat removal. This is one factor that has data centers looking for better ways of doing things.

Air conditioning has a lot of weird units of measurement, they can't seem to just use joules and watts like a normal physicist would. I will try to process some of these measurements like "tons" and "BTUs" to give you an idea about your AC power bills and required capacity.

ton = 12000 BTUs/hour, or 3517 watts. (based on how much ice would be used to provide the same refrigeration)
1 watt = 3600 joules per hour
1 btu = 1055.05585 joules
1 watt = 3.41214163 btu / hour

therm = 100,000 BTU
EER = Energy Efficiency Ratio = BTUs/watt-hour. BTU/hr vs watts of AC unit. A number 8-12 is typical
SEER = season-based voodoo. EER = -0.02 × SEER² + 1.12 × SEER
COP = Coefficient of perfomance. What we really want to know - i.e. how many watts will remove 1000 watt of heat. COP = EER / 3.412

The first thing to figure out is 40,000 watts equals how much in these AC terms, and how much electricity will it take. Lets remove everything except watts and the EER rating:
Wreq = Wload / COP -> Wreq = Wload * 3.412 / EER

So for 40,000 watts, and an example of 9 EER-rated air conditioning, we get
Wreq = 40000W * 3.412 / 9  ->  15,164 watts

Next, how much AC capacity is required in those weird AC terms?
40000 watts = 11.4155251142 tons of air conditioning

So add that power use and capacity on top of what AC would normally be required for the space.


Evaporative cooling is measured a different way, in the temperature drop from intake air temp, with accompanying increased humidity. You can make 100F outside air into 75F inside air. However, you will need to look at the cubic feet per minute ratings of the systems to see what can keep up with your heat load. You may decide that 85F will be the maximum "output" temperature rise after air goes through your racks - for this much cooling, you will be looking at garage-door sized walls of fans from the outside and gallons of water per minute.

However, the evaporative cooling does have the advantage that you are putting in a massive outside air circulation system - the 75% of the day and year when outside air is below 75F, you will need nothing more than to run the fans.

Inside a closed air conditioned building, evaporative cooling may enhance efficiency a bit. AC removes humidity, to the point where the IDUs need to pump water out. You could add some humidity back to pre-cool the hot AC intake air (you can't humidify cold air AC output). The humidity would have to be strictly monitored to not go overboard or add more humidity than the AC can remove.

Whatever system is implemented, you need to direct airflow through your facility and systems, ideally in a typical contained hot/cool-aisle system:
newbie
Activity: 28
Merit: 0
December 29, 2013, 04:49:11 PM
#26
Indirect Evaporative Cooling


1.Yes.
2.The Indirect Evaporative Cooling (IEC) produces less risk than a poorly managed system.
3.Yes.
4.✗

Furthermore, the IEC systems can lower air temperature without adding moisture into the air, making them the more attractive option over the direct ones.

newbie
Activity: 6
Merit: 250
December 29, 2013, 04:23:48 PM
#26
Hello,

You should consider using localized refrigeration instead of trying to cool the whole room.

It is important to know what kind of devices are you trying to cool: USB based asic devices? GPU cards?

The base idea of localized cooling is to use air conduits in order to blow fresh air just to the intake of your heat-generating devices. you should use conduits also to take hot air away from these devices, and finally evacuate the hot air out of the room.

I am thinking on something like air conditioning flexible ducts, those made from corrugated aluminium.

Connect the "cold air" pipes to a junction box (cold box) and the pipes carrying "hot air" to another one (hot box). You will have just to blow air from the outside to the "cold box", and exhaust the air from the "hot box" outside of the room.


Hope you will find this helpful,

 José Antonio

BTC: 1HJMgnNnJ4ouLTenVFsDRhafPF2ZXHgTg9
hero member
Activity: 658
Merit: 500
Small Red and Bad
December 29, 2013, 12:58:30 PM
#25
You just posted it right above me no need to repeat Wink He should be out of the newbie section by now, maybe he'll come here and defend his idea.
global moderator
Activity: 3990
Merit: 2717
Join the world-leading crypto sportsbook NOW!
December 29, 2013, 12:06:05 PM
#24
Oil is a really bad idea if you're making a data center.  
1. You need big containers to store your stuff.
2. You need to circulate the oil inside containers (pumps)
3. If your oil heats up it's going to evaporate and cover the ceiling and other objects in the room.
4. If the container walls do not dissipate the heat well equipment may overheat anyway.
5. Hardware modifications are difficult be prepared to cover yourself in coolant every time you want to connect a cable.
6. Makes reselling almost impossible and voids warranty.

Tell it to this guy: https://bitcointalksearch.org/topic/responso-to-topic-389706-389758 haha
hero member
Activity: 658
Merit: 500
Small Red and Bad
December 29, 2013, 11:54:12 AM
#23
Oil is a really bad idea if you're making a data center. 
1. You need big containers to store your stuff.
2. You need to circulate the oil inside containers (pumps)
3. If your oil heats up it's going to evaporate and cover the ceiling and other objects in the room.
4. If the container walls do not dissipate the heat well equipment may overheat anyway.
5. Hardware modifications are difficult be prepared to cover yourself in coolant every time you want to connect a cable.
6. Makes reselling almost impossible and voids warranty.
global moderator
Activity: 3990
Merit: 2717
Join the world-leading crypto sportsbook NOW!
December 29, 2013, 10:05:53 AM
#22
Posting message for generationbitcoin from the newb forum since he is unable. Link here: https://bitcointalksearch.org/topic/responso-to-topic-389706-389758


Hi, I'm answering here coz I'm not allowed to do that in the actual thread.

https://bitcointalksearch.org/topic/m.4195044

My solution would be something like:

http://www.youtube.com/watch?v=Eub39NaC4rc

Just put hardware in the oil, no contact with air, no dump no rust. Oil does not conduct current so no short circuit.

Oil also distribute heat on the whole surface of the containing tank, easier to cool down.

Little maintenance.

Only con is that hardware will be permanently oily and hard to clean and resell.

my 2 cents (and hopefully 25 BTC Tongue)

me btc wallet 1L89NqH8vEwmcCkUf9W62fKL8y9Vi3K9KE
newbie
Activity: 28
Merit: 0
December 29, 2013, 01:21:21 AM
#21
buy a cool box:

http://www.minicoolers.co.uk/products/waeco/images/w35open.jpg

pack it full of dry ice and put your equipment inside...your rig will soon look like this:

http://www.freebiespot.net/wp-content/uploads/2010/12/dryice.jpg

send btc here:

15Wzsww4syNhX7joLbKE3mXhH8Fs78EXvc
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
December 29, 2013, 12:42:19 AM
#20
No GPUs in data centers?  You should come to Hollywood, or see Pixar's renderwalls for some Data center GPU mania.
http://www.youtube.com/watch?v=i-htxbHP34s
Any animation or gaming studio of much stature has these gpu renderwalls.
hero member
Activity: 955
Merit: 1004
December 29, 2013, 12:04:24 AM
#19
Two part answer.

1.  The answer is 42.

2. Datacenter servers are all CPU based, there are no GPUs, much less GPUs that can mine Scrypt coins efficiently.   GPU mining is a waste of time for BitCoin now.

3. So unless you are building custom mining PCs, scrap the datacenter idea.

Here's my wallet address:  12eBRzZ37XaWZsetusHtbcRrUS3rRDGyUJ

Thank You!   Grin
newbie
Activity: 10
Merit: 0
December 28, 2013, 11:56:53 PM
#18
I live in a country with long and hot summers.

1. Warm air rises while cool air sinks
2. Open systems are not that hard to cool
3. All in life is about the flow, not how big or how expensive

If you would create a big box or a small room where cool air sinks in and settles down on the bottom you got an inflow and it's ready to be absorbed / sucked up. Like normal cards, it could suck up the cool air and spit it out. It should spit it out upwards. iiiii ... i = device and the dot is the hot air spit up.

-the devices low with still 2 inches of space under where you could poor water on the side of the box/floor without an immediate danger and it would run right under the devices and level (only air needed, just visualize)
-the top of the devices blow of steam (hot air, just visualize) up while cool air sinks in from the side
-  [ \ i i i i i i ] cool air slides in to the bottom under the devices and the fans suck it up and blow it up in the way warm air would go anyway.

Circulating air without knowing what it wants to do is not gonna work, so adjust to nature and you only need air if you understand the flow. I would use AC flow with a gap in the inflow in case there would be any water, it would drip out on the floor before flow into the cool-box with devices (not a common little drain pipe that can be stuck up with 1 fly or something).  

You could put some kind of kitchen hood (a fan backwards in a tunnel connect to a mouth/hood) above the devices with hot air flowing out and up above the devices (iiii). Not to close to the cool air input and maybe not to low to suck the cool bottom air up other then the little fans from the devices (you're gonna have to play with it).

Now if that all wont work you have an insane expensive setup that should be under the ground or not in a cold county  Wink

Just air will do the job, when i opened up the computer on the left side and turn on a normal living room fan on the left front it cooled down from 78C to 66C. Also i have less dust then a normal closed computer inside the box.

In my next step I will have more cards in a huge cool box with AC flow in (not even with a tunnel, just pointing and not on swing). just not right under incase it would drip in the future. If the sealing would be filled up with a hot air cloud i just put a fan backward into an other room or pointing outside just where the AC engine is. Maybe i still create a tunnel with just 2 wide sheets left under th ac towards the box on the right \*\
hero member
Activity: 518
Merit: 500
December 28, 2013, 10:44:44 PM
#16
If you set up in a place where the temperature reaches 110 in the summer, you are going to end up costing yourself an arm and a leg whatever cooling method you employ. Find a cooler location should be your first priority.
sr. member
Activity: 454
Merit: 250
Technology and Women. Amazing.
December 28, 2013, 09:45:00 PM
#15
lol @ copypaste answers
legendary
Activity: 1512
Merit: 1036
December 28, 2013, 09:25:38 PM
#14
The move to water-cooled applications raises many challenges for facility executives. For example, experience shows that a building’s chilled water system is anything but clean. Few data center operators understand the biology and chemistry of open or closed loop cooling systems. Even when the operating staff does a great job of keeping the systems balanced, the systems still are subject to human errors that can wreak permanent havoc on pipes.
...

You could just post links instead of being a tool:
www.facilitiesnet.com/datacenters/article/Free-Air-WaterCooled-Servers-Increase-Data-Center-Energy-Efficiency--12989
http://www.facilitiesnet.com/datacenters/article/ripple-effect--8227

Location: Obviously wherever you live will play a huge part in this ... if your near mountains, the suggestions above will get you some interesting ambient air to play with; along with the possibility of cheap local electricity if you put up some windmill/solar near facility. Again, that is just additional capital cost when your probably more focused on spending as much as you can on G/HASH vs hedging your own power source.

Building a data center for BITCOIN or ANYCOIN should follow most of the current standards out there. Any computer equipment for extended periods of time at high temperatures greatly reduces reliability, longevity of components and will likely cause unplanned downtime. Maintaining an ambient temperature range of 68F to 75F (20 to 24C) is optimal for system reliability. This temperature range provides a safe buffer for equipment to operate in the event of air conditioning or HVAC equipment failure while making it easier to maintain a safe relative humidity level.

It is a generally agreed upon standard in the computer industry that expensive IT equipment should not be operated in a computer room or data center where the ambient room temperature has exceeded 85F (30C).
...
Recommended Computer Room Humidity
Relative humidity (RH) is defined as the amount of moisture in the air at a given temperature in relation to the maximum amount of moisture the air could hold at the same temperature. In a Mining Farm or computer room, maintaining ambient relative humidity levels between 45% and 55% is recommended for optimal performance and reliability.
..
You too:
http://www.avtech.com/About/Articles/AVT/NA/All/-/DD-NN-AN-TN/Recommended_Computer_Room_Temperature_Humidity.htm
sr. member
Activity: 423
Merit: 250
December 28, 2013, 09:23:02 PM
#13
The move to water-cooled applications raises many challenges for facility executives. For example, experience shows that a building’s chilled water system is anything but clean. Few data center operators understand the biology and chemistry of open or closed loop cooling systems. Even when the operating staff does a great job of keeping the systems balanced, the systems still are subject to human errors that can wreak permanent havoc on pipes.

Installing dedicated piping to in-row coolers is difficult enough the first time, but it will be nearly intolerable to have to replace that piping under the floor if, in less than five years, it begins to leak due to microbial or chemical attacks. That does happen, and sometimes attempts to correct the problem make it worse.

Consider these horror stories:

A 52-story single-occupant building with a tenant condenser water system feeding its data center and trading systems replaced its entire piping system (live) due to microbial attack.
A four-story data center replaced all of its chilled and condenser water systems (live) when the initial building operators failed to address cross contamination of the chilled water and the condenser water systems while on free cooling.
In yet another high-rise building, a two pipe (non-critical) system was used for heating in the winter and cooling in the summer. Each spring and fall the system would experience water flow blockages, so a chemical cleaning agent was added to the pipes to remove scale build-up.
Before the cleaning agent could be diluted or removed, the heating system was turned on. Thanksgiving night, the 4-inch lines let loose. Chemically treated 180-degree water flooded down 26 stories of the tower. Because no one was on site knew how to shut the system down, it ran for two hours before being stopped.

Isolation
Water quality isn’t the only issue to consider. Back in the days of water-cooled mainframes, chilled water was delivered to a flat plate heat exchanger provided by the CPU manufacturer. The other side of the heat exchanger was filled with distilled water and managed by technicians from the CPU manufacturer. Given this design, the areas of responsibility were as clear as the water flowing through the computers.

In today’s designs, some of the better suppliers promote this physical isolation through the use of a “cooling distribution unit” (CDU) with the flat plate heat exchanger inside. Not all CDUs are alike and some are merely pumps with a manifold to serve multiple cooling units. It is therefore wise to be cautious. Isolation minimizes risk.

Currently, vendor-furnished standard CDUs are limited in the number of water-cooled IRC units they can support. Typically these are supplied to support 12 to 24 IRCs with a supply and return line for each. That’s 24 to 48 pipes that need to be run from a single point out to the IRCs. If there are just a few high-density cabinets to cool, that may be acceptable, but, as the entire data center becomes high-density, the volume of piping can become a challenge. Even 1-inch diameter piping measures two inches after it is insulated.

The solution will be evolutionary. Existing data centers will go the CDU route until they reach critical mass. New data centers and ones undergoing major renovations will have the opportunity to run supply and return headers sized for multiple rows of high-density cabinets with individual, valved take-offs for each IRC unit. This reduces clutter under the floor, allowing reasonable airflow to other equipment that remains air-cooled. Again, the smart money will have this distribution isolated from the main chilled water supply and could even be connected to a local air-cooled chiller should the main chilled water plant fail.

Evaluating IRC Units

Given the multitude of water-cooled IRC variations, how do facility executives decide what’s best for a specific application? There are many choices and opportunities for addressing specific needs.

One consideration is cooling coil location. Putting the coils on top saves floor space. And the performance of top-of-the-rack designs are seldom affected by daily operations of server equipment installs and de-installs. But many older data centers and some new ones have been shoehorned into buildings with minimal floor-to-ceiling heights, and many data centers run data cabling in cable trays directly over the racks. Both these situations could make it difficult to put coils on top.

If the coil is on top, does it sit on top of the cabinet or is it hung from the structure above? The method of installation will affect data cabling paths, cable tray layout, sprinklers, lighting and smoke detectors. Be sure that these can all be coordinated within the given overhead space.

Having the coil on the bottom also saves floor space. Additionally it keeps all piping under the raised floor and it allows for overhead cable trays to be installed without obstruction. But it will either increase the height of the cabinet or reduce the number of “U” spaces in the cabinet. A “U” is a unit of physical measure to describe the height of a server, network switch or other similar device. One “U” or “unit” is 44.45 mm (1.75 inches) high. Most racks are sized between 42 and 50 “U”s (6 to 7 feet high) of capacity. To go taller is impractical because doing so usually requires special platforms to lift and install equipment at the top of the rack. To use smaller racks diminishes the opportunities to maximize the data center capacity.

With a coil on the bottom, a standard 42U cabinet will be raised 12 to 14 inches. Will that be too tall to fit through data center and elevator doors? How will technicians install equipment in the top U spaces? One option is a cabinet with fewer U spaces, but that will mean more footprint for the same capacity.

Alternative Locations
Another solution is 1-foot-wide IRC units that are installed between each high-density cabinet. This approach offers the most redundancy and is the simplest to maintain. It typically has multiple fans and can have multiple coils to improve reliability. Piping and power are from under the floor. This design also lends itself to low-load performance enhancements in the future. What’s more, this design usually has the lowest installed price.

On the flip side, it uses more floor space than the other approaches, with a footprint equal to half a server rack. It therefore allows a data center to go to high-density servers but limits the total number of computer racks that can be installed. Proponents of this design concede that this solution takes up space on the data center floor. They admit that data centers have gone to high-density computing for reduced footprint as well as for speed, but they contend that the mechanical cooling systems now need to reclaim some of the space saved.

Rear-door solutions are a good option where existing racks need more cooling capacity. But the design’s performance is more affected by daily operations then the other designs due to the door being opened when servers are being installed or removed. Facility executives should determine what happens to the cooling (and the servers) when the rear door is opened.

No matter which configuration is selected, facility executives should give careful consideration to a range of specific factors:

Connections. These probably pose the greatest risk no matter which configuration is selected. Look at the connections carefully. Are they of substance, able to take the stresses of the physical abuse when data cables get pulled around them or do they get stepped on when the floor is open? The connections can be anything from clear rubber tubing held on with hose clamps to threaded brass connections.

Think about how connections are made in the site as well as how much control can be exercised over underfloor work. Are workers aware of the dangers of putting stresses on pipes? Many are not. What if the fitting cracks or the pipe joint leaks? Can workers find the proper valve to turn off the leak? Will they even try? Does the data center use seal-tight electrical conduits that will protect power connections from water? Can water flow under the cables and conduits to the nearest drain or do the cables and conduits act like dams holding back the water and forcing it into other areas?

Valve quality. This is a crucial issue regardless of whether the valves are located in the unit, under the floor or in the CDU. Will the valve seize up over time and become inoperable? Will it always hold tight? To date, ball valves seem to be the most durable. Although valves are easy to take for granted, the ramifications of valve selection will be significant.

Servicing.
Because everything mechanical will eventually fail, one must look at IRC units with respect to servicing and replacement. How easy will servicing be? Think of it like servicing a car. Is everything packed so tight that it literally has to be dismantled to replace the cooling coil? What about the controls? Can they be replaced without shutting the unit down? And are the fans (the component that most commonly fails) hard wired or equipped with plug connections?

Condensate Drainage.
A water-cooled IRC unit is essentially a mini computer-room air conditioning (CRAC) unit. As such, it will condense water on its coils that will need to be drained away. Look at the condensate pans. Are they well drained or flat allowing for deposits to build up? If condensate pumps are needed what is the power source?

Some vendors are promoting systems that do sensible cooling only. This is good for maintaining humidity levels in the data center. If the face temperature of the cooling coil remains above the dew point temperature in the room, there will not be any condensation. The challenge is starting up a data center, getting it stabilized and then having the ability to track the data center’s dew point with all the controls automatically adjusting to maintain a sensible cooling state only.

Power. Data centers do not have enough circuits to wire the computers and now many more circuits are being added for the IRC units. What’s more, designs must be consistent and power the mechanical systems to mimic the power distribution of computers. What is the benefit of having 15 minutes of battery back-up if the servers go out on thermal overload in less than a minute? That being the case, IRC units need to be dual power corded as well. That criteria doubles the IRC circuit quantities along with the associated distribution boards and feeders back to the service entrance.

Before any of the specifics of IRC unit selection really matter, of course, facility executives have to be comfortable with water in the data center. Many are still reluctant to take that step. There are many reasons:

There’s a generation gap. Relatively few professionals who have experience with water-cooled processors are still around.
The current generation of operators have been trained so well about keeping water out of the data center that the idea of water-cooled processors is beyond comprehension.
There is a great perceived risk of making water connections in and around live electronics.
There is currently a lack of standard offerings from the hardware manufacturers.
The bottom line is that water changes everything professionals have been doing in data centers for the last 30 years. And that will create a lot of sleepless nights for many data center facility executives.

Before You Dive In
Traditionally, data centers have been cooled by computer-room air conditioning (CRAC) units via underfloor air distribution. Whether a data center can continue using that approach depends on many factors. The major factors include floor height, underfloor clutter, hot and cold aisle configurations, loss of air through tile cuts and many more too long to list here.

Generally speaking, the traditional CRAC concept can cool a reasonably designed and maintained data center averaging 4 kw to 6 kw per cabinet. Between 6 kw and 18 kw per cabinet, supplementary fan assist generally is needed to increase the airflow through the cabinets.

The fan-assist technology comes in many varieties and has evolved over time.

• First there were the rack-mounted, 1-U type of fans that increase circulation to the front of the servers, particularly to those at the top of the cabinet.

• Next came the fixed muffin fans (mounted top, bottom and rear) used to draw the air through the cabinet. Many of these systems included a thermostat to cycle individual fans on and off as needed.

• Later came larger rear-door and top-mounted fans of various capacities integrated into the cabinet design to maximize the air flow evenly through the entire cabinet and in some cases even to direct the air discharge.

All these added fans add load to the data center and particularly to the UPS. To better address this and to maximize efficiencies, the latest fan-assist design utilizes variable-speed fans that adjust airflow rates to match the needs of a particular cabinet.

Until recently, manufacturers did not include anything more than muffin fans with servers. In the past year, this has started to change. Server manufacturers are now starting to push new solutions out of research labs and into production. At least one server manufacturer is now utilizing multiple variable turbine-type fans in their blade servers. These are compact, high air volume, redundant and part of the manufactured product. More of these server-based cooling solutions can be expected in the coming months.
sr. member
Activity: 423
Merit: 250
December 28, 2013, 09:13:31 PM
#12
Just as water is an effective heat-exchange medium in evaporative cooling systems, it can also be circulated throughout the data center to cool the
An air-side economizer intakes outside air into the building when it is easier to cool than the air being returned from the conditioned space and distributes it to the space; exhaust air from the servers is vented outside. Under certain weather conditions, the economizer may mix intake and exhaust air to meet the temperature and humidity requirements of the computer equipment.

Evaporative cooling uses non-refrigerated water to reduce indoor air temperature to the desirable range. Commonly referred to as swamp coolers, evaporative coolers utilize water in direct contact with the air being conditioned. Either the water is sprayed as a fine mist or a wetted medium is used to increase the rate of water evaporation into the air. As the water evaporates, it absorbs heat energy from the air, lowering the temperature of the air as the relative humidity of the air increases.

These systems are very energy efficient as no mechanical cooling is employed. However, the systems do require dry air to work effectively, which limits full application to specific climates. Even the most conservative organizations, such as financial institutions, are beginning to use these types of systems, especially because ASHRAE has broadened the operating-temperature recommendations for data centers. ASHRAE's Technical Committee 9.9 recommendations allow dry-bulb operating temperatures between 64.4 degrees F (18 degrees C) and 80.6 degrees F (27 degrees C), with humidity controlled to keep dew points below 59.0 degrees F (15 degrees C) or 60 percent RH, whichever is lower. This has given even the most reluctant owners a green light to consider these options.

Airside economizers and evaporative cooling systems are difficult to implement in existing data centers because they typically require large HVAC ductwork and a location close to the exterior of the building. In new facilities, these systems increase the capital cost of the facility (i.e., larger building volume), HVAC equipment and ductwork. However, over the course of the lifetime of the facility, these systems significantly reduce operating costs when used in the appropriate climate, ideally, locations with consistent moderate temperatures and low humidity. Even under ideal conditions, the owner of a high-density data center that relies on outside air for cooling must minimize risks associated with environmental events, such as a forest fire generating smoke, and HVAC equipment failures.


IT equipment at the cabinet level. In fact, water cooling is far more energy efficient than air cooling. A gallon of water can absorb the same energy per degree of temperature change as 500 cubic feet of air. This yields significant operational savings in typical applications because the circulation of air to remove heat will require 10 times the amount of energy than would be required to move the water to transport the same amount of heat.

However, it is more expensive to install water piping than ductwork. An engineer can provide cost comparisons to provide the owner with the financial insight to make a sound decision when constructing a new facility. It is not usually a feasible retrofit for an existing data center.

Rear-door heat exchangers and integral water cooling are options in existing air-cooled data centers to reduce the energy use and cost associated with cooling. They put the water-cooling power of heat exchangers where they are really needed: on the server racks.

Rear-door heat exchangers are mounted on the back of each server rack. Sealed coils within the heat exchanger circulate chilled water supplied from below the raised floor. Hot air exhausted from the server passes over the coils, transferring the heat to the water and cooling the exhaust air to room temperature before it re-enters the room. The heated water is returned to the chiller plant, where the heat is exhausted from the building. Owners can achieve significant operational savings using these devices. To protect the systems during loss of utility power, many facilities put the pumps for the systems on a dedicated uninterruptible power supply (UPS) system.

Owners have been cautious in adopting this approach due to the risk of leaks. The heat exchanger is equipped with baffles that prevent water spraying into the computer in the rare event of a leak. However, water could still leak onto the floor.

Another alternative is integral cooling, a sort of a "mini AC unit" between the cabinets. This close-coupled system takes the hot air discharged from the servers, cools it immediately and then blows it back to the inlet of the server. The system contains the water within the AC unit itself. The installation can also be designed to drain to a piping system under the floor, and it can incorporate leak detectors.


-----------------------------------
An air-side economizer intakes outside air into the building when it is easier to cool than the air being returned from the conditioned space and distributes it to the space; exhaust air from the servers is vented outside. Under certain weather conditions, the economizer may mix intake and exhaust air to meet the temperature and humidity requirements of the computer equipment.

Evaporative cooling uses non-refrigerated water to reduce indoor air temperature to the desirable range. Commonly referred to as swamp coolers, evaporative coolers utilize water in direct contact with the air being conditioned. Either the water is sprayed as a fine mist or a wetted medium is used to increase the rate of water evaporation into the air. As the water evaporates, it absorbs heat energy from the air, lowering the temperature of the air as the relative humidity of the air increases.

These systems are very energy efficient as no mechanical cooling is employed. However, the systems do require dry air to work effectively, which limits full application to specific climates. Even the most conservative organizations, such as financial institutions, are beginning to use these types of systems, especially because ASHRAE has broadened the operating-temperature recommendations for data centers. ASHRAE's Technical Committee 9.9 recommendations allow dry-bulb operating temperatures between 64.4 degrees F (18 degrees C) and 80.6 degrees F (27 degrees C), with humidity controlled to keep dew points below 59.0 degrees F (15 degrees C) or 60 percent RH, whichever is lower. This has given even the most reluctant owners a green light to consider these options.

Airside economizers and evaporative cooling systems are difficult to implement in existing data centers because they typically require large HVAC ductwork and a location close to the exterior of the building. In new facilities, these systems increase the capital cost of the facility (i.e., larger building volume), HVAC equipment and ductwork. However, over the course of the lifetime of the facility, these systems significantly reduce operating costs when used in the appropriate climate, ideally, locations with consistent moderate temperatures and low humidity. Even under ideal conditions, the owner of a high-density data center that relies on outside air for cooling must minimize risks associated with environmental events, such as a forest fire generating smoke, and HVAC equipment failures.
full member
Activity: 208
Merit: 117
December 28, 2013, 09:06:11 PM
#11
Location: Obviously wherever you live will play a huge part in this ... if your near mountains, the suggestions above will get you some interesting ambient air to play with; along with the possibility of cheap local electricity if you put up some windmill/solar near facility. Again, that is just additional capital cost when your probably more focused on spending as much as you can on G/HASH vs hedging your own power source.

Building a data center for BITCOIN or ANYCOIN should follow most of the current standards out there. Any computer equipment for extended periods of time at high temperatures greatly reduces reliability, longevity of components and will likely cause unplanned downtime. Maintaining an ambient temperature range of 68F to 75F (20 to 24C) is optimal for system reliability. This temperature range provides a safe buffer for equipment to operate in the event of air conditioning or HVAC equipment failure while making it easier to maintain a safe relative humidity level.

It is a generally agreed upon standard in the computer industry that expensive IT equipment should not be operated in a computer room or data center where the ambient room temperature has exceeded 85F (30C).

In today's high-density data centers and computer rooms, measuring the ambient room temperature is often not enough. The temperature of the air where it enters miner can be measurably higher than the ambient room temperature, depending on the layout of the data center and a higher concentration of heat producing rigs. Measuring the temperature of the aisles in the data center at multiple height levels can give an early indication of a potential temperature problem. For consistent and reliable temperature monitoring, place a temperature sensor at least every 25 feet in each aisle with sensors placed closer together if high temperature equipment like blade servers are in use. I would recommend installing TemPageR, Room Alert 7E or Room Alert 11E rack units at the top of each rack in the data center. As the heat generated by the components in the rack rises, TemPageR and Room Alert units will provide an early warning and notify staff for temperature issues before critical systems, servers or network equipment is damaged.

Recommended Computer Room Humidity
Relative humidity (RH) is defined as the amount of moisture in the air at a given temperature in relation to the maximum amount of moisture the air could hold at the same temperature. In a Mining Farm or computer room, maintaining ambient relative humidity levels between 45% and 55% is recommended for optimal performance and reliability.

When relative humidity levels are too high, water condensation can occur which results in hardware corrosion and early system and component failure. If the relative humidity is too low, computer equipment becomes susceptible to electrostatic discharge (ESD) which can cause damage to sensitive components. When monitoring the relative humidity in the data center, the recommendation is to set early warning alerts at 40% and 60% relative humidity, with critical alerts at 30% and 70% relative humidity. It is important to remember that the relative humidity is directly related to the current temperature, so monitoring temperature and humidity together is critical.

So in closing, many ways to cool, from traditional air conditioning to evaporation systems ... that part really is a math equation on capital cost. The real focus should be maintaining the optimal environmental conditions inside the mining farm to ensure your core capital investment stays operational as efficient as possible.

Tips: 1BhgD5d6YTDhf7jXLLGYQ3MvtDKw1nLjPd
hero member
Activity: 658
Merit: 500
Small Red and Bad
December 28, 2013, 09:02:47 PM
#10
Have you considered this option?

This might create a cheap boost to your cooling by lowering the air temps and not causing humidity problems.

sr. member
Activity: 896
Merit: 272
Undeadbitcoiner Will not DIE until 1BTC=50K
December 28, 2013, 08:41:07 PM
#9
Please remember that bitcoin mining runs many times hotter than most datacenters.

You will need a lot of electricity. (Like a small hydroelectric dam's worth). You have to innovate to get it cooled. I would say that it would be most efficient to locate a bitcoin data mining center on a mountain.
The electricity could be produced with wind turbines with some solar. Cooling would be easy since being up on a mountain significantly lowers the temperature of the air.


Best Idea,
Choosing Cool Mountains and trying to get good electricty flow
legendary
Activity: 1204
Merit: 1002
Gresham's Lawyer
December 28, 2013, 08:32:53 PM
#8
I design and build or rather used to....

I dont want your bounty but will happily offer up my 2 pence worth.

Not a concept i would choose, cooling is more complex than that....

How many servers are we talking? what are the BTU Raiting etc.....

How will air "flow" work around the space, will each server get enough cooled air

Why not free cooling??? just high speed fans in a small space and extraction can be enough on small setups.

Its not as simple as your question makes it

I am not sure of the BTU rating, but I will need to dissipate upwards of 40,000 watts.

Air flow would be a bit tricky.  I was planning on an intake and exhaust fan to get rid of humidity.  

The summer here are brutally hot....can get up to 110 F.  high speed fans don't really cut it in those conditions, at least they haven't the last couple years.  

Also an ex data center designer with .02 and will leave it up to another to write the best but here is a piece:
Condensing coolers will remove humidity from the air.
An evap cooler will add it, though typically used for more open air systems, you could do something novel with a combination, and being careful with the runoff and cleanliness.
Have contacts at Leibert and Klein, et al, Trane is also good, if you want the full industrial engineering work done for your environment.
member
Activity: 84
Merit: 10
December 28, 2013, 08:28:58 PM
#7
A friend of mine works for a company that services the Amazon facility in Milpitas.  If memory serves they use Trane equipment.  You might start here for some research.  http://www.trane.com/datacenter/

Most office building suites have adequate cooling for small to mid sized server rack configurations.  Where are you planning in building this out? warehouse? or office suite? what part of the world?
hero member
Activity: 1492
Merit: 763
Life is a taxable event
December 28, 2013, 08:28:09 PM
#6
Please remember that bitcoin mining runs many times hotter than most datacenters.

You will need a lot of electricity. (Like a small hydroelectric dam's worth). You have to innovate to get it cooled. I would say that it would be most efficient to locate a bitcoin data mining center on a mountain.
The electricity could be produced with wind turbines with some solar. Cooling would be easy since being up on a mountain significantly lowers the temperature of the air.

legendary
Activity: 2044
Merit: 1000
December 28, 2013, 08:24:26 PM
#5
I design and build or rather used to....

I dont want your bounty but will happily offer up my 2 pence worth.

Not a concept i would choose, cooling is more complex than that....

How many servers are we talking? what are the BTU Raiting etc.....

How will air "flow" work around the space, will each server get enough cooled air

Why not free cooling??? just high speed fans in a small space and extraction can be enough on small setups.

Its not as simple as your question makes it

I am not sure of the BTU rating, but I will need to dissipate upwards of 40,000 watts.

Air flow would be a bit tricky.  I was planning on an intake and exhaust fan to get rid of humidity.  

The summer here are brutally hot....can get up to 110 F.  high speed fans don't really cut it in those conditions, at least they haven't the last couple years.  
legendary
Activity: 1512
Merit: 1036
December 28, 2013, 08:21:24 PM
#4
Here's a facebook update:

http://www.datacenterknowledge.com/archives/2012/07/16/facebook-revises-data-center-cooling-system/

In phase 2 of the Prineville project, Facebook has replaced the misters with an evaporative cooling system featuring adiabatic media made of fiberglass. Warm air enters through the media, which is dampened by a small flow of water that enters the top of the media. The air is cooled as it passes through the wet media.
Air Not Fully “Scrubbed”

The change followed an incident in which a plume of smoke from a fire spread across the area around the Facebook data center. Staff could smell the smoke inside the data center. That prompted the Facebook’s data center team to examine other options for treating and “scrubbing” air as it makes it way into the data center.


To clarify the above, there was a brush fire outside and they were pumping smoke and ash through their data center. They now have a waterfall through filter media that pulls particulates out of the air and into the water.

I actually attempted to make my own swamp cooler for my GPU room, it didn't work out so well as I just was drizzling water through a block of stacked cardboard. Swamp cooler pads aren't available in stores where I live: http://reviews.homedepot.com/1999/100343657/aspen-snow-cool-29-in-x-29-in-replacement-evaporative-cooler-pad-reviews/reviews.htm
member
Activity: 70
Merit: 10
LiveChains.Net
December 28, 2013, 08:17:47 PM
#3
I design and build or rather used to....

I dont want your bounty but will happily offer up my 2 pence worth.

Not a concept i would choose, cooling is more complex than that....

How many servers are we talking? what are the BTU Raiting etc.....

How will air "flow" work around the space, will each server get enough cooled air

Why not free cooling??? just high speed fans in a small space and extraction can be enough on small setups.

Its not as simple as your question makes it
legendary
Activity: 1512
Merit: 1036
December 28, 2013, 08:16:36 PM
#2
Those work best when you are pumping air through the facility, outside air that is already hot and needs to be cooled. You cannot just circulate the same air, unless you want to create a rain forest. If it is cool outside, you don't need vapor cooling, just lots of outside air.

You can look at facebook's system: http://gigaom.com/2012/08/17/a-rare-look-inside-facebooks-oregon-data-center-photos-video/




They use misters to cool lots of outside air - the evaporation of water cools the outside air when it adds humidity.
legendary
Activity: 2044
Merit: 1000
December 28, 2013, 08:11:15 PM
#1
I am in need of some technical advice from those in a position to know.

In building out a new "data center" for mining, I am considering different options for cooling the space.  I am interested in using a unit like this:

http://portablecoolers.com/models/PAC2K482S.html

My understanding is that evaporative cooling can pose a risk to electronics, as the air might become saturated with moisture.  I am in search of some expert advice on this subject. 

1)  Is this a feasible option?
2)  How risky is it?
3)  Can the risks be addressed?
4)  If not, what is a better option?

Most comprehensive answer, with the best information gets the bounty. 

Thanks!
Jump to: