Pages:
Author

Topic: Split AC Unit for Cooling Mining Farm? (Read 645 times)

legendary
Activity: 2170
Merit: 6279
be constructive or S.T.F.U
June 03, 2021, 09:58:50 PM
#34
Even if you are moving enough air having something to drop the temperature a bit can help even more.

Years ago a HVAC person showed me some calculations for server rooms / data centers in general and there was a massive amount of math (that I did not understand at all  Grin)
It showed this many CFM of air movement at this temp could do this but dropping it by a degree and cutting the CFM could do that BUT moving more air at a slower speed (bigger fans but running slower) could do even more. Really interesting, but as I said didn't understand most of it beyond the basic theory. Came down to "Just make it cool and let me know how much it's going to cost me"

-Dave

Of course, there is no argument about that, I believe everyone with common sense would agree to it, in fact, having enough ACs to cool down the hot air coming out from the miners, you will do just fine with no ventilation, but it all boils down to the most effienct way of doing it, cooling a server room is a completely different project.

My gears are located in one of the hottest cities on planet earth, they do just fine without ACs, which makes me positive that any other place would do just fine with enough fans, one important thing to note is pressure, our initial setup wasn't performing well enough because we were exhausting more air than we forced in, and you know that air will have to come in from somewhere, so this created negative pressure and the air was forced in from the exhaust side of the farm, we only noticed this after one of the exhaust fans died, it was spinning in the opposite direction (sucking air in), we also had miner fans failing at a high rate.

We ended up adding more intake fans to balance the pressure, the farm got A LOT cooler when the pressure issue was taken care of and so far, zero heat problem with absolutely ZERO ac units in that place.

One thing to keep in mind is that the power consumption and cost of a few large fans is a fraction of one large AC unit, also ACs require more regular maintenance/cleaning than fans do.
legendary
Activity: 3458
Merit: 6231
Crypto Swap Exchange
June 03, 2021, 08:47:24 PM
#33
Damn, I'd never think that people can use AC units for their cooling farms. I have an old AC which is old but still works. It'd be pretty nice to sell it for mining purposes.

They do, in one of the small farms I have I still use an AC to help cool down the miners when the outside temperature is above 35c, but after doing this for many years, I got to the conclusion that you don't need an AC if you can move enough air, these miners will run just fine even at an ambient temperature of over 40c, we have 150 gears in one of the places that have 0 ACs in it, we run most of them at stock settings and a few days ago the outside temperature was as high as 40c and the miners were running pretty cool.

Even if you are moving enough air having something to drop the temperature a bit can help even more.

Years ago a HVAC person showed me some calculations for server rooms / data centers in general and there was a massive amount of math (that I did not understand at all  Grin)
It showed this many CFM of air movement at this temp could do this but dropping it by a degree and cutting the CFM could do that BUT moving more air at a slower speed (bigger fans but running slower) could do even more. Really interesting, but as I said didn't understand most of it beyond the basic theory. Came down to "Just make it cool and let me know how much it's going to cost me"

-Dave
legendary
Activity: 2170
Merit: 6279
be constructive or S.T.F.U
Damn, I'd never think that people can use AC units for their cooling farms. I have an old AC which is old but still works. It'd be pretty nice to sell it for mining purposes.

They do, in one of the small farms I have I still use an AC to help cool down the miners when the outside temperature is above 35c, but after doing this for many years, I got to the conclusion that you don't need an AC if you can move enough air, these miners will run just fine even at an ambient temperature of over 40c, we have 150 gears in one of the places that have 0 ACs in it, we run most of them at stock settings and a few days ago the outside temperature was as high as 40c and the miners were running pretty cool.
jr. member
Activity: 48
Merit: 1
February 12, 2018, 11:03:16 PM
#31
Just an update - got 2 exhaust fans that pull 800 CFM each and put that small 400 CFM inline fan in the old dryer duct and they are cooling the space very well. The room is not closed off so exposed to whole basement. I have2 windows open for intake to balance the basement. It is still winter so works well - real test will be during summer. I would like to sell my 741's before then but the recent dip in prices is forcing me to hold onto the rigs for now. I expect it will have no problem cooling my 12 GPU rigs.
jr. member
Activity: 94
Merit: 3
February 06, 2018, 01:24:50 PM
#30
Find the cubic footage of the room. I have a small mine and am using a 300 CFM vent fan above the main mining area venting up through the attic to outside. Just don't forget about air flow into the room (opposite of side of room from fan). My mining room is 13x20x8. So calculate the cubic feet and get a vent fan with enough CFM or more to remove the air from the room in the amount of time you desire.
member
Activity: 70
Merit: 34
January 23, 2018, 09:31:27 AM
#29
Heat is the amount of heat produced, also known as the load. It is equal to the amount of power consumed by the miners.


Heat Rejection means to reject the heat to the atmosphere. It's the capacity of the system. The system moves heat from indoors to outdoors.

12000 BTU = 1 Ton
1 Ton = 3.51kW

newbie
Activity: 4
Merit: 1
January 23, 2018, 01:42:17 AM
#28
100k BTU is not enough to keep 40 ASICs cool.

54kW heat, 29kW heat rejection.



I have to look up what you mean by, heat and heat rejection in determining adequate cooling.
Thank you for the reply.
newbie
Activity: 4
Merit: 1
January 21, 2018, 09:19:06 PM
#27
Just my two cents -- we run 24 S9s in our farm. It is very much about removing the heat as opposed to cooling the machines. Put your focus on venting heat out as opposed to trying to keep everything cold in a static room. It's much more cost effective. Not that you won't need cooling -- you may, but the main issue is heat removal.

Hope this helps in some way! Godspeed.

I have four MaxAir 10” axial fans I’ve configured for air cycling. Based on recommendations here, those can be reconfigured for exhaust heat extraction. (Vent to manifold, blown out thru foundation vent blocks). They currently are used for room air exchange, each draw about 4 watts.  These axials are only adequate with low seasonal temps.

Thank you all for sharing you expertise and awareness of pros/cons.
member
Activity: 70
Merit: 34
January 21, 2018, 01:07:33 AM
#26
100k BTU is not enough to keep 40 ASICs cool.

54kW heat, 29kW heat rejection.

newbie
Activity: 7
Merit: 0
January 20, 2018, 09:53:31 PM
#25
Just my two cents -- we run 24 S9s in our farm. It is very much about removing the heat as opposed to cooling the machines. Put your focus on venting heat out as opposed to trying to keep everything cold in a static room. It's much more cost effective. Not that you won't need cooling -- you may, but the main issue is heat removal.

Hope this helps in some way! Godspeed.
newbie
Activity: 4
Merit: 1
January 20, 2018, 09:24:27 PM
#24
I’m currently working on a project for this to accommodate up to 40 ASICs.  Looking at using two 2 Fujitsu models AOU45RLXFZ running mini splits providing T 90,000 BTU cooling.  Portable filtration. Industrial curtain for climate area.  Working out the zone distribution and total amps. I’m out of panel space. So I’m bringing my house up 320 amps nominal with capacity to 400 amp peak intermittent.  Cooling may require 2 240 volt outlets with 50 amps. That’s the plan...waiting to hear back on a couple bids. Then I can do some modeling on costs.  It’s been a week since the HVAC review and not heard back from contractors.  I’d really like to avoid this expense but my space can’t accommodate cooling with just mass flow.  Let me know if you want any updates.

Let me know how it goes - my electrician recommended 24,000 BTU for my set up in a 12x13 ft. room. For now I am going to put 2 800 CFU fans in the one window and leave my 2 other basement windows open. Keep me updated though, a split AC unit might work better - just not sure how much it would cost to run them.

The bid came in using two Daikin units rated at 50,000 BTU each 100,000 BTU total.  Two mini split units cooling an area that would be separated off with and insulated curtain.  Curtain also reduces about 17 dB.  That’s enough gear to maintain 40 ASIC’s at optimal temps, humidity, and air quality. All that a/c gear installed and curtain $25,600.  The power out lines will be two 240V circuits at 30 amps each.  It’s not as cost effective as mass-flow cooling.  In a suburban area I’m troubleshooting an enclosed area and have to mitigate noise and heat without large circulation fans.  I keep looking at the numbers versus rapid rise in complexity.  The front end cost on mini-splits and scaling over time...it’s a very difficult choice to build out with unknowns in valuations.  There are several tax breaks, credits, and rebates. I’m doing this under a specific LLC, which helps with cost structure.  If you want to discuss down to brass tacks, I’d be happy to.
member
Activity: 70
Merit: 34
January 15, 2018, 05:40:34 PM
#23
Interesting points, thanks. I'm glad some places force fire resistant wall board.

Affinity laws are always true as in shows the relationship between flow, pressure and power. It is not exactly a cubed proportion with pump or fan speed depending performance at different speeds.  The heat removal requirement will follow the affinity laws. So the fan speed may need to more than double to accomplish the same heat removal when the delta T is halved. Regardless, half the delta T will result in the cube of the fan power required. All this is a bit tricky to calculate as the pressure may change as well. Each time I measured the cube function was within experimental error. I have also done whole office comparisons to demonstrate the point. I know it is contrary to common thoughts.

I hope to find the most economical way to run these miners. I'll let you know if/when I prove something.

Thanks for the spirited converstation



legendary
Activity: 1498
Merit: 1030
January 15, 2018, 04:54:40 PM
#22
Okay, so you  really don't know but you think you do. That's dangerous.

https://www.homedepot.com/p/Sheetrock-UltraLight-1-2-in-x-4-ft-x-8-ft-Gypsum-Board-14113411708/202530243

This is not fire rated sheet rock. It is for sale today at home depot. Generally, if it's 1/2" it's not fire rated, if it's 5/8" it is. That's because they add the fiber layer which is fire rated that is 1/8" thick.

CPUs burn more power as temperature rises. Not a little more power but a significant amount. Enough for me to pay for mechanical cooling systems and the power required to run them. Oh, don't forget, all the fans have to speed up. So the warmer the intake air is, the more air you must move over the heatsink to cool the processors. If you double fan speed, fan power is cubed. Check the affinity laws.
https://en.wikipedia.org/wiki/Affinity_laws


Very few enterprise class data centers are actually using outside air cooling but yes a few are. I have at least met every large enterprise class colo developer in the US. Even when Washington state tried to force everyone to use direct outside air cooling, I presented how my design was more energy efficient and therefore meets the spirit of the law. That design is currently being built. I have won that argument in Washington now 3 times, because I can show the proof.

I can also argue that all direct outside air cooling systems are vulnerable to attack. All someone has to do is shoot enough poison into the air intake to kill everyone inside. Even a smoke bomb will cause a clean agent dump and ventilation closures. That a solid $250k recovery problem.

How do you cool your data center if there's too much chaff in the air at harvest time? The filters will all clog really quickly. Then you are spending more money of filters than you would have spent on a standard cooling system. That even happened to a colleague in Dallas. I wouldn't have expected there'd be much harvest in Dallas, but then I read his data. They installed a giant system that is less cost efficient than a standard cooling system. Tragic.

PUE is a flawed method of measuring efficiency. I can improve the PUE of any data center by simply increasing the temperature. The rub is that I have proven that this method does not improve the efficiency of the data center, in fact it's slightly worse. You don't have believe me, it's what I get paid to do. Headlines are designed to grab attention, not to tell the whole truth.

Yes, I've been on the recovery effort after a colleague suffered thermal runaway on a large VRLA battery. I've spoken with a few other colleagues who had thermal runaway destruction on large Lithium-ion batteries. One actually destroyed their power room. I don't think that's the context you were using. I've had several cooling system failures, usually a control or power failure. The room will get hot. I've never had server failure because of it. My data center in Puerto Rico just survived the recent hurricane despite utility power failure for 6 weeks. Some of my competitors did fail, so I helped them get going again, as hopefully anyone would do in during a tragedy. Fair competition does not include kicking them when they're down.

Ironically, I use outside air to cool my miners. LOL. That's really because my miner rooms started really small. Now I'm looking to build larger rooms so I'm now more serious about the design, efficiency and cost. I am working through an analysis that looks promising. I think I can get the ROI reduced by 40% with my new design. I'm beginning small scale experiments now.


 That particular type of sheetrock is UNAVAILABLE in my area.
 Like I said, it might be a "local codes" type of thing that the only sheetrock I've ever seen was the fire-rated stuff.

 I wasn't talking thermal runaway on batteries (though that's actually MORE dangerous) but on semiconductor gear.

 I can't speak to chaff out of current experience - there isn't much of that around here and I'm upwind from the only close sources, and I didn't have it at all in my previous "in the edge of a city" location - but the SMOKE levels last year were bad.
 I was having to change the filters every week or two at MOST as a result - there are reasons I have always been a strong proponent of "positive pressure with filtered intake air".
 I'd have had to be doing that to some degree anyway even without the miners.
 Chaff shouldn't even be getting TO the filters themselves - there should be insect screening getting clogged by that stuff instead, just make a point of daily inspection during harvest season and cleanout as needed.
 You also pretty much have to be directly downwind and usually adjacent to a farm field that is being harvested for it to be a noticeable issue (based on when I was growing up in a place surrounded by farms), and that's normally going to be a week OR LESS out of the year.

 The "poison" argument is serious strawman.

 Fan power draw isn't the cube of fan speed, but it's not linear.
 Falls somewhere inbetween, based on the fan curves I've seen from manufacturers.
 Academic point though in the servers I've used as the case fans weren't set up for PWM at all and ran at 100% all the time anyway.
 I do grant they were "low end" server designs.

 As it happens, in the place I'm in right now, I CAN'T get enough "massive airflow" to do most of the cooling, except in mid-winter - the evap units I have are doing most of the work, and are certainly more efficient than any mechanical A/C unit ever dreamed of being to date.
 Don't believe the "change every 3 months" claims about the media - if you demineralize the water, they can easily go a year and I had one set still working fairly well after almost 2 years of near-continuous usage (I just replaced them last week in a unit I bought in May of 2016) even on aspen media.


member
Activity: 70
Merit: 34
January 15, 2018, 03:33:03 PM
#21
Okay, so you  really don't know but you think you do. That's dangerous.

https://www.homedepot.com/p/Sheetrock-UltraLight-1-2-in-x-4-ft-x-8-ft-Gypsum-Board-14113411708/202530243

This is not fire rated sheet rock. It is for sale today at home depot. Generally, if it's 1/2" it's not fire rated, if it's 5/8" it is. That's because they add the fiber layer which is fire rated that is 1/8" thick.

CPUs burn more power as temperature rises. Not a little more power but a significant amount. Enough for me to pay for mechanical cooling systems and the power required to run them. Oh, don't forget, all the fans have to speed up. So the warmer the intake air is, the more air you must move over the heatsink to cool the processors. If you double fan speed, fan power is cubed. Check the affinity laws.
https://en.wikipedia.org/wiki/Affinity_laws


Very few enterprise class data centers are actually using outside air cooling but yes a few are. I have at least met every large enterprise class colo developer in the US. Even when Washington state tried to force everyone to use direct outside air cooling, I presented how my design was more energy efficient and therefore meets the spirit of the law. That design is currently being built. I have won that argument in Washington now 3 times, because I can show the proof.

I can also argue that all direct outside air cooling systems are vulnerable to attack. All someone has to do is shoot enough poison into the air intake to kill everyone inside. Even a smoke bomb will cause a clean agent dump and ventilation closures. That a solid $250k recovery problem.

How do you cool your data center if there's too much chaff in the air at harvest time? The filters will all clog really quickly. Then you are spending more money of filters than you would have spent on a standard cooling system. That even happened to a colleague in Dallas. I wouldn't have expected there'd be much harvest in Dallas, but then I read his data. They installed a giant system that is less cost efficient than a standard cooling system. Tragic.

PUE is a flawed method of measuring efficiency. I can improve the PUE of any data center by simply increasing the temperature. The rub is that I have proven that this method does not improve the efficiency of the data center, in fact it's slightly worse. You don't have believe me, it's what I get paid to do. Headlines are designed to grab attention, not to tell the whole truth.

Yes, I've been on the recovery effort after a colleague suffered thermal runaway on a large VRLA battery. I've spoken with a few other colleagues who had thermal runaway destruction on large Lithium-ion batteries. One actually destroyed their power room. I don't think that's the context you were using. I've had several cooling system failures, usually a control or power failure. The room will get hot. I've never had server failure because of it. My data center in Puerto Rico just survived the recent hurricane despite utility power failure for 6 weeks. Some of my competitors did fail, so I helped them get going again, as hopefully anyone would do in during a tragedy. Fair competition does not include kicking them when they're down.

Ironically, I use outside air to cool my miners. LOL. That's really because my miner rooms started really small. Now I'm looking to build larger rooms so I'm now more serious about the design, efficiency and cost. I am working through an analysis that looks promising. I think I can get the ROI reduced by 40% with my new design. I'm beginning small scale experiments now.






legendary
Activity: 1498
Merit: 1030
January 14, 2018, 05:13:08 AM
#20
The Chicken Coop shows a major overall efficiency gain though versus conventional centers - otherwise they would have stopped building them.
The servers use a LITTLE more power when running warmer, but not nearly as much as A/C was using to cool with.

I've never seen sheet rock that was not fire rated - didn't know it existed.
I do know the difference between "fire rated" and "fireproof" - hour(s) vs more-or-less forever. 9-)
I don't work construction, but I've worked with enough such folks and had enough family that did that I can sometimes hum the tune (electrician excepted, I HAVE worked as a union-trained journeyman electrician in the past).
Perhaps the suppliers I've worked with didn't bother with the non-fire-rated stuff due to "local area" code requirements.
I've also worked with concrete board - nice stuff in it's way but kind of a pain to make holes in, and still "fire rated" not "fireproof".

80F is what I target for my intake "cool aisle" - if you can dress for it it's comfortable, but if you have to wear "business casual" it's definitely on the warm side.

I'm not going to ask if you've ever encountered "thermal runaway". 9-)


member
Activity: 70
Merit: 34
January 14, 2018, 01:29:30 AM
#19
It depends on what kind of data center you're running. My typical OCP cabinets are running 20kW, I have routers running 30kW. My cabinets are only 2' wide so I'd bet the power density is 2-3X.

I just designed one room to run 9.6MW IT load, nominal, up to 32kW per 2' cabinet. These little S9s at 1.3kW aren't so bad.

I'd say that every mining farm I've seen so far is completely missing the idea.

Looking at your pictures I'd suggest you simply isolate the exhaust of all the miners from the intake. I mean a physical barrier which prohibits the hot air from mixing with the cool air. I'd just build a box that contains all the hot air which will go out that convenient window. Place your shelves against the outside off the box, set the miner on the shelf so that the exhaust fan touches the wood or better yet sheet rock, use a pencil to outline the location of the exhaust fan, cut a hole to stick the fan through the hole and turn it on. Then do the rest of them. Hot air isolation and containment is the key.

If you really want to, measure the differential pressure between the box and outside. If it exceeds 0.5 psi, mount a squirrel cage blower with 20% more capacity than the miners total the suck the air through the miners and force it outside. If you put a VFD on that blower, you can actually set it to about -0.1. This is critical to improve the effectiveness of the radial fans on the miners. They'll last much longer that way. Don't worry about the fan energy. I've run this test a couple times and found the fan energy used by the exhaust fan is negated by the fan energy reduction in all the miners. Then, the miner chips do run cooler so they burn less energy. It'll actually improve your site efficiency.

Sheet rock is better than wood as it's really hard to burn. Your noise level will go way down too. You could even add a layer of insulation but I don't think you'll need it.

Too many mining farms try to throw the air around the room. Too many data centers do too. My data centers all have air to Freon heat exchangers on the back doors of each cabinet.

Think about how the shed solution isolates the hot and cool air. This is doing the same thing.


 OCP cabinets tend to have a lot higher power density that most common 19" rackmount based data centers allow.
 Most data centers do NOT allow 20KW per cabinet - most I've worked with you're lucky if you have 10 kw available.
 Your 32 kw per cabinet data center is the EXCEPTION, not the norm (though not uncommon in OCP usage).

 I've not actually bought any OCP gear, but I keep being tempted by some surplus Quanta Windmill and Winterfell servers/racks, so I have SOME knowleage of the subject and the relatively high power/performance density of OCP vs most "standard" rack-mount servers.

 Sheet rock / drywall is generaly fire rated, not just "hard to burn" - how many hours of fire rating tends to vary with the thickness but I've seen 6 hour ratings on some THICK sheets of the stuff.
 Insulation would probably be a waste, given the airflow level and that drywall itself insulates some.

 New data centers are moving away from the entire "A/C to cool with" concept due to the costs - the Yahoo "Chicken Coop" design is a lot closer to what most large data centers are doing today, as it's a TON more efficient than traditional designs.
 Or look at the GigaWatt "shed" design, which seems to be a lower-tech variant intended to be able to deal with stuff OTHER THAN rack-mounted gear (and seems to be similar to what Bitmain uses in it's big farm).


My 19" rack mount cabinets are running 10-30kW depending on how populated the chassis are. My OCPs are running 14-20kW, they are design limited to 24kW so technically they cannot reach a Cisco 9922 running full out.

About sheet rock, regular sheet rock is NOT Fire-Rated, but the Type X is Fire-Rated, but not Fireproof.
Type X is by no means 100% fireproof; simply it is drywall that will stand up against flame longer than regular drywall.  Also, just because an area is rocked in Type X does not ensure fire safety.  Fire can still find other avenues to travel:  vents, doors, gaps, etc.  

If a conventional 1/2" thick sheet of drywall will stand up to 30 minutes of fire, then the added 1/8" found in the Type X drywall, along with its other properties, will increase your margin of safety another 30 minutes. For this reason, fire-rated drywall is sometimes called one hour fire wallboard.

I mentioned you could add insulation. There will be a lot of heat in this containment area. If the sheet rock feels warm to the touch, there is a bit of heat conduction between the hot area and the cool area. You may also want to reduce noise further. Insulation is really cheap for such a small project.

Yahoo experimented with the Chicken Coop. They build hyper scale centers. What many have learned is that you can certainly reduce or even eliminate cooling system energy but that doesn't mean you'll save total energy. Actually, IT load increases with temperature as the CPU and transport interface components behave this way. The headline was great, near unity PUE but alas, they forgot to mention the IT load increase.

I was just in Facebook's newest data center in Fort Worth, it's nothing but OCP, it's not a chicken coop. Good ole mechanical cooling with a very efficient environmental air exchange system used when conditions allow. Cold aisle temps are 80F which I found a bit uncomfortable. It's 35MW IT per building with 3 buildings in the plan which is more than large.

I actually tour many new data centers every year, and speak on the subject at data center engineering conferences. Nearly all new builds are putting in Liebert DSE units with pumped refrigerant economizers. I use some of them too but usually only for a redundancy layer. Our study showed we can actually use less total energy with efficient cooling systems to optimize for server intake air temp of 72F. It also makes for a much more comfortable room to work in.

Over the years I think I've seen 25ish sites with direct outdoor air exchange systems intended to be used most nights and winters. Of all those systems, I've only seen two that are still actually used. One of them is my customer, the other at Cisco, but both of those actually yield much less time in eco mode than we thought they would. Every other one was disabled a long time ago.

Oh, and if it's making noise, it took energy to do that. When you go into a loud data center, I can guarantee it is inefficient. All mine are cold everywhere and quiet enough so I can talk to a small group of people in a normal voice. All noise is wasted energy.

apologies for the tangent topics
legendary
Activity: 1498
Merit: 1030
January 13, 2018, 04:56:44 PM
#18
It depends on what kind of data center you're running. My typical OCP cabinets are running 20kW, I have routers running 30kW. My cabinets are only 2' wide so I'd bet the power density is 2-3X.

I just designed one room to run 9.6MW IT load, nominal, up to 32kW per 2' cabinet. These little S9s at 1.3kW aren't so bad.

I'd say that every mining farm I've seen so far is completely missing the idea.

Looking at your pictures I'd suggest you simply isolate the exhaust of all the miners from the intake. I mean a physical barrier which prohibits the hot air from mixing with the cool air. I'd just build a box that contains all the hot air which will go out that convenient window. Place your shelves against the outside off the box, set the miner on the shelf so that the exhaust fan touches the wood or better yet sheet rock, use a pencil to outline the location of the exhaust fan, cut a hole to stick the fan through the hole and turn it on. Then do the rest of them. Hot air isolation and containment is the key.

If you really want to, measure the differential pressure between the box and outside. If it exceeds 0.5 psi, mount a squirrel cage blower with 20% more capacity than the miners total the suck the air through the miners and force it outside. If you put a VFD on that blower, you can actually set it to about -0.1. This is critical to improve the effectiveness of the radial fans on the miners. They'll last much longer that way. Don't worry about the fan energy. I've run this test a couple times and found the fan energy used by the exhaust fan is negated by the fan energy reduction in all the miners. Then, the miner chips do run cooler so they burn less energy. It'll actually improve your site efficiency.

Sheet rock is better than wood as it's really hard to burn. Your noise level will go way down too. You could even add a layer of insulation but I don't think you'll need it.

Too many mining farms try to throw the air around the room. Too many data centers do too. My data centers all have air to Freon heat exchangers on the back doors of each cabinet.

Think about how the shed solution isolates the hot and cool air. This is doing the same thing.


 OCP cabinets tend to have a lot higher power density that most common 19" rackmount based data centers allow.
 Most data centers do NOT allow 20KW per cabinet - most I've worked with you're lucky if you have 10 kw available.
 Your 32 kw per cabinet data center is the EXCEPTION, not the norm (though not uncommon in OCP usage).

 I've not actually bought any OCP gear, but I keep being tempted by some surplus Quanta Windmill and Winterfell servers/racks, so I have SOME knowleage of the subject and the relatively high power/performance density of OCP vs most "standard" rack-mount servers.

 Sheet rock / drywall is generaly fire rated, not just "hard to burn" - how many hours of fire rating tends to vary with the thickness but I've seen 6 hour ratings on some THICK sheets of the stuff.
 Insulation would probably be a waste, given the airflow level and that drywall itself insulates some.

 New data centers are moving away from the entire "A/C to cool with" concept due to the costs - the Yahoo "Chicken Coop" design is a lot closer to what most large data centers are doing today, as it's a TON more efficient than traditional designs.
 Or look at the GigaWatt "shed" design, which seems to be a lower-tech variant intended to be able to deal with stuff OTHER THAN rack-mounted gear (and seems to be similar to what Bitmain uses in it's big farm).


 
member
Activity: 70
Merit: 34
January 13, 2018, 12:53:00 AM
#17
It depends on what kind of data center you're running. My typical OCP cabinets are running 20kW, I have routers running 30kW. My cabinets are only 2' wide so I'd bet the power density is 2-3X.

I just designed one room to run 9.6MW IT load, nominal, up to 32kW per 2' cabinet. These little S9s at 1.3kW aren't so bad.

I'd say that every mining farm I've seen so far is completely missing the idea.

Looking at your pictures I'd suggest you simply isolate the exhaust of all the miners from the intake. I mean a physical barrier which prohibits the hot air from mixing with the cool air. I'd just build a box that contains all the hot air which will go out that convenient window. Place your shelves against the outside off the box, set the miner on the shelf so that the exhaust fan touches the wood or better yet sheet rock, use a pencil to outline the location of the exhaust fan, cut a hole to stick the fan through the hole and turn it on. Then do the rest of them. Hot air isolation and containment is the key.

If you really want to, measure the differential pressure between the box and outside. If it exceeds 0.5 psi, mount a squirrel cage blower with 20% more capacity than the miners total the suck the air through the miners and force it outside. If you put a VFD on that blower, you can actually set it to about -0.1. This is critical to improve the effectiveness of the radial fans on the miners. They'll last much longer that way. Don't worry about the fan energy. I've run this test a couple times and found the fan energy used by the exhaust fan is negated by the fan energy reduction in all the miners. Then, the miner chips do run cooler so they burn less energy. It'll actually improve your site efficiency.

Sheet rock is better than wood as it's really hard to burn. Your noise level will go way down too. You could even add a layer of insulation but I don't think you'll need it.

Too many mining farms try to throw the air around the room. Too many data centers do too. My data centers all have air to Freon heat exchangers on the back doors of each cabinet.

Think about how the shed solution isolates the hot and cool air. This is doing the same thing.


member
Activity: 504
Merit: 71
Just Getting Started...
January 12, 2018, 07:50:10 PM
#16
40 miners = 54kW

That would be a disaster.


 54 KW is about 160,000 BTU - that wouldn't be a disaster, that would be a "miners in constant overheat shutdown" if you didn't end up with a fire.
 MASSIVE AIRFLOW (and possibly some Evap cooling input to the "cold side" of the room as PART OF that massive airflow) is the only viable answer.

 HVAC folks generally have NO CLUE about the heat load miners generate.
Even DATA CENTER folks have NO CLUE about the heat density of miners, as a general rule.



You're right on that aspect, I manage a data center and initially I was shocked by both the BTU output and CFM of these ASIC miners. The only things I have that come close are big Cisco 9509s with 8 power supplies ea.

jr. member
Activity: 48
Merit: 1
January 12, 2018, 07:13:24 PM
#15

Little 400 CFU fan in right now as a temporary solution - I live in VA so it has been cold lately and this is better than nothing.

I have an old dryer vent I plan on ducting the 400 CFU fan into to - I am thinking have 6 or 12 of the ASICs blow into a cornered off part of the room and have that 400 CFU take it all out.. not sure if it will be enough though? Probably not

 You're not even in the ballpark.
 You need to start thinking in terms of more like 5000-10000 CFM as a minimum.

 A 6" dryer vent might handle *3* ASIC units, if you have a heavy duty "booster" type fan on it or *2* miners if you Y-duct the output of those miners directly into it - no way is it even going to be CLOSE to handling 12.
 Most dryer vents are more like 4" though - which is enough for ONE miner, but no more, and even with a high-end BOOSTER fan won't handle more than perhaps 2 miners.


How is your farm set up for cooling? I am going to have 2000 CFM in exhaust fans total - 1600 in the window and 400 in the old dryer ducting in a 10x12 room. I am considering buying some land and putting a shed on it with ~5000 in exhaust fans on the top and then holes in the side to let air in. I have a couple buddies who have their set up this way. I am trying to work with what I have though.. have a HOA so can't put a shed on my property since my backyard overlooks a golf course.
Pages:
Jump to: