Pages:
Author

Topic: A journey of extreme watercooling: Cooling a rack of GPU servers without AC. - page 7. (Read 27312 times)

donator
Activity: 1218
Merit: 1079
Gerald Davis
Just remember that the real cost in a datacenter isn't space, but amperage.  If those are running on 120v, you'll have a hard time finding a datacenter that can give you access to nearly 100 amps @ 120v with just 1 rack.

If you're just going to put them in your own 45U rack, I hope you've got a separate feed going into your house/office that can give you that kind of power Smiley.

It will be in my own rack no datacenter.  The largest advantage comes from dumping heat directly outside so datacenter doesn't really make sense.

I have 60A @ 240V in a sub panel with 2 NEMA L6-30R outlets and a pair of APC AP9571 PDUs (48A derated).  I have loaded ~40A without issue.   Right now I only have gear for 6 complete rigs not a full rack of 10 and I don't have any plans to expand that.
legendary
Activity: 1750
Merit: 1007
Just remember that the real cost in a datacenter isn't space, but amperage.  If those are running on 120v, you'll have a hard time finding a datacenter that can give you access to nearly 100 amps @ 120v with just 1 rack.

If you're just going to put them in your own 45U rack, I hope you've got a separate feed going into your house/office that can give you that kind of power Smiley.
donator
Activity: 1218
Merit: 1079
Gerald Davis
Based on current USD/BTC and your kwh rate, what's your estimated investment payback period?

If I only convert this one rig the payback is <3 months due to me already having some of the components (sunk cost).  Beyond that it is harder to estimate because I am not sure how far I can push the cards yet and if I will find that changes to my plan need to be made.  Also remember these aren't new rigs but "rig conversions" to need to look at incremental cost and incremental benefit.

Still a rough estimate:
Each rig conversion costs ~$900 and will provide ~600 MH/s extra hashing power ($66 per month) and will also eliminate ~$45 per month in cooling cost so direct payback is ~8 months.  If I can push them beyond 3.2GH/s every 100MH/s provides another ~$9 per month in net revenue. I am hoping I can get a bulk discount at least on the 20 waterblocks so with lower cost and higher hashrate payback might be as good as ~7 months.  Overvolting is also an option.

So I would say absolute best case is ~6 months worst case is 9 months.
legendary
Activity: 916
Merit: 1003
Based on current USD/BTC and your kwh rate, what's your estimated investment payback period?
donator
Activity: 1218
Merit: 1079
Gerald Davis
Damnit DeathAndTaxes, you are going to make it hard on me keeping up with the joneses Grin

Sorry about that. Have been planning this for a while (Dec was great but June will be brutal w/ 6KW of heat).  Seeing your out of the box thinking got my behind moving.

Quote
First things first: what kind of rad are you planning on to dump +6kw of heat? Most oil coolers are rated in horsepower, which is 746 watts per horsepower. You can get a compact (~1 square foot) rad that is good for 10 HP, but only with a loud cooling fan. However, ~no fan or low speed fan gets it a derating to 8 HP (plenty).

I haven't spent too much time on the "outer loop" yet.  I want to stress test this single rig first.  If it fails I didn't waste too much time and money.  I am thinking big (maybe 24" x 24") with a 16" or maybe 2 x 10" fans. 

There are a couple of options car radiator, oil cooler, industrial heat exchanger.  I don't know exactly what yet but it likely will be big. 

I found another company which makes really nice custom units for cooling lasers and other high temp components but their prices are insane (way outside my budget).  They have very detailed charts for c/w which gives me a ballpark idea on where I need to be aiming for (surface area and cfm).

http://www.lytron.com/Heat-Exchangers/Standard/Heat-Exchangers-Tube-Fin

Quote
Second: Heat exchangers. Some plan to make them pluggable, or some way to make it so you can add/remove them without disturbing the whole cluster? Also, which ones - most I have seen won't handle the heat from 4 5970s.

Yeah the tubing on the "cold side" of exchanger will have quick disconnects.  This will allow removing one rig from the rack.  For troubleshooting I am planning on keeping my "test radiator" from photo above with a pump and reservoir.  That way I can connect a sick rig to the baby radiator for diagnosis.

I found some brazed flat plate exchangers that with good flow 2gpm can handle 2KW+ with 10C rise over cool side inlet temp.  Remember the AC load on a single 5970s (GPU only) is ~230W so the DC (thermal) load is closer to 200W.  I bought one and will test it out this weekend.  I am thinking of a canister high lift aquarium or pond pump for the "cold" (outer) loop which should keep flow rates high.

Quote
Third: Power. Is that PSU going to be good enough for continuous duty? Not sure what you run now, but perhaps it will be.

I think so.  Seasonic is solid and their customer support is great.  I paid for units w/ 5 year warranty so I will be using that.  If they start to fail I will need to think of alternate loading.
vip
Activity: 1358
Merit: 1000
AKA: gigavps
e) the wife acceptance factor.  She has been a "trooper" with this mad scientist and 14GH/s of whirling, buzzing, heat belching fun.

I had to get a warehouse. Consider yourself lucky.
member
Activity: 86
Merit: 10
rjk
sr. member
Activity: 448
Merit: 250
1ngldh
Damnit DeathAndTaxes, you are going to make it hard on me keeping up with the joneses Grin

First things first: what kind of rad are you planning on to dump +6kw of heat? Most oil coolers are rated in horsepower, which is 746 watts per horsepower. You can get a compact (~1 square foot) rad that is good for 10 HP, but only with a loud cooling fan. However, ~no fan or low speed fan gets it a derating to 8 HP (plenty).

Second: Heat exchangers. Some plan to make them pluggable, or some way to make it so you can add/remove them without disturbing the whole cluster? Also, which ones - most I have seen won't handle the heat from 4 5970s.

Third: Power. Is that PSU going to be good enough for continuous duty? Not sure what you run now, but perhaps it will be.
full member
Activity: 227
Merit: 100
If you had chosen BitFORCE, you could've built the same system (3328 MH/s) with only 4 units, costing total of 2400USD (+ shipping)
and consuming only 330 Watts (less if a high-efficienty power-supply was used to power all 4 units). Virtually silent as compared to GPUs
(26dB each unit) and a fraction ( A third so to speak ) on electricity costs Smiley


Regards,

And it probably wouldn't arrive until 2013 - Don't get me wrong I'd like to purchase one myself but spamming this guy's build thread in quite poor taste IMO.


To the OP: I wish I had gone with rackmount cases for my little miners I have here, your setup looks good.

None was intented. My apologies if it appears so...


Regards,
donator
Activity: 1218
Merit: 1079
Gerald Davis
If you had chosen BitFORCE, you could've built the same system (3328 MH/s) with only 4 units, costing total of 2400USD (+ shipping)

And I would get it in 4-6 weeks months ?

If you are going to spam/advertize in my thread it would be nice to read it first.  The MB, PSU, RAM, CPU, and 24x 5970s (the most expensive part) were purchased between 12 months and 6 months ago.  They are already a sunk cost as I have no desire to try an unload all that equipment.  

While I am interested in migrating to FPGAs.  I have a summer temps arriving in <3 months, so I am looking for a solution to expand the longevity of my EXISTING HARDWARE (which has produced if my math is right ~250 quadrillion valid hashes so far).  Hopefully I can get another ~250 quadrillion hashes more as I mine these cards into the ground.

Quick questions:
If I place an order today for 4 BFL Singles today can you guarantee a delivery date?  What date would that be?  Tell you what, if you are willing to guarantee delivery by 1 April publicly on this forum  (w/ $200 penalty paid by BFL for non-delivery) I will buy 4 today.
sr. member
Activity: 452
Merit: 250
If you had chosen BitFORCE, you could've built the same system (3328 MH/s) with only 4 units, costing total of 2400USD (+ shipping)
and consuming only 330 Watts (less if a high-efficienty power-supply was used to power all 4 units). Virtually silent as compared to GPUs
(26dB each unit) and a fraction ( A third so to speak ) on electricity costs Smiley


Regards,

And it probably wouldn't arrive until 2013 - Don't get me wrong I'd like to purchase one myself but spamming this guy's build thread is in quite poor taste IMO.


To the OP: I wish I had gone with rackmount cases for my little miners I have here, your setup looks good.
rjk
sr. member
Activity: 448
Merit: 250
1ngldh
If you had chosen BitFORCE, you could've built the same system (3328 MH/s) with only 4 units, costing total of 2400USD (+ shipping)
and consuming only 330 Watts (less if a high-efficienty power-supply was used to power all 4 units). Virtually silent as compared to GPUs
(26dB each unit) and a faction ( A third so to speak ) on electricity costs Smiley


Regards,
This is more fun Grin

And, he can run bitforces off of it!
full member
Activity: 227
Merit: 100
If you had chosen BitFORCE, you could've built the same system (3328 MH/s, about 328 MH/s higher than the actual solution) with only 4 units,
costing total of 2400USD (+ shipping) and consuming only 330 Watts (less if a high-efficiency power-supply was used to power all 4 units).
Virtually silent as compared to GPUs (26dB each unit) and a fraction ( A third so to speak ) on electricity costs Smiley


Regards,
donator
Activity: 1218
Merit: 1079
Gerald Davis
Some updated pics.


Not your average watercooling job.  Very "industrial". No lights, UV dyes, clear tubing, bling bling leet gamer nonsense.  Maybe I am weird but I think it looks nicer that some flashy setups.

The PSU is a Seasonic 1250W 80-Plus Gold. I only have one of the 80mm fans in the back hooked up.  The sempron puts off almost no heat so might be fine just to use the PSU intake as an exhaust (yes computers did that at one time Smiley ). The front left (upper right) is the 3x 5.25" bay.  The front center houses a 3.5" bay w/ 80mm intake fan (removed).  The front right houses a 120mm fan for airflow across the GPUs.   The top cover also has 2x120mm fans but I don't think I will be using them.  This case is sold as a CUDA rack but honestly I don't see it having enough airflow for air cooling 4x Teslas.

Yeah the wiring job is horrible.  I haven't decided on which way the tubing will go.  My first thought it to mount the heat exchanger (should arrive by this weekend) in the front and run "cold loop" lines w/ quick disconnects out the 5.25" bay.


Closeup of the 4x5970s.  Finding a rackmount case w/ 8 expansion slots is tricky. The few that exists are $500+.  Luckily Chenbro makes this case and it wasn't too expensive.  The waterblocks are DangerDen because they are the cheapest full coverage waterblocks.  Watercooling is expensive but "saving" money using non-full coverage block is useless as the VRMs get too hot.  5970s are nice because one block cools two GPUs.  7990 would be even nicer but by the time they are affordable FPGA will likely have killed that idea.  I likely will seal the bridges with silicon sealant and apply plumbers tape to all the threads.  I want it as no maintenance as possible.


Front view w/ filter/door open.  On the left is 120mm intake (0.3A).  I would guestimate it at ~60 CFM.  Provides a slight breeze over the cards which is all that is needed for the non-waterblocked components like the caps.    On the right is a dual bay reservoir which also mounts the MCP655 pump.  


With door closed.  It is currently hooked to this "test radiator".  The end goal (assuming all testing goes good) would be to mount a water to water heat exchanger inside each rig with quick disconnects to attach it to a outer "cold loop" which runs to an outdoor radiator.

I am kinda surprised this radiator is holding up because 4x120mm is way undersized for ~800W thermal load.    Ambient temp is ~23C and cards are running at ~55C after 8 hours of hashing.  Strangely the load "bounces" more than I have seen on other rigs going from 1070W to 1120W.  Need to investigate a little further.

Thoughts, comments, suggestions?
donator
Activity: 1218
Merit: 1079
Gerald Davis
How Much that thing cost you to build ?

Well I already had 3 of the waterblocks, tubing, fittings, connectors, and radiator.   Plus my "standard" air cooled rigs are 3x5970, MSI 890FXA-GD70, 2GB of RAM, Sempron, USB Stick, and Corsair/Seasonic 1200W/1250W PSU which all went in here.

So the incremental cost for the "test rig" was just the case, pump and one waterblock ~ $400.

Full conversion of my other rigs would cost ~$900 ea (4x waterblock, fittings, connectors, tubing, heat exchanger, pump, silver kill coil, and case) unless I can get some volume discounts.

To build it from scratch would be ~$1300 plus cost of GPUs.  When I bought them most of them it was more like $300-$350 ea now they are insanely expensive but hopefully that will change.  I would strongly discourage someone from trying this unless they are already a confident miner AND have experience with liquid cooling.  I am not sure I would try this if I didn't already own the aircooled rigs. Smiley

The reason for doing it is to:
a) improve the efficiency (getting >3 MH/W now at >3GH/s and with underclocking/undervolting that can rise to 5MH/W if necessary over time to stay profitable)
b) push the cards higher.  I think 3.2 GH/s per rig is possible allowing me to pickup 20% more revenue before the reward cut
c) eliminate roughly $4000 per year in AC costs to be more competitive in face of rising # of FPGAs.
d) keep the temps more stable (50C @ 99% load 24/7 all year long shouldn't be a problem)
e) The wife acceptance factor.  She has been a "trooper" with this mad scientist and his 14GH/s of whirling, buzzing, heat belching fun.  
f) maybe someday provide "free" hot water and heating for the entire house (~$1000 per year).
donator
Activity: 1218
Merit: 1079
Gerald Davis
3 video card do not plug power....why or how ?)
and how motherboard you use ?
Oops that was an early pic when I was testing each card.  BAMT only worked w/ 3 cards so I unplugged one card, mined, powered down, and plugged in a different one to test all 4.

The MB is the "miner classic"  MSI 890FXA-GD70
ZPK
legendary
Activity: 1302
Merit: 1021
3 video card do not plug power....why or how ?)
and how motherboard you use ?
sr. member
Activity: 457
Merit: 251
Why are your GPU temps only around 42-44c?


I would guess from the pic that it would be due to water cooling.   Cool
full member
Activity: 164
Merit: 100
Why are your GPU temps only around 42-44c?

member
Activity: 73
Merit: 10
Are you keeping that in a DataCenter ?

No although my office is looking more and more like a datacenter.  Got to get some sleep but 24 5970s produce a lot of heat and AC cuts into my profits.  If this 1 rig test goes good my goal is to rack of 6 of these in standard server rack with a heat exchanger to a secondary cooling loop which runs outside to a very large radiator.  Dump 6KW of heat directly outside.  We will see how this 1 unit experiment goes. 

How Much that thing cost you to build ?
Pages:
Jump to: