Author

Topic: Custom frame for BTC/LTC mining (Up to 24 GPU ) $150 - $190 - $200+ (Read 22673 times)

newbie
Activity: 6
Merit: 0
is this project still available or died ?
newbie
Activity: 13
Merit: 0
So this thread has died on the vine. Pity, it was an excellent and informative thread.
member
Activity: 98
Merit: 10
https://bitcointalk.org/index.php?topic=434996.0
Anyone know what kind of motherboard holds 6 ATX cards?


Something like this:

https://bitcointalksearch.org/topic/motherboard-designed-for-mining-and-better-riser-cables-in-this-thread-365181

and its awesomely designed for the  power requirements  of 6 gpu's
member
Activity: 98
Merit: 10
https://bitcointalk.org/index.php?topic=434996.0
Lol thats really impressive..

But its funny to see high end cards of that quantity and variety being loaded on Windows 98.. should be hack proof though.. i dont think anyone in this current era of hackers would even know what to do with it?
member
Activity: 108
Merit: 10
Anyone know what kind of motherboard holds 6 ATX cards?
sr. member
Activity: 280
Merit: 250
Doesnt look like there has been much for updates on this. I am interested in purchasing many of these if you can deliver
newbie
Activity: 7
Merit: 0
very intristed in this project.please any updates?

joining the question
hero member
Activity: 574
Merit: 500
Mining for the hell of it.
very intristed in this project.please any updates?
PMB
member
Activity: 109
Merit: 10
HI,

No developments on this? It really seems like an interesting project.

regards,

P.
newbie
Activity: 40
Merit: 0
i don't quite get it, does this work yet or is it still in "testing phase"
full member
Activity: 234
Merit: 100
This's really interesting! Please add me to the news mail-list.
newbie
Activity: 28
Merit: 0
Hi everybody,

I'm open to all suggestions. But I would like the parts we use to build the frame to be widely available.

Extracting 4000-5000W of heat is not going to be easy, no doubt about that. The air cooling is going to be the hardest way to do so, and this is why I'm looking at various options. I just order this fan:
http://www.ebay.com/itm/01-05-Chrysler-PT-Cruiser-Radiator-Cooling-Fan-Assembly-NEW-/130632543695?pt=Motors_Car_Truck_Parts_Accessories&fits=Make%3AChrysler&hash=item1e6a4e6dcf&vxp=mtr
http://i.ebayimg.com/t/01-05-Chrysler-PT-Cruiser-Radiator-Cooling-Fan-Assembly-NEW-/00/s/MTIwMFgxMjAw/z/TnIAAMXQuu9RjLtl/$T2eC16R,!zcE9s4g3JJwBRjLtlIWq!~~60_57.JPG
I will give it a try and see how well it performs.
Here is what I found for it:
Cooling Fan Assembly with Ø415mm Universal, Suitable for Chrysler PT Cruiser
# Power: 150 and 180W
# DC voltage: 12V
So I can expect it to pull roughly 15A.
I'll run a few test and report back as soon as I get some results!
legendary
Activity: 2926
Merit: 1386
FIRST PRE TESTING RESULTS / PRE BETA
Hi Everybody, I'm happy to report that everything is going fairly well and that the first results are positive.
I so far mounted 4 Cards (7950 Gigabyte WF3 with F43 BIOS, BAMT), 5th coming this afternoon. I found the following.
Depending on ambient air temperature (tested at 17-18C outside), I got the thing stable at 67C-72C, 1080W. Hashes average for testing 580-600 kH/s .
For cooling, I used a 24" industrial fan (low speed) which only really pushes air on the edge (design flaw), which is not good at all. It seems a shroud would help a lot to push the air in between the cards and not around the frame.
I'm going to see if I can get some space to allow the cards to stay correctly positioned. Right now, I used a foam spacer otherwise some of them touches (you'll understand when you see the pictures).
Posting again soon.

The 5 cards setup so far. Waiting for #6 and some special parts to customize the rack.



Just a note.   Some of us here honestly trying to offer you suggestions have studied heat transfer and worked in the field.

And yes, it is rocket science.

Smiley
hero member
Activity: 504
Merit: 500
Fridges come in all sizes...

and as for your statement... "I would love to see what happens in the insulated enclosure if the fan stops working, this would be very interesting."

Same thing that would happen in an open-air design... if the fan stops working... It shuts-down that card, or the whole rig. In my 25-plus years, building electronics, I have NEVER had a fan "die". I don't buy cheap crap fans from china... I also don't use stock crap fans.

As for an enclosure, which would actually be better suited for your design. Try a large microwave. Junkyards are full of them for free. Larger... upgrade to a dryer-machine, and gut-it... larger, use a chest-fridge (remove the insulation if you desire, it is just foam.)

All I was saying is you are building a "core" without consideration for any actual evacuation of heat. That unit will hold 3000+Watts , heating a whole house in minutes.

Hot air does NOT only exhaust out the end of a card. It exhausts in all directions, which is why they expect you to contain it, and evacuate it with additional wattage-sucking fans. (hidden costs)

Or, you contain the cards, like I do, in a shell... rip off the PSU/mobo/vid-card-powered fans, and use ONE external 120v/240v fan to replace them all. That makes more power available to the graphics, and system, and provides quiet and better structured cooling that actually works.

Your hidden cost, is all the expensive stuff that is beyond the "frame". For the frame, $150 is a great price. But that isn't a "turn-key" frame, and still requires additional setup, install, and building. If they can manage that other building, then I am sure they can manage building the simple frame.

Still, you are "manufacturing" based on "theory"... Thus... show it populated, and working. (Funny that you keep showing a "failed project" as an example. That was never working either. The one which was half-loaded, because he couldn't get all 16 cards working on one motherboard. Completely unrelated to your build.)

If the cards have a shroud, and evacuation, you can remove the existing shrouds and place them 1mm apart. Stock, it is best to spread them, so they can MIX air better, thus, mixing more cool air with the hot exhaust it is spewing in all directions. (Which then, still has to be evacuated, now that it is mixed with half hot/cool air.)
newbie
Activity: 28
Merit: 0
FIRST PRE TESTING RESULTS / PRE BETA
Hi Everybody, I'm happy to report that everything is going fairly well and that the first results are positive.
I so far mounted 4 Cards (7950 Gigabyte WF3 with F43 BIOS, BAMT), 5th coming this afternoon. I found the following.
Depending on ambient air temperature (tested at 17-18C outside), I got the thing stable at 67C-72C, 1080W. Hashes average for testing 580-600 kH/s .
For cooling, I used a 24" industrial fan (low speed) which only really pushes air on the edge (design flaw), which is not good at all. It seems a shroud would help a lot to push the air in between the cards and not around the frame.
I'm going to see if I can get some space to allow the cards to stay correctly positioned. Right now, I used a foam spacer otherwise some of them touches (you'll understand when you see the pictures).
Posting again soon.

The 5 cards setup so far. Waiting for #6 and some special parts to customize the rack.
http://www.awtti.com/images/LTCrig1.JPGhttp://www.awtti.com/images/LTCrig2.JPGhttp://www.awtti.com/images/LTCrig3.JPG
http://www.awtti.com/images/LTCrig4.JPG
newbie
Activity: 28
Merit: 0
The first test will be on the bare frame. I'm still waiting on the acrylic trays.
As soon as they are made, I will mount them him and redo some testing to make sure it is not affecting the airflow. I did make a modification onto them to allow for a better airflow under the motherboard by creating an opening in front of the PSU fan. It will also reduce any possible airflow restriction to the PSU and reduce the overall cost in material (acrylic) for all the trays.
newbie
Activity: 28
Merit: 0
You realy need to show it populated with actual equipment, not theoretical equipment.

90% of that structure is useless and expensive for such a simple structure.

Missing things... any kind of cover, to actually allow air to "flow" over components. Any form of exhaust-vent attachment for the 12KWh heater/GPU exhaust. (You will not be cooling that with an AC unit.) Missing filter attachment, and blower-fan. (A free-flow fan will not suffice. That is why they don't use them in a professional setup. You need a blower with high static-pressure, since this is not an "unrestricted air-flow", thus, flower-fans are just a cheap and ineffective alternative to what is actually needed.)

I'll sell you my old fridge, which is better suited for 2x the volume of your structure. Same price, insulated, sealed, ready for ducting, and has built-in supplemental cooling for moisture-extraction of ambient outside air. $150 OBO, pickup only. Tongue

Hi Isawhim,

I thought I would address portion of this post.
I'm going to show it populated with actual equipment, this is the objective of the Beta test phase.

I'm not getting what you are saying about 90% of the structure being useless. There will be a lot of weight attached to the structure and each piece plays a roll in either supporting or reinforcing the frame so that it can support the weight of the hardware and PSUs. Do you have some kind of engineering study to support your claim? Do you mind sharing your insight on how you would remove 90% of the components and have something that would work and correctly support and protect the equipment?
As far as prices goes, I'm doing my best to keep it down but the aluminum profiles are not cheap, the fasteners are expensive, and you still have to pay for somebody to cut and drill every piece. It's made here in the USA, so labor is not that cheap. Right now, it cost less than $50 for 6 GPU, which is not too bad if the cooling is adequate. The idea is that the frame will help putting together a setup that will be easier to cool (and save energy, allowing the rack to pay for itself).
The system is designed to handle 6kWh, not 12kWh. I'm not sure how you get to that number knowing that the limitation is 4 PSU or 6400W max... If you want to enclose the frame, you absolutely can. I already plan on doing that when I will run the oil cooling test.
If you want to use a fan with a high static pressure, you also can. But I will start testing with regular fans and see what happens.

You are talking about using a refrigerator that is twice the volume, which defeat the purpose of the rack design (compact setup). We already know that tight spacing with the right airflow can work:
https://encrypted-tbn2.gstatic.com/images?q=tbn:ANd9GcRMk0ZteKoFZO0kqXhSdfCn1uLCdaJKOUc6j8FLFqYoEqmay9l5
Debating this issue is pointless until I start testing with the current spacing. My theory is that, if you can make it work with the cards tightly packed together, why couldn't you with an additional 10mm in between them?

I would love to see what happens in the insulated enclosure if the fan stops working, this would be very interesting.

Update: I will receive the first frames on the 8th. I will post some picture then.


hero member
Activity: 504
Merit: 500
You realy need to show it populated with actual equipment, not theoretical equipment.

90% of that structure is useless and expensive for such a simple structure.

Missing things... any kind of cover, to actually allow air to "flow" over components. Any form of exhaust-vent attachment for the 12KWh heater/GPU exhaust. (You will not be cooling that with an AC unit.) Missing filter attachment, and blower-fan. (A free-flow fan will not suffice. That is why they don't use them in a professional setup. You need a blower with high static-pressure, since this is not an "unrestricted air-flow", thus, flower-fans are just a cheap and ineffective alternative to what is actually needed.)

I'll sell you my old fridge, which is better suited for 2x the volume of your structure. Same price, insulated, sealed, ready for ducting, and has built-in supplemental cooling for moisture-extraction of ambient outside air. $150 OBO, pickup only. Tongue
full member
Activity: 162
Merit: 100
first thing to do drill holes and mount caster wheel.  Smiley
legendary
Activity: 2926
Merit: 1386
Not really, because air will find the path of least resistance, and you are seeking to put it through specific channels of higher resistance.

That means that correct design is to make the spaces between cards the spaces that the air preferably flows through.  You could test this with one graphics card by putting a flat plate to both sides of it at various distances, then turning the card on.

Another issue is based on the card positioning, it is airflow output from one card being input to the next.  Thus in a row of cards, each has higher input air than the previous.  The fix to this is flat plate, or preferrably curved, diverters separating each card and directing airflow out of the row and upwards.
Here is a picture of the rack from the bottom. This is what the air will be forced into. I guess we could test and see how much gap is need to have the perfect setup. But As I said, there are physical constraints and we are at the limits... Don't forget that natural convection and most importantly the GPU fan will create a natural upward flow and a "low pressure zone" in between the cards, where the fresh air coming from the bottom will want to rush into.


I think we are thinking the same thing as far as card positioning, although I'm a bit confused. Here is the drawing I posted about stacking the racks. It's just one idea among many possible setups.
I would not recommend to have the air go through one rack directly to the next as it would seriously warm up the second rack.
If I'm not understanding what you mean, could you possibly do a little schematic of what you think the problem is/will be?
The problem is/will be/always is variations in heat transfer between different parts of boards.

You can figure this pretty easily, put the whole thing inside a 18"x18" by 8' tube made of say, building supply 1/2" foam, then measure airflow (can be done by looking at RPM of a fan blade with a strobe) and temperature in and out.

The increase in efficiency of cooling as you move things apart is exponential, not linear.  For at least the first inch or so.  The best way to handle a row of cards would be to put flat plate diverters between each card , made from plastic or cardboard, not metal.  That would somewhat eliminate card to card radiative heating.
newbie
Activity: 28
Merit: 0
Theres so much hype about oil cooling in this thread... When GPU mining becomes not profitable, you will be left with thousands of dollars in paperweights. Nobody on ebay will want to buy an oily graphics card. They'll think "Wtf this is wet, i don't want to plug this in. It will fry all my equipment!"

On topic though, I would be interested in each card had 3 slots. I don't want my equipment to suffocate.

SSateneh, if you want to sell a card submerged in oil, it must go through a soap and water cleaning (by hand or dishwasher). If the card was installed in the oil when new, it will keep it this way. Once it is clean, you will not be able to make the difference between the oil cooled one and the air cooled one. Actually, the air cooled one will be dusty and might have overheated in some spots because the air cooling was not adequate (typical with dust accumulation). With the oil cooling one, you know the entire board was pretty much kept at a fairly cool and uniform, somewhat low (46C-55C) temperature for its entire life. There is a reason Green Revolution Cooling (http://www.grcooling.com/) is building a multi-million dollar business on it.
"When GPU mining becomes not profitable" ... : there are already crypto currencies that can only be mined on CPU/GPU. So it might take a while before GPU mining is not profitable at all for all other crypto currencies, including the new ones that will for sure come out. If you think about it, when GPU mining will not be profitable for a currency, it will damage it as most people who support the currency, the mass of miner, is GPU mining. When they all leave replaced by a few ASIC pools, you will loose redundancy, you will loose security, you will lose interest and the free hype that the miners generate. That was literally the foundation LTC is based on: trying to avoid this particular phenomenon.

If you read about oil cooling on GRC's website , you'll also notice that the oil cooling has been proven to improve energy efficiency by up to 20%, which would keep the GPU mining profitable for a longer period. For the few of us that will be able to recycle some of the lost heat, it might keep GPU mining even longer. So far, my calculations for a 24 GPU rack show that you can build an oil cooling system for around $900. Not cheap, but doable if it saves you 15-20% on electricity cost (and improve your mining capacity). Finally, ASIC were developed to mine bitcoin and that's all they will do. Your GPU rack will probably have many more applications developed for it.

If you want to space out your card, you can skip one out of two. I will to prove to you that this rack will be able to manage the cards just like they are, with the right amount of air. Keep following the thread, I will post updates from the Beta test.
legendary
Activity: 1344
Merit: 1004
Theres so much hype about oil cooling in this thread... When GPU mining becomes not profitable, you will be left with thousands of dollars in paperweights. Nobody on ebay will want to buy an oily graphics card. They'll think "Wtf this is wet, i don't want to plug this in. It will fry all my equipment!"

On topic though, I would be interested in each card had 3 slots. I don't want my equipment to suffocate.
newbie
Activity: 28
Merit: 0
My limitation are:
19" wide: because this is the maximum with the rack and be and still fit in a standard server rack
24" long: because I don't want it any wider so that it fits standard fans (20" and 24").
So I built the rack based on these limitations.

I am looking at having something like this built myself. I need to put it into a standard 19" rack though. When you say 19" wide do you really mean around 17.8" since that is the maximum that will fit in a 19" rack. I was going to do 17.5" myself. Most server enclosures are 16.9-17.1 to accommodate for rails.

I was shooting for only 8 cards across that 17.5" knowing that the most an affordable MB could use is 7. Thinking 7-8 cards across 17.5" would allow for enough space between them for airflow. Could reduce the # of cards as necessary.

My design was going to have the cards on top and the MB under them. The PS's were going to be under the MB's. I was going to have the top bottom and sides enclosed with the front and back open for push/pull fans. The front would push in cold air and the back would pull hot. This is for a hot/cold isle in the server rack.

These are just my thoughts. Haven't gotten as far as you in designing it. Keep up the good work!  Cool



Hi Cartman,
Unfortunately, it's 19" total width... If we did half the number of card, we could make it smaller. But we would also have half the density. Somebody mentioned they could make it fit with a 19" width... So it will fit inside the rack, but you might have to remove the front rails to set it up. Not ideal, but the best we can do now considering the space we need. NB: the limitation comes from the space in between the MB and the CPU, the space in between the MB and its tray, and the space in between the tray and the PSU. If we can trim 1.25" or 0.625 on both sides, then you will be able to fit it on rails.
Right now, you do have to consider that we have limitation of 6 cards (maybe 7 but hard to pull) on most motherboard, so we have to do multiples of 6.
legendary
Activity: 1270
Merit: 1000
My limitation are:
19" wide: because this is the maximum with the rack and be and still fit in a standard server rack
24" long: because I don't want it any wider so that it fits standard fans (20" and 24").
So I built the rack based on these limitations.

I am looking at having something like this built myself. I need to put it into a standard 19" rack though. When you say 19" wide do you really mean around 17.8" since that is the maximum that will fit in a 19" rack. I was going to do 17.5" myself. Most server enclosures are 16.9-17.1 to accommodate for rails.

I was shooting for only 8 cards across that 17.5" knowing that the most an affordable MB could use is 7. Thinking 7-8 cards across 17.5" would allow for enough space between them for airflow. Could reduce the # of cards as necessary.

My design was going to have the cards on top and the MB under them. The PS's were going to be under the MB. I was going to have the top bottom and sides enclosed with the front and back open for push/pull fans. The front would push in cold air and the back would pull hot. This is for a hot/cold isle in the server rack.

These are just my thoughts. Haven't gotten as far as you in designing it. Keep up the good work!  Cool

newbie
Activity: 28
Merit: 0
Oil cooling, only way to go for this setup:
http://www.maximumpc.com/article/features/hardcorepc_reactor
We'll I guess we'll know pretty soon.
I like the oil cooling technique because, in the case of the rack, it will allow for heat recycling all year long with the use of a Oil/Water heat exchanger (intake of hot water heater, used as a pool heater, etc...). And it will make the rack absolutely quiet and very stable.
member
Activity: 98
Merit: 10
newbie
Activity: 28
Merit: 0
OK I am back with only a few minor comments.

1) You might consider attaching the GPUs to the upper side of the support as to not cover any of the cards' rear vents which I have found will increase temps even if marginally obstructed.

2) I have an FT02 that I used for mining at one time and found it insufficient to handle large heat loads even in that orientation. What does help though is large amounts of air being pushed from the bottom (ie not the three 180mm fans it came with. Make sure to have lots of static pressure as well and an fairly open exit.

3) The shroud for the top could do with being more open, that is not such a steep curve on exit. This will lead to some pooling of the hot air that will be more difficult to remove.

That's it for now, if I think of anything else I will get back to you. Nice setup and keep improving. Smiley

Excellent feedback!!! Thank you. Wink

1) I will probably switch the upper support to a 0.5" (0.5" x 1") profile so that the rear vents are not obstructed. That's an easy fix but I HAD to go with the profile I used because the other one was not in stock. The rack is designed so that this particular profile (the one holding the graphic cards) can be toyed with, replace, modified, moved back and forth, etc... It's the only portion of the rack that will be adjustable.
http://thumbs2.ebaystatic.com/m/mm7Ao65B0ktfUqII7gOv5ig/140.jpg

2) In the case of the FT02, which is a great example, you really "only" have 1 x 180mm (of the three) blowing on your SLI/TRI SLI setup (150 CFM/3 cards). With the rack, I'm counting on 300 CFm / 3 cards minimum (20" Box fan) or 500 CFM with a 24" industrial fan.
http://i792.photobucket.com/albums/yy208/dangcjr/DSC_0086.jpg

For the cooling of the rack, I did consider this (high static pressure), a PT cruiser car fan:
http://i.ebayimg.com/t/New-Radiator-Fan-Cooling-Chrysler-PT-Cruiser-2005-2004-2003-2002-2001-CH3115118-/00/s/NTAyWDUzMA==/z/AyAAAMXQ0v1RcOmo/$(KGrHqNHJEQFDjwVMzBBBRcOmoMJ9g~~60_12.JPGhttp://images1.carpartsdiscount.com/auto/archive/pictures/135625/600/1/P/420D614/chrysler_pt_cruiser_radiator_cooling_fan_oem_5017407ab.jpg

I have a few questions on your setup. Did you run the case open? If so, did it make a difference with the case close? What was the spacing between your card (the standard 0.06")? Did you mod it or switch the bottom fan to have a better cooling?

3) I drew the shroud on top with Word, so it is just a concept. I think we could test various angles and depth to make sure it does not restrict the airflow.

I'm looking for all possible suggestions to improve cooling and set up. Anything you'll suggest will be used to better the rack.
newbie
Activity: 28
Merit: 0
Looks intresting...what mobo are u using that can run 6 x single gpu cards (i.e 7950's) ??

I'm working on that as we speak.
You can find some info here: https://bitcointalk.org/index.php?topic=186877.20
I posted in the second page a list of thread that deals with that. I'm working with Boozer to find a stable setup.
I will also be working with the Linux developer to test the setup under Linux and optimize it for 6 GPUs.
newbie
Activity: 28
Merit: 0
Not really, because air will find the path of least resistance, and you are seeking to put it through specific channels of higher resistance.

That means that correct design is to make the spaces between cards the spaces that the air preferably flows through.  You could test this with one graphics card by putting a flat plate to both sides of it at various distances, then turning the card on.

Another issue is based on the card positioning, it is airflow output from one card being input to the next.  Thus in a row of cards, each has higher input air than the previous.  The fix to this is flat plate, or preferrably curved, diverters separating each card and directing airflow out of the row and upwards.
Here is a picture of the rack from the bottom. This is what the air will be forced into. I guess we could test and see how much gap is need to have the perfect setup. But As I said, there are physical constraints and we are at the limits... Don't forget that natural convection and most importantly the GPU fan will create a natural upward flow and a "low pressure zone" in between the cards, where the fresh air coming from the bottom will want to rush into.
http://www.awtti.com/images/LTCminingrig13.png


I think we are thinking the same thing as far as card positioning, although I'm a bit confused. Here is the drawing I posted about stacking the racks. It's just one idea among many possible setups.
http://www.awtti.com/images/LTCminingrig10.png
I would not recommend to have the air go through one rack directly to the next as it would seriously warm up the second rack.
If I'm not understanding what you mean, could you possibly do a little schematic of what you think the problem is/will be?
legendary
Activity: 1400
Merit: 1000
I owe my soul to the Bitcoin code...
OK I am back with only a few minor comments.

1) You might consider attaching the GPUs to the upper side of the support as to not cover any of the cards' rear vents which I have found will increase temps even if marginally obstructed.

2) I have an FT02 that I used for mining at one time and found it insufficient to handle large heat loads even in that orientation. What does help though is large amounts of air being pushed from the bottom (ie not the three 180mm fans it came with. Make sure to have lots of static pressure as well and an fairly open exit.

3) The shroud for the top could do with being more open, that is not such a steep curve on exit. This will lead to some pooling of the hot air that will be more difficult to remove.

That's it for now, if I think of anything else I will get back to you. Nice setup and keep improving. Smiley
legendary
Activity: 2926
Merit: 1386
I think if you removed every other card in the GPU arrays, cooling would be possible.  An attempt to direct air flow into this packed double array would require a shroud around the entire thing.

The distance between the GPU cards is insufficient for good airflow, air is required to make a 90 degree turn from the card's fan to exit the rack. 90 degree turn is difficult for any air mass, but for distances of 1/16 to 1/4 inch, laminar flow generates additional frictional losses.

Maybe this is a bit technical, but the sum is put them further apart...

We will test with an enclosure around the rack, we do want to try to force the air through it and see what happens.

I have done some air cooling test on SLI/Crossfire setup. I'm not saying it's perfect, but you can keep your cards at reasonable temperature under full loads if you have a fan blowing directly onto them.
Here is a good example of a setup with a 20" fan box and very little spacing (0.06" -- standard): http://www.youtube.com/watch?v=2nDTBN_cPs0
The spacing in between brackets (not cards, it should be slightly more) in the rack is 0.35" or roughly 3/8". It's not a lot, but it is 1/4" more than in standard crossfire/SLI setup (0.06"). I think the fresh air will be able to get in between and reach the GPU fan, which will push it to the top.
I also count on some help from the naturally occurring convection, which will create a chimney pull effect. The FT02 from Silverstone has used this concept and the results are very encouraging. And like I said, this is not new technology, similar arrangements have been used in high end applications, I'm just trying to adapt it for our application.
But I think only testing will show if this approach is accurate or not. Results to be published in about 2 weeks!

If you want to stack the rack in a server rack, you will have to put shrouds on top and bottom of the rack as depicted in one of the picture. The shroud will have to be much larger than 1/4" because they will have to contain the fan(s) to push (/pull). One idea is that we could sandwich the rack in between 2 fans to improve aeration.
Not really, because air will find the path of least resistance, and you are seeking to put it through specific channels of higher resistance.

That means that correct design is to make the spaces between cards the spaces that the air preferably flows through.  You could test this with one graphics card by putting a flat plate to both sides of it at various distances, then turning the card on.

Another issue is based on the card positioning, it is airflow output from one card being input to the next.  Thus in a row of cards, each has higher input air than the previous.  The fix to this is flat plate, or preferrably curved, diverters separating each card and directing airflow out of the row and upwards.
hero member
Activity: 574
Merit: 500
Damnit where were you 2 years ago  Grin

I was designing other stuffs that allowed me to do the design of this frame now  Wink

Looks intresting...what mobo are u using that can run 6 x single gpu cards (i.e 7950's) ??
newbie
Activity: 28
Merit: 0
Damnit where were you 2 years ago  Grin

I was designing other stuffs that allowed me to do the design of this frame now  Wink
newbie
Activity: 28
Merit: 0
How far apart is optimal?

Answer: As much as you can!
It's hard to tell without testing. But as mentioned, people with the tight "standard SLI spacing" (http://www.youtube.com/watch?v=2nDTBN_cPs0) are able to make it work. So I expect it to work as least as well with 0.25-0.3" additional space in between.
Here is a bitcoin mining setup with 0.06" spacing, small fans the back:
https://encrypted-tbn2.gstatic.com/images?q=tbn:ANd9GcRMk0ZteKoFZO0kqXhSdfCn1uLCdaJKOUc6j8FLFqYoEqmay9l5
if it works there with 3-4 little 120mm fan pushing hardly 400 SCFM in a dense cluster...

And here, with dual GPU board:
http://mining.bitcoin.cz/media/img/miner.jpg


My limitation are:
19" wide: because this is the maximum with the rack and be and still fit in a standard server rack
24" long: because I don't want it any wider so that it fits standard fans (20" and 24").
So I built the rack based on these limitations.
newbie
Activity: 28
Merit: 0
I think if you removed every other card in the GPU arrays, cooling would be possible.  An attempt to direct air flow into this packed double array would require a shroud around the entire thing.

The distance between the GPU cards is insufficient for good airflow, air is required to make a 90 degree turn from the card's fan to exit the rack. 90 degree turn is difficult for any air mass, but for distances of 1/16 to 1/4 inch, laminar flow generates additional frictional losses.

Maybe this is a bit technical, but the sum is put them further apart...

We will test with an enclosure around the rack, we do want to try to force the air through it and see what happens.

I have done some air cooling test on SLI/Crossfire setup. I'm not saying it's perfect, but you can keep your cards at reasonable temperature under full loads if you have a fan blowing directly onto them.
Here is a good example of a setup with a 20" fan box and very little spacing (0.06" -- standard): http://www.youtube.com/watch?v=2nDTBN_cPs0
The spacing in between brackets (not cards, it should be slightly more) in the rack is 0.35" or roughly 3/8". It's not a lot, but it is 1/4" more than in standard crossfire/SLI setup (0.06"). I think the fresh air will be able to get in between and reach the GPU fan, which will push it to the top.
I also count on some help from the naturally occurring convection, which will create a chimney pull effect. The FT02 from Silverstone has used this concept and the results are very encouraging. And like I said, this is not new technology, similar arrangements have been used in high end applications, I'm just trying to adapt it for our application.
But I think only testing will show if this approach is accurate or not. Results to be published in about 2 weeks!

If you want to stack the rack in a server rack, you will have to put shrouds on top and bottom of the rack as depicted in one of the picture. The shroud will have to be much larger than 1/4" because they will have to contain the fan(s) to push (/pull). One idea is that we could sandwich the rack in between 2 fans to improve aeration.
hero member
Activity: 873
Merit: 1007
Damnit where were you 2 years ago  Grin
member
Activity: 102
Merit: 10

[/quote]I think if you removed every other card in the GPU arrays, cooling would be possible.  An attempt to direct air flow into this packed double array would require a shroud around the entire thing.

The distance between the GPU cards is insufficient for good airflow, air is required to make a 90 degree turn from the card's fan to exit the rack. 90 degree turn is difficult for any air mass, but for distances of 1/16 to 1/4 inch, laminar flow generates additional frictional losses.

Maybe this is a bit technical, but the sum is put them further apart...
[/quote]

How far apart is optimal?
sr. member
Activity: 280
Merit: 250
It is an interesting design for a high density setup.  Let me mull it over some more and I will get back to you with some thoughts.

I look forward to your feedback.

Latest update 4/29/13: I have found a Linux developer that will put together a light version of Debian that will specifically be developed for this rack. It will allow the entire system to run from a USB key that will come pre programmed with Cgminer, optimized for 6 GPU per motherboard with a very light OS and graphic interface to optimize the Hash/Watt. It will also allow the operator to monitor the rack remotely.  AUTO INSTALL and AUTO RUN (no computer knowledge needed, 99% Plug & Play).

I am interested please keep us updated
legendary
Activity: 2926
Merit: 1386
Good morning everybody,

I'm getting ready to order my new LTC mining set up. But I'd like to put together a clean long lasting build that will keep my equipment safe and working at optimum capacity.
I want to reach the maximum GPU mining efficiency by providing optimum cooling and opting for the larget GPU/(MB+CPU+other) ratio for the highest Hash/W......
Do you have any suggestions for possible improvements?


Looking forward to your feedback on this little project.
I think if you removed every other card in the GPU arrays, cooling would be possible.  An attempt to direct air flow into this packed double array would require a shroud around the entire thing.

The distance between the GPU cards is insufficient for good airflow, air is required to make a 90 degree turn from the card's fan to exit the rack. 90 degree turn is difficult for any air mass, but for distances of 1/16 to 1/4 inch, laminar flow generates additional frictional losses.

Maybe this is a bit technical, but the sum is put them further apart...
newbie
Activity: 28
Merit: 0
It is an interesting design for a high density setup.  Let me mull it over some more and I will get back to you with some thoughts.

I look forward to your feedback.

Latest update 4/29/13: I have found a Linux developer that will put together a light version of Debian that will specifically be developed for this rack. It will allow the entire system to run from a USB key that will come pre programmed with Cgminer, optimized for 6 GPU per motherboard with a very light OS and graphic interface to optimize the Hash/Watt. It will also allow the operator to monitor the rack remotely.  AUTO INSTALL and AUTO RUN (no computer knowledge needed, 99% Plug & Play).
legendary
Activity: 1400
Merit: 1000
I owe my soul to the Bitcoin code...
It is an interesting design for a high density setup.  Let me mull it over some more and I will get back to you with some thoughts.
newbie
Activity: 28
Merit: 0
where do you get the framing?


The framing is made of aluminum profiles, so any company that makes it: 80/20, Misumi, etc...
Right now, the entire assembly is purchase from a manufacturer in the US.
newbie
Activity: 28
Merit: 0
Is it possible for you to post schematics of your design? It's impossible to judge distances with a 681 × 569 3d render.

You have a good point. It is hard to see the details in this picture.

There is about 1.5" in between the motherboard and the graphic cards connectors. This is the section that will be over the CPU, which means that I will probably have to use a low profile heatsink+fan or just a low profile heatsink (passive cooling might be sufficient). http://www.newegg.com/Product/Product.aspx?Item=N82E16835114094
that heatsink alone costs $30. you're better off making your frame a bit bigger to accommodate the OEM heatsink.

Also, I have no idea how your fan design is going to work. you have to realize that the fan doesn't magically pull air up. there's going to be turbulance and dead spots when you're mounting your fan that close.

Hi Grue,

Right now, the solution is to use some overkill fan/airflow (and to adjust it as needed). The fan will not sit "that" close in the sense that the enclosure will give it a minimum of 2"-3" space with the frame.
If you have used a 24" fan before, you know that it pushed a LOT of air. So there might be some dead spaces, but overall, it will be ventilated. Plus you'll get some help from the natural convection. Because as long as the fan from the video cards can pull some fresh air, you will be fine.
With 2000-4000 SCFM depending on the fan, we need to dissipate about 24,000 BTU. I know that might not mean much but here is some comparison. A gas furnace heater that has a capacity of 46,000 BTU has an upflow of 1200 SCFM. So to simplify heat, they dissipate 46,000 BTU using 1200 SCFM. Granted that the dissipation might be a bit better than the rack, we still have enough air to dissipate all the heat.
(http://www.younits.com/46000-furnace-stage-multi-speed-multi-position-upflow-1200cfm-p-3219.html)
The frame was originally 1" bigger, and we could have made it 2" bigger. But some people requested you could fit it in a rack. So there is a limitation in width of 19".
Finally, you need to remember that this rack will hold $12000 of equipment, so the cost of the $30 heatsink is not really critical.

I posted another image that shows the configuration of the rack a bit better.
member
Activity: 102
Merit: 10
where do you get the framing?
legendary
Activity: 2058
Merit: 1452
Is it possible for you to post schematics of your design? It's impossible to judge distances with a 681 × 569 3d render.

you seem to forget that the motherboard needs a huge heatsink on it for the cpu, plus extra space for RAM and northbridge/southbridge heatsinks. Also, I've never seen a atx motheboard that allows you to pack cards in that manner. even with risers it's impossible because there's not enough space for the riser to twist. plus you're packing the cards way too densely. there's no way you're going to get good airflow.
You have a good point. It is hard to see the details in this picture.

There is about 1.5" in between the motherboard and the graphic cards connectors. This is the section that will be over the CPU, which means that I will probably have to use a low profile heatsink+fan or just a low profile heatsink (passive cooling might be sufficient). http://www.newegg.com/Product/Product.aspx?Item=N82E16835114094
that heatsink alone costs $30. you're better off making your frame a bit bigger to accommodate the OEM heatsink.

Also, I have no idea how your fan design is going to work. you have to realize that the fan doesn't magically pull air up. there's going to be turbulance and dead spots when you're mounting your fan that close.
newbie
Activity: 28
Merit: 0
Surely that'll need wheels. Must be around 70kg?
YES, it will be very heavy. Wheel or set in place on rubber feet.
To put it in an oil bath, I was planning to use an engine lifter.
http://www.asedeals.com/images/44020-Omega-engine-hoist.jpg
newbie
Activity: 28
Merit: 0
Ya there's a few things that will most likely need to change:

#1 I LOVE the idea of having GPUs mounted vertically. Especially with a blower style stock fan, it's the best way to get the heat not only off of the GPU, but blow it far away from the rig itself. However, you can't do this in a rack-mounted case like you're asking for. You're wanting to vent upwards of 5KW of heat, and blow it right into another computer or server?

#2 Those PSUs can't fit like that. You're assuming the fan is mounted on the top, but most high-end PSUs will have the fan on top. If you put a high end PSU in there, it will suffocate, and likely overheat.

#3 Like other are saying, you need quite a bit more room for the motherboards, riser cables, PSU 6pin cables, etc.

#1 Not necessarily. You will have to place a shroud to separate each stage. In a server space environment where lots of cards are staked vertically, you would want to vent it to the back of the rack and then up. I would have to do a schematic to explain. I have seen it done this way inside of military grade equipment used for UAV control.
http://www.awtti.com/images/LTCminingrig10.png

#2. The PSU will get their own "air flow". It's hard to see but they are mounted in "tunnel" and isolated from the GPU/MB by an acrylic tray. There is a 0.65" spacing in between the tray and the power supply to let the air flow since, as you say, all high end PSU are fed from the top.
http://www.lepatek.eu/fileadmin/produkte/netzteile/g-serie/g1600_04.jpg

#3. The only challenge will be to manage the riser cables. The PSU 6 pins cable will run on the bottom of the frame (enough space for that).
Keep in mind that this picture is somewhat of a Worst Case Scenario: 9" long Power Suppliers, 12" long video cards!!! That currently does not really exist and would not be done for this set up but I wanted to have a design that would include all possibilities.
newbie
Activity: 28
Merit: 0
you seem to forget that the motherboard needs a huge heatsink on it for the cpu, plus extra space for RAM and northbridge/southbridge heatsinks. Also, I've never seen a atx motheboard that allows you to pack cards in that manner. even with risers it's impossible because there's not enough space for the riser to twist. plus you're packing the cards way too densely. there's no way you're going to get good airflow.
You have a good point. It is hard to see the details in this picture.

There is about 1.5" in between the motherboard and the graphic cards connectors. This is the section that will be over the CPU, which means that I will probably have to use a low profile heatsink+fan or just a low profile heatsink (passive cooling might be sufficient). http://www.newegg.com/Product/Product.aspx?Item=N82E16835114094
http://images17.newegg.com/is/image/newegg/35-114-094-TS?$S300W$
Where the DDR3 is located (and chipset, now often on the back of the MB), there is a 2" space between the MB and the video card. A memory stick with a large heatspreader is about 2" high, so it would not fit. However, a memory stick without heatspreader (which would be fine for this application) is only 1.2" high and would fit just fine.

Now for the air cooling. The theory is that air cooling efficiency is somewhat proportional to airflux/volume. Generally, people cool down their 3-4 card set up with 2-3 case fans that each move between 80-120 SCFM at best. A typical case is about 1.3 cubic foot. That about 230 SCFM per cubic foot.

This rack is 1.6 cubic foot. A 24" fan will move as much as 4000 SCFM at full speed (about the equivalent of 40 x 120mm high quality case fans). This is about 2500 SCFM per cubic foot of space, or 10 times what you have in a standard setup. (that does not take into account the natural convection occurring from the orientation of the GPU in the rack).

I feel pretty confident that the combination of a high airflow + natural convection will keep the setup running well.

Additionally, one option is to use oil cooling for this set up. To be tested as soon as we have worked out all the issues with the air cooling set up ;-)
legendary
Activity: 952
Merit: 1000
Ya there's a few things that will most likely need to change:

#1 I LOVE the idea of having GPUs mounted vertically. Especially with a blower style stock fan, it's the best way to get the heat not only off of the GPU, but blow it far away from the rig itself. However, you can't do this in a rack-mounted case like you're asking for. You're wanting to vent upwards of 5KW of heat, and blow it right into another computer or server?

#2 Those PSUs can't fit like that. You're assuming the fan is mounted on the top, but most high-end PSUs will have the fan on top. If you put a high end PSU in there, it will suffocate, and likely overheat.



#3 Like other are saying, you need quite a bit more room for the motherboards, riser cables, PSU 6pin cables, etc.
legendary
Activity: 2058
Merit: 1452
you seem to forget that the motherboard needs a huge heatsink on it for the cpu, plus extra space for RAM and northbridge/southbridge heatsinks. Also, I've never seen a atx motheboard that allows you to pack cards in that manner. even with risers it's impossible because there's not enough space for the riser to twist. plus you're packing the cards way too densely. there's no way you're going to get good airflow.
legendary
Activity: 1666
Merit: 1185
dogiecoin.com
Surely that'll need wheels. Must be around 70kg?
newbie
Activity: 28
Merit: 0
Good morning everybody,

I'm getting ready to order my new LTC mining set up. But I'd like to put together a clean long lasting build that will keep my equipment safe and working at optimum capacity.
I want to reach the maximum GPU mining efficiency by providing optimum cooling and opting for the larget GPU/(MB+CPU+other) ratio for the highest Hash/W.

I designed a custom frame that will support the equipment. The advantages are as follow:

Specifications:
- *NEW* Will come with an optional USB Key running a light Debian and Cgminer pre installed for better energy efficiency. It will also allow to monitor the rack remotely. AUTO INSTALL AND AUTO RUN (no computer knowledge needed, 99% Plug & Play)
- 24 GPU : 4 x 6 GPU on 4 ATX motherboard
- Compatible 4 ATX PSU up to 12" long
- Footprint (rack compatible): 24" (long) x 19" (wide) x 12" (high), compatible with 20" (box fan) and 24" fans.
- Compatible with GPU up to 12" long (PCB) without adding feet. "unlimited length" with feet.
- Cooling mounted on top or bottom or both
- Natural convection for added cooling.
- Graphic cards that exhaust heat from the top of the card will send the hot air outside the case in this set up (custom heat-sinks from Sapphire, Gygabite, MSI, etc...).
- Stackable & can be installed on its side.
- Separate airflow for PSU (shielded from GPU heat via acrylic plates)
- GPU on the outside for optimum cooling and better overall performance
- Can be used in a Desktop or Tower format (on its side).
- Heights of the GPU over MB adjustable.
- Adjustable feet available


Price FOB the plant (assembled): $148 (6-9 cases), 10+ $142, 40+ $137.
Shipping on top, don't know how much it will be (the rack is kind of bulky -- 1.6 c.f).

Order will be placed early next week. People who are interested should PM me as soon as possible. The more we order, the cheapest it will be.

Latest Update 4/30/13: the first batch is shipping to me today. It's going freight so I expect it sometime next week.

Latest update 4/29/13: I have found a Linux developer that will put together a light version of Debian that will specifically be developed for this rack. It will allow the entire system to run from a USB key that will come pre programmed with Cgminer, optimized for 6 GPU per motherboard with a very light OS and graphic interface to optimize the Hash/Watt. It will also allow the operator to monitor the rack remotely.

UPDATE 4/26/13: The order is placed! 2 fully assembled, 3 kit. I will keep one fully assembled and one kit, leaving 1 fully assembled and 2 kit for Beta Testers. I will post the shipping date as soon as I know when it will be ready.

Rack compatible version (19"W x 24"L x 12"H) -- Order Version
http://www.awtti.com/images/LTCminingrig9.pnghttp://www.awtti.com/images/LTCminingrig11.png
Showing with 9.1" PSU on 0.25" Acrylic plate (9.73" capacity in this set up, 1.25" more is PSU plate on top). ATX Motherboard on Acrylix tray.

http://www.awtti.com/images/LTCminingrig8.pnghttp://www.awtti.com/images/LTCminingrig10.pnghttp://www.awtti.com/images/LTCminingrig12.png

I will also investigate an Oil Cooling option for people interested in keeping their local cool.

Do you have any suggestions for possible improvements?


Looking forward to your feedback on this little project.
Jump to: