Pages:
Author

Topic: Custom frame for BTC/LTC mining (Up to 24 GPU ) $150 - $190 - $200+ - page 2. (Read 22672 times)

legendary
Activity: 2926
Merit: 1386
Not really, because air will find the path of least resistance, and you are seeking to put it through specific channels of higher resistance.

That means that correct design is to make the spaces between cards the spaces that the air preferably flows through.  You could test this with one graphics card by putting a flat plate to both sides of it at various distances, then turning the card on.

Another issue is based on the card positioning, it is airflow output from one card being input to the next.  Thus in a row of cards, each has higher input air than the previous.  The fix to this is flat plate, or preferrably curved, diverters separating each card and directing airflow out of the row and upwards.
Here is a picture of the rack from the bottom. This is what the air will be forced into. I guess we could test and see how much gap is need to have the perfect setup. But As I said, there are physical constraints and we are at the limits... Don't forget that natural convection and most importantly the GPU fan will create a natural upward flow and a "low pressure zone" in between the cards, where the fresh air coming from the bottom will want to rush into.


I think we are thinking the same thing as far as card positioning, although I'm a bit confused. Here is the drawing I posted about stacking the racks. It's just one idea among many possible setups.
I would not recommend to have the air go through one rack directly to the next as it would seriously warm up the second rack.
If I'm not understanding what you mean, could you possibly do a little schematic of what you think the problem is/will be?
The problem is/will be/always is variations in heat transfer between different parts of boards.

You can figure this pretty easily, put the whole thing inside a 18"x18" by 8' tube made of say, building supply 1/2" foam, then measure airflow (can be done by looking at RPM of a fan blade with a strobe) and temperature in and out.

The increase in efficiency of cooling as you move things apart is exponential, not linear.  For at least the first inch or so.  The best way to handle a row of cards would be to put flat plate diverters between each card , made from plastic or cardboard, not metal.  That would somewhat eliminate card to card radiative heating.
newbie
Activity: 28
Merit: 0
Theres so much hype about oil cooling in this thread... When GPU mining becomes not profitable, you will be left with thousands of dollars in paperweights. Nobody on ebay will want to buy an oily graphics card. They'll think "Wtf this is wet, i don't want to plug this in. It will fry all my equipment!"

On topic though, I would be interested in each card had 3 slots. I don't want my equipment to suffocate.

SSateneh, if you want to sell a card submerged in oil, it must go through a soap and water cleaning (by hand or dishwasher). If the card was installed in the oil when new, it will keep it this way. Once it is clean, you will not be able to make the difference between the oil cooled one and the air cooled one. Actually, the air cooled one will be dusty and might have overheated in some spots because the air cooling was not adequate (typical with dust accumulation). With the oil cooling one, you know the entire board was pretty much kept at a fairly cool and uniform, somewhat low (46C-55C) temperature for its entire life. There is a reason Green Revolution Cooling (http://www.grcooling.com/) is building a multi-million dollar business on it.
"When GPU mining becomes not profitable" ... : there are already crypto currencies that can only be mined on CPU/GPU. So it might take a while before GPU mining is not profitable at all for all other crypto currencies, including the new ones that will for sure come out. If you think about it, when GPU mining will not be profitable for a currency, it will damage it as most people who support the currency, the mass of miner, is GPU mining. When they all leave replaced by a few ASIC pools, you will loose redundancy, you will loose security, you will lose interest and the free hype that the miners generate. That was literally the foundation LTC is based on: trying to avoid this particular phenomenon.

If you read about oil cooling on GRC's website , you'll also notice that the oil cooling has been proven to improve energy efficiency by up to 20%, which would keep the GPU mining profitable for a longer period. For the few of us that will be able to recycle some of the lost heat, it might keep GPU mining even longer. So far, my calculations for a 24 GPU rack show that you can build an oil cooling system for around $900. Not cheap, but doable if it saves you 15-20% on electricity cost (and improve your mining capacity). Finally, ASIC were developed to mine bitcoin and that's all they will do. Your GPU rack will probably have many more applications developed for it.

If you want to space out your card, you can skip one out of two. I will to prove to you that this rack will be able to manage the cards just like they are, with the right amount of air. Keep following the thread, I will post updates from the Beta test.
legendary
Activity: 1344
Merit: 1004
Theres so much hype about oil cooling in this thread... When GPU mining becomes not profitable, you will be left with thousands of dollars in paperweights. Nobody on ebay will want to buy an oily graphics card. They'll think "Wtf this is wet, i don't want to plug this in. It will fry all my equipment!"

On topic though, I would be interested in each card had 3 slots. I don't want my equipment to suffocate.
newbie
Activity: 28
Merit: 0
My limitation are:
19" wide: because this is the maximum with the rack and be and still fit in a standard server rack
24" long: because I don't want it any wider so that it fits standard fans (20" and 24").
So I built the rack based on these limitations.

I am looking at having something like this built myself. I need to put it into a standard 19" rack though. When you say 19" wide do you really mean around 17.8" since that is the maximum that will fit in a 19" rack. I was going to do 17.5" myself. Most server enclosures are 16.9-17.1 to accommodate for rails.

I was shooting for only 8 cards across that 17.5" knowing that the most an affordable MB could use is 7. Thinking 7-8 cards across 17.5" would allow for enough space between them for airflow. Could reduce the # of cards as necessary.

My design was going to have the cards on top and the MB under them. The PS's were going to be under the MB's. I was going to have the top bottom and sides enclosed with the front and back open for push/pull fans. The front would push in cold air and the back would pull hot. This is for a hot/cold isle in the server rack.

These are just my thoughts. Haven't gotten as far as you in designing it. Keep up the good work!  Cool



Hi Cartman,
Unfortunately, it's 19" total width... If we did half the number of card, we could make it smaller. But we would also have half the density. Somebody mentioned they could make it fit with a 19" width... So it will fit inside the rack, but you might have to remove the front rails to set it up. Not ideal, but the best we can do now considering the space we need. NB: the limitation comes from the space in between the MB and the CPU, the space in between the MB and its tray, and the space in between the tray and the PSU. If we can trim 1.25" or 0.625 on both sides, then you will be able to fit it on rails.
Right now, you do have to consider that we have limitation of 6 cards (maybe 7 but hard to pull) on most motherboard, so we have to do multiples of 6.
legendary
Activity: 1270
Merit: 1000
My limitation are:
19" wide: because this is the maximum with the rack and be and still fit in a standard server rack
24" long: because I don't want it any wider so that it fits standard fans (20" and 24").
So I built the rack based on these limitations.

I am looking at having something like this built myself. I need to put it into a standard 19" rack though. When you say 19" wide do you really mean around 17.8" since that is the maximum that will fit in a 19" rack. I was going to do 17.5" myself. Most server enclosures are 16.9-17.1 to accommodate for rails.

I was shooting for only 8 cards across that 17.5" knowing that the most an affordable MB could use is 7. Thinking 7-8 cards across 17.5" would allow for enough space between them for airflow. Could reduce the # of cards as necessary.

My design was going to have the cards on top and the MB under them. The PS's were going to be under the MB. I was going to have the top bottom and sides enclosed with the front and back open for push/pull fans. The front would push in cold air and the back would pull hot. This is for a hot/cold isle in the server rack.

These are just my thoughts. Haven't gotten as far as you in designing it. Keep up the good work!  Cool

newbie
Activity: 28
Merit: 0
Oil cooling, only way to go for this setup:
http://www.maximumpc.com/article/features/hardcorepc_reactor
We'll I guess we'll know pretty soon.
I like the oil cooling technique because, in the case of the rack, it will allow for heat recycling all year long with the use of a Oil/Water heat exchanger (intake of hot water heater, used as a pool heater, etc...). And it will make the rack absolutely quiet and very stable.
member
Activity: 98
Merit: 10
newbie
Activity: 28
Merit: 0
OK I am back with only a few minor comments.

1) You might consider attaching the GPUs to the upper side of the support as to not cover any of the cards' rear vents which I have found will increase temps even if marginally obstructed.

2) I have an FT02 that I used for mining at one time and found it insufficient to handle large heat loads even in that orientation. What does help though is large amounts of air being pushed from the bottom (ie not the three 180mm fans it came with. Make sure to have lots of static pressure as well and an fairly open exit.

3) The shroud for the top could do with being more open, that is not such a steep curve on exit. This will lead to some pooling of the hot air that will be more difficult to remove.

That's it for now, if I think of anything else I will get back to you. Nice setup and keep improving. Smiley

Excellent feedback!!! Thank you. Wink

1) I will probably switch the upper support to a 0.5" (0.5" x 1") profile so that the rear vents are not obstructed. That's an easy fix but I HAD to go with the profile I used because the other one was not in stock. The rack is designed so that this particular profile (the one holding the graphic cards) can be toyed with, replace, modified, moved back and forth, etc... It's the only portion of the rack that will be adjustable.
http://thumbs2.ebaystatic.com/m/mm7Ao65B0ktfUqII7gOv5ig/140.jpg

2) In the case of the FT02, which is a great example, you really "only" have 1 x 180mm (of the three) blowing on your SLI/TRI SLI setup (150 CFM/3 cards). With the rack, I'm counting on 300 CFm / 3 cards minimum (20" Box fan) or 500 CFM with a 24" industrial fan.
http://i792.photobucket.com/albums/yy208/dangcjr/DSC_0086.jpg

For the cooling of the rack, I did consider this (high static pressure), a PT cruiser car fan:
http://i.ebayimg.com/t/New-Radiator-Fan-Cooling-Chrysler-PT-Cruiser-2005-2004-2003-2002-2001-CH3115118-/00/s/NTAyWDUzMA==/z/AyAAAMXQ0v1RcOmo/$(KGrHqNHJEQFDjwVMzBBBRcOmoMJ9g~~60_12.JPGhttp://images1.carpartsdiscount.com/auto/archive/pictures/135625/600/1/P/420D614/chrysler_pt_cruiser_radiator_cooling_fan_oem_5017407ab.jpg

I have a few questions on your setup. Did you run the case open? If so, did it make a difference with the case close? What was the spacing between your card (the standard 0.06")? Did you mod it or switch the bottom fan to have a better cooling?

3) I drew the shroud on top with Word, so it is just a concept. I think we could test various angles and depth to make sure it does not restrict the airflow.

I'm looking for all possible suggestions to improve cooling and set up. Anything you'll suggest will be used to better the rack.
newbie
Activity: 28
Merit: 0
Looks intresting...what mobo are u using that can run 6 x single gpu cards (i.e 7950's) ??

I'm working on that as we speak.
You can find some info here: https://bitcointalk.org/index.php?topic=186877.20
I posted in the second page a list of thread that deals with that. I'm working with Boozer to find a stable setup.
I will also be working with the Linux developer to test the setup under Linux and optimize it for 6 GPUs.
newbie
Activity: 28
Merit: 0
Not really, because air will find the path of least resistance, and you are seeking to put it through specific channels of higher resistance.

That means that correct design is to make the spaces between cards the spaces that the air preferably flows through.  You could test this with one graphics card by putting a flat plate to both sides of it at various distances, then turning the card on.

Another issue is based on the card positioning, it is airflow output from one card being input to the next.  Thus in a row of cards, each has higher input air than the previous.  The fix to this is flat plate, or preferrably curved, diverters separating each card and directing airflow out of the row and upwards.
Here is a picture of the rack from the bottom. This is what the air will be forced into. I guess we could test and see how much gap is need to have the perfect setup. But As I said, there are physical constraints and we are at the limits... Don't forget that natural convection and most importantly the GPU fan will create a natural upward flow and a "low pressure zone" in between the cards, where the fresh air coming from the bottom will want to rush into.
http://www.awtti.com/images/LTCminingrig13.png


I think we are thinking the same thing as far as card positioning, although I'm a bit confused. Here is the drawing I posted about stacking the racks. It's just one idea among many possible setups.
http://www.awtti.com/images/LTCminingrig10.png
I would not recommend to have the air go through one rack directly to the next as it would seriously warm up the second rack.
If I'm not understanding what you mean, could you possibly do a little schematic of what you think the problem is/will be?
legendary
Activity: 1400
Merit: 1000
I owe my soul to the Bitcoin code...
OK I am back with only a few minor comments.

1) You might consider attaching the GPUs to the upper side of the support as to not cover any of the cards' rear vents which I have found will increase temps even if marginally obstructed.

2) I have an FT02 that I used for mining at one time and found it insufficient to handle large heat loads even in that orientation. What does help though is large amounts of air being pushed from the bottom (ie not the three 180mm fans it came with. Make sure to have lots of static pressure as well and an fairly open exit.

3) The shroud for the top could do with being more open, that is not such a steep curve on exit. This will lead to some pooling of the hot air that will be more difficult to remove.

That's it for now, if I think of anything else I will get back to you. Nice setup and keep improving. Smiley
legendary
Activity: 2926
Merit: 1386
I think if you removed every other card in the GPU arrays, cooling would be possible.  An attempt to direct air flow into this packed double array would require a shroud around the entire thing.

The distance between the GPU cards is insufficient for good airflow, air is required to make a 90 degree turn from the card's fan to exit the rack. 90 degree turn is difficult for any air mass, but for distances of 1/16 to 1/4 inch, laminar flow generates additional frictional losses.

Maybe this is a bit technical, but the sum is put them further apart...

We will test with an enclosure around the rack, we do want to try to force the air through it and see what happens.

I have done some air cooling test on SLI/Crossfire setup. I'm not saying it's perfect, but you can keep your cards at reasonable temperature under full loads if you have a fan blowing directly onto them.
Here is a good example of a setup with a 20" fan box and very little spacing (0.06" -- standard): http://www.youtube.com/watch?v=2nDTBN_cPs0
The spacing in between brackets (not cards, it should be slightly more) in the rack is 0.35" or roughly 3/8". It's not a lot, but it is 1/4" more than in standard crossfire/SLI setup (0.06"). I think the fresh air will be able to get in between and reach the GPU fan, which will push it to the top.
I also count on some help from the naturally occurring convection, which will create a chimney pull effect. The FT02 from Silverstone has used this concept and the results are very encouraging. And like I said, this is not new technology, similar arrangements have been used in high end applications, I'm just trying to adapt it for our application.
But I think only testing will show if this approach is accurate or not. Results to be published in about 2 weeks!

If you want to stack the rack in a server rack, you will have to put shrouds on top and bottom of the rack as depicted in one of the picture. The shroud will have to be much larger than 1/4" because they will have to contain the fan(s) to push (/pull). One idea is that we could sandwich the rack in between 2 fans to improve aeration.
Not really, because air will find the path of least resistance, and you are seeking to put it through specific channels of higher resistance.

That means that correct design is to make the spaces between cards the spaces that the air preferably flows through.  You could test this with one graphics card by putting a flat plate to both sides of it at various distances, then turning the card on.

Another issue is based on the card positioning, it is airflow output from one card being input to the next.  Thus in a row of cards, each has higher input air than the previous.  The fix to this is flat plate, or preferrably curved, diverters separating each card and directing airflow out of the row and upwards.
hero member
Activity: 574
Merit: 500
Damnit where were you 2 years ago  Grin

I was designing other stuffs that allowed me to do the design of this frame now  Wink

Looks intresting...what mobo are u using that can run 6 x single gpu cards (i.e 7950's) ??
newbie
Activity: 28
Merit: 0
Damnit where were you 2 years ago  Grin

I was designing other stuffs that allowed me to do the design of this frame now  Wink
newbie
Activity: 28
Merit: 0
How far apart is optimal?

Answer: As much as you can!
It's hard to tell without testing. But as mentioned, people with the tight "standard SLI spacing" (http://www.youtube.com/watch?v=2nDTBN_cPs0) are able to make it work. So I expect it to work as least as well with 0.25-0.3" additional space in between.
Here is a bitcoin mining setup with 0.06" spacing, small fans the back:
https://encrypted-tbn2.gstatic.com/images?q=tbn:ANd9GcRMk0ZteKoFZO0kqXhSdfCn1uLCdaJKOUc6j8FLFqYoEqmay9l5
if it works there with 3-4 little 120mm fan pushing hardly 400 SCFM in a dense cluster...

And here, with dual GPU board:
http://mining.bitcoin.cz/media/img/miner.jpg


My limitation are:
19" wide: because this is the maximum with the rack and be and still fit in a standard server rack
24" long: because I don't want it any wider so that it fits standard fans (20" and 24").
So I built the rack based on these limitations.
newbie
Activity: 28
Merit: 0
I think if you removed every other card in the GPU arrays, cooling would be possible.  An attempt to direct air flow into this packed double array would require a shroud around the entire thing.

The distance between the GPU cards is insufficient for good airflow, air is required to make a 90 degree turn from the card's fan to exit the rack. 90 degree turn is difficult for any air mass, but for distances of 1/16 to 1/4 inch, laminar flow generates additional frictional losses.

Maybe this is a bit technical, but the sum is put them further apart...

We will test with an enclosure around the rack, we do want to try to force the air through it and see what happens.

I have done some air cooling test on SLI/Crossfire setup. I'm not saying it's perfect, but you can keep your cards at reasonable temperature under full loads if you have a fan blowing directly onto them.
Here is a good example of a setup with a 20" fan box and very little spacing (0.06" -- standard): http://www.youtube.com/watch?v=2nDTBN_cPs0
The spacing in between brackets (not cards, it should be slightly more) in the rack is 0.35" or roughly 3/8". It's not a lot, but it is 1/4" more than in standard crossfire/SLI setup (0.06"). I think the fresh air will be able to get in between and reach the GPU fan, which will push it to the top.
I also count on some help from the naturally occurring convection, which will create a chimney pull effect. The FT02 from Silverstone has used this concept and the results are very encouraging. And like I said, this is not new technology, similar arrangements have been used in high end applications, I'm just trying to adapt it for our application.
But I think only testing will show if this approach is accurate or not. Results to be published in about 2 weeks!

If you want to stack the rack in a server rack, you will have to put shrouds on top and bottom of the rack as depicted in one of the picture. The shroud will have to be much larger than 1/4" because they will have to contain the fan(s) to push (/pull). One idea is that we could sandwich the rack in between 2 fans to improve aeration.
hero member
Activity: 873
Merit: 1007
Damnit where were you 2 years ago  Grin
member
Activity: 102
Merit: 10

[/quote]I think if you removed every other card in the GPU arrays, cooling would be possible.  An attempt to direct air flow into this packed double array would require a shroud around the entire thing.

The distance between the GPU cards is insufficient for good airflow, air is required to make a 90 degree turn from the card's fan to exit the rack. 90 degree turn is difficult for any air mass, but for distances of 1/16 to 1/4 inch, laminar flow generates additional frictional losses.

Maybe this is a bit technical, but the sum is put them further apart...
[/quote]

How far apart is optimal?
sr. member
Activity: 280
Merit: 250
It is an interesting design for a high density setup.  Let me mull it over some more and I will get back to you with some thoughts.

I look forward to your feedback.

Latest update 4/29/13: I have found a Linux developer that will put together a light version of Debian that will specifically be developed for this rack. It will allow the entire system to run from a USB key that will come pre programmed with Cgminer, optimized for 6 GPU per motherboard with a very light OS and graphic interface to optimize the Hash/Watt. It will also allow the operator to monitor the rack remotely.  AUTO INSTALL and AUTO RUN (no computer knowledge needed, 99% Plug & Play).

I am interested please keep us updated
legendary
Activity: 2926
Merit: 1386
Good morning everybody,

I'm getting ready to order my new LTC mining set up. But I'd like to put together a clean long lasting build that will keep my equipment safe and working at optimum capacity.
I want to reach the maximum GPU mining efficiency by providing optimum cooling and opting for the larget GPU/(MB+CPU+other) ratio for the highest Hash/W......
Do you have any suggestions for possible improvements?


Looking forward to your feedback on this little project.
I think if you removed every other card in the GPU arrays, cooling would be possible.  An attempt to direct air flow into this packed double array would require a shroud around the entire thing.

The distance between the GPU cards is insufficient for good airflow, air is required to make a 90 degree turn from the card's fan to exit the rack. 90 degree turn is difficult for any air mass, but for distances of 1/16 to 1/4 inch, laminar flow generates additional frictional losses.

Maybe this is a bit technical, but the sum is put them further apart...
Pages:
Jump to: