Author

Topic: A Few Questions That I Seem To Have Conflicting Data On... (Read 1209 times)

member
Activity: 112
Merit: 100
"I'm not psychic; I'm just damn good"
just put up a rack somewhere it's not difficult to do. have the psu, mobo and the extended card on it. maybe a small fan at the end, if this is ur only rig, it won't be much of a headache running it naked.

unless u're prototyping then maybe u'll run into some trial and error, this will inevitably increase ur cost. Or spend the time designing it w/o an operating rig. It's just opportunity cost.

I went ahead and ran my prototype naked while figuring out the rest while it is mining. I had GD70 and planned to do 6 cards (8 GPU) with 2x 850W PSU. Didn't work out as I thought it'll be dangerous and my build have since evolve. I'd probably be using the GD70 myself haha.
full member
Activity: 140
Merit: 100
firstbits: 1kwc1p
I'm no airflow engineer, so I wouldn't be confident propping these cards up somewhere and hoping.

I can tell that there will likely be some kind of space issue which is going to be hard to solve, but as I'm not particularly good at DIY, I'm not sure that my attempts to engineer a chassis will be more of a solution rather than a problem.

One option I was looking at is just running the whole thing open-air, but that'd give me storage woes.
hero member
Activity: 602
Merit: 500
Well, I've never tried it, because I'd be terrified of something going wrong. I suppose if you super ziptied it in it could work (perhaps use zipties to suspend the card), and propped up some fans into the bays... Be aware though that the maximum cable length for PCIE-Extenders is 16" I believe. This severely limits the workable space you will have to play with.

I wouldn't recommend it, but if you're desperate for high hash:cost ratio I won't say it won't work heh.
full member
Activity: 140
Merit: 100
firstbits: 1kwc1p
I assume that resting two of the cards in the empty 5.25 bays would create a cooling headache?

The inline connectors are something like this: http://bit.ly/l2eEBX
hero member
Activity: 602
Merit: 500
I'm not familiar with inline low profile connectors. The problem I see with any setup is that you will still need some space of some kind to place the cards. The Antec one hundred is a pretty shoddy case, fairly cramped, would require a lot of modding to fit 5 cards in it.

I still think you're setting yourself up for a real headache unless you make a custom enclosure.
full member
Activity: 140
Merit: 100
firstbits: 1kwc1p
Just to clarify:

1. I'm planning on running Arch Linux.

2. I was hoping to run three graphics cards in the motherboard using low-profile inline connectors, and two on cables. The case would be an Akasa Freedom Xone or an Antec One Hundred.
hero member
Activity: 602
Merit: 500
Hey all,

I'm thinking about setting up some rigs, but really struggling to come up with solid data on a few key questions.

1. I found a motherboard which has 1x PCI Express 2.0 x16 slot and 4x PCI Express x1 slot. If I got four riser cables, would this be capable of running five ATi HD Radeon 5870s? The motherboard is a Gigabyte GA-770T-D3L.

2. I've heard that as an alternative to a single very powerful PSU, it's possible to run two smaller PSUs. This would definitely be an attractive option for me. Can anyone give me a solid tutorial for this? Preferably with images.

3. What's the best way of keeping said riser cables and the five attached graphics cards tidy within a case? What's the best way to keep them cooled without resorting to water cooling?

4. How easy/difficult are the XFX HD-587X-ZNFC 5870s to overclock? I've been basing on a 950/300 clock, which seems possible from an 850 core, but I'm just trying to be cautious.

Thanks for any help.

In no particular order:

There are almost no cases that can handle 5 double slot GPUs. THe rosewill Thor is the only one I can think of that costs less than $300.  Unless you want to build your own custom case tt might behoove you to pay a little more and get something like a 890FXTA-GD70 and write off the savings on riser cards.
Two smaller PSUs are rarely a superior alternative to a single large PSU, pricewise, hasslewise, dangerwise. Perhaps in terms of cabling...
I'm a little unclear on the 5 card thing myself, some people claim windows is limited to 4 physical GPUs, very little proof offered though. It has been suggested linux will overcome this issue. And yes x1 slots will do just fine.
5870s are very easy to overclock, in windows. In linux, a little harder. Forewarned is forearmed.
Cooling and tidyness are always a hassle. Lots of cable ties and lots of fans I guess.
full member
Activity: 140
Merit: 100
firstbits: 1kwc1p
Hey all,

I'm thinking about setting up some rigs, but really struggling to come up with solid data on a few key questions.

1. I found a motherboard which has 1x PCI Express 2.0 x16 slot and 4x PCI Express x1 slot. If I got four riser cables, would this be capable of running five ATi HD Radeon 5870s? The motherboard is a Gigabyte GA-770T-D3L.

2. I've heard that as an alternative to a single very powerful PSU, it's possible to run two smaller PSUs. This would definitely be an attractive option for me. Can anyone give me a solid tutorial for this? Preferably with images.

3. What's the best way of keeping said riser cables and the five attached graphics cards tidy within a case? What's the best way to keep them cooled without resorting to water cooling?

4. How easy/difficult are the XFX HD-587X-ZNFC 5870s to overclock? I've been basing on a 950/300 clock, which seems possible from an 850 core, but I'm just trying to be cautious.

Thanks for any help.
Jump to: