Author

Topic: advice on 6xPCIE rig needed (Read 1613 times)

donator
Activity: 2352
Merit: 1060
between a rock and a block!
June 09, 2011, 08:20:26 PM
#9
My understanding is that Windows won't recognize more than 4 GPUs?  So, that would give you 2 dual-GPU cards only or 4 single GPU cards.
Linux goes up to 8 GPUs, so you can use 4 dual-GPU cards or up to 8 single GPU cards.  I intend to find out on my own, but this is what I've read on these forums so far...

I do have the 1500 PSU mentioned above.  It is awesome!  However if you're going to install it into a case, make sure you take measurements.  The PSU is larger than smaller once...  I actually have 2 of them in one case (experimenting with power loads/distribution), the TJ11...
hero member
Activity: 648
Merit: 500
June 09, 2011, 12:38:30 PM
#8
i'm currently using an MSI 890FXA-GD70 with 2x 6950s and it runs like a champ. as soon as my order comes in i'll be running quad. i do recommend using extenders though, as the space between the cards is very narrow w/ quadfire. this isn't as important if you aren't overclocking it, but i don't know why you wouldn't.

they do make 1500w psu's, they're just expensive. http://www.newegg.com/Product/Product.aspx?Item=N82E16817256054&cm_re=1500w-_-17-256-054-_-Product

legendary
Activity: 2058
Merit: 1005
this space intentionally left blank
June 09, 2011, 11:50:45 AM
#7
Mainboard:
Gigabyte GA-890FXA-UD7
link in German, but you will get the details, if you wish.
6xPCIE (2x16 2x8 2x4)
Thing is - we don't know if this thing could take 6 dual slot design GPUs... if you should have any idea of such a motherboard, pray, tell!
~200€

That's fine if you have cables/ribbons for the cards.  But you won't be able to neatly put them inside a case without modifications tho.  You'll either need to build a custom riser for the cards, have them lying around or some other place to mount them.

Quote
CPU:
Some cheap AMD thing
~100€
Also 100€ is definitely not a 'cheap AMD thing'


As you might have noticed, I posted this in the meantime:


Mainboard: MSI 790FX-GD70 (AM3, DDR3) - 150€
four PCIE-slots that look as if they were spaced far away enough from each other to run 4 GPUs. can anyone confirm this?

CPU/HDD: Cheap AMD + cheap HDD, 50€

so we're tuning down already. the mainboard looks good as for the spacing of the slots: http://www.amazon.de/MSI-790FX-GD70-Mainboard-PCI-DDR3/dp/B001UGKFSU (just need to find one, lulz) - or this one: http://www.kmelektronik.de/shop/index.php?show=product_info&ArtNr=26150
member
Activity: 66
Merit: 10
June 09, 2011, 11:17:35 AM
#6
Mainboard:
Gigabyte GA-890FXA-UD7
link in German, but you will get the details, if you wish.
6xPCIE (2x16 2x8 2x4)
Thing is - we don't know if this thing could take 6 dual slot design GPUs... if you should have any idea of such a motherboard, pray, tell!
~200€

That's fine if you have cables/ribbons for the cards.  But you won't be able to neatly put them inside a case without modifications tho.  You'll either need to build a custom riser for the cards, have them lying around or some other place to mount them.

Quote
CPU:
Some cheap AMD thing
~100€
Also 100€ is definitely not a 'cheap AMD thing'
legendary
Activity: 2058
Merit: 1005
this space intentionally left blank
June 09, 2011, 09:54:03 AM
#5
first, I am partnering with a friend who lives a 10minutes walk from the data center, which is populated 24/7.
since the rig will be windows, logmein etc will be provided so i can check on the damn thing every once in a while.

we are using (we like to think) better hardware to minimize the risk of something going kaputt.

the budget is ~1300€, which we're not being able to build 2 3xGPU rigs for, sadly (1 GPU ~200€). we do need, however, 1000Mhash/s to be halfway decently profitable, hence the 4x approach. if we indeed are profitable, we will split machines and upgrade.

i am more concerned about the technical aspects of wether anyone can see any problems heatwise (case or no case) or if anyone has experience with the mainboard?
newbie
Activity: 19
Merit: 0
June 09, 2011, 09:25:10 AM
#4
Ideally it would be best to split this into 2 machines. This would be preferable for several reasons.
1. Downtime: with only one machine, your hash rate drops to 0 during maintenance.
2. Redundancy: if you have a single component fail, all of your miners will go offline. Since you will be co-locating this service somewhere outside of your immediate control, this could be a big issue. Lets say your miner goes down in the middle of the night. Will your friend be willing to go let you into the server room at 1am? With two, you only drop to half of your hashrate.
3. Cheaper components: to go from 1000w to 1200w you are looking at a 95euro increase in price. That is almost the cost of another 850w PSU.


Since this isn't going to be a machine you are going to have immediate access to, I would go with two cheap motherboard with 3 PCI-e slots (buy these new). Two cheap CPUs (buy used if you can). 2G of the cheapest ram that will work in each machine would be way more than enough (buy new). Cheap HDD, borrow and old one from someone or just run off flash drives (borrow if you can). Then 6 GPUs, 3 in each. The most important thing is to get them online as quick as you can. Buy locally if possible, 3-4 days tied up in shipping is 3-4 days you aren't making money.

-SteveA
legendary
Activity: 2058
Merit: 1005
this space intentionally left blank
June 09, 2011, 09:04:54 AM
#3
okay, back to the drawing board.

Mainboard: MSI 790FX-GD70 (AM3, DDR3) - 150€
four PCIE-slots that look as if they were spaced far away enough from each other to run 4 GPUs. can anyone confirm this?

CPU/HDD: Cheap AMD + cheap HDD, 50€

Power Pack:
a: 1200W Corsair AX1200 Professional Series Gold - 240€
b: 1000W COOLER MASTER Silent Pro M1000 - 145€
when running 4 GPUs, a>b, probably.

Case: Cooler Master HAF X Big Tower ~200€ depending on the fans i get with it

GPUs: 4x Sapphire (Retail) HD5870 1024MB ~ 800€ for a total of 1200 GHash/s

total price ~1300€

note: the estimated cost for electricity will be 5% of all coins mined, since the thing will be placed in a friends computer room.
looking better now? or should i just screw the case and let it sit there "open" with a giant fan directed at it?
member
Activity: 112
Merit: 100
"I'm not psychic; I'm just damn good"
June 08, 2011, 10:23:05 PM
#2
  • 1. PCIe slots too close together to put 6 dual slot cards.
    Must get extenders if still intend to use this MOBO
  • 2. 6x6870 = 170 W x 6 = 1020 W + MOBO + CPU + RAM + FANS + HDD
    Need to get >1300 W PSU. Not worth the money esp good ones
    Choice of GPU is also bad. 58xx is better for OC although if you use 6 of those you would require >1500 W PSU which is not available in the market
    So you have to hook up 2 PSU of > 850 W
    • 3. Your Mhash/$ is very low for a dedicated build. Breakeven point is very far away. ~>1 mth

      Expensive and not worth. Change Build
legendary
Activity: 2058
Merit: 1005
this space intentionally left blank
June 08, 2011, 03:54:12 PM
#1
So me and a friend have access to a tempcontrolled room where a buddy of ours runs his tech company.
He's willing to let us put a rig in there if we compensate him for electricity.

So now we're trying to figure out how to best approach this whole thing. Since we're both cheap, we're obviously trying to not go all-out on all the components. The rig will run Windows so us noobs kann remote-access it. There will be no overklocking or somesuch shenanigans.

I'd like to present you with our choices so far:

Mainboard:
Gigabyte GA-890FXA-UD7
link in German, but you will get the details, if you wish.
6xPCIE (2x16 2x8 2x4)
Thing is - we don't know if this thing could take 6 dual slot design GPUs... if you should have any idea of such a motherboard, pray, tell!
~200€


CPU:
Some cheap AMD thing
~100€


Memory
4 Gig of somethingorother
~50€

Harddisk
We consider a 16GB stick, or just some cheap HDD
~20€

GPUs
Now here's where it gets interesting.
With 6 slots, we'd obviously run into some  kind of temperature problem when trying to run 6x the highest-end card there is.
We're currently considering

6x 6870 which runs at ~150€ per piece.
The thing is, we don't know if 6 of these monsters would fit in there. If there's only room / technical provisions for four cards, dammit, that would suck. We should get 240MHash/s per card, so it would make quite a difference if there's 4 or 6 in there.

Additionally, we're unsure about the power supply. Which power pack would you advise? Would standard cooling procedures suffice, or is there need for additional measures?

I also came across this open case, which looks awesome.

Hope someone will comment on these first ideas - thank you if you do!

Jump to: