Author

Topic: Pci express splitter (Read 5935 times)

newbie
Activity: 27
Merit: 0
August 08, 2011, 10:06:18 PM
#18
"For 30 units we can do 30% discount."

119.14 + shipping if you have the cash...
legendary
Activity: 1012
Merit: 1000
August 07, 2011, 04:30:25 PM
#17
Just something to consider.  If you have a rig with 2 GPUs go down you freak out a little.  If you have a rig with 8 (or more) GPUs go down....you freak out a lot.
newbie
Activity: 27
Merit: 0
August 06, 2011, 05:34:51 PM
#16
Can you give any details Cablesaurus?  Looking to get my mobo soon.  If you are about to start selling these splitters at a reasonable price, it will be a mATX with 2 slots.  Otherwise, I will be getting an ATX with the max number of slots I can afford...

Thanks,
-Zach
sr. member
Activity: 302
Merit: 250
August 06, 2011, 03:29:50 PM
#15
We're looking into these as well, from what we've seen so far there's still an 8GPU limit but it may be surpassable.
newbie
Activity: 27
Merit: 0
August 06, 2011, 02:31:19 PM
#14
They are ~$200/pop  Angry

They must be doing low volume.  Maybe someone from Cablesaurus could do a large order and get the price down to something reasonable?
full member
Activity: 182
Merit: 100
July 29, 2011, 09:10:22 AM
#13
No it's 8. It's a driver limitation = Catalyst, not the OS.

Can you cite this? I remember seeing multiple times people saying linux can hold more. This is one of those times.
There are no myths here...

On windows, prior to driver 11.6 only 4 GPUs would work.

GPU does not equal a video card in this context...
Windows will recognize more than 4 video cards just fine... However, you won't be able to use more than 4 GPUs with 11.5 or below.

However, with 11.6 drivers, it looks like there's limited confirmation of 8 GPUs being the new limit.
On Linux there are rumors that the limit is at least 10 GPUs...
full member
Activity: 336
Merit: 100
July 29, 2011, 01:27:44 AM
#12
hmm wonder how many cards i can cram on 1 motherboard like 20??

I think the current driver is limited to 8 cards.
Its 8 on windows. I'm pretty sure on linux its a bit higher.

No it's 8. It's a driver limitation = Catalyst, not the OS.
full member
Activity: 182
Merit: 100
July 28, 2011, 10:07:18 PM
#11
hmm wonder how many cards i can cram on 1 motherboard like 20??

I think the current driver is limited to 8 cards.
Its 8 on windows. I'm pretty sure on linux its a bit higher.
full member
Activity: 196
Merit: 100
July 28, 2011, 09:39:37 PM
#10
hmm wonder how many cards i can cram on 1 motherboard like 20??

I think the current driver is limited to 8 cards.
member
Activity: 97
Merit: 10
July 28, 2011, 07:41:57 PM
#9
hmm wonder how many cards i can cram on 1 motherboard like 20??
full member
Activity: 182
Merit: 100
July 28, 2011, 05:48:46 PM
#8

Indeed it would, the molex power adapter allows the card to pull that 75W directly from the PSU, instead of through the motherboard.

You only really need the 1x adapter with molex however, mining is not bandwidth intensive so you don't need all the lanes available to 8x or 16x
The 1x is less then a dollar difference in price so it doesn't really change things.

Edit: The site I'm about to post doesn't list prices at least that I see but take a look at this.
http://www.amfeltec.com/products/x4pcie-splitter4.php
full member
Activity: 196
Merit: 100
July 28, 2011, 05:14:37 PM
#7

Indeed it would, the molex power adapter allows the card to pull that 75W directly from the PSU, instead of through the motherboard.

You only really need the 1x adapter with molex however, mining is not bandwidth intensive so you don't need all the lanes available to 8x or 16x
full member
Activity: 182
Merit: 100
July 28, 2011, 05:03:45 PM
#6
A quick search shows Joel is right (2 PCIe 8 lanes).

The other issue plugging GPUs into this (bandwidth should not be a problem) will be power draw.  The normal draw is 75W from the mobo, so if you plug in two GPUs it will need possibly double that.  The connections may not be designed to handle that power.  I suspect that's what burned my dell poweredge serve motherboard when I was playing with risers last year - i.e. too much current and slow over-heating = crispy fibreglass and lots of smoke => trash.

Would something like this fix the power problem?
https://cablesaurus.com/index.php?main_page=product_info&cPath=1&products_id=11
hero member
Activity: 518
Merit: 500
July 28, 2011, 04:45:24 PM
#5
A quick search shows Joel is right (2 PCIe 8 lanes).

The other issue plugging GPUs into this (bandwidth should not be a problem) will be power draw.  The normal draw is 75W from the mobo, so if you plug in two GPUs it will need possibly double that.  The connections may not be designed to handle that power.  I suspect that's what burned my dell poweredge serve motherboard when I was playing with risers last year - i.e. too much current and slow over-heating = crispy fibreglass and lots of smoke => trash.
legendary
Activity: 1596
Merit: 1012
Democracy is vulnerable to a 51% attack.
July 28, 2011, 04:32:19 PM
#4
This seems to split an x16 slot into two x8 slots. It doesn't seem to a bridge but a splitter. Obviously, it won't work if you put it in a slot that's already wired for x8. But most chipsets should accept a split of an x16 slot to two x8's. The biggest issue I can see is that it might confuse the BIOS.
sr. member
Activity: 462
Merit: 250
It's all about the game, and how you play it
July 28, 2011, 04:27:28 PM
#3
thta's why i asked about this one specifically i can see both ends are pci e at least i'm wondering if anyone has tried this or a similar one.
hero member
Activity: 518
Merit: 500
July 28, 2011, 04:23:42 PM
#2
Many riser/splitters convert PCIe to PCI.  I've yet to find a device that does PCIe to PCIe splits.  Most useful I ever managed was PCIx to PCIe running GPUs in P3 servers.
sr. member
Activity: 462
Merit: 250
It's all about the game, and how you play it
July 28, 2011, 04:12:09 PM
#1
has anyone tried a board such as this one
http://cgi.ebay.com/Dell-POWEREDGE-R710-PCIE-RISER-BOARD-MX843-/270692444242?pt=LH_DefaultDomain_0&hash=item3f0686e852
in a standard(ie non dell poweredge) motherboard to any result? a visual inspection shows it simply breaks the pci slot in half for each of the slots i see no significant supporting logic on the board
Jump to: