Hi,
I have a problem if I try to connect more than 6 GPUs to my AsRock H81 Pro BTC v2.0 mobo. I bought an extender riser (PCI-E x1 -> 3*PCI-E x1) and I use normal USB3 PCI-E x1->x16 risers. Everything's fine until I connect the 7th card. I can connect the first 6 card any way I want it but after the 7th card connected I got blank screen when I turn on the rig.
How do I make it work with 7 or 8 cards?
are you using Windows 7 perhaps? I have an 8 Card system and the only way I got them to work was with Windows 10 and Ubuntu. But it can also be the fact that your MB don't support more than 6 PCIE cards. Have you tried connecting 5 cards and one dual card? such as 7990, 295x ? I have two 7990 and 4 other cards in my 8 card system. Not using the PCI Switch.
I use Win10, but I can't get to boot the OS. I turn on the rig and nothing happens. I don't have dual cards, I use RX470 GPUs.
Maybe AsRock H81 Pro BTC v2.0 mobo doesn't support more than 6 PCI-E devices? I tried to disable integrated audio, serial ports, etc. in the bios, then connected the 7th card but also didn't work.
I don't want to buy an expensive server/workstation mobo. Is there any chance to use 7 or 8 PCI-E devices? I searched for modded bios for this mobo but didn't find anything. Or it's the limit of the H81 chipset, not the bios?
Which PCIE slot are you connecting your 1x3 card into?
It is my understanding that each GPU requires a minimum of 1 PCIE lane/pipeline to transmit data. That means you would have to plug the 1x3 card into a slot with at least 3 lanes/pipelines (in the real world that means a 4x, 8x, or 16x slot). The H81 Pro BTC has a single 16x slot and five 1x slots. You could only ever run the 1x3 card on the 16x slot.
If I'm wrong on this I am open to correction. I've actually been wanting to play around with the exact setup you describe.
I used the 1->3 extender riser in the 6th PCI-E slot (which is an x1 slot) and the 3 cards worked fine on that extender. So I think it is a limitation of bios/chipset, or not enough IRQ available?
I have a meeting I need to run to but this thread I found may have some useful information in it. I will keep digging when I get back as well.
https://forum.ethereum.org/discussion/5195/riser-1-to-3-pci-e-1xEdit 1:
After reading through the above thread and taking a look at Intel's specs on the h81 and z87 it would appear that the max PCIE lanes through the h81 chipset is 6, whereas the z87 chipset can handle 8. The 1x PCIE slots run through the chipset, but the 16x slot runs straight to the CPU.
I would think that you could run 6 GPUs max through the chipset while running the remainder of your GPUs through the 16x slot using the CPUs remaining availability. In this scenario you could only ever use a 1x2 card in one of your 1x slots. If you moved the 1x3 card to the 16x slot then you could run 5 GPU through the chipset and 3 (or more depending on CPU, driver, OS) through the 16x PCIE straight to the CPU.
But that doesn't seem to be your experience. If you have the 1x3 card in the last 1x PCIE slot then the 7th card you plug in should be the 6th card running through the chipset. Theoretically you should be able to run 7 cards with your setup, but not 8. And you can't.
So I'm wondering, perhaps we have a bottleneck somewhere else? Out of curiosity, what CPU are you using?
http://ark.intel.com/products/75013/Intel-Z87-Chipsethttps://ark.intel.com/products/75016/Intel-H81-Chipset